var/home/core/zuul-output/0000755000175000017500000000000015146000575014530 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015146032712015472 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000427377715146032545020306 0ustar corecoree5ikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIs,r.k9Gf䅴 "mv?_eGbuuțx{w7ݭ7֫t% oo/q3m^]/o?8.7oW}ʋghewx/mX,ojŻ ^Tb3b#׳:}=p7뼝ca㑔`e0I1Q!&ѱ[/o^{W-{t3_U|6 x)K#/5ΌR"ggóisR)N %emOQ/Ϋ[oa0vs68/Jʢ ܚʂ9ss3+aô٥J}{37FEbп3 FKX1QRQlrTvb)E,s)Wɀ;$#LcdHM%vz_. o~I|3j dF{ "IΩ?PF~J~ ` 17ׅwڋًM)$Fiqw7Gt7L"u 0V9c  ˹dvYļU[ Z.׿-h QZ*U1|t5wKOؾ{mk b2 ܨ;RJK!b>JR*kl|+"N'C_#a7]d]sJg;;>Yp׫,w`ɚ'd$ecwŻ^~7EpQС3DCS[Yʧ?DDS aw߾)VxX帟AB}nyи0stĈCo.:wAZ{sy:7qsWctx{}n-+ZYsI{/.Ra9XcђQ0FK@aEDO2es ׇN# ZF͹b,*YVi+$<QMGhC}^}?BqG!(8l K3T[<~6]90}(*T7siv'=k 9Q2@vN ( R['>v*;o57sp$3ncx!>t®W>]tF-iܪ%GYbaRvHa}dkD̶*';ک|s_}8yj,('GrgTZ'U鋊TqOſ * /Ijo!՟8`"j}zӲ$k3jS|C7;A)͎V.r?t\WU1ojjr<~Tq> `=tJ!aݡ=h6Yݭw}?lѹ`f_" J9w4ts7NG GGG]ҡgc⌝M b/Ζlpah E ur C&`XR JcwB~R2EL9j7e\(Uё$׿atyХ?*t5z\+`/ErVQUxMҔ&ۈt.3;eg_O ξL1KiYLizpV:C5/=v-}҅"o ']쌕|tϓX8nJ*A*%J[T2pI1Je;s_[,Ҩ38_ь ͰM0ImY/MiVJ5&jNgBt90v߁R:~U jځU~oN9xԞ~J|dݤ߯R> kH&Y``:"s ayiBq)u%'4 yܽ yW0 -i̭uJ{KưЖ@+UBj -&JO x@}DS.€>3T0|9ē7$3z^.I< )9qf e%dhy:O40n'c}c1XҸuFiƠIkaIx( +")OtZ l^Z^CQ6tffEmDφǽ{QiOENG{P;sHz"G- >+`قSᔙD'Ad ѭj( ہO r:91v|ɛr|٦/o{C Ӹ!uWȳ)gjw&+uߕt*:͵UMQrN@fYDtEYZb4-UCqK٪L.2teB ˛"ո{Gci`du듎q+;C'16FgVlWaaB)"F,u@30YQg˾_YҊŏ#_f^ TD=VAKNl4Kš4GScѦa0 J ()¾5m'p/\խX\=z,Mw˭x:qu礛WԓL!I xӤ1(5AKRVF2ɌУլ F "vuhc=JS\kkZAY`R"Hr1]%oR[^oI]${&L8<=#0yaKL: JJl r;t#H+B|ɧJiM cm)>H=l}.^\ݧM<lu Y> XH\z:dHElL(uHR0i#q%]!=t_쾋-, vW~* ^g/5n]FhNU˿oۂ6C9C7sn,kje*;iΓA7,Q)-,=1A sK|ۜLɽy]ʸEO<-YEqKzϢ \{>dDLF amKGm+`VLJsC>?5rk{-3Ss`y_C}Q v,{*)ߎ% qƦat:D=uNvdߋ{Ny[$ {ɴ6hOI']dC5`t9:GO: FmlN*:g^;T^B0$B%C6Θ%|5u=kkN2{'FEc* A>{avdt)8|mg定TN7,TEXt+`F P |ɧ<Ғ8_iqE b}$B#fethBE;1"l r  B+R6Qp%;R8P󦟶Ub-L::;Ⱦ7,VW.JE:PgXoΰUv:ΰdɆΰ (ΰ0eTUgXun[g, ׽-t!X򴱞_aM:E.Qg1DllЊE҉L ehJx{̗Uɾ?si&2"C]u$.`mjmƒVe9f6NŐsLu6fe wkىKR%f"6=rw^)'Hz }x>1yFX09'A%bDb0!i(`Z;TyֻΗ|ִ0-6dAC5t[OM91c:VJR9&ksvJ;0ɝ$krogB= FYtЩOte=?>T&O{Ll)HClba1PIFĀ":tu^}.&R*!^pHPQuSVO$.KMb.:DK>WtW.Y9[XԦo Έ')Ji.VՕH4~)(kKC&wfm#Y~!%rpWMEWMjbn(ek~iQ)à/2,?O Y]Q4`Iz_*2coT'ƟlQ.Ff!bpRw@\6"yr+i37Z_j*YLfnYJ~Z~okJX ?A?gU3U;,ד1t7lJ#wՆ;I|p"+I4ˬZcն a.1wXhxDI:;.^m9W_c.4z+ϟMn?!ԫ5H&=JkܓhkB\LQ"<LxeLo4l_m24^3.{oɼʪ~75/nQ?s d|pxu\uw?=QR -Mݞίk@Pc n1æ*m$=4Dbs+J \EƄզ}@۶(ߐ/ۼ𹫘qݎt7Ym݃|M$ 6.x5 TMXbXj-P\jА޴y$j`ROA"EkuS#q * CƂ lu" yo6"3껝I~flQ~NCBX`]ڦÞhkXO _-Qy2$?T3ͤEZ긊mۘ$XD.bͮW`AީClСw5/lbl[N*t*@56."D/< {Dۥ sLxZn$N(lYiV =?_e^0)?]{ @| 6+#gPX>Bk2_@L `CZ?z3~ }[ tŪ)۲-9ֆP}b&x Uhm._O 4m6^^osVЦ+*@5Fˢg'!>$]0 5_glg}릅h:@61Xv` 5DFnx ˭jCtu,R|ۯG8`&ו:ݓ3<:~iXN9`2ŦzhѤ^ MW`c?&d.'[\]}7A[?~R6*.9t,綨 3 6DFe^u; +֡X< paan}7ftJ^%0\?mg5k][ip4@]p6Uu|܀|Kx6خQU2KTǺ.ȕPQVzWuk{n#NWj8+\[ ?yiI~fs[:.۽ '5nWppH? 8>X+m7_Z`V j[ s3nϏT=1:T <= pDCm3-b _F(/f<8sl, 0۬Z"X.~b٦G3TE.֣eմi<~ik[m9뀥!cNIl8y$~\T B "2j*ҕ;ێIs ɛqQQKY`\ +\0(FęRQ hN œ@n|Vo|6 8~J[,o%l%!%tyNO}}=ʬ-'vlQ]m"ifӠ1˟ud9)˔~BѤ]һS8]uBi( Ql{]UcLxٻa,2r(#'CDd2݄kTxn@v7^58þ Ţ&V "NqK$2$ Ri ?2,ᙌEK@-V3ʱd:/4Kwm2$'dW<qIE2Ľ)5kJҼMЌ DR3csf6rRSr[I߽ogCc;S5ׂdKZ=M3դ#F;SYƘK`K<<ƛ G׌MU.APf\M*t*vw]xo{:l[n=`smFQµtxx7/W%g!&^=SzDNew(æ*m3D Bo.hI"!A6:uQզ}@j=Mo<}nYUw1Xw:]e/sm lˣaVۤkĨdԖ)RtS2 "E I"{;ōCb{yex&Td >@).p$`XKxnX~E膂Og\IGֻq<-uˮ◶>waPcPw3``m- } vS¢=j=1 W=&;JW(7b ?Q.|K,ϩ3g)D͵Q5PBj(h<[rqTɈjM-y͢FY~p_~O5-֠kDNTͷItI1mk"@$AǏ}%S5<`d+0o,AրcbvJ2O`gA2Ȏp@c/"ɭex^k$# $V :]PGszyaUS@''mhSt6"+ҶT M6rN+LxE>^DݮEڬTk1+trǴ5RHİ{qJ\}X` >+%ni3+(0m8HЭ*zAep!*)jxG:Up~gfu#x~ .2ןGRLIۘT==!TlN3ӆv%#oV}N~ˊc,_,=COU C],Ϣa!L}sy}u\0U'&2ihbvz=.ӟk ez\ƚO; -%M>AzzGvݑT58ry\wW|~3Ԟ_f&OC"msht: rF<SYi&It1!ʐDN q$0Y&Hv]9Zq=N1/u&%].]y#z18m@n1YHR=53hHT( Q(e@-#!'^AK$wTg1!H$|HBTf̋ Y@Mwq[Fī h[W,Ê=j8&d ԋU.I{7O=%iG|xqBչ̋@1+^.r%V12, _&/j"2@+ wm 4\xNtˆ;1ditQyc,m+-!sFɸv'IJ-tH{ "KFnLRH+H6Er$igsϦ>QKwҰ]Mfj8dqV+"/fC Q`B 6כy^SL[bJgW^;zA6hrH#< 1= F8) 򃟤,ŏd7>WKĉ~b2KQdk6՛tgYͼ#$eooԦ=#&d.09DHN>AK|s:.HDŽ">#%zNEt"tLvfkB|rN`)81 &ӭsēj\4iO,H̎<ߥ諵z/f]v2 0t[U;;+8&b=zwɓJ``FiQg9XʐoHKFϗ;gQZg܉?^_ XC.l.;oX]}:>3K0R|WD\hnZm֏op};ԫ^(fL}0/E>ƥN7OQ.8[ʔh,Rt:p<0-ʁקiߟt[A3)i>3Z i򩸉*ΏlA" &:1;O]-wgϊ)hn&i'v"/ͤqr@8!̴G~7u5/>HB)iYBAXKL =Z@ >lN%hwiiUsIA8Y&=*2 5I bHb3Lh!ޒh7YJt*CyJÄFKKùMt}.l^]El>NK|//f&!B {&g\,}F)L b߀My6Õw7[{Gqzfz3_X !xJ8T<2!)^_ďǂ.\-d)Kl1헐Z1WMʜ5$)M1Lʳsw5ǫR^v|t$VȖA+Lܑ,҂+sM/ѭy)_ÕNvc*@k]ן;trȫpeoxӻo_nfz6ؘҊ?b*bj^Tc?m%3-$h`EbDC;.j0X1dR? ^}Ծե4NI ܓR{Omu/~+^K9>lIxpI"wS S 'MV+Z:H2d,P4J8 L72?og1>b$]ObsKx̊y`bE&>XYs䀚EƂ@K?n>lhTm' nܡvO+0fqf٠r,$/Zt-1-dė}2Or@3?]^ʧM <mBɃkQ }^an.Fg86}I h5&XӘ8,>b _ z>9!Z>gUŞ}xTL̵ F8ՅX/!gqwߑZȖF 3U>gCCY Hsc`% s8,A_R$קQM17h\EL#w@>omJ/ŵ_iݼGw eIJipFrO{uqy/]c 2ėi_e}L~5&lҬt񗽐0/λL[H* JzeMlTr &|R 2ӗh$cdk?vy̦7]Ạ8ph?z]W_MqKJ> QA^"nYG0_8`N 7{Puٽ/}3ymGqF8RŔ.MMWrO»HzC7ݴLLƓxxi2mW4*@`tF)Ċ+@@ts_uM Wi·yT"^'~i6֬:v~m!VuV_K2k*`cKxuBG&24T}Lai 0Va(7K#ӊ!,ZDxFQO*lם>!4ӥ2 ]8â6 U`V%`!c%؎ʨTzrKh! c.}.D>)d_ 8rcu,wf2?Ǡ*_lDn}rauyFp*ɨ:UiM2r:9ct X1lmĪ o玓,R%!`hGT LYF#g<cm${|Xdu4tmtїUJ\~dc0KcMlf2?mμQ ߉J4WrSHTdp"ӹ'cJq2zPlX̯.0H!ND@UapVoGڧD5>H]f@!=߸2V%Z 0"G4ȇʩ@]>Y$ًF_Mm_Tt)ib+q&EXFu򾬳ǝ/RS>r,C2NfOjpcm{Ll9vQOT>9U;])>6JdbXԠ `Z#_+D[7IIjJɟUh ҙ"`"a ߒ"G̾H`6yiCk(OA/$ ^%K^+(Vr[RR1"u4A.1X0=7f/"(o9/L1X{]q`Ȝ/; 9a>E)XOS K9mUxBa"'4T[Jl /K/9,rlCAj_TiǘP,:4F%_0E5IE'rX-|_W8ʐ/=ӹjhO%>| :S Px„*3_y.g9| ;b`w NtZtc> ײ1KĴ{3Gl& KT1ZWX8?C]~We$9; -.D087?1a@P5B,c}jcGȱ WW/ @a#LA4.ٹ^XڋXٝ:^Izq. ٽƎDn6ٹBc5Lt;3#i3RAٽ9| cbpcTfp> 6L/_x 'ۙz7~w~);qU9GDT! 6]c_:VlnEUdn6UˇKU;V`JUݵޙEO[)ܶCy*8¢/[cչjx&? ՃJȚ9!j[~[' "ssTV2i sLq>z@JM->=@NỲ\쀜*/) ̞r21.y? bO]3?C!yw3ޯL_Su>o>&lrw&i"< :]_<<7U_~z5є/rfn͝MLmc 6&)e+n7cyy{_~궼07R7wPuqpqo{ߟ+[w_uOq?u-|?WS_tOq?Eu-L_p?Cz .e ϿO*3 `Ђ6a-`kIf-s,RL-R`1eL~dپ&+IhYRczr?㐟,v~,b6)up)3K,RLW"Qd9JgT\1f3@Kh% a4x,kA k ^d kYj5Ah𚄓vXZhX1xҖ51Y +Id ZZ\C| fD>hB֡#-$+Jpሟ,Cg:6 3 xH "}C[`ӨOAFn5ʬLHϰ:N@VcyBI#Dr. "h hg ۃm-qu>V&൘ G7qi#^tҒ[JI!{q*lrD܇Gk@;oI<5xZ4xM"؇'k!>V|lk'{d+ :sXӄc)?W`*|\v aVT0"tMًcΒVz]T.C$cEp._0M`AlF̤@U' u,—rw=3}resLV&ԙy=Ejl1#XX۾;R;+[$4pjfљ lݍ3)`xvcZRT\%fNV Q)nsX }plMa~;Wi+f{v%Ζ/K 8WPll{f_WJ|8(A ä>nl"jF;/-R9~ {^'##AA:s`uih F% [U۴"qkjXS~+(f?TT)*qy+QR"tJ8۷)'3J1>pnVGITq3J&J0CQ v&P_񾅶X/)T/ϧ+GJzApU]<:Yn\~%&58IS)`0効<9ViCbw!bX%E+o*ƾtNU*v-zߞϢ +4 {e6J697@28MZXc Ub+A_Aܲ'SoO1ۀS`*f'r[8ݝYvjҩJ;}]|Bޙǖߔ 3\ a-`slԵ怕e7ːزoW|A\Qu&'9~ l|`pΕ [Q =r#vQu0 M.1%]vRat'IIc(Irw~Z"+A<sX4*X FVGA<^^7 vq&EwQű:؁6y\QbR9GuB/S5^fa;N(hz)}_vq@nu@$_DVH|08W12e_ʿd{xlzUܝlNDU j>zƖݗ&!jC`@ qэ-V Rt2m%K6dX)"]lj齔{oY:8VmS!:Wh#O0} :OVGL.xllT_oqqqLec2p;Ndck[ Rh6T#0H Q}ppS@ώ@#gƖ8sѹ e^ CZLu+."T#yrHhlكʼE-X'I^=bKߙԘ1"+< gb`[c1髰?(o$[eR6uOœ-m~)-&>883\6y 8V -qrG]~.3jsqY~ sjZ+9[rAJsT=~#02ݬf¸9Xe>sY~ ae9} x* zjC.5Wg󵸊y!1U:pU!ƔCm-7^w]斻~[hW$k sE0ڊSq:+EKٕ|dvvjjy6 æ/ML-yz,ZlQ^oAn-})xǺǍ--qcl:WLg ӁvJ[ǧc~Of+8qpçco#rCtKӫce0!Y-+cxMK-H_2:Uu*corD~@N`#m~R:ߙ歼!IZ5>H;0ޤ:\Tq]_\_>e˲\oUQ\Wߋ47WwߋKpwSSۘF,nC.\UߋoVEuY]^VW0R=<ު˜˻ x}[ێ'|;c^ M7 >5\-> m-8NJ\ALd!>_:h/NAC;?_ξqĎ6xMY(=ͯl~l8V0٨T zL{Ac:&$ ^CpH*DW\r2aR|=(L X1|wrO_g ux1^^V2޲jMi^b``Q#dBxV#NBk1;DAV$"*1]Y~ d->'I`q][oȒ+_%ؼ3ǔqb$Fݔ-")YvlV1b&g]X]0;J*,4~S"cUDV+@`6ScUtwA0~ƪa;2wƦ=1VE)d$2қZ'0Z(yUyQn?7w,RGKmXt&Bf~j2O`R<***灴w;V3Xo6?XUZp璉J2HA^ht2Vu{^xj ƧRO\z3ፒ陊6͊0rBxy!~ اYx=NS(y锔ay))))?9uC@b|ƼƗ-Gc2Z{髾nJeqe2;mi,,7<wŕn2i:{Y\j17س6f16}]Xj.,ݲahҚfrSӾ-ŸaY{@V,˙Zn`9v^HLӅsݕ˗\S!R@(l3 w{<^` 3~jǓ,2 kڱǹdf =ßtt, %'vKFrײ\[)i 󗤯T!T{PL1UU MpM?}{$;1f+CQҀWxBy2$疣]98Mvu9U`-hlթ^?b@x%B޴<>e/!OoxD/c7#OffW"B5*G_zc-/ɵt{K hT&sߤ.oS^,XA9wtwPO0)8?1OhݫE DD 3kxOTbZcm¼R+Y6g>caMi0i3@ Ct0- 3iOځj\G5Ez9AB>)r&PX"~}|e_EC"`NRNegtW~1+yqYUk]z$(Z|LWi~I_#;#9o/aT'V[ߏOۊG'aW=1nmw%?2l͖O aR!_$JwvtsǠu[5C\ģs, PEQlN[OBX J؃sMI`-r.-*y^,P0٭ߦ>9o.>n☮SdwCV<oy+DיD샇Et0t_?cT41 !*>>$Sp)~=ǺM:CJ< dˏc蟜Dcl A1 air0E DBt+T6#RcEvP RGeMR',Hۻ.S4@fo@8E.s L]|.v_OiFٽ)?n9H3g7ols no JcYd܇t3xfY]U¬pV)~P(_0`?I70uL MA8n_k.o 8$khϥ!hi5O>2]8Q]Ljy%f W-@)K4C}ZWC?=gN>4@l]gyp xeQYGEй 'V֋ڗ "IQx=2UTϢ߼iw!غh,uI 1Nm[i%\$i.7R0NqLwz6bj=(WE(eh=)-jQ$=s,tLRnەNX)|hp`\ץaqcjWۣT֭)`?Wϔ1UmrŒkZMiIMnp[w 3pD0wW tv u߶pT+pݮ x|n T{5׻YN D]lI$`nI: 6 DLʯnҹ!1\/i1Pr)ilI#ULqH5DaeCRTQ>Dە|h\{>K L":A&Lj=5)[~iV3LcVObGO%u3^\ Di]Ľ3Adu\vȏ ݒC]XUg/c\wZ]"z97x:s){q[&ǩnA=훓1=q2,&@Z*q6"~ėm!/>{i-튦w[@ apә;j_&ndN6?ÿ89pG('2 Ч[G@bibmz(S'} D< NѠp[(n©ABnS!7lc1yJnb/=SX! ^"$6%\re94 f`~VӬ``U?`W0h.|:SO~蟾'aSTWAg> _\ ts#|0$kMa4ЭK : ܗfwt5u'e"Ė ds=cW[D/B7KDptND臽oa(jEB3;\ pS } nw*8URBdmȞ g!V5U0:JC rwPiLQÐI_&>mQu(VCjLr)RGWXj8rY;SDi_nG8G4˲D8X"AG@(Ί0Ff"B9ˀU\AGԺlJq*Xb9%8/ɸ,^ta v87ۉ丝(Rlj 7M8"~,qAue97jձD6W2Ճy6{J Ej#<oe-bqqmǏ|7r8'9pjE\$euߦM=24]ߧZA[ۄ sک$^uO[<:L~s(.jy 9q5~!'S 7@6w&=Y=h-JIxx}P`~~糫ѤuY6+=Q6TsTGL@ }KAR,&d(I٬A;"k`#\rDN@!b^E!s`<|Q I1%=tjnhE$Ń,9:{ʹl5B /JP$,Ȥت\Gs2&wK0bhKl88="ה*lC<Ղ _!F_ +QJ(#Lt]hT "#8;c݌ {jkɾ,fz3k5|fz}xp(?V)Z(5 ֛wm 4.H7w1m5]6YD-pZ@jh̀Eqֶ)Ң<: >@n$LJrz7P'Wln>:KY`|#NlZhJxy^3x&n_8uɟn%q|"~;>z~Z0ɷA ԟ ף?O92aI|Wo]xj ,r)whOSAQqd 9r%`8Y6bfǮ3EPCgڥaG%4/,CbYbqe#xL3M 5^ _`iyDN<&0gf1p:縪iXt|K7lMz{p* vnzO( W0bPG>vXNzw`w SуԺ;N4RGR@NzO#{$uB eO#=P:?P;@h; Ih:>𑄆; ;w'?PHBuB O#T\3x[(d! 3yD񐘣Y2}#󙭶 UHflr`I]3(k{ 8(z%vHRB84i@5ɲ9z<"؝'U>Cّ#س~ Mn+\G޵6v$׿gU ,  )ɢ(Hc-SMҵkޥ 4Ҍ]LuTHSj!x$ݯ3ׂS_׫{o=?׏cGxo߷0f)uF>qOlLe*?FUOi{gxДd^KlIe_xw{u1HnwO3~"}tYR Z:}o c擼؍/^=iu{!W.=|@t0ݟ}FRo{SJ{dN8QŲq_f^˴xKMpX#/sxƓsYGvsYJLk{I`qX, ix<1 WQ$Itюx۾< ;gbJaZ(:ѻvw1{ {j6F,e: `ZҐJj]zZ!" \W&]\ fM4L*o1SVhρa*?ES^ ӴZ" $vaK̂Rsql הAgQ7 *ԼWhTS$^|~ ⢲=rI &ZdJAj$mEV4&NĢ} $ l饸L" + 7lFVDI1N ƪPR|&5)Z-Z֬"`t*r2,}cRJ ,8ƹ5dQ޲wn!I)UT4BRk'Gzޅo^oa7 Y9}ڃS,d!F%l.HhknUC ÃmAlr=%0= vMlR,EґQO87(]*drZ{eHyCpَ6@SY&hy0Vdp ף11E uYdma‹r/`&eRfo8􎃓 Nn4[ЦHMh1Af54#NAl~YTԎP$ ӛbJk;rjxg[&d~l^&yI2LZzc;l0hOPɂNJI /ɩڲ"B|RPE VQ-O ȱɌ)&@hBQ*^"ZK _LHdg|4ZEImz|q? cVK-GZ+;/\a߆4OOγ%NvK=m3R-)ƺ* 1H,>n=DՋAwiwҿ잞S+52ÚFFn_eNͭ[sNA#XW$nX5F/}a-tHBO>qDf96 T&wO)b^R<zvZȔuvdXQ+ U1TMd%* R$ʳJ/K*kӿҪ3ڪg+//;\Ys=l솓a06D @'%&");;-R %_e~)ԓ$Ű렵:jdJS-g+~v)1#ӂk-r˂[sCfGer!kS"}> ߆='m2%:PaG$2hY5; 0 nH\}'c;+$gձ$iTHc-&>OClR<?xe 9d|,ͦ:i1Hfz<_%yx!0xoMT/R\Xuu*kqd"EґF~QE#9Y] <`(Uɋd"EoB;NV"jB+oZpbrH}j8^vnYp9Xoտۅ-lPZ *pѪVzDM bIM<zqRhs('*gʹZ Jtd}$ 3 c}!jjWJ`g5i#)j\ȏ`W뭜Lp*9 oU~f1̯=#kd"VafGJu3Lk f!x[)Nl6ۢj,^r&z^ a6'ol85[PRR Z!"%*d\бhI54$~IϹMLK1]FCiT,Av#3dsD=9sN7M$wśM0}͘10B56t)L1ѬQ'u Hm0/OK^s FW-i"`6:^(>+*Iz);$(Q/iDgq&L8~zRJKQoXd+G^ǛtREIDNٰ()Sv6Gi͂iz^C`e"aEc>jUXpOX;fv' nʢ=Bb9"1f{%znTmbPơ5W71g%&!Tne۝ᶁ۴Ɵ6G)a#5;(-7Cfn9yUTJ{bGϊuKLv*{9Ѥk|Z5ϫo Tc*Jŗ7tAEbb"҉S$1emYfxn2J3`pa|RP"ݯA3h3}8YS )8^mqva9e_p;1#wxᬶAPH A f?S/pV킔fs,[)ѾiǜɫiOp$`Y8xєO-_ݰ]^- Վ̑RUH5t6 =Gegv[ΊgO`-eJǥ ŸU zoP=3arnءy UKF.N6{IwDWt'1̾_4÷T,sY789OzjLh.gv/));ˏx Z|d>H1Nlml^X19W01rĦ1:GRՊ1%H:-e )g@R:l቗grPPQsg9er qfoSA=.)0N#3*U+,"Q©ł0Jjh}FvW%E{!e ] Fe6%K%&#|0Gs^qr4||O4bgTS#izbʚ"B4F[(s׷v.~fȣ(?˥zVFKKB+pV" )"qT>2 FU$e vvL~76g-/iLԔ$xUcWB7yR9+77 z6XU'4CHVZJ ^~[ґaYp=s`ɲL-9JJg˂S)qʓ7bLquXKq8l'6$/&7895#eyx A +j"FY/JKL>ȇwmgmx~]xp^:5&=K7G aOD9L}}9nebQ"jB+* !Q1\56=J4d4hg Ed)WE ;3c7E:^ ~.򶻀䯔@6iD', Vo)с xLr Dǯ3ļ)E9 ANfy?mcBw+{FQ$nMa g6*JI݋~=C%ٖhAF. ;ș9sޜةX h/F8ޚ1D]c6Ul1|dL&tM>IҾ:35{VuÕr/tíeN^pl]g`fxVg^8,|Fn?ƣ4lNwM|a;R;Q?u9W 7.:1X]'AC(¸KKNb̿za_UJJU T:9KKp*dNGp2`eiorr'@\+>khl7OfrOcdWwlcWxggĻBˍ+@gN}\Mz14q,7NXZSwHGׯ;Qa/:gݲwFc&M{ w[LNbcabol>_ٗG> ?E~!$HsN|QZsP9ysN@-yBfL I;NЎ/"!& ܑ~Ӆ8ܗdEM~H<̈2iџ9CaIϢwț1q@Q0<  _m£R)<6Z@ E ޏXкE<Hmm|B/w7^pHS&iEVKZ*iXͱy1gmh* iS,8 B4߁4h[$ 1iFR$Am"%EHBɱĻr¹9ZkwYTIM(^ Tx.h8 <',0,WcccRxyd1dkMu֑ԉXaB&%:|K4x^]+k+GOjhWnf;a5klDz}4cz@Ő X쏮w Ez,@qnYv.[NIӽ߸%@b69ODVqA:6ŷO_DlKM_&I#,=VF2j@A)GF8rr%j1ހ6Sxܞ70A N5JFo^/*i,U3f%9h0#զA7|IZV[a> %3z:2uch@ ݗ;bSˠhO<8|aUG nh ۇ)IgQ*D`g nV:x70zlNt5 ^E<"mdV+4jw,h_җwaɑyKچd:P>Z@sΠP/k=dH,_r )mEqL')(;+wKl[uDyNss7DƻV,/@6>j]BФ} T`U-}Jy̓)bfO0cn1SGqT ]:|!ċ-OƔu^>bG6n8x~m&v{(M~N\8,ٍg,?#/B&Ou~s.cP/ pĸ!HL0JgbsB`4smIq衿{AQ!u'O 'M32C8D. ⿦Y)u'd38ocIswj9ג7fn ˫r yP{ñb%>Dx,<ÌHbz֪֝;%CǖMpY2F@rhj5DZ u)X%)ȚFI>/Ւא XstƦ֑%ɀUbלr\ԺW[ E-J3."]"{/?6)r]ût1Њ{8:Tb Hg?bx1 ks;?FMom晐 ӹOLocIm'x7p;yo^oFϝk@%iχr|MmЬA&;D9e@7Zׁ( c@V.)~Ⱥo 9.v%9ܐ *rT]p vG?+oX^ܻ)wL~g?͆!7YV G0Ͼ맃tz8ț5;4,3t<]Eaܭoy5h\Z5[ٛt7T߇% bq~? pWp8?CB0FӽI#٪w=Y^`h'qvxU~&>|8#!rKg)&_zW?kq-e<P yp/{" 1ˉ;|& `]18B&:S`]F&D6WgWF1)#J Qa5hrUgLJo>B%q^1|?f"euhp_f-<W# Fh/(}\?>;/"pR P/JTSˁ '=N3RR#h9pV\_Km43ߋ$,"pyuN{bdžSbN[-w>0OÉ& TlM:q y|K|r6ׁaxYq]x5g+gVw^/ 1J^8YjX% 2\&B%*f M-  7 #QB5FYˌ-sTJI\Bh#8&i86D #W%pBJD2%֡dbÈVj@7ލdzaQLXA)MwRGfښkj4q̅ gunj԰t8;u%D;; *gV=ΎEΎ RDmlYZ| Hls 2 H1#0<nj"~V&FP"v"mhZwQ}';7JIPnӰbv6v/; +prIAp/^ F"OB;p,ET&˺6butuD]pBlW@j/'1f-@$Jh øEBK9AV[RKWn]d od]],#L{\gZ6Gjn܌9v6zF;%/81N'RJ\8EJ&܈X!ppd,8`*GlzM|vGXJ%*0wR 3GiFLj[Gj[.}#MQkaE D'K8E#DC=Q,q-ׂONb<ИyK9|b!k-JVY!0L "E̥4 ETODCcQlmj U S!EJk$T#Ҷ_ 2Mé KVy$Y=ȑB .5Δ7Y`1|4gt4U5) )Eǀa[d]]]qiO,.$qBX6&pxLvYELPRvhHeDRRB;`T!ZΰMnbPlKF'w6g`,CVB+`ke idKSk#$MD |B0QF|5f{` TbBK[)"3#(^- )BHLr&DF3Qg +{.K @&lfXSgXû4pe;% uI4H!NP|_iu+;SpRZNz` ,2EIt 7,jB]ETXJ;$qJS&LI=QExn:f*KazPү!8&%4ʄ5$#K(әVg]1I-I$.3~s,v43iƔrgi":[@,NKȟq%kDHe(w ˏTJn!Gv+[ӚCQ{xMyҘJcWg[Rl 8%rs4bE!׆YANoV9zToL6g)Co%StʜH"GS$ GqiD U&If XT:6$1"I[+v*%[ !ISIVEG)jvRHGšI;gVlԑ6x`vKBIKՄiϺ Wk0#ȇoTge7Ep~a (H:`+Lq1qvdtLzlyCpy )i`I3 -ǒ}bL27<5gQo#+pװB%;,L&N3Ց6ȥ@ ޯwWK` DO)F(lg+k-o*'gosa (I Q"|+4S *`۷因|S=^xAfiƻ-m((ҸڽV\wf56(rF`jJ4ZCpՑdC ^}(o5HcLUuOn)HŽ҈U#-U2?h"Hg̏tXNO^-kn@ U[H ^PKvsh\ړZ,j ~5AEX%";>raO\_Q-z$)Ӝm,o+  X*3FO*`=Y&`r{eg˩ꊩBQiN^h-"tFhq⠝߭bg+>,W0pt :'RIy;>Xv+Nj$2>ع=u͊ jbT5(VROЍU>uI/-0`ͻlqAB/+w'IGICϢy}_42]6q/= ,ݠ6 ЧJRn^iC;^isZ{~ .?B_~x3`GۻSsf^7Dq4(-X>L>IJjx ¬Q.|eֿ[9#9|U={a=mZsQIwȇ?MG |%d^!ҍC_ 4C4.bj_~H;F>z, l 퀈gh˟(Uo+1Z%k&5 ,+tWUhx讝3a=_d>Q]s#E@/Jѩ2 m>vW>ZEa#U[DK ww Ňk><6Q> =i"u~*g >(Z#eַ"{5:07Uebal' 0}٢䘿ʫhx lN0~;ӳ[Px_» |?`͝, -XA4o',HOR5/QgEeW(/@\΀3,7._oxbY ?\͋s)ȃR5 jU}ӆkTZMSN8$bP$TRm6MжR-+֠\0lYB^e=Lܬ \~]c~^b"u ugQrwXK\*Z~C`g^ܷHmxٱR7LZX"f&8NHm$sRq@7 s\ƚոIbtS{rU@ M?̦L'{9\^V<,Կ@wzǀKPppşv7߭v0?\-旻uQz=(; mˋ" Š;t6̆rihK/brg"Rp Ȅ-v]éX e,]0YKtPx/sTU8-4)x{0@@GIubk;r."p+7.&Uf))((Aty0w.(m1LYHS*o /8I"i˩5Z}tq19&Q~]_vw:kGFB G$ pH3w=UT,V?F,!pY>3N /**vtf#]Eo~|+vUi ĪuDnr¹/kvI6`GrQ`75\TAXRJ 7ֆ5kb cMwC‚k2L#@) Pd M'Yl O5xڙ1\Ika;`DVΤ7`*j9XD+-/FB!?E5 ٓ֊M\T;wAoZ3 f%X;ݖMNOxW't1_aCE^eve&r=XܽFTOܿljwTxv|1sQKUD{epܝ߼`:(( AuM_ *݋u1K \Iq}4p$1*R+`1MQ[A"+}KX|d0z`G*=C7'x8reUAmFD9VeJ}rLm,H׸nFh-մ2i73'Q c+4V#!8QDb sK+S\`YCS L_8;qNФTrKФnZK43RT}ۂC7N&MͶV-4P ?>yѠ] dQ`pzr9)X@#<: qT[sv PVlR[$-ߐK\‚䴓@JZfA~@ m%ȩJap:0%5f9N56Wghև}P|C59yf HZ A ,% of?5^oU ;mgyP2ȯ>~U@(&<@lZ)~LcnOyM?('7c/9+"n?W%&`_ GpJUy2gد|O8RY=^o-\%asyŊ}e&IeÍ;[, ݿڧ,9?|U?Oo 2B)%x8Nsʰ/uOۑ!, q:D>V۩[ ~zğY(?au`;wz'\+q>{Ehh7XGbJ^ǥ/>O>{W/SW߬]^ t_>.AQ0ό$h%ś䔫 ϋMZg8i$7GBL_F CiBDK#4 ֦ٻHn$Wݼ-֋3ϓnK%YUzdVUV֑Rd2nRd03HƁu`s F8Bi~O2?ԾN2e~So}xKVͳk\7?/Oc[Eh,lr;eyyN1;+z-l~llofs'xu(m7[.C o|Uk9޿,O^߭ߺ.jU^Weyrwj@4Xu0ևd :jw2UI eϿS t@s X]Ʃw+[ds{Ń{1 5nUq-+!K2*6&(8. 9Dռ;Nՠ0 iL;XHS>h{zpo /a^?C9 y j~Zu09ha! zf#>5;䬣o'=vܼ!/oė| ( A(czfl 8 uj'2`{.<Ş8b>QTURar7kz?ɏ^#>*'<_>zߔ+o8<+r2bei,52HLUшȾDۡۢ0"{/Q<1C*&H(b!&(ŨP\0*6aG#MXD[`Ðpp/i]tQՊ~*^ Umk/[{{|6V ~Is<Tl?oX)>e>[0%m\ g+֐Ȝ <ļS 2D fV,72! +;^?E_2e~f;:{-Q[u!g!D sX/\_,7`܎AKY6Ke&7S, RQMXZL/˫?co㻍7UmF31qĩtf$s^+Ǎ v A4O?x7JɄ޵v'='s7qXĝHKQ1Y2όNAfNA:Hb(8'zYwuV@5j y#VX1X|{wSIcSHqJ9(c==%z؉£#r8SL.5gA82QlI 1);:b/َ|ac-?0 "hÏX)6tv'[ȓVd{ܠZJYF+_8e慕"C`Ĩg!<:򆑏Ӟz@ w\Tx Rl*kI. D#C`JjK5=Vg H[cPi1u5|\;^_pGb 2!"d2e@=c>oD>VTwFs_L"5J.ua@^'GJ|Tʟ\AF>Vf>գZX:@ae)0z+HvV?xt3|%3F7ɇ!˜, mwʈ6>T,0c:q: cE.ft~}\†{ƨ9XMhiba6mD)DCer?[.[8iD=cfK3XOe#v |knUj]z^ e#Xfe5r:KkƄ6\7D>V3/'HJ%Fɪley'KƉP:n؉£#rHk2]0r˰/s͑:z“(7 FGsx'</IiQ4<(2Hy!iMijbDְ)';|1:#KZg=T*qk 35=clϏ#(g&Udvx) >_j/bBk{9S8('q*N ^O"zzȷi4q_|Z{< VX|?_!gi`ȖygcEs2rkR^2|h錧y'AKgW32:G/[',F%ĉUu@ KV4-bLX]]"<\\OZ8%J!qٻs]D>R ez[q^ @i Sw?m'C@Uf cBU;jn鼽/;NH#>/Ǽ+zƄbP#wT2 O)-EqFdyS.b0NHOnOdCNJϗiJA~zƨ'j:#/r=:|+NSfj٢DYfcqVDa\Q$z IFFhcL1 gAE| +LLYK=[qQɉ ҿqq*xJeWB ^SQJ )V"MC9,J3t{Y`ѽu5>;[Va3)'uM侶&[[&lQ58-5o) gHaW93~6drX'i|glGu@(Sg}R7uXR)3F ߤRE\]ri1:)y7\֥&e;,y k(j~ yp8l L#흮k_g´8|mw'` !px3g:F6^K7J72Y(0Oou՟x@CvAĽÿUs;g솺MR}~N*f7*jV33_a9ٽo;s ⾙Pe@{BޠTqߤ>>FΘ?o}në{ !qUfԜ*М{ !Ss i5<Ry׸(aKP21R{ټ v !S"kT hDv@ĽU `L̅$4tG:/n>xynPog2^; bnܼƫF>spqA[},B\χ!bH]+F )iW x;I{ٕ~7$tԹچٷ >|i٥Uz'Fڹx]Hϵfݻ>Uob}$XͧA #fBxYIA&mSwDkqaB7wG㿆-MӗoUqB v8y؂ Oo{t:2Cf$c, y5^c_**g[!gtC=kʌ*&2X.ymHd"a )okK/R隒)Ar+G<2#(R⣒X{!vUJW'ZBbzƨ2dDݫeyhC,[hn.(A*Ufz#ڮSh^a?pe4MZ-Y3銺WR[v%b~&͍-Kޛ Pؐ ޫ&t\ (tU\׶d.Qͨ)) e`˖yJIakC, Ṹwml X}٭lA qJTSZ2`xb{׿"=cT\UOYX_>pjZQqshT {»Q#㤞렻c- y#TXN1Us \+ )`AQqS@- xwws7i]*.Kg+*ō%U$oxB!q8eM~E !+ quH#,(a*%ϳKA 3F S<0D>V\W@1񸬬C^'2HMXn֛$`*x@=@GJWQ^͒'V+<7y, kdM`F|5:A5{'&p/<ݍ3u;t\^U=X-$KC>¤Ac%/zvLN$!—YIpnčT8&mU๮/&e5렅|ȿl.n>636/gP]mn|ՄW%'Y{7+,7M6{..A2n ELiDʞ")Is*ȈrZ/:i"B~oB*fF"L#uxiݡYFAڲPpv6&%N4;nQMޏ咲U6ś&4N8 F3pL9{X.VݼAT `lKu ʤkՁ4$p$Z۬|59p~?^ѡƯ"J"nDer44Tܡ18LƸ ZSbR\ЈF6 XAyޡ{9*"ҡw|D0$c(.mamg(eq ]6TOn* h7!Yԡ eFBhFP}Y.'u UlbstH:\E_QE>%ңtw2ZhL7'uS[n*sG61鶜VJ$1]F^!қZhL7ۉ9^j5rޡ?R v]']<G ʖ%g]SHO tD79 `Qr@}@мBe.#UBcUn=xVdY`W-ϋ:&ͺtstrQ :UL1" \TY "~W+ۡD4BؾK1-&KQGH8x_K6Qb%U`Je g Mu* iQ"p4{Q=YJXGgt,YcM+ zRL&@J?PO/ן xXFa&8IWm RF*fV'miFG@e-..ÉKÈpW%"":4!AJtͬxO6#Nv[˻f^ s6x4t_CL;>Ҩ̥bC>/y}(eyԄr,s \tL=E`ٽ hw_vV GNK;IXE 뽾e2*4 'pDP~Nr1sZhLwZ|M.s w+6fAry6bUf.!&܉$ר~Gyēȁ~7,q&?eV~!OMCUyh/D2*Ku+?u]Y*skH1WYon72&,IBD+M᧎ \&1eR+|Gw a0NӤe#("f9t(yJDCi'8x?W$)W'X>=ZP` _f>Z)p) CpʦQ)Uj \=i +g+Y3yCUILUiOJ<IJ:}u-4J.>7dbIiM7^s,V*_-!)X.~ӯ^nUl?C<{v BW~g='\Ǖʋ/FYL(z!y?vXgL 4{ʻ:_ya[ME~e(k}s*/S+^R!sERuҩq|Z/HmXpĺ߄d?N(K0IPm"t4PAj'8-ibCkIH:tSׯS̿mj܁39Eh7PA(g, 2ߕz>?)ΏwnVs ʖn6_OhtY p^e7>_7ϫvjoݘf/I5oR;>fʘa:{gk x LajEf rlk 5{?fz}Ҭ}_p `x,Ň b]~#gYA?=S4éL5Ƀ"^ixI+"85Ic)"NL00$*"-4N%ze,Ǿ74'j9o([]HL ?O`sK/3X~{{O`Gţ zpݧO0cfH ̹0Quu0(3_)P Zh ̜sѠ1,).c=!aޛJˣA (*™}J@QPåG!pA\T9^Ap-gsԡUBc`ȷQGW⬿=25| k}8jUՇ%]2SLyіpP*u{$Il  ʆ1 BeCFgY-MBR/~xQLxs8?˧V jY97g|N.;Pi10s* g4sP}GNv)Iѹ=f88_?] j.Lt6 iiOsb s6r?յoL?759*dӺTeYwT#J9"+in"Z8 JKsa͡U67$u\ܭ*;MEtk#20{h[C{*, s2[?wp&u%4 |+[{ߡ嫴9x& Yյ]n%=^40dKjdd+; Zv(65h_PeA$5NTHJK!JBQ'jFE5"T#kl^ F H4IUt*gAacdx@* C6RDEޕ˹2nYW+_{mFWx16kPV"-uXGfOvy6`^#N?b[ >DgΠʺB; C Mp/p&c/P:6ٺ~D0$c֥ۛv~i!0nu7t!$F)`A2X0j'l02X6=i< =zDkk +KT,"Wc`:ёƘIXG a 7*sqgiv8/|tYx+@ Z([\A2EZfu.kn_rT\=4yk!c=\Gh[6E:LN1s!إ+X`S_N`ܹ |Ʊ| X0먏8JRkY*sn]ems1ϾwKn02B^dw{eHI8^u5q$ċ _pxץR.X|j`qjt:_?R,Ot~D ׸.oA:z:4$)gpt`DqxҨxVmAa'C +6kbHjđoҷMI`ŃSFRGqZa7b4v܂|,AIusGw& OA*mV/n<[4׸DhIJ8C"%ù!DEIuZ\H%Ag~"D)A2zsK? =A cPL?m Vz4QX[tP}z1IH#A$!TYqqB {Ñ@O2&d46kbj88b4;$&wM$ɗdJv^Iݳ3e|%@cV.%za,{BeKtY,?<ؤ[IG/9 SMB3 !Za[3n-s+!$lB=зUOD빎:>zsIue' SQ:]a6FRu\jc-e֣0b% ̓Q;O˾zġv7"/7ꟻNnpeޝ@emOb3tɳچeY*!+-~'yP{Z{a`'E;Ӂ&K9dH}\\4Ky$J0%a6lIri%)2ϔ zI9F$AǕlҔe2P#zPeWD“XT"^ӓ:ɂl LUB+~"Ĵޟ$`/SjEtd(VD:]0Nvb{^zb(z5s ~oL^Jww K&3ɰ07 #>5Iyw1~uJ0Ud6\tԫP>!Wm%a^Tca]. Wư W{~:hR^ijɎƋ}, -RӶȼ?쨓< 1-&QՆ_ iO_"|4O뤠e –%DaSozWuhpU.> C|`^ t %4)kvЊ~A4ckiSΒE|P`&>Il2ꑈK Lܶ g_JcmG|c1֜6|Y ^҄!{(UJ%T\5VrVLzKDQmⲇ^= z߬@գvhэ~o-AE2 i 7̈́dX| iL0"c䗫I3F?.`j 00m⌁~:_6[~'`0_o+ o~e︠p#?c7`g;P^כ6TʏeqOz yuKPA*dמ5݆Δq ;Ux&\BW7Wg؋&*-]Sܭp>[s.7ǩo#4kon¼SC-?m-Uy~??_ظv1o,eРMmpߏ}ɀa칟~Ӈ~;y6Ϸ/aɣk&bM44m`bd7`O_.|`hփaC?cX؜"u?=q|fk ><]/%y(I]߹Lr1TX0)sy"^Q/DQ̙!s2ȯ~I7Q?0}GŸn-i 7Mx8 J]G &l!a2[ɧ04@5'WOոU=} ?lvZ|>)}rB A}ZQ8 &͢Vpj+lO![,'-kVGL=iݚ-vc+&n&],kwnc uln6i%M\%syʸdAʣ'.h2A/RſMt X=7cD6 PG,~DuȈv8VuwQ$~}e쳓嗍Z؅Hlc]LjDf"E=m]iww`/]d4P JjDb]&H\;uc'maJP7 ZB{@;CgX_jr3a|=ZZo 1}w| j}ɱnc+%'.2(7"m][vN=Bn~]mtSr`!OnyE}MfvSo['KN~z<;*Q4?3WIR<|Үg Rrbh8osQVfY`Z;InτoWfcZ0ܝs [JI{ V$r抰!f| 4 y^a>FhTKҘ ;7ȅWk-ulL)r+XBтX)\Z媤|ѣc/Yqf̨K*YNgVl߻{ȓ/=|$e&^DhrQ56] % .L1|:wUXi.96Oꄀ̹ϸs `F^% ږBGW)~?c 1$xS aW'#G\XBSedXL.!HdXVJ{Tб~#ɫ1S*SfѸG?m`_[/]c['3V L.%-)3r 9pL[g!R)as}Fcb>~1B22q_{uf_֤׶ŬyO:d#Wp?cYjy-=^[_\I/ZD0+.'3ׄSn 7p Wֆ9Knf yn` 1ہu>]|v52G l=c6xcER3<*}| oo7h{-O^vU}DCZ%.W7QiE80/u95U6gZ+b/ܒܷ\Ki&5gj5LŁ~_=zLő|D59l|]!>WSV86&)MQ`YT yJ%Q]^҈P1],k7?_,\7?%n,qzxiH'J74/}xpt*wtcݨ>{u܏fe7_׈_Z^^m~,ڙ_ܬ5.Υ1r5Ơڼ4ڠWFOh]\'}|w\f<{8_\eۭff-Ǹ)]wdUt>,oiC3i PLf:x9/fs?Ou.]{%3E#lx$p|r3iEZ\'񊜓jԯ/?Vxv O H?ˏ{B7'@`_ӓ'='<鿾9(FmH33HKVrk3*\rtGq@uA ػϷ {\=лQT5ǃNͯc_Nj_/>_vTz5Yw)Zx#cv6[ca/{9#1ܐepKĘ`%m4łcIHN;k([و Ժ5WIrcs"9<:5o=:{wkӳ|sdXéX.C8[Z9`ͭ nf! k%kXHXf,og'b>z3Gv ~/޺ҬbS}cc3Y`Clw>Nz}F$l~͗-(Y1Y;Ncd$_gv)=`w=JpД ԸoCsJ|d4V>*-x̟M&;cwΗs맀?v*FM wPt[O6NV)ffN+~ WTdC:.SpW[=`My4Ync`sk$S3) ~g*9K" C*Lq*|3U9pF3vm¸1 WWŗ,q9\?UG . euL~\u%YNi1Njo3I2Y\t~~߼5w'߳}Kg>~7#?{'#Dj);ϤwN9$AE‚banXz);S[}>o꛾9>`7iݪT۳#<[2sހ6I[7v ,Son1hxʨ)z7_Nm tg?.c2bheh ERͳ5ڡYTL4lq <k3I|3-Z!Rk}~=; OffUi'uw@C! &T 8iUF͈Ժ^M6W̏1MB씖"Ѥ)i O4{B.̇%YJl: ";z،`mB$]p3LH=Y2crxFm&$Y(TƆUpN6 &MdJ Z2Be\5kpc6 a]`l;^*з,&֥l4D' "6` 8~zp3UA?')?RIJiHE Fxʸlu=#,19 828WNä Rpئt`ʠ]B2;h`S =vD/C9KdVYsQr:8y*GlSN6V(X @^:%^2 Լt;t%[cO0>R+M,ȍ\4k|},# P2"uv71"璡A1S\c sz/:Jm$ԁ 2G(̡KWBċE** 3u6yôjU@T8,&TG vp-(O:~e\~ ?Xֲ |A28ҧG{DŽ'ׅb^Ъ$e?~eP-D '/U>e{4*CO^OK;n'j(ˌyHmN+STTEA2FR&B.z\*daA28tsO+k Q.C?$Xd$4-a5/_N̚t[lӖ8#kuzhշ/ʛ?]]4L)(g4^EhV]q38^!qZcppn<78G=>壟s"1A#9rI9"wT4*O& e5EGS&RO}b<^2B\BkT"a),FYneqVQ mtr;"& Le.LGmϪ]RuђE2<ڡ2U݈e+@i!|=ҲV +q]0]Z8%7yQL$ {a5MvS{W?vj/45+o}Kcc¢U3ˈ䨍 Ĉ$KY2r6sJKua`hA6jp7%+F99nH[?46C(R+&IdbtL 5x^aC28tٱgVzʛgbx:LaM2Zˎ޲Z ^,_*<_ʨ*ھ)`h{bguIQg(5 7y6'A28Q•a/{Trj87G6s6 /QQ>I}G FepTPw`ʠHݶ n+\/v}ab$FqH~ V[.$E20)8pӈU7ؔ<2qzz9nҐv0)`CNAq(СfԸR^A28\fVͰDcc y}]g[]&4*CJ-_5KЇTOpíin"RBt5h>pb4*̋ܿkA_(ߠ磙ʇbp(bˈs65c^cryšFep =z|T4PU>) vШ NZI^0l@Ę,ͺ&UȾ<4*C UzHNY7~V8b|!PYً?;Bv#. ?3 {\_[VBL^No:;ga)9.xy#bx) {Ҭ4ñܦg}xJ2ZUJkߧO`FpFepXy>Lz*Q._rM^0pNWޣr0r;hTG?@烒սxg`Hl&cJ-gT!uaxŬ,ð04⮇etA28p|zz8cX@K%V+t[Ѩ 5ޣR3{x,3(RYP.qkS-+ uFĵ)q$tTx! EiR$M"eku"GgMAIZ]OgW=sNa_g( u/+T|/zd{:hT퍯ֻovNs+Q`諏3~@eXLO{go bCb txʸݔ̧9A[- rT9chmQu5Us;OSle 4=VHT[|1!`n4"!㓚6=5uDfx`i|fxT:uvp9%S5fD᝽&:2Qm eDje,\ǘ x6I܎ki@ĆXZl01e@؍qO"hִ)_f3FepnVu0A5` jStXa1v5fM/f}Ie\tШ Ncm[4deY.^!_.|u=p>Ѿ.tKIpx(Ȉ dM28 E8Tx 0Y)\qLʋj?hFep]\RHח[A\3eXӨ Nf%YڠF DzTieFb4-&US'&`ɍsWɮiTBeUQhS¹Wؼ[. jJ0t>'Fd :"Z^|J|[ UdUv :Å*8)0[鈜Naq6X{~zXۧS t%]uCM*> ǐŊf!!1 5( gZ|ƙ"|-]>}a>`w"Gv-;}|H8YH47Ù/gX^lbǭ6_!%"B F=wQGѯFG۴g[j}W;هQtu2ɫ=8:vb+(D*0LݲdoF1Oa6G ~:-wEqK\գOGώ%t-/rn1nCGw=Mp++vx]ǹ۴Q3V݁G>x};׹>q ~yg**Wԅ+Wy8*?ۆ, OeMo}(#~282dJ+ҥ/`҅t03~P5!J( 2:A1dR,O%PInKTx|NO[jX'9< 0J!"f|R<_t9oN]TS ~ZM-eh:>M6ph fALaCwvEo˕*ٷN}57g9l_,[}Zr`۫l3YQ ++lJMdLo4/Ŧoɀ%r𨹮df|Cփi tݲrnn|Z ff(&krRtZ;s*_kŸQvzߩꫪJ%LE,0y f Cv/ \kPZGP@_Gk`t ԍ>t6;3;T]zkO,lQcZ6UG@E8xѱ^60[-YˣG^sgyT{u Fb؆V ͬ*5ĂߊR Ijw5XH7b0U'9/[E"V?Z ЦPY͵6άBm1RÇÍ+Plw7J4:<3`, /m*y56ڻ Hq{MKE|Ӝ`=X..I:qgMR@@Wp1#9CC Q$c,(8Ġ))p  p@9;vฦw87zf`+Nب7|Guou0(t`׸xE_?͢B6}otW̻\5RP#C4ݽQٽGo;mi|>WQJz,.e3$N 1G S\4pf^.W1CHЌpc-.K,M%VHY!f29Xge?bYCp)&]q'w_t)x-`ctފ_/|ذgP OE{ȁkac=)b-"7 `X*Ll!]GIzR(owfX<Ѐ4?:'/bZ*{]<~Wr{ 4zQ~([ΣY3{Ľd>R*2Zwq.ȏbS)(u7Lng-LvS!|ov͖$ \-= S%~ ԔD+ .  \ @i*=v,R%lxۃv7U%\HKL/Fq$ fz 9w1a`kyM<ٿnP=b;}]e2v>ֻ_CP+g u1w YL(s-ɜ-R!E97NTj8b*,# AIBtjt&if q & O aBsKQJeh)!qX<]+ʬJ,>rwX%?nָ5]1-.MՃհIdYLL ;o ԽI'G!ֺr3vV_^7(KӼr?sqʰ6,$LT&eRbNDY4IIM@ f+`>Kc$3THPDTL*ʌ!"J " *_D*%CyuM+{,fi6L3a݆1b*61PĬh 0%3f%(*3È8)7jkF2a ZX䠵Mp!"RDxd$ʤIA0 4 74H(&C0LH!Uʩs,WRiS2/3mXיv ?K9Fϫ\q[FVs6=vǜR2/иV1S˦h9::չjuZUƞO\HYgߚŎy+XL(WD!rZx W'½_>q GiTZcZхO[ZY~|WxZhk@Dԝ9'y6V%꒶ RWA_Mܩ˟$i 3e-5{m,AhgT,~lsr>Y08zprsA7V5[-Ȉo|3[_p,1F#*G?]Arߪ&y:2)?`9z^7߽{;Îo/o]t'x.S(ߡךŏ7k4M$ݢiN[+!uT[n)@~\}|ۗXO/fsnyG]:t \6@AKM9YWQꛫRw^s[.!륡wGYQ@q/TSQ?j^WIJ AB6FRVs$9X%E2 *dtPڰYR|>v־)<{tХfVj9Soκnܖ/yc34N!\0AQ"> %C$J}% jN0:nȾw54 -CCyN޾=Y;Tj_ۿtaIh8`o"DHסʠ?wW:} wˇK4SF)54^.!.>QVI8å/Rbc%9f!i ;e=P*I.0=4=[i@<݈n@ ' x ' x4a.Gů>Œ$t~|YC֒ю+3W,s:&%IWbj˲L&M*ZK+۫秷M|1t}igF"&>ȶra% LXTS`qƐȌhpܬ5<ئΙhf> _~*CGRYHڰT<H ѷR:~8 Vbh7Ͷ'{t NmtұzϧIKRT|b*1L!$*NY 3SdPu:x'atJv 3\&KVȑTYjtΈT%3X/ Hz\wam}>GXtӿ[5ٚw>>$&03R}-ԉur#݇݁q%;y͗?*XY$]ȡ"fA)fJ-l̦&\SÉdw]"o:;s\ '[oƧvf\wp+oii 3yߣ&$`mxR!2.NnN 8 D%E?$Ev`k>p_֎ *R~C?`Kw<0_R%r2,*tnǚF?{ܶ /ٜ/W&[쩊sxL IɖA )Ѓeܺ{랾M^3)g3c9t0Rzb Lqt6|&1z%8X7^ _-Ϣ,-Ϣj\6`gIlJXs:b2  BrqCV)q6y) ?$axx}vhG8b(;!oy;ߛ9ђ#3vz~+#(*G C?۷4yWpcWetƞ1m\67>w_L)韯2tԑ|)0#!*WQ( W6\B.K 8SmKy2i80ߒx#$>Ewluť*3>dfDZRiYk׿M"JܧwImx2\a'KRp 3:?vv[|@]m= N~%=w𻋞{%BP cۓʦ } f"k<QcڇgHݤ oJ5~*,GON]ǿiǨ$4OUyɋztxMY<ОEdy"0>տ NY%~|jDi6d1Ql@dmXWS$;I.^Gh qŬaWh:o{7HnM-mi YmKT>N$Z8׶KNEce9"t;+ohMw~|밅\%r[/Wʕz!1qrĎ~$n6\vm늣ɰz!„ZJ#qUTf"&}Ql7殁+ ZnjՉ \?q _X-3?5hhJkҢԂ;H|:DM?ڰ.MtPa'9FZ0W|>XHlIgu_~T !ba*tܚg]@L*qʭm A#DNetiI%Bms<%үoRF7s^xj}G( s~yDKc]whCG|wh=ܬ<&v![W0LȊW*>l`퀫&ˑY]{rM.SJ;C5C!*3[t>d G~fl}>dFZ(+z?M;eÇtA9VWq^YB :RTō7vO3Tr'73Q$E~@MOmhp?ːgͱ]GXVC(YcKibgt2=F*a +J ҙKv"&&(zN›, Y xKwGc4S{=2J=r2'5N#jy8ۣy6&j5|F lթ`3QS=γjq$CsWhnY=E`=Za)#v-Jgyσ$1r43h y9441wg%3cjD^O")#gcQUh巪wS7L*10Ʈ`c7܌POqDuo ~p]G On(Iw(H0E7ERP_d|EϏt|W3l,BȸH*_?\8H9\w 6gux,r/˵;11b'诓7lW-+MˏRʄ+<8tj fK8'V09-Nד+3?sSݯœo.N`$r0pFR^ۊ(Oƿ\.'~5C'IcOt iHMYNAhOiRܓS|2]zsx1\19v$zm+ݚc1dfcӗsa=L heo7C+ljNρǓ~1{͏gNo`%w"׃O'5oko5Ulmsߡ©)7{[|d+C[Q>a8pbù;ObY539qA/ <&Q4rTpQ}0_#` d/r_ܩmeGUE^%)f+\c+8CxfÁH2 zSڰ\|W*%#"~,6Pf܂C}P 2$ ʏ 6XvyB|l+PsUHL:@ tB;k;u%wd[Ͼsz@1T)-bSif^2N 娳%+N{rEۧVsrV.fP }4 TZӼ<5#?l4mr0 (E,LFRLX1!n2[_-Tf-p3%dpl*Uw8Q<ۣl* 4{IƩیRL95\+o)p)BQǘL;p@s>( o<{tɽe+ PMޏ6֬_iz5gR3 B M3JL*0"5UP`Y%|Z3yH K1rŠHII S"!& h;KI0 V v?&B_գx`L9)uf~֜|Ÿ^ NYg3,r&~ӯ+[ɺt D`NW4ajZΚT}碕VPjBZ!a:KR.eK Mū~;t<0MY{y+5Yr6<۷44%2߯ =rji8xBo1omԗ.׳y,P?3¢*#@__c&dv6* a)Eb! FN3͡; -qFUmTeG3(]KYXz4tae#[ +{UDlNh5L% %\;NQY RωTJuOA!8Mh͍1SNSո"wo>>!u*sq@z˂SAa>(ELp EKɕo9$yԎ.1 u)}pHaxˤ"21TyUtZI%H7 ^P% Fn'1cP{PBacNX+º #,әRYe3fbӱ&fRjG:6qCM! ՂhxB !5 p Qa*q2`$ 1m3H +HM[w0p`;ߖRI3A^A9Zsol,)@r,Qa󺨰;x1Z5"ΧoO~Q1|lMpּhRUV ~f{{J~:5SdÑ[&{P$r퉬(єF"қއh|/uo G:]>[VyÀ6!DCZd\Z$.yuڳ,r9rNLvXO!_'So"9^& ~ˏRʄ+b-r1Z ˩pN?{֍Ei0`,Nv-M{|&ʒ*N#[$vcG_8 Á΃-ܨ3Z~j^]!>WgN5fV+f( ?4s.Uy}jrqt12tHYM#mnDueey 6ѸO0y.> /=sp6G]665u  P>>jl6!7։.M:Q :kCl0j>`;0G^>"ٻ^ݳߝ`f`^] 1ݙ;06?O?(64ichJn'Ǹ e壸fj˭co?=烙×I}wVd׻fS\Bl2?D?nreʟڄh6'bfWvdPׯ'pCMۙ|nK,G IZ\FCDZK%'Xsd->ѥ 5 AeJF>S$3Ɏ껵FKA}̰_6AYmc%#5͟ZfV# 6U׋]ru>7 u:'Pq%KM͍ov+?rX295".Dx{S5ZNYb耥ڄMzSYd:h_|])d`>6E1_{TddHc,<[`OS%JLd-m#WfgVz-x*Zij6LFr`ؤ!Pi\(+Ax;ʏgOA\KGq+mݨ؋5`_~1iȘ:P{SY׆p-9GA<^zaD_җ~j~T;N'Isw~syUj~g~>=ӟ^UX {ޏyaΪ [rˤ~}@ȵW 9 JTPSQjsB]W4 } |R>_E7fHe;0z+H NӪ[5WeP8j<t); B+RX2Җ€EQGwHhL2c;,E %,R9sI$lie!PB V+PuI-҉bܵGP<8B8x`+^`T[`e])o. .IKaoc&LfqCxSTtL! l)5kJtAxAPNljyp9 E^`#F1oĦ)jzZ]_Wra͈zwݬvGQ#jpXJuYhWo-TtG$$Qq9t1eoT8&:>Jx?)@ʈGVp"cv}(o. i%xznQG%wAx)S}K.V"MC>XF;>J'%#& 9o'BUk"*2]"DB d$0CVP(!N#p ]f,G!,-Q+> {`ؽ妯v038 H|(BS ˽jXbQ$ynI駑D7 ]ɾ>ܙxh<ռέOgZ-zonfhڛ?`NX"3CRyߧƷa&%Њf=Ϝ 9A\[#u9=(pB եO$J ?.9;NP(7un/m}eUϮfJۜeq N1t6*[ЃB; c],/F77ɝfITSU:LWt@9%ҚGDS?yRk@z 86J3JJe`skHd"锈 ݏC S2*͛3I+s?g10۶?G+- -W+G4[=1YBh5uIm!xdI8'lTxP [ AYEk!| !| %+ZtHgM$X֡(ൠO\L~z|PqƈR;P[LN 7E!d~l\$Qt} LK_浌32+P!j1"RzkPJWX6x΃5hGuJRPrkKM> gY+Jg,F))hi ('s ; `eo߇Bxfdba fi}'mdqsgY;;qOO8]Zo2 ﳵ\Uͧa*a5A CFp!n@F Ρr֊}idXo}y@ԝ~h8|dmJmT|v>xlh6V;D~4hS3d.JƖSm$;1W@@Cj(?mujAg|XьN{cGk-]C?׿q >ft`8K438=>,O6jUkܪr77r_aFq^9`ZzE#rZ3AaW߃y7˶s6kh(IvO.1?h԰|w5q2#npQwZԸv9zS `Q~WYTڅcWL=]R7QW~u488 PԬtguN|ӝv6Oķ&|yBNıaafz-;׆*nV?v&?@B;M ~^";͔:?Ә{mhk#t~Z[d{7 zMkKP5bw>+kϟb"/3p ҟ[+Ϯoqմda8$j,C{.W|bWIp{8!ǨE`_шw.s0*Ӽ sYcp.IW1@Bn55 FԦ]&@S"q^]ky֝I;6OQ Qǔsb=AQË帊Ѣ)LJ| xeSY/~AƐ0qł@40B0\Ш5|O+U9yvSSe ZY% ZpT8YcI)$`utII c~9|[].V'k}~A#EQYLYĉ%&TjULbܡ98B=I%Nʔ"D,:oegZS9+z?,vWɌt\MnL6{1ߪ7@cwMZxhNUN:"@Oi)X1jRSs0p}ÜQx1lu2%H}!R*SMGw0 eR``$4H;ÐQM&aqXFx "1.;XT+dBHт{ CZ gTb&J>? /#V= wiZ5sF7M;5# q<5л;:ílA26@i.c`^W.`4p8K~:/U~U"|k:YtFYl՜vRkBNA{Ê5Gɤz5HrlW~,:-N'cZVĻ'<."7,jTaY͢;X b'F-kQl1O!1:#G侹EW]ߘ0~/U11/ Ff K1MXG7ۢa z?TdFrL$&˂PɜXI+ދTjO#cz#N^PtFU/BZ$ Aq@;zrxg'2;^VlIda=_ в`jG|="KlF鴜 'n HzZޔ-G]@ ԁ6FuA˱CV3o<$j^7@De'j9=uWcG=mrZ֣Ux2/%)a<%H;{|h3}WNOЎ혞՞8Mg7X*JA9S- (ak!E8WFX" Oh4qdi] KUL_Jtx=}  ƱGBUr/c ,8ARV&" +8ed4*b3DB(opAud#F2a3Fq,amoPbaGc;Vx0sP1~#%x7#x^t ׅ~g8Mxrq$!AG!&7Z7*gH:m7CʈF`N`&v`Q}p}ں@w֠i[CgVw:`uOuC ~=_; I{Dݥ?5+$-gI *m+jSD1Ipatk .\Zt.k/> s[\zH*;yw6eqv>@&n b7*ZSߔ 8L:Xʹ5;;կÓK}}ԗ} >m0{&IeʤWplJ6Pa3Q7Np:3l[Vb8ɠjTDA%| W`V]4Y˂~Yp ˂`Eٚ'f R0`DR@gPJi?3Gm,ECxNgt>q~+8%_=?ć@b[`Pj!=2(!̔3 yN Nw7L [p q\vdFߦS;; 1a6 P b f%2Py"0:٦<%8k`%F;zqasB3) -q䈄kJ4QcH?,&:tCk]zDvg7;Mk5UjB|zʼnMbJ~`B0bBj ˯3DѸ|0WGCתƠ"xz:Ivmt@EYc*iļ%)-5]> i^KIwjf’r3:j `< [3ad  ƒesQ_3iE T1jpi$D yCDjE0yg+0VtthMCje$<VFCN-H?A+۱k󗝔ۥ;Nr N }əMtPmd);T⡙-g>gA"w[9dc%Y&R*o "562X㏽)Wo)q=X.]{utABB4 z]X3ZJˀc Ր[]hk AЪvhS`ũoO#x1x0/n۟Z?SVvbm`Puk [Շ A%I􂃛OFnN [~T6$O2} o` 1@fjyD6iD*XϬJ]@1©`j1Ŗ%:cvW+`+Q, Tָ+ȧwՂ1i7i0]ܦoT;־<^ɶjAޙ ڦ7U2k2cT$boT3pүK5_{7.R_Q o~p1>r@Jo˿U]ͅ@`BH˵"ClgQT<ɳ͆45?iNӶ"چR>ş, fڒ%+Kf12h΀LH9YN`qiqqd6w0{<tn4+"^e'ӳ) MɬSzD 9){ױ\&db s`kF5 {{0_$q)^G&EWg}˫W]^s Z{-+kΖ{TVйʰ,bQ}G¤遣ר[kN,;SoQ 4bc4AD9MO taC5nR&ю?l]sA(U/+h ,IFPIx-h6w&$+[-}p5Cva@QмNv|2_6iY0UW*Tlej4ER#qɫ#GH8Zc 6WKp[~}(j$D:%85?Hwe'mw.w[ itPW{% @\H BI-ɔ]-傟Xv+,Z&H뷇y!MȮ sژ~w&R%^=x?2hڅizGTϢh`g4gkxuNSJG{cl Szi~q>]), *S2d)"|t' 1·ea(A5Gr|aca^?gF8?o\B#5_|M4|,(Fo_}<22>?l5՗G30:aw2\< c3yJj}i,@ɸF@,6}$ou xϢ^Ym9>4Fn6(#s{lɦɗo}Tzx2[1@ƕrQY|;1܀,e`azd>|ꋨqL](G.r#K @' ,_ǎ_T nw]~/\]U?*ޡX{VC-v1 vQ]Um<*x=-Yjf{\њaXΞ%|\=%n b7*Z+UMI{t.֪:Zʟ?t"mT1?}$5;K ƋŢ@gbf bFXFdyMq Zכ7lڊbSY| !X,[ w|"i'b4R .tbAM ]ܗ0 0V-_b][oG+dP}jy*MM'@>B_-)ey~gxIQH`i._UWWB2V#폋Xqsp~[Y(xC1ǂq0|!WSE ]S=S?RJ%‹r<"Q`J+h)J +ۈ5;Ȫ smMK.] 5>Ӳߴ0YBKV|7{oP潿D;Ѱkm WL5u$BRkKS*)UQ5e)}=e۹)sm_}i1E;%4bLh݁$Ô')s\9 ^MHH@$LG#uWQ:28O@YaĠxj(6֝]2wTr+)&Zr0!gE*|ғ#8uΞOHř-H]05z]1-^^j1l)pʏ_>`MdnF5JMکX%P*$N Ihj UJIKr\՜8q5@hP2('tJqiD(TRRƹA.I$BzFa3g"Y2h! B7)=utS:Bs_H|?PZX /vQG,2U-+ϿI*ռUh|{{Nw2PhcT21 s )K$)P5Hc Ί<~HM/6BzM?K?h> $MJ'Ӛo7ԣ?^WVꅢfĕW&IVyk*q27?;i))i!$NVyQd 6HIX[J ^ZC]w:8 {[]hɯ \OC{2O"Fh#9Y \iS"hfN Ą¨c"^C2` l@e"KRiH*2Q#3HbWk u!@<"ۨmnXZky+1(&)MyЂH.8"G8˄VS l9崖Ӟ5=SܒN"QZUy;aξ4?%(qxgk`C@V%ؐJ)]r-itcnw3}A,|u=eFG ~9f.!~ 9|AA 5B)O3KJ-4Ar3rM RsIZ]F@f$N*xUIt "Дjh2s@>}+DʏAbuZo^vug x`/ 1ńG*TN#8x#A*%7]}KOvdR1D2QJ)V nQ׎rElAĉ0oYFt¶5L"KR㔈Q&I+.i4Frg*h |֗.sXbeCjdZ]X_?ߔo}e}Go6T/Fk* 䢣q߯3b1gREg7$~h.P{+d}PC " \{8wh]3<Ğ)'樂AhRUQ h9RjO4Ef 8o]:ɨ{)(ǹw,g}ŅHUZ-f=>U<*P ƽQT chBv1І%ꉳK><è<6&biLE]<U["0M4"U9?疛.8帺o6ATJֵ$ؒ[: .f52mA%p,f->== Nl#[rSJQZY<H> }rA*8Z -YެT!M+ׅ^!9~ɏ?~yo~:})epz?D= wu@tF:eecMc{Ӧ9VQO&|v5irڽ>Zʘʭw/_ÏݑF'by'o8͵f@% E~l~^YygzȬ?W{ezC%B1ٌ dkLv72 :4Um~4j>~ig)'mxPD(deNA] ] ,REOM6^Q!ObitCi㾺H B)I\4rI)M= h\kK!1$'Ih nBSSBRRpc:9CL),MN 8XMK|`IJlL!Y}R9`:_„0@s&bI3?S|uq<:˥\Z rN>xH3fs@N9 ń2(- c"=7x'5w=(]֟f8G+σ~)&8ގR:fJ.QUF ,i3y3k9br,e}`qI)mA.J!E E&H4"my> iu5{l LN-׹f6C(ۏQ gD=1_0Ҁ!$Pbe)%J5jD5Zk0oLiFyQxJл'|$UOT-UO4U7qɭ]K̫f0\uϧ6^KYbKWϥ˺q:[*%NmͪB,Ybiݫ67j,!zqC %C667}ݷ1.4U K9'i|WFeqFOSZ,S.yJ Y)ke,pP,qWg@A1g+mp>)~$Td1I11S{ϊêcWηu\g.UeOy*z8E>)PL,r89"PU4pNpjZɂf62R[|N[͔)@aF\ WY 2%&\>{*&u\TDzES9efFm踨-BQXĵ FE~\YBߛu`GșRQ{Vt"L=hW|<v+PLj OT_#+OѼL-)8mE/cEcEa:ɏ=}da}OݜI'o^Ln>RQ水=͵Wk?xsՍP̖] (%2`T>f^M#_]2*wbq9d؍"+gARDdV$fH]OIqߙgoᲞD DhnlSqD7\jjfw&}LN7 (L/O^\^qUAϓg~% j9Br\wm$_K^C: .8A>$ѧ"6x#LjJLUU:Nvlj^1J7ɵE;a&!*[ʙ wLJNBik,DÄrEjP] d BxQތ%Y El4.!襯͍vL=ݴ*Q%U9J۴t)\TRxbJB.XœR.yNL>|[٥ksYwZPA1u0wgF"*9(((CgBrE˙;" Жnq!MQ:Wbt;l=]F*_1r,~μ4'vt5OiGY+a*NǁGB^ݩ Rx=v|]ͣY+ϴv_}7۝@5Z]58ExDQ4HYi펧o?4:M nԘGFzmm2/z73GHu-]#qOq<+a_ sul,LuGlSү࿽J 5 u4u@NWr$ŗCP? gxc>ftl05K{y>9+B gpb.|go,pIKt|*NI#8NxF$H8' 8ީD:8of6'&~5wy2{lcech{u¨DP^ xm,UL("d?(㬉){|0~DuMWd)0\G;}t˼=.$LBBp.շv)Iri'K^.jCqPA$E)DK,HUJRnD2Z{geRi 8Bdis޺9|Q PWK5Mx8##eWĚfl8vz \rd.\<2\舣X&{CGl_]u+5̖S4pzG J5((p.qȥ(b+8J$cC*&hȶV&(Yr$XfA(ڒٛ"T"_l[ObٝƣM 2z%kc/k]}mdVi5BJ%$+E41RɍiמxLZZn:)j}vGaY}NdN%+52^ՎF\{㩎F&~k\V@p># + 0gX[w:k`uZ㹟}{}.ÐS$r8fMy-?|AY 4ܦL 戮 ]K]vr S-Byr|,D,׸HTVB{SZnd))u҇rEM4 5n*n9)ͧ []Z\zºvZ"؞y~(WD\TSbZ%.vtN][펺 ŅU w*ܮ㽪p241T]0IP"X O5ĜC1%v޺YC ]ޓЇ(ƱjR戓ֿ0'пJ}qTHz1l",~t Jo9n{rTܼ\ Iոqjnoie~Ykȴgi:&f> ;\H P-& @cC藍k6 7]"韔AijnUoYޭ==rB+W]YXr?[5K3N޶ezo}X轗˛=qDFtT"&#$3)DR<Fs,T^ 'EP-m& #x Qwa ia*n9!z2SKZdB`]HGd}pˑǎ j,1K J0r2mBu&-R<ScO4J;Em{B YmC)%G0 q  g޺_`F`tYMrO.}[kj *8hnoF䫛W1q??[,;{e[R=HV!tҔNL"Y`D9 ,'1*R7BlEx\hr3p1UD J- -Ri•\cPѳٷL*%v\,7s^7; c1姆mc7H\Wf,iF" dMr8lR# ;&rj'xwWq`淖$ZTqzB".Zvqywqc;Kn{]8P*y,rU͏񮸶UW;9+Ž1M.dzqLuPE="r5̕|9Lf>-#Q1 B|o68|%\aZbV$`yKWa+XLX ӆ8UiTϻh8k&z2ofĚ1fZ]5VҚZ䳑Hn|2Wo'"3zJQ}x?5_*&QrX.$?8z^|?w߾~2-8Ĝ$l.mC4ݛޚQO'C9>dUׁڝCmqg7&-Y売9/\\+mdY&<_{e@{B%B4{b&39TGgUlUp޺Q0mb~tM<7Aʕ$ F# M^v1J5# Ղ&pYz$=ye>"ѼGNZ!RhaOv Tp!@ŵ/F@u<ԙěwGc =Clfh?8Axe]v>Dy  +\uz+F(sϫۛAmz[2*&l].%NSd"OFøߔ~<斡r³{rTO"+5P*<ց'ʏɲ<{2MS2{}B I\T˔^')'sx`,ĸ-X:{9Up;"g൯:E{'s5[.7 ddB[i=[l K׿Qv,)7Ԏ2RXϒcvcvCx>:r)'u hgqK>}})`Z/2 eL H"&N &Dz+X9Yog![o:bVwq7ݝcݧT0` - (!TLRDbR@Jm5嗩>)^A"ZyoQ2o=Z!}z7k3f:ǨZQz>2X$dUplh399fLɵ¡@m]c.\Ǐ]-]~v4>ק7DNđ+H>tT9]NJߺ:M+C)eh.yNv%,0xnjt=ćw]ie;6y ; O]^ar -K5dzw8ɞNm :Ė[.Hv 1Lz^i[{>D*t<_^6MHq;NECO*wmz!? ~$?k>q*v-y=L>桀 Y} XVX)CyzKq)PhN(DwG{Kh oK2!`ЍeeAнT8*1 ןTnP&h0FJB]T}o)ex%t&;A鵆s@ig'h=/<'JD} PbhUQ|Sov%d)#IK<3SsՂ'*չin5%jƃ0z& 쭕$ɼByhR +ICҫ;0\pm{T{43z(1  $ )иBꬨgՆs{lAaCP$çgLgm`)lPM MyN44$?_XO,2%3:q*IU%Ax!p!sAs5U jIO䫏Y}+UrAn ,-9t 4{ Δj GXܪaq 3 Nb2!r(L"դX(ˇ~_n:!.}u!>Á!7`.톛Oc~.֦91ϏYfe-kz,1J2,hI 2\* %ϔL1d.zS |SMFVXo]̑t!;ЖyHZ8 j :bax{,Q) 8t?ʳͲS*2褡-lh6ʧ9,a;")'$ɇ/'4;EA@LIa:|>7_q.vF;7!9ʧ YTd!3i"!XDsOClHk4TQlqGȍKU~0r]ƫQ8+}5_ eM'xۮې7ruyQzNNkRr)i9PlJK&| {ˤLg5Y,z$s]i_t Zq IrT!ZFy Vぅ$=B t) _lCOܙ]ݹd]Xo]7 Ɲ,'K!!R4c f8V @ѥ9KgRL ǘ.Vѵbv! z:^6dzbgL'?'a:Oq`2mklD8]Tx/&h鵌h2E 6si:i0Pc p:ЬҰqba*S& *SL4h8_p>fi@ds54Kӱc_H,0JP٬ H"ȤfF$,F8d(UjXϦeӴiMӎiaGP=@3nU]ĖY϶ûxN(|Qe(Q(u;#SIL RRE|GГWT2%⬄c.?|g']2x4"CT<, PІGkDYIŪGˏ9p5Yb'c'Y{8 KiɮAʾvoE~g³qP4 حdžNx0kfyO_N+Fp ^pG4_ɪщJiTy! P!,`g*J,Ted"UWSRU5qxf %E)9lç_N.嵠 e\(!z!9eL"Fpad Jp‚A` yS*;Go_ġM4ˬ6<~z1ଡ5/g m͉E֧L\oE 8?pE)1Rs"sĦ cZX낧5W43)'`!x- РSAH4[˵wX֪qU۾˵?+[TޅQduB۷Mӫ}-Ja787fM d˾]ݔv΍">t5*FA^ 1]ajÑ,{_4W:ecrF^-M&^O>\%kߠir* &|hc?˹ǣ)^9.}yEk=/[Ms%6mR?{ql`8A<‘75pq c.)>z8|,RPmZ3~:}0C5'.,rM4Fl$v~2MW4* MĈ6Ku_lݨ"2~0JoY"Q55oU;xs[I $.,ucy q4n'܋ًGniq@n/uȮRJqfR5E\`$,|6iƩ }I=~еP6arWh_tto_埯>=s g篿=sӎ*H{W_{U=*p迟}{ѢELVCroGq, f~r1KGcgei5Pc~> `oH9Y$ijZ;Z7B4_LH7?h^@%$y~ך&#E¬Gdݶ.? EdpC{jL"8jN(͝v`\E-A҃6,漺Ǹpy]nyV\FCFZK%Dsd3|KA9#kP\+R/uqv}vnBCo6ByaEszykW;Gx}ܹciy^U/=% fjnxyO;\I;U#R5QH\iGӑ׮;~UOğN7R (NR*'TM#cJB<^I' ,$|N^zc0eb>`5CZ;) h&hϤ; Ápg(@;g!$i7D 2H_v\F$0.v &)H9$q*KEtJ,:PIO%2+{/maKUZ#B!N5)NJvXK뉋hd D:#zXG{ZWǺsXN! g’@ 99QOD9Dw@v#EDHQ Ǝ c8V#`Vio Zn_ *ˁȍп֫,HGcoLͽQvT֑:ݟ%EPCb>@j.9"O&)J!fyG %'R5;m-Fd! י^WLcA|8".DXPqd!Z"!%bz^1:;ΉH&IsO&m%n6iS/-(Xwć^. <M.Y$ ϩ (m0 Nn1F8 ՁIWsCm5V@P`ѝ83#ͳҭpZUOp|q' ƇKmR}&P\p4XtU秏:tx?LjS0ksYMQ`3}{&@;'ݢyNq۟@ xws[$,|\6 s 2 0 0uAhNX [~Ƙhlcaһz>Nvԝ\jߴy>]E+ٓc˘noЦ?*1rKe_;!i@-ޖd-أiӸchB>_q"ow&-]r\6GGmˇ=a7O̻Rs3ie8vL)xĝm_s%`'}ݗ}=ipqre;kY@yb,jZ8=-6jNT2kcm"B &&D? dmx:mL cP6u6\uԛ7"qKFoGГZpmj& aO5WlE q0R0=$9IǨkX8:;2fU[ 5!9a884%U0ϐôaĤ&oL8HQ c#?\IRF`)X)"ո1>J]c\ͬOHx1縱jFlX'=>/45g "Ga)p/K BF%9'&)0 Q&^&zyכM<h`[N"(MH $0DJ#r.te#!0 [ԙyXכ|=9_e4EivO=7-m hC$ņ6!LA {1.Cm"MvQ\DjƘ566:6)x\-mvޖ!rҝ[侖)hFeZS wA 411߼3&HRS"U!uDQQޡ2V˞߮]ƂwqT ߡGaI/oc'7 ~wөw |l~:uЮ\?jvw;?;?l^I{oP{FA8چWu;´i)Ri-2 f¦bI͇#qه{$V5CE\$'B4KhSQL 뙱1$SI8'ɹ@^G| VdИثYcagCpXzQZi57Ε9ϵثjXOTV\{S[N#pdq^ğ[S|s=)au찹jx}1Mk6>}+K7S6ŜP>e1rxGmj-saT^\ KH׹'v™:98u!ja=%~h+atL`RX˩"b:%*xbAsCX ygwnY'~e'돮˂km닭;x#V"T{){k8骚^#pqq`GԎ_"Qv =qn[jUQ|mZMThj2+ 3ɏo30*#rSR5,M`;/Á?k^w^[~<yAo'qI70:bHNt[%s\)9%XRșܤܚΪSiH<Vs_qc5b,S*Z̆q mH ]W.kzɄxhe|fijtqZͻ͜gy\Ϋ"TSªdz/x65DGWߡ5Ôu%D* ׫o߻ݚNG{^h021h 100xϗՆmU.k;A}⾵~._5?6/ռKZM:5SWn\J㞼EῬ*NsoLY > W;{l @D?(`l`8¯ql1_U?^ϧt޵DtG4Wy7/Nw/|޽ +ؙ+N/_R =T_>ѓ7:3CHf:ODerP͓tJ.)J4V@9)F)CDhR4ΟbArD(K"H(r@8RHqJAzpE"hW Fz]w.wB4, 4p@x:B!JǍHhl,FHD=P߅ {Q{p‹ Q4B!$N tRIR_JR>hQ9,qY!gזQ[/d YkBHcH=P= < zkkA$l]v1 YRJO2N.ETJQ"*YD%XJ,-gETJQ"*YD%d,%N*ETJQ"*YVD%d,ETJQsFfET.ETJQ"*YD% :U"*Y(V,ETJQ"*YD%T.Z¢UX Va*,ZEQ=.Ykz콚PBat( Q ܙe8&!Ua=_\ Y__M?40aR{rg  9,|ɏGiLchH<&F "UWqȢ\F*?HO;?:6nԕG]q[.% vRavy)Iku73Φﭷ>E;gYfm.+)9JP>b"[k0), JU g,Cd 3x5";_udh?&Fxn Qoc8;qZpy07"\<2 (wm6Ӂw,46izgԃW1E)<wqxHQ$|I)OUrfcb M=I&iԴ L߶c{E:Yhwl6yf:cc+bufp$E~0&X@m='3lw޶ROꦋf@3ut`*-(BDG_E_th?ݬxzʋBQu?qҁo+NHj2bƊ1&*Z)Gc)o\ *Όvloρ-0!d)fge~CtgAI9)ْPB^ LX{Dem@ULJ)$0wGb{yD 5JJKrKi.ܧLJ?r4dO/, UCA*n)HQe(ɉIXJ!)">#^K%AMx,GK|@OE=}lr{3e>hD,cQYx2R&6>d=wWt]*ڪ{tt t$cf;rSq78JYI+%'(7[*GGȥQAY,@)q *SIhT^)khNIB<z%Cf7A66h}mg>n6 e(Byѳ)c"12%e:#Q[xMD{Dx[B v6Ѱ|}Շvc}s F) s92X|?'c4lGS8ם H͉r&ia jכ2{J[G]ں̵Cҽ'SNB(RFBZ!+ASܛ| mBmT͎(%$3h>J'Uvi8MQ1P쌜]{4 &ZRM7u_NzL"֚jSZAqoȰj"DȖS$+fr,CzCcgjT61f%&Ɍf,P+B$v UVlxB;'1')jCFM2ƙcU{ 8\=<4yz_N <ӊ.5u2,d&8M$Kh΂Buc=:;`ڗq/GU~0rru2ʕ%^i>a+_G'Z<~^R4,}%~zoْF5o<4*#^=H3)(x`6%>eR&3њDep-tï|9>:yu ZF~ Wぅ$=A<&Ф*zc(.x1y-9[E8iqg'M:6?rk&E=c5q2R{t*e`r{Zj7k([)+F<!gLiSeNd(^`trr]2kb_6st뻭o> Z 翡=3@),EN\r.m^h&#Z`36IsH]KaZIXVڸA10)eH)&4OF]B84 Mƹ.$잋^^Y`j#&%Y&<AI'͌HY.p}]e/iOI%lB%@sDmQ_OManewaEP_ >EÁ |¨hp [nn;Ȓ]%WO\LĹ/v=Ǡae"[kB^kg 2+ w2*UINp> 2 XT igK s4pRxAz*i<5}4hiǿvFΎs>?W]*;- AڄAkC4/րTx\ʓٖTQλ%JU>G0r )pkE4%c86CH/:~@.K,#DcF'$On\eTa̠D$PEkx=T*ҨP_\z2 $S1Q$U(V*Z) XiJg}3 =?j)9dyNeAk I-3%\06xJ5J u+H)E!xϠ(2% //bZ%Kɫ`@eERI`wCj^\HߜUK??sS)60,3-8?N>]RAWUfU(Ee꧿(N~HU*F aOjUz0S//Ւ!Y|_5J0@C8(g O{`"N~xsRO8ICKf?A$rU3(Mi%R@ sr k3+FB>c++ҝ~H7ՕkEm6BR[aP8 듰S+rIuTͧ6N]8/w0̍KKugNmjڼ񯍫ooxu=9o?xcảpFy8x>][aovIT.G?^O;LoedH\#IzuVfYޢ4l'\E_7o\ &&'7Lͣ&nusexZuaH3)+d^CNeʱqNXu9;]Xj䉟 Fӯ0:Gv#G޼7o_?}՛w_#$Rqe '' 7^ЪqCka\Cq01far_w:`Y\8jiTP'WqOlR.=Gf U]"ܙS.bs\sd=>h.2'=#F)u~H `Ҳt1GtϬN*I5_w Y,U,Hzvo#|>FÑWe0'uL2Qf$ZIAb2\;R瑗:GU3xd5ŋ.1G|#io2-M4U}a◸ tg!mk>('XhG1%32 $G7zAzI9IYDŽ R) r;79)5d2R(uUꐨSŘAvgϿ:+ñj9 *7,ړ+jrv J4\B5"), >!oMU G`Z\A*8p6{3a1Ki֑*9HqbQE1%*~,Grr|-[&{ibL^\d)92I0DTJp$bnRVa<1MHDY> @n얮[-{ݥ%r#iu\ސ+WK>A2 [[W`kdzszK6[ l:Ė[2 m흗Sb&;r3?o}o7;un Tkxo鸵]nE{E4 Ws֜Mt׫brs·~ܶ懱͕K6/$+2.iG+˂VQ&㓟GlAQUMA`-2#%X%I:id?<vbzVM%VC=h`\v C㠤RI>7-[pu.͸F+-%&h'|Cߝ)a䛷3X?nیCpڼ?('7i$m&S *Ț1f > 6TG)-2H2_ Š2fΡ@qޏYQkiok[՗&oM*SUd&2PKW4ܛdHK[(/LY>~s 6g;?`CPF|CGhEuuIfӽk6dYYf](ɦUb'?.W'AwRq0\DUS6JT?ܞHôm)^O/ Zϕ9gqSr.?fzuw KOъN`5:>_4ԫy80 8AUmm2ٻ6r,W.I^ =;;`y ^cO-qteN=ıKT:߹(׹T0+d+MgXGlx߳A&3T/=${cl qgllI|+o0bck:э?6.vg)gH[Du+72S+H ^\2cB䒵`.yE?~m>3m{i>p%|o { A$-ҋ1NkkI\Y><Pu577w5EѮ|G͢@Kp\FzӼVuyӹm'KB?JIy7?(.\C[r BK}@, ǻM5){>=ďoȕr8pꭙNUXgsBG{7rьSu$SF7¹7֖O,㞕̇,FUdV&ꄉBK"~4khz (K*. `(.P`PGi*|CIp7-Ln *^lvo5$O>÷d\Ī7t֔`!xLU}(y?}ſ3%m/;u*h>fvkn_Kuǚ`y0_^U柎Ul陸Yu?X~rЌ捼uve;v`.wn_2vc?t~țU oo_r/߷iOD?k4tic*"e\_?{=ŗM*Cc@$&0)|c`)c)j!%U9?=@[CNXyEtHL +E(&FCMY{p6J/ iM ٙ_٧Uf/-jt?]m+lŷ>u}Z3Wa} tB팄ZE 7 Ѕ_*Dfv uG@C/oWq%Gcwqw>}p* #Xo$VԻ^}n>2ytmn]b'5y˽V.r*7iE9Pm!_+yo@jq6޶iilbv<;YP'Xob U%F`A$!JʘĩL{fO./'GLFcˇW0ɧ iRrZ+. v!Zg2Qf$E!RD[y6xs]I9芦oa]s_mfsn;ܼr}ڪgѮƦes0y070+lЛAoRo H!gKr 5A)`U 8<,YNg%&;AAs]P>o3a͛v+!Yde~Z2ɷL1q6eܠby:\ Ы+֠/˅A'c!}W ]N=˥nwagt-aN69nx7XsrkKۤ-H}ů4iSΎܧ3Hmdcl=9Hp;֪f6a4NĚdcO2^ۏ ¦U'%%j;)ZS^'J7LSu=8O!mld)N5z{@Vkr6А3ƯÕ4 ega?C>`c̏1><ķq Y<hf!Wّvx}6>|5y&l:ߣj~ӁlQ4NScp :g(瑲DB0*QH@[}mmWWt~IPI4x&BɌcID].=RG%P<)GDY(9Q#(][y#w7SDpJ `N*,5^vWYb)&'{fRv7C[*qr7^5QX'L AQÝyJꝖ\;͟~ZN=V)5$eh $T`c)qbTTp~AHNK͇uZM37uO~Ӡzv7؀ZC m`c]N *!'$MF&k`/4, wZP)=D"E3DexD 1yT֖а' b :thJЎ(8Pg)+ keTA#e\UL]B:M4mдcjګְ#(ŎS_ID팪^Y6uE=!DJh}+!>ߕH\eE;ׁk F+[8IG--z{#M;:hHYmr+?1#st#/mI .%yIF 2O>!`IzTZ\1<θG绸elӨ=(V!PtҔ +&>&Dq Q)-q!$:^~ɮ e;Fv!\Ghf~iLHC8\*,%IXqКSki/|sGgDeVqG/aV9YڛRq ߪ&z]e.&0G_?򰒟BڳRjD*QpaFc?kd}pl(0F"2bÅ_l0elW  N[lB-5!ft#Kt5ȝqwB5Kc,Ns 3 ?bjMi,ۏwq|GG7G%eBSRٸZEw@h-_2uyJuyz#F&O[cj[oSo'ofǫd$%2먊|n#QQMWZ1rrǡ5^M N:!1E *|4`tOblؖb]PƊaKu I9ϓKC1E6xꏻLNFqFs `;:}{ӻÛ?雷N1Q:> 8? ̂KEp:P<@_-@Sv-hn'|9wG2(#77?ԜYtgr_/M9,*a4XqgIH|\@E}Mm{HjWcJK᧽\I$ oS{7EgH`[D*f5*LVڰʆ}L8(a"hAcTJH.F)c]r( LQ9AˠK=/u@vo4gIՋ>JJ=ss^Krڠ*/Ün,ejb$:-U })t.)T8m0RS-b>hY0W x{zCg@z5G@+7 `¬5X& ӈ )ȿ%w\5J^r䳂ԟ]$spBL-JXXpFL% өi Gle \F,Nrxh uT8. %1VѸd*"S&8dG(PgM$xX{='{աvKYιf+T+N2UHʘ GTY-{"JC0˒ ګYAZZ79́D.ջ9¸s8ڜ7e>>&\L>ޖĒ[K,Ţ Z3(~l:gx{*>l/2Ę tC[14 r 2RC[{F,BF$㑅H>HԔ1)0+!"=[LzCgG.'_\pr-w " r ݮV|if6+3YV%_TQ(RnLҠOzˑ)*A52XWЊ,o\ϖX<6nV 62( *GT K\5T!@_/'YRl+YW+fB%H>f %C]o=hj%E,:V*dHœEyȚhef4ǽp_sтMвeKwmz|.1z^iv͘wy4vn\P=Z6-WK\ t׫֎Zm6ϩ8=g9tШ+[_H7J@rHA 9$ 9%d@rHɼ@rȫ09@+OA 9$ @rHA 9$k{kr,sHsHA 9$ @rHA 9$nOa/} mL^%TK&_(CxdKe襲W_FjV^"ԢLfKsӤ*c=.W0R k+J %\{NQcEhD@)xG>EDYyo SQa "pqCQRre wZUX\n33.O0R9>3fb1H3;N'_/Ect1ӀhxB !3 !b* K)bYpE{f} G"ez pVxE(AkN ) w.1̶gDeVqG/aV9Yڛgx$Cg<\~vT/ vhPJ~ EkJUc ~D1 F<)nm]̛'#mLu͟k fwHb*\*GgfnWDE5_M:;[N8T?2: N:!1E *|4`tOblؖb]PƊaKu I9ϓKC1E6xꏻLNFqFs `;:}{ӻÛ?雷N1Q:> 8? ̂CaЁ@((xw@r~wм7o*hAu;;!w>%F0'o*ӨGO. S4Qv]/h_4rqm V?cAYb.|(r.b>-4Ppe~nVlG :RLfJ~Z/!3fG'NY־I^CgH`[D*ք3^Jҳ[{^ٰϷC8<_hn7q@QbRHQ)!d)fhv1(0EO./uԩjH98z[am\<:L񘫝#͠+nUeXet\Ҍa^`ٸ=x1y sZD)Z2li%%jCoOksУbq=jHkØ1yEt:%11r#zČ TFK"Y.U8y5"$l5@/ B6A[R.V~I2!HzzC~h}: 1zh9/k[|k'vS}7ONglĞ͇`}^Ha dd"kQ9_RqR4$zx-\bҥVpb#VGHCi^҃ݔ; kބ z!=~yQ$*8ʝ7 1qV{t)yiZ+I.rdbYP,g fHrbAˠ(n&Vէj0$7]tRS6/f4%v ,ZeqJfL gvZxc.[qlN*u5b'|^H]/M>W+1CxhmGǎ h驰%SL(1xI(Kt4>S;qr<9;KZȆ| u9ri$b-` <ɇ!?YZ60'ַ]\3َ>msn94jֶD[vKƉ{cӿR}^fe/| Kp3"DX,-,xGI4 GtgYgΰ;SZ5a|oX ZܟsePTg}9ϟ^*e7_6w 331f9/X))f1y<탩w=8vQzR:RG~IpwGӶ=.tn̜wCLy/{zHvm}y3Z?1N|Wݺ O<l-lj>1JmE X픉~ΠߍyWJ='6:zl.L;J+uB$K_J_-8:n?xv) (3)7OǼ=Nz< +))m$J$EkAֈx!"Rvn1566Fg7A?ƞ 궽%pxLŦ߫Κ\^gWrל Ի=O\{*~íҰcrJ1.Mpt((9sQ\fKu:"{*(^y%&ȴBLQq!jRZr_9r3n&tBI  R>iB$ Yf3}{Ƒ"WuW{I _Xu!);z%kDYI_uWWEnlbR/GuU2lUeœ$eUmNI(d5VEm=hh{C3EQFbdYlhX P 2{EF<{PN?RT,rڢtR1AFqj ڰ`RUZԄn8@ѠVZLq!NRy_7Xz ^*jQX&S>UYo-]EefP`8F1{y7L`M]NAҔR8!L* W鹈$# [Z#6 PimoE-[=MD|#_!٭Җ} kTrEʆ 蕗*2>ꨔ 1zlXN>v&!w?v?"|& ⑮d:@&Olb=X |ޝGifSozWm'{lݚ+*:uL!׮e2<ݗiWϾvz5!hf)01G8-dUǂR>[Ťڲ_(J 㡀8T7W86ݮ}= 'BZ##UC-L6$-fg:s*3Z;2]bVW&t uYEY# j'Ww?ipԀ+=@'T$z8HuiR\{88*XXL ZJ>OnQl&L\Uթj&R*L tZ(t"g9N;QKp8>AwYGu掙%c֮Xa狢6ÕC.>~S4n嚛39!)Gr.+$el{ =4VDAw]f5~'| t+d6j_B"k!yZ!i7067)t)l'V$DU!tIi!Ea!y~ޕ=*}#_gJ-,Sj)5,m q*zl *U"2F'T,ɪj9ĄbrfdNL,2&$o9M0cl ֓%[]Gs4]~@qq|D[rcͩk_T=]~ȳR{=|eD W:r~|OKR`RPcSE[:lӱU&F:@T=@=pEF+34W#WJ혿REk.ȇv5TZSHڇ@2piJJ$AdUhY"^Pd ޣ`mCV7tטwq| b>NOqGi`lzl)=YYlg8.ho;d"h!P':wIۭ yDqK'3Fk7Z$GR;|rdWf>Z}NasURB)b/%G#u)1E*rJ] 5Gj͍g7uן8sqR&#s־j`}T,kՕ ؊ܙ3i3EeʨUCH[?Zd1㱄EwsQo8X7wv쿩3_@җ[:Yv5` Ϝ VrP2j+JdBHcx͊7mfml ryOg]_hy0ta:=4=zM=u51*]%k5)]2ok*U f;p/H*YRdT𤋃2"UEq J !JY$`BCþX;A^yBXv2ZjU"bNN pV&:6|д4!5Yk(E&P0E9MԭZ^gV,7/[-&Ik,,Ngo^舣I-W-uDc*.]:;cupɓa*NC%W!cR"!U\n~%Nee2F޽{bOL-^]1H/Nimٻ#=jjګ*_Rjڭֳx7b}.9#otE6Dd5,iYt+;nkXCsJxD[ow& Io¿׊^KhT |:Ox 'bדr{K7 irzR1"tZq.B)%C{?w_1?|10-.z!_;#W ϼ?J]eݪC ~^^ķ}>YoWw~ǼNw'Dϲ8̗-^0'(+j'&xjwMoнϧ lrTsWj1m+x$HwUQ[+Q:^=!Jm6%lm{YI9 ''kuf'Kt:kb7!")2YnĢW(h| M sKUOgE0 | wᗖ]u>+#WG/?~/j;毋E9>b<$]HNII;޴g|>ȱNo|ޣz>DƈQ(t|>:,=- ܏-47[+ZcI!5RDf#߼]|j< }*?.o!/!ޗ]sg Z2yÖW B~2$ ]JLۯnWOG 1bX۬ӎ{k2ҹmtWC~#WZBR/6N$ q| }i7G,?:ӆvl0dOG)]vu7V^3OG/f+/Z#7h^Z=5|Ti%:|c'LGz5_kN% (C/"Py3c.{;e@Sp`&5,W˵Y2=.;md[\^~KP+"EZ9*JZ"2YImpo?6[{F7+ԮlF>U7O+G'&jI/сnD8;-:l$i]LVmMם9:bFr"$X(CZ,)r!ƬR֪j$ ԵEWH7]pD3 L[MîDSpRfq׺YC]kp1d`9 7zS^NZ~ihCrX$V)1ʥP4LzB #RDF' \ XoQ%úEc4BĿz9_a+Ap " ٛ)o1:lg [|" .opy\\m pk~l!\&+"+"+"+"+"+"+"+"+"+"+"+zP E\`˞psz6U'\?"p "+"+"+"+"+"+"+"+"+"+"+"='ß W0Wg\Z# Zl0Xä|Gl8Gţr~x:bt2} d v)|78{:!MqBHJnܱ ^du_o()][^i>0^Φ}_.}Ll},ծD6]1nbl|cu TOOmxs(V:~ko1F)6m1hY>.mU?rz6|?A$Z$Q #r}8*!0>@ 6@|0]5sϦSH4}7}n?&,lϮm_kiնBv.<xuҿf^aȕg5)ᇰ8ڜ}8=(<6Rc5[8ǺQx\ &(.NM\&j[2>դ4zP,~]n3m&<{ڌ7\TJ:b_x|b-%mN)X\b:Bc$wKw%#ȡ}wv7n4BL(2VfಹQVVUlj2MXgR 'U_ܚf.4_yӖq!ut)G] ,d\ x{QCrY 1I =EwλRofu7}pQ0l {!١o)y} ![*} aB>#3}_߉2 U VAaq>T\ULSe֔Y?Zʬ.M4<粂 $YUdrFgu) /ȭͬFꀙ`ܜY#Bw"Sʱ_ƴD²y׆ϝ)ş!-[eď;\b~bz_y,TèN͟꼉SOkbDm9 ~Ø}iN>^~;|"jރkBڟs<;kg ? et`z>2y</^1 'sY*}+JVYu)rJ2|N[UmdW)`!̹=Yӆ6V%wގ,~rgX*|w^~ݡd%uw䢟fǯr3i|{+? 0>LƟ " /[ ,ddBѤ>ۊL9_i򦶯'.M<I ê664WH jbљraɈ"\K(:!6n˭:$! îL_R>;cC[|>4i=bShCԍYճxk_[TEP9H)sU>o C5 vJJ\Z-U0Y#"ӮO2YlyY<_-)}kO>%L.ՒW-`LN|LkYK&e)TjNha31I,p,E)O*ז9mQ"t[l=ٳ~)49hn6KƩπ_YDOo=Hr{ig|7͋Xp=pc0WmdR w Z0f:HVM}e}&'aMyZ֤*;/tbn=Pq{jHx[ l2zުA8:Z*FYbZq%7Zg6IK4V\*NH8cKAA똴KBn*)2f네4lGKMR g:|4]hxԼxe %NNW<ݥ7B[ߨk6ECzf55nrײsf-(4i/Rn~5?;~11˼QfNBD#o_IMTKs (\r}+03G-s\2pOG#O}~WgՆW1Y.ƘxZ'G˶m&Fog~WpMNKlו{+ﻌ{\xe5:m.YЂ(>N>z~~xqͩknN믺}WwZn:ooʺ,ieij>ѬZ8|4[WOՀWOj?-O6.w'7߼o_훯y zۯ0 -(ĽM Pޕ$Biey臅 t X)q$SQ5"EQ,H&ŢU,[dVVTV_DFFxq>E[:ZO,xZ–zzqߦ>~Ie<9<NOVsV$- ɣ0F143G3kk*pgAjٟy#f#ހJC=[L׼K13F^w\D$  .xSc,jFI?Y 41ig UIzh\U|cġg =ruRԂ J(DU~}ޥICT/JzFwsSgrƧOygygCvz_ħQnMdZyдW K_DpFnyNPgS6\ fj{LC7XK89NSXj&hP Sz} \Y2 }@D =/ W۫:yiXU4x}JI*8jz DȹR<0Bb ,yJE}\:XsfZ%ka1$UQ T-W}M;|2$a KaP ML9MJ9 *µ3T"(Il~@~'7p^شX v/N{#4# P`}A/V>6~TBylc QvTV:?Ke^Xb x^²SIRsd %m-FD# Lш x2e@$SBe80(ClEy: 4$H_"pĤ9Q罳o~<Z熈]?lT2!׭г>PLJ٘vџksK<$1╓r RNLJtlDU "r  STcFxIAҞ-U*ʓܢP%OlRl8ۇ.;[Zm*ާj+4%=r{sCHℋ J*& HD<## +v$q/Z{*9IGCm@v*;cA+}"7RDh +rfk'&go}B/Ev/Eg93@n얩+}rvH_z󦺳m:z Sx+]j6zIc:ͫYO#+9/}Mb'o;̼4r3?d}:O܅;y&[&n|hK<-k>ViqL|k:] IJqמ770\Ɐ5i٬r%?x.e8kder)nF)x}ߣsUڹZFZ\OM>ԃ:~:k֔z.hCL$]T$с>i*Dfv;2i;|\P6>x5Eμmz_]{=b@i+zD$_jHH0 EaL.s)HJ%Ho0F>O;xx]QӆA< &R-,"T'r™*13*>>4^!*nc6?FGυ=S <XC#w&@()T4BhSe/9gan]s ZWϾK Y | ?x'LCϓd z.CUUcMld@tosMϝ_b!7eV+Gpk1{a~B$M󏹀>,;~zܯnЕe_:gk9- mmw@ôU;2ά]CeOfeE ̒.f^ U!r3hf Jq&P82Gb% zVBndU@.-"WI^Z"!Ypf< 6W8A.߃dA0<`xփʼnkW߰LԼ0tWUdSLjihʂ8^F¬hNhTX'T#ԃXR+¥qsX18 <2IMyřTCɨd8&tfƜ2IS)c;NYd3%ɀE_mh_^l6<]*m3CQax[(pR 6Ɠ=R3 !:eRŴ'"5e513&9T&J>919|Bo>><xUKxv;]mOwD ¦^\r !4x=XRJ:mPc +HV6zRybR|ОV̤stzt~jPuLS-.! 1OmGА;DԨ[R$%Wf唗Z)}VHI9E !1bV*Nҍffڣ*=e`R'֠\m/ONēݮUFH.P%POE`I ZК9&7H-&R\9HxHrS0nҔeOm:-*g$LŞjȽaa a:*e޳$4tJKX Y1r:Y-W1Gi70j!yY$ŷ/wIU2gHmp 1iI!@KQ ()ou=}b4鐘 Y"%&qDs$y kU jO{OS#a8Ld/,B1c]9IdA\ף`TAqQڼ+{n-GKl1u8ގ;I葅DLW 3ČёxB .$hɔtB) Wzr>V9[_/WE;lsA{30֎ щ]0Ait0p5ۏ5"fDj>0f$tϭ9Om7GS; "$7WId d|NbkWͧw__< '|F)':26YF WhUwIP&s*FVƆT36Y!KoTlڄME&:{\m{m 5A ꝫNBekͬ0$ʜSkS$3  w(Tro #dZBQW\E]ej B* "?_=O2j5QFWkNh܉ZinT5;+>}_=Heĵ:3S6`o_e'~Vk!URYYJ2ꅦIލ2~~'˸B$"MBGt`QDaZ%D9hr~&,e/Z^"߾M_no0϶.o/fT6%Bg1ws1;1-O~t/?>X%=5BviI~]ˮYͤi U0)P1häY<2#a*X'1AT%hL<gZCLC=E9_ 4ޔD5A KH'r™*1TC5:ߣ5rUڹt+cykI}贏tmsJJMdAA2()QFUOamgߡ\;x^K} QqqF_dT!g{ޏFϯEu^zIaLq< /#~o|ԶIpMznO^_g?gXBfz}ᢺ ^:4cG+ֽ>|Ǔ6O;OՄ?u 80yOJZЫ{g}^o0_Ԕf:2rQa2zJ|:ПAJ]k"ALjE &eF2 ,Z#ՁHOyֽW*ܼ_ hjGƙkvqifGYe r34kVМ߃dA0/emK,͠:> ZOZn%>hur䄮4VG/#am?ߖ 1#4߰0S«#И- 5d!qdecsnDll|'ڿ- Ԍދ-MxsVt&zZOFssܰbg) u&:׾Y Mˋ͆ٻ8rWXdY4p$y8yI 8G#yFS45K=m`Wr7f_ ʋ;Sޞ'=FWbҹ>GtDzD_ߌ^'Sk\0D9sfn!T8 Bu)he x^K)zE7S)@*8xb_,BʺڜҠ!ZqۨmM]zo령U\5xг>QT5zhƓcP9K` 2{MR0CzT iEщf;+j8畱)J+؊0*xӳ6^} l{ԥT` ucC7=sxޠ*!N$B h] AT$_Kѱ@\EX"QD؄TQ5Mjjр8w#Hm(fLʋ*@q7*:$ pb8Nj7xj$;owZ."D$3Et!W\TRd"v6sjxж""2IZ|(_o"Zy\/wwPCDY>䕵 e:zA]F裉ZDҠǃ3֥Sю${+AmaMat/Jn!M!"LWcq}5e]6eHeBlrdQxZ(|H*=ww8  3@R3{ T-EUuT[ Ep9xŻ\e4Bʨb;xZG򴘽喝z|7dVRIFˆOJY5vk;5 =C"oz=9m #1ehL}; 5UbSUmR߀ mG)^7Nx%m &p-BlE`YIJɠr1+ W]l~C PDU MW-D%9U6;c:ejGG{@s+tBy+ʼ'OY\7Ij mlp^^vf=BL|!EmCvQ Aʝw]bցhp~\8\_*~=[lx?g;xYJH.wjf~Ng&HĚZr/vhOQ0xM{'^9ҝDZRjAm"q*zm :U"B48K:YLF| g`NL#ߗO Qnj)Ǥ ֓V%[S{6]* !nn.<;r{sRin*MC7_oN8vSfǮ:1z$XӻӹڒG2Iݗo~}v2'u.҉Ǝ&Hꉺr#BU} igdKF 5@តb'7P6jCig^Mh:"֏Scp1b:gl'Fr| yѬT<qoz:"uvwxY 6܏#y۶~/iqfqw"_+CY#\.6ϯOL0~p`l?i>[c7j~ xPYqd`bN!FtU9{W M-3^UQf\ר9_o>jZMB)6J>L&\19.*hԹW[P9(E+K R(z!,4R)N=ކɧ,-I&'d;Y$avڤnq'45#shmN*Z?"7ǜ43]zm/Z PtTRFʡjciq~o"8YYErʂk-s,-}rme+ق50T1R}[ކ2nRSQ/x-m! pJ޵Y1UiMOAF?FOc8flc+$j飸'P E48+-#l఩(_Zlh1;*c-mM_;8} gŎ.OJ/x=[m=XTU.(U6@'f .b>BRVInq!.CR :,ٗA @BЀցumsL NI`FgL:&Cl 6hڠiԴwa{P5 r53Q;A9͟Lo ΒvYX?  N-3uDn0.u>0wX8t"!SqT7qazK#J bٙAD6LI[o"v{ev9FHX}QTy8;]|N0>e% `5]Y_ݢs}8(KX%l8xGG\~] מe(zrn/mVC:}9` SIYUJ فsV9r Z:YبQecx}P .Rkߠr\K紦dMI5l@'׷u>w_/ۦǗ1<r=呐%vϜm&<rL+Um2aMر DaJe@{"gSr< 9j,#GWE C%192 Gȼ4dQJPc V/ƽ{c;p~씼M*albb޽'=rЦlm]w#g~a6o}ү[nzs2NlF2˸y6gJNZ.ϑ;_Jw7uy߉z.:qvcXMo%^C~>Zz]fq%[ |Lȡ'(JS5'SH;^C d9pb޳A'&OPb;Ycvj*c!ކcAO(ʠ3m$uV|^ᵸ y\vk7V5iR,]l4Xk ʠ)*ʬ[ \(UsR٦sWd].tit7m r&9ig LMXS X#%F P{~TgՊtqjHVh4(e݉Zccs*qB\jK}K |d)Yޏ Zv:kMD TS@[ce\cUm $;SBRVNAUװWgѠ6Y̡L$9vR:[=JUwLU>ˬhg^`܀Z*E5NoB*?MF-"spkC**|lgc>~vYrgV`D 5ŻQGD~;{$[kڎf`-04(0JQ>9k !L*?Lo^}(#--U%HUM>BHJh +k~nyL/P~oTƷmu%?fQ&̌ȬeUf/.u7XqK,vj%\ݫ!}[E6)_dhDݨnogOil޼寳Pƿ^,.u1n0N`Ÿ:t1[v%Ht6]3O{QZ$Mz퓐^~պ赏O1wQ\3$3frϼrK{wӶQ:L">v=Ii9gl8rcw2wjE&gx~,?{WǍ/{K,E쇜K|5ʌlG?Şdf$QY)[as~z]7_~z߀7՛W8ϑD0B>Xy<@/.~ilviҊ6]/׵-oc5ۓZr{R귣|,ыes~n8]̈+4OXlVj'dzeo^.c*O ![݈f|o@Ou݉Ky%V~tMܭ.qxSȳX 3F"cj5.YJ\o4NMY>p6C=oԷh?$Y|![;XZ(oXv5r\5/'~8K~n1 E4F61N:;,Q"hM8v@5l̜.ba~#DQ61ɱ"c,D+ @AV%)K*R$-KmP (2|2:e]`.r&̲q!e"&IB$V/xA&'bq;@d)F~@Q%'Sm ƿ!p\  YZ2?"]ۗdԔhJ*zPA#/\Kikyˬ$c:*Ee@%-j"k[̓g7 *Bab-1P $mFR(BNjG]9 ўR{PxվS贪L5L$\ kK*HI( Hm,eԱ!4F]{;9n@[+/Π'w# lbq<z icKB,%=aO!O+xz檊+\Ui \U)i4W\YaqdX`J1tsUn4W\9RtNk v=@DjTZ#=]Ei^>Nѯh:9i2ϋ`;:IZ_Mi<3]u*YZ)afJh&场ΏO)k?&џD箼5w\r,'yc]$/CuAײ;k7K^s1A ;3lbU;>ۣٻ3"pn JQVo?,Yz2(\0H!jl[.)rg8ՖUJ-Ym^՝v.绉ùJ|7)|8J꾯\@`+Aj9tsU40h] N_gtV .ƬM)+g:#!"7Gk!=:n'F)4^]5?oǬozwɧ 97zPA,{OJ Ħy_,ϼWɐ=f2}jܣ2`5 /jCb hR:ܖpH%kў&Tʛ k38oBd)ɚћ Ry4xe yybّ0U!-*4ݫ }.Fy&"P^ EhuV@K1 :+J蛖821%ms 5[:pZ%YTҠaP,^qflOg8 o nO?vljwlw;xsqj/N~MugglEvBhHb ({6IHIWl5Yl0"?.נ wӗSxy_?eK4lu\AW37'w :&8Mo2y)nS|kP"I;\.090~2SV\gftm_gy:Lp$ɹ0EqNe,iF[|HQ3',xEBO2EL6yS8"'JcL@qaj\kk$B-"<\Μmߞiѱ<b9z]1#^^%pkźZt:Z6dxE ;Ld+D*ZUG:P 5Wd V ƌA Ud01Je VBUR)Q9`̸:W4R.kRD,#"C`dmݨJkf3qCU =$Y6#YWgl:']@ 6IgG׎m1Tڇ9c_QoMbx6fLas9,s THч@ -J_w;&6\olCbǦ>;vlV?}[l 9D61)B.0<11LA#1$,l)Si_Ch~ $ A*Yͦ26NVd =AY;Sb."Zj_J;LNn6;kec-G="fC'5153dFEG-ۚE42KS>ID&z6g=I2ơXfQ5jElN iWB$M6`k$WjkU^ m5mԴkAk#(Ūʆ䊋y~ߛتt|!h $S F:9RgU}VKU)RgϰԙRFIhd)Xby*zZyNk]`4jlYWc=:9!dU"S6)N6yQX!l$cerNTB36z{g,k+tlN/ĮĞE[U/c>=nlr>/OW'i5K$m\G2'2gkW׈$Oŀ/7(dD鄡#F1gS%.e(e6J; (E1szT(j]JYRT HZ5z EZ^D)fԙX̔3@_AڠL Y%4z!ì֟MNŚM0*q;M}^qm.|T҆m\_b]"V@d)Fx#3@L *-!Z Y"mi0!b30(HJ[~kZ"_<0,MAIN$XWcؗdԔhJ*zPA#/\KikE$c:*E1E-`/6C@4꒹,g,01 x^l22K9 'sXnUaIyo'F78 IǪVE2 ֜H JLI( Hm,eԱ!49"}ڳ. Vu ݁{r*~oyZt3v<~rڡ9;;Ps KANGg; ` !˟cǪQOQKLf]\2ת f@9^CdT1QD>7eP(ji5 )uJKS@2JZfO*%+54gG9 tŐPWg !fo.!MOoLpv~ZW~'Zƨˆ"E@Ir'S@a B%3dj3̊2vEރyXLv9>+);+A(dJ(JUhL|# x:M$pXbd9<{`3~ZYg?Ois QLPuG.GHuf=s,홣-H  4 [+J5ـ`]Lk D,jbmRLŹSq!M7R wO~ڔ~ 얡CVfZyӝ nf/X ?6=}8Og[]s;99?~@>3evȗ'o|3kg>*<۝:˜ߩ\ ewփq-[婻nnVH۸<);'#wKFn'n.q@X..s?Vl! `jlr]HYOdaO)v) ±6ޕe/FvȾnݼ~8BcԒ_ /CɡIJcؒNWwꪁ/iv%9xG_F*a5aEA:dk)!j3Q ȵ m:!9S֣zY}eb[<܂(iY0NH6z0& Q)" 'HZ7 %W{B֑ 4:&gaQiƝV2% B' H GUm*e(\A( iV'̦V59>Ѐ 3OsړwE=/ZbJB(g!b*4RĴ :B[AjV)9ETRi+XסgVK~ ߲S(b^~Zs~ޝ\)S&4X!K;]0xGh-Om~;q*.0*͍013Yy{ 1E_ n'W_^fHb*\A.ʵzKAl0v5|qa!T1r$jR9 lE0ևOif =^ٻMVLAGMjԪbt>83Is`? G.=va̒Ά[ K xCޠ.޾~ݫ/޾D]w$80 ";ם`b ƛ M1Mz5ƕ)WP|$Sq g& J?f~}5oz4G'pZXM{ p%u=}1.K٬MyA՟S̩r׾P!|L/bs=RP@/_~mҫIYEUϽB@$48:wZx A!?p$R8UؙIц 1FcpKbFbx3SBasD/uy3ۛ$OlIlE}Ķ-DZ˫Qb1y_:LLHҁ:$\N#IVcHT $ێ$ϰ# cБw3IAbj Kn"'Jќdsk,QV8dnGІԆdy҆rf͟tX" cRĔ' iPJ`eX3&P-XKgQ_{L52j<@i@^pbr'a_1rrE%U7*r8GnưW'מu:N4,rU[_͇ˠ"houc_\s^ÅɶpVm|f'0k'ED*L;n<>gX~*kptٖհ-;ԃy ~ ?Jt(h&{ 3|lBNKbG-{1w%qZgGyxy¶=eд)h 6/3] rgE>*.k3芚e[:c~V PF0g7zEYjdXߏz*Ale*0KVnRnd =!"=AX{)) JnHb΂wDðq IVKy 8]իT$т;p(țbR|ӱQjx\Ve[|D2[lpyϳn%BV(P,e廆Z&N2Ȥ6 {v=l6LOoz0-$jcrB୦ۮ;PUVtg vj>_qNy3Xه34q$M[bέV,P5z;}bJl2m7c|ٜcyYdM}m+7w]mxm(' oCb@. !t+5uJ,ҁIZ5"^jkt L@D=7s`r%&1՛i773mXCa몣uSH)STäs P X1}2oeSjM3{h7iNւ* S甓qnCC1DəW=32;չ9-Ϣ"qWkⰉL+DwkQ-EJ'0\0P!UEAY&D473ˌ3LW"gWtU顨B`_zq2";>}=[ :ōjGeͅ 9a'Qx WX"ӂ"ۘ8YsY7*Ej͈ [F3 2Ef<-֎f  X1 -hQ\UR Mȩ)gGWd[1TZ)HR݅i/~hh4e(D0J|B"Ф|>&;TQG#"<0 ⌨E &REBQtp1/0-j jIZ?^c9PXs < “=s\mQA FN CxR_`[ш8\xNIci`&LEU@a ɔ 2&*U+mED-%FEcmDEAqOPjT)3 XM4&D*&-P\+͉nxX1V:tҡcv<*wm"enɋ[j@H,YBBw88zo|\`{nܝ).]I5n Z(ڜ-rm|Srd[IuHO|u %KP0vY4 <@b~Kʶ7PvO@G/ii1N!W8S\q2Z䱟BHTڞBxX,ME5p 1'D fo2u> KSki=r%<\Јsf߭BR;?_trM#u*ƨ˅/u f8I$YR%U]Nd-Sm;h{0o%&pq>͔ki׻Ev= |Za4_W% .BX~7%Yu.ddFK_H<=|xE(&tk4bcv4<׍tO/j6WZR[4E;%>7+F*)2aRg~37j^'37nl)8θx*4񄫞ė;RÍ%7'h֙>"恑|jG|e 4&C&@yyDCC)͉zyz=X%wa1|W#&Vt,3^,<z)<_!v+xZoQi4[]u(󵹩 %L:.lS^˶7UXgz;h0xv*%ڡqލNgYBx /V~٤l38G|9J$&Oo=oE5 ^L잍׹ %2}X|ɆpqRڡ&xKi.pS݄)0236pIu=?eOizj&7詔?QU48P; nwv;[sɵ^n8E NӁ@.Tpc‰J[( 0`( NiJ"tYfp{IbYdG0 qn"T":Fe1R *|Pd P c) =+m{3>.u>EyB""a,9/2i%GeR F#`,FA:. <_ qQؤ`mpC4 Ti j=G?/ΧbLÕcp]hCxdhG#i0 QyI843N>>v):TivK*͙@)Y-ka&,Cgc }{&JRMYN MQeI3*LEK2r !@ NBy* Ev vڑM>o?//%QLXi̍d/vy _ħ"ۜ@ƅoB<1 #/~qAR WDRD({~#tlڛ%XdC싞Aw8|۲`Q5ojYM.ؔo\e HB` U6\q.x]KeP_ cyBmP[9bڡdi婔!SRnBbZ!R2H>-* vi en(6& -^:ATօ$>;WB̸~9(l `&uĴx@x9ؗOk/7i9vXreXh^h©K귯 0H08TuapU9_-n2k!ddK5,)y!$̭aiL̚,tuB6O?)Dp[ܨ*KiFÖӴwGˈĈD\Թ uu9sJoAϼΗ^\^;M?\$Z<Q=w})tuӹ4ilj~,q(vuo7>ȑ&c4nlǔ\J0xmJ+.} {˕J\deetzWGͣQ׻IB0G-b{_\ߟz9 >"A[oL6K[@傶̐ xΩ #HmT=[? u 3J}7+y2m : VrFl"=\ L-~;qN 5Mh6WN>|vGf7;`\$+R&BK,kʮ#4FR=Ӝq6kcKMj%dzR(-3i#IMak82-cclQAlac3㩶7l yk wj  jG70|'* ڳ6 @&FZIVz'eƼ0,L,͠OM+bȄ "dB51EpLA683DT'ta{ؘ8aMga<xE [DZ"6rpFMmM$Y#QI'Q&\P:*Mگ8l{[v5lJ1^qf4Nڸc"klr[;mO:ƑLrŌmܑl6V6cDy8Lɻu> j]zW 볳0܎k2}^xvy-\M)jx`uF Ǭ* 6d0t>lTA&hxUL CRdޣTH76(3^JH#XGS(I5꛹x<zQ֭||Q(_2 &(1i呉@<( Z%M !Q ӿ$gogڒIDdڄ7Fj4LR3{D%S6x4i"5HO9@`Q*!#C;@zR\ K:2}ǁ`A-*§Qʅgq_MApy4F+TH/FKUf sԙgR iOhKS*x_w{q D:sHQ N!"P6K&NGRrNh.}d4&xyzm4KNѬzcɗc^CKkohܺ|-̠ԯp?oS8/.tӹrKԟ_7nMURi+-7̜{U)jrV)8j o״N\-;}L-FgUͻśE`@d%΅twx1[iova.F-MR-kIXْ-]jFl>f'{Rژ>Q,߻ ewsjM[ꬓZ]WNZ[4ӑ0`JףN1w#҅|X9ߓʺN^]t~7 s?ߞV?SEdgPRPCQ__qSs~NjM'~ч(tdдWK_\}jZ y4DF6xFa"ʈZ$P?GӑG)}W"3v?Ox4flwb25oQ4.X֧-uBY,׌Ka,a.N9$j. bEAT[~?Ks7-\Ywu%I3fO LN#k.Z:RXӁg/Y߀q^E^~YY)&sVYYjP8ס~m9i:VIT2̉Ae!qGX0b;bOy_dQcLw+cI%TR}6 1A5뗲`QȎ mcHXYtPt޾W V |JMu Z_uL\QJ/IpG*<׏f]!,ծ5\y4ޢ慒C!뛽]smVy\ Onc#ԜuVtۣRn}rw.z-w?Hlۚ@@s=/y>񅠣PDMa0 Ei3_xUFG)4]zٴͳB\ԟ~) ;yJ'$DEH}IiLzQK$J+ZjqG&諳 )~ݔ҃:R+=(T]"瓫m[Ӣw겓;&<fQݜvhmQۯIg7g >mgL~g|=&|\Ov2)>2K֜Xl@s%gl8p-GP 6nED֥Ϟ8f"/u0nw1st!J8LWK\(!ZHAKVCȳ{s2vz1B^qw_qbisV2-IgK"ܸɍ#K&dRSye|nCmk;QoL(+9 D~.i{$f *=t40ߚt۳UFj%% lHJNlC,1gBQBK$A6綔G,ǨMÛ?L|,1^>y}VO\q}2%KǗ/߭yއ[7{tcfK;JǗp7ۿ#_r1"~r2Xᖪbo#WmlgNL ڑi݈_v [LDK䔿 Xs B@XF&IiS\6BqXXl|ѫ}_}fK+wlBȾqb( 4}n2[@1ZLc|:4^sD:B n`ٰl_9{B/xJOP<ߡJԅ^o{.O._M BL}މF}t:XeOSӪk ,e8N_wuDS8٩Zje'aSkЧZpa}1Ru9sIJ\0`!JVXTֺ@Xcʼ4f V48)<'{ONXG/t0tCYZ_Z;3{'yN~Q9?AB,6+N& L}N0%!bfFS8?)e\{m7 @Wcgt{|[}X\s]1 cΉ VFE:GOD $BX6yX-z^4C=~V.̞6S~u{6A8+ %$1ܚX5Jj5ilfD "t[><)~`Ҝ 'l! ?c-% IMRF;m?e=5tbKOܤ=s"8N~'֋e%#0Gv`yK`=mBI}Cqֿ:x7:|NNqy{i7FuAb*%v&f[+d2LȠ1.iS)Xg|b heV[z9KOSL!LD'!M)phaRRፄm~ћn<fd(KHjE‘/)*p98%FLQwESF(ُhDJK_ qetɯe:ɾmmhTnN3;z'vd2ѽrϋ~6&T|qJjIj2/9R+-t Viir2E fTхb؇,ћV+uc/zg E7o6ss6y&|߰<37%"Xe cvFK9 cQgCw>I@>eа!!oKY׻$fW1|.V$UD/_|HtI'/N 4kyjl}ˡ9%a3U=TҌM٢k*YFb\Y`1['XîdunC< "t3'25Sxk/.fuڨ&瓛(8%nr`3+\=Nun+9>ڟoUgQX hVv?p-g'}|V lYvfJYlŦeSԀ g׋mjOӋnf \Io< cV ބ<`phzL8)TɾXR󥷷1XdtR~~ओ@( -(7}i:zO_L)HFaَOQ\e/B; 팅{B"7^\^ goş{''1bR.+`DE!ơrQb|Շb@,wce6T tD#qT<⛆{&Ħc]˵}QَrvZì㥨 QfԞShXsew7bDe4#XXE4BVfhI+֔j ǘO$)TW`<6x I:0 """FDqF'-e\#ٓd_._S~S6"X 66j8;T#/uHkì䥸ȃqf\q!.[z-Qw҇X=\զ2o,9]wzqti%ڬ>M!6b,EszݜշϜ7Wi ;j{8cs"s|kCT%rEFr.ۀL쬂jX22`IFvަZ{Az%LB )Q%*I{_趺8O?wn֛zǡ~OGGz~ڣw-6.?{WFJ#_˴M\f2`2|%$'q߯jɲ-ɲEYj,Yŧ|1^bW :;üۇOjY`n~e5C+w뇩ycuK̶qǏFE PB rt$c#;s׿:OPžg$rUG'$|)/P(g Wn=Ҧ?;(&|JmhC X?ū6BiѼ7~}[f\f&xZᐳL#i$b>Qω1:QynƜմy61⯍̓w1(Ԝs7Htm^#Q5~4/n?&iٲ@ʑX>a`0*VY> ӆ86 W0b>Ňlc{5k&dIu\)Jk6uoG_#j,6+ qӅ۩=igq{gw?)?ON(wxW`}.I`k~ߊ. ͇ņfC37Zn|q)9e|F0c_.{cݮf Op>5Yb 3 Ҳq5 5om 9U<4"D|3ˑJC*O&!`6 n3\Ir`4epkN$g5 N;kPvJғ[f2n!Z<NZ.5FSUy}KA9#kP|V2ݥNKq\^7d=Hoŋސ|\|RFy#ӭIg%_ړa"] rI^̓{@F G!9=)7yWL=x>]mE0gl:D]o9<[Q+Ɏ9lGZ1' e*ٌRQ6|7ș;OO ~8}Q hIϖqht gkȱ Gͷ4Gw5uQt6E͞VS?Rr4OkzմCQϮnQs wcu؜\ _^#wkpvxB."&Nըj Q(]F=9`xG9=YڿVR]8C PJS:IPA4)@ C:9 yrY " SqTbԚq9jMR@ШIubl] =unM;aRY@I4fK $'$ 2 ;rk'H9tBSSBR *W XKu*E91L\Z | BmYׂH ٘d}6;pT.ThΙkMbI3)|4_cNr! g’@ srpD1DB L;#K%B@!(N;^ ltWm/-ۋ^HZ9pMQ>V-c4L͘3q /=<<"5۪f+C~O?ކrFDJ֟ ֎[χZȋ?KDkJT ިZ;*H%]}]U%4]U'X]jAT%A:T$E)\ZɉTMʌh#i-ju88QxĨv!Ă2L[jj җ)l)FΆxy,Bamj"8>Dܝv3ٺF}?]B7;i{tEg=(Vp3K<$IBhppJSD}:u>Ǚɺf'f/*FƭFxIAt* %TɓZs?/Y/G$R_Ϟ-uEE%"[SL3+:xO:1Z6ңo4G !I&j<5P<';qGEI9>Hs|-5G$0MCȡ6PNv*׬V.E.G^ XG f<$i%yS>?mܦ cSx &r-ikf '=>B2O->Pc[7ϭirt:osO[MزƖ,wny|[ z^h~?dou:sdoT{q\[v֬\ηl^/7òris{&I/y|-{[ Z}vHkOHKWCZsǺ$qm.qT@XA =+К!"vAMtMXm_qRIR:})(vl~Ûzӏg3aIPD\ KHy'v™:ȥ;Engo/3`Qr% ,jRX3 ̡9b k9*sп]p.js+*ӛYSIgj_\ތ ȧK@ 8^1nmFD(.%nqGHZS'v"^nq':pI<2e0+GpY' &їDOLw9` J1Ȝ |rTYd0O{hQ>$Y4==<"S:*eCtJKX Y1r6䳆6pl$ D-]|4Ka>CrxpILaR)j!u%Y!orz.ҤC)#d࠳~qM4>#ɓ\DSXAbPOIs=+>Mo!Ad-,cUwټII2*D'8+'yh.qj{|?wDP=n!Ǖ3 Q\'jt$Bʵ%PF܆Ei: "e0W߀=PxImHHu  S؈ 7أ{ؼB!pGX0|Fq>X.MK8ץ+P Nf ɞM8*PFeFr='~*b;W'PsFBmGnnE_eT8ϷNDt5VϺ"ɷ!?{Hg+DQÜٝvag ,еNwg~ʗ$qJ$@VDH#E_N^X [i RQMjZrJS4K)•{Z4 |EI.Z^ѱW@@~p}{]rnGwWs3d5zaz H?&ټi(/> 0tՑH}[[ʫyφ'DE0ږUVJ0GU]~TX!:w,p<9kG7qlYӐO00uՀdYouFߣz/No)} OpOEǛAb8-tژ( .D#j.a-}!]RR„H9KV EVTPuJEI"6g}q*'_0IhOIj_x޺" $ elF$3J)1Deldlg`={Y{|KwA[+/zgP|&wsw੏Ǽ-;47O>G ߻Hl{U 5::P2x"~;<]׊=u|w+Lf]\R,x {AQ)c.6N=&Jɘz9D+\@RHHցVH(k=A ױǤ3rvDM_/Y>F.܇g7NfAu+tӘ>;YyO]&]%ֺ$EF% FHP؞L-5کd*4E*4]|Hmŵ$6̪#H㜗>FOޕؠPTTk.< UʝsY@'l2W^|ilֲbWsYRt|`r/&k$t|ZRiL sR+:s M QLj@65 ZC«1.{G8P'XGrt݋]bH@2UIjl :kά)QEHm-) 9 EuAHчY)$ ۰S˸vl&wFΎn'%yFE{9p1hXkb\v3Oom}ןjKPb:˧M%ܶj/[^»w=>}n,Ձv%r+iq۳Wm{׵O4=`2,$]} n_[7ۻMot[W[6ܲYpn[=_6󅱲CWZnvwyFۻEG\"Goxjtn?묷DlXs9e]#CwmmHw5>qZ˭w%>+* me!r Tr)heQbt_m6ηєZLE#̶,LgmH 5 З\ : ?eXy٩:~1yjMN!6e&cHeeT1)F>D*(S3ΙKnwLjW'r(9l2$lhRe,1_}er EHuŃUY%걵Bl`bTI *%wбuF,}lf }V.lYߥFzT=ȁb˨@KJ&4ԋ@"y_>z=sV!B(cukJ- &AcQ JVgP+Nz;&CP؈.ZR`AUз/9l1iL^1Z3%" ŢAV:uFힳo=7Niě)s\z?m$QBg@Vt-CM>}5_Pr$&آz1ɑ*FHZRRt$G2;zs Qhl㓼3s/W! 9B|L.nk lԥQ&`pMQB*L7ŻƻKlEG925$Lw2CpI2]JM 8GJxor+pWa yOKx2ⱾοTX|_ea-mezB?d9_[H "!&ehhʩ{BޣJVDh0rwR,貢b'Ho23`- MXd1C27d>!kSVPpޙdڣ1YJҢ0b < -Jg}J{ӍOf=5k<+8[hkh26Qg30q^լ^][o[7+B^m!tZ`0Ljk"ˮe7uΟ?[%ۺ٦.Nv1SK͵}kYtɻT<0(mcG=[k铗!2B /tv$rE'R6%4HWzK.XuB= fv}^ieͅjFC=}fT"l|fUFk40CrAf+" d9Bz*:hc53~!~o=qI8l^lіe:9S֕ mI DI-4)7Md6*AAΨMLy'=7xL0p7 vS "D ˧M?9^*#v|k"G+=(`^ȓd\Zf$UDS H^BE)*8qbso ۻ׬tW;0=mE\xNGJP.kO]J[fvVTP W@}E,6jp:YB ZJxqާ=Wx.m6`:އn"~KR R&2wm-^2Z8$*dcbӍOoVqra`)Q6vvՏv)f}dDGK6JΣJYBFh@Dk; wB =y4@2Fb\gd0*t" 7+S*;AoҎRb̑\;K0o]Vj?7瓃>j|~,c=b6K[=%ۮ:vN4@3ɳJTR `xnH)%*Qbr"'!D$Zu C@ZWtrJ`]JjY&cezj57mbS2Ig:Vgȥ5yO.rmԶ6hH>TЫj7&\$+RL BQJŰl|*VzGf V3l]hrGh%dzR(>,3i#I 7qdj[j⬷Q~2ZO--z&W,r}7{yQ.q _Rb"ȡM #$+KAd'HFL+'f۲B&Z eeƼ0 JIF&gc`61dB\2ٚX0" ,SYGT'p=&z{ؒi[1bZDY""vq2RҲW>@b4N,/>.0 T P"׍AΈ*I+m6e}@L2dI AsცSK :jvQW]F/&ĵ;g fTFR 0Mp|'ʫ,ʟ矿* LGo6r ~՛Jo^!@̈+/cpZg,tShqW6&Tt  ̰,X(rhcdkYrRPK8(iTI& u5rnut͞A 0' #)U]y5qkB U^7Wнx1 /iNn?%^~eɤnTADciZeu㖢Yw0c[bO>s/(ǘ$]kPⅅJ k6ĤK "/i|A+bXwn-Xy<*DЦ=ǝǻU6.y|4fBmT#_ WKqtDbC.oyiBjPKfzu4њNo۬f2_=mԲSͻ]5z5^zdZ^-oC3o7 -t<. lSI,O?'[UsZsf{qj[ hչT. g3vy]̍6>&Q~ rM+y88+ssaͭ>7i[2sX~yn-9zZWa`)V8Jp jd3&.1Ȥ zR#Y;D9():RdhzT3Y+ib`e P IוS"pɑ"$#uрkpvAbDyu-L`)e(w@%a\f:2<2qŴ*Jn8YEmtF-,0t}W[Fվm!wNgn#K$x. s\DY2eg̐ )R]EzJ8y29AЀ Ja yVKkvL|q8zr#*rQ…O?M QJo#ҋтsKqu_>N0қzj8$-~7uǧ@{d9uđ@9b3.8`G'`2( Qf'Uƒ/S`Z,.yVxCɧҰ |_]LhtǟMµFmѾ7ӿ&VYGTJˑaq,= !?x4GL_>qZ\~q||yI޸y9b?ۣw0KAt4Кqg^_r O?{~?~?|?ۻO~~K;_hRMhgg 3x0O~|h]mho:DF|Ʉo08:#/j;$4vmHʏ_|;PLʴ7?ikrD{r/NlEAZ~_*a&U8 A\Čnr|^@eٙ&1q.x\]/!$ &,DzBHX;0*jW:ӆkZz'^#Ъ*!9#bWr`ק]j$K eŅuF ]T닃[WkM<1oE}u=j=&7K==t\Z:/!4;gi&/GOdd`4*jߠH |҄tU{"s9O e H8 _q͐')d",רD{Y7X(LE` AxƁ`=qI̘"#E -"4R%%&|7'{'Hea9iVj1ES*z0¿ٻ8rW<Kp^,v0؏r!Jbc{N2ێ)I3Uݭ"%r\$ FLqUoRSvS-Y35tQd+4 c#P'OPOUE24)(9)RrQGم[57Y  G{Q>گvG1(9J~Q=KhRq4!H*`@*E}l|촭f_+ku1 xd<w /'&W$&Zduњ];hMHqyԫ].]Wh4K0$SN&WL8M-gmK'ѰEuܠ^v}߽03b87 RzX6T]m"]ktXOR (6\nX&\]T<ˠm/y=C^۲㧽֛>}Lkӣ_=>zthC=IMU-a8eU=\/y9)t=ϧtOlf?>]9>E܅I_JW!ݗ#Wo'FDzéݹ]}m8U_OgP?nìW=n#/墎>mOS/h:=/eí6>rK\&HuF8D 6%-Sیk族;wJkSyv5! q}RG X_㌁Ѧ\Vk!l۟ Pgؓz3MB j)5t@Z$gv -ۿk YU(5{jI6{_h~$!E14R`(g5j9O5_rl@= H$uۂw`Rӛ40(018vjioM|'F7S ^a5!jR%5&HE)6SLPh]P!LS.K"N b-0eD=oVOjCoG_P]Hb&D䧔L1K-\E6W ]Wz #1Fkq\ctliNzRWWPd<70b}]#VOw{å@{ vfZ[bZ 1>:o"pr-l1C/trGnD߆@%V#*2ۡ>΢1ۘdZvc6Wtvzr(PR6)O &N)J%pf2}7lՓwi3Jҝ3JlBW6!Δj{ʻ J9g\atUh'>fi o?yOͯ??ފ}ۇ/O뱺UzMӆ OmYG{N6Jįi@;SXԑoU:~ Ws*%xL˚ KJ/Q;c7yԫlmu/n_#y5p,av__W/-2>ߖ-ϿA[ud}<=ZgG͈DaoOϽ^r!ӞZR\>[y)'|{8; %%Osyd$<&2Y1\<'yԩ8xs"qD< -&-RzqP4mbiFRsX(8Hf #] H!Uc#[MS-y H5'.;\LeLebܧw'#7B}89.]ϵ ,^TZ{_ 6YK'9w}b!fv_&1szQ4q8_GIz=(0AM n$d(Tu3UQJ,&tIlѳU=4gNB {ތgco(o?qV+K ¼Y&e1)Niq:j_U1S-QLOp_3b=p)g5eҧNNԑC%M]ݰE}X{Ye}\H,Q7Cn2:ˣ^aȮj2=!!/;N)89ghޑy3Γ )ۨ+$*rƬ&[JPsc( "lk4F4<{l_wI_UETO/-YS_xM:ʦdKwǙmAGѕӧ?䨉ۊS4^Q81B5h32-#JQ@F"'@Ÿ ! "B5 0Q FZmmpBSm]\dS Aѧj(C$-V ?{WFJ#_[@pn3Lf_cml#I+^,ےXm[N N"EVIBq{nhn(~Xm9ð34ۚx'PV =2&+h|Tk%PYo_C3M4y.#S] b^rVطտy"ӢcQ:"׮3..Ew<1I+tEʥ9[;p԰L%6ЦTBJhS=m<@rDT[qUOTOTO֑+G8VY[+E%]kW~\y*?ګ\MjVSW~\+?p5131-W~\+?prfT~\+?pW~jW W~\Wĺ"V~\X\+?p!iE!瓿(J_ldzNkf.V a[$¨hh _|H L-niv=f>X}Z_3`)j+Vj8 p2*`]|p,DR`j3PH.%N{I:BiyTnh -4)N$9H`9J, iDr\wp_݄*w0M l<:>zEY7hf*2<"qɔ2 V V*(a#$r1 u?ִQKs*Tʙ0 \,WHY0igLցԯ"'BҿA ^˔T6P œJx\ʠWPA KTɩ ?xSah:.-اߚ ^LVU A4GnMO,UJ Aݍ?8m9%5)r#9%0ŀ3.읲wF EsI:{3ZOұmgfIU9X]`,^ Ҟ q h Jó">b>r1}tEsE+ѽߖi .sQ>jVX ώ'{aR]jO[Ztm<}`gƇfTHҧYfZ.TgG^-f71paœOa>;c$jg+~=N×K%P%VtX ,O71 GD#XF|'mNgtN9`:Vj-N CHiXhJԌrS}ǛܬtioViN1l0>n_h9:,?|~}o9|uF,BSF`k~JF{MޚM 4E g״{S}E3;-S !6?~LB&^ORrtkNΥrD{r_!IOTIfUtRT̝T$vKr3Y#*ń,'q_r vٿ?چ?l;I>%<} ɺF!!0%J%WG6׼Sϻqfu6I,W1"j 3{}Kť`IQqc9\zˡ$ v9qOA}X-gmvv>ہL:]M+쪊_z"r R ia)hw{;6 /jO/\ܛM9gMNd%Om=:ڿBYGtH)Cj1PB<)D$e;"Ògȹl!EUyz?֫]~:TJ "HSn8Gd6byd VFH&Jձ]ұqW t&b8X Otz2h1/VkGNFV?{{ȭ`[is&=D4Ņh+#CkTY׌ @Kd\9WIK+B|fd ܡB zkkwIHɪ6srܘr+O GJ;Gmoܭ+@ǣY^.=[TZSL{+gDӜ~scc+PA(dY9>)#ydO"y Zr\rNIbR3(IX# }bqtM+ <&,l.ye۽V&>u7+UZdĠ!oeEJT)NK^= prG2w:GJTrjo5_*~J>󷾾ۃ:}doDvE=gQw=`kt15ݟuuTZ1}w̧GwTܝV} ϹMǞw(63o[ WyR`m`9e2Cֲ׾% FQw|AňuY wۘ*$Nnd1a/i@B֍7A1RvJKTR2f Ós/t.o7qvC`hYްU~%o ߳SM~P_l_ͮ`PMLViR>r䁫YpgjIkuۛdՌaipճ6(ϲVI{   Og@$ qdyReYLX&tJS2iYѳ&Άzsf6LTz+7Irۥ˻&[A3fO{hd)#YrB@*a?b '2%gN rtZ%)c\%({PAX9jTשϫΎTGMGV9 TҠ ʀulB2ZU "@ Rueة͉$z`hVNqM!r4 53X`@J` ׃q!Ul%5p~(G]Qަ}{VEC(mPFmLkTĥ`98P!i"G08]Ўf}[ OS01*vE irgHP&O6 F I`ٳI\ĵ$H6]OLRK5A7 υN|l:4r"-v=w3J;Я_ '&}Dlb,|2[5H &&pZgKpm\Ux~%Ώc_j>asF:ni~[}u~n>xEULFi`7ZT^4Gʶ{ۚ9ITl땷m>H&a͉cMt9\()S%_QD)IxeL9h"k%E#ra~L7gWa*jht;dHm}+gOk,+w%5;|5CyPmsØ1|wмZo)ɽl:P iwxt6-i N- ՞J< J}F"}>$Q|aEJ]a"AV{ågkR_} ="B$(o%ӽ)\:(N'b'}?$b?BO2 6m,dBQg]8r= wmmI2;/u<`1`v 2eVuDI-y|N рtmCyhm~ETm^dX^\0?Z^wq@hQڧ[QW~yҗrT5:m8pAUIWL2"s(Ey(c,"< "T]޾bV0|;`B² †` $?Pd| ;Ķ:L."AJ.8H/r(gWs;0LC `Q~16(vrWoDb_q{ nx53նGڟxBNYyK'-b.PJR0aQ=i%Ѐ޳EB x4@Z!,H,T"/z)xR3HB3q^֊i<8zw)d3#NjzH@C*F%|RZP䒲\VC՟oFX/R56)hA*M;֞8.B6K-@k[rho-* DK)dXL"20Tf0~n1Cm"p5d|ە74Mt|<{d m@ҥՍ>.heaj׳Ņ,Ңmޚ ڨznפ͌wq7ɢ%m%a-;m B_/\r)-ݛ&`Ag2q%褂Ji53[ < Omٹ|}皍6P|/$X%L"m8y0Һ̘R2˥ggƲmUfqbq)~e_l!/zzM/9J5j~ϐvLiН'}F3?M3{OB$#?毣i~hQ߇<fP#R*pR\<;cns (]o4i6hy|xȿ- [d/LJϿFfuÛ}7 ^2sGBWdB:gHuZ);<3:;ځ_L?Ť$jR͆/J+ųvN:煫ITnRn[Es2\jJu/fJiW \I@\Qꥢ߳?w?B9UN?ޝ4 ?U3ܗ:b{'p1fTN:NwZB$ tL*+YVgOJ9i/!Tؚ\_ (xcF!s ~aU\|1_%vRZ;@ 7yF>?kmxpNRw![HYAR>QOՁ'k_AT(d=lBDN&yQlF&4)*SjIh Y \EEbk(J+-aPgiY^x|G~zwks8Dbrv.~kvt^iBr )d|L.!  yki"[d E1ʨmdl DΈ3 Tڄ3&ը@f,CD\HeV7Dj̺ TK4e ‹߯⾜-W)Wr"o'~Ɂiv :ɛd} N8$6'8RnI695i \)(bXV )%Kzn R7va !{%y6]6+j vBb^ELfC*ls0N hbb&?eU`#1 =@JacܣVnyO9a/0>vx-}=+.:6{ί)Nd:d܍uw GH;ՆgK^vxҬ)*oҹ5&@f's!TVֶBڤI:A{-2jkJJL%JlA2XBL*l,Y(orҠA526vdlUU7oBh0`w;u|R}GNgq{/dru2~'Flr 1Q'PfJLSHKJ. v5IH2HdbQ< I ADVHLaNXWX֬.1m}2J;Lӊ-L;ڲ1j{G양r,jSZRZbs(Qi#"2ɹ2Bxh23CPEĞ&eVS2F>BIDv16v@~v:;yn80>8{ !X- Ѿ~4݋v݇= 0vvS o􆆒i2>1C@uysreZ/g31:>4:BWތv4T|aZ2yݤqObc}#]lkMXj,|KN)p,J/\(G.Qm-}a"fBS/tH΀*0XUۀHY3/8:EitfKTJӗf'Ыb0lٳ!dv[-p+= U+i0㣃d0 ƺQt 5hJ+fPShSD"KrPgQ{Z6*JEJ"z*zbX&jS&I bBI !gET! k ZLZ< sgߗ3 7X%[nQ?]C_ތ~ǣ7M7Z)gjvKbv;r75WvfG4Gztu3+?ھQ8O|m#Yf5WeGnL PȝCg4"`Yӝ1NnoEDG!r:|3q)8~\4IX5(O&|SN#3ߥOG}mʛ7uCVl'ntGSӂ(vħYsFL ϴz4n1V?oWajU4g2}-]$MTe;I~׍$II#F_aV=mӃx,U`:9]N|S7<7|u{V`ƳQg!/mL>=8'<:( ſzӕ zS:ף8>gӻ~i?=88Α5"h!,?$?e U9-Lkj,]Ƥe)U͜|IZH7=֫}y: jMI@ұ ʲbf$ d)1Dtltl lmp3{}1?9(P7>O>6tڜGǾwa$۹ Meb갫҇4>|U@2!PRBŦ74(HE{HkFZ1r.jniJ!u\t!e_TL(#dT.Z}!fl:g30MGr‡|r !v?nAx{?m[L|qfq=5?1iVYqQQ@D ?l٫dgV] 2X|H-Z$KfU!ge;O%06(.+Kk5<+U5%d3q[q tz^9g_Cһqn)9|&L37 QLPsCG|DI΃縣zb`vUPhK`Q$dJ9(69'R`k <ΆmSֵޖEXۼb ڤ}}8r<}V8fzq ou;rK!l?Mw+'wc u?ں{=C4wf!{nozl|狚|:[>;s9MF{:^.ѢJ@W[s9eM_-鮷yJݕӏ',ܶ6䛣1Z'9drR]|Q]\ph~6y#gυb7۴)+jϳ$p~[IkՎ'wɐJ<)^X+#^z8 ѻS|f=S+P]?g{!4tN"?$͒wmXXq8hœz,jt4'M2sQa77.Jb\tޠy(Rh:8*"P4QtQ\X w` цhuJ.9ɉ<̶8$[P(/YD=CEްFE9&ASqR Ar 9ڠ1W"n&QjC(ܧDc>_ev[X2ux\SVP|M)6!C (ubR$HAS"Y )6܎Ql^zv\ED D6 & P MPE_kS_}eGQ"$ELm+/xQc9KPj֒P&F(ڡpla 'XϚclnsFrdW[ֵL%CʑBc@Xj9A(iha7ϼ1gY AΈ ]?* |,DfF蛅Ԋ'4IOxtG'I{[ + @^PM UUl!X|`z0M ^<6/q=7:s-s*rEIc K_W|[W?lT2:e%.@] !"݀^Tۍu6C:0&1yz TB$ 0eR*JIH!&{z ֟I׾;“ ?֋kĬϻk%Q N=CBĵQ&}0\,.µQU\UJmWȵ{ӋckIFeRV]R&tr*WZeF.粘n'`69hf2vXm*b*qrҘ H%cTAZG. `/^8!^4Yy>4 ,8#~~=֦y9AVaȈ+&烐M`h^*RD(e;u/K`^apJbcݫ"4PEP2)o4Y-> +=l[OZ,Ĕ"L5dʩhwA 5f*Nh&aRx;,uI/Ǐܢ~ݭW-_/Qe2_\䢫gOlxBNqja鐇J H>?X&uOp##b1l9^)hA[rP11Kp%ek3Xs {H7k:Xz(jٍ.m.d#ˆW^4j^Z"@b|AX6h )%*ID6RXFz<̌xGOn„7[.Mu^3-{Wi}v,ZWVxu;r U,XE%Ӛ ڨEkWq.zqgm(Za!e;m8Rh|X|QZ>x.A'RLY ,c(14 (Xd-: fk'{Mxy5xgA:6i](MSB'2 $ H\*/?)rq1zu+f{,ؘYl½U\T~{g~LO-鞁v%]ӂ "Fg'=/>!t2tDjsu&@{^/hd$N|f.H ܵȽ"7@V{Arr`dkR=2Q;iѣ P=ѣJq`xVkk =sU{(JfUR\Bs4< F6; ,r{F]A=Qjp$; 5~7|4l2|΋NNؑP|kFo?~(q>vI W^DR|1ϡu?# e4oWX_o.Y*!y?Iۼ}^[8 xӒؙ/߿߈iGe8M`PF)Z(JIr@eQ*0S1WUZ,b0W/\ ޽IAn&g6Wj)g1WV󘫇I~+ssإVdX`pjw(J`{sRz tC]v"CS?;ےŏ//eN_Muo/psG[ ѕ8aP?߯FXS}HF7JmyX]zE^>-mo~^URl!5ohSQW ttsMI눭Vs˷Ǽc}=m%U`GJX\P@IV}%UJFPbIm~qR7߭[W} ٔ;,Rv>GjX ˒4`$ uI%z0˲L5ӏSk<:% $Pɓt̒lAPdmD!cP 6g-1_sK0.+)dH0 uض)f{=LPlo| K :)K|<55?|OԴ5Gw$;⪣MN62ir1$I٠R'vR$HAS"Y )6=,H=;."Q(I*ֱSֱxV0F1ɦzEʂC=thY@ವ$v(\q9[gB(I5ֳflg^lcVv'ХT cزg*L(rq O5(%;mܠ-Ubb/ v? !Rc2m#GŘOdUe^p XINs~ŖXNZeʖQ-Sd),^*[x VX(%E'j5Zq11NsfґwMZQJHF9/F.(JV`T CI8Ii55ddSx7ku$&DD~rI ""Z J@k [rWE@EOȡXQp ]@[\tBkfLH m*8F E68G;#c]^ߞw4C5=.|U;qorW^(Kz"Q 6I+HIMgOg^@%m"/Nvl+Сws>HB44ې!U m?Q8t)F:dtDO#z!|'3/束sɳ'WJHRyY ST!$g0%"nrDR|NSs5^шeZ!(ҺN^9{BTc!mɗXmOf h=ۓH}^<9DǟUЛZpڰ^ZS!e&5r6Ghqi3H3hYh qJf3$]t 5Y@`A8Rth Aj(7]TkѯA* HR̦TH򨲷MBGa/5^l֝wcw=vOJ'܉&֭cr vsf _@)0HtL M."T4*']2s]&Hœ"b]f W!7z8FWէ93~cӋzbmv/2OĿg˔oxBNYImN$KFOZ\d) Ea‚ <t{Hȧy63-ާzgD<ԃ8S6xǬG'GUH;'y*]qR|/] 9@L"`+*>9%e5.g&jFo_GpMj1ġ]z)j]?R jm4~ȋ7O2x+Gd#_4|/=JRL'vU9使JΰOqJCZ%5m_^_IK}~ЍPh$}')?~rQ ,M 8+dz.N?~19?\Y:-9#<ߖZ `r~oHC/ vylHzYQC^}bv7 Ϥ{r`[u:oZukg ciϬ0/$&ϖRF~ކa^wZhjsˡǨU-8^Jver1?rN*M[z}wnv5q6gV$ᑻ.6ZmkZ6B_;nm~λcL켿E-'vcx]gNǥC.VOnˋϲѸ34A0hDP: k:LqJJAB4Ȫ99̴ fc%&KKQo/eѾQxN[_I$2ѦhFY@PrZ Jr" .r>kzcNJ m88ܐc,)П"*Q(}CWM7]N|?q׾cúݸ~<[h-ozWgXE-w7|/8:[T|Ya;Vߝ5Nofw]M?1.Vt?: (yzR(0o%CʪCs*O)4* -t=dz dEvU=Qa+AJe"@5To&||Or~uI9txݡttL#:jA:ϨET4R~WQb"'D% J[? DxcFmyM>gMEYa.PBHWb@R3EBI{xf~VoѬ;{jlS;Ccop(]LrAD()IcDm6d`aQ3 p:HP\T,$O){eIP1Ǝڬ;{D<u~v{RQɁ+ifJ%0@eB׷=ǖmrѯ*;S,;{id|};5E}bB'Uv* (C6AgA IhJBK!B5%%&QG`Ҕ HГk7rϒt&'-5x) }ajͺqv9,P,PXxwYʌ]|u̇A'|"Sґ!0 *DҊKCllIV)PC |RBP*l6QcИrusUZug7bUb j7CQ[5Fm5`VmaްgzIb,qZŊ#j#0ɹ2B#df9 cM"j1b|HTg}c=h0R*dOa~|!Xd`@VY)8$ɽ3EO|ȎS2D $dH!lw2 #YfJP& #fjAt/ܝ=<S7kl*r yUyüSjjj]$oc;ܙ!Ȏ^ӨƎ6^7?>~54 Gp0Ye];-2Sv4 #DZ,#* PB`m cc rFiF1qVw`)j涖/:Zz_|v&wzƲG (ؘL6{2&)HIB!C&5!ZdFm+Fha ~v`wb7o1yF=d2,^\3l껉{b7DH0y߹\J'| B19igѪ;q 5'7Ԁ+Zܐ p<;}cŹ f^_5Fٜ2Mk-POj3C6vVJ IS-(hLIHUOR(P1KAĞ+2J"B1SE:+dҔt OAk,3նgd Pw]A HQ(dQ!0hÝl٬;*#tЧDZP¡2-*N?78l3٬M1RV wí BZz-!,~!j/I!Â;ΚTq?56iOCׅԗC093rT#Gur'Sxh]Y[H}9Gx2գò UO%'P:P(-(/\ZAG.Q-(}XRLX4:D hgY2'@!X1xm8;kGr#%9Zm֝=Ֆ"ze[8=O}lW> ˻OE]5p6wρZ>wl&{w}5T@˔BIlaX`EZ<)a&ADƐ2]>ǔ]̫>@] :Eet*`c):k4ٳ#d T=c7 l:κ{* IaEDQR%y KHM4Eh/b۫8ZsMM{%RbQ"VkmcUm$hS bB)#"R"p][o9+mw !LHdp&0x5%$9[emKq{"UHևD@j`1ʤY>Fc$w6갘oN¤ܟ{>tRxm%wŷ&ݭ!ԋɆʼ-fN>|* :K VAtps+?Z^?̢@@2'9%:8:aPNXgD:_r߱Ipd" UG'$ Q9=7k[pz&|gPLϋ0!T~WVF*QO׎ONg?\( i{&uMȎ2F8׉s|trߘПk}KyG?V_gOW\N?:fkh,FX sar>c$*iwo18x-CZbmmI-]lkmm_fV''T>Q̫pпdf79snumn+EkNkHnXy8I,fK;6fkҕ\T}VYo0{\;}]ٿ7>Qgsz\q#0>hnA`~}B=@9ӲyM+dߣ]MvG X[ٞ_,qx M|v5ϓVmT9;5EF5 &TݝT"T&3YؑAߕ Ub]r1/v⣇P=^8Hr=hɫ=5&X"0"QG: &A8p ޶ ɣ sWV#qV'MPP@i% V.%D<IrL:lLrt0ĺQD&VOy߭?dC7:[nr52RvM.NFRJ`Qht\?Ӊ sS!{HES$?I $mCRJS?<kc)$Ɲqh7E+olPVC,b p8!JFvg^:s(Y*"If[ *sq52F\ p 5wmi5p4%$-NHP_`LTu3 %:IAfL,<8|CIJjL!YcR9`:gC)3R 2-ɰljl R[KױL Ғ@X Ps2!Pb">uʧ B QL(-C@ձCұq큵3ڋaWong rp)ZD7lJF7q{;J)7GUs,gIr6cS P NiU"41S!2DE+f ̽2UǬqF.,D#kC gZk.[Tmz)qn~pųկ6w.~͘bFxW򙐥DΚGqf8>熐B H^@,+ .QT$-r|(I` 9K摣N" hq49*$iST,)5{.EhcT21 r.x뜡BD`MrkS<`1qD5yTvUuf˾WV0ݮg*8kpn d/,#UA[WU8j[~;Edjj:&)xitq8[*]%yPͪB,YbɲWkxj^:5=2^βyDm7.SSξzK5vAn87zc t15ݟwu `M,b3PAbg F$Gs;; 9@=v%isnh,Gbp<*k豘+Vs*KHk^҆rC\<F,bZR%+ ȥw`WwkT!z*uRtVwkDA'M_o/hDȒ\8#PhY6fK3r9g;^L E8*,X',PFr8k8E/dUp,V[0_Abe35 )PB/xOw׻~qT@g6QN(vJV+gtb?gzS`j-AM`xM"%ܕUĈ16τz>ݦl;W&VQAP-E0a̒AO(0:q&c&A+sseZsuW z4 5D2q*K.5W/\ -m .թȋOćݳ2`7kr ?VgRS̸?YyrJD(l;2NtsI7qF%Dt5Q_NjMvMzGB#rjntm@5㓷vjZ?hZJsP r*2h|^u'JEooTvfN>|* v37T p">7d =Zѡ@ & HVS;v2ՙL^tbߔoP$L>qzB"4b q-Y[y uC2tu9ۏxUleDjg|*+trҝ^*P rю߫692FͶs,|ҍhZŬxpPݯ'o.NWuZol~ R|6,v%HT و[ n?^ B$ڒ[ mlfu|B q0`ż@Om.z;:V궾R]ߟ/q<W/gltiY΋MreN0w{\q'" zkMrqecMc{9=V^G4-bڭl/J8\|f&3n9p$Y{z0 j~6**~)Awg! 9oLvdPmVWń;w"Mnm'>MlHKf5D7gz|Q=hɫ=5&TTHA΂iUrkU$=yanJ}b d!rsS1vS!:8=p. ͉kwau~mv;$g{Ώj_fۑ1/ ˨ 4$SoHv[M9M IiK.Ӟ[9CRY)J$p"18z[? =u)!<: &9 Z| ń2tQZ;$X=h򊽂Eps0 >O8 dAM,cI3 f=@R6 L3B6ϓ-)@w,1S ot 2RX@oݒ>Lr>=9]XP9E2IS"eJ+X XKH Zr^bYg 2Jli!Sp!yDz+X#H|'H;8nk;1c^mdڵg|DŽ)m,yyzy ?fIo.MVf)w͍/@heOvȠ?!􅆆F74],Fo^xc1;j ql' lܺ^W3? =lFnbuϷzݩ}x,#r1vyoX-Zj辑!OOKc1/i|s5mڡ5,6%ۜ@zJs[}m#7۟0}/`-=tCjJ4yKgjekȉZSc.1yS@f6TVGGIxi"?F^<{ϫ에(:Ӡգ/|Jƕ# 1IOKn }޵˸# T-lZ8r_pawzyPP1:Uyl8h[l}H땐qLi of{IFpUbE^VZIaE2:0TXG@,wqukp7׹V$ T-s &C)4ECUBJA:"hqH#V(a#wڐ*CD)0(Fnc-tFw{gC%m]x3#μo6}ub`}4 gv"(qqv2Cl2J+S[zfX@UB95񓾬 'e&qk5Yd{4N_3֛5Y[p3gS'-QFIbiHs.a^2ZEJK.kdk{r ? ϤϜRwL&x~.k[jqPu^ % Iz'rRo%靨֜{;QI'IJeO{t o4ݝs g+yo&wPf& TA'x2!ɨ. %OAZ52G~th.*!)How;)1qۍOöXa5v~9ɡofeE\TX BIr32a',sŽ 'l+^1b0kfIB&A23"H@z`>Hf1QY2)c)WV81R  CnJ1H=P;9#hEw&S3BjDe3_t؝NMQZ~E?wg)Y/x0$9BH)R\Muh?^ȶNӗwl<\.[&ls({I~ts$??ngRd~Kt/3*1{w@Zq犻dVk[ mgR#a=FY_)Z)vG~ܦ:7w ܜRtks c \HrxZu$jdlznsr-U%P,f kv#ӯ*%g: cV*{Jbe#J,Vz:=Ԫ|It/}g Z@rTp>s7K"ZV:0GqVW{xuOv=a^teU B%Ӊޜ#fY8l&XIuja41> il/m<"D45%6zv.WEbuxH0BjǴ&*I\AT6YN8. P2bЋG=cMܝtTCU-:Aog` ~tÎAҴC@3%dRօvõfWiTP|z3S\4bE,6΀jw跶47!oZjZS#ۙqϳDh~ \F)*ML£žڑ;V9NX/lvw7~ A&'TрEJ@p9Y㼵R 2}ZȠ=:K_G>䋎 =ze3D42ZO,%L,01)<d٣gT.mGPқB㴳Ī/s5_u n!u=:ztLZ~:t)OaQ C)E'?6ӫ(5{=]OwG̹=9?N?TJ'NqJ?qj~ZjWQiكM$-eI-).l=6ȹ7֞<=Eh/SoÈSiRu]\0[-={w Kmoo8ɿ[^ǫ/Xm}LNdz`Ï,۩/+dR{8qnr;'%Nwfil>h%\RhTq)s-4u`uǜ=:RuR*dy,蒯IJsB!*]qJW1e].rko'd]MNgdG0* b`V{R2WkOV/v~}2(lhXZQѹP»;qS諺IL|7^1yJp̰x5g ZBJϛ$*@Hjԩ!94w6+}UU{zP03Sjx.5jIrYRH.C(D,ɨG Q#x@Lل>*q>#ZՁY5dJhkU/%YeR)i2"*.1Rbt&s#'j.+jp٪ϩL(| e8ڡQ,]t(Jx!ARTRAj X0*JPrb 4rMYJx)&/I! $iNkDc sjG)w3%XJc?;z=z=Ciӻ6n΢mpL<9o%:+Ý nۙVkw~eRL+]9DHN L BYO,kL!$r6D!"27BQ[c$,V"1̒ R=)?{֍ /3͑UxJjIƥX"R#oURt$\-@h|]Yu̴IhrN@ e3qnGt2u531BGd?A~v^40{O 4xOFʁ*+hYZO d4 Kތƒmddj&m"dK֪ۣ#a(ƞ 6AIdFɶcf5IrH ?Jq2f}Qwڼv` e(mpV}b6ǹAZE9HYf[H1BF#%gcVͨ d!f &7Q1@FuҮkĹ[~v bg㾈(:FDQ"V*l0A' O#0΁ @nZCIqB)fADM#J H&i J #%z՞pqL6:;%EeŊ@18DzX42sM@ȬPpOBr:X#굊ۓFٱ-xGEw~.g#7Jh;8 h7я ;_AKwkvE_QW@6!""R~-"WEJY%•a\-Lt*7S%m$'=$pd&zO_~ &E#&ihz?r3Lʥo߿QVI"tVlVYKiɸ\L[\fk s/Gp+qj h9Ko8H+)qOʣG1Am>OkZ;F62)68b:Ci{+U^œdml&s,.ou2G.-] -?n[ո+m%U5u6v[}@c` Ww焫t|՝E pug)6UjSpuWBXw@Hf L \iURVzpŅ6vH`wrg"kWEJ*\DB#NȎk-4ԱwF2-VW!$0RYp9 'FB$(oIQF8hxJxh? v̍u4R2l|2J44FAy+ g|{)#8QDyYL3FH% uLqx1@Je61p@I"~ XedFEtLy㻎 JuǫMK,՗L*^|b3oݔfG0nO◓ܙR+6 mU;z&H!B=T"zMU=$аG`Kqٿiy߁-2JA`c"[.9x]qM?%F[ٖYdE Br YeDh5)Ԗ-P!FSf"gJ-ب$D;.3l,(fChF3vZᓧ6W|C^U\Sꨲ=AefA07SbL*徎kjjb#l stUmɺ2cj#qȱ>= 5Ug5( Ę A oI 1ahjT9IUr7^@[gф-UOwͫBBȵ}pVkN]QW]jd{$rVRJ/IpƷT:/d9VH%*9 ڠ楒h8\SϛXw.-iT]:P5i[}}a%YUF>-ڹ#s—uxG^XCNQYPRFpǬ PK6`1\ 2= iq I<մ-hY3q6t1y~Z9磰Jy}rskTfL%r9T6y)*`ᎀ}4L6)oXHh9L{ `)iAH9RINp|>.Wuv?mq+T \DAju{s̀t2+1X4h!$% s  UNJ>frfкXӝ?FD.}, c)#Hd+u6:eu?ִQ- 9@& (L(TmBiÄ sD-`ϘUE]H.0Ke$eQH."!FL % @H֢.6lMofsK5YqQūQԅAt}go:w\|%M> g؟kҥBb|_eAY_9ޑDzRo* g:v&ܙ?8лbH׸\{`()Ɓ`|}Jaύ_Oұm5%'iV`=XYVyԁin<>7^и_̠48hWc{FK.zD^~[k88XDM5B*+,fώ{aRk]U ]?i8*£ÅUƥлԧiޤxq1ү_O~j?xs~v4}vY1&f sqdl]풓7~9/ >|plAoj#_ْě[:Y _ ,Ohc [|,Xvb6Eٚ5l}NjuU_Z[C4[#a(ǩ7̽W4+U4'~TL>Mǿ{OtC 88K6"(,¿$TZ?e󦱳Y V^O|v5+ڽ|4v[c[n7Cʅd{S$>5 ړ{qɄC?iT*%ug.U8-Č8Z?h2)ᱭO]9 I&`>M`#-HIQBI>&>IkS<P2zg$d;KvW'60im8lyJ@:kpR% H1JtZOu K҈K)uJpUu:9ӭjNܓz k+G;[q}ҕNȥRt##*gh y z.n|s<&doBG2r3d4+0@f 瘹h$n/ ""0]nЫ`]TN['[,D3d #+ӓGCIбי8dkW~8ӒɊ~,CVTGU2}(,f.j uv]+e{VJJ;^Q^\Ҽ$8t&g ^:+Sg^35'vݠ^uUd*$P2](>)Ͻpܑ%G]̔|eSt#h-vG{J)ョ5祯Xd<20rJ&hMPt6oֹDȄVq4H*J]c۴FirV5gXk/FοfgPu>;Pv(SIXN"1 =SIlT3H7}hWO%P}l,x62XI=;Vs<iO0bu>c9q^'m<Z@ItK瘱dD јTN\r/R.Zb) 0>&EVH2p2Ddɗۙ8#Ȏ &\#Cct<zޞrU4BIBlѤS }`t\ X`eB#%83'%ZG P}0?{Oܶ_A$kH}rm%vfǵ7NLSbD AQR5x-RlK@xG`(3*h2:Z rD`)+OPD2Ml TztU>齓n׺]mI!vE{"GX*|]\i7򌈶5籣{*lTN$YkG~v>8GWg$4o7GBOap Hȓ2n9UX4'~B L-`O͇K J] ނtJ$iƥOҡ)F|&<{ŗAv,ʹS4.畱 ,d >,Fq p=CF -72fO8-YzZ8 ⢖IG8ztΔUeG4mUr5&XrF8OR׍&?szm9 Kj#˜hYj}B*ɘA2 Zo2h$Ebn48$ehPR:v3ulH_j'=nT$98A3Y\13O9 x!:ýڂ ײdz] %DR-m͢FZH/}3x@8c$A#gYX^ o]_G\,Ugt=;C{1W9{1pC)c5e*]LʿaqKUՋecV,޼kpaAcK\=zI;6;vfK,:Wyj 4_UgRr=ݔ} PEv=oT~MDz,S;㑄hIeI}>d?%|y"oG;-NIV+cl3g+:?ٷsc;]vzMT'9OuEٻawe}i?Tuqyk!PهM+~W7IuUkRͻ^>xt. L5#Ayk0*.+8_?߀Tp; w1m.k< 4{qG/峹t.;jT4sK`+O皃O3H#ޗ66㔟p`R @US*kSvK} w{QrX>i8L8Rrw߷9H՘| ܦ80Lik6 Tyd DCtnaX0íFɭ jArNn;ӊ FmFi) V3&ULD5wk^T;S<wuϹM~?v6Hcb&) HJ&%ƒXID#i(_D"(ֆ1c9Q4Dl498UV h +`<`Ő\:僕: D<" kDCf^@ nf;CSڧAy Wg(7:^e:}QP*n~sIY87ܥə=Q*ytۡPC*^y%&ȴBTTMF]ZFATKiQ(lB0n#tBI  R`iB$ Lc8I1nY M<,oiIv{t"8f}\>QPMj!}\ v{wLp%":-( i-bKQV}h[:J;nQZ3"VxDRa=@c]410Tp\13P"4Ld@Ȉ&*HΑC*!(26Lg%]]ɬAԔr$Ͱb`atXFM<} {tJe(D aZN(ÄĠP- 4I _1-? ؀>2 )Gт)qPԘӪZZnmNħF “=s\mRA FɽP ^!k%q 0e0Nioޓħ]WΫ5t8f 9 D"ђ ȡT03%Ki!R֜G`CbIࠋc!PdWk:oAK v *%2`=SAĐH% bRKyLwBaȮԡ5:AEt/2pKE" mPn<ؤ>1W=+:|T&$/ DP+R3EyE4;˜l;,:,C{ңۄ,ES ײ. 2`k U cJ9^ Aua.nJEk;PL?ܟC6i/y Yue#g>Zy-> \rAxP_uwPvG+ Ӽw^jNf#n}gɹPRZ-l[M4-&iZUZǴ'|:[0P~ ?&Հ*nSb@z$7 yy*"DgSrp&nޥ a\.{z?h<,:ͅQQC +K_tVʈpz˿Jr_P%p!~ҳ2 .dN &OE-L(yC]WcˬU>ld˺*Xw|/ h<{ /^}MCVE`{|&_g]<:?_f~\ fMc/:QEn0gnw9R,Bdsk,Qp2Z{X{T]^Wv+ka2y("2P$HCDQ :wZ1hIZ:˥hO9DŽRn&x92J4/ Z)"I;%RHDcӉ =,+vUjޮ?m̮h-tM_.[K,5I&S}@npz| [Ƣc\WF\5S<`s-:2VzVuft ^q\(wހ'zƌƁZ@1b"iZ+Ioxb3pZ&}уw1vr$O]%h?JP9 ׮0!eOg|R`WeW=U:zd~}Y6<3Ni/]nl/oXrɴB0߇g҃;\v[k> C7OܽB}^w r018iy~,YO+qvqxz|݄ajxUHZ8rP5)#XW59z;C5PG5BQ"x+dH)}Eژt7Nڍn;P߷`i>M\F%F*}X63BVZ5aTZj$"r25͖K!D!^2LwZ v豉hj4"h8M1p?\#gNxW®(a뤚fx7|>'|OmEC[L8yWt\+F6On,ZΙ"f+1aJ o?u0*\YnU\r-hE:xoTq`_S(<FOv Nl.v7 ad# [v>y6x䕱s#ð_3_}w}jl %F\aziI8{ PZi$Q$9E;4#|Mip!v@ʷgكVC㳣qg/k##9w>[FJt\WwpEQֹ\Us$.AC[nR7'kdvՏFC PBϔd2߶qpx{kIHUG'$*{t\.y@7\uiu4SPqʓܿC]GCl m)D}WBGGŰ`\f&3~VqіH"CŏS'qӥ5U'+cZfYXx>W^*f7QDQ\~ugsrnWDtRf{䷘^''.t .nV6O8y.>.=^ٽN oMc^u\Gr-= 6m f&Е T"M:^|wxG_u޼S 勗?>/8s (HFݽ?޵ko5|%sՄ,P]CMQPҘŖ[2c'ë yyPǗ&Zݬpeq0 l~Ȅ+%V_ʠ_Rh8)/@,f<*o7d Rj'&}ؠFI.& 74y\ڡvO VB@'4ig UHzto\%qGBmrjhHIQe4TŨTBqJ4JѺ3f!wk%Yy315] GͰzPv;{'{h,-UMӄ[jv/-,EyMM.NF Q(T\?QD>&qhM{]m|p&"yOhJIb*:I6}d dzQGqb<͍CS5.uBѥ-R}4$ʩe>5A$B劌' $>![9FR <:!M9|.vxb҇UL$Аcs @ w*G3HŔCm..UÐyI?Q绥Cp롕[^'c3F AA?mOSxFY"PF@SVȜ8Ip` !9ak-r$u†b#x<<4gi$&y Q"'"O8zYEUFm 5ciÄ4m$O-'2ݟ:5:ybsqq~)h)D%Vu΁D ڸ KfZl]: ݆ԝ-yݑI_Cg/z_W[Q|;Bs晖2ee1y+2(ڢ׈}2IUN.G"q͒@Llϒ[41cuZ~5%̤ ;HPQS@芠$,~6īGŰgP7 )9ј72td{[;#oY.u伕j,W.޲q7?^?>$}w[ }|k)VĆj <`bN>Gh*}jfhMM=j"+&n4hsd*i#5*q,aF)Gr #bB$\uS^j8i29c.PvPڢUNpǭTryZ:{i = 4# +,QkzhWMN5qͅ[TFH.P%I P%+2y#D8" u6^81T@4%!p{%wOL5nkg`\]+2Rg-ӚRO ڠ9M~Θ prpTiQkg82/{ d~?]qbxHCFo IQ()Qv,s5_UWuʻ2wreOoom UZSIUS/-"Тm wo.}Cߍͯbw[ݯv# i5n'E.q ^A~nz mۂ^ iqA-G3:;$ '>@G76/?u7{ i>"tE 1' ,; ?%l6W!LZ]ͤ\ AF`@6XE{ nn;Z,m:mWC.Ajx7n&mp8F]d]v`r9Z $P>:iauKe^v(\{QKfYԀ5 {Vs&oU~ΎE \\l2j}՝u'QZޛ%%~S&KQ'?`,޿> rfX/.^{)#Rr Qꙹb0rUȥI |S5u/m9>뜘wӜ0Ÿ1k=C\,c8&|:Ͽ=;|44QZILwk<9Olj f?Q^rÜ5Q [[cI"!tly_YzJCW.wXH cRD54u(118Xt0ђtKwCr 8M(rneZh ^$aRE %RHDcnlcg-@@jFvJÃ& R,w^T g%&_/WDԳHRXUB\rD>8mMBՁ#ni x΁ܷ8:nK&VfCz^53Pmu8SC$!*(ƏcJI'z~n)}_ D3jb ^'vî[a@:фK=U#BXYL^jʈhA #(H8H?:S\neEM{秃9lއ)^[ eg-Ӆ=kWT9noqYvKS >5= u/-B ?ݏv%!huO$i=ҊP3X0uXF"Co|;uV+b]'Un]2CuTdd6"J7LEaIՑDlhFE [!(Aez2Y:ɸI󄂛3M'8f|[9Ժ >OB>ԂWKM_ oz-Q fhy*Jj%͆p6S3懽Zy'kZH#j բ'&lIގ$gt5'Pn.wI6ؙ51M6Z2gXMU,1"b3R$:2(:o;cƌL"^ˈi5 n{NI&Κ[];];RHD[p4M%SB",̥8Rˆ:QCRf@.(qCvY$PNz`aR"2OlQxR3 rdӓ'VevT1LeO07~z1i֟3?)yf= V `$:4&A9ZA iNS% a*xkHZ鼖Q(&)&`zS暏0A2bdF/cL#2fa˘d4㩶g­¥Dݦ{Ǻmõ˚MMw7f8_  ؊i- LIUd\ G y!Ȼ3hd)= J٤B:`^e)A.DnW|g68[l;sgJ ҎZmjb r"p8H6(^+ 6:&$Ym`E) <9}^Yh 3 l J>&#ȁS]68֩_s'ø+1~<"X,"ҨO)=N$16Y!LFbVF2g($^B;XobA% 9 H\lE8 0bS'.:iS"lY.a@1;9x=+LJrQ:7$rZEgӒx8]܆]9v=O[0aˬ\6p֮F_6+(5J$k3mj,s鬋9? XQfb V0rTp#sLͶ|D:%ʀzq7;$J:%h(%[ 8 {B>=sn^nm/͂fKUY1X #F knttmPcw8Z!(Z[JL-0"~ #"08-\E08Nb#b΀b#"d YεD1nу8#q<e$ [,kY?gύir?_T\Z0*-*.'{ C8)ϔZI(a `>)7䝣/ġM4$| _1xby9w߼#1:>ɫ<3vi9ΌR=JB L5 x^qQ2xL'ٝ3p'"G:E8Bu:)V3oLC5ƂjmEc!I!Ǫ\@Qk"!.rJbhw \DC*.3-@pVA-to0CJf/h̝K5]|Vz˧; ]/{" <;*0=C>usE1й P6JF(%d`|HUb'NkjȄB&%jɄ dL3HI+\JXI+ c%a$0VJp%$W䪒\UJrUI*U%$W(^DRVZj})w_ݗr} &RR?R?ԏ.KR?ԏ.KR?ԏ.5UJR?ԏ.KR?ԏ.KbKR?ԏ.E,KR?ԏ.vԏ.KR?ԏ.KW(*{bji&[wm&="-Z+c(յ$Im)d%kw@,{LO٨]rXM: #BcZxSZgMK|џyI ~iIɗ./4k&Êus4מSD ScpVu+v0wOc*5(K)# ksl⬙$*AdjXZۣ"^\~7ў~!:z8>>|s3X|\œ у)7*BTP7 Vp E!Hɕ.<4@+n8kR~ bp;"E8IEdK? NaQiƝVc 3"4Rb3p'{|l*ѻ.J d6Yʮ#zlJ))0!+ !0 νfT9ͤ6 8՞o.Ghs-MkqV]$;p X(!riǂ SOI"EH#L),RFs:K%$4:8hͩ5SRC~wpv㟥54E56ƅ?V 24;3LC=?,T`^4wU &za30dz]`8zCE`|n8)$v8)0F"So&~3x_[ VE-6`S\:$U[\XR\ 6pG")GIhX|\cミmYFOeunZ#ۛV^QO?)j:}]^}@|j ։Y39Qk4nO!Fov6=nF"\ sf> Fq88>꫾] UY2w~rᤷ%n鬯 Eo3&qkS/U|<^.:jsp6r잗#V/j߻btjb9_=kB5U2xWn?o?to?TN=l0.Wnp~'0GG߽w燷GGop:zwG޿O`QA ?o$p鿽qyET54[;|v%=ޞ>e3;p%@~~}z;2/OpZ5;`M+ 6LʠT)ʝք(1m4K;ҏ |wi'1{DG>Ն-' oG|N fG'NY@Xko ΐ F1T*k4Uؙ$={aav|ňÎwQbDИǨ\23RQs,55Q6ul4a:;ߙ\:LՍ\?][s+,?% U~HbT圤헓up$CRm %E")M*!lݍn}ʐmv>L6~i?KZ2"h&䊣/U__4SiH[_  dM[j:<]P SZ} \Y2 m@jO | f?P*' <>$qeIJE= DȹR<0Bb ,yJEm\ XL9Pd$yCɥ)Jgi$rA7!H+)#)!Ie)wT.9CLSYS) I~O_P{ֵ GbB1:Y]9TT1Y &Έnrk:԰ç.cN$(9sD1DB \;#H%B@!(N[Y{e6:X˝A~N/$1/3܇s zzyӠ_ 9U>F۱T LިJ;*Hc?K"4*}h҇0crx(BX9!(e1FPBm@T˳0`)]6` D8.DxP2DXGrScR QYF,Sw!VM~!'*{>9iN t~ʊ%[$$3mО NrpfhUid#bh lFoHuThKm`u4[mMr LhRV*yR:m1rn8h0UFrt^mi~vEkYɲd*㪦Fz!$PqECe- HDl-/%3 ,Gڋ$pMCʡ6PB;Kv2>DR"WKi˗6lp(hJ:'~ؔu 7-]7]{IZ1և7f G=>A2O->Pa [ϭitǷt:osK[?uز–wnzo|Λ z^h񷺿٧:cބ7*vEQI )@%(H.ʆ+J胴BLNS!0X6<J)_9YIx>_NŻGF;/w7;&cPUFH.P%POE`I ZК9&7HZ-&Q׏rv‘"P'aXL&2` ܙ!LS*7F+B/q!6=;Kx2$Âdϫ+#E';|k2͗TBSg׈g@=!*p*1TyOlEtOAg3eu¾lrsPd&1.ETѨ|A&NpMtFW#QJ֏EzSz@g;Yht[WhrFTs}ic9_/գ:^qHJ30 m,JFD(.%8(P}j5N4B˒*4.IQ|BA<LE4p+Q,p AlBN˖ >P1@C) 5A) 'gRXrrT@ dQMAȹ]1]p pc'9]67/1"y- WWի)ʩ/X3r=Q7<|V<.ȨX#S)hJHB[.5ZH VQ<,gi5c<,{><+!!ghPFm ,lt0cu~ &?==姯L#~y9~OF{.LC1?>~Swٛe.Q)6k?t~NASF?i:> '_UN'v^_7&oa/^ !_}@˿{qw G }-:{O;-^yts|9ʳ · fS`NwM:={ڻr.7?5~7fg ˏ`v߯tTDy0NYXh/w{g3\!׳.% _euyi~"))45G cu^Oa*>_Ճ`crXeAgN;ι2:7/_J_F6B%ž٧(k`~6Y 1c3WE w&]ԲXkO:?V:;7嵞EΠ‹D!77߯ nxWl`R[J٨$  uEuCu|cvcI/FR'jeqw< Gϋ*ic½T+&hwmfMSг_7,!<μSkDpԴmQvp~㹬ܞwTÝEFN 7%͒!lbLrX'1ATu(9X{|PN D2<sX@T2kcbDi`$#4&\!T&Qz}PT%be{(\  ƒU |!?\j{{&oܚYC7.zG~VB=A*@K9=ۏD?WPNI椦bd+UGW-fd-E~KwvO^\(Z\Ȧ:U#Gu1L`rblnȍr{Qg2gCԃv2r9oxۓg | )5X | rH_O&\k=_B\ +W5@xyA=תy:BTNx y:d-QlwL2!+L˷70IW~p/-~v'Ad։Ņ d`aAlrBq%kgum~?v?pkH#i|Q56/hF 3`y_K4_|֙x p⏅~:`Y/.}H5 5v#i a{(ي}DjNZbxEm*kUQ\d<(!t$XEH},=S6"Ѝ6.ŭa>"YLϚS飯7́פ*J; .:]<+p 4hy8)-S*ER}ʗA. B'ss-LPo.q^(wq_YKG> M^NZ!im-*$;߈f^WALWW%W`<$O5j1G6ګԐȖ&#L yU;A#3Ә5s WsY6cHaFѡWJ VBU\tAX"#LX[%,g<͇dޗR\+MmӬ6*I憄>& ڐwܔ }G$^˛ࣃ$Җ1DAp0R-(Vtewbpt_/RϿL\=heܝqJσ/שқ*7?ð3˸ Q:ݱ/͑osHM Kuۧ5c| jQ]_yqqB6;n_}27J;HlUPIWsڑ p*cSP WL.JYB]ސLIpJyA:icm"E)uJ.%E0 ؔ | { R&hu#lΒR=WUj2F}L տ$[χ, (덉fa@"cAD$-d& 6 dVhu6"! Gv`6xNk>. J=kyQZ .<3XV>%+1A i",3gihDlulF]4$ !HҢHdML,():&'E83FHNڹtR?t*X$b5xD%$b'7R|Z,H/A' Ouyo@DnF$8` q#J H&ҤRY;t=O՞x^lըr+EN.nԍ@"w^Ia5)>!(o4 YuG,e: x4s-8yO;a`k>'=#>kg7c0;? *NoР2A|?Q^hIXgL#%'$JI Ru ߥ^LQDyYL3F42!C +lwN4TJ dᓖ~ XD 2dE:&cjMg{j.+iaDaY E/~Q2/VcH;fZbYt.#| ;-3z&+U]q暧ݳqb.wX&+OːۨJ x9Inl0Y])谤sRK D!5F3^LT@ecly}ztr<h R, N,H ^:+S 3/5ғ8& BA3\sSABb51@so.Uh>EfJs]y2q|FSwy9ҽ= znn$G:Њ aveǑ:l੔#PIt51c eL@jmU\<ϱ ܧWI}eCVoO۬% F{  LY(eApcIDZ%F.jNZJҞnB`7Y,ցKB:,g͞A6h9%1UvjyY_I^.=[$m_@+:@R􀹷LO+ I z'7;y&?_͏) o]-A8cMLm+>I"} dFfQ3XVL`#sa7'@e5wTzuF=خJ0j$rH24^r[\&iG6Ą04A$1Ue+=6#c1cIM'Yl7cg]ˣƩCj +Wt]^? MCQZ_Z7֛tf7f!l(@--ڈ `L8\62 r`EςΏ)J|Q5rGwMDnԁ}Da&7,l WDn9^!C'XJZq9-!khz(rpD1 J L 9b(+6ari(}j^83V%¹Ko1$`L9e:yDPbőJm]D#˚xu=9}zݙf^L \#Qo0x=i&[,E|'[3Gv$tnv1M{~06Oe)>\|/x9f?rrqyKuݻIku\nҜGH(ǩ7̽"6NzӿTM97$޼7߷ן~8{psONoh&d= /~h647ZJ|قo1f5_y͸wGKUfaq $Ļޏ׿iy:.Oͥk^ Mr/¬vV=+;og.ԯ+1Y[L < ʏ)xox=mԏnݺn=P"$K't6g(32FᏤWApwk\5-|:FUHs[^>R'Ug N^$QK`F:퓤]"H4" Nݥ/uir}u,娫Q^(G]seZ"M\9Ʃ̒!v_5)~ez8_!xTurWM2$e1O.%6ERLsX3SUUuu}v;NW bV‚7bZ ,qU$O'=q`M` D<@>1;+SEJ\"5I^R.T9XJ%jwBI_!^ HexIlFc!?`}v}NRjIG;y+K}'oR*ސ~䭔h"5aduj*-&TgBDU*TN8S%& 6!qM<XXPQ VnQ-mGDh% ,FRxFLѦS xx`Oae궥|0권l{<ӋύIV' c5q_[NZ˟Ƈ\>W?~7^t{W,_lojÖ)+-ŲnG<1jsgݾy8Âb"J/ǪrO:ylkMdȔkwa}.m8?HqT2D^.]:* V 4%0̬`7y9An>gP~e&1.E&^*hT> ̽r~'R:bCLTrGe&!F=)=ftS݊q鸻h]Sv,6]y~CY^uuPx8$DŽG僶P#R"GXhI>5a+!D88ͅKh-OD0k8 JAx4 'uB8\TA҇*&h`$32CMP3ə9,YtStG1qw ]3(؊Ϝnژ_!"y^+܊TW8Ω%ʍ)'_pMu?*|bbJ52"X!sR !pE-:aCqB.hթeG-;~3WGERWĿR*mb3C3=V??^-dr<&d/;fS}Xy+g,TڴaܴԢ$3/v ^.YaUwc~dk`{(/)Qg-B#W >k t %eڃۓHzхI|Sڟ]ڍdEFNx@kKAV: yndBə3eM%;eF y,DDMzl"j3gj=Y;p@dMI$)xFeG"[7 yLjJE{VŞ_TU ߀ᆭj *uNT1m"4ND0jSYzvShZe ګtMAo ą{Z7'5w[Jth!h(-6M$J+=Ihw-u]ݕ,cfE:h &0xۤ>Rn 7J9{ .AS^jhAZ!A&'ṳ.jg1)PD}X8mĹfG7'ExVp~i #ms Ur͓⃨`ht1z5QTd0ʨu c2y#9bۇ Ypcg'I/I`*u23%!G6DEA&ṇ@ڙ%r8/fO5(lEXfJ¥, D'='vVL5l8kfu,E.l%Up|!9{Lޙ$&0)|c`(c)j!uc%M>?@'z ?c4鐘r4@D9>JAͭOdf *IMtl_45^(f,1h 4* ӛD44' FhGDZWK{tPxi:A!zT Aeb:Q#f-N(e#!J7V.#e\K/z/WLEk|J/Rg!R"RLE "9 e%pch(ΣFns^.Pe8kdelHUE)xʁgE"HO߷~HZI, l&%_uܝ.K& GuvVOKP˯58:`jHψCߟAB'C)kgts'7Lf|zk/~:9f%~EM϶Ƥ;;S֏o^=].6Q+k9F'-t3Q'˃d.ߣL<_\n`&q!$IT5 Z9]Ĩ$lmlqEm /N.KakfdㆡĈ9*$)?ē7  3_!HKPk\Z+UXg`FE w8xJ-Ҳ12ՙJ鹶AA Tz'&Pi3z4W BB Db-g45gшDHM 4X>q-%]9½PԁʅQ$(vb'rR9DFipON.OioP2=GV( rVu@ÍVs$q>kNBg՘M9X8Vigpc .:nMc7INMe[ !犹%s*Y<`LHHQhe&PV;Cꍧ:C@q,<:L VH @8k3)&ZkH f)=LKpV/=9}y~@JK0ӳ v"i*MU]XiUBU]}oUݴ[qCw83R}zǟ/X: *G .*daB)UT%єQ6uYkd-\"*S[ʥ4TqL>v9 RNJ#0.2΅.WT&H@xFuXq &,~z /&"QD  JȀ<}2azlB\!2 NjА9J*++TYIF@oJӶ:o)価Nu>J-kF )cs5)GI!Px*mLDʼn*:U6F)D;MM.x뜡DB0*QEi V¨8k6]e >P~pcHCUzi޵y꿲ߊt3I P4=(P0DZZ)g/y_I^:ug^>Y04 ="$+6W]NbšNSSEhhɎ| kOZPv{R͐+%5Pl$}q,q6ޙ?Ych#6(zrrTp389=_'%m<|yq(>8О^fU'/FC|i]- `k9%jO&`\JJƃEF/JQ'ʈS_IͰ!X{Z֨i"cp88O#xhX8fl02 _ o$s@E_=0XѦ? 1X  d۸LCRn\!q&LcdlS%3#U}Iqq>].:GӒmqэ=.>ʍ;P’szLjfz)a^Ŵ8{\| \e?1/нj~6Ҵ%ӳT~fc\&Fp )f1V/SX ~ 6A/U.˾SoSJ/(nӣEU#puK%Xﱴ`unĆ0TK 9>$f$y5}Gt<)̧0dx¬ۈ?NU.>0ҦsºL %o5Y[Mwٻekv:5@m+nj @ 3hBSp84`L96V2VgY`Ƚ)rX+($օ#L2(Q^a-WSƎ܏&39 Х*+c:+!:+[WyCܧ"?O~8x,+_s9-䬸~VTä X}k^-:k8yg\DS+>eswt]qPnu[+}ɖ}UL,J{($A"jS͹ֹggG>m衙w]\r埛8Y'=&.Opx懃xzWa~rgEuwn U~\!?0IW'ePx3QYY{g <1  4\/OI^[oj$ժްb6lXfquϳ7k]vmנ:[tonog2B?/S?y\Owzu[z)@^ޥC?;!eS.t8IooݨegA~37ջ'˷КQcʷ:ur&6&g˨fzC,U'wadz˓||~;߼o_o`?~_;]˚uS8" 3_hCxQC{s֫ƸlƜ/F/]|. 仫__:^H.N(g:zyu3w҉-`=m{W{;TMxw8p'<7_0 @%K!_ـ34 ٖʨw(+QP6brH$V.gR-LX&3Ag'zG aUi,–)L/ [|/.hܓ-ׄ)EݠCLs}[XJi#`"'>'k9cttVAV7/vۭŧR l5#Gd8cnTD>ફVB;dF`O3M+t \.xuy|juݎ~G,2F7͚u/$bJr[%o"E(1AlǨFղUhUUc~C-#jPǺa0}Ȃ6xx` 0[,ˀ( l5_z[U#|m`xsKFb䢴I 5Z%aMS k^" )EV;r%U?JH:q}QN՗?oz<+m1/*7I!:{; ₸;r_J`SLRєt !fDѽ:8Tl.h{ƦO2bt`6/vϥf*Y6@(̪,Lo668%#AiB~Liw殪V$PaCL%@T4JZؔc .yN ե }9^ԥҥ֜%Y\Og*"äl՘ڤRKOPScجKp0.k)۞aeʦJh9h<~e6F*3KT*4dTuGkNљ^Lc>*E)?!{>h*&rhٻl[#u18xBbXu0-A4^X/Obl%f!`X:$(Rt^R}Eڋ-J8VqsPVMM-19P3U6 潂\ 5Ԫ; A1 =rpc ;FFa՞/fL(q"kp CNO`rcDEĉj kuue@EE=VP ޸Tya*5uLz4յ 8$8)80=q^TAUBNMԷX=qAU D=QϨ|/& 2:%O:%jm54$e hºQXX7qunXJKɈS!RԋY5 Kǥ]b-AQZtVU-ڜUP c tTVqUM _1"JF6"*Jnpe }'ASX+#PEn@I=M^fo:E]&ʱ\ڨɠ=PR+E|B7ޛ.9?)W(!,@%0URjz.ih 5"٤5%k :(ڬtvο: D +[nȲtGE˸=~i;pLF AMjKj,Z0F=(9@ide+ 탕EJiaM|<{  EdufTsCQnxDh搟k4Vr)p A(@b9v%E^g<2aN؞PڢHbQ}TkF|Rz i,&27hâHV&=+NF"e{ Nj/6dj'=֨Ur@4c0SA"pՔqOH2ˇgH:?kf΋&Mg TfĬ<8F6PX?ǰphh-t1E h+hυX4S"qFd>Kx>j9chV;ͨ>rI"WŪR盉0K1:Fff%Ф1%$"0bɱ&.hI56ץuĕw*;c70&^m#:܋ŠKPk;EQR)0t|gOhN<,t{*bMs,TĶg@BGCB3Ґ34$dj"\D$.ڶA:[}R.= clF>9GjHsC;GV_Φ]B/ yazcnx޵^=~Ā%}xkު؏٨2{Л*|U?&fmpJ7ǎ6iyz]z,mnfT[OwfC;_;[}xxLlyˠ48ó3;maM/O}Mr;g&1灶ID2FR7^ ^c5bs|JW^ǧPXg^cs#Ϥ$T:HT:HT:HT:HT:HT:HT:HT:HT:HT:HT:HT:HT:HT:HT:HT:HT:HT:HT:HT:HT:HT:HT:HU@o+cR>?X+ăW逕T:F͔czt\NnהWCG+ZTPλPmJըa}^!VWIO#Wɖa)O (IP tn2>UyR Pmu:mUL&EvXj#Q5x>h.rRҐJW˴Vx3t:Ԕ;#|Sv}?P+ u(rkh)m. {))3*΢\k@~QݰRPb_^ST瓎Fێt^[YJ"REj{ЊڶǩG1v^gsdqf_rTtT.n6o-׈RRa|Rq!jq8 1CC~$#gUW0Ys0(UJՈrɐ&DuFZiuJ}I带݇\`lmn]X*i+|h;4/c|i˛k\!֔8FI˲:?gx3y'k큍κ|u8gd_Jvl#s#ʴ8UIp!]jy8[*8D*g5Q[ PH([m狂oh.rs#!5>D,|3cT++ND/<qjBȜcc!:FF9:׸zl5ah\(Exⳛǥ=hҳ $;ydg5kp%Y%Y%Y%Y%Y%Y%Y%Y%Y%Y%Y%Y%Y%Y%Y%Y%Y%Y%Y%Y%Y%Y%ٯVvVs&$D`l3"; k%]vY)~ י@HV@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL $dU[ Ydhm-͗QU-wH tHkP^agyJˡ0 o7< v秛yqxGqp?e4w9y7]wx6^Gԋw?mǽ],=:F |ipW LmfW6п}cӽ7Wer=Osc$wlySPr_Et2iLro{|z67xY=o;v{^*woǿU](yھŁwz=|v,iQV˝MS3i*!ܜgw?Wg鍷/o?vo(&-}h}x[ٞBۿl>Ҳ޺Ĺqx8mz4kXԃ\Jms5rZ8Ĺ&5qsMk\8Ĺ&5qsMk\8Ĺ&5qsMk\8Ĺ&5qsMk\8Ĺ&5qsMk\8Ĺ&5qsponW3/&OV͆pݩ\ﻟ{@ڲh6)ncYtt_ V0*v1D13 7\_N]ہ⼌/vs'M^&&\.:JeB((*-5*1%efsQg=53A8yqI~M~gd܉=xI/[3VB 7~5_Цۑn~5WO^j}+u{[{"HVҨGll7sl+imX)q[梁l,Qj9N90(d(>O{ =ؚ;fLk1MˮenɟϦ9/=nm$!N*q"dɪ67r[s@nu;5=+ڠ77pC_sk#vMG~06y&/Q&7 fgywIɍ yrd'l=a*$g/_>-F$wN}UEO~_Q-#Jx|o<iXMj՜fp󟿝_{JI'ʛl t`O'߭3N=mN]k6Vm`m(À=>#T{ t9?-t[WTMJe~m^pkQm|WJwQGUЕ8ZD o:֣3a(FPyYfEFU.ZVU6-=J Ǒ2q_a+ׯkҕ|]>m-рxwCQOLۮrV>ӪһѡpC:$M.NV:ي3#DrU.c`~.߮,@Q|\ط~ms0S*э ^bSTSctܘȼm6pفK]F}*=ƎJfO{/ZĐE=ޤƵ8q{Yhdwkpuy{cy\|7{gƂܚH)YuB(tIمȸ(&*ke\؄GVAJd7@3$'u9@ +·}Xg~ au Qk*@^/Q#&̹=1b_hۈ5{On6RW80-ڇ-n.jvzfoXLsY~'\7.MͼaT^kV(!(x萳,xIsxQ%j+c2eUJ߅!Zp\)5kJRQc!Qԏ zZ c mF}( ĉm?ȑI椛C@eqUMX*"N$1Y5 ?ۍ.5Gj+I;GiІOr۷'n<Bwן7q1_Wtvy[>{\-P`^=[>E?/^cq}rn߫W.EO_ۓfjl˧޵q$BM7RW[qpuA{kxH$Jpg8)M5%fwMOu_p뽼Ϯ^}20MۋSU>X|mOߴMqN]J9[{\- ] N 9e_+~u-7Ӝ{s`ڵ&xRK"1䫯Kpy0 \>_ϵ7w5ybwܚ;1\<=h>_? uO68g+}/Wk .ᵃ^'R(AOJ lz6Was}SMѺ1\݃`[FalfyWyl`%ž.nMV/6!&;vL:u@72ܩYlէxL[||txC̽ r٨4`v5Ն *$f>T1*]]JwT@a*2Vs"*ki4IO@RWg㓕`dcHA@RĐۊ\:` xZy- ƨ ΁QZ5m,$?ygL@)X'-T: ;`YV;9xYwݩbKF.<׀)ܘE>zeDVdͱCGO&DGk/Ϋo?uQL|4ecVU):pԁGȥADs&@]"["K9GG&u Bi0E=IvV@QAu)]7C]7|tOyP(N hR@PN ʓ:pmy-$#̧-xQhOS+ VV^E)#g1^&W"~sN42ć#?|0j9,BҬޞ$72IRG!Ʊ^|['~%W̾8-><&^lE(]5om:f1޻.9E,,묷c_Yk:o+\7lh_T{>ug8@O?̼-bmo_G8=!7FɒlF iSz PW{Z"&Q9bpQ)'cW@TIJ^'NDAi֝f}p(pA1L> K`4 ENXd(E5k&PƄ.;A&$S!=&\CTHd#bܯY$1ΊDziGʘSA-FօG?F4P!ͯʆAū TIeeŵ7eFVBN N5@SqN! u ugdqY;7w;E\ݓ4`0}|Gؚ#ңaeiHBGIE0 ӎ e:4K!+{6Pœfɤ;Ƀ&&!|L+]9Kl7Xv1WjRvRC83j[es0T ZYtsH"GhѰə"$5dDQ,*5!/i:e^.$G:*k b/k %˂X$b1W"u[9>y%ebIk1d{DeR 񑱞8 El!G8 $d9G%(ISUTtI)rvGi[lŸd_( EN.nՍ@ydwo@U0TZ XHNyh9Z'!Fs."~"잺MzQĮ+iwpQB8-.EQ?&DND(Q+mC8cr!Iplvz͔}[J"1MCRY1]d{ᵬ㦳rUSrAȪR@Ө[PaǤ5nVPDXFbTJh >޷p+Wx/T~0fv\: S%#'X\i˨Xhq!DT q CcP9E&:'&[ku'2YeWDD)&yXNH!e WƢ-`&ЎWCqcmM;5)DYPRj-L*Mr$!GQ#xGUGY.} I Xd{@(q(9 lu8EW~a {eVb68gm>tl}],,l;StFSv4s{uVk#Q7=+fQ5>ߎ&>w=(8T"OwYo(_Ûϟ|}T7Y)l A`~y\>[C݇ņڡ; -Yɧ^L6J=qo/rZ[/ ~|O}ػj|rk6 c>=گ ٸ/ci/&UΒ*j|1]ʑ>xICsҗCHY/Ght$8l^H."Ho!c4jJ'pVsHTi*ӑЊ^==ruR`*ZK%U~}K5 A2/JFewSLgg8t=VϧK80Q}Ov;7::qAmR-:̹5%hOqIvnl~6GF޼\$`IZUn󤛬7Ȧ!K5RR#)(M^N:ق=$ru\v"꘦DlQN)G{ w1/y5~Xp*gk6/Z||1xy)>VL䂣6-0m£lhg˹5rL#6ȉڕp?ZUP̨nL=e~\,w$drɀ<2Ya0j!|&LߍuJwõr 9_HyGsBeAg-l`Q\Kr1pA"ˆ+Ƹۊ<2(CCwZ N"&Z iׄ:3pZT+(^KC۷ۦŜ)5kpf;r%h|dRӬ.'rbit^^==l`f٘5]zvnFvޘy>x䕱blwWAnWhaqژU(0CzжE&O cc2kIit/ͬ 퐕tOYI .msfwTؖ[#:1Mp(w(:Thq(l{(l{¶9J` 2s0e˨ DATKiQ*Dlj(ʸX %%c88/L.P!uAY&Dlx0eFHu]'wz3:=)ꤒȗ^:-SUȶ(`t;'JBV I scsO! 66DD%1E5rcVgvNâTXs  9.6H؄ #Ht(8s~`0N<^UJLs 1 P`4 DK"* 0dJ),@CB:P[w-ui/lJ|c5M^OK v *%2`=SAĐH% bRKȣ;)c8}0dSS$X/!y1d/D 8&lO?^r(D) %5(¿V8d>[v1ٛh0W}2:`ȰÝ4u(11r#zČ TFK"Y.EC9&r6ȹlM 19lVJHWkH:vzu.G.i `Er 鎳Wh;Sj~WL-T>ʩ&;S}6AWNm!9n"7cfI߂{$*8ʝ7 1qV{ti0.$^ƽvghhP)ZoGE`Gڶ7UOw?&^p硢b61og mQ*PPwfĂpw=n.6qb:$0몢" ,IO \$|F[>+G>~ه߽|]mN3|8ݞ!9[J) XV\c.a[ R&2@m5ˉ]qֹox1hTe;_ tm"cG4TVg>p%1ޣo7ի{]~uT=w$TRNHMp.P/Hr7Ka-bҙoˆ FcD2Y+냉K[hA #(H8Hw3pZV[rB߳KBWoѧ?9}I}orp)ϠÚ;pNR"L[h\+c&AXy`cd>h"fPa'!U:01(mH nhDEFp92ӝ2,YydcܡLIY,-,xGI4\ׁr1  8n'9^^=4ywڢBcn;`!JeXbzt{+1h=. 0B MuD%Hl S`=KYρ<4Cb⢖IGVR!5) 8h:ڐT& 7$uݸ~ע q|Ϝ㛧zm9 Kj#˜hX=! T1, T_BWU%4VBqH:numzVmH_ꂉ'-.8X|l\bf XeמY\13O9 xHE Rk + ށߞoRs(Ѻ +sBDtqp:tI}iF/>A/ONt }k=.2?? |}}6盒.zrxYh#ŜOO &r c_='ٯ"?߿nZHd/F/?"EvM?ƴ/SٛOћW? g~E>EӜ_f6`&'o.^0k}3J#|47+`{Ockg7vg pv7E4חgSbtDa?N^~_xOѿb~z]QP)FYHzL ²"6@j% -Ԋ$:Eujc0d՛YIV&EH˂<hxiLNG?ĿƸGV]`ٌW?W5l9Ɏaڃ%݌8^Ӄ[x5ƽ&K>)|]Fxj`oGՌI6LJ4Qb)/-Sw5 .5GԦgZ{_ % G˦1ICmM%Gp![KvUIq⽇ͭz4xpdz |TW"TƺNTLه7\~o~}?'K%_HXEcV]P&WEn, }ue߻WB~)>5;8wZQmW#>|`-n&. 7S7o}>r2L OW׼RF,/ LrksxGsaG4d*Ayo'd"ן̳g*&Wo@To@?7r_’ ;b߱/6s8|Psv՚򴥣=g+7˰4ܗfj村jBKB:3hy3JŽQIC+ Y"\;GJ~d(!=]!])#]!` Ɋp% ]e*(J:n_믯B:9^rQ-Y)?֖T5ͮ#0 b(xƷ L'ŭg((+=jhI0O_on$P\G㪢2̖M>x[=-ѐXߪb\ &eܸol2GV -@ L1&?OÊiy \3b `n𞷃oj-Ws,=㕺O햋w߼)B*S8L`s*[Zr=hk v;ݦU`r4.0S#\ޝߌ/Q_-Na\<]ؼt2Np0]톖P2n(kZAW]AOW6=%]QLt.+thWm+D)9!_1!bxg*eQW0mR8Lv2\茺hh;]e #]JA k;CW.3]Z2ʶjճЕ`Lh!BRv2\#BWVUFٶ}7zzLӥ ~ئ*UQWև3J&{:Bʇ.%2 `ɻ33 ]e+DJ!)irZP| q#R6.7ء4RG*3ܗm'tF)AQ47gZTu,#r`K^2߾^ߩdT +b$(H9wX# cC*Ym6ԓZV~mԁ[vC%wH,8 Q 5"VOKk ]Eu'I.Å$e]fHBeӋTva_j'q8̓tCDWT-K;Еjߦ]!~m+ ]eNy:]eZtutA]!`Y'BUF #+.!B\u2\mBWOW%7=]!]Мvi0Į.PhR ;DWV++) ]emR?K`g-󆪖Н8Ko*?GwYzpEYz͛-R0;$$3R jљĎP>(G(ՔԆt)+pMgJIC(tutdFw(cj AhI5}mZ,C1ݨ~69 u-:joT;){;mT;MT/НAz:tm(TJƪ ^Z,W^k7; NXXkҺs;g">Çjb6>>/p$  Hm$Ť$!m(u.>GaӉ٬ˋM[e|r/ߛ4k+WOBe:?M~M*g:4]%Guusԏ,~}b}.OGdxzxZM`4mRI Q %(H0$x Rɺ/H''u<3`(UN+ V"`l#s܆s4G`wV]!+={gT=zQ6?X>_͖Ie_Hsyt_ݬ>ٺ/gݨt:)EMkxɵGT.l|b>NjU]Nq]ЯˢdNht1z5QTd0ʨu c2y#9852ۘFσ{?;MxHrS!*$cކ(|`6h5m,I+ƙ4X<=ՠгa a:'\zϒ@t1*-!87g*Ye*^~_"xMs[R ac6HLaRhD8PRBt6?}/z ~a4鐘 Y"<>Jd8I/*VcR8'w8ޯPO('Z-7geun 5e D^(f,1 4* ÛD4qN1BqQMo]}gs6kaΪGq\,IqU7-8A!z4!7gИXbFNH<'SВ)R62tmED4. lY[1tΧη{p"0.q"%*UT 訙3PfY91<ݣ9XOο;}4ܱz;>u𖢺X޲Vg*ӱcIx\fW7jo։~n%#` + J;^g &_L1 Wks.Utgo *,hdU/L\ʪ=3|^E:!/bO))7rA^eҗzjpY34ZCXeB8\kXմi36Y$]0!C YFuImR[7>)ݸ?ZfK+wq;,9ޗ;+BT`pi| ⳷;MMhUAve_=zZ|{{eџ1Mu/+r1( gO/G>ZW5z=x6zove{rhfݚWBn-Ufv h+](-C`Jkdz::g v%N QX19J, g+HŸNz K_=)5Ov<Teɏr_|,*o'L@\$8 g;/._)fϚoO(M5&U@?wy!}>Ki5ԽzUXS3UBF J=>XzRxFLѧS Rpc)afLw-Dr7|('x0.O\]?OB 0Dڢ I=%)k~>qrY)xk5ڨڈ&\ȋ+@P> !^f>v/هV/Y'8LJ]|Zu&ƥZ2F#2r6$NpMtFW#("6d8/5~1q]մ2L7HfEy:鉶6p18:++:SjQL[x׌}FxN{o,}ѽ{kWbJ6%O!x8$DŽG僶P#R"GXhI>pFȶܾmzb8b\$F D3yrGpY' 뜡4J TA҇*&hI&(ř@L U s,ҔozV18C{3a|f2Rg*yŲ\܆PWboFWH!˶A BDHgzgkuZٻ!mB zv0=̄r8dFTCw&8Ag*9W:S).3^jpmqunn|li5+,\o0oux8a4qh2^}W\ѿ&w͏Om U /z^?1socS]ѥ3'L[_&̐ߦ?1ߧcI71 *m2l[^F?m6\)鈘e5tI_Pݒ8dm/dFiퟛ7Ü8NV}<.4ۺ\, ;'Eߦe>-^o|͝m| go_9uxHg_[>)\=<]wY՛#1:"X# ''%՗^ui8u?nVsHK2p}3ؚ2p4JYuOd6\NgOHiENys\dGpS}]lL廥J=]@El|ٴ!iA1ug2Cr\rkd1ڐ11}}N] bSzmFm ^N鼙y.˻c~6ekl[hB١cJ_xu~amڔSp՞e:ZY$qAYdzӡ٠6ƽ.Er^b:k Ьv۶axgOLsCըGQ21~Ľu'ҺE"BMBlڙ3oxFn L"Kǁ""Q`YI81Hˉ! ee_:(%@e̐p<}!t+5uJ,|EkA)m1QnC6 뗞>r}E}:F\)\WXqw^Iqx=y݃RWڶ!R*da\dQ,R .z^g^S]IS׳`PÅ~ m0'{Yj |ݮý LPohw7TE 4';ojEw>3U|Oϯfb:6^m0_;>*@t<u4~<.fpo_ګw|!DSH1X91Ve$"_jY"MHRѰe"}>}욞u#M?sFT)FkXRfJW>X8%PĊa]DN+ J8!:8:0&rt` >,:Gl/\;ʲ!d}iK ֊Hpi9[6JPA%8c,T2e1FЎ gd8;❶ W:Y=k?Ƃ6BQ:"G L ҥ])$6*R"r-\ jR /rtp;KvqKn]bWPr9G xFϦ_'7%U?x6b fNY,4*KC([*u?;\5(K`g x4Dk30mM,-L)13pHuPQ7GY`B<UX-zWsB20鵌FAZs䖈X78w X7C`:yc'US)@$A% H#Ae ss*T\A$r)07( I XD|0a#r  ˽)PPq}`]* PrlɎ93l96_Y1\ :9slo['UkBr]V|E$h"GPK"QEa"hj#!}" $ gB=(j#\Opʁye j0DpR=c6q{|X%_A.̾wp< 4!5+za_r~~j{lrpPM tl Lp=P3HG,C^H)򞗮hd0 )3 j-$M* %^Fd LBDNgǶÁ_ 1kgӎ]6I;qÙ"8Ht}8!CԚMrLH"4R@44Er6r!5!fE;$$bL> .#AP1a6q:owJc<M?v4GG\ sZʄ2u¼&BP5;L`WTsc%bD<NxPǷ{gAOPҵl]D! ӽ /\Ydڕ`X,Qd eJ+gI1"-qуP|D:%$yc5$Ho4vH"ĕtJHFŎb_jO,Ke;8kIܳS׀)%h,1ِV-c Q铓sK`sO\ʟ~ 4R" 9#8[02z0w! 3 pTH@, --%8qTQ4H9Xc9RU$qJH!6" )Q% T(Tr%!pӱeaY8N9C Κ;zEG?\/Q+.&[^åz9aq3eV: -@IgYwl~&/oSq)]nxtig}=fŮ,܂o嘧Y$?=]&j2&gJѝR)*Mɼ11 q Ӆ2)xD~A&_0[TXAa`R!'B%59QG% be4;1!{hU1-UVA-Zo0CJf/h]K;5~3Qj1HBJ[Wۑ8hZd4utxjbiT Hw7 QLjLb%lV*y9ѩ2:ACF$y`*,%EVGшB [!( 8XJ9&r6#6%Ȱ, Ƹ#a ZϖH!AkUC+@}u3! eU*c̫q3gZ/* AL.J[K@nϲIZz(DB[,(dv*0LbD6#sM }g|7Z얮Bҹ'./4k&Êu| \{#Du&*pVtu+rMܚL1jve%LiKf&k~!Q9 $=W”RtN8RUv]lk^j&#e"''"콍4LE 7*BTP7 Vp E!Hɕ.l9%] &Mmyb 1 fE!"Jb,"C(BcPi15 B۞=uXtQfi.9>a3fb׉A3;N'?WWicMMwV1Hw!B(B Qi1t4RĴ :B;CkHpF`2B=C8.T G(AkN :Gxsnz4cH|jd\{5,3u"0SڥTs9)޾>-Ûl',A GTUovfl](UZR;  F"r4BG怒Swd*\Q5xyk*aUiC(m.,In<2_٬<,iPLw=4>}W/U12[ԯEr"`vLhBxүܝZAo|5ƏWߧ0O'RFF|W|YizΏoم0IL\KGoRۥőb04Ԭ1/A湭'ڞuݐXͲ0 F0bŇU;բeo˱ _ͽ_r]+oMVi?6:>? .=zZb6. o||Gޠ]}O?쟯?{gm20.lpͶ дE7ܻBIl33f7wh^,w{ĕtP?C𜿾7 ͋?x汤 Vȝ%k' έomvk j^>->wӸo=Wn=io\odz|X8wVFha9jC]`{VAsg_lX!/G..8tMKZQ`yͭv˞,ί.;Sf_}9ȡ gNuyYVojp9ێCgivYe'>ߓ#O e22\)2d;2h?#O}>ծC>npِ$h~#D\I˱Z%Q 0!* RV0eU j񉑐qQ m`s@2CfY8 2C4 lMΧrr6.?tM|Q.\Ґ #*rmAoQo*t4/ВX HH u9B5qȣ$tXU**(\,rJr3g蛝Po/U"71`&UE P"MgpyўSUWx>8Z\!é< ms; Jm LauiMb l0z}Roſ{*ArXRy>8$iH%<]t9|:^CS\ŠJ#샒C0UYEmJ# ]'jz;LU{Xsȣ9Fos_dTXJbȱݹ79[lm>P ܰy1ΊE^o_~xd{;NfVې[^,ސl3ܺL6O| o_r^IOm#]~8Bל]l=\4 ΎIۜ||uq^>QO:?ͺ@2+eoݾ؇/tm-Jn\pV>s]m<4Y/.r4ܶxlK ِn^f7yι;VΆstꍙ07?kD/HI* mtIUzTz褢HT X1ՒʎT*pz+X"f">HWVU#2!\`'E H-É}jLWTz1XW} Jc itsCkٕeыBӼn0=ԨSFя?j4zg8ߘ 7n$)SeFc ^IYʛt&ZZ IFuLJP9`GV^k\]g3ufE(&:"z1=Ŷpcer.68{1eiREaVk]l>!\`z HTnUqE/*%RXdMWVU6\WJl mtW}qE*-W=LI W$ZbS}Tʆ:?2[' v*j Xu*pC\q n j-d:5.\Z+R r#v֤3H1\\/M*rTW= '7S O8F S-M96M+lW24@go/N+vyO:&E ^ުGFCAW. SF[+7Vf3"υW̴d{-h1"J;Mi[+1{YKf9g`o v6EEU*"(bz+ܲXj# 84=tvj݁v*}ǶlpcTI{ HS+;+VٵWX3XcRkAtWRUq%+ UZu\Jtz+M}DBb MIyؿ;[W9[6?^AN >yvr݁vjpJZ>)W~cZ'+NB+Or""ՀJ IpEAdpru2bb~+- jtz} ͞³G/~zN蔾$<%=msg vh|}P}\>-7=Y|6Ηߍ%hqhRNENr0ψ1P%4."HCnݝj(Mu꣺GuoSW5s4r';q=G*+Ns􌒳kW?`ߐ?ylZI /T/*g_^.fǯ˥X9n} 8;z8bm`s[s4LR#ϑi_2K[jkKY,5HmWRSO8 B#|6jQo)]K$WC2ȮcU:9`V^?y*ND֠"AN'Yk՜~8ɜffi䥵2HCr}2ۿeYerS ?zl <`f;*-[ySi: b{\-pI W$( XZ]lWB )tJ*/$+R+tWKѺW=ĕ NW]l2bV+Vb^ !FpłJW,'c]ZkTq*ݰvG\" Vɵ'>PV*%1ઇdd3:YKާ+:+Vi`q8HW *d>yqE*pG\<E H)S%2z6V^5n'iY/'sʼS<Փ`Ԯm#NֆEԡ\J0 F)1Yn:>۸ ==4z+HՐ XMf_IXnW lz80ln]JZ4xQ]j+NXH0( XTpj:X>J{,&+JWڃp H-ٮ]B!7ڻpE dpErղbnUq6v'L:"RTpj]wI^ V v,\ Hu\J\Wdn>)#R: Uٍ'Z[ s Z@5Btu1`V6b3nHpK|>Xu2:ϵ]@$cJK̭U-C, B]:!ʤVEF)Ű` ɸZBM׍uV`6ƺܲͭK}Xw:uX\RWTź-p%\=s.!\`:\CjڮUpC\I pEUb2"bJ!4Jpłm:A H[WRW}ĕҲ/l]j?/Ƴw1.錻%}Y >O>:yOOÓI\o0L_duyCe]2鋟7JeCTYP4ƐCd8Yl:9;8 uEr?/_.GrN͊ܭj~D9=ڦ]X ۫E37G1.[T(& ʗJX rũR])|0>BGHݴ_FCh@Gxn56(NS.L,.Vi Břy;[F4w>DyNOOơ \nUsLSdE<+6Mt $_E%Y<)ad EW__PI=s|wAg1l.pӗwx2$D]-S]{+t].Ͽzԃx0HRi9Dc6%q69=9c%M$Š!HKP '!&u2xOT.:9>1.׫{:])!(,'2{EF].=P9]mo9+~Y0mNC.;3n0{,%$;q߯zdKж$Y$S]$RI>1x>t.jf#R2yULLܣuaVZ┱/2YЯ}x1gSRL wB3! /PF*gRߌחR] ?Xs,-dЃ|4;\ĹI?.?x9 D?@or:OkJf?ӊ^k+ ɇU37;8u:-Z&cb<#aHc3eYo>e?8>hjOEX1RRݬ[JYq N&亀F) BFoMJA4pݤq?߮i>8[Ի $`җe4\I-l[Je&xN?LLf.sL??9|2t6(67㏿cp2YeQA}qCSG8zYpDZ<* X33%0 DPF&md؀J9rJWG`4)l.-w1Jp<AP2 x|Q5qG#r_(lt)mm,9ϢVsc Q4' ScпyW{V<ڝ:}j5Klf^WALW_ #EH?y[,"b藔)M쏘E(Tq8qjр|zxf WAB)[4YfTA1Y+VQ0BH˒#BR(,UԨBH[WeB)B()2XZ\&LeRY$yfHtц 83k͚>&mSIJISŗc5wh1^#~!FSwdj|?h;Ϸ4&ʫ F/n43]jNǍ:>6ԩ΃dy% | }q|F=5dkke7ֹD0K0Su|JZenE .Hg3D@z'#lʨ8tGC0>~ z嗃^h hFp%!*AIJ=e,W@\k$334Y@vNYABHFb( ܉h ڣZgUjq/ʾ:_p)Qi -볛mxNn'a.vsh`0z??z3{l+ r(' Y:$xY٤".2892EmÊ, 3/nShD-.L܎YI$!e8{l <]M;ڼ.-Sos$xέ"* a1tX!ш\ulVT$L$ !_ 'n ǘUdHu2Wy~8cP(*{Dy#Yd"پIsLѿL-U <]fb=č\D4TO31Xt`D!o@rvPe&c=EFfn\묦%EU//v~q'7HA+EE+37D|Vְ XXh33j]Wo@ws`{O#m(_Ӎœ-ҙ-8|y1^%58:HHă%et\kUNӼNy4L 1E"AGt'ԇfOk'>d>Lf(P€fFri`nBfn^> u)W5qsܫ9fte;׬5f0BmuS!Nj àA锋J0 g+s ")n,T3A(KVn֢:N .H[B%MnA9C#ٓ3,-1E>QlWy_ödXobGS}b弽-ò5M:"y[V#HfFBrMgb 1z1*_uЄ, 7m/·+RщS"i,rreN0dɓ!$-TQ < >[*arVbg+ƺnEĵ TL9eyDɔh+'Qipw"?N۟kjt sȜ@ȫL|_kmCI6X& Dz<. cΐ!YXH~E@kC-SR𲑔PS}.ƪLC97+wvT6 +_gq(AIi))u .znPB{x¢i.0HKy<5 3NHmI? ΢do{Ġ»& c6)1IQ%"P2;h!@?KܞL/O5{V%iUhfl,.y|)n~EҰ=@]G>CtۻihVK<;z~R;2{:LN^OQ4M휞d6m~y$ŏ`ph֦ӽ%FmCT?|xu=_x ca\apw} Gm5߲silwTW|kMRmr[5|k54[YU7d1 G,Xfpp]tGqE#պ@Z,kzZHXu4IrS8huutB5 g{uAgrG߽ߕo^_?ݛs>+qL=0M&-"h?[?K?CUWUM-VQ5[==m\[}h>Fe-o{?^x5OBYttHlx-wl;T^М܋d旳=^JR9;Kel]o ҏl틗L #nͩ 䤃\^`=8r$@z+!sc%Zhqg6,\^Zc 8\:R'(#YnOT KҊ4^ w:ULh֤҉`7;/AjwνF KyJ;|UYtH+Cf 6PbRr㜴$ed%Efˆ|gSYE0G{N)cMjBy ,YN $Z̆KRALXtKmTQ:;&+Z+m-A]8 oDC\=w}S髫kĽF!w 4A7 yϳ<:u?C=UylG&Yjɴi :*AڠYtU#&O Sck/ `O6U8C[;\|װ!{lѩ,02gf u'#*frvʉX%Q#t 'V$cu.TnӬJ:h<$ɩw dɃ1e*V{3j@ߤF\?yr?h偡J]kJ@dQ,%@(KM쟾ҳd{BQ.$1|SFB˶\2\<%^½8,xCu./OTĈf՝7E- f5x0) {UVAe0v^y,ln9WJgrU*mZpt9QX!sYstUo9brtK.myR`9x0WVZj4 LI<5ƈ+'9ӱ0:-c*^ Gk5f,`x:x6,5[!DUՙ<"XK#yfŜɞ솩+)uX} Ys>2n[?ud>FߟFr&&Pgׯf1!aӮ[;3/o2V6r=>$_U|}ЋojռD&\- r5sԜ}l趑!1t58wGiSXn+\\2پm>3X[)k-&]85i#]Z:Rjsfu@M&zGپlꎳs%D?Kc.tu _P0vY4a#A29OmwE%<أX@Uލ212玑ۛ*d /h}_xTV^} l H"i,˖E. <{[7m}7/,kFu O[OBaWM=+AZ!hzHHуiZ8h0l>K?Wm&0Qln%9Y;kz\tıˋ:5w;,꾈P$HCh9&S.#wZ1hIZ:˥p)Pʍ"VF5E@X+\$! eklvBGtPS[hO3G[-x&W'טoEۚQZ Z[-;TŨ<`2쇙r]^:r[B9/Gqق6c`bҹVpb#ViCOtN'$*8ʝ7 1qV{t)̋z4‚SwnvLw1 x^5ˤgw;w}->K' `clUVjm!;UqT=IUUUoyP8FGD8BQ=6g*<ɄÉx:Mq\ߌ~C=@-ѣ_]PSK"- aïZH"Х"JO yFmNٸc(xsYqN+teE<\ $eQ*´%܁2aeLj0 6Ff냶(b u +4C1(mH nhDtGFp92"-ȵ[Kp3"ا MKI DpC,s$`SS'pzϜ7-<M]mY2L5s!J |s >> >,Fq p\8FPj(!*6Ebˍ4t3O|zYOy!#A\2)(Z[Ù2ꛠH JS.Fk: Iq;8dqT-'rImyds`510'DJ2CPPRGQQ?/uYhm6 m^JQۖ]npV2`{ [җ`≀AwUaD)؄ kL$W5X򊣭U թLFE:}lQ/}KlݻUOSNB"ƹ %g. \W .;r\`F Z). F *a5`1FЮ!JwN;,N\ױ`?\FSQ(`VG5SPZY!AcSXJU.~VR* VJFots6i õnlG]֊/,1Hy7ի>(\,`߆ vy$"kY"M4ڢ~4;f-|)@C!fH ^ wŔAl} 4>o9%/k@^&EiR=J%R%ΈL|SIO*t%EKSAӘ̷cf C,^+T 1;^s/6;0!~#/{ltMn.N<|+wŒn*-' W/zF`RGSJmΌ6>nw1 =6zfl ⧤ *TT-Ǯ '(k { @0WbW |BJQN]])FXǮ~vu[Ndrtqx4g_z~bDۜpbZCi:33W/7 }e!MrDF׋ h*'ƦOqH SU0v>m>d ٙqMo}T)UNJpzM'K$ҽb>KO;+&3g;Pb+碍hyDk]#Y@~a@lj8vÚ7i}VG^))7l%D\PȸVC f ELPd;P epKGM~I](pٛp?ey[#U3p$8 YPJ;uцS%X ˱A25A {];_y+52E^[a q"6 ZO$Y v}VM9Ջx+eZaDmB+C0!P@:Dw30ZFL&Z܊T*ZXZE쾈솟>7M#[Bm7#K@%EvفY_6Z\l=^̿odw)x-l@Ru2dFBJLQ*BUgUib9~RKP,a2cRV"1e7:eZI`tFl>b<"[2Z"Mw/2dYZH[5]_X(lR!TT24>劁Ƭ"Kk޵;b,eWJ,@-.t_n{ h0z;T4Zڲ`}>%Cd}J&d[5v^Ojm;n ҉q$s( F[4hZ)*!wnu}R2)0I AJNj} (( E?ҲcHKR.Д 2P %!HE~i!!}Қ-UQpRTH@?ی$ԓŬsވ⢗}hhVbP[bӮ,fn\rY$l, Pt=+l %D: d{RQ]J u;aJDz.ˠV˓ ֓) ָ"@I v+ ,*Ծ\-Y$mxY]'en4`&T s86һ7E˓ژ[M`s qAY'Eθе24.ΨVПMi$* 9is֢R6/`*#Ϟ\i%eAآy,X\2؆~n!VOQBWBJĊ!'_2z|0eSt6X=2z]C $MfP57._s*d2[@C9eDBP@fESrЌ4k e%Lqr)) s8b'`&C0]E0TrAI!1xSP b00Aƫԁq@8C,}r,}U2zLrt9|\ p1[;ƃ%e& :!/5* <@H"5T^1 6>)S+W0qiqTEK*z`:G͠ ۀ8n MdA*X?<{UrD? BM tYI";> ƷۧR~9ʼ󨓹HΧ,}Of__b=^EdB.@mځ^Ls  QBwAK\HaLFt]JS vo顔JPEk@Q uCY.Az 9ugʮ$ݱ`M'm>Ф" aRꆤ%CHr(QZth@$MÈ? o9)-Pk8n𠗨5ʡĐ 9P/3{SDӓNJT+aJtPX&R4**M=<bA= TߴUahl3 'r!V3KRe 4ivTIya ԟP(eQRT AGu1ibbxsX:e@ t@ ģtF$8QNQUY3fnf'5i+U|t?(ɐ+9k듵Ok=kDv ʷҔ//5")"wU| &(m@yT2'II^#NĔ5*5X. \9eT$`VMJ9CP$ 8tAb驋:قP5.CS=B+ygV}UQnիo6{qlvuJ0YAR I<`jS/ӣ~~Q` FG3E!TWV?i:d7?3<]r!`w jv h}N @A7; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@/ n!'6Jk8z|%@/ -; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@/ m .9B[:@;^B v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b';^:.9Bw hzN zN b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v 'խF[;ӫV~hw̗_#L vɸ;;%q;c\[*6.qIqTtp'{ )pupo WJ4pu(y^pWC/wz \!\vWGKz3\@R!bWZ!UW ђzpգt•6hSgaWЊ:8>I<ěOɬ̷VgM[~*?-UE4_)Tq*"/gvsֺ3o|dcɘ-KE`;u«?_緯6SMoj.-\EI Ҧm@CCFe%Pk1U@ξ#dPŘݿfu',#.hW{ccu/pP4ޭ^}̯/oYp؞cϪ-՝tzu4XQNS%4;ɪ5謽MR죥rͥf2'â~tg=|9 sUZ9Bsdm(|A?}ú[?]W_>3o->a,JNus M?eǽK|;CS6>?vqnQlփz4z*5;/@>-v<rtQ=}<}=?(#\k1T}R?rXwS.'|hoqxx{->u͛;1]F949^XB9*L=-9BM5D@ԄȸԴNILTm}|NDV+IMH& vrL<o ; ׿/i4y} JS$Y|\QI eZ̾SHV'$w f՚hؿ>?rc0brᷬoJߘm0aDLx[?~n?`[?# z. ڐRL)16Hiu1w-\tziY+wp=e]TL(ej^本q^d[kl*7"2}S(=V%yğTNqTDŽvpn]{vtr!s޾AТYM='/O_FZF*F dma~Ň{EݩJjf)I '׺99_-%IjZPPhpm/% ~} *SUϔRjQs"Jm3WT#)4ʕ,M)YuduR:R¹KHØάnX6їG4?ex&zC./OD1-sTo/bYx8ad<bN&`2`"ka:*٭52[>$Y':=X>AX}M A=1(޵Eߊ ct۽@KEAgFY_Ii[NN$E!ggJJ7Iȥ,LUFъVg5rv[aUeG/Yq85#p4zk,HsY%.Y#udZyw2%rH)+&@xXteSq(Km^o{9+fQefFnXV qơWօׅϪ Ww]˸#p} oj{ӛiX܋<&_G×?am%Ji!KdlRAEf26Rzɢ6F="K\&Em m&c3њ$\ cku"gq1EkWC6TkނA0eJbp>1\r`V|R4VhDArc] 6+m\4$L!HtML,hH` Q1Rt>Fn}؂UE#VC5E{s0Zg$BKLbvvTL  !׍۶leA${}ʒR֢&N o1Hדּ^47OV=YFΞaMRcN %CD ]dw; EzSY}|eDzlMΪ}N˴T`U91l޲ݜfVL9R2FYC%sx>hzZ۶aׄND9HS1K'EZù!H!i;=;%({< Uarmۅ{aMb+"H_nel\ŔSG$.\FR#AVA гsKmM{UCPSچl$Lυf.ш,$u zA+HFdEƢ "C?(2Ƣ2% DRB񄨄wɵicU|sl(wtTݧ 5S4_xuzB_le4 nT|RgZJqJ+?]}v{Cr s6ni%SZN}&_ض2=ŹdTҤqlӦ3- Cg\08d'ptG7')4fl,9I\zcɗh ش\`EnyO l^Y\_'.8p4C_[t3ti[o%.>j([fOgIث Iu5SWJ\>q^>ͨyErqpŵF.͟>xw=;_|a fE̅m8Σ \-`<t=kV;J?6@AHRmks6O4e.>n=]9ζr|ɮQޕQũ2RV>\i0Ƀ6N L^[SMӻ>|`?>v? RMᤃ$IH ??64ZXZ|ń1a5_c܇c]ۂaXm qoOi(n!G9hYЩО<˯$ ,h}P5U"*r1]n=xdǶȆ c<o㋭5(ğ751Od&s9z29bt$Ӝ~PY3+ VzΆkZG+WgÝ "h6HάtF' \*.KҊ7iq3*g:4<;ix;ϧksxCOv ;^l0W A U,S9or&} rfRrwX9S^8G͢ӎ+D6a!go,$4}R+y b؞eS;qβ0ѣ46 jI2c6CbV;†Z2mz\uYTR%6(j뱪r2952!Œs^^ ]ۗmuvfnw=;Yy/(D# ±4q OF%83'B\"5@H -QʹF)wIHg66qcʥ<- Uvjv7 W,wճ7+_5ܙw3g=|%diJȻHr :6 dciJfU %~{l3L Z#G=E1&/Hpz-K2Ik| 1R؛3eQlq)T[LZe̖S\y\i$xȑLFp6E'۝9.LRk^Dg1쎮]׫&r+iIÛCh{ץ/vzBdߐBCCjݾi[os<͎TgC}6eC-v>yxw狨=zr;?vJ5E;:^`r{-G+\G66/m<Uȭ毃5c؜9w(o13Ps޸XXN JoɁ_wlcVI*w eO[2+\ X5*zSf'8f=%4rA@ǰ&$%0,EUL` eaWWUQ%xHn6 M2b!rCb&ȀF&g4j0ԃqm%5pZ"|XP᳕oxxe4Ah2R`*J,x C$zD z2Vqd 8Pҁߣzz5&FM*>H%dR ( %H֠o&Ar! P$C ;L%0c{Hc2n?!"?Y[ <'JI̬BF: I]4 {GxV_s%|p=nrgmdշ>v[~!tBvGN;'<0EqdzHGHHK\&&pN0wpfr[:_ܟ`([O5h4̨]ݿRu'o$Z{t-|m>2|wB2qn<_C\ϿΑS׻˖9Z^FS 势nu m 704sl*rV8lڸKjh+X('V` ; ׁyK>*!=v“߬M1YrB2E``EP&G&Kd ^]/ ٻFn%WM$Ň|I.doLeog4#i^ o%˔-$j)tKeDL#󲒒,a{n1!(w!Y?(o[YyKx3&<`O-;Qxsl}4 e_(e@n;HǂvgԇL懚T\c"s9oiݚ9I7l& >Hu2zRN!L3FkR SQJgyotE q`ѐc7Tg=2 LVI;]Gei25{l˳Z{6`Sr1F;Qc 1*&h"(HbSƄCp1=z {0rg7?ivK@dA,@ajVC*E#g_ҊԕՏEvPzr?+;g'L'8Fߞ)fj\FvxC|TW[ApzWyER^Z̙qH>g,s!Nt|>2uteqxeZP2 1J2<$Mi6j sP;ϵ(S # WE>'-Ø|x.iinTW:cvj4Ê|hJ=Y|xb\oU9CB<(Y&"8@T9G={t1Bꀧ G7!SC>.D0w(#q"c(R^e%:䩂O?}̥@qϗ c:of^ߗe^{W[___ӹO߯Z9 KӢIΎGK†ɉd_~18Yw᪥|5 s+Fjix&x>^'znтU]uI;A%Vsx #b,ؓ>_=cwUmcx6t5Š/]t磳˽4>]6~蜸Y睵Xk;=}ԥ|yAh&Kk6"+.*JϘdLh!"< )ˤujPewgYeVJ gl n$u01ym!&4`sΦ\1*. ٫r[|c˚&riiR=Vq!o/dr%P*:uGRည\)nB  Y {qR*7yM4H =mJJd3s"ȃMI4XlFcdt؉= ɩ/A<=\ 7U2$&JL^tHSď+]b("G奂;w%4}7H~yAhe :XFrۈ$US7>$DfRڅ>ab5>_[>h/niʽ)'c{U(Ũ3 h7f%b}|ۋMw\\riEvK,3JtHncwm6v?xy)d6vwncwm6vwncw{cW@fx5 !0!5魨FUj1cLd(t.&:$:Җ)4fWDj$FLCB 20*R@X2U- r.u)ufU( cBkOhߐkj'%ޣSUR(-#MIyhnQG@ t:(#QLD)gT?̷qc?Dy^urKdnv&ƁU< OBXnX'fuֹ-Y%RZ(y65\)=B##8mMDխoSr))0 )E/9K,F39Y,8K7ax~ej9ɖ9խ9k1`2 %ZՑo䓓v}keִ]{eE**_9DiHV x!`%OeB܂}dJG {8ZH[9F"Qoi Ik1So$ˢY: (]&YJZŵTl0R >4Y7?.0 TH"5Qx()6N(۬4:g|Ȥ "i!58e(bە9D.Yڧj^-.be\.vxklu wC BS3+ywoFwujq{qB f+ ,:}(!mpW㛑y=h{*,MFhGq1h8ۙ9`m8:zAe(Cx: a27Hɰɸ+tIN%m$UdJ lL13"IL8h%<0"Z V޳+G6gs2}R+z,鉭hVx xKJ~J93?$ŏr5fV|vhرaGȔY*[V]z)7m>W G)SGF&ŋ *d##҈& !A%!Edp g226#ZrJ:!*BɜmK*Y|Z wi_py֠f\z$ZG1rٻ6r,K]4k9\kFtW Wzl,%-5r9Ù7bzž [8IG3zDxk qhTNC˵&χU+Κlo{H 3@T&wvmm?+Ey 2ؔ!!^XIL:CXhUob!}rJݎGv NNU E %FvcUa ӟ! yٕsR ~?$s!_!A^Z_ҁW!$P2Dɓ&еV xOzfP$j\Ri1T9ʩNezkUƇ(2*6nnQF?%ڜ ñ_zAޅ'ŒɭMs8c+'6mIg >e=[غn][o'9 h=2YĖ%l'mzQ^6MvVrzkn}-ǼϚ xM-[d{ zm+EmNs>62;|_Lfq*8gϣ>ѕ OT9h.ߠ:՟^}˗߾/_/q ,zZXOOb'0p?aEg]cML=QGt9~zob˭7y9 L}7z۔N^ᡢm>zH9XK,[acW#`4ДQA{jL"8jN(͝v֠[E}g64*S 8(ETީ(U \!RF#%8ãB NA<-0}BPJSeIJE= DȘR,99$Ocޑ$ƅ'*\HolVC_d3Ĩ5A/ ӡ"Iڱ;ƽ{i+ PY@ITq $ R)9 I+ 8 wKzor:T2Dž_8P22@N"XS) NV | nJ "}$V(c9&iPMbI38srh:4]ra8PKMLfs(rRAig$pDh2vJ2vG`wYo`m-A}8H@0cC e;L~^FA*SroTeAϒDǡpmM 3v9  sNELRBJN g(3Zh 6^`[g:颦PxpwcaW]:1!7:]۬q<'>/]?_ -<IKPڠ? *lѫDF]q&y Ԁ"UiV9Jc,qjkC i:EKʷ\$(T@Wv '9g|/_6s@v,YJ51@M#ԨjW8hkQRX0]$ IWrCi%V@P`Ѹ _iTr{ }Z|̮Qe pQem{:mCAo Yh['?GQ0)rRQȂL2g /j9*stE>r(J)}ƍ| n{)ɬi=2y?3M yǯMqzJDFEe-Cub<) 8-.uB,U#rjOF8Fk^ƽbf%I4ʀ%-KDQY)~ g&F*[qz;! P8 9oί-y"Lb7~;GܼyjhJ52"BfTJR$Y&)j( FgWW=<_DʆHYfnx` AU{ b&'5ż7LˤݳxCkˊ럙"r˩g9g*W󃒆$)@eN(ɨ['@xTtZX|y9Du (.~TsP^a^KhQ1"@!NgML(YFeyb7J/@ޕ'U"\w[)څ'>w|~ɋi}g ݟ#jhًq]T}Es7E5OI3Mvb̋ zm[.3gk\ݷA.vZoGz/pBSZpڠaJ& OXCOyEtH e"r,GD30<>5mojufjٰz?+< ZS#-$셥P`,1( 4*ɁÛD8T^qt8N칳5ȳHvgϏ-%jﶀڦ!=GKH][oc+<%a7bVe}Ii#6;eP7 W lk9-`\M]fk\d\xQ d2_184F`&e XG?o__iCÛ4'&t"'^)9+Z)G"AhátJؙ8=g+(qHO|YR2i΀VtCAAm&@kI`!`kB^[սORZ0bvZYj48ᴸW[bR%*t,}g]dk,zP +eJm3^k! +V0&O)a( )jL{@qJj^nW+po|&g='_`{v8◯m ,Xd7&:ރZCdicdraxc]Yf|ڛa:h')\zΣT[h˒^^]Lsc@3& Z)^Ӄq * ~hmj6>/uK{}ύC~pgͬqgvӆd Y2<1gPhT 4> 5s='+ӓ]rIRh!+XTq_YJkzףpǩwr0j/>5O\'N}lRWI_ PWաS/H`kp0JXE]R꾫+h +i _֕j9uUuwuUK!4u8PH]`+E]Ae=Rn +XB `0ĵpGW$Ʀޢ҂;.NF]i!0CQWUZ+HJZM]AueEH杒,kLrRϚNyئQY.CŵFrH}#o$侑76s}g侑7FrH}#o$Nj$侑7Frv[#o$m{4FrHU6q HO#o$侑7FrߐVH,12}X侑7FrHۀ>Jh'wW&JOhRV3-&;ZFrߌFrH}#o$侑ܿF-swu֐+92#?xH"NX9z&0e2ҧ;N]aDojx|`lwL1[mIM̐t% Cu:m 0wz3#䥇k_\@ؐ'4m(NKdȭof:g|Mrzԣu`jq`qj4tUѷѦC]OuWz{pAT1V*$FB\1YY. RU]E(!Z_ގbhniBC(]Ek$E"6puQtLI\xƔ++pHF/ia@SY8?_G? `zp2VF*ޡHv HZlcpM' Q[Bl;Gxiy$$QGG?<߾wlE IJHǔ$н]o/D;@?ֻ_һ$.ߩG~G'0'KK*'yFbm(RIso^˯7t.Z[vTWM=,ɦnW߮ΘK6zbS_jSM7{vxzP! 90 BU-_W6)OjѶ_& UT ͛nk<9 I䉠cR(l|RI:m(<*a?6Inoz?_\\'IYBV2$-c=6ʟ umؗ2 yi>!8]kwj6mnt._U |"ym-[m_7y[:6iͻ챞 2yoOtz<TzGHq,ojO23ЏU/әNi^ɬ4dl,X`+>j1JԘ\D 'U7vR1rbE.TDCA@HVC >z5 (Л딛]|n wASu?_"4ڢ-pUpG މT -荳AK6y}6'J 6֭1ZYe$XWP C[Wu-]O-v}vdz]vNNF<1^*cNK򱗴Ӑ|%[gzɇރC7`vHszddzJ杒,kLrRͱv$ǚ8Xn= Br9;u=q!v!ZU_ w6nn]ldTI!'k5d!阷rBQࢷu&r : x 6ˉ<.3 *VϮ1+U28gA+T!EKBzr#>TQ&$B$t e6;VҤDTJ;Z@P~`9BnJ򚟲|*K2Cřcz86tt"td8ߙ~ʾN%tr!+\8X5Jdv8jViB[d%l( WsPBѨ &ƀ /9c+Miy Uȍr Bt'O "hD٣1IBJ&?mRr!E0_]Wb,G3767y7 8wP+|7#^QIl~6wߍ?_?t f-G:˂F =} V#1eLv\z?;]M@k,zfG݅:D+sfL%˥0!48c7Hw/mҽrV2Gs!+f^ƂJA "ۖw[)/]FE ]6;]D ` Iy)DnW^;<' cۙ80+W/Wnmgo` e/a>ٴAƴo]PP1pb@$ZJ7IR=Jl~u_̧9̘eFp;@L""Ax$#Ed}3H>H#d'4Η,YʨD-fV%<}u.U[*Yc`9% CpB,d)9S$HZٻ6r$Wmd@pef6 k-y%9a[_Ԓbm% jfVUvILqo}⯒'a6X^u㘦)AQxGC uTB j.}|D HG4ςBSC%Q  zPDpJ>5856}ЋFwߍv3[u0ayeJ:*׫o3;0ytkJ԰?K'/Š=m%, 1 6$] 8 D#޿V)4h+& 'ɢBBmDɂN[,ۄ 542&vdVmnqW,c!-X,KE{ gx˙$AmOO'NGQȈ9Bz0(ji'4 E@3EH& ]ͨt L;NTJfVC8 ٳp˰ RyɝA1TBߎhPȍL>&PlGl7+.澠`㮨FmVPX[&V\$ʩeL Z!T$#hhh {r&a-$=d )G/CT5!/Yd: 1 ?.$G:*kَS8ŗq_qqWD "DܸWR&HZ}\y e.ZGz&Q Ұ'c4%r@MBP8N<(GOqVQ|1CQ𓈏!.N)9*+.q\,7rAVzH<1H \}CwłU .>.<<weVXϩ<5{h ̾R1z$D2~/߾z;Οpp#$"r/˟wZ 644ah Cu7;Ȑ3mQmAfƙ(^w{ek+c~{r_AQjY.~)Asg)? hđ~yM}Io>)3Wuk|$e7#`4ДY{jLE͈DRA89MLi*m=ذў1=ꤨ2*aZ*8%[%ѹ\C*CZ Lw:ә$t&t LLݽR]}wntYcVID 0i;0Kux@<."zj4T̓pfQ(4\h?Q"ϞSRޣ^\D eп!*:IǿpΘt:%DJQ$u.XW;|B҂H ՘d}I94S'c :W6M"PeS;վ|:$p5' H4Q*夂hg$RD):O:6+b=}Ӵ*;X[(`P >Q8OYP26,ƳɸC{;jᘩ7֎:RXϒBfP0=Œ=TcA:T$E)}CɉTM~ F[KRhsiufЈ A2 LqxL@ &g2QV,Z @2| 1LΩhM']ey?ܹϧoٺ݄\7C;M=viF(C}O$Ix$A28ATN)WWAd]a~yCEjҬr:Xj#pjkC it*OHTɓ+!&v ^kg}z;qt~2 ďYخxK5?֞M?{ b6~=A&3^L͏KZa}=7+8Gh$P|b:_\*J,J,J,J,J,J,J,' %jBx*4P@BU6 T)M[h TY@BUh T*4P@BUh T*4P@w:@i;@T@BUh T!(4PhJ*5@BUt@7@ <@BUh T*4P@Yٳ JM')5m4| Z{iQJMA×|_)Nw6D0~" .P2DHWsCm5X+XP̢]^hs̓KTUFb Q G?z?{&'/9>̗Hഗ0dw8JF*IQ"0CM+Џӑ)5z>;sc,5R.;}J%FC5&PהjSCUoIϏIAkfjlMPž5;ua u3Uo\ ץ!;uF=4w~0qlH'&{PHc>9?ot.td5TN6(B۟I+W+ÌF2WnP*;l9u+F'pM尸zTsyc^/WUphۄ,ڶl 4f<TzGq%YU哰K}~5涔:IJj&wi~֋`>-ty;/O?vIisok[ncױc/3S5/3/j^8vP۱lGVUAuq:nmFɒg W?pײecer~z1}86dvjw,~;"6'xHymY"wTӟS缛+aum&qD'g3ixE F:\ξ͓Μs/ jJCa@(ϯh[`}!1r"J bmTCXr2CR뫙WC4h!m@1ox6 ŅK#' r՘`,S,kSWϠ.c^ORf{tP0qc7GzχIKi@FhwZt-BDx:5kTiGyz#0s^B=B+%vlS#&+mB.WˡdCӻg{r2V!H>@dk.>9OK7J0@Jl9J>%Gw6Ah3bhmdZ]gxɜz|B}Rn-Tv^jYf 4_r6`&ieoىiL*YvjkJ!T˩T[j]O*282)=TP?K% Y/no#`09CsLlF4q 3lvJ6ܙJ9՗m1,MtᾹLmQa jh5: Vj[, $*QKu58GMt h"Mzk,j ZlI OzE,E.1赝YX+]Nl pl*hDd0:!cVK{lbϥۙK{J݆O#gM} 8K5 ڲ>9Vc(6˱l}KCBHU0eDtl_.kIE|Ld_ug9# g6+REՒ+ka` =XPd58cSm.1B;&AĚɦ )*;~]5e'ۂ䍊M_l})4?G|oj8 W_}%{@5 L(%Ҧ^Z@<,knyJ?ݲAC}ڷ?YײS}-tY1Z ItdJU?ea.ĢЅr/Nl(Dى.B7C_|O_:K}'ow_?wWG|5Jǻww[: ޟww/]}~~n4`{Ͼ?|s^uߡڍ}fmz`]ݓ킴iF GfGևKXh]qQp (Rw0b=9|'Ք4ˉ݇:n%!FTFhd$0FFF^b/zxlCPȲjȌ@b!ޒe)R@&9 .eTiIhYkHGb1ܠ/v@`tpNߏB@oϾx^wn2=In?< .Lm~9`uqVp}{ۏk8E'D-}ߏح*ET-Id 5.hʏhXGsKǐOSpkFKR Qb[QzǪ>ݚu԰DbcSdd_ ![Uq =qǧ;璈]JʓN6}M=ȍQ'[u 7Ճs,]zi|D= 7ZGVCS 6j$\(kueիj6pHhl7e2Wx,7Jy. |u~!E™>鏿#jͦ:Q"njΆ& ;=*m9"qFqI37MxΚJ6Qroiwf2xW{T&V&yvӝ= /1ů @?]@_AJz}G@7 p˙>oJj? }:-##ɇ''6;şsYPaP^{7_u:?7f,Pg9X%"DlOF1>F*~Bd ЌZkMh"dQJN"lfJ,bu1.60u3D s65AIfcAn5l,]a2> s6L`0zN[DS ͕D$:a ~jk;V~<~,+C+r!PjSJ(@LI4/''Pc# - JC zjY}4FYQ x1Y(5 R#<aq,cXhBƒ›{ґ[jÿ>aJgg_NWWxĎ!5 [(<꺈d)\Rb1, Cc:"6^*`PzD'Tp Mtط\R^1j8OGl8/(lFYǾQGmX`iDf&蔵bӌ&"*且9RjRВ5h)db瘠1J?HR+&+!6oz`KDfFDqKD|[7l}D5{ 'X4Gj@ІFD(=$Zn>X,A)#U_ Xcʤ`)p#<~Ej8לrìd߸EZWM Yݒ`9e=Dj ϣhe [,qqq>c.n6&sԽg ~ĥejg-ՏoS-?Xlw֖T?]OW:& dSSbXTZh){S$PY,C{!mweyEF^,ssܼ)XڸWy6۷#8bϻmf = 6)ֺʟ/u}*agm^;/-xBgX|$=$(1trtD|4:9:\Xtrt!urt GP'ǻ0TDB̓sF`HNV88)rSaNQt5]H"VڽV>^E]jiJĤ&[Plzr>}l!0L‡m@S0"E'+ /z㊝4 MÊ׬,_?RnX2Fd.0{0tt祜<pT#:EՄ ۮOP-rpHؠan}H3q( fŸ8Z]Z(liLhV6&8-OiVoz9`,kr!R22PK9 9ɨg7B aq!λAo͚4wדr y#~lo!Skѹ5##8c`zĵZ)i2"77S9Fx ^Yk$7f 7f 5{ВL I%bcIjEj/}$B\lO<;0Fy4.zXd ӎogf|RmK-3,XXI2a |7(..|Vy>%W2.ȦM:{p~2ֽ1._;0_=2%ckg7W\rlUis7Wy ̺[mϱ g_{+SVoOg߶ܠyYBp̩2*o_}7Z,/-6y<̩΃Oۗr&iqvT5>e~z-%we9gYUM"7̗e%jyq?z'O\A=G1~UM`p>4>D bw۟ϮՂǝ}FUox]fǶۿw;12C ʌ1`:XFt'5f O"hlqs}^bN;YA,vS7֕Ш2 0+ב]Xך1!WK$ \G*Wk {KS]`=fE XFLWabn> ({d\ݑ{EગJj+WnնM J'+U*WҺ W#ĕlJbިdp%rѧ+Q{32 zqe3aJbA+k=R{?dpBmpłJW"WTp%jow\J?MLj+ IW"؅dp%rC2%MCǕD?j@wV˂E#9Tŧͼ e&NJc@4{'|yOَNgˀv6{߬"^ŏwN)gKʈ۽%>뛳K#_ J[C0 ~w~NƂ'7:{|aVGZ|}m_w}m_]mN gkCX0IWN⫉0ҶC'_m< vq`>%%=ycxpR=N*=G  h6v0" u(!zUu YNyjM:/P;rhkOE+w4;/BBE%\+jcX%z}ǦsɂO=6G ֙#맒]pm(RDѷ\*bXkT"N!4g!!\`*\\J:Xe0j2R `D DթՒCǕ$5jB ^pł+*\Z7x\!`qEpEڪWDKW׮XZՄW5J K/DKf2jZvǕRq /`rdp%r_!8bO p5F\Nd ;Px2ˋBٯ?%_]55dG1"ʫER{q{TɎ8\q1kUU7awᄊ,蜯v * *g\a[.fvKsa=W&AE~ey tު.O{vmej.s>Vt4*ӡFBB/vy61b/s0Do.i8&c* HN'ܗv{pN)tCu^9Oح aͣsl5 ZdE7JۺK}ԵT6 {-xkWa:ȚPq$~w]{x=t}ʻ4opN؃PfO/6QqaBk jՎ6>e>L-{ؖݸhsȝVmR@M%d6PjN.ah؇J fr ߌK:6}+$GϐKk<#`GY2 -CJ聫0jۦ pdp%rɦ+Qqb̄J J*c% DmCpF+CHW,8tp%rOW?^M!ЂהXpP]\R{W,7KW,XC:SR ƈ+֮XU*\\̛AQk.*>F\9Vv%MW"S*byvǕtq 7STSc)m߽}MٵM/B"۬wZٟZ9lGӛ֋i{ӻXz#zU* bYdHb@U֗.Jܩ9Ss#`nnNȓ3!!WG/W뽵dJ7N*noBJby"rIWvXpVpZU O0W=:<*#% 땭Vqv6M!߁\2+Vku:D% W#ĕQF{JW,X Dשբ}xpE Ru>\Z+Qiqe1 Jrt֮DsJq:D6F+G\J JR2AQ+Q7cĕBnĔ3Xrs<ac>6K%IqC<\ ><(*hѣ_+c : OS!W(+r1W֚Mn`)mca"rRuza 7x;^ϙU@Ǐ%c{W:<Ժ#sS!NpmӃpłQdp%rQ+Qkq%*1JJHW,ȵ DRiAM!#X0 D.Q*+VIzqη] W": Z=+QvqEp%C:y6W D-JT:3jp%R S5 VȄVMr`Qj<%=G1:G S :AK j'x1))f ZzZ6-=ffd\aC'&Wx="7TpjB4tlzɮ ~ݱK# lp}q;lHWHަ++FM>]ۙ8-]U Bns α8uU.ª,BF<(SXUl+{UUr+,1 Vep5X1(.vZZb!W.٥dKuMT6դRTd 'E@`C>ɦecYu$3s3Z[`kf˜=:,PuǨ+vEKZ/ r7UUn(2 ڄ\+y nT;䵽2 [.:b(]gϋ{XֵwujY.BqkZ{V v;Hxc۶EZ\Ǣ; .i]Xp 됳LɐG>12y\.Vֆ(WThuXՎs⡤Prkjkrui~<,P͵NX TDA?r]k=DnPqSK.>*}c jJKd ,PeJSpm^:vʠJ(X0{ Sƣ*]PpVr1h-Ա6 (1&frgيFJ2:k͂ZytTl׮kq㸲ۺG~ `NkK"#W[S#Y &C[,jZyc`"OU::.5y;aJ$|)/VQqр)k|FnxJ v_TTlPtA[wZ 4@shvXqLڏJ$[PyU벫ԕ8Yɠ-],p ]ɇb"uT6 Ņ5 -c0Wg\/ *a Xa=Ҹ1 ksty̡  "=wMBA[EtJ9)ۣ[lmB7BJJJ#L2b!]AАA6*K, |Der HM${4U2T Xp6L:@@=-V(!Ȯ܁pYV ++5 e2a=omN VCH(PED&TD{#]ozg-JC(ʨ[sV z,,1u n!6K17)_JJ ufM0*Qkq(9n둙t쥃~.V!joSCAV(J892rAj ތڳFxwD"=dPIWm (ĩ(Hv\F*^eFTݣ.RVKݹ`3/z $$dA"pPBi5&Y,  ^ B9TECk>84鰐Aw nv׋q)b֛䤡bLbb󢐴t8!bEfC|]g⍮oWvѲfkwNՂYPw z$M"X 3 o b9PT8x0O~t*&{H\C RbtLECNZˌ %tF\8gP4I"i Yf &deZCP1hx L}辬$kIu u<o oxC`QQ,TGW>/AXżۆjbPb$Sa{}~zk|g]J }*4ʘ} ]{ #BK|u)}nkQ /ѡB-1`8BYG]I@(ʠv?ᡔ5K q[O κ$!H;V@@]BZx _3!D[vThňe{}65 PТ td!.h]`mBgTHQ-fWbd@BUq@ơ"2ΪU%? ʰE(3A68F)PNX)j?VHGYtg#MGhTf%Ei*5^ZVE* XHhF-w$a:J` ئQ}AgAK Cj)M.1ڹ^y6O]VW],HnLZ.ڎsU&ɒ;j0V$LC'Kac+i0C˿GB("bjЭ(֚B/ yHY}hPAI;yBϯo9̈}R@k80)QC^"zHmk"* F m}(-hGr%5 2TA2 5@J'2(;ZT}f=*-+P!>,D4cA$\s 9,RDyŰ aLW(ƈMnƢb${ag=uc[sOҕȪTqQc@sjo6i]0s0]FzPIQ %_T4/Q4T Z،P-EC9yygA*DEK.j-*;k*Y[Tڠ@aQ'X©rltEiaӠ/B\] M F:`yGrStE8 8%mCɵ+F(ЭtF< \T"m֍\4*fX. b!C1 Ptق/"S4nR@Ւ0[)!Gr6B!(L ΃rt ?0 ՠn~/n^.כ0*:i JU+HkN{{*ߠ8}{aLd:ZbsBe~vEEUoi}{]ߴr{zh_7Yכ/onpUT?mNGCCs=㝐>{ N%Ϸ';[ƌP|>{(>ٲԐz^s8ϳM\˂^tWao.v裏rrƑ^׫˧Oˈ)?(믥Xё^.gX/vw;жH޴U_C[ٰU};M7Gmy$B&xi`^:ClBv+mc8֜ Ij6\zc<pņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbg5\9PiT)1\֟ (#pEpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlb>*Ҳ WȢ.NpOpU W{pu+pņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlb4\ᅞ1uy-a }+B peT Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\jT; ]W?Ξ= [»ݥf^(/|a`{e\&ZN?azH ;1"JONWϑA ѕe)5"ӡ+Bkͩ ZvBtvB ) mNW@g:CrV&DWl ]9rȩЪN2]]yrMiq:t+Bkũ|+s^߻t mݏ{x+bFo.m7-Z_53a/z7_nW+zsvRuτTR obPm+oWb*-j^2̽%]E_`^ @6W% r ~~9Bݵm}sUV]ME^6'i3:v`6鋱~kt+W?[FZmWojfvvnA S"~ó?}&^u)5MV̓9|4܍!|פ;SDͱ3*F^0;U΢"3 6d{7z]ENNp ]ZwtE(cЕȡ7"Q8x \)\:y&ǡt'AWS^e['CWƩ (LWgHW* ]U+uPzs+<wQ60]5+^֭2Bc:CF&DW8ЕW7":PZtutDfJkWl ]\BW6R;PZaΑ_81w줙IC7ҍ;v!~^oQJ?-U7?Q'iBYz=JtyDNu50|d 4{isgRt]L穯E4aO~=4!whQJ*INKQjo䄤pK܆m,)JNE ZgO] ȻN8R~ B-߂p`gr? nLWGntJwbEte>uQ ѕT(Sd ptS+BkP*ΐFOI]r2tEpd'OW23+_tRN$O羃Ps+MfBteLS+Uœ:]JkWHWt{`)d6R tut *8~=cUv*tEh %]yLP+z:tEpۂpZN%/%]žX9J]ZucP{s؂pDu4{[ Ow )> ^󅎈S#b{i?爀J)UspLSZiyCY^wao_ @\y{]InxW?|݁?1r%U^ПarۯG7%&_oНOwA߇oQ4NC;oi9˻?x꟭pCޟ_ʋVo8 927N~=vgxg 7z=K=m9c1f/frOaw79˛w'0{鎶>a~/gnn>n8X۟W/ -'s XZWO{e.DICoER]*O:gt@)AZ\[uQzSɨn2T)"3y4~_*]MW/.}oȾG⮫)˫ CC > W:7M xKwF+D_5kr{zwc1W,0SV&VO48J:| S{fr?nq5qI&Q9kUVH4ZߦѩϭZ9q~&x8xmXr~;2 …TBݩ6^7ccQOv EC18"7,臛V/g[[|N_,?48QmF~;<~1Prڇ+bKu7dY-ە׶nj9 ?pN0FySH'JȍN1h/Doͱ/L?.5o7Zͷr}2Eѩ7[$Ov3]%y m]8ZN?+ ߃oiu,YUώwVtTFXt)D'KD|gMn7y>/_ )߳|p{HLz/Yvξ7vܝ{sn ٸd*$^)L\B"jc) s($'D [nB'( T cct5bXc8.CV^;+Hw-P \4_zg&[&XєWvgӂM$%7nNmn}Su.4더* kc6BU3r S Z•̝rBp2fT?'+HoVY-rPbL"P٦ɱ7pHM4(s(ЩЛMφXP=,mCd+kcظRj1 4D2xY1^_uB6dOTНE'꺗 6Jl \eO[k⬬Ϗ6~mmsq[~Na]^ԻCnCw:> 1w{fp̽c`zw]92KWs~Q{PKb ]s t,ހB @>%dOs%ГsO~@j5 AI,hQ/jޮ{-Eed[z(%#_!X$T)ST`#,) ^ E@=yJA: a}nprr ~%=^&ٗ^t絷%ӝְ^99Б*gCG7^GO_Qh.('$ӑѩsO%"hG d2yBlN m1.B2!Fj)zGm5.3[_bV!FhäSUBFd!e38MYs@@Ntgbj6&Kذ1Ϋw/iïo }z JBYf@I)!=\ANEywBhO$ wKX2K(˹Uzk_fi2,3!6ԅK9hTrJ],vĘ2)*<{4i#]{-Ygy V82ܖj݅\TQ;J:WDd54NT@1֤#C9` Hj- % $S$"SV%\D ;S{. m=ύqm^hQ4Gۏ]/gW/l{6+U ۈx2_|w&/tdI ;OtbmP!$aT=TTUbEb.HF 1(`s(V{1Lm8V ŬUyAVM$w^ۧٛ8CU }H ! Y1Z?0l/3eX_˛c1V/(7lbr6qRr6iWlR0A\W?ī:yu:y?$CalO!.:!eryM{&p$ݡdC4@N%Jlm>J)؁sIYB 6ɁCEXQWKI8:@bKa`s"`m]sLeiS"(ݓ{cuo#O\ΐ%h W!/qV~r",Z>'iPU6NW}<]^LǻLI3ݎ2&Kr yt~5w>$A*?kfAKEe!mVJ[#rEH9P ?k]Y@jEY4q5B<-v9e11&B@P,3EVB5@Vܐ&jK!j%l糳,{Wv**]ͥC+}Ŧ鄷(XFǾV[l`{߈F -Öʧ?V(:RI 1E~lES&i CdVXt6ŋᢲE1(9+BOg{؛8%J XǾl`x;WM ȢȰ}QqISIQUG1eE BLյm&C+ܸ5`K;SZِ0imIln5Xfҗ8؏|ZUyVCu6Jس]]| (GqU{!> xEL>G[.>]<8{g0a[/r#~_pdV=G }kGF?#1hjTf7bQv֘>yd/a(&e۫ -[a]!Z2#QF0IS>PMbbk:zyR/ɳfZȏ2K΢ب0SeaA(\jpK' Qx!A.LX]>8oquMWɕY{WD<45[_]l bxшD- A <~>j2)=.C]QQe:"|B)![@.TyLЂecIx4f40_et_5fF ~\SoxTⶋF٘Ih6"H8+ X4] ?3 @|l)XnI# B )|*nǢF|kn5C)hlENF/V6~iNgͫw5ǟxNWU'mqkd? ˛PFk?Njrq'&7V`Dg~8?te˵yU{A@s( GwƀGi>/tvy4_ŕ9OoKq~.򮍠2Y4܏ǴUu%顟ީrtFdy}uv qC.A(?*i⑋)<ڬQQ2{w ڳixգ͵&.]TZfym`vK<<2ԳUߚxݷ[Αd:c3oem:LԋwQl-jFlF^fw\&ӥ}lfmNj{pZCju׳XkiuJGZ(oYn8/i5q G0M:M]lѮW}>voa̎f]*1?˛ u<_ Z_\xYs %G|[| b<-RsT&$|cGvKZ'@q_ˏnđyp?8G"@Df9{%hV9H)XKRZ9 ŝ W&[jϯp8 YFu+PcΣM-_r`*6fA^nrFaM5yY|8ξġ#Sƾ+;xUC=a;vtP.>Oε[xy;Q] _)` (C ώ\(Ug!9>s Pް֕6Ȅ^A_KIEVq9&vG# ,br)!(aYҝ$0ġ Є5 z:bz6Η&NЖ(iQ@9*T-I ] ܢP5D /ZA T<nCASGS:H^ڷhtVSt kA_]h"nu=ʙ'@*JE!ԔO4[9YP,BfRpERWG1r1ZIҬ }tT,0}mAN[Y ʠccZ+ʣwng|%wuAk|*к.ƂvS9N:t?/&];$; i/sUCſrx(f5;"X=.ƲpCg`bHI(`2ʵ{hɰsmob%U>]-(9Y]R0VtA ={Lz瑨sFIweHǤd 󲽳`veO]nco0:lK$L+ >,xLj`0~iG'o î[5y^V!г~ r׳6͙ѩ%yZAۨY,a3D;DV%F.AV-Kyh7W@rfUFt%&TYٓnm6'Ǎ)\48Xۙ[ճ>O&y5ܖfM1ٮ,YJaS 0Ξh>pK -))Jحbzbz(yG,۽GWd,XUdeF͕248T cUdpt,1&D9E.ޔJB&i/!&Uql0JnP]1+YX>=otݜ»w^ %ȭЦ5w՝+vK_`ߐI_ZWۻOtTg7~lRˊZ}۝7=ܾjsOceTx6qjY:ŨDl,|FT|x Ե6OJ Pj9-FSbf+P"UyR`U`9eRCrH+%mxk4!F#@DdBd&J8KP|/l7A[c)D픖:{mSI  aD*6<K pIC۷ͭyی.!X+t&m:;u:ޤ&;>/wTŽ^#dWBMLO)9&-y*cLpNY+9Dy$"&L?r%[ ʳ0Ir#PqJIrylQ(Bf^p,;0FĞ[Y<, A:fChYZu.ԡb6ne]y V&#zAlI!̥bR$"'DM֥߱ށOeJ'mB$ GU2;J#̂ YˁFjC 4I= -GPPJ UePFcXbR0,EUL`ID˼7"Ё!RqBJڞ$ 7NǴ24qh fY6sMZRDP!l%5p+WӮ_k[P- "h * =dZ $.U6YQ Nd0H/@! ўx,EYrt`ҁ}j@M_4zN+1Zhik}yoZ8:<7]K۷}wþRxaw<k[)Joj@2$?D\%G"bVѼɑ \J&G*(WCrIčFfh4Re%͑*~WmIZx̕ +CX mgҳ9^“1}?/ZfzyWʹ:w84ࠥn]՛|dB[;G5n#!:&"h_IbU΄ S"bdFw_:adoo.?bZU> I,@RmPdB.AK r=CEl9j9xKr7(wSJ \>no̝M}̼vmM]HQmVˉ_kf4~W> : ^@WyE^ʙqH>gh',s<0QeI|Y3K22 ̔<ӊH,+\&VVFf!FiRfS.@VNyn O1 G:GtJ29JIa(%9Gɽr1B?\ǁz6RW-6GCQRԶew8/;Ȗ6Hph}RYϮq@6q\C rDJUoyaJhb hMaPC洶NϣvLc.|&W7yM 0yU^u3(7Eӓ4*%Ri>E5 iVXRlEQTW@n<0#q߾ Dl$qOF7toߩ?y;;^׻`_o\nKd;uTf /'5.6ZH!69mM6 wSzΨ)Xbԗ{릅,/e׫/Mӊ0rkz@dޯìML-m8.@MQ%6m ca-΂sr$t,tX.|&9MG7AQc)N_v i׍QOnz?!gNU5_8^ll.m7#c] }-W=H*GԾgK0A;Uݡ3jwަr6Lgۤ+\[Ux]wuDs/əol?x Hü ;?%>([5ApgL2&y%gJXHY&cL+ib!j"`3 n1Jf2 Bq0Q Ƒ9a%>3!C:Ṳq`xr=iߢ^ypxia(7կI-)R7uJ.!AVw7t(HbF*X+J(SÛϑ%g8Mnon22⼙>ÿTD|Yb GRHRJIQh֍Se$i?vAq?mSe99V"I]blWjR0N#Ed)jVg(brdJ1CAΠt(!b 9.84 3) )>_U)|珻 6k j1 !9 Π0 .]*p'u8- 0*ǿJen lWA|G)!"C3,=+, ]7tEh7]R@WgHWJM67tUZ ]Z;_B\:R0c{DWT|,pMo ]@WgHW9h#"7tU**hm}WR13ҕA{EW1 f/tEhm}Wrb++@r:s5J7V臕t6›STp`~OfM#--wxK. 9>|@<LJzG?n\Bq/hfr0O8\}“:={ 揯cVTSB 7q`FU'MBכ:Y&=BOJ->٣~N/޵0,q@.V3Wҡ\H;cRùdЧk\9F.pAE.h*\PZ9ŸF6;.`285^8u~pOuZskPbǢNte:t9>=ؚU/tEhO骠=ҕqUleo3}d骠Tvut*1HUkE_NWe|W]}]IҫQ1 \ };*(c,J ls쏫 }VUAicJ D'WƤ ]7`Au"ΐ }Ү  b1 Z]Rΐ\.=AP#J% :U 1nF銁[{CMGAk:O%ځϑ%Q/hm?b>Q5*/eRau߽g`j3_oio/esdGGLyz=FW#ROhxBs[`P-pGh%|@\A);ϣ^0)NOWVGr&]Wd? %Kd+;աK%YDGtUҚI*p >U@~(tut%`OUCWf{]UA]#]"]E􈮠oWUAtE(5W]!]I,.wUZ:]J@WgHWJGtE/tU骠z3+5]im ZŻNWs+cV}ҮQ]7+BkY絫R s+[R8x)Pad_G1nhk\z%SS Cmt!},в"k1.M=ps\$XxZ&WOuDg.SiWvK..Upء a<-MWCjv\ n:C\٠ F)0 \Yp5Z];ʵ-p$HgrWlEՐ)ڸz\A%n:C\Itlfv5|oErew5Ժ_*usW+~ E:cfP~*=o:C\hg•jun\ A֎+$#Wq"\ Րvv2 ۵ܮrC8ХxlC*l{}w2q2-[1sm8'4D2`12ry nue;Tyгynl .{⋊ ՞UT,dp\1CO7^&,O!ğPz/oPtZ˩w^\(WqWCq5Tlo WO+ʧ(v`!? \j ~Jgbq%LOwpOqoXr̂v%W%0;r\ܧM߄6 *YbqYs0fbpȵӸVVgpW*d"O+'t,\ <0v\ q3x ։_@ɥK]D4Zneo Rc~1pݓl,Lލuoa,buow-biIx?`=0-!oCڭ0T=>+%?S'Rt jVWCnYpJv\ k WO`L! Ðv\ WCe W+$/״: ұ:ҥq 1rpuԂvst{k׸}J]cTD'(L'(b㶟ԔI={x.(w1U{ }@ﻣFd~rP+GC1_;d7л_%]]]>Oo Ƽoo_q79z/~k!lň_^۝+M_>́m7?70L?zOw7g/1S݃mkFj|;`|&޿c/^;aa>*@Q;r`3pE <N5L  57pOw?xm(HEm?p]^,dḏ%+ \j3džkQfOŻC"_(W!8ww,~'pw7?_Cr~~x~+۞EMɱKQ2\F|v ףjhsB1+ )fAn:M9W[*8t5է_b(lrHG ξ*ں16Rwɦn2i&'9 :T+1SZC`B[%ε`0"4hFډ{s^բk3B,K\~XCԻsf^3DF$ 'F1=FI#SmIa2c7f8:!)RL ZՏ׾AW".ۇ?[=d昭1+) וu' H/&RQZ:SjH) 5E/d5_uQ\&FSX'G͋K j TЇbM7MrSgMA̬Y=T5.耣,)";D{jkm.5SGގ L&)eohZE>$68Q@c-8`jbozy9w:rnGpVa=j8QI[upyU糯+qȠ-.L cMm̭n8DY֣`8.t`64'L{uU?s%H \ *PF@HP[CǍYlXkٌ-10*\-JpZ;wMAIV*N!'[!b zLnAkpʆ"RI@ebcLhHp 8R(S,`@$׀T %4-+.ӛP &$b d:zn vEv?3UAnU{CZ `Q  @ C:X@ED&\ .t/52|ی5g,A?C7a?% RAL&ZJ I5JA4:)y ֑8v/0ŏbU[j((SѝG\`#)#rԞwD"=dH_(h|W8T-]u3R hQ]LI1Ji1r 9qVM`!ѿ%yeBjr":J(` e0AFݮiF*r/ZQCktȠ ^]JqrN}/fĥ&+f]"1QEbSSt"B!&D̿%azÄw&^(W?{n.e͋X?Ԃ3_@mzfx@\8tFEtdW%S(YaJZe1%OHv9Au:8Xqὠ/XżsUpaPMRuY+ 1H?,FO+|ny'K3iȅBOeLl3f߸ _{`#EK|wIrt@GsP6D2ZG݅ZҘCH#l]G]K0SQGkExXKG.ܖ)%3cEX A !+P(hp0m!zFF,=w-iP`1%ȸ(YGqXd9*$`fj2Ō ٗ`PA?A0j QxTD❭ZD+9Lw.Tp=Q PlYz<`N{XUq(ݠ/T0n/whE^H"6J1N:YXrRQ/-a50èH#jJҸh e \Fr绝r5W)PxYk }(T: k #m@(2wrfF8-+ahCzA{!־?);1>!p~hd)) Ӕ.fA"c\`7afd1Z\o*Zɬbq>paC3.+^,l)-ޕ٨^nYl8njn_ o\] m^'A%xOPBxV75B-~;L 1t@;/Nh{@Џm\$'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN ri^9 p%>f8c:@'bIGN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"':@@GN  L_@0N D)hOt @"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DNuIТON #'57N @;CNCti"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN rp@_,EvSE[ .5\Ղ/O@S)y\w?HӋ& '?%|oK)[JɸtƥϪ^Kl#JU/thwB[]] #B]|_ Jt(5+뭹 r18Ij4L';r퇒._pYYCiZ Q78?MF]B KS283/ J$!F*hq"dɪ,?Χ'3V97N >>D0ٯ.'=PحFŁӋqbkY e4]{Ѫ1K2+Yr@z@ m© ߭QteQ`֖ګ?f0ihAT2ޚrjK nK@(O ׃;Zͥp?Omm5ڪ ^ |,ԏ=k؋ވ5#.!5ǮZۑ4}6 >U(7bv8zJReSZ^ f 9ʹ7\"VcϽJo靺 mی60qm{;0^L.˷LmTtRw. |>ʃZj8Iiߚ_1/:cOZb|<#q{RKYa5smkZVoVmwJF&b#]BW^SJ7QGT效Gղ4j1|f{IӎeΊ^\\od//%Ԃd^ӗvyp% ]!ZoUJݳ1ZT |;]WwCk_H]+])V=NH#/thtJ8.]F}+@ Q Ktut%~Gt*Ӯ7tp ]ZJ9=t~LG/th/x<]!-Jtu8t=n#;pB7AD;7o(ճЕJ>D?tp%}+D+ԾzM Uվ/thޫ+@iֲ]];`tG mP0k-ιQ;[z3(:teeH%m _?d/2wrt'g]mL' n+kQƇ\_6ԜRteJqO]I5>*b=9:boMl5O16U807 _UA[>?|#NHi]_>h@4ssnN?&/*,]woۧ7ybՂ1\ý8sa9 L~v(PW{'H@fSӍ4@٪qjnkrU1-HT%RUʬ.M4# :J\~*>HbtomЕ\M.jB^0ndz-1o;PCC6(|7: IʪNTYKhr%R(;mE1&r-zV'ߚr "TTFb؃È5GkK7P˯R,G0l]_p]ǜS=_l;ȁ]d:;UN^ݗlǼ|:JZ?G!& $WH :OW=eWwF)+ETͤQ('XU N{13c34#3q+5gQ whUrSgBkdժ-p#Z $53vnfNtƅECw̅Ip@e|r=I\Y<ݞrͅ xɡM^ A Ah#tPX"sQl5ݾp*(,QH)MFE> c;|EySS,Yq:[C}a΢㡬-:fmAM OL+V_\ 0j%֤e#ME#q$sVVmlH` ʐ+E'Y,pM.,Q1!3K0.638wa;+W`MFˆCQv̈ֈSeu:Cۗ^d \HD,~b|.Jnm;`x*ZI\f A9&)[Y\J0`JzvWpf:/ΛUڧ΢䡼;EEHx؈U9aòSUXH+RgY2[pVS[a./>/ȹ>'|Q>^R2v=ǗY>-7+Ob٣=_yu|u"Z}ܔhq:"f_ߔ6bHl;ue@,m*+dYoh[wہrqQ*8]mo[7+w/pl 9 l@]4] 9d#Ŗ#Α$'@bG"yHpe+k5$o=ѡL .S k|9 }Hfld6y&#}.}CvOVq׃fwPFlFhL`s^*.h.7)w;4;_Cľtξ :@nrD]@C℀@C QKP9̽J#? m7acҀ !g q~9x>>80n_aɾnV}ىpuy2+s:ܧw!@Ft" +";z"A+ S(X Cfuikjs/u^J^J}!k0{H/mIaA:$ UIo2޺9L.KWg+_CMcR7en,+ ˕/c1i/Dh1%EP-J)@2I9avIRrt%2^QY@HjDeAA:(C&`r`!dߙ$lx>uVy>P|޾hQo:kIzyykpz࿩T;K7\}Z\=x@2iJlJcMo0990}aO*lpJ&ˠKoM d%8h: BIgP@cR|.sٌe&o;e57G*Y fND0,(Q\@ -Z`)?$ss:Ф)wv92Mіr6s@d*^VBi) $NJ B8c==Vgem|inʴ}D,"a2(\xYF18,$О԰e|})0F {k-P[B(럜.ZNA0r`cQʭ&`:њ8e5dt 9$~B)_ȥ2НfbH*kuzR??K?$za<[NWt?aʌr\y1W0&wy4R'}/~Q:=eTz1@Vs$@u%Bj-u: i~~?^||3V`xU Ik^e%0.yj]4_SWP_Ki_/In;u.>:iKxq||~KX*VE^ XEP͍UMzNңYZ.gO~h|{uy2m^gpRegއNll[Αx2Xrp#DGK TgKַ OEg3+ZIx(]|2>}1ӛ6Oߟ^nx9%ZzWZ/[V}^Hmx|?HiMʨcy$Gw?ݯޯ笲(r7߽_}_o˿|oyeN)ut ]:`'t^{MCoMs{5-Ϸ:+w{_|e-=M- _~1t:^s'fdD|Z 4W,ghQWl,{SBPӥڃF8w~WOE.XZHr1+H1!Z0q/`UF%+,ʡiVwgB5|/z;EgT/Ab`}2 7 ;>)0f St >ؠ8 31b^qCaD*bMWț]Z'2kjZFU(*IʨҗD %S&0"&(@&Ȩɂ"d$EJM%Wwƨ o4jJ9021"Shݖ1yT0:kz4` bt Pb]:@yRfئRtVA]J-JF퀟|z׳ l"c1XB."*W9c!IBWB(^e H=g/,WuXMʷ) ,9:c">jEΕE$4FoDdd !zx5b3h8x RdO'gyD E}̜ B4 |ٷ=xdÐZ6YҘ$|!(P1]Ɓ7SrNGTwn ҠO/OFWvw/w.֋*ݱ}i_}&BƁǜ4$bZ4<ۥѓ2]Jӄ iA{ϺP +1RuQKy$nNZ-^F'ydE"OjVl}Od)iɦ503"kf~6ۿ,: dh^ݹtognC26iytzTWGt0&?.xןƆMֽ[}><#x7#zus9嬟fg/?ܴ۞]f y~AEMT=T)R8Q{YiʕhU(cIc#'\^sTl>_YN[;X]4"KQO}.Bח~*(~j̔z׏VQ9wӥg!CGÆVz~/F~q3?Q6_yQ');Wo>/88YmyNމ\{wHO''dd6dv9X |^^Ȉpn*9W%2/[ta个@!סIC dj#]x_@zuRgE+U>KE6!87@n}]n|H/&@{g91|В'89vyr\b G7g%!V= Rq! mRxX{P,V]h {Y]IFJP\bL @!;4SCVhא!{ \CӪ;Hm9ԢbXG4h+Z.rEN#Bه=Y(IǚX:Xu7*&_G-E*ֹV;#8:jq ы>П2jb]omr꾂`Hw oڋ-s4+3#7=DX$`ғ7+g7(GV#fGW\GWZƩÕX)&p UpfVdh6 ZY 3&JV/_m`˵RMX,[ӶMəyqNJUTh \kC)4VO>羈new~ڽ!`ɿ_}~K?|/OOnM8u5VUFs.&m|tk*[%)c*OU^{]yzwz=뇋^?l[^ۯuC<|'z3yߝ~ 7NLrc#aF|nmwZ-,swY9;gyDQ_H3~.io@V^'U .2P fF*&)svzZ|هm-'ZO";5*׺/6Z!FBڡkRM7Q:[m4DkQ2;wvk~ B ~=?mknWX]uDg1"٠]m3@Z̕!Id&DC%Krl݄z7?<;O܃/P5V&sS'JaHGԤ@ fGӤݱ4)h֢zf|M l8L/דgN +_wEyfnY Op!K)yঽԦ%Nʝl>I띙|j˜n"V&2!bR1qFu ɗ('ђ-\b]ecB8C)Z.#9yps )60섗,P3{ɱ9F =%gie,O yoٜ~xmW|:`ixяYϜSɎ9jja2Ll(h$cc^cR[]7{؅ J6b Jkv2v EUM sd˱[Gd|EeRBJs`SXfRN6[gLVZ+d,MQcB|]ac1yA3ѓMfyΘjUČM^uz)Orfߘ_&I`w ?講(:F^4}% c{gu'ZG9XA9Wgw GJfyyգI`ԁ3I!ZI)\x4K}gz&Z KlFa2P&:6$$GQ ,;Ds~VOl@R]#^ow ! yFE+\=Fa.X%5BuxXlgv4*RƎ>y)>{txk@J5WVQ 8D0X!6sܙ=19/Ẉ7FbX[0G81T-ńV[ˉؕu~it 7躹4Kr4>娳/8sH@5$dbB"ZIh>%xkm$7~mf=`dr]ؒ"3; bݶ`^MVYůů84N>&W 6 lB WM&ly^t. >KF7DLW*y_Xqns>f|trs'ZnNec FY`ՒwWj$s/"jQZr4"X!EE>!c2ɠ.q(<\-A&BR^"7N 6L[/(ή L<,)_OI %fJ?;3csU{M*^1^}!*e~`j14LJaT0/ZG=*ԹOTu0~yZ WI:} n1,I) ɨr-U lܺ.xAV+9tڮdHGW-{X9ba=サq3E 3h'/7U?Ů_BO.+ߡrߌO|dS5I sIEeAG7|۟cl !4iiVKDkj:% $U?'?laq9Ϗ\D/7SFOb-oʬ$ߠ9L& 1 )j!5ou jqXyj%FpuN&>m ;+awm ߕnܩ04ߟWyYNwYgK pdѻ+ڴD${;kץ@w8W;ZWԺ~Һ=MpzeAA?uԲ *7wz^5>yòrDk-hŷWg>fUClC=tpD:}<[Ori.[v;N|U /;ױ.B@OC*DUt1OmJȸ5vϢdi,宅^Ѓ8U,[q;1 V9풖LrƤ#(LQf-,.y{TBKk=<~_fl&Цs/]*&Za&`6ʹ,gk5x!_[An5n˫U4 iRdOKʂ++tt`9ٓ"d#uрvvPOB㜹haSGv,ە/ Qq2fGfV\%ɍ2{ItItEGuj]!FBGrF1KEi,0iT⒕(r10fH]Ezy"jRҵw .F+%5i|uʷ' &Qzb ?o7?YejU}1J>M#/jJ n',xSe7w釉k8y*bHwF_^\yt|~Va%3-9% g\2q6fgﭑx|>Mgd򕫚a+ZU]>2CmYť=O,Mi(+(gE|rNמTGFnQo[xf|TIetgWӳxT+rGS8g^>$R|rQل罍gjZKh[:oofoׁa,\ڥ0/s+jnׂ#hͬrrU'^oIhs$tksz;Y_,HiSjTNb1׃ٞS{N.rרQ%x:Rִ|Mbf{#g7GnkxSCk {q0;2G߽ߕ?}{~ xǷ~v rMlGG #3tкiCS7~ĸ;ƽ>Vrڙ[+罟n~v4 H}G9hYL\@hOŏؤ" i/6]R}[*Gpg)UN q^ K;/H ڛ#+B& rK!%oDb?t8 *@*}#={aaZ{~;ŀÉGZh/E&Xmg5S\j$ dJ%]Fo)0d„1؊>[C":K>`iBy'N,%'D,cD4)R(m NY# xҖzm 4( ۽bfO[' Db摦3dH@Te5qhxt5Zk~1L.T^ʠau2D4CL!lR$ 4U+.B,urtBNSN! J •f)sb)ah!(Tj-vd4313-1Ѓ l%)}d!',zNQ'n}psJ`:լï]J(Ԟ%&@#3e^0[f-Pt:vJ:vUFmWeKT|Y< N/^ } B/ijVuo}*h3k@wX \nC?Ϻ@OŤ:g x`B$BYC˹ȅ|C5cg&T%aRjD(d2\eS**R(uR \JJd2sH ɇ̵1iM#Q$xFѯOQwh'w~r mrTys\&ҳ Cb)ĥGd Y,əɫ$gR]++RҨB~y5Y%{AOOpdbR!r% q>0. iDrs@TV}v!WˈExPr;0 u@Àt ")ȴpV yh:Z|Кy$ U͇!MӰ3~ǑRw{m8\bD}y괿Zrʂ;cV^BbHE F1iޓF}J2w3.Sgzv|iG gFDtyVa6OPgQ3CHy!Imc;ӊ)pǫ߯:u6a-rVdL,}ɘ5R.g( yHַ'~2J_+&ء6CC(j۳.*/dKx #^it )7Τ$䀆&qӘd.JrL.8J6؁X`')8z?/.RlJZxE"$Oj䪔{qT-Ȅ%[D)J8sO9`^M=ps{]Ϗ:05zؾ;Ə/D6{Gֻ`P*Rkj~WB7ËZ' 1h򹦩҆ƘZW I59|Wsd׬VRc3jɴ\{ޜg=RV[J3AAۦK{߾Kz_76k(P(r/w.ɊO"lꨯ>- Z; 7;buVNj]yqLJwf=MhGO(XrT-n{jW=U7-e*gb.X]O=Pzu۟I `E"PEj/.oTM?I)oceӺl_G_tqne/Bb9439q9^wFMN, ,*A WQpT`L1&FrA UɴZXrZ:Hk!+0<]CpP&m#Vd$cN]FRbyw؎++0ꤪt6XCAU:C".Jwl^5qbCOU_ؔuvnJ} 1Iʦx/e#ئ 9w(NM .5`tF̔s;mMsOnKYE田Y`,%%IǢOYGX ZGxzziQ%ͶLN@1ibA1@ق! !we=kM#*?T1= $?f=]zټKnIz =\nR,# DK+߹T\̠ t*Ц |<C}~r!s˜,$&otV =ӥGh R# 0җƲVkP+ή:?󓞱tc;흠xaTV8(\.ȲHB"҆!֝hp^kkG{6ZgFGm ir%3F9GIpVrEV.sFX>2BmEDHIJ. (_kKXR['mH ewC'{` ,D W3$ȑDZltWwWUWU'T(mHT)3 XM4&D*ӖZBw@͑D+]w+lJa/:ՙI#hR&/_(. dpp6y&gA7l22X▛FJ0,?QLJJ-Ѣ2"aL/Y-CAuV)"Mҷײ*;-}e1ua#Z{х_bKˇ&=A"Qh;t{s?\i.oP* bxY%Nt\o"g.>ԗm}7t:?;KĮhiEFf=v5fjߖR ,W-?l3IZ1A'V~Iܒsp~NQeMƀQ|uTy&R6wf~a)Zۿ `ԣch꧔bہKO߽ ZIB;J|Riat!d9 e|F́3;_7 G|uM-Y,/9.a'9 -кLmuW؊OE3`QE/?w~gP1S8 @MQar\67Qj|_?oW h(x7L4m`_v_6QMrD) !5(§Y!ǿnJ@[E'l=w9,1)" Et:DRrGD\f sL(QmGs+@` A[R.ti-B",m W"VNBu}%{ LDG}8talUVbe,:uyEњ10 .fdrH8+:HC_}.bi)U$*8ʝ7 1q6%p @1b"iZ+I?XśŸ 3 +&SG\}'0re5¯qDEu2,=+єpz^è e4:|\+;=+?~_>}Sê ;ĤD씑S3)gbZXFV;Oen10o'L'4 0g @jTyX9μ4\ma`N)] n:.\rZTkLͿ_mK=ݛWGK^R2|wK1|8v)&8 n81p t n햻a;Ii׽HNYLH& d : }z8{*GҀII㝲fj ఄsRbS&a0" | \K Ak:fSWurݡ4ZƕT4#2.PQYUxr `(|QY2 u1;Xőp.4,T4xQ25 D4V{ hܘ2by@cDcV05 GXРWHd6GTY-{Q4K4)sTˤtH\(sƚ=b *+ž\e%h5& vzWY ya|k.ajlW 0rĒ[K,Ţ ZAZ7< ӻ}?  lwbZ+:JI@Ji`H-Øje RsP9Q)ǡՍx #z!Uxd!R&R/5eDDL@7hʃ&tYSI[z\fa6WA8[j N;OfiVhkK3^Vm3"^~Nc7)A  r )Q{-G2rL LHE @x `XOcיh5Z`(3*h2:Z rD`p'5]1prW@nw٬&^!Ն%:E6+1qk } C ^YapH lu[AxK`ÿ0c𱨗ЊzqQN  $|9Aܢģe#JB2W퓜pi]݇hb&_֐HW%"I Fk Er$ݢP&|N9k$1$:o夣5A@ V38OvYj!\滺N5u ?oz26,h6 ws l g\imW?;KNS=?O&o&7eJȩ4iťxɫPv|65SiUM~paG!+C A R; dA#L9 )( \` 8uY? &xG-6\:$UX .yduky ɩ0'k}g\g2-Vw2UITcTr>vk!ߙBkdjΉK8>B|*FUibnNujLkk6Q\=y //f;H$q-sm>:ݺ֖GNhM:BKF",#]/,Vy3,h}t3JuӾ/`:f5\19brui$ }zA(Hl[Yn3 xS8KYS=.\w `Go޼{>߿>wޝ~u~?g߽/R% D`b M[chAuk[emjKƽK>*lML@~,~}~oZ*YJxt2=5A#n%+/zR2]du* *w3 QcLr _rz{?`og5cu벑Irioi׳kAB%I90ճ^ Y뀲@Ĭsa&@ {BZQ*"7Oken~`PԔ'ܣ|,?4|_:Q^Klv{^{w?rN]{*w5ZkhÉklq-Zqݤh|N;!8u0&'ꮖ6&}mqle*n~zDP{뀝}1pΣU!C0C롣Ѳ}=_ő~JQ,m l.eӾRB0*;rSV?CV`u?Wď>*?;3*udq@dkZB:((E묄(moT1!vgqrG*s V1vۃ}yiٍu;->]x %W2W>D!UQ!J$ҁ`$ hBGAj˃O03 >= "a`E41DcD v}Զt+$J`%1;ʰs&((]F@Q`J8 ~`_Aƣ h,^}*pw#+;)GTK ?Fg~=1ĒG h)K*:[kH1a c'1) 9C . zFgB׆<*JEFYp)`LC!a`PE9&i/F]G5g氾v{za"JzJS֐d 3Z;%bTN{a&X϶9>+ǟ$azZ ?hd'լ-HJS.nӝVc:C0z=i)+= lq:} yHݺIoEMg&CunXd I޺ /B:kZξk9Ó[Ϝths%-8mmi'85?"+uRfB(YkƄhB874YZ_Xz}9D6$|sإAZl\D<˜dJHNa X]ŸiINm@Z`wAmf{վmP}+ j /דRS*IfGs#a=2cFN](ItIP!gt"'- zy579N&g_7Q%H)LyDGZ@EQkX%"mӻi-K6-EKՍGrh:XRS3^p'C(Dc4ڷyBɋ0>T9eUj*bXI攄JHxЂ*B#Z< F<yL& ZeY(JYR䬁},\LGeHyY6 (S,rkǞ"P ))2^ O-#Ϛ gG?;8}lPkBwQ)-뻔}f<`=ܤE!τ[>b1 T7O.>;3ۿ.E"gR5Ek hk|D !A/V0QfT+Lq';->/Tܷ(N) dqV+.o&QTʸ$04f&qP[{l╨S xi #uIlBP'lL:I/"zx_jo8 'ok"c5_P|կdpw0b2Y1*(W6YUDREฑQ)ȉ !M=6dKg;YȾށEЋeXS89#ܤl_nO$j:!aJ6yd]MB.ws~htz_҇m]9~; Q } bC"IiJ6!mK+2K#BI >3uщjBqk# 7bv@Em&qdqpjn&nwLz&e0VֵZ뤻 5 ]at: "w*J3AVTQ岬72ѪVKnģnR񨑩tTDWBZIXŒc7WPiPqؾe])vEz/OO~ VfZ켰%@A:+*h>dZ&^>?w>8c+Ɠh$t @$$0,A'3iuGI jS-9zހ'W?>y=57zn8!rٗ` rFi!f3"Zu/Ny|< wӯ`כ<]MO`7vS|<7 a5qJs_%.zŠhΐA›߿Y.MSŏ ~V)zŢ^,6Cb6~^ ͗cz,|JRezozI^!:tBIi1(lV^^k%OXev:[ePYP2zz͋dk.aF"!*JdTdUtbNvq-RPɛ,2rt0I_В1MIxth(5k vWמ<[pVkzUdzyד̾@)&YI^?Sx-qsIN!uՀfxR3t ٺRtk:%#κ`;vvVdFԴGbF64X9 .uB%5*bL=ԅFέ:K`ݐĨ$2ŤXtl7N{.HUcSE JҲrtJFֿTN`𩱣6k̒ns<烪[N[kDۺf_Ys_d~,C7Vwg<%F+u =J˵i~y(m.+JYTU!#@AT+Cdc Q:.+B:-2Z(g&QJR1TY]&FYD:4:Tkdl6Wi2B 儅{;x$q}uO 2,ANNN/O~k&@`bDTȪ1 5Y(29vi~ 5 Y Q %)̎8Z4ZJDVù<_ 1Fuڪ1j '$ ˌ{}HRˠ7Busj$9ed )OkJQtkrɪ|1cb&E⠺Zp>_%x(>EDh0!℈[-l:&W lP$bB)PRpx96.FDNBv;gbbO#i'uR1yۭy_ud\\vsZg3+Mc\.N]0%6w#GdI9|HkA)NeSK?{FB,SU}) $9 򐜧oc kY$*bBCQ9mC{2{)8tT:C@a e7y}WmV{O1xSoQf;n1 nxُ]䧚x7/.},I&%øT˒`.5fU=>NTg|[NM*->iym{/;v>_9L 2җM毻(9oGyr'[l:Gdr0iȭJO;Ի/<}ZOk܅9PFB{¬]T}7=+c"j3,SNv]6R_9)70M6L-?#vbWԆNn+92]j)֬/9@ؖز|*Jlqߌ].H"q9r5_uP} gÃM'>b֎Jr&]ZknXܿ_qqy|yU۬Oj_`ipC.`(Q$%n/n\ b#2g,huAN}θcq%<}I e,,dqpPbeNu-PJ|j 26}ٜu99Psey"PHpyJ3텼9/ͲbTXQ'e9.#s/"k,9K`K/H[qzv8O-rV~exf;\㓝'\?}abӕsnqh=$xb\J:(^Aѵ09wrU:1IP-HѮQ (;C.{J/_JF6 $ s d_%3WK*Wv)ǼI{!}'q˥\jIx\#q>ImY`x0KD̺_r؇9 Xk;9$9B>d2lKBS-K6&VInw-1Yf(pt݄kM7'?W)~?!!o|b8^Wcq('߅eQf]](OJ#nt{ѕ0u] %0ue!t+ѕr7ѕFKSוR]йcG`؍ѕ3u])%YWG+rlHW t+n:J;}])eYWG+-HW̶v{ѕ"L3(DQW#])WL +-ǩJ(YWGtr^G`y])Ji+YWϢ(Z`=S&,@>`N N4F+WLI7Gnh0<Zx:emyzzGek)2B;Y 'hB/a')1zZn HL/RC:].ɜhC< Q8\=Ojp]\/Suu{]b_{>أFWˍ\/Z0u])eYWG+˖ :ҕe4Ot>n$2YWG+dߓw5Ѝ *-کJ)'sʙK+^_x>OוRcU\Oѕ{nt+ 8+cUDѕr7OhcWJ籫"ʹ([{j≵de7nc/RZ燐+~ t% *O<IyRʉ2yt%B3(ߋ`R8wQW3t+Jq+8u])eڏQWښHW ])n?Owᬫ#Up'] 0~t] sfR0""Md ,_8Z{hli ˜t3u{r^Z^~0㵄'xtǸDOb16r1|VsؒO\?]lExKo*%Sv)oZUNe)ruvݹӼcF݄먗p]i%Q1>'Օ~(nQLٍR<n0]%<x/q@G&63ƎЕukC0vnFsJqEWJaJ7uK']nt%pCG"M]WJ#]u8nt{ѕZpSוRNmjճ"Xtj{Q0u])%cWǨ+)t%l *.u+ fRʵuhHW Π+ 2 PWw] p~t%^t4ΠRYWǨd. S0&D]ބƤTSp%sz]mx ۃj0CGWp}8ZijEiab8BW8j׬Ct%օnt] -ݬ#!v+Ǝt+%3y])PW|OA]) +aA|+ѕr7Zgu8GWǨ+o v+>zqؕ}%YWG`]xփO+t+GWB>9xRYWǩ+zz2( *n?dul8y8tw={X Rfq{͋gohD$I[-ظL7݅_~_~PNW?oڹPn/*-Y[ E_ϯZΦkUF޼(&7C"廇Dc_kIUzpa_Wp Z2[wTnov7G1_/'_J?/Uևzo p]]r mjw)9mm~0Ъ>|b+=]=U"_1;ܖ pG[poߚﳥFfރguGgޥ +!HmY?^tb/d?ܒ8Ē>4krPm< c_ ![1[m{R>]:FouYe˳yK!A1a!995D}vCTy];}9)Ćr RA4i۬.-ra>]T⦛ ΢$ҁޭh@}BLaX2p A K %!dpj4'b7G06'mh-&%FsnJ.aE2dTP! 91&vهlFZilA.JH, sf> @ 28I]J͘Y-jۢ,} z'Yqփf1W<˽\0,*EJNIv5x}( ~B 5HF݋60^dQJR:Mr$0Fj6IB1$V9Lub !Qf&AG8R("@]7oQvs&qM 1št(Ț,䜜9PъOB(RFZU 򶦒iҔUr ދ{·}g>}X w7o e'AjH)6v%$u\ui4IR1I ŤM,]Kg}XM "jk*Rp,1#kXk VxKH=$k -FC0T.`榄brROu#En^,4!VZG#=5.j֥fuD4HhŸ 6:Lb}RzЙxƁdPDQ$& o$֒z%@?Kv#:Kl0DD)QnI'eJ(P&)RKYDs{F\rkIjCsP~Kx[,޵qdٿB6m!X, g5M "eٳSMJ,XDdvխsϩS DB^j<,Eiܜ59Y{2P \rC`P*ؒqW2 j) c[N-t:K& U Sbb*!2%!ML((py q8Rq?f+s JP W4+EL\:h,," ˥tr\F/`B]^j>`" T ˕3Vf Av@C>fDBGfEh*nF)K<;Э1  c8bF`h&|0(lA0IeSAI!1δr*@p2`V$}D8&VJ!(*挚10J892j`@FmY,#KQ @+do y"RDbk.-=*)IE>/)6BLbgV UQc%Gii"j$$c@ Pe09Aj2a+֢ƛ }\uXgLD҄a"4Hmne+F%̷<k1ŲԨBus.tiv\ӮMkvdv!ğZTir¨ȷZ. ATZI|4x K8,}rmV28Ut)\]p1;`K,$tD^ JTx"D&jZV"aA>X\$Lt~8XeG@)/. Eeusp0(!A/Q7C|̨C <@nxВmy!uPʥP4LT`=@ y U@%'' \ %X 7nkbX746I2'XQW*ϕ+"@r iQ'yMne N؞PdQRXjb'UU`-и QE Yq2BcsJ7H0r#SL=XV=_85'EiLA I d:"USEc2ˇ[H:? EG|5o:k Pٗ9k ۻp QXm(aB3^ =bN F9@>񡒑8=ZMfHOQP mJMѓJn%7`~8XfԜ 9$U~"r٥P#Y 6EL AD` %. hF BB9KSkT{"!w*;cgX.q0XP5a7KBu]z)`6@%MǵםA()0(y~}n']m0:%ڤDi3Tp-+t7+3TZ쬻>-t9P>^:=+ ϖ/CNB_뷾\Hqq n^.mEk^8Y$\ThY lΟxX3.n zoe#ɷi.q4yXpdytxȅ}?&Ҳaqzٲi5[C9OJ9|~ԯ߃6(TVa6=:_M74m -ԾsPo=H`^r>^å=2\!\ph7\Oop唒 Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\Y Wcn W ?m j Wq2\}+Vr2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE4\R5pn_ W-Z_q$0\} Ff;v0sK/mxlRkGw็f/;LM>?3oᅴ.VC03*RwN=WVqm?~L:%gF>3ό|f3#gF>3ό|f3#gF>3ό|f3#gF>3ό|f3#gF>3ό|f3#gF>3ό|f3#ٳ̤Wv`8[MgS~ .>uh0%:Wu#C'a2/v-\QZ̗Y9޼[eX -3= ޟVgˋޗ4-n\o.Q \p_U6*.t,?S8m$y7IY{:A7C`^v/:`o-26pRm29y]!q5T9#+lrcľjJ&S1g67fƪq+VƈRKrw4VO[ݒld%-nvK[ݒld%-nvK[ݒld%-nvK[ݒld%-nvK[ݒld%-nvK[ݒld%-nv;JNL~{C0l| ׿}Rrmo@OXδ'J"Cba-{0>þQhcgmZ/ph_:\( #jJݳ;MRp| phg_xa{RWzV=+V8𽁫W7tjQJGp •p+Gp%]<9 ˖myC8=ć'Yյ0+=:_3O",M+YNE"/~WnI +OdOoH~m7zxF/DsH׷t+n7V߰ S3Zfnm[B'+ Ee,S|q\P+q `u7 ͊h;^}nb-~^KU%DCC)pmv8=]v ij6N}Y\3l׽MvA%o:T`!U4EGؚ"v'e:.Wa%߽ _npa\:m[oEs;;ޣ!{϶&|{zv΄ߍwg7ek^;K8}PʚƉb '3KmN"TEEަxy/?6ᴾoٵ~ޮ%_5Q Vu^uPxf`]ݒOQXޗ{>xqy[ƻ%2d3pn8bʺ~MD{_iFl-|{J5%7.w`˳4E']bہGv5/;ۻa~KĽ }|j\YW/bmzP#F}'ؐ2??f{tu<9EjG &/CAo{m/Hn}[jQ2(SfQ.t 1" \*KqFE Fh~;N~`^cc ]A=q`k9_t[.һa#Ђ[I6gRBM9I=oDI@"q_7vabmw!qݔd9b&bfjɢT;B2䑃DfB[ɆAU8rZs;OAèbp Os)^M:Ħ?ϗw͎!|9Xs|w(cV%5BQj]oixO:e o'0ODXoQGԯQE_d/+WB9fDӇe]MTɸf~}My5)I qOx] Ѽ ~s":V^;&2ڢ;jẁp;sw5i34O\]x.޽9 \%rSDeWy/& o;\jϛݖ]A7<}k$t܏,*h#B*zOF_^1ɫ?-jn~fF"z{$Zчgas>s_VŗmwvxfCy@Q{Sgd2&gߝ/$S4"{њ8nunBGIN'Y=RZ/>~ؿ,MUydأW ¡Qboj4&.#+JpF-S*.鐪AOxmm_e-(#{{ ֛[-G=HwybgSȅ=Ժ{Tħe39PdOTۚU=qzMS8gz߿I7l,#7[ݝ tuȨ>~sb3wθpSo=KO'ȶcWxT΄GD+5OuK~d@)훩uV6i ^WUq1J/ ?'ߔ>IPZ$5$UKSud9 wAuAg}-YvWi:I9&sh6INJTS&_Q y*ʉKu-wʳO.xK.Z ŘB\R i!FbgMÜmu:Zd}"&EdMY~1ܚyhdLeM֗sFIvL.z}X/-Q>KIP'm3TST :r6`p53P:Yn7a2~' C ,Df>h3sF1EҔ0vUƃG2Ų@K B Ctnig 0Em/`KG@]j_̕e爪 z|b%X&i_ƤrGW,r FeBGe `ASjmē&1qG$M6R;lca5c(*S;Emo&U Z$Y0@[;9.cTtH#ࣖƒDlI2APTR;"jƇ툈lgehoc1n*ruuawl1%R R5F1L@"Te09"< Xj:g : (zj0@1סNǚ Ւ"mu]o9[ A뤐s JG] !WbT]0bY\ r9[l;7!~y9]v[턴^ iOVZќVd!bnv{Y;.ފwByC -~ еԥroo.]LjUa#=pR_/ڱNu WZ|re.GA?N N9Uu@MtC;8rpΑ3Q01O+g'veߔ5YK-s55˅Uc]jU*\Q 8y&!3S s`M ƊO:.7.\9 FLU%0tedEOWZXRR32Q."Kuy.qXJ]Mã R/@Lo(Qԛ)J$"V5i (QҚ(OXkkO/?lBQ%A0`f`qpxlٞsj8~)7zԳY$lCjR5Z-LW)j 2I(W1.de}Hl-yUKS >4T|ѧ1<1'E\ss稻yV i_m^i0Cʬs(z+bcjL!c'##)"_`Vfch{׃!P샠'J1IσYLq/w3!6ݤ8X1Tjv(&V.NTK̫&8S\>|%i[GwyR.Xoyz2$f>ЗP&1dup>C!rOvi8zܨӺN;6\֏uFb cF$+siQc YcDE`Q+RT̨ࠩ I# ULjSi#H[Ψ8GTu#+CVhN8ljtb LƅmXBߴ(rX@%a5W.NYăFƚ{RgC&k?a&T3_\ٷQխ''֭oZ{j[ٷEicŵ>z}o3wɻE7d#tyqy <]w?!^^|xrX2,jCd)+Ň@Y`TyuSQC#{12`M|rʾ SZ\MS:zscuޢi K;Nk'$QB,R^(uVX)\:RLcј(Zcb5 CdQXt2ŋE%B >J>* .>FMpA׹1#vӏS='WkUSbƈ#'_`\.*bRAhqj߸Cn\,Mym4?ÙSޔ0i@oט% ~j8V5nZr_"N~q$ꮑMNZa=*j57*'m+E>sc4x)#\-ȣ{ :.(юGMяU\яw.̛cׯx#&AǜHCq!{.`J)LzuMݣ"mso^UZ~O 5@ etiԷ]d~էU_C3f.j_:`&pJ`Y!+4rB.L)>&ԧӞ䍫5o1}v0DadSWzaׅaYTFX1mݽe%IC&4:g~q7n<-Ne7i:䞖Zc9W\]9sdq r8t lMor `ױ-.pxclq%/  5ʽs${xW3J7I.EwkX/޺}>{ix51\.̋H?Oıښ&Z0Yڠu\u&'r:1D%evs-&r&9^Ya:M2*cA%9-d(51f2dJ;,V.qe'c9m͡f܉%YIRu*1ٴԢs:dEʢmQEk )Q+uc:Ӫ06 d+ܝqFa7=dBˑ*>br6}gv=vv"n[""30hO'Nz"ZhوZE+ ]YoL5*J1ȅh`Mr$ RTkƤW"S%On9xh0MFW||zbLٮ(SYRmj^K\8g'˕6c ݋OO//FUT.p)Acc!z;-}1&~p?;~rE?WoإU?܉еѺ~ ҈PbR*^ԓѰv'67ǟ06#X3HicG98׍7Y6;_W?͜ɾИ`n_82AB\Xd$FKQ{fe4LX" B4E~EĐN m9. ؐA7τ;"4}(>T9 eAJDx%+Z>x%z@˥OHZ lj$E*?\%2 o91UΕ +]9C%щMvvmLȹӸW䒦$p!D\)# TP NgژT)*UC)[fvUH&45NmZ8!WyWh MN>~1u#??>:=2ttTCb~dC>^s'*֏cmPvq|Wͦet]YP(٫0է g] bW5 ;4?ƬXMz™L5E% fw=QIFs2s*"{6o5PJ%c^ZxM=?ey8f55Tɥn'gq v^E+bvMw 3Dֻ%%,%^qˮlܒB"k˦۫7$Fd&̧2@vw>eolj|Ʒ77❛}޽y6ߜצ۾ٛZ˗oo| ܢ7o`s{7_-Ǽ͞ߪ%5~K3/^`֜Ay뛝,y[Hó^yrS<-Px%FnR0u^Y @k,:,4~̮~\(›.He,[0GruIGaŵwy%!šެ4`l3׀p‡b0yGg%'$[ek:Myn €8 n[{bp^P~yY7EmҺDFa0찛 shǶ_)֪Π(Qsl2h&MuUyj+k@ϕ}42jRgZi@cּ2s+k!MKuIƿ٦lXK- '(ϏcԳgAh3 ZQ h(''zrc r'?%1&wa 2|mHFl/Vct&K%3j.wpz mhB+cxNGK5* 4ZÏ\Lf_\U̮\/9v}6[nsY߿^lI'!t\ݬn7ȴ!|1ʳAi>eͽMc(qsK>\qbCi5Tz+xQ{icq3܎Aqt^o9|w7˿<}7xg`= HPL; \ۗ?nߵlko5E׊9لoѯ&myCG X]8r]şëx 9Wknl>g/M*2)~ln[f4PU\Z@ "Zɂf6ҙ2ÏPv,gd),n~Q~xWeC1 LC.G2B!JD}҃)֠aR0k"48{ 2m;Y? $L%&kݹsGxhl7o_2W]^ىůw2\8ݜW]}A (K+*P)!#D4`M̜J$n@{. GΓ=կQW[h20,nuQ՝U/Gv|o)|;hՃ*ArPvs4hy/7۰ |>|9KQꀔLQ\EԲwf*W+V((ܫ@(\ rP. X rP~޹XXw0+N\\\ NeP..B@(_ ra\ rP.B@(\ rP.B@(;\ r ycj|a;K7K=ky\D!띮QҵѨT(Q@@oBOĽhy<7Rhh Rz h\kK!1$G2jdv`h$*Hᔐ;!A9\ptDNQ Q[';hW]U X)$O1L#0@s&bI3?cRyo0cN )Ғ@X sxHhrPA3 e'X-ο5 T܁OT\$`+Z6moR-3Fਪ#u<, ݁1׽x(Qg-L?`G熐B H^?{OF_睮 ,`fw0e#iHˎmR鰭dNH(#n̉ IJfU]qt);V s.O,IchNGډ$6 d,8`Ȉ;5MJ.%z1Վ618KuӐ!G2ٰPymDgɝA}6E8,zOՑ,ۛ{J+MtY#]vɍ]]uoKPxX:du߫>ZOf yp7}ݤOhEe9nilo 6֑#/ˬ-xHB~VFW4^]Gos<1rb[wpQݶf^ ~f{3ފ>՚yb:;&n{]%j.o 1ը8sPa:5BR-f%Ri+)y}-m>iB-]vQD˂wXFL*$&KY w/g #%O!jkJGP#RNcG:gwH'*d/dCjΓTܢJ )#Ǥ%\Eu .)k: :V\QpOggbPe(I{Rڐ% R}P8mXv`4==ܿ&(l2DVV2mJjBt8{Ս; *=>4֜dpttؼ%73fO"H%0|ĤDi`.rJ۞$ϧ0= LIijYhJRfTBQ$@5Tr~QַjOsf*zx?3:0Ś4(›^4J:;1Zһik.H}(ێ~~6?'w)-d;mwW:#Wx|5n~y.K:.hRՖϭE hk] iZ5GӴ%$ӛaʍh_u%GcuՁr2b%Rb:0/Wb)&g>{egnOΒ#"p*#%0ًsJq\t)7 h^R$,pZ(ދҚs8=g ]/[uՌ0sGܖԮՑͺWeߙSWlUR}$^Q3ARg\^KrK(ξRgF sK,(-܋.UVyY).G}EfN*ǫed`XyB &x{SLj],5x]() ёe-yQD4s)hJd\+@rxcpa .}xO/Z-`սom -LV{pg9R@k1AAJ"*pNhw+fØr1Y}Y \q(]NM:adۢ՝:e%-p'A8 bY *阄BÅCȢ)8{ެ, 婈g70T$\h-INj,+_Q|>-ӗG˩HB;+"* D!4*yTqJ-Byky@MR=O1GC2 m6`8O%bErR" ţC Nxbq6JE9 LN7XA,޵,+_z"2N̬fqc@aZԔl;}%JdEQHp *3qdoe0+þs9<=d, 2jB*bF[J R:X'2|ϛm& &5cb=&u3-Vymd=:%=|33ol>eigiյ~0ϔ\u! |!F: ( mS``+4țP,~I^JnV[Ћv^|eQ Ҝ AQL62PpTNH( eNT:Sy:z9֔jƯ ^JHoި"ښEv,>pvkvf㌎ragC÷_ppT!:xe#aG(hH4_u#FoXA+LQA2'6&8UK1s{>YmS=7|1qwW/2e4a2K4eR1L>^?P`]w Y^H&VOyt_:F fƫ{«Vg N& gU,s-lfF|%efI UgzPpUTFVg\UH;TI0*:x5ٲE,X U && d% [PIJ*i,Y'콧m:OCUý}H(5.8\.f 8S\34Ⱦ2a)Ѧh]Gha6Ko`aЇ7o݁$ E;Oa`A'#tmi;"F6I6HqCRt( 2{aʆt쵵XX)kAt>"U&X5u(%3i"sDECU9W5֬e mw Mg!ZXsүil`3'׵b5$&FJ Z_QE]:rTx@ Qnt[.NX-)zӎZ>-ƔaB,.~J^u7;2ڨy_-~&)Hh}7ꚻM^@59# Z/oL{BvkT~Z 5e JRB@r,Z.]@pVeq̬VZw5gQД+jSR\efJѕlŨX%HݦtdU5eB994Ψ]׆'G~.\e^7!?=y}y+GlO9Mb0D!qOt5,0eY $y&s}kCJrً2B h]Ė]n|2BTJ~3k:OGl>?+Pv76jC sԞfVʉ&֊k"TU+4Q 6&r`.&&aȂ 5 NXĚ\TP7 ȹbMx8T%"vm#"v8G9"7Y|Q̞8 VG!&P!!M3Z =$PӘN39y*Vv֧$LatRk;ͤt6G>.YtCf%E9.縸i51֣ N]U+EJ`Q5KA6w9Cfxv HQqyUk䓻o5Vf_(Fq"Y'v0Ԯԏ<<7pq۾ڏ Fuk'i3Z*V;qb?wLJ4\,3h{mol@@>8 T*jϹ*; HPǢ}VMRcUa?W4ywAw%05Ѹz.n*piK+,oV՗DlԮuZ5IdLdc咅\&9PogǏG7nnA>dnןM:@;SnHa^%e!m5ȝz*(UX:CZ{?:ppe>(xb(ˢ4tJ7*) sZcĥ 6}U*.Y.)h,xJ֎ 1EPz9Mg\(m U~2Y{d PP˵ǥrBg \\JJu(t&C9%tBlM"E!:{n7w}>$êrfv^ RlRb TgQՈlT8ڔOW_V"uΆ~6_FC=P4RV?u{ՀnBoh d@T\d2 *:9#`Nbiv~q  S7 ;Bg&˶x:O )MZӃY/ADf:ݝ؄,~$~:[|鸤σЧr5dwiɅDbCn9/ifowZ"}ϐn4I]6w-H{">V:\["@J_ k Dlں4ǠЭvrtW>=croJkwYrQ҇ߖlvF Ǔ ^TxY tzf{$S;) Q#à6)0302[R#KXok;DVA[ťIǬ!j4YQ]e:پn7чQڢl[>\e^\=]fEp)e/_' 8퉩yGD;>\eѲ D& gGf-z5 ES(00T0H+~FCjgz[}.-bəKBrg3Q LOת U̞`ax]-leqP4&n0]8mGdbpl/o&+ vPٰi';AP& 26c8G.Nzq??{o(lraLY1$LkB)RZaHw'>MGSz(N ʉ?cR xyE.rø$J`KVG>1`Yy2|z:A~?lLoØhy<FXcN h7=}ltPnߔ{95!hG&A fwXˋT?}#xw5ŏ1GXWF@Lˁu^>5fdrƵw \Q[exusnݷ||<D~_qpA7G;#͍\j_CTݖ)Hy۵￧ZJGtr%^A>g1]1w޵H]]}$Iv5EjIɶvᐢd40 ^=3UW]8n⇲6mĴ:]*MgJ/.}-sxfVgCP}aJDpe9/74_ݢ3(n _ܮ?OV"tݔ^pQb?a7x5 кMoӂXrF8Mʗ)\ܽ!So&n- +%eh~/d:e:O]7Q[tv!h2ݤ quh(LBĽƱi>LJ9v(#ލѢ4-P(FǙ薓p~}:tW>/8.~Y:O?%m~1DC]*iߣb~~y`QB|cgّ,n2v!XZr=|inj:NM{y]J㏛ jwT OWnܠ⽮pdtL|vg,ahOQ|wco=ųe|s[&QU=uwwn}t³ET68k䍋s% >%6<.q$V0]UO},RftV7.2Rf>,@ 9[$O_*HFUd@#PkRd"HM7N76H YL*&VB -[z9*Vښ(ER|E^͆'dLh/ ׫&>)3N hI(gdD%.c5D!QjLޢt$*X00Go V;g,(1SQI1ɜg@Dc}~_ զ\ip-je鑲HF,&DkQt;Fsא׵>{-B^\oFRMp.T ?F cJ42:ؔsùk]L' 񂑦gkͥ>˓)'CJjB٪%vWJdkNRB-| JxA'duK섍ܘ`WKlIVe Fg}NbSKQA M .] m\_; %fWkGpJ]7\>YM3«t>xC[hœ-eJ2[%^҈pwXEh~c. OȦ7QP9aplW!)96dpʬg1k8(x$eIr ی![xTG^)q@qz_Zl\e |f=}/4[0զYnY>qh)DpG`٢)KKb>$w\>0b`R!6OI| U-)2(T#M"N zд#NF:Qu4#4IE)e AO7\)=B##>pXfۿ\RRanS^r18-2qYVg9r+)M牉;Mv`h}~s[sDl^3dLp>f,NڹtÖԯRɊaE&ZDlaE)9h2dt^\r'g&|d]`&'SA&@\7pFP7N(۬4ibg|Ȥ*egMĤ4AP|dN/gw5.qZ\ks#lCUWۤFg|,&r+ ܀p8jiFZ3eF m¯^5kLݜnv~ܯCγ*`ٯ1IL6\Sް1B9L{ FC:9Zi Y0Yd U*r)n Nfc%k44"9R,؋=W f  /r!z念e2 nRgJJ88J X!p!'>htVSt kA_]h EZoT.ЋC)7:i hHٺXVuTNtQ{ 44r[c4f ضq7^j͎|5V . 7tV9?NxCJ*DDAhsQ8F/dȺ !ny,!]O2ʸ=uуOwAd&ZeON! dBkKάEXPTDuVd5ͧrcVh˞cTMD)XIA YZxߊIT52s߉=U$__5u^Vkx,YZogu5z<,>u`D(A%=D٢m%&mߝ]s6PF%P)*ZWL?͜)9Ah Yr2**QӷaA \ qND-bCeHU@IyYhM}|%M"o*ZH 0oi[lԊ2l#Aa`pw=lX խpe'Sjqm>뽟ߏȯ|rz(NK뜏y,/79;(bu6[0ߟ:K\TkJ@P HJs X*AN>\QR}+dE c9h{_lB|.`i5-v, e|(}(-y-Dtyb| ҥñ}Yx?qb_qB5|8pSd#a=ʽ?vٳrrݷ\S.ر+*>|Y$W.Lֆ7RȵODuPpYvַAP+-𯝾0у=;z8?zq917_4.U/V]xoG6пv͵׮Y}]ɚhT5\/<-лqk.n8~1O {!^f=ͣ*PE`/ T6i &%Ʃ+@E.ߝͳ Xxp۲/Gi>/@'/wsD~#0d㸘Cd˃I 5{.sͷrsY*\0rϏ2qQ/]gAX+Tbu1{6U\Pe>?ޒœ-Q ޔ{󷇧[~sM*n=qޫYS~~'GI{V5͸G\,Oӎq;Ū5,7TOoO{\Ky=w'VۿߨCxT A0ؒ@F^~*λsǸ?( =*t;U!B⭩*BC"e2z[ ЇLѫZ 1(/UqCZm{E6oNUx5YHYwAu'ݚbZsaq&vCGr#07|R׽ַ>zjw\Oz JBYf@mmM9 GE~qR.ʻp| +jq7ue{<1]ؼŎ~5vr{>} ܞ"Ÿ=E\TRܞ"7;x4kt{kp4uARCk>XNiS&yRImɻ\wJRK< *RjjEsEDkMn=Jv|6 }X !) a9f% $S$"Snſ6Zv7ks?]/^)E2fJRZ~͓Nwd'H&`U3׭r[x lzfPeM O bpPB=A NPuG*RUW #,oŊ e*Ġ#LHSX U "hc0eSa% YpZp@" Ȫ?nCM&PU,pB!_Z?q S/u1AgZ[nwceӹx>lO;Nrnp2 $ԴmuFv u Y9P(Q*QBkΑ'e 2$$b,S|!S,j_~|x\]jb HFj5 fLfh\(z[Es T;NXV%5 ;'.IVA$mXW!wVnܯזY2 ~0aFv['{U@~>sw*ߟkfAKEe!mVJsGCX9&PrkZF|e V5[sm8j/6U1R-c7q6 tVCm!t0­km7d"q ;EcCzç?V ,&D#VH2 t9Y|`F  t}CT+Z"mt3,9^GU FWs*;K-6+Vx֝`׭l21'/Z_T`!iPX)Tu^U(LDc}lES֥B C¢)^l .:4F",{l@7q#_5ø+xE4-,d׮sԀ,o,{KAVTRMRPZ 1V(VqAk V4;Cm-8mh&7Ƀ͚r٠ȫ&e]uv[%..Nvq-7e%9! Ql^A"^j5`Q5kFOvqvqgsձ+f`:%a/?$)Ml+qn0_4"?6I q,V=hP\`OEvJ{FڣNN{;?N:Y?/F]I\*_#TXK+fj,/(Ύc 2%yֵ娝3\&lDL5Ra!1Ŝnꇭ- _;{/6a2Da`PSqz?c 1&R l^4G5TfMv}޹ɸvg2dY:{3|ؚZArVAA%$6Z%|9/D[7M9F!bX/ 1[JA,1XFGrTۇM Å5ݡ**+$BNx~?˫-lݗX3n}uqM\ٷY)X3|* kc6BU3 r!9BVeG&'pe/{PZM`Bj]MU mͤ,;,V)bJ1 u͜mJ; @hJ}&eنm_4))nR. 6D6bN6MkO[gVFˤCl >ӀOOfgm<0*šO-и!w34:{&ǜ`kY:;`@6ty@hA xCL! bphNРzNy_Hf!!ɒe-⅛@Vt(cllKg^R2墖8D$Z~#G#,) ^ E@gH }!. 8nO#Otǣt.ՈqBGτJaM`^&/erX6]Q䰯pr햲,.+>mA` >bDL;>A0ءn?{Ƒ`ی`qI'X`AĘ2J_us!)i(Q$jzU]U]>ڢNt)؛IBHm)(hD \1ƖN-ӔX)ߘpxaR[F2}J!")&8񊃅Mpǖig7bx4C=dLD AYG ϧdh QRkd0S#MutW9倢.S˸&"55!"ANڔ( ȱ31U UĨ`XOEuh5Z`(3*h2:Z rD`p'u99x d46#pWsy|zbLnW3YT=DM~rtT#/)dѱR!CJ [,Co9)2-Py8$@/OY8B6Ha9F'@\)6 !g(O|ʧ-S0Sa1\Yh#,0%-R;xJ@WNr:ZxhǨ\# BO3pZʚ_ny4:yUO뉛oP)sr:g|b|8ű/1*)oH08+?W!rt}GQ-?2cGu8-ai=5Uנ~iarՐ')|(]U;+ 3 ([HI!ed2'c0"?p$ۢP\5z hD '= Tfar;vM>Өv{,|i=i[dj!vK"7VuQl96ɬ\uvdx%OYpVJe6%y4sPAWixԼ,5ܼ+.ewY;%M[*n<KRҜw^t.<'NŰ=KJ=mid1ξJ0F1D()Ѿd`DRFe ZZtsxn_7T =B<yTpL|ʄ1!Ӊ%2-tZGEfccqZ1˜"x~S*ȭw/{${XY i#?ͅ6eZnh`ӹ}g8Q#Сfh&FG0z Qi1=%1,BОe8"e*8(\*,%aQ0qКSkqDJ#Lz|gq5c{Vqo'WaZt1ߎRީwc16Yy>Z0?99^o M5VH;GnR;]cHh-_tm~}09[QoFŬxr~U1I<\*z9+ΑU]Msװ; '-1˶fHk30ͬ.>T,Rr4<]/\:6tͭ;j[_1HkhNSa#a矣KU_ա"c|,w+] X?*TDž/`9ٛ_~&}٫7o0Qgx}WF`l G-$DM"Gp^{ӼyT^7Cuvﲏv1;p-73wW^UA=򚟫tв5"$+~ l~9ԅkT6UԝUgi:r]?/p 17ٱ(A3z,(Xkao pkÑH2UؙFzrg\䕙{~;DÁ8XQbcRBr0RHq1(0EO1Ly99yΧ"xZՇ[ן0[Ba˃4~K31[VXC/aN(EKf-D[eVzDe(2Oogݣ9,ֆ1)b4u(EQ`b02rGD\ ש-=&`G Em4 / ƀZ)"P%?IOIǶ\glnKJ4 eZe +sP†5go Ų-gJ~-֫Dתlh}r [ .;qށ6c`\bҥVpb#VkCІhUR}pUvBEqUMY'O E7rntT wOۥݧa^ IUp;o@ O=cF@J,iZ+IoUvu ̾g2NPŒ^ϖ3zT _!zUߣ\7-a[6a='.ca' ZW?+&鬨%  h)8Hc| ^F\MYlv}X,tg$wsQa6Y5**~`+XhW/$4$ ٓ/]4x `ɃQRc8Q-3P>³0MF%AhPRH9!5A@"%}H9, aïZH"u2 `8%~Qڜ?mt rbH؉'N}mїrVދK5q,JE;иW"LL -aC{t=Ie ^0<`P(pЈ8)pbO?pla5sd"0E9c6_,Rg2DO'SDcsR"!wK9 Q )uMgl ًw!Ӕ^נɻPc$;d3\M:U].ӕ?}ڎ'<6@PV-10E1z"(,PC+Q18[n<dz{*zzI1E-"gztΔUeG4mUr5&XKN$OR׍u~+|g/$X. :,sw£&Bf`RIt*A(B깾3?H4уa̬7UUBqH6um v*v'#} !)t x$e82)uQHp ,$N;I=3QI,t1C8:g8,7YDe`sh <00p>Y?9Y6oMDm1 #$6ṃr4g)4wGD@7 G+oi:}aR,`~žZOGq=];Õڱj^M_d v^n{+V8#XxQ별 -!ç+GP)/I.F33&s+ar`%IF@fT $Ib+9}PRd8u4I|9 sY^]k-Dx"@Isduhh  ,.M?fW'o b'MIh$ٳ/ 9,K3)|1'5ΌSvݠ'ogA`%:խ#*KI*31 2]R"bUJ*퓻9woxn^ʞJސA2)q!/)6/f鿂:GBNrL&Oz,$Ss]JuV溨Dk/:q@ -wܐ*|ȫ*6zߐJJh4ZaR*]TG?QTh8ºd=[iߴPVeG);ٽڣ|9;ho}`́}S]![EUUnue\vq՟(+qq0Z/ [[ۮUUәت*j•M 8eYx e]3! @F2;dxJXorױաsఢN`JMSWeZ.)[RUe盉aru}bϱ^`^Mßv6gl/ zN+7Kyb9Ol\i%OlZt\yb~DybvV'+LF'+k)\Zqg`e\-Wv)J䦃+QBbFh  6V%+!\ZTLq%XXs6\C+zq%*]+ hUHW,t֮DGG sWtb>dprTp%j=+QUq֔właIW"&+QǕ q@\\Bb@2L0(jm 2{W ŕw W9GcOڷ\lҶNVb7)7-<+eO9QfG~_.ϯo^<]h?UǞ@ {FO٦jcrTm3r-Ǻ&ExW%gl24%{]q=b̾(_wqܔ|׵`_꟧w6u|;^0٥=?oR"S販 %! Z0mG$scl w?*n:BjTkcZ!e;N跭ڮ`K]kEEe+*:Po˶'`x"T^QO0@^v}8V-%h1rȦFC=Q*9V WaBծ]Z J@TbNTp%jWbqhRJbdp%rNW2+c!%\`L2RE;DW )!\I 0qrOWUnJ3+>!\`JgRE;D˸Z"Ck+u&\Z!v\]-W)%2x`}FEwj~UPW U0zyW16օh'd`{vؗ_3vݫ>1oh z3էTߡZ]{u ;|o@n{TsѩSX+lB]S߃E^n Lf\L&WԺEDeY'Kt{H*D'ry>HWGJY'W4X%5ΌqrdZ?ӵ&TZپe\t:!\ٳNFɥNƩ;D W U&$+NW"רTp%jmAg\-WFPs+OWMd\-W[n%IW"7`*bVq%*M+| GF ֳ_1NOƻb|Fe\-W^+K)]ySZvG+V_b%*h %+Ldp%jĎR.j2ڣ;1Ykgct0ZFal RKZR;D%" 6.vK&\Z+Q׮+/rh:\`*\mR*Ŏ+Q>jV뤖E0dprA%+Qk0v\]-WVZ@8<Y~v^sL5z6T~.Tǥwq=ۿ[>l2w`Z)m`qEIfQ:+,*ʮ]Tomgd|#om{v~䭘z.~ g=tSt <mS eh֧|hyN7kόb,XU{ÿ쫍KQ UN!4!Ԯ4Fڟ;V{A:+~< Os<>?yx<6ؼV]ɦ%j-xz-#StT‰Ey*;h?j?2OnvP{oW=^}uz{3_Aߝ\ߪ`liF;Lv؋Z}Z =?]=+}^}=LfɓUsy+3i-sv nWgW׫Ы}:<<7PEY<W|<{dYo#qAo`CsC9/jLwEՠ:񪭙_oM/Ϻr^Qd})-v0ֿ]C"xx0Wz=6vPfM,{ˋxD}.^9o(p}{Yfy)/VO[/ϯ];@+Է!MrZHO|&Ŏc.` ݿ$d4t_^L 9@v"6-|Ƥ͏ )]{x2O6vlxو~nv0jgo;Z{+ SX'ױޑp y ҃!HM@>e<5M2'Bo>[s{nu𠹣!|M\?ԇ}d<(8Z-! VtQ~ͦʁkjDzITb@]b0s)qrWtkH•+qkk!!\`0\\0JZ;D˸Z"pł JB2ޕuJTzq@\i q%rJWYeg\-WOWI%+;sߑj=Ď+QI*j1 D.`*bǕt9\"* 6 -tE-Ry}UJޕ'Գד'bǕ4&j To-kX54u@kXG9ͥ?o\hJF?zĠ}E M{d9wV6*KEeSh;(JgѺ6xAӟ{cS6g;mknȦ T2n5dZ}.*+r{a`1̟a2Jq5N9dڠfǩ] ` j׮ʢ J˽i.\\kR!v\g\@O&!\D.Tp%j:n0NWW ĕ `PtprJƻƎ+QiwD\płNW"P*s7\qeAv JMWxWvq*ɸZ (!\` \w%j)`*3+UJ;"ا+KxW]q%* e\-W(G6!\`P0rE "Ǖq%*]\&ȿС݊ʎjL3F5 <`i-)&{0hC%jlӳwzك:yPźݱ zU*@8A{A<=~ux> *WXZ+,jC۸Tv_+lv9V&X??F v8w8n8cA\2#pe2vz"&+4$+kt*6Ď+Q[Ń 6/<DTp%jIŎ+V*{WKĕA  &k UJq%*͸Z x: 6 &\Z4ѯ]J%JvbNW"T X=q%*!{WKĕs JW"؇dprJfJW2{WĕwQJ ^%+K S5 Ie\$wsJu!MèܝP]Jk_v (/(gt܀ߣm+'֟`8&u# )%ı`d\̖?ÄUew9n/; !\`t2LDI&Js7NQz <ßŕ1-] Bn%. n=2q7oA5?+z LM\0 z  ?cnF+LV7[# lV )mrp}R'Zc%R2|k&I}Né<7FaN`?>NRN4Oj}QJFD$ 3BziFqE@wAptfJ8ږdj$m'%ң#$HEm2[HRCdM"+A,]<^.leUr裪~U+omK8Iø$EIX* gzrh!_ U4|?lT.Mo LKxWAeKo{6f~YW&~ΦJQ0N1$e UJTR_Z8GO/ߝN|>n왅ݟr}٭IR|ҫEZN崄9b}Or99^Mt-DKFBe= NtfEQ?+lbtV'1J9>`b!/*lttKfRgjjP7RHH0!*4HΪWIbk=]O9r4EeVwDS x EN`ݒnR^Sht2]ɸ81f<3D㍰l]A7/VX !'Vr/73BAqIָ/<e IKBM0`)e!͝yE7 bݗ nĞZZ#}X68:o}-gl~e3+~]&&04za|E;n֪˶3oӱ/J\PQV4-yVUJV48)%|ʖqBT朖lDˆ0q10i0` \$KB2KaK 1苰=( 6v Tƭة2b*+S҈zڻDE0 WsF˸ ,<苨={Zxͤ@`X7u?YwlKDD (lv~`43GSlrJEpV2sdl;%~"qAT"8V^u!Fw&._hQ=7w47+ ֚ᰖL*d9.oo17zo}5Z]L= G?l\ͪ $릂JD aN֛P_߽d(#-R#P۞Gfg>d3ĸȤV(`F i;\|`'0}^U;gO|/OڮڙoCs۰C֒܂ B&֎z<1.G1Qʽ^nI`̄ E 'd,V1d1dnI6P0U:jDܤ y>)="=(9gU d? S\\9ڜc- e&i$ߤ+ ֎FUxlC&:C c&U ߜWϜ;f'&;QCrV?!cBb>l35 Q㬞4 bZ98y3,zPS*V_%_%m&=*<0é؃VX$&> ǽ _nj9,BX Þ-7s$4ä}$ 1v̟{x11At\xIY+{8& Ph^p⎙ax=kK*I'0qL6nj`2mŏ_|mQ ,X.]8ؙ+T1)YNG(9?^`*2\)BEFF,lRVwzG͝q c9궺Y9]k|H}iZ볢#l˝]FLҴ:%F%U,+f0\2WL2Xꢬ 5ˆ8XuQu[ ^<5EL> Wq:z?s՞5~n\^cV(CRϸۏӆY"1rq}1#PT;A;.[n/qLnqO{EKRDq^JŊR*VN'ٴGe!26/!ӘN嘅LlB";&Zx6H%eʢkwdv(kn%Q|Pp)v٥)q k00-,+>yUL>Z=2bՋ;Y-w VBz{Oc8n'w/5Oĉ$kl\>Roƥp|͐>bPޯS71hv)` P]W%,U ZJpPe#)y3zt9#4^&41hdи}qEzxs{.٢)>].fYuxϫ٭`~tydw͜jmƁ_3s ygphгi)" 5Wt/?%`bJ?.chJ© ?3;7j~Z[}[IH0!\~ >S!C(^y7)BuȄ"s~t7Kd4`xLrK:&X#ĐqFT`7?_xdӑ1߯]zE`;ֺ01 9(:d=~%MjRm h/\;FpCҀ4&{'CI/3KȌ|%vHs?= %qBlM$|:V7yn~ @.BY9+u,IҀ8$5g'B!8cd̗bzr:SYϪ h?+ Rd2.ɴݥ"*?>lD*NZRF IZ V9)CYpu]RB BefSLq^pA4U!jfǠI{zᤇF~Сd7P|7xIVƺIAGx YGIzjZX 0(9JR](XHLBcYKɘyOM5up襛`tQ7$Bfg:&Re0 S$V c9+sLd.M^#( zbwsLZ!$hlO{?3,Lft0!pk;oVXLlio*ԯ \M<]ߥ[|*@7lr=XaMQy 8JC6=;1&UBU(G}p1Ex훫a&-`*#LۖAcvx_ty,YZN=E.-aSˈ|Fufb8xeٻw>w$-&ҰيY0i}wb^c)]T t7npsurf5HHa _ Ԫ 7_H\Zj~M*Y}70KLJi}u^R@l\T!F &'r%k]g鸋t>~aFV PVn:ګlfR)j)s)UNY>vYCNpc [1[Hɻ3l߉zLq%Le_礋H[+;*4tT`TEamU 4sݭm=9dMgpеdZr #Gߜ茯<.+m%qs_C{d (ljsI^}HɲJKYdݖs#s"b<U ]_47FQx<DjX`0yg 0Ȓw~:A(0yk(FZղJ8>*A ٹZh|":[|=&HKK,dssIJҜ"`5;Z2UNCQŎmϾBqMYﴹdGc#Pp`C8㊖Қj֭sNJ v͏m dJ[F4xyABdٖC_5%sTg xYM`QkSuHV>t.g!D aa=|4BBk8/FIއ=XņǒPO׼!졖Y6Ee4]tKlρSAǵKHz4R$Jps  *~L+웽GW.*C7 A ;۟l}xc91Q(T1[wj \S{珴 1 $gGS]&&plݗ#XfU(̠ :Tf~D#5!~O%e>Os13vƩ4` VMkd=%ON~xHa@v l_:U t0ME;0mLJv4$G'$xg:.e)= b2u^g3I,:c(&qj{\b%Vǔ>F5!ځ$d:=8Tiu[1lv*IhF01n)ihY:@PbT/ R1s˚yN=w`<3B$_cmoH}'oբeXRkۤ10 G+a/;)`0΁ `C8~XQ"ݴ N#L5iT/qDXX1RdʂnDY ;+CB< ˖qyLE$O b[@"yVLZ@!MmC "AlހBA1{U6a f 9K=*ap[VŲA9\c6y( "82f AОˌZm`z~Kg4O(T2S$4QDh/GT(Q5!W!@ gJŘpmD-kn=l- HH>-*LK0ZrWDMXmu`<_ʼnpǼЋ@gKUFhذ aZTGe8_*I~GV"dsn"&E9LJ%RJ8H r .d ӥV*EL4&!&? 5$@: m{\0dH'v7YSSE qxfɟ$&9M :3;=x@t6b,$)82ԝ̵y7I gnj͋"c2sI PÝLF́P֬LG JXͤԾRƌu-#S1bQiR-;a@buJ#!&`i.UBlRȴǔzi.3sy/)B ΀xRfLt>|׺ s5Q?Urtce;k7 JOP# (SC9pJ,_Uv_7|~^dx-;E)+ 9PE$&41*`HadRB\]s*O|\h1`8ԣjU̇!NO~ E)=L3#meůvYE{a&^,J@a,;}5nP`Ho+DM  I%;g':KccM[,ϋ}U$ D :ָb/EIX 7Xz]XvΦߩ˵|99uw-nLdj*0{V<=_E.AH=ඁJ\-a, 0I, hp;Okk rZ,e^)~$l1DX1%V0v;E MpAuɭCAqMYﴹo>臢K4r^= 7kZWR ppPByֽR \zuL4eu e-?jnHddz愄O5nŖ-Oj$0ŢHju䢥W;?kG(Lᄻ^Uk\w9BY˖b0*jSnQպs_En߾<ǀX6-Y٣ T- ^&ZaPKlgJLVR0˅ح5٨jin xܮ2YӚìN4:A~CΜJy'^V TαC{Oג?K,"{x6mN{uD1i6 )㛇_e(OHAX$7H]v3A)N|yBS}y 1QL xf9Z5ȵUS.fjiL}+S^䴜'>k^'bouxBЎ5=3xS@'0.yձƵ\s\,g~>wKu1/[5;ЫK_XvYX34nsW^hUJ)?+)xz"zJKGϳ)?o @_?O?@aƃeȽ%=*`a]gɱ?͝qoO-_/u32DmU> ݭ[3?M)i{{Zޮ5 AXrFR_?Lϐ=aa@|7UUVt4nM)5Vhu%?ҧnv0;tePFH eHxzv;>LPJůaax[}\BزF^:=gՉ5e9YMeMׅyyA$SN[L̊+L|ra}qΤ2 4c&Ir(B~ ñu>cV>/|߀1S~"IȾX0'+A!v _/0:/TXF7mڂYL%5fAf%uqJRyY,Ѵ%TP%8s à f>_!<ȭ++nߖ|v5`գt=Xcu%3X_. 9\T'd^C3qѧ< ]JD#jg2! a*Ȝ1"dB!xDŒUaAR8_5t b6zLbeD*'4z !psd!XHS;;X1ծ g?~ RUFh0(T͉<6:Qz0Eݏ߫}M&)#f!)(cQwf2M 3U-pD,E%P!LvXoej&k1/=+TDuh&-9eX-A !:ͤwNџͅ0Q6P9U0!# S2CarȿHM t3Gۇ hk} )8AQ-U*hER̛.X#c%b$uSdF#QJ`jï7fdJc5u+e0lt>G bG$+SPK)d95}ZE7VbLvEϙLd*dhi%),|'KBP+!DLegZq(h^A d :l5&Ņ i1a2_st} Q F`HǐCIA ʻ3'Bbqc8UOa֔SZTA'C駢V m:#Psֳ%l+nzᦖI gIoҶ1B-N̓< &v,Iav!~A:Xb-=e\qKcW.u[ w.Cov FSV sʥB9 J`\XLU)t[;f6(G[~u:4T%[z+L∱Kkʛ̳xT'ż1փƱh X[mz0K?N_Z7 {6r?}glӴ:Q7>)ם '4_Nt RM.w$DzP=njNL+f U.y 4%<0,,nbRBD'[Gu t+H&f nE5䓧xܰ^*e@^Z%rDrH ?"GQlGrL['*ɀސuxJ8-mT`/ǨsèTB`I.'-i7 6K1=&m2NV{knAĈ1@@ c8 vG',aɢR(IiX$xH96[ȶ+kQBz+tr?liT7 #&=`_{ O2Ӊ.%0:E!mȩq'd4)ġPǰuBSW +S<z95$)$B,09}J RV.,7rphDҤI]QWOt iP#KwHJzk*^É$n͏#Mum\C4Aˌ|6J}¶V6T fΟ _VIG1@Q:.D@upț3.ak +\M@!i3 I*P0>PC%4oa}pZRNAHH  C&R|:5haj3Ԙ{׮t &Esɛ*޷a:r.@WWKP6}K;q@ЯC{ i1H A~\G-puP䊆`(^}\/v?_qLg<]SI'7{,I4,kmYUMxQ*3FәYYE@xCCYÌz!bNiF~ w%x㡁i0H`/Na4@% >Bg`8Nfy{gn)Aй5Z1[rzcAqZn߿!jڞO,<{l<1>]}'sQC5 ?i!Cmb[mnvi.Z"(ίjsx8v蔯~L č+;j| 2XdF*mUA S~A_]Q씢@z):![S"Cs;)I;iD"zznAqG]HM S+ٟ5:]C z{00r],Oy_Djũ1Ҩ HKC# q~_tBxG"8JG (J#E}͗'?Y9Qg@+y q~iSSm6!J]Hl|.(#5T[οt&$'&&iv>`0VGqi?͝)'HR'E~ )'n?(-N(N[F^1SN3]KB}9UerxHJ[y2Y1]{J .26L(d.].u[C(vpBPyi'l}5f_sB8Ży~eU`u( 3k .U7R0$zgQbiM)H>}MfÛhVs+ "NKYMvX79k]eC bpS) J 1gEK r#iƊFz:N'΀P("`h"y\Ex쳹-R2 +oSϮBY[ w|6+lq7ъv rs܈RpZ2:y]8ѧFh-*9s)y ّBِj[!l}FkCzu O}s.:Lf?7Of< Q𽬛uKBir{{(3sG ؃ɗx1f}E7} )bx9n=Ί{p68DZb1%Y9OGA,=݊m|]]DŽq>= R,0j۽qTqN" 2FÓt¿ۍ!z}i^#5Q)7KM0PzC>%y9k'aοކޢ޽q!0ư SԿ8ݚ >2S([6genZ$1U)k )f!o(/vD6 {Jm3S^q@ {Fv'e |)yw@S.PORK5m)m *1&K+cI/~Yxł0ԧoA;㆛ BxLwRW20i%;S78n0%Ʒf~O?~d5^:6Pߪ,( =6&t.Je6j)V؃182"# *'SE 8p^Y,vXx %xeD:t&澂#{Q R ʱ7UV4q[l ~ + DPlA>,' eCm>>&yD8Ot6̆A06 CdgW织mcUHI}Sy,E"RMS=}>77(z{էawV}K AƗ}Gk^A>OQX>꡽<5v1Zy*f7ڀyQQj‡/5oo|Ija@'y30#gFa?l(oٓn>jRaGF,3h7B%+  E}ja-cɣL"ȳx~Ӄ~g1nN0>SOaA:w3+,U^67r8֥)A;ʆ`ߏ@V_O5v)W龬U-Iʭ%;AHe. cB&#Xy=y ?ѿqdPULe2Ȃ'洰`8 '֔q: !B (/ !& 27üdsB˒ (RY4} X0QB 8QZfR;PX6?hxU"e4FGfSW QDIcVu;N!jUИSIHA(bm0V\g7ގIxO_ sx@N@kЊIAgZf*љ[H!H.f:ڟ:7H -h5(íLh %,0bnj . šQ9{j۾5b< ߗ̨ %F3V5+ B pn#Gv4VՊS"!bDd%!ֻ>% 0Mqeq!E\0؅xGFq ZO@V9Ѥ㮤m,h^MWec6fKσb[p8d:o?},k~s5f`CL˹8:/㋭RX.͆hH萅+C%d7 am H 6?aDØ0_n>Ϲݢ- ۉy`٠.<ȟ606\\94s[~azN.P~0Z/V䋭bkEkE#D+ 5IHJ1ELcI `(JgzH_) ya,i {vEY-v}HwG"YEfs`j9/3"2/hu>M5:MI]'SOv"-2.J".F鏏oe'p%\R!#y9 H|À=zJD]<ǠGXۥ `Md6N .R9}JK7vA\?C#@pd~UFiװ;嶌aM{ԛ6LpR*9ւSa`K55BD\07XD#in_/>#\*xzG6CMC]'B]n+}q0`1b ^O 02Q- y,>\$S"Ev3܌<d]-:EU*d 9ȢI6fxֲC6e%ܲX\xE'*ae%+AQ$ yFi5ZІy rY /Y΅bwarP RQAi2_((4!y)X{nDS/␉T۔jVxv'/H=\S\DB%꠸fc8F VG3p_V*-XR7\FEЪ5mITDEpՁO\Bh\*[}tloէ\K15Cme^yM s65lᡑo)F;>sJ&Xd)/Ȅ$8R \V[r?Z,frsdzs6ȴ.SuKi[ku첖+ܟ}s6٩'xV#id9+<#i˅2W e0ټ G@\ ;p܈SynDnȅY{Qޥ8B)3:B簶Yli^s9G!pWiJ ۢ)gJSq-jxNƭB[u ;Dq( ~vQc R!c^k}4?͓kS!LQ8 qRaH=Bl>.ad'n7L0gwfSXVdJ=AAv18ubt}򻮜eR CPwļƧ6OOMx+[ ߟhɻ?֨|sָNgZڿjo|bd>d̏t9ÎP0Eo,+?/ɯ &5O{IտJ|u.9]TmtN2!p}e۴Yc%>fOѽ]|\?` p8h1`}E5Ƭs0·кaJ]jybio'EqH\#”+/ԩ: -Fy-@ 1V1ːE[]:&psvFr}F5*|48¨q(Ș(XT! yya9ITEoEVAPTeϭҥyI]6pE?E*TF%H4LkEnY_l4{2'eן=I\Inv \ɧmY$rŗ>8Dk aF$BI" 덌 m16R ERXȲ#|)z`_Q!.n҃vH$]5My>(3uѧX" *HH!$NC/( Bh:.TD]U jQxO) GMaR,wS,qU%Vhugk~qg[Nqݩ݃3I.z}<ܫ L5V]??Mp>}fC&)ؤ`N"v5ix7^1XDeݪ~׫ Tk7EusVS IXyfE2[r8Tޔ?P#E_ O%;BUw43V=U"FTqSDBjR= m%).RO}f%;jzfQO-GBJ S& PH78qhb.:\q$d(]DԽPG7Lvgzh-y-//0٢_yrcgyfX_ zA%9iWzҳyx=7|9~>0i265b NРKbxN)eMDV\ 07C zݫ21p"DRxVLQ=7VVh@.>GL_f1A! ɶ{ؘ~ 1̉Q5಻?7m^@+'T8U8{suHs!uIXx@la#h_`z~3Jt;,lT!u5)ɻ y(i.%gМxNq?)ˢb}.UpK|j?^/y"w}S:=Zjۥ]ٮ}nQ&*K["jRdIIy 9Yن;.Ca2es\.Q"GEr+wUGl#Lr$\Tm/kWiOnuFǘ]Nnȩ})$Y:(&c[dVp,Ʌ8N*gn8Vn~܇ */71]Os sko-; }3琊Cfi{!ƎP&S|Idq)غ&a]Z z]k|F(uZ멮maݑG4ně5=wnuRbdçBƐAO)\e_U!&'NJ DLq٨`pd(70H C}*!.IqP5е3/%H;+&=S5ǍX4`u<>%g {:ݤn:#$2FۉI@p=Γf8 }bp'6804: ruj*UQ/]k;]!L$n_AvJ&{"Qv o5B3Y(nU]jQw߳5DY;x :"h9 Vy>0< $B u8e ^H\36$ h43;~P) :ppqkq)'Y,)&fBk%&F"U yx/7dU2+.ݳ+*2S璽a mkӵi"~ktc(|EA) \_3$YߟhXojg;^,YK׻ Gc6 yf|h༌ư#T"LQks,_0|1 4Tn=b?`$b1[|rl"|t sV9&Og)hk~> \Fe۴Y{>-FvqE@!HV?S؎Wñ]WU0o6w fivv>͢隋bm=ON<9R^(SoʺdbNq2 i-||s 3w`bW+$=v:߁8]d0vWAvYdP %tLM`̫l%?<]7jSl{ lSvϷ4BD Qos1f:YրA0ɏOkq[GӍȋȸҢm<;HA@9D]`2}B] k;Gto;NG|Gov`?t )h'!~xz-jZ O bTZom+gݘeW׫n-ワW MH۾9US5ցy"&$/Q匿ۯ %"L)Tل6K_ԩ: -4E lcS YT$uJp_K~Rb3|{PKzN%['.>eןrEfy~[rU_tm=\(Ͽ } Z&ܝ'  !mx aF\U&Q[~%`*Ql߽`.䭋bals8e޽}$='gxAЌfL'oB5u LnL̽ɶlEQQh{_ |eWC*31Ҫ#XfeVEPF?P[Ox3ܓ?66rt:\xlPOys`$bQ/2hBƫhgM EN#'qo$€?12)(Z!uG%k{Ч; M@MwCևy_[õ߳ ղ)Ps .m>b:y?(/odlgAE6t*wcAԜV,?c\;ySê&BX9Y'k OiL =N`aϰ3l'-[Oʟ!*퓠ЄoO8?ᤉ\&o Czr8 Θ>;# ҕGw_(;f25\w+d W : Ζnoxv}l{>oj`Ӻݒ}&_Ҝ$_wz 1U5qgᘒ'p|ֺUᓛ4WBƛ/nh*B_!c|\0R7_vs1s6_o_> "f_H bt|oL^p2<8.{*i%sƷMXa&OcDu_!sW`z_7ׄci&L( ͩ!۳z7Uϐc %&0h>z!Sޜ&~TNy{Ť7 ffn?Ȥ񤡇pҤЁ'ofav9j&R2O|497~8ONe W{o/7߹f'|/ط|oўs׌tDot OTT-(NV,Xo ITd!ޟ0`H;e8dD$4of¾^Z OD_(X%,va^Y"<'b:D6lwҼrJӾV֜96f"*,wηg'\ȒrŒs4 ''[(6!ݭҵ'W_ca/vSw~b8,n.~w;^ vnvh_>gi^v7Rnd{/_Xe%sC˴YwQ`1 o߃‚vvQLIJsk/d >϶Ylv!#<eP*Wp3d0RaBW*Z6E b؟=tA졃?[cX}*geǿדCԾ]`-N1py<,ɝ +jHL5c\["er/{;\MZi Y?l?9 }`,f6)3HF~ #lӨUuOd-Pь}Zb@Ywח/:DcJd?㇎r?2c$0Jy"v{EC~'9VT)>T ;e{qpze'~MX3W7 z~5tFocg~WQoX;|zxOw?Jlؤ1DgCޓ$|,Y5}+L5GУqo{O&p1XXk}d5nFb[>\Z\WLC^cZƯL7W6\FSEqTJͮh$k9[V=@eEJNm[33,\Q*L^pm"]=t'da[!qiˮ cg#EفK4)R ._#ZQm%+Ə2ت@$ @|d PlJ%\5Ԝ!_ٴENI.DV'Zg%Y-܌W%zЬ<`2 6ydaxwJ;Bu<}vq+cE!fkIw6SZ(&\AXDG.b҉C*lD%yl+ V+rX@ S!GTUʲeMU:lmH@6 &8UML8TKBHB !e^y+9 9Rh2ᦜfs:=oɝwJ] //D8roup ZN*t݅._.fwKvVTǮP+5Yj]WŜWX%6lYlGk:U a7ԋlF h,Rl 9bԆ^Byڸlk A/ chl}yv@!3#}*N}',ގҥ1:욗3 EMxǕ\H 2ݗlTp: A+XR *ר qPĩ4F1OSdIlRfOR!"čhl{B:Sn 9$*$`&;v K։lM ˎM"<ZXYixwdNךtNךtf8zhA~r;F\w0'1^m `\9oEcuI y`BcDѸ"q25xqx4<΂.:Sަodz9ޤ;൨E,ǡ"6E%l%\!/eV9ZEf}VN5q7;+=nv8B#wޅzT^7]׻Z/yˍAr?:lg~z&PFgmjTILōŧF'(Bp-DمwV#b,T]bH uA@ '٧LlPdVFӂ@m 'bFFH0:NG ^ȩ4We֏l~vĭ7f3+R.P # 6s0IE0Ch&@W-#&W{5 4S0d;C{Jl('iJ|YKGkF9/dײ2o"fC1UNpPd%4^iϐ% J`;최7 ̗!4C4(Ov9Dc=kBl%߼1,EhzW>4ʌ!d@7FcX}s?Wːm{l1A:7ÀY4黐b wٜ%Z?Z-l3W(Σ z?EA 6'\ľ/Ϯ=$FMuSdXX G(]ܱP0Nb6}7ݏQĥ4]s/z(*!LŊi8.ysô 6%?qͿpOD}7e:*tVO>-ِ{N;Gf'}=+axewѡK>wtjhmZWBjʓGڜ*΀%d"YҦEFW[oc_k1ɯu Sj"j_g-xѴBL;k(Mb,jrےʓ4viB&f.-7_ic kvZۏw6ܻ*/MO~q~>no"mن'"qS.??'g'ͿnA﮼lmJ2BŽiCЎ:::?dE#P{! dU(}(Rz #dX+! ԾCBǦ :^snvG0B_9q?SpZ'ϴy0g&'1*hE 0 FԼmt熐ϔXG쨞 `k_'^eD,XejhxmI֍ ڐm1Be1g%$,<?oF;;v_i'ˈ?'dƆjP捦S2P3?LˉPFLbni'_?lQ}mjLJj2_6_P.^^xeN>`3&qRlţbr!bujtqKZA 7蠂 %<ZWBJ4A/3$=Q8z27%Γ͋g `fVyR棥p|J :/>Ƀߜ;~;,tmS{f篨ہg[j]RVԵrstyV޳Z_r3 ,Y$QU-}ܧ{[O؎2xz&~ԝCt Ū2b:Թ(DI;XK.gPV(H*AR֑݆D彺[wU{ʚS*:,:Nbz>m׈OSNQXط1#h0vCMϕh #"U0؂Q4@qx N[4iMK36vt!wvG*ID"*re8r[Bd- 鵲'?oEKGw~:T4VYP:!g1yA[Eh§Id6E~٨ tj3EY,gte\+ +l]EmZr` ]'m'tXHfF wAkweQ$I``m:CQf t3sX2(lm#YK5ME4͖m-Pf.gCМ#ŅS)c+·)j& 1L>JbjnN/c{quϋvM %B9U7VqZ]vI%Hxzyߨ{^H=^G)gw-<~ɓ"䌩o2bUM!Mpz®z_:wxEN"fH6|>:|1pxz^!۩$uh5$gd}ˠvBrumK *Pnr RaUr<\*i =* xG4=ki=-rEg!Ӊ1Jw3Lߏ⻳ -mD98=ʺRYB2AIP1$a2]\e|c{ bLrwc2u@Սv9}G==q0y21_"K$8c&ۃڹj73*?kѭkQ jb~~B> 5O*h*r(0>p7L8<[Ő pcj(ivq(rY;er6CCu:Ng3c'6LP}WTJwZFDښJQ- F%5B5FYccc,Ia`6@V\g[u*aǠۋ2OTnzwNoxuxңP<^Tә*:|A|.NMj~Bi{A*AjRhQ>҄;*ˈ[߿v&b7Ġ#6 lo@{Nzgʶ9?8yAUu'/;CS`r 0cXh q?P`J:M7`8g:1$q,^(w!Vcu_4/adc|CWz!0:xF=f_u*D@黏0Nva6Kt^9wזCyhRSt'ME]x{774},OYS<jbk'd_M|}OSM,#r"uDk5 fmG7:Ҍ\XՖOQ>-fpb.yW 90 }- fl1 d;30~nkʀGhȁ%bR}YtN)EU턢O?zy#S| Do+!ixV۹܍ 7}&ˇ0ǧ>Fw_v[2oQ@0o9ιv%"vi=UP.S xVY5U+9䌵shYj|m=iom9l.ѾHM*oZ K%EvoW91s9~]ͱA~ uǕ4{m-ly ñ/={yUABz--趚c7hd$-h~V~thƨD\Brٛ@ݯJO̓IDϚ0.#<g߾yv:<;WvnqW~?vim?yA Cq%iysH;)3wk5rӋsܾjVzyd\w2s6Hxs x6&boojg'YC{pyG:[8I>]Œl7?&RLa!% "P-c#RcZiX@P[V-;H7\gónKv jӄeea*J6[u7 1Q >ĮԔF̣a=-b!u׺ǑܨG3913pu$P 6Ѵ_wZc' # ,B}ysJ]Nm18ᵳ'Ϋ 4Q쳐?(swۑ4\sY \R 9 UHor 푪#g)τ~ߔMf͢ǀW.6Z~ǩ k)ٹ1mTh" ^Ez|%[ ̸=uhumT^wQ}j;ix:muddm1p`9\{n$G5r{la>^/Mvxs}UEe6/I mɌP lqUSqCAFFU띴„V|18@0Ik}5(ԍCHv-uȣ9f@Fd.Tյ*2feQᄁsK`'y]|󷨩@Z}} n/pq8ļ2=? :|A/T.{k!8=Ӊbөi})2 |lEp+w[" v/PRt `3p.Zmu(ٳKm-̢>հ-$:ĝj򹫋..ͫ] J/ͯz:xSrCx#Nܯa:zuVN^{:n/CjpLݮ%L u qж:GUhkghO9Gp+£lq5Bx/i0rTÇc4: D%$cYġTj{QB&v0cUu8̱!::Oƣ׵`W1= b(GT&Kbpi)mv K-jkU4:dVMHdҫI`!JqCOq,6ˏ>6{v=<м꒟Dѡ}C@pW}#Mz<op*ܤ͇89x:{uLDL)^6Kfk$ޕq'H6<{li>nb+ȕN$zsT[7n D;džfɍ2fvLqXhUSQ,ę'* UP9fK`:E6)h+-:T3De sX"V=1gڝ'o&,ڴ=X>/y1"dW.QU8\,z 5% +2/5cƕc)Kŷ,5GlhV҆5@}^iV2JL'.Cb]Y3ܠ-%&A)u~ȫ/Fѩ⧀ҩ׈C<ĸ#MCd~NpFY6}*0Yl^` yzDaO]x!#XzvۄiY,ݻA\g8a a]JTec@ƕK=;H_n Wrx/AY~"?3'D7%8.8ˀ1!T4~Ӎ884A!D9ͻ[^Z,8kkp??>(fU8_ %4u`᎚2a\RQ`I`r ",s7,آR5' QoZCrѪE쏸q1 %.nAQ4X\#@/m(ք" ,o2]Jkw a:=ei欼IYG)ӏp#4*>0}ѓh yo1?Nk&t%kPхF.NLKUM/L(`w[l \r.Ыrg`p3ԱQPM1ڤ [S&au1kJaJdΨSSI{kWt3[x>4->~RKqNcM}!QݺEp4>Dn+T$j/= kU*/kʱ/hnt {;U0~lڨA:']R iт3SzxܦUF1mnṂBL+y5hyI|U5o#G8﫺_bUQݒ_K70]z i0#H(NGؿu5cyq K+UY՜uOm,({c_nxq8UةB)pgc8 ]ܤe,Kh-fLOQc)93#Qwfz٤Z1l91VNXL%i| +?핋Hɭ|8zf>#홛<2sͧq hK^7i)G%WB^Sur Nt7*- 7ufŒFkVuڟ 6m![YmH]h- ȥO0#yR:PrMd0؋Ɩ5"ډys:Θ ^;ޟ;AW:ǀjrȮ>d<,&r`#޼A:]Ǯ8ֹ24IW ,)Hzɩ̳ GV*VKw;;Luz)56_υR5m9n&|PxÓ$ xcm!m-sf|Z:"to]ȫE8||'8lk.*)ε2oP5w/y1#wYuEd8̡dt@ӶGd6Ԙ}=ľGzc__`a"<[2c_kB f3;*- ;cNS`sU%XRki9Vi ‚wuoa)dLY]TY5VdP4P)5Jk -ui]2Ơ$u[N[p4[q!T93-L>~NJ%pbĺG,uIYWc.&i翎EHSo:h1Mh1}{ξ)_7~:'ZքJtؽ7Kxy3}n)qǟ~>dgZ[b,})m >Pu\d'nD*Ol4FF ;F[z(쮂n$tqSLtÏ@%/s7]11k!O*!$ޒ2-ko鿿3^x٭"Gv1]x&~\Di{뎏K.ipǰ ?ơԢRkqYi.vɅ*esS--;.j>9@e g3L.?_#`Mfg) K\1.&*e (m6w_խ8JJ$UУz`n=k1ҖV:Ig1UBlAլR{w湲ב9-r/oBkifo9O KzPgF"KԅyyɋA, *d1;,<,.>(?\ Ak=u9$}(Kd6buW"~!F,yI_rc"әz%LdZ=@k!K'1s/y1ufwE>r\v]"fo>Zg)AԌ Wk>r9np8 çy!)hlw>ç>S4,ݥq' 9@́]̀C}η1&hT"uv;{D1ݪ]S7ܟBoﲷ&/hz|»XTفSPv`Nfi v^B-&@]Dmch1PO ~oӸ[;FSOtFH؅qEb$Sge?sTp* nѺ|whC;Ą_b"O91;ۑYGL#ovAۚQEEҩcpjGv;[o";Gv)kBvw]@; a {}5}+ηK1g@;u)vh. vhC;^UJC?k2DIvhC/͒}ǷAHqvb.hX vd#Ulg}ηQ^y" @d^mGv;{Di/ch6+RF wh/))먭8'Ag5(zr2 ;|+&?ҨOdO9A/K2KVR\*puʴTT,Cb iuN5"\FY\mL c4s8巏ssTξ.-Qh Q!lLGJg3+YȐkMZ6"IeMqz;%MMbrs ?jBT?M = ub&r$7 \SrS.x揟~y>=""ӗ{]Fbt`j ט.kIZ['e8pP-5.pRm0~S.ɴ 476h bYV-/7P䖃P5Xr{ˇɔ_$E%.UBQ,m,k23Njt+K(9z4ZCE)Rh .? f%b=*zAQcpm%e ?ʔ(N<9:G,q]u&~Jw{Fa{?R@.KsC̰x $tFp2Y\۶%va3u$gz8_tncڀ, ɞE`mlٲ%Y~)RCi%ll\ZGi5rN'[v[`pG|Î'3Y >@hSޝN9t8,6d"6 Z?0H`d#EȮ@$i51łcN:@0`uH1ZR4kc| 0e16z49NI>mnّVJ0nр}g#z /s*I~HZ`ڒFol1rl Sia?) bH ȣH-s{L-GgU1,E'D)*LĄ.,5_j3x+pqdЦrY1'*6ϘC냄>r,:"W, <0>'^KI$G.TV!۪=~!xYdWߠeυb:@]b;V7{(0eU fVԠ\d`E 1) Q%3!2.5[曇16R #:z{B9`8^: Œu66 l,9MYbJRvb)Цִ+逎;3g);k+\'g)E)#LϞ`L\4dJ.ϱiG9guzCkEjeyͺmcǦlV; QTK~X*$eFVfE$c l[%2GRmА劄{(SԎ- #0m!G!L E9W9vLZmENrw Κfbb1g9$F:@mPrHb$]F! JLz)X1*&-q :&~#U$)]@o"{,6 mIEuزSh]aOLp|`,{+◊BS6jȘ? quH‚$enCdd=9D YڌJLZGIX+u򮴩Tг)5 Ya`ձaO߿i#4fc5~M_D!t `nfJ5(H8L @dMЦ:х,- i-Y$9نLN2l1AbԉܧIcZMR;z~,c;z֮9}@uiJ]6췷30z0]5x/78xEgj wfN}eS6 ŎFXDFci5l.sv2!|vL^{\8Ҧ5\ t6}_ȴBg=F'N =zRqQtG˔Y~ӈ%C{f~\FYw~p ONG|CZl#QF;tIX}Kjz~`d^^듵>ߜͨb@_\$~rlj\04& ݎpv+QlVk1SU67mNÞ쵂4RʂädǣԔŁ,qܽu,o4՜ᜎX;ry:,B?aPn.fY,awX7\\E|z3~;p6h-o>妟'u"-ò(,_ֳ윖D0ac%.U|j\:dp^ 錶d \^ h3NmnsꚣpگYm? 3I~6U"XM/ +&iY}#xRJ8¨(;* QIK\ϲ"wC* o@-OWPb 'I[B E*X;^ r9*#C(.\q̻^e^ފ1/o"}$F *6BdO:Aʎ1mk牧qky@i|&EDitoM4jg KN(r7+99R1K;]P< *9Ͽ)M"fOߞ5?|0A]ش̃P4!LDAԅպ%A%Ye .#tVEAypuXNlo30A '!s VUk1MGnY#=O[kC"%QzJrJ+ _( fmtdѩTn$VOnN}'*}=˕3K̓cM2s-R(Hctge` ƙ rx>2f.M2f pMx T"{# mt(qܺ,牘O8JCY@2h$ʳ*Ϻ JGpHb8uh֬&_fx pη)*۬xF?dt|KNŎ GcBP^4y'f:?O#S{tsJB޼gW-Kev͗t4UGՑS[Gumu9=v;:;@3Q}%^:>,+O^cEᰐ$45i_՗G @{2ο?;*+7G@#lV?W LePU۵\ԧ c `>xsRٿ}"y(.鷿}ؾmb>iW,~.^?G);u:6m2͠Q3(ײIΠā6VOȕtCrTt3zH5Q.kV;Q@簇pGawmoQv뱿?\%mW$wr.?]D1QcIc8д)kg$}F+v,gȽ{TbV"|? | Rъg쉺S|uy--}?TIt٦@T␪_,to[!CYp֪V>i4Enp[7 ԁ-glGun*i[ JN{%.iX*R+聛hQ5gّVꞛj,ƿ׀NnͻF3IlZ`[y柠4x՛0Y7+GHfc/Kͻ#ϟ6\Χ{<9)1t+dVx'/OOB{~ɛL]ѱeni ^mwtBjyj%WQvd 1SM|98tϸV*fzI-kcc(z:jW59H;)5:jpPAZ-p A'l\ҁcDteϵ= TP-zkP+KQݵ̴r+5C@ɷ;m"a:;Q|ΨNqHGl Ṕ1Qհ9h*"4Vf@Uq>&}DSZpc(l RZ-FOIE/lQUpqtCۊgh+ʞ)ڙgؐvVLXBԢ]- `F^׿8_ku>b6n15IF1p 1)xy(^9M+e>Q1CPJD6A RXP͒vN1A '"e99mf_tjs!8H ޴"(6B$)PBH%AbfYևbwMȼIJUc+^ Tk5{Y*h% iLZwv*+ըCZu6LΠ|SB?\^S?gWy?I/?h77S~ӷCzj~q&?̿m>N]rT~|z1V6x17~r'g'w|&_&-/nWPjI"NM`4s9j d42uގXZ:qkmFEؼh3 xm+KN3ߢ.l)Q7$mX ڃ`HFikr \1"c}bDiO(rOoUuF lbU(39$d\/T$Bq^%5OyYC 4֞ Li_?kj %ÛN/|_l{[v@n/>gp%ٶ}ӟ3s2 7} xer8)T1 {ʜ?}lB?cSGQN%?T~tobJTNjr<+_oY,O D*E3RB ?\NV>}|U x}puh6 zofm޷OZooӿ_?wc]~(?8>}?Hs kZ:N S8&*+UxNZ\FA=*ޤa  Pi%buÒᵳ~o|r  T{}o/̔i0gvk_e& 3.Eq/Q(@ZF*R+<1δI`NG3b%9!"~QjIx%n EpeR bPPzMTcTPzCf|zq2W691P=dD wpPL圻VXZcCj1Ʀb3 Tg:XQI'kܪ@=gsG@UQ;tn'wR2q <ғxϮer2p Algk9[mnݳ!( q6$eHRuAYDsAEV)ɊGJWφځΆ7g ??y>Aj:7ςG?֘Yo ~/zW5;|fsA ֘"]5f7a}06wLUo$e-O=IT<7mmsԋTbkJ`6)A鍥*Y4+Ab~hI%"GKOځN7f4qf?K}?ւXZr*#uoa<;H^HӼ ջ%l&j{麧JeD "Úh ׌+)4uEAj*E%[>clke55Uw'Ϲ_M?GAr 9ZR|@_{C\t{p܁bS ^K,! ? 8b9ǜE* |>ڍhp/0~}mDޒSfO]onqi*a;Jsi% %FIm\<بQ" ։EPcRD Z%%ўjϵ0ﮮGߟuN35D)PBӌooiy#-w1CdL [T8I1@94xрG\2J!p5wrc ueM,h"ȨDJPnIɐuw *  -.]7rYx&ך^k< =Ng@RƄЖp#ġ24v{X _맭)+y1mՌF5~-.T ;u2-2CI;"aEVd  OD*$q#Q |SaT ڐh#g/5#0}kD7wi7 92p#m72uhxDOs&:5ӌkԸDY)# ۚmӮv*]Kܡ Z|!v\uuեuNYӥ5]tF.|4 )@r KPvӢ0YL+X_f'us+Lcs/^ܻvB'&52?v+-%Mo<4P8O mbT4hqM ^@^k+:L"Q/"_^6QqKh/XrZRh~<5N_@s=qdMԀO@ Br_rbB|HƂIOdstE(l*'GI|TI !y-#A|B̖ @kQ9v*Gh3Woo.'q`~eAr7O+Dq^ţE_,{_d-E c@[CE)r(im9!JxFTRkT1I$|&)i2lI2C٨6l5W,jSml ,F-@Ld7-Ԍ[q,J5$³N\flͧ/sSVYvdA]VTe;𧬲5/F"`4Go.TP{3Ȼ5Z%QCE{F%Iϝ|`l\{] v?n|BsJ*C.#VeJS^2<;||?jL%PؿMbO9bh~ ϟ q 3zPNOAya69lanz|D$ZS&5H8!&D&iE+tځxo_i^B6DyԼ AFX 5xT|̒(x}i}kbQ}-"5/ IJqV ŔxJ)LZ&!:5SmIҬsn9'>W 9tq~N  d@п腁I }տQV=Q5`B'ݩ-=Dy՘?dM&ND0h 4DH#sk!Ձ/T>;_}׆\PB%רC"% Ǔƌ>=%Jb0b$N8%OyeHP)w]qBԐWl[ 4!`(" S{ Ԣ$tס 6oLV^p~UxwVB+̞ԃ:ڨ\ϧT~R=_AѨ]Az~H9aSc xD9Ҁᒺѐ#?݀1ښfy)ܕ+xSt䍠6htqJCbDvtAU*$#QE)Q_ A~q"[/8@saT.cgv\5} ;hz+䳑\}Ӑ]լk+B7*6TF'lHDv܄.ћt q.Ả4? c=V'M-zf8HxO='GBy0#Br9@AQ J V[%UՖxK0M3bIzL ڋPĭk|9li})0\/`F .Ř3´`r/OADnPpTyccpE$Ea',{Ow k /p?fs$MeǗ$gCj(9_*nK*t /"'AWOr b9)'- )rWJi8;EaI9?tњyHGڜ??-ZC"QZҐ 4 ϭDR B՚L4sN@qњځUFHIZтG/_$hfٔHEZD/I/I۽dܸU\,A9Jb`,PGksFRY XDT%"kߘKE\+(jPYWsfffvEkesB2D=č'\03gup'XPFE46Il@U,[;,k$edK7J }"xa5 Un 1Հ0(r?InqZш2>i\ Awi\#,)8c5W(QVP ӥ}6IkqFEMvZC?N{m.1`1W$ǎ~zQZg&JXjQyXזsm\u<>W-5yT{,{Zipzq {K(zno~ݏLqLINu'~cXbޤJl5mk$H۠ WùT]Ɨcq0|m T߼pߌA\CY|ӆ[7mflx=X#Gq\@77Cb8No?[/|wumV^/]oӻ $%fen:nmbKUfѡeYw֗>76q.PٯD[f"Ckh\:NV3pei{HuR>(4㻒# UD3z~kP^R[zy WeGWm=I׾?M}\k'7Ŗt̀y^c)޵rYhWN&AwB9[#X^ߡHʏr e8ٽ B1,- CVk|`h"zE7$ࣱ4@[j=xR)xIRBDnɪS3qUj|&p#Xj&t,gTTvo9@@yw |1vk8cw2@p!!|MkS={B-a Ӕ9- \LDVjr/Ww"uYy`8͝VwE5]|ݏ]݋Aa3`mba Ve *(HE\8i#PDơ0"zb:ufN24w.ӜY閻燦÷3|޼x91͚ zsGOAGo}-`D""Db"ۧZ;Lgyuu8d)Ҫ{, \i[.xTNo98q@/+׹qM0[dvwtiOnncwٱ;]==(;$cYq9=N ѸEt4.qQt41\8h$E,^|F ( bf70EXp"q*p5F1\M$zRQS L@鞹6M-R\*)+L!}3a˜I͙ќYKH<|3b4@֖ZˠJ%j;]@:8R@#/ꜟc"%:ϙŻG^kjG%a1Jx򎁆W8>nҰF !ID$(HadS0eR,A BPּP4%YOq&s8g8)}0Ƀ*A).f2s7M9 jr~4ANjH@ Ov@)*xzh\zA'5:B PuV ;{Xu ]]j BU q׽̰RR1=.qew]:#ZivRB߸w:E vTv=@D:ȳ i9La'cٝL_oMN JaašU[kAfvgGYάv|%Ҹ􆗙'[+坧RDߊ&w.chq qX)8?G_\_&S?+o'VT(5;4Gw#NK(_G#p5N,st|bm@5yiX1Z"Y .4bNLSg]{z>Ft~?_wgOƗIbDG<%ϧy7z>;+uit_<یkO:WWݱy{'__t^%O{ۏ?:=G ~-D@~ȸޗN0?D[YJU=}:e|tGkǓ'7Ѹ3\ʳ+8H{ʩ/T.[5_h'j0[W,3Uo죽;FC/gw޽~wҿμ:.?K"6հ׵_:Oz >ܰu~(=ԌL:Sw_8ϞA;?y=ո Lwyߟ>vL^] L~;?w"cTGS&9B1qUf:zO=X)׮_ëQVhw45zgb4z2gLA"3^y`^iOn$<5ʷAfgZɸFQc&ݿU Wyn7g岞͜YܑjI狵,IoN"! uByeXhcuFLa>> s\L]ָg6RS> YkGw4w8,~wJ1<\1 ݶQy;Uew ~<.L"Nq[{z'.ydڎN\%O\r3\1D|%;ǭM|IEpjS$B^N\%7)ϒKVїaB%o6BO'.yո$ rO\%O\ֺ,|%qBH*1JkTcG}`\Q\H{ⒷKfS.(`|dBkݻ)d]_#YtX 0dquY,8ClT>)@*FXmz_'n_i*-t\iǓq_?=eHPR/i✄lL*RktdV3)ID_Acc#LނB0G䠑3$j>RA3[>h #m:gluc(`E{l-IS|WB(pg*5Zk c8HJ0¤S1Hƛ߅ZR-v?:D#:Tx|YUt۴1 rԑ_r F- /aDaS]4"T0 :0Z*RiD%(PR#(A(FJZxef~d\R&0`TʍV`ZJ#z "R 4y] H+kC1T2̎6?\>tFbc0hcBh(aē/uJ\+QҢ4ːFK PFH`0 Dɞ3&h0ԨvԨ1MH <4+WNqcv\( Gr\J%ҐR0KAX [&[]=%D`4(+xIa<[`Tqg2(jyRBbh 3\!ӻIxQGz@X@R$w\czCZŠI]_5:D+,k5ur}CH,\zl :-`ޢvZ;'1N k:D pk$:B@WN{"$ux@bͤH|FB ujG`)&洰svĭW<aƉJ-;JN  SmGuՁJ5E2x4I:C\7xsl)5eͫxA.(.Ÿn!kȤp}BB+$@AOWc! \nK@.44w,ްW7k ǗA!$c7^aa&ԃ9,ڴV]/o?l4>χ}sR)S^g7+ rEm`9$A^ʒzzz$v_mxոYUXE#; ͅt 4gA3HA{/vQ?İǨˡz&W&!]H9e=w ]K{st5Bp@'n6H%'Щq'P߇̸h@_IP.ԢwN 1ܕmz mN}&L1鴾ޣU.nEh8$<՛ksbmy7r`5( dA"wG&ET߾7Y,27 h$S0I&G/m! vs]$7e10q$}1 şV+ͷZU &-N#~;&Kzͥ8HAu]^gQL!TUYJ&,4|e<^O9KqA>xcomr5Lo?J<]G]w@kX# ^mPFw7wx3\)3._(w@ uC@Y ,zGl_WA )(:_?1 PuwH.QI]!vWl_Wm0'Óܥjw+߭yܪsyReV~7o`8vPA5oPWꀄ<<_&|xP6r@c$i]lCz ҔqE.C(ҬMq{| Cn:lv:W8Nb{upcSZ~򻨖5ׂ9/W\(Z*3愓JFH̛X6Q 4 ERJq]uRiˌra8i pNYfUׂ~LC)PB'1,CX*j?ه.ZǞ>\\/RtH돔o7nC& fEd0?s>0ƚ!׊N}eC+DՁREVuߖ\@/7e=曋Ռy`-'g5ivY{7On7I9 Ofp+JSsk mrsOn2=*rTqشױZA^m_Kn,>uVc=xǔo7۴s[*z n=&}`UCŚ$;5B ۖhSϪn,:zMb@uT/z;PG9Oayz16k뻈iebG}&ʞ*1WG=(Bdݎ.H(,+"K5qX>r'a2I<L,SS Pt@nN}Wo$al#\4?e`hom稽&>yWq 8=P'e JA m,~5j SQ;0! \8/Li00 Ii#j]#<:b/.He[UYyǪJ2ZZO; ZqЊ͢Hȴ:3%z ς\wYsۯf֨V5щp }ۓۓ:oSy'R|.b-d/#M:.O`ٖ !L}6.) aTt4Ip\p([`oz!-wnW&1=CAp--(mb^b}`TgBݽp)xxvv~][雓\=W-${v9*kAzL$>u=A^"Ap\&@8R9q_U2Ubl%lR'nlBnڦexђalfnHK#mkӱr]ڣܭpz' ] =FFHk~^" %{8q"R,  S2g)PY)*_4F(>CKXqFgx}laa]KS8% , F٘,Y!8ri'|q2WUFa}lQWUA9LэtU)$Q|5+e݃F7 ,qPPd)`$fv7s*Ɯ,7wA-F 3GѨ(cXRR* -, _VL jYU)tӔsA@etcV<:dI '> )`T_[I=wH 5M}g::epa*W:ljN{, QgPڽdFtCPy%P[굖Je*90qjqn@xMgK-c3tmB\zbzR2AV0^X!ݻ`Z:[by#,)]\FG+̍` GsYbd6<@j!Rd"ؿy7R}gkhƆZZQBN`E"b/b޷H* 6&*w'b".ػ!(F8Z! <%"&ł0 AFdE(}e U.k2\j&bohhIDq*QNɋJK烪 ~邧YL.1c.[zTLjDɓ[aϒ/܂q(kcR %xP&^ؗGA2C`u{{F\!ڵd! _Vnԅs@[_i!xw`H0V5UQEATwj ZBT"ɡSe8BpbD]^'6z>)uK$Y.10Ɗ!R30Hӟwu|RA/nr}z AK_+[XkzMnQ?=&c_=8@*֟uLܙc#%2OK8h5t L(+qstJ&l{˚ɆS\t3 䄥T PZ)*sRDz!-gʨ\02%fL-S4ؾaJ<$Ex3:&(ɍJ,5 %5ኅ mH5ǃX~(Ԛ)cVT4,BcTUh*MKE"8b_ yݢ:(QM<-_Ũo^]^:|0Z)Vf?ꞖՕ ?Ӳ.C lgcWs?~Wx% wByu(@W׫OY#`ڝePLy(U8},ƽZz}l$ E4GЫ@#|A햋AG-gΩݲ'Ɛ/\DdJ-}Ab":cn z i쉦j1$ 2 P#פO;k~&}Ԥ|L-㏚LY`rI9kҏ?N5|BI%gL-cB 0 BrXDCb|sHLB%B;˜$:se9[l1Qari /x$jK΢}2~;1=czR~+,3s L'( Lz|?̨pp>X#-cWsWrꠍm)S<<_|/n)" -=C{XsǞ>{JMᮟ]+Gc:ڔPnl\*J9 X0Jy9Z0-88B`̽΃!ܽ `}ƮC3j[aEQVZ"rZy!RzN+TRx^:ðYZM~6snVA{G6 almGS -}ڇ^N= 5 QBH4^ U+߸rREf 1F1X LHX.yݦk %"w>QAZo%F!!_Ȕe3N݂|\ RD'w>mAKe@fnhL5djqiGBb1fv[ЋUr'l !_)ueHIGOFt'+1 = sOUfn;<\<3*,X^ثS0dƗV)x|pE,_T )%E#^xY˂ $Za{<*,EaW{-CAL)?/8ܘF$hmGOv90tv@r tI}_q{tdsGٰ~ %'έ)A`Kl^$կWՇ<^1]).,m8ǁP5'Is `.1ԚV/Ic͐7ORf[ϳpfzojjc亸8?!ڱ'헯d{]_ބce㴚^;\3CYr@qR^O|(;=֪ `bӫuΩqEa/ͮk>&6f(w[&(F^0O&-<ÃۡVE8 9LCw?n_!þ4} Y,fc}_di HZ&ﯛђ523xTկ_'dZpIIWlӟUlӟ ]|)_I|*4Z^W _n?^uǝ*ٍX>W>ONHcByޟSpSp>P ̒|UY9U])lu8 G#;XH H[Qpց' X| Ү8c;Z=tv ~!UW*Oʍ~ 0~r\]||>}9E/ðoCJ֑&{nsSֺ tף2҈IZ̬A;%ꝕMȯ&e$a2>QiԠYiTZ63: .iTZ7[ iNěк%MH4FEu-eBmVš1[IV[X-'Y5*ֈЗtFE'p:N\jR^F^)F 텖 "T1!#1ӐP (eyciFfB!gĀi!>+ QJȉ/דٷ)r ,8a䤋WqtA7]%ٓp\p<Dj.TlNEp)<o8i#t45^e eOf&1C7š1eſv!`fva]mb,ƐfTf\l֏wk3|^10Ϻwc$kH%̒d5c73wQ͓Q4{2t5_z&~kǻr9h *BɈFHŒ|$"FF !Ϫf>yK?j3]gE 2wm@UNhi[ $oO;]CV~0Ppd.Q1qum̮ GLH8qDO(Tab\.r3$bQx|3-8=2@ԺfMu=0z \}2 i(TRjiTo9\,WN - Ij{!o#Q H텪 T$1b駃KUV>7N`ZM>͞6p s߮ ^Ilt9K-D,NC:`҂~Ѯ%(. 98Yא4RzF wK-մ}@t%PZf!F:)uNY8s7^}o:A|,Hٳ!~ES-t@k^G~IFKcr.x΍6GUxI`&~ ^oY Ӱ<X _N9IWaG.딴E%*T@ eˀoa;OZH3Mqj'XKN>q6±Zz C։9MatAe@uca*p@F:u6<8X88}*4}Y`S!˜bq!V.>{wT)q: U@;E)וY!qr-I=[ΧYhp h"EA,9A#h4  N"Mv3x ;ųBSU8sitOf|UX9KgoPьZ=w4϶{5@%z)Q}pt91ø,Ss\¹1@1gqN[u>]muV/7fc$Ua~7@ڥ'[-ڔy[ATG\[iWRO!X/V#mU(Q9|S8Q(rqÔ,~q%70D 5lA 84=SZZF#e;:ׇwwڋ^X\0g/ֆHГWj5CDIb$jB6]! I(!mvd~!ԡySW'j4!0maRkBZwdIXqI#iX5n!h~"ykkEBaK^pSjjnqXqњ2ԅCcx2$[GQHPBW!(wm؂!)Qys15q=EN-"'1d (B:qC V j(i $shĢX))!EN)JR' hÃ(x"@TT0! ޸$ P($đ  s |m\V@سp%:낰FK|T(qsNMffYQj~Ã񨍛R=J}H|2;](;:79@ț5\VПuv)q'a{Ro7㕶|*R!Pt?k9R#P[wwWv$̈́e R#jt'(aN:uf|sJn՗u4_ Mz͜Eߕ'$=֓'q/Qˠh'^A2T:0_ͲOShp P$PBISP`))' !8T kjɹ{ݠ.Q/})D%Qs2*$b˜''" :韺ҏ4XK"aXJDP(e2/4 !D"  ^;}^?`ȇ1(8\$"Had! )bQDBjH1c(87 quDRd&^e|^s;AXuZ 0h0yYHHՔF02(]$:PDŽ/?Njv x| ᓚz/u@boˡpu?f m2vcfyQX2O現UZ=Č G"I @9yDB\a$4RDADH$Q*eBT<֏Gʜ7HfqpGed ̛Qm!_1!<"NO" 4>4_Iw?נ^.`^/Vd"'OlόX-BWcO织 `cX1"}|:zfZ{gd2hz@`08 azJXoh+i 3{7|sُk!߃y2z!d؟a1%ui݊-P jb.߳ؗWج4a}vC 5S>&džSQ 15M8" #B-%QCe\ S fѲ1l>'%]# K+:\nSt^ݫ|Of bENDZa>ܸTc3R#:gɑtTIAB BX*") $um$%/Zth%J*dxh%}*H) {yfy,hmuL>K^ q4;֫Wtj^Vz% }fNmP.M<s$%y2Ƥx/~a #ư/S^P}4%߲1SAvW)RJZz)9ef)Y6jMޔ&l%LPh> @2hQZ$ 5ԚbHĵHenˇSv׀ v{?~”1aV c&0> U^ j I8{Hs!GKp$O:\Gԣ~F?ڶ+/ʰ8RL L/81'=y8x7|̲.{*Ip1WAu=Ev"L ZhN)(Exf">+ZN"ZiǶ3D:1ƍVEiPDw9棟=ٿ~܎F@иИd\<{Y6ΞxU.񬌘%iQ-5]H=,74S]rh,^x{~L)/z2L/y47c=h״N0W-Dh|X_n+z̿|3ݦ[VwOfM+w^;L4b\`0L:v ,q 1O{>^e[ތqPn}z4WQ/r긝[7&{a|ePDu|QƺuLhںy3ZԺա!߸z_Y7uA}Gݢb6mݼ-jАo\Et#ܺa`|ePDu|Qƺug15hQV|*SR cC|ePDu|Qƺ& ޾7E[򍫨NY7 uA}Gvnތnuh7':% 9fƃ csbc*Ha!H$r6Rq& Y1@A97"Lir8;1_Un_Rl2?]s|C&0A4\̇vOC۵Fp) b33al =!JǼć*|}YQ`# A~Jnj/%JJL()3**!Y/e!21dL̙1QK~QRt/ߓR1X!U᰺pIT2P6J`/ɗJK%R!+*H en x$)AWъ1A CBQ_ p,cx*&y+L |A˪OxVpwfc.~o;]cU g"V! #RIiRi]/Y,d,BBsNPHc"Ф0 8D[Fh? Ճ@*6  ,̓'zxT!a#!Nළ)=@ALTP0]FCpCg>~Fp><A@ʌ5h$yp N!JvP=@Bh̟7tUcH#kAPh/<q N_Cș7bJJ01جV PHabNם߳lB=czWoYr9N۾,2&Ygaؓ.d=`8ju?vv[~K,Yxӄ8|ml.%v=>&}Ն7kj#bS>:7#}ox,EwJ{Rr+iHyHsV|66i+{l竴!& #Goy6h6wy:"kי0,?_q80 U@)d u?2E, 烈և ,.&u|Ĕp?tCjFQC(KxJ;"2i@E4"3M K<ۇZskxqMqYoç^f2R6aJ)D1M޺1rGϼ/Ӄm*D (9Ђub̖g' vf%=LvUvƛ}e(´ Nu"Wƻv╖ |1Rn1[ذ㺿u ^F$sW%lo2]{_05oS-. }Iz&=yKESaY4Œ9P7K I槭u-4=P璉x1mNxy+@Ox s;J츄ra0K.}ښli}vjy#ABe? p+͜%E›` 0t{8Z  xm4Bm޺<(%.퍸 !7TiYc"@>Mۢ S¹vdbh27 AM,eh2Jҥ)q~D} "8X2|}jliJОS:wB|F3/ NIIՓ_xv%s܈ dx)/ގܾRz:$HsAWRpY)96ܱry "H,tL 4l:Ţy"lČ#{blѻ ~Ffޅa Ć 6:o#|1cnzbouC[}柕~o#Lc%;X'*'j^Y;]g|NǏ˅o#+۞AQڍvWL+v17( oy~I!ɺfM` ca/^la7b,ׇD,gz 6J jH:Nr)gޖ |^%%.NZ]VG//Kqݝ28yP`L(ulZ`HJ7DZ(#;?J}89bj.#,$s'n*+Xwwn`9/ڇm*Ƃv`҃l,3drvoIH#:nPD$)Nӧ8n@ wO׶_9fq*È6*-M%-vnv&fIC)rBۛ]T} ;6IG.x-bI4"Tx[ <'L;zn [ m^(6D%Ԛ3AÝēWJf!%B;r,I]x os. /pVƾ|fWg. vErJ}_KKi1#*iũ=0-1c;nz Eq܉~Ewy[adP2N2򎽴[Iv_9fîS$nb8R8{" 0 b>cShݛ Ms)"B)Vt%J׌h/Lt/6Kk_Aec.MILj*.I*P⼃a }|13,FZg5lZwSm=GT; 6ܟRw wC'*AX`Mtf5f2EQ݌ a4;8 ]$uOԥWIzc&71:=]%6j*/S}K32N^՞N7K_a KbKy1-x[c+yP&%!K˧'`+^PGSFcmVWH2<S̑s"VS[yw>/6H1DN)LOJI8*ՠ4{`$S<1Bb-x <-Zɕĩ.MDnƓ֗Cھ׋ݳN)!7I~Dۏ[MHAy*4JNGJ4@]ܴ}mzYd,jX(b%Qq*NX[AOՆN#"]}.U>׵*J *Bq7ڪ[]K5mu/yz"@@Q:'C1qE;9^l9~TnDZ9_ Ҹp8_#waCG\ytWsQ 8Մ}QZ%8m|UȰG:ɠOw{.ʬj;jjh0{(9ERp]>Rv,S̊a;6'^vؓzP)p #۞\=Qiuν`O# hD-{&%sxʽmamNRpSAd+J%`DhgV 9G..קB`%uw\-W@-*lhU^V֐ o2-[D*W8|*mIQ(vb{thy{ =,x1zw_+n8tHQ`Sl">p1f똋8Ċ8 rxFf2 pcZh 0 w)76%U>e8U 2l1h7k 8s{PvLyzWo A~/& ^~|vwnϮf̢Ǜ&0ÁN4WyV]wP՞JZLd1ώF!OӝPn$ x3Юҭ7l_JF5ZZ5>S 1!jd@r>'q1Ȑ-msA„l9r6GEoo۾kNj.}@͐ _ ߉Y$H {fg{[ܒٜos^Uq܋Q<9VCtۙ/D^rN2x=R~aVvJ-āP̉$ nzXJ~(QYuە$] d_#֫`h7'BȽo5\PBӀ&D0$PHscc8GksGLHhˊ̍W|@9R, d;A!  *1(ߑ'%;ր +s"G1 wG(}U fup|&KAY4,vTrKX gw3d&}芄jQ 1CL$4m3#-C@(ItS>1ާP~;s`IO&Ѐ|Pa;h'.&yoI hAWmpe ĺ8oO/$԰*OdBvk=yR؁^AC;X%_n%"$ZLWc!(b5ލ@zC`G5C сdoӯe }\r@| bѠ}l׺VnsԞxZ̬xXl-iC8JgU_(=L"&3(n^ƪў{/LPHuk_9+ սm]^->TYE}{8B$d҃59ñ @DMBt28ZNhqVzu Bq*N%~?(l/ekx}<;WN"u(η~tN\nq3l}׉1ξFx%/%d(n1Q^05WYLQd{hњ+LئeSz*fV=RX)ڛV7[6K𢡊ޞ)u,כͧ\P_]n ZAP])෵mIgtyauw(e8Q*nW y(\h]I}>^u6d9=gh~jmY19^OBA󗒴hSɱOF/zէ;{U&#ݩ+ݰ! -Lmgܣ죪c~Ȇ~HsGe/i=16E޸3ԣfsPZYރ8 #wMƵQtۨ'dƁl?dB]Bqn+\g^ۨ޺17c>T #+yIg>mASsag&{G0ݺ kh.U&-˘(>0KȆ˝;_n/~1w"D_PV=i7uH1R[`n|f{< XNSo7ؒt<ҺyՅLQkv7&khyaTG3,L&9_> j]3ic줁^c e!*;iU3YDTw] =,^s hٹ/UGi;ͳgݢF+L+)hI:5)x׏4a~96៪h cVSo>__pPTKzm.Cg?[*r^97 gg"2Evq.D$Ibs@P.T f,SeI*@PBp:FMEB-;V`1T0Ԍ`90G3|2`JYB!U" Q$1(qH$3rH8//f|c$c*ME@u_&jjj Nw|I*UH"Pj>P$]xiozחE/^yt<߼~ f qK$r|TIe7coƊ#܀03xEdDB)$@d=0 qr'@ < 5tǹr@Z0Ո8=dѱNyw-|yn4Zè-h.Aq .@,2Fdgi ӜXo\lk{0ntGu),_ۭ[?QjOqʨ* A|EOv^}M=;t1=?MFx=4ե^g<㉢G:0A=HO'"uGXVpścy'> *q(+:C1 1 I@wp6yPP`S:H,9%qxOtpF0N8@S"w~ :m:qys RJSƈq4(6gX}6?5L;Im|ۻV}iJE.RMњ "+S=?熠.w4խm~KpjP<"[xRTt^(Q@=.ې5yZIvVbW=4B ~+o`˶v뗭.>~ǚޯoz$8V>y.;os )8 Ahu1'a9Z=h 7E >GqWw~2Ct%ψR,YhD&qQm<1*}Fn OWY/ηde 2cH%8h v1 H ٛ*M@JX& i"QSA0rI!F;IBDmH!w2Bz#8Z6ʾp>}"Ace]'lI`B&E?)x}m {Gw6Qu٨@HQS:/dBOi{ڼU׺jZWm^UɀK"\"9fgzG 8#(/%YyAX1_DWz_9o.LE] ?ˉvW G@ [R럸7_"PqQ|ޘb%غA΢MOtX aq8c>/Iq}%@ϕ aYfqIJgIΝ+ avfS6g&lF/ +(C"Ui#^ey4R. %PZdBQ>1ynd ŏi8A}?FV 䋍a%:,ݿ]dʀ|2Ed3JLٶBOLrm҄AF%B$cQ\<#VCixMw&bm K初`H*U{4b^mb>vomz&yWG]U@{sX rou|lu˽Pweg 6h`67P78S=/˻r"8(pQ(/$f᫻X] Ga1\8}o\CJIfJFnɥ !tP(~O/m ܚf)1m/3/QsDPMܯc*(F/ [_>4WQj[,sBg/-CӈS;/+8Hg>mB6\@5 v@ &m]$>A5}P;xz@,ެVpx7\kg$cӶ"EH \OЃlt(O~>f]BW;y _U*$|ʝWD‰r!`w^ư^8;g^HvIru=ʴ eD߀ Ԧ#E"rN 8?6G-1 sK7g~' ڠy!noȲ4K E6 t%Ibρֻ\6܋i׾ƑFAcpm3VeoǨ0;z*~C1h _Oso =ŋj4 ͪ,zws[j[ۓ 'vϖ1ښ({ƅdɁAGhKrv@2y41,)l'k౯7vN\Q2 ˄I#[\EA쏗$K_- uqLYyY~Zu+Nnuͳhf:U-K-wjT^bje&1m_;2g8^[FoB7MQyV/mE-9b"Sf[fɵd$YJMN]«ºcXeo9AW M)S˛AVΏ )3`8MBD( )-$8ʗ<HombZBxF d0ƨ0`ҏGKtO+Q$pz˜.XI`&droxch;[tyݙ,`;[oE."I^JW`$OG+!zG0z)r{ǭ}4`B=Zֶ+jow90RIm_Pk16uջhCxv9<ޟaT}nf G_:-mxƟӹ%<RYR%*iL[PLے/b|n3- `Vu?ͽ]r{'t-ӸDu0>xYvz?7W3YAF iʇb^-j)b\9Z5!uRiMCGN:8t`h=Ma=$Jݝk4v0[2`-d{ݺww[?DߤFtݺw{S 1=g@jԨryĺ""vM<'9i0q%榸48;t}.+||w1hO1kD':ӁVkoΦ4T:zC6ڷ CoFBPXB7:i.N;m.w*N$Gw*CW13Nٗ1isQwݨy.-1'8mO$.jJPܧ5x9[B H8vZ PnLۧO媬yl[ i>l.￶HQTl9n6{[ Zu*hUU3y J#!,LatSF x'SJDGYhw&afty_V z %pIkLȋ껆gs*LߓO 'U2qx< &ոT$+C",.አA%M2K).8k(پ'X\~}No~.ɀmP#uI(=K:U `Qd3V+c:je{.I@OAW&FZgi rBx_{-.g| FTt _fPC7B2LؾupھLl1)-sXɃ@iNx"B0@~8uđ3%;lg*]&3]ڦ2 {ݣTLa&߯_? a`ͭ_ 4xzόSQi^p&b2'0cJsМ26(iEV˄Eeo0X׾qq]ΏVuIf{P8_]lb3.="B xp[@b($7aR3Is)B%rOG41=_7f2{29#d f._`FaN|9d$9W~Yr\ͯ~*&o61`M|=Mĭ~V',sIR1!_ׯn/| P'?o<}3&}5_~vf]M&`/GOYߍu;<߭Sh|Dה3zx_r#T cʥT|,*SwbXMǪe.N>|1O"'gjf3sVFGe ɖB;BD9eYy\#\ad>~UWGolc< !E{m}kG"Y1ݣ5C2X$ߏԼYl#8;[r9y?efH88]D_+~T1IRKzGee> V|czlP: )٠㧢Ћ.gL@t!QET{.XqV_-BVI炟bnQueb=Żx"-&Ntۋ۱!Y 鋏 ;šh,?=eq\2)qV"bgqR)-=c +Y}O|WA6_mjs5ht)rNenvEG=7QsΏVoh< o*U$hxZ_A\@ƇK Klwd nD_62c߉0u-q- rt+GFpȞ..A6#y3SfLUEq87"h7ľ 3v#)>&\b09T*C*ː6WIQQ4eL2<1Qy 3kSL Ncm{gIjn%;vD%[*B3)MR0gB` S.#w|S.*T*ޘOػ߰LG:  #L5.~= @73Bwa:nmʄS N3S-pF9 ~,}<I:W&,*X^\rZ!U!c3wͥ7ȳњѷ%/vUj:.>WDl5?4Q1W=^j@hjan+S+3j `4$\BfnLnM4r2EeDiÉ+*SݨI筐S3%xqW&#Q%iL;[屆Z#?ќÛe&@;RأAyA5rD3䱴12 ~&@YQKTzjrʹ4H FX7Uabk5RHFӚ,l5%{XZ2Pj`H60SY43`6,[ ,TD1?7w} 6?dzCV-^Prs< ߯cfqY og_F"%jM 歨QM{ux5sƈF5l`RѨbx ؆Fa `7EQ;E@E55ʃmhRudFQ;F@,:N{ PedQp(d rjL1x.FWŸ𪘞wWY,VKhP+,6N7D.!?H(|AԊs%:2:LLt`VRb\,kx`+m7/E#`t0 {1/Pj'G5We߲Ԛ2 ω&Ф9i294mɌ˙flyolɞ>lW4(aA9 +: XΏ)[Y=-|shwrshxߡ::; ճ8WMgQynϻb!|ΨQ襓GVFTck/;sb14$&SRfN3.HIpYU)KAJ5^%}3nDX\i !mNGy.Cˌd\SMnt."LgpWq+z_pή R!Ds\.<6e*q=̈\(*SXzȻ&j-0gP=cNm3 LHV<70iE`ktڤ4̱9Ad(}9Yz+FLBǘrݞ2DL&7;aȁj`h5s0ÓTns7e^GA+տFTU;/Oª8dS!< =w~_,`1"P'̊¬8o;[{Oq_!e6v$54$jQow8 9#Xd^,(-JP@D7 qQ<1 YRBCMs jR+" zDùMit4P>*z):!GZ@@PԄMm Q(GJ'/iA GP2-mdS2x #ʋ-H p gUdcmϵhȵD \sO~ӳow3]ϻ/Ξ1&#;|wrɎOt챊wΟy)t|ٕO!_BȣQ]#_t"I VyE#_6C)KVyּ5FO=| ] ϕ^wQ<[ӿ"~$+Gvh/f=c8Wi<;fy2~E-h/nxЀ:71N{v2\ Ba%vXRf5J`ɖ|9+:՗28^ &jɗڎ/{Fa%Q; {!چ)Q?:liL:tY۝{Ԕ.= ӝ 3toiT2K8ᝲ(9ܩvL7+5K)]/ޘv(Mxk(,D>x%KDXǸS0u4QW7D~ȸ"97߅?wO$qhJ8N3ߌ49_hOyEM9Q,n͡Gi]9p8)M݌b+e\m1˨ۣtI$7yn ?g7. VC}rzptQ}~;VSs|b -DA9qzBo-͸沓KI3*yR y6x$&:T=9%B"Bȩ!=RJyy#v<먈tuԚ !^" n#^Dgc! $I@$ƍ3GP?34PHN{ppڊhLyd, tuliЖ &ψƤQca\A@s˓B2 tdI&]JmP2XH3IۈĈ)ImZkj#M*l>QBEHpKOW(C\>b67 TH|M{+Ny*obzmIA7i' 92QPjvy9L D49Zi)VI賫5ϸ#cfRk&kKЕ_Ĩeޗ͢Yc,bΪ f zl|OoVmSPǶczM9^QXvm_.`NBk*u@<@'EŅV1fvAm"5rLZe v9ޝ IBU(Rat  _AnԚ/l d #Y> u(+GZrdrZ|z59=ZFQ 2hPAffji$h@S(RNc-Ƞum*vC H`~}.$FEM W5N2$r)r |6hWr)gH-0kY\i6ݔ(Y 2;Cs گ[Q|~5蓭X%_VRNJVOɴ@Ej#wF{T;mT0*^]piWpcT:@ˆd O ̦G&sa8oP;dK4-QG6r<R|;jz6CAKё*v)ېoйKV¢*x.1޽BNV!5ՂV婘300(bSJ0)Y`)0EM5I:}|x̀ך{[NhvY.0 ]aP; i} (-.<FHO-4{ 4~1{hFcA-l~13S湆fajN{ϯ{n-Þ3w']ܵ4O l</qr{5}w!G n.'?SgssC50~ˏ;9;dBua<[Dc(41p-HkElVBF??vzCddoޱS?kC'GLrOs4QC7 W_*Nxg [9 ˾~O" gYCV#+nwI!"fmqKcZRRERpD$:AbQÞ_UWWub~OfpTG]c0L3fXNmzp6n'1ޡ-GG7㝬l<68y㾙׉?18?wkWtWmQ Nxh<&}(}wԅȵc=<3tu)Y} [[hxw`tHD-(~{Mq2 nun `pNƅ@`'a˜lF:Q6,$ ;(8Op |WGd@c31O JgOo)gx1iSbKH猒us%0b Lek`@^cO5*1ГtP$ K4WL`{.!;e񫻣kmњjBPrqpmRhН/>WRog{Uu ]0m,[._,/ͪvؼ` }5렷ak2Y-2[܏F7oVDl]ųfcEyY.UlWUw}a9q*&|/sϭnAUz2XI}gw]c*# y"!SZF~rd )Z= i]\U Zuz 7\?8 'tt~^_}rNN=CQ\ υtL1(5(d(;s L[Uq#&)5AST<.2T`Q n6aFz$KmŖiAqY %Cj 'P\Pa`yiAR0~Gz v564 /!hUY89`tlHxenߧVᑍcT~t ۬޳l(Kr7@ 5Dik-9;&"*TWX,̃C@}S-H*5}rF0֕LY:eH)V1J<Qo䑢6闣ic'՚6v66L thOYq~t*x|J[>dYxQ^R%uJчqo>SDOQS$^h$eM% |ST[9Q~rd,FR/ 0$K\ʒX`dP$3$Gp*$MeթVTkU6vmll) G: |' h_Ex.t5"yn>Ap-E ~n.M ;,M6 "3?z]xdf99$(yѦLjiEMEO8%}$(gxJ'A#|`$X  2J/ ƥ+~K @) XǁJȎd#7oi,L.xַMZ%> ;? 6Ddc4~*a !^ ?Z\]w |T/@9‡{6T=i]M*7z ~$rUnvzrF뭷'`|k @9|U邔>&TyP;'jcR"; W2۪PUyMUvXfn;;>_侐Mj>%r8NiaǬy0g`\}坰ʠoSYg'W0=q਩ҹWzRT%@jxqD"/]A(\\t-:#RHs/QS hGu# r9ӵ5PB4B ho~}h M*Ѯ$fWs iyƔsSV.?SOu\`) s0s+B+8AVWʑRK!VhOBm r0gPMUf:+EguI'XOZ5m[ 3 QSifjRZ^~ Xk~|RX.˫?Kg('d^\]Q gJjyhb%bw70Rնрr&dJRJOVV1o2bM(ҋ##6ky `LR@wyq{ joFaLU'Fv0iP48_8zJ"+ Fn6[sTu5e_(5g~J]PGCݡ Ύn!8Ҁ#،VrGoK2@W'po\S2h褦;Oek u'ԝ-_5"l}– Q,(:t)9o=C@聢b$+ZND̋[Cy:pBRq(/q5~,@2B=GwXnh%> 9:C`M(1H6x qiRU{30[͈rܖ9!7=Hh ڑdfMdO Tg߻bfݿ#$\.Y~e<^&\\4]l~~beb~OfpKrZP;98zJT u`믡v\l0BS߅~>z*l{r| p'1{㞗#2q V/4bfJ0 a[M(wG&A@BUaMr31 }qPO hQPQ0RɤBiU=3[FfԆV, &x8N!J>c Ԡ' aqioYJ.[S0$$Qњj{-#ǖKp-oZܔTԎXo0:Ds~mPMU6H~Yw○/u$qGrW-NfM^^M:(I l7s :ު1P$͛NXQ&t wU}T)H;>ٰV[ox_x3-zsN7?+wYu$$䅋hL.ܧj7V Qb#:Hn#^[o#[E4K=xvìv}DizPs{>q&=V|q!˄4xs`-Dkb+r@^ N*vW!Liѵ+ҽDSꨗ& S0: s`1.FvHQɑF؁Qg)8LHd"!J SbGf!FKi vwARe EdG'Fp1/0S2/N D9QA$lLbY"x5'#ȕFi%0+uQ:b(*C rVr8IJ(H4(`5f+yNfhX[XyH*GbX/Kb(I<#(Y_`-B.ecL z@)V GRj⃁XRAch+F Us89}K# \ +y8@ KZI mcJ)əC5z<&|0BQCw5C~d*3 ̕׼=%QӈZ ޹x}kcB[X,:sɉhx|f45 Ǜw+ ET?*_VJP=]BQ ;)3[|S6z܎$TkYb>ZS`̶qM )17w$Ok~ɿsdoE4G$Tv b#:HnczGj.$䅋h{=ujۙ1_zg,ha*¬$AΐrFi*!XͺQÃ5K■xbE>%p !M76CI x=#0A4tf۹yOmxd..f(яq( hD5,)XIﵭ>be$r8Ϭҝqkҝ1#<,)N~\ȹT Qw,"٦zc}AsyGnϽ3+P \OzjJ4'+4k^[?{g9kg{:"E(R|7Kv#jJ,i?,a KQk^oW2 @v0Ae0iմT,v7TbT$Terd|`JaM&w b[u5))* KK%xK7/ J5# u#&2+BVqJ&K+.MEv߁׽s2wt9SH9>-xO˻Hm(^A"=HN~0fڨw_2`v82;8$E@L5} о c^+Tl3™B'FxFIS˚Wb+ EIPN \,e`F0BSyK^b) g&WiI/3 #AJKZ2†+"k覘TiΑ%)s5*2KBGnY7S @ vFqAS5<'`YqJ(2sgM۲gpr!d W+y]7W64TV~6&ߐ[|fAC.UГ!C4;_,xjANg7| ؃XͿ !λ @hiwvdn3!A7Wl<ƤyflYuzC[9xA K" \ݰ=o9I5gwyxb QXn+`oo)_Ϣ)v(ʧ\eIL3探mfa0 eG3molS 9~ezgn8 ߽=fk흁!Z]V8e8cx[gθ'q;$",^gqm{]ꝩGd[f=}Mt=閕p;ߞ''8JQM{= 8L|1*z8S|]CE*Xk7oȞHwkx-tѥX/-ҪeW^kUa<%z+ 0ܣ!p/ bA^0ų5\]g_`9A V UFmu7#=rȉ1nc1Ί̉pzdEuOkÈZK#`V7Cpcy^ Q1^4ƴUB& $KJV +(8#"ejn^9;lN7;Qcz ^R伐JʄM2)u2gIyC8G5W|vƖi8#WAN8_&ChHG7y$&џ l<K ѝ. ƵZ✾)3m2oR; `۽mW[=ڃmTp1jUK,VƅҲup^~:9U7*\F?ˍT| 4faO,/CÏn9vh߶gX Qɿ+<=q4zX@2NM3C!x;0c5/p482R@f#ruϲK/4H"Md <ɈЬ2(/Ú)%,ym}g3A3{D jby1'.9Lkŏ3x"{r֊)G#扠=kJțrpF{;GvlrGPҳז-x!qGzd$/Ju^Ox9b&H#+H91_4/ײ,Ι9F#piw<茗OhBV<:1d'.+Gr *#ȶ7sr n2~N/O8=O~||۴n^R?G̕>O=25L/?V,+)O0ҚnI 庯"ҵQs+fZck&nW-d=ܰeqf쮺)E ,yc}S-7`aWƐ&AZ7ywS.-,Mx5MQ99_<7ŪiTn1}gvͯ6)-7XGY_y-DKm ex gad?=7Sz4})W++Q'2,GJʺIsHD抁ƺcI\fܻ[Z7r"$SRzI릩抁ƺ#I ެ3͇\:Asl݄q+Ѻb`#:snUO%Y7gBm ru)NTкI N؈m[ OE͙P[CB\DdJ?=n\aD0u]T!3mТǨ#Qw ru)ٵ8DO&F~FD抁ƺ+Ǩ3͇\:ɔT:nflD抁ƺE|7gBm ru)E[xlݴQD7usFt\hc"Oͺ9jk|HȕLu͜1ȺtnXmsO{<:jy+Q'1])Ĕ>{~jSA2US|)_֫&0VVr+J[䔇7F8 \)gk p[X"b{j'Ԕ 2xN])9#S7UP]&h1O1fĉ6ŘOM9!fYH.c QDJN1)S$Z8fbS٫&QTb1O1f_YSy1_Yu6c L)<Ř}jF]'Č1kcb^5R>l1O1f9"bY!Ħc S .Ƭ1+{1_Cdkֈ1f@Sy1{b #*c  1´[)| 1f):w̪Af1O1f GcVL2=ŘOM8]YqlDAc 2pbS٫&ݗcV <ŘOMHŘcb5AG.K#~-AF@A@EjQoXꟋǦDC,pZ,o6?}wWqռ}wg<Kʿ7 \Tk&2ʓ*AE!E)MF!`Ӽ7$&S3P>64YJ`e'qc/ 4cjp0q5~cf%44sBvC~ ְ}]%̳`Ԭ<&i}BzfQ _$f Y!4`7:NF5~^euy^5y_9eNJ3s!G}Bg5S~.G2:9èa-ˌŗ^{?zIH,{X^`5 ºPfQ3}pqF~iez|,,?UPȣ_“A P 0{?ΦsBGw1;՝=BPv`Wh~y%o*Sbq>J!ǝaו qG.}OY@*){T;إD[q[.)Am`[)cOvffmMǽ^ /-h]+i[~cN 2$;q3mmC[H9+?2tGqٻN6'.v$h[Z9RhUZ9כ_V.zD(E@ϣljގ(7I3̆2Kʬ"XdSβ_4 $4q;\Xia:i\sEu,p(R#a}pBJ)J8ՆJ#X=8wy!>/_/T?ܔ hyYL!V! ?5fWaNA7~[q*ᘋxQji!%"x&g>FU9&PO97ٵ3`̇C?#L%`,߁՘Kҕ~0nh#JV bv./~N1Oz]z݅YJm[x ^ڼʧ\<}froqBu?٫TÇZT-Y%w8[l8S~^w@x4d_pذd4v*_O J/sLJ3ք`4C~YĉˌF9td֐`տШkBLFYs01[e:W?l۠Y B!H'0 ^lɨ0%.I^Ya [iu!7^slQJPH^QM/6E_oq G8WKn3K B2tS,0E6e/gVjz$I膽'4ea`|4_w#Q`E:kxdw4L ugCsL;0))J; e ,x upK`LҰe22:L+rsVJև{uNʣ`):E{ Rf`2]D8~ډjTvpMT *G|eVmEctpOxpJE ,JB96Ap@$FdVN\k_ęwڭ#aG﫳`5'Ŋprȝ8,\2F`),}5RA ˙4w CBAMSr"\#&ղF O;٘֌#i;Q-qGSQ{+q"ZSʩb.hPCլ&75`ΩD(,rp!=FO . r2y4ihVX(~Ҵ렱aq2s#A(azd˒_ \kN%ns8\um~_onl৞pgV)XY=o>87qG엣 O[Xܫ.-Wq)/fW'z1PF?!j4̒+ x髐6ނ‚>4SX`N@T< vnAx ua٭{}+vCMK(\ΊhV:c1]sJ4]1DZS wIJ5O*Q2ٕ)5Jcƅ@dGU02PDE20n#nOEbE3?&.ѻt%XR7WweFM@sN7+jw^y0D]Xd/' 0(MWW /߾? ir8i}褋J~ z4RlnPt{EKN팙=}eo\C4i5 1PG_kҩ^-k^{: Ƴ/MN,;r@Pic@E=EpV\_u^,'14q:s3k&]g,#B=^!,U85|~D> eK|>‡$mW8Nzgw89OQT ZgpwL _{JL"#^#]"D*6 q#'), $)5 c[DiD^ #z'퓱̢iI׌uaFA(XS 3&}0e*HUcy=GΘd<`yutFڔ#' #!9f+;-"vܧGn}NGvcׅ1t~`a2RN{W "xiRTʨv2(k p0"¶  aƘ [̓v8 ,#Aay_ֽ֤V8/ʩKqySQR#o M<􃁜:4T GO󪗬| Q bUb:DaKLt8*jvD#5n誣 "5$]0C_!"ƨ ,RfGiX--ބL`n%Kg>b|/iQqqJ76Y5ݙ_T*o./M@NKxKۢ{W:\Xsb$X[K<T-Ɩ< 89kx!kB-ֳ &=7͹r;!/?]>eB^ڏw`!<-W&E T-S9^6 IK9ثc..O0*݇h ʴM!5 pt~{8!R?{ȍl[2a7; Kr$ӝ }KUJ-"SrH~υ9u߯I'5,?zΟwM#|U 6 (!4R eLy] ف6vru撶71$&|z"+V`vY%v9Vkgkrq.~ܮfw|u|P,xt4fu=lN/CQ"2eʊkق7$+xWSJ5N-46}?QԨR΋>3-c$|9skY_KؖS'I|ҁG s~ C,&gÊfL""#yε_+jcC~J]u#N3F(<ӉKҀ!?w #h{cJ~n>g]ԑz5Y1SϰoћvZe[~K2M /BV ^; Vp_w Hˆi/ 4Dv7 B MdGw [@Fޓ?=4`UϰӖNcwٰs!uDȰms {r뚋JfŽ NJ v;) lїȤ.T >ݹ/w~hKJ(Mu&F\;zs-gK2A}`^\R߮?*`c tg.B}֔يZ،GEl#~fnfT ,EcA|ot0|=#_Gogд?5=*X)B[]k]@0YLi݇nЬmjXtc[^oC M3 Khgr :'A݆5q&.t+:XHD\ =#)BSck CKDKY^+Q^O_Rg&^QPQqSȬҮ*,7@}rMƌ%@s$γ!J3J١Aǘ_<51QHCxDRiq[ඦU5){߿;t8bѦ.vS~˧Df|Sf/;l0DTW1)\Y)7`vćBkEC#B7 ҵZhMں,`%Ql}XE3?><ާۑ i"(F]1EuHc}46%ֳO2˗#0vwGvT)1)RYaf0L9%ep,Sk<[lsr% <'a^DiW|HI+"Up&2 qo&x)cm'{5eJty 8d~KVEߎHj$PuՓ/-IRju*֒_c52k;8vC[^is:60M1˟fw~"Mtn3w3zMv 0_aĹɭE)dEQ`tP8X48 _?_-7GaRdnnF5mNQiaw#_֗mM\Oa-e_֡G)РzˑҵѸF&h4nF,PraH(+N2T:J9"TE8 d)mP#)mZ͵{?h!d-59Bܒx/aAQIQ#N:' S+:$EꐯGMJOl("G=mSR$3$C;!TIPI! 5DP`S |k1k֟ZW_]z.;vvl1+F*SyI=;rBw묰1'"zR;Af1Z8"9P䂽Fi#xnMA(Kr{0Ն3_q __d/{u-U:S8Vzw9Zjs'E\zsrX8U$&܂G(MXOLîZ.T73Roàj5INP0N{ p@l3xDkLT0Txps E.Zeߕ6J&X5U%HK_ Gܙyu *:>LJ%U;'VGcΔqTnfDDLvXW<2X94`0|I\PUZL=.i&S"Du_TL!dƶƊG'P+Im'(!laiG/` `9p]T0P=DH ֧4ZA}ǂ\W 5#aqw,qX-藽eRZ0$YL1 a,YAD=Xw0mj, 5M̕\6 nY Xe01017<`'%fm TnQxc:1㜶(}l0s(}óJE l%p1 GnayEQ]XzFY-OWkLZ]/OWFnbF 8~U񃷋obO(fӐSub7/`|W oY3~5G+j ޼8٧)7o^_!cbIxNtƬʱ!ٷ6g}JPӅ.mMJ-$R 6|QkP6ɷO ?[6ã*]oZ_n7DeƣU_O?<وHqiYX]^A w띿AH 06kĕ靴Rg Lcyׄ5u g^|%-l&]dV?;ϨΨHLrK+>h&f# 2Z92v$=Yh;~j AH3;EưSPts:IH\z:[ӳ=;tͬ/ʼn-MFX 8d?!#I a M@ YVLRPI3Qn㟍SEu`JEИ?kU bM3#FXjPB&&~neϽn[xR"vׄt$[V՘lz?ZEVsl^_LST`<_w[vtJ8mO1;߼YNhKyЫr7#0HpZc^sn\Ԧ͛B?o&N1{EH#eaƶb9{sw~Qr] O甖ew?, z1z"v=|3sMJ[=7ou~ǻotNk"i. X;r){T90 ތ&z␳ӻ:Cz7$]:бO72 gSF.gYd+@өŠ6ԀtPWf>{ONJAe(g,lH({QT5O۔#庑3]Yh**̡O1u/byMU`|Xe ]zhΨ$ݭ,욅 u@6 1HOWJ* $(Ew^4X/=K+H*Gs,̓tzd55)D%/&Umxi>c 3ısFHobӌhk t2xc!NK^xx c%\$ :Hm)BV@T"H {*S;m&w] _Fc38JC1”֘=O6%$V8caJC $Wx}T`d LƘN+ 8߲ay5o6"l-9b|C%HOپKҌ*7C eJz<I`"U|ӽSXݍ0J.읯N普E _~VSpj NF͡'^QJa/tTom&'^zzHkN'1)TNUd&񏾘ovu;;ٷD 95w)o8ǥ:>t |~~W3ט-/9oTұq6ӹF!=4!nBV%P\,Lww1_tc^:s·)3/%0SرlÃЕ6 SZ8 nV[&τi Ihs XMvRw2ʗw6諺hvNAcKqALu1ZXEz8Ӝs/i#$o'3eZx/\X VqDV4PT/ҹ͝Zr-e~"ijJnS}XݱâhC:ý~bN?{ܶ G_vqx_6qwv4Kw4 jt$t$J2I <~Q¹d'Ʉw0eY aɸ5P5Zu5 :m?[#K#6XG KpI92N¨ Z+afJ , 8K* d#!b,/Cp djY hpDC^˴ TFDk;w0:`soK|\6¬Qp.wˡniߒ۷;,/=Gx3u;?&6t""F$.F t@Lg n﻾㵿Bh%Vߎ7Ӕv= : UЂjӰ*P'H0P~Y22qNt5"3OY@cHBd߱@c^RՉY`4F! jh{~~7b.&W:~%g2c}rɑ}r;|>Ei_9a|0;58~>\_]a"~kPrޙWbMf͓Iܬ $zp/0-xv~2xJ`cb?}eq]n:KvNȞ"jW)KYe @[KRƹA֮Αf/Ps ԡ`orFUoEt(80Zs#$C\T.cD^$8[0'VdV)BjL%;,hƥ3\fˢ " ΗFqoJGv:`M} |kÅPf4{ěMf=i݁O"xL+u&Zb9AmX,}6H V:PE򫧵q UFXoͧBWLk+:*D_[Q\`` ?. ߫Z T 18Kݧmr&M,uI;\m&WӹZLyn2י+!9וY5=g|v5fy\V5IJuۇtμ)xc:&ħ=noVs*9;]+/s|`oZĵw .c[8ciǴ^nii8 ONENwX97QXڲiI]Rs5i޴t_|V{ԒĿh&i&Qt'/eJ&V +̞Bsف.%gr!i@,&FӘ <*sv؅Z'٣$X!Ϭ X"JL.z? 3{z(*8 B-/XbYeR.z^h|>!=A)J}bۚ'暝E}QO7:lE/ mPCSR>q|>y_6gXu#,itU/sYn?5r 5ug|?2|;1ŞŞŞ^YiSBkXYNyQJe579aSZSI0Q6ůCNk:NŐtN]GVC?L}W? yh~6o ݪK սфٸ  a3@b#RMIIPӌL *L)hCDQnRiji]*r[iűj .s!C}U|RnI%mf ;@@w}=: =5YEqW ΝᶕRU8m.mh=;uw|P{.RvolNQ']B`JK؜%IإRQ #Sd'6A;$Oi}!zmZjB٬%m2DNU)^_@K#*+=wt>ՠoG_y8xsDv)CD.J4yL.HS) Rhl:?Y&\7穙nΌ2J'r;c<5;ykOKP ) 7tUE~1޼Gf6ߍwElН,ic| $):,CZ$)%'&P>f O?+AQE12'L f \He6ND{5g>&_[KHE7L-aŃfPҹ&[4\u~ .1m.0oQ`g7T0g/ik9L&ቡ nǙ`4FH8g,7qEi&d2BJe "٘  ,L87\:Z3R@(.S7M3A`xi ٠.qQyb>kǑ aq`O }׿^ 1{3/׳<,µP?,~&1G6xk'y`h& r=l_ÒW3wMxV#* q~6531aJn-_β f dWtd5_dTn?>]m6ܟ!uRc ʛf6S*gruBC<5DcmH=h^N5IU>",4R-xyjK|G͕3`G3ocɶ=SŹˤ[F :3/m #4o[H._XjU䥮xɾPJQA?]UbCu$/Y*Ж rvlIiT]+LX0I _ Qpzf7ç4Lo>UhR!f_c;E}/ËNF[ox\c A?q?a ~\*ߞW=|{^ʷ 1,VnA3 dhtBos,7gYe"D8i_͇kz#=*GutOntM'r~6{;?9郒\vx?.u_.!:-{NWfn*#:c$QBlTige3(x q)%α,\dfwØ=%/!:j蹔FJ5)#Rʀ_ty?B+!(+ӌihfK lj\:kLA1JKIi2Zʴfk0}fZ0&4sHkÔiJk| 7ڬ1DRv%D1P.8Pt_ g(","҈L4eAؘ49S3ʸC\$MݻJj=[9FFKL37fSqQ'1,D;-5 # Wº`knFp84 Bu4#W܂ʹQ[N},Mx^lZ+FLX;uxIĞ'b+` ԿdŌ0p,Trӱ'9)LL%=jIsuT+QQ;s{ FԚOx@ ̈`ۄ:K!؀'J-4F9%Sc2r"ㆦ_j`[/3ei"k+GK &Y b$;KWf,Y+˞SlQ_,>}ZarX7:xSCąD@%2rtȈ*x.ڕ$(13%ĕ9 3\8G53f{[V= |ۈAM;$pV 8z3٢vuyNy%|Syf|LѨYp-U2g`d.YT[ATp*n4̛ٯpd Zi?ݾE{Rwj5vtVWXPCGQ_bOY^Ih]`J^d?jw@B7AQnv]Y)"CxwP{$y$ rjaڴʢZjV ٮۊ5v94/Hn-8Ό=۞iJ;L v Z YǔtX5HX 9碾@`ڗ&C:/!VOO%3*"i]s͢e%@.aX[hkyJƒ,_3t deZU->^mۈɳξRD*x&3GUX&BG@ +#8.z%[GE0v߄RZ!6WdX9gq"fڽ@zouESA*&$fEWy߸ćլ."As,13/hJByN!PW,Tj$ S䔪.1D2`"gk216CILS5%Dk.Rx'YQIxuEl n`ֹiF~2GgXz%hTK&JQ yI\}WgO3wl;4bGz߹;~oiiѶ6BᶘpQFBC^p4-En#:mVYcKJ,fn{l=S@@~z}njq'`% cԄfp]뜮|ŒCM=w}}'!jXO>}z -yY\ջoQ s2:ZLuR%-hV(af1Abl6m3P2fH%3KQxcY$I;s@?r:]'d|Ch9r D; OX8NpvNhEIpY/>kT9<3~Y[ gm1p6\ 3~Aq0x TR9㔲R!gr5*l,mC?7uIg&Si;wJa[+99P`:,U1+5TO5%fv&)_TLu,yʅ.UM@+oͺzl+9T63S^k:+# h'$3;aW ++{=uaH[q YR.5fsmȈG+YCPG.Zwܞ~;jß?ޔrU]Wλjv]\˸W߿W_y|rFs3x'϶,>q *(y-w2p 9X oEh6r:ߕO|n~g"6P|?z?C<8^)<6 i@<}IWg !}#WK>a)T/<6G%*&ON4wn[o.?ǜ\̙=.jr{_#Xg@M \Tr޵nD]WlHal]ӝ}ݻ)? #|KO~vO!B2׿ [pτOONOxn. g ĺE0 !'V\'~"v_ɼQ5\ϋzoz?p[ixfඵ3,RXISgi""_}]rc MkxI]jx͋oO?O x5 Þ5zZl*7ct[ |"&59qZG/1Niv|>L?=IHshznijv5ETQCs1)T|ȉsv~ bQj:9_*h4/fc9#@tP isj_f'6fְ R{NS`=|{??k=nmoCPVρ׭Z{=ڞiez: \+[N+h$T-ܜ-s[== -WۍUltX6U㒥}qV9G#.>z] 櫝ߞw1 U5S&?U9nƖi?5S҆HYg>}V ~0ufs5; nS֋OŤm%zʰ`[>Hb[]b1wNs!ްښ"^|?m !4㏔.<-s:i.'wZ?SXvnzuZ7Ә Lïά0`k<?3) THDtl]&zdk5Hhi? 7[Z#D )ll31cI'tL^"z`GrZ "/o~Č Ѻ#P2#4>:O\0{H,CVʂ1 4/'X@Igaw8=u9I}œJrh̺c|dm3 NK!<)rZOمI0xZGC0,'rZ?SdR>إ@m_ZJ;{MśeYճ&gCYve>ǐtJﭩR6)%jL=*Ʊ38,?FS* twکMP;u5RYe/~a=ٝHgwPajLb(+G4+Mq$deU3$3@k3MB4;9;j.vVUDcI!URu %%18A4 ,EnTV|5+qE\Q(.Jŕ:^%Goz+/h27KPtd{+; "chgJ;YPZ*{=A["[o@6eJoCD6#C\ArcV1r&Yf#J&A;ax9wzF2rعк0U1+.cFʗ~]:Fy#|umyq1eFY3_lz@  3gtL5,Rn."hF*qcu(y8[ -u+1.aẅ́Gm,8q,aecv]ՔK0=ߋ};1msډ,gMϚ45i<Jp'Z)H$ 6: P{(1UE{l!Q]S3u5謲EZjs0Sܣz2`mA;C֚YTTVZIQȰ2GDv#S<A^|m!f῭{KvFՒbK YI "qI9Ud ԁ&&S¥s8^ءz'Sq\OPghi4je$CT-5$a)K!6:+*P T#(J$!SQbEv"Ȍ/[wfQ!c9Ev%w[Ѷf{3X`Szsy ֝PI^/zb<~Y8oEx*k>]4idg̟c2Qw#^*Ty:Q1)94 k]YgX8lZdpՖ;x4~Za9l}1ؾVIg[jw~<ǃNً7L$C{t&q)ͮ#0Y*^.F"JIK M%hOmK>;WYsjNʚcppSZ_^{$W^z!z+Sy%ÜO:?pz]ޞm.}}pv>_ݛ7c[Z_{7L }%f?1Z{VOˋї5rď6uJ-\nwfQPǛgzȱ_12;R /۝m SdŲHd}uʲ%U"vn,WG9 1nA.gXw/whsۆѻuxyeV.b_y8AU0ݖ{Ň6bt05@I~i$ttnĔ|_<X<@_IW`G271v/ۏl<2R>,Q5\0/N3O>lHxA_'+fUPŮ:01J-SvbPWL-)+|*]c>ls~^hn.w,F}wa:4tvt̬)F{a녹!ӿVpîtյX=SG?T.rB~%(=Rs~%-} YoqٷHL5{y; =ŴgߪG5&$://K@dH&>9>#)"VX!}PN OX&vy],ͱg߿}3ߐ[^*b+# E``:2jJ6d|={/^k$v ˫jr5ث嫯, (\-lq 5nǼh.9`4 Q\Ȯn+w <^%[ۻr<bThǕ Wop_0c|<:2'~6}׳?A+C5]GbϿ1MfO.› pt{8o&ِñ^.uoeUm h]SĚ`giQH["Q RpDCWoaz?-`5\Ȗ\G ^q{Y&:v\Z#ߏ$|; NG<816P^B{>~yu+wNs-1k\EOiU;*e2Rtn@61ֈD{NShacI<;=jA`S z ܆۵Y_ rt݆ AfawoMV௰&eLII 8([PC]ШASB$RZPm`=CEɮRagϾ!z`:TJ YN8EFU~dU`.66ߓv=rQՑ]%hgs8M BF1149rS1=Ol?$AowxUqe+FV+M+\W8.SҚN:SWϓ ;N5J8S-8DjM1 0g+[QW:O'lN5IsjV<1ymf:jڞ{WrIinZ0Ơ}Uzjkv uM X[Qy#ЊXj@,\j<SDCٝY6Pm-X׸ٖ`]|1_t͙nDvmky$BLO2J-@vOih0̐fjcB?=nyG#[-|Di4F>~_}TW'~eѳSM~8ʣ:nF8^ԩD~l 7bH.R:WWjg|LCcc:]hݥ IN#Ý@0#myǎ&nm3m9ٶ\tl;Dk";ag8شk;==ԋ^[|{1BX)ZrUn=7] U̽8 ^:nK!p7\zGD ?>K8/b=1++Rd`5WsE%9|»(s(.LJx<+r'鵍~E EAn)p xT _LkY@GׇCIeH> A5GT>RLrKɬ|Zj. xIQ4..( Va~K'iqNI#}͹bTu@[ޱ:p:H7ڌaϑSa HMoiOreHev"n0F68 1G0+ b˥MH):۞Z:((8\UAHˋ@X|lpf{8.kѵڞ`FVpN ˨FꉷmyNJmF*ǷȯC5-L͍F;4ܦ(Q-Y\/>gI(ڌ`Mvg‚yݎzHB3NC aLhmz瘡@l9T VqF_g.%|6܊;[4efXoxYޗ\̒ߌʕ2{_d'tJ죹ylo+}1QI]`ȺJ 4wRh;V6k0Y `Zt1m9aNohDz)HPl_ "qoyNJM!i=x#N&Fu7F́G16 ~Zg@1(qӹk'Q%{aaz f=#T 99Th{h;,9d 9 JqB5c+\ϻ*Daˬ8dk~mB{r7+Qp!WC'E KiqazZ1򈍳SB3syII.L{4+ eaDm%-x!vM=bEՈm:mk,Xu"'Y%|LJ$7$烤#n8L w>=|6-6anЏ2EJӇկIZNG)0#QieNol9GyŒ/4)y?o[Ͽ.%y3d0ⶁ4^lt7aiуaZ$j;`OI Z! D%ͻ5ٗT\JY,tUN0ek/Ben:vHHZSg@e.KB3G6V9>:;SOInە 2m1D<*b9Aᰥ`B`YÕY'B1}[?8L0U"a@j,58mಎPl^bI%1!b糼gyBb`nq*x-Nt]uNmʅTw06I$K[ZG`_M ټ}խ=qve"Ȱ ' ljڀ5h$|?Kd*RP(pp&][*GiZU]Y-AX q)ҎyT z:ؾLW[nM&MS˳ۖm3Dڸ;ܹY0%N(4߸/ ֎0ri*KV,uDk {=M"t%WF/)NrBvLss?Z p\8(/贤,Ky /C?~. rc] Zj'ᠬ`.G0f<ji-j>N(Us?ONɛYpNC3jLr++|t&Q"7l}D*9'#7~(Z1)A!CDYbm2n ۼ}iA݈5tK.6ww?yYN#H0k]FTx\cEa~B6waXF#|k&,!h%ӹIb& /Ow% '@wm#80}?ۋjj4}mA}8jdɑd7~f)ɦDQrbH"J$;pFUwj U{â3qp !T bU=jW"W (٪XEյVu9o4aLKRl8 3EM1e0z[jbNF_N/͐&`8-Hu+d)syC]jJBV\jRվ*&8#6׼ΫYLvFǟ=(7kfI{o8klsE.Y*R0&MRsmrSGops<JQ70lFp5rRq/V4!&&K"4FIloFO'c\mJ,w+VyZodCR$.։E?S, c>`S.LW *ȕd<8 #?vzG`9@JvdrVȧQt//u>h;2s7cjMxAK[Jj+YR3H;@'"|Sĺ0,G,6a IQk]_rl8QJyfH [A@7Ch1lW{\aNUڍ*קR}bǩ{㉳YU w2*Ɉ|EܬDTV39X)UQڢVlQ9W{p :aLp^Pc`7੦Ua:,U:Ri X"^ u8toI2z % ONrmϗ6tZvK xײ5{v彖=3BgOÛ/jϥ <~vu2r///]BA|\ih fZv1jCE4횙kkҮ p&?bDU_Cj1'yXA2Vdl!Bh:`N֑h,\g7j\?'^Ac۱0IO0!CJ(U)gmKih3gy;D iKv#a)' NHrm)Dq*18Hm6:k\|!LU ZI룮Y?VZ,\9׌PZٮ8a'ȷ͈cms&B:sI#(.Hk@e Ȓ`1Xjr+91ԛ(jNƩ@SrͤD?kCqɳb.%{sM1%Q k0Amz߶Gt(άZQ bQiPʤ\;ҰQ;HQ| .9P5 9,)uQGT uQ 9k lRbؚ|.S=jk&S;p{zdxE?_llo473333C΍͍//fd m[ RT]wuuuu<)B߲} .4Ki8'$sIB-)E^AA7K`"mi%i$_ ^+G̵yŅ߅F`9dP3B0܀ gwmO4 F$z֨ 錼;uzapQE<'/#/[dܥ(&U'>>]yaIn0RQ\F%>nnyG4ד\eGP H"Wljc|#kX =B*&49 k&h2U"R<ݼ9Q p6oEɔ%ꤹE6J֞=ד3nwZTSŭ֚%V!ZǓa昨6>D N EmlJ%XhGi~i-xjC(!a!l*02EI*՚yisׅwTj%t(-"ֵ)gx0T5V!ay:` 'Z(nA`= J:R*Rr>FQ@8Lf]erlr">'x]9 ,%I pHd&h*=`4FMj)\egNZF40#8VTKE+92.@ ?6qbY"0; K@մXMPTYSyH$x5tTb%9| B(1 ˋDCﷷ,ֈlyPH%1o'WdBuWivݎa۱v;_i;"S4 $ FƠ`uG,%)(,M9p>H e*"|q gWKjT8c = G(>IATB ̏@,a] td3hn#%Y6[3 .&d{:h:}&EMXV7\^zznmmyۚ\Cp? ~[nߵ~mw5t@;//[_msߏd؜~O;L;ǝo;х=9m6..8B~k\e/7ηcggRGc@W7~<15 ј2s0/sec:LEib̢EeHFfqys48{f ^zOB3XBV̓4jrAeҌc_[_N@s/G25|ك'ۋj|s[fJEܸϮp5M>l8xcUW52[8jn벻yQF×{s(S:ömM7(OGlyXkT4HupN g-kχmО_o~to9yXZ;/'VLiokh~o}g_NȏMVrКXk^ӓt#)xv?dm eNN`}<N{yg?O:dV?[1Ѻ_cf Nⶏ,/f0:=˘BrKplщWN&2yve-s[ݿw/ű8w_ƕ 6 0WplJ`@sջaŀg؏{s Fl ?uNNhåO~#c%od{d7&Ԃ ƍXb_x?v(‡2JOQCx+lDq%yP2 3N\*+ oQkTPuVԸF5QukTB$8'"xpxpƸ"}׃ʥ}/q_FZp=GE.)Z("BxUJe^bRABMLJ@s<o~F|@.H4wb#pğ;6Z;3_;N/ύ&7!Irߋ9rtD zRʵVO'\WOL1 A8E!C64 PjV90Uqa1):pc01n] 1lYZ* רo&,H٩;{ r܉wnnim}6`b~l7c.v ֬[2Ό:<*n/om L+tޘ v SNBN$«XVo:=>1/+V&d0E8ٯ^OOjb7vWAZpL$-~JbWH/^(dve51vjZ}Zј;"][QS4Ya-Y#D+7.c5oK8HRdbdJO`U0P_;ma4_1@JYU+ؐ9PPSǤn t |7BDanL:ybלx+e5Ԥ1(<2 }2JX̦"#M^F$Q x7cn ~V RQػ'&JYurl<B LM9vB>*YTi# sqz7ٖM:n7ՉYc#lB5uj ˇT6JE' eޚQBY5l %ՊMXqIx}zhgݥ{)(l|{U{=9|[B; WԈV -Ո U(GgC}'Ϯ!8ʩA_Uv4Pk1DZ7E{]KP Sz)4Cm9d?qX}:뱦cuJuobsSofx921yp|Z.O20! V98n.IH ;1\x{In£&;㘝Zvہ3tK;T]-|54;Y\WDr)pE"|-#\uc.z^=8)#.6Yw9ї&~{~MW\2 Z/\y5WNS]Q=vparar4droN{t3zy`n_>i_dezyM8K6UsB/=-afȣuht%y|SM(E 4j=m$0׶MfTwӢэ&:aɗN&SÚ=j6^nmlGkm,qt:O[QCJ*x4+h5 =8+ӅnZs,6W bܜ^LjUE'QkI <Sp+ Pwe6i [1ǝfY`[y63ϕ:~ܪ5{< zb %7n-1#+ں@ g!Q5+ 3BY;!XnD$uV |l"Zޔ˼z1izw?\D2>_ih[#*Տlg[6#p;u[ut$\Gj<_D9DF@˕Y YXP KۡtpWGPf28*+ug~yfv=,KIu% r *pك+|)d;Z빒r?z)Т)_/)O=e'-vo=%Ж^.zʵ SzrlVlFzj9[ZC=U#CDž`P V\{碎ΦUSj\L1ȟ顆g@=e)ީChI(TB.[x~//%Wyʽj1{㳓kOzw7W>B-Q6ӫ_]X2=4X~ d岖#tƚbel#f%rRm/De60j̽#s/q:bGͅs k(~^_HR*z:gSrJ^3ۅ "E, _|&F4$ C|*^čh~f?yNň7 g?YAJӥM?H WLq:5XP{($#|tjGNگѺUO۪UNXC/ d55ӏ߸s]U;TXbܡdڡZ]"dwi-ɧ]\~j0R8@1S=#`!MEBΰ %2r)q`5VEwWC"9%roǐC"wH%roR(BM.V&rs2`Mz$4b &rG/k(&%E맄6sJ5l7fgs`U;p%6+I<53]Hz;hP: v^ypAѣ=ݡ"p-Q׋ B9)==AV -u%7x(kv奷ÆֱPٞ0P az[h-V\*!%EmQJBZXOVǂ.dbRS6XA&cy֜5l/"]kÚ<9k[\sG'+kѭ Ҩ2I\q&BF6BD}cx!ws(OKzW&cE+P NxC.o'*g̙JiPƑϖtlDbgg73l,!U`8D#mxQ1P|\ ͠ŢIP Bm):aA]Խdg͢z5a;vl=\ SJhn9L+v*}~w 2cI*~9h#swݏ>~x5z95HCV6<|ыO癯7Ӻ5~畟zm@$ʎC4AX'E*s .hM W%ߖ%Z?]\QGmK{QXt*˚Jl7%FvE<%N_}t(3SX6dl_>Vܮ5]-ڛFIf pv )R8tkT+ S3 q s-yl$sU:%w-F-*oV-Xcq}p,UA\r-ʛbEjhwX*Ƿ0m$S6vP 8[d%)!!c}Bd + BbbR2AtBla֣s4eLB+LhPF,>VvJ֔r]XvC`@-8a$% 97O:Vpb7LNQKiB_ ڪak Գ1x'P0|Z^{TZԃ' VEB)9hW#p:8݇BvH=#^ 9Xn89VImOLŝ"/B瀞DR LSA(0j݆C3$iv)p @U*LO*ڥq%K%@?ʄ^V0{m3鮼' hYߝȉ] ߨ~b R,61LާżZPϖC!61w2dfit6> V̐Q#I5,O2x΄R!aC>Ү) $w's]YoG+^<"22 ̃a{v ᇕAT^-IJz}#"]nUF%K2";#=E?N[9["FLjSjא֓N04 #>{!E"V[NEU2j%ڶέA=y=+R ԼcpxV ֆPoڢbc|b$YdGbՉ; v;L&d1;U2:[*ihfE+Y=T:}yjj`~*D)(jdSh Ŕ=kaFNmf0$s^c|~-7.]P)&)X3=,K&_ݣVu2oOn -i ԈаQ hY܄T!unak5Y@bZ\?ik[cU,LsDZ7Q3Ӊ Wr&hoƮ^lC ]9sS7ԌqsF蹲hxRYһS8akanCS[5b :Ӝ: CdYp0`'h#q8CddYD,^dzŇg SNfX)>Z#1Z+]#!Sh#V/.1c'w3z2.kgA"ܞ!RG |Cj|`oCLAwe G%tB\q"ӗev"B{Q)4=1dL0zdG}Ւ7@&=ۧ5 yߏJ K_fz/}Z\Z68~b` كJmFQiǖ8J!e={ RvѐX4d[ZEjƯo/~m7]oA4:BKlc#:mj-c'vHxE SqHa.86(/'vn4Uz8Ĥџw! 3l-:Yjv-dfRXrQ j)rBҬ _0{Q`lܴ+fн!+uM^-6؅bsgNآQ2! 644h Tca v^1q}e䓫M~ ~bjM&*R#DM%"JAh[ȄCh$U䶡2QVg$4 22qY=$K=nXмojBLfQβf6>T#(&A%m˶B J|M9heIY]1*kOO?+831^Npd =(V"?dT]$lҭyQaibY°ݰ暘8P0 EVMSfsy9N/Qd Kr x !>Aˠb ,r]HVT)nde{`Ϫ6Dv141"[;-lNRzoGm{yҶmN5#;5eqr6> rDȰQI{G);X]nqrSrJS61Gf JDԦD3Bjb/@^Q"jֲK- ;2zp1-Xt(B'XFF6;̦5BIGlP6 vFD $mrRr㒱KH9kUAn+#|95ӗnb kدt#P ZWo{Ѩ|`oCSv=:LJu4nPd$ z •e 7GR]M9{bhMd3%'hTsڒc7@òR r}aX$n# ׈ERɝ`#*YJqKNac@b cMPF򞽽@zL\G?$len%5<%[+.fg(qi9dF\]cY;ez6yއXPftdE$.'{K,u0{Q)tF)[JN10P#1\Qۭx8 Vw{*tPZ0O;gxz/,0%[;ώ.YKC¾Ćd6 oѴaæ `$I3}(y3v&kA{ vz eT(=Z1ʎVuiV f%ҎЛz iA{V"(MMH9.⟆ɵLItuqqu!S7#Ly ;L=<BƀܚtyScu bZߴʗ^H-5wϻqe> H﷉^8bUD&ZgeXFD%?oRV vNQ{wbb߽n#A f9^_>j^|3YW]u}9*xzW}8ú6clb[y[-eMD67CjHul䀇c5\# MˆL, meO%VQEϻŕckrS=(Tߞݠ_1=:i/s"2$e>={r7=,noص[)?;IhY,wBw6booK|`pm2vվv+`?9!|zzѩE#}v@gO=j:,YflE|7crO2y ?a Ax U?~DN6&fwz<Ʒe8L,3<κ&V(:tELCASR35 qO22Pb]Ҟ$܍ V߷wd]C> oOc2N@r|7ZNnyrt^ug| v ȭf΅'O$}k U `7ց ?OVlemb֔z{sw!/7v)'xy#` %17/56[=*6 >t㐗>c0g 9;V뫫7H@䭱qh;ێq'n$^pfY$yx vAF?e?Nz`>9p&gbú.Ѝp,׉L 3ֻڲ-+3F Rv(h| 0{"}qm$)[4E)p%QڜPPT>\d |<`y5~A²șfv} !1aioJ]#xۓ|QUl~(wgzd; i|xdY(edC%}@9^"g=YW5ȭթs9E՚|[բFJF `mKc2kGI '5)VZ=A*7{Jgcuț2#izکөdכhxJFcN1%c PyS, o~J?1Jއ Xp5Q55ޏQcJPiuQ/!ūQ5`(*>,Vc *h~m3E=$i.7Vs]ch Mޕh?cjehXZ]GQ&x%ʎ2~ q=gkrs+עkoZZ}!Ej~axqVW~MkWFPfEW%5/AZcn1tYP]GE>8j͛=.i[^`9k0Hrfsv*=VrFNkp*#¾:%~x|ysߍGL1OAbfCf^4`~uY_[nO-o^Rgjdmt};kfenZurg^|%W/W+xuK n#7h{ow> V T/SشI"=V2_vo$O W͕-hd*Oj';C^WۿBmTڜv.wv_VݞxՍGx#[@#Cvf;>⠭9b6P9T:|=ҊNWʞ<\_OKᩳ&_‰T2gI?p7X"]c"{WZW#Z"@"hK͛9kJSXktcW 95 L ) 6噈3g".ϺⲛkmJ\Q#[tn"Ѓqg:~I;ٖl*e&-'Gs b=LJTL_`$f2Oeǔ4%)}5ڷi1 PyxrEy[DDVRi-=K_<ܢswkO2!>3'죽eEME' bfzT K*997@#O߉z_'|'wR^^8ȍ/yK6 H</*spc=qqgx>^ݍ/^/yd $ L ,BvXζ݈svVޅn f)JV ZF*eqkMjep`vLp?K&oIcj!"_C.OjU+Pl^ǔKe_BB#")z nӣd1h`$V {$dX 6FbV}{]Q&둱/@cR<\d"T#JW1vxL'Z;c%Cz3DuI %L#4$bURXwπ(/ DTzxWTk+q'hvN`dw\¨mgKز]Z I!۟BB wj(-w%JBYvJpF${C `>RɱI"h$Ӱt9q`CZ./SAL!u0kq 5$$Pp44@P$0aOF3<ߘcro9jA\ř%,Ms28a*g,9 W("S  e"P.n[ݴFM;F Ņ2{J#Kl/J'2IdnJV"3\B]KeC!Tԓ־1ZhoZbJRB244p) &$P b$\'L^E5jG.ҍR UtAQonJ& ̱,,QKp .ݒ%i<ߛ̑Dyw%Jndiy?5S" dO( 0FRbQXkUEE &\GYdU3mKa)#,"\HJFT J,BTvO+m K1Kbm_b˔&e ~,΄{#};cS3;.֚]|*}T}:o^2>Yo- (?Xe6J{!CA wnpSDjىޠd*kE!݅FU~pET.w<,'CUr(-cV b=[wZ>.vZ$W] 8]Kce'5_3!\Qmԃf޼oH[2@ZW)aqZ 8Swoq(q%3)C=QEWĵĵ7 ފkQ1ŰU:@e]oz|B~L؎#2\v/%j8JHr-qHK!I#ҹ"$655)Z81hӕ1P\&#BqYbAL>uVE7{&Bt(hFs_g?6W/I}Ԯ| >dؽ)6)E 16^ܾA_zN.;4zBA5Zl<}DpW̩{Lwa{5$/fmqL0Yę: lP9mYwpeaȅg᭧ȂďPדYqu[ﱶ#ۣbμB>7D$ԪX&`BB亪yXe < ?)7,Nݥ͚FQU@V(q85xv7!EQH1.B*_{z\Li0v"qxU#(_dl%R Z?.8^ePbj%,fw7 e 4qDW@(W1%PƼо- J\r!/B5~p/mo)XG+˽o.?~e~yA`ʯv<>d#~TgV/wU+L[` W #35B9Eglh][Ǧ88A=YC)A)CD/tYBپ EH"P|HXJ*.!E0*-җs_tJ6F KnP,zp99ny1~Ѻf8K5-l}m }͎k# >_PkЈDc%RHҌKg4aTǓZ+C,(ow聅 0.UЯ@SVpi B;!$ ۪ܣzFmZa]m_*sΎ] %+YOY@X psO!ҢUF),ύBn5ܯlgĜwxX"FӇ>a0#r%TLˆ&1L @E*౒`/4Z8lA9zBYQQGA TŠ9}wyJ1ٍZ;;3:ۯb$x!Ǩcud6jKobиJUO,~68`?"ÒSm${1l`~TU!e3ə C^ݍ/_/ϹSO-:jxd:%T%>ԊZ/e= /﴾'ߤƊ>zC 'nTTuD փxiKrA'_'eJ(ݩFx~t F6Bh-J5ƴH`>x1HMv1HdO Rs 1pg>啠KgW֫|l< ~ac1*gas};v3PݰzD]P#pl3K<٥FPpؘB}a%.Z\,Ţ?럯BOoKV=zޞ_cLb AE2OPdBs+ bz~q/JmbԦV5x_%RQS8Bb\QD3%I#2 )J0řG:r RQG%Hca4NU`pyi2΢~-^/[)CK4}O%4,?>?ǷKgwYzh˖Dz),Y+鉨D2*^erQ^s.3avЊqMq'$2 ?( eBzVHp eY,Xi0?zBeFYؐO@7SWӎWxţMvb+vbx^C"*$ q"+'GG\FK6#SmFl- t"#tZ\ .Gpo>|b֒ l'$8ݧR6`t"_1*F2w<] ~Zzmn <ߍS#n&WxUWgdTʠ#0 `Դ $D{k؂Y̙[g'ZZD*5Qiy O:(C4DSC&ނL묺};ꥺ};J)0͛gŏ~'zRwcף3#9z>|p*(8Ln]7k>>lHHBW޷fE_#9w UUNNZzPyotd(ה)"ρqm޻w:`)-̀Dr*U$՞Tx/ԌT|HVo"v[Eߪ,Veѷj/vA2iĊ6]NNӣ r}FBQse tXJKw xCm+br< މ>yig, ÉCIS`,Q,2 *:t2=w]:5.N˩SrԸ}T{ؐJFEpȵbA{PL yYlz};So6)#oq3׼ e8J2iuOXkrV@d,Y8%9*bC(L P`j=k 5ErԤ`7P;-5>wvYvbo)y c5'HH☉JhBH45O: nLčC:J |#OjDqp3 ô#M%O3XAx_o"?`;/1L9N)fkf$ѲõY^Ba'{,(u̚Xfs1J9ȝϾ:OlkֈJ#,dw)*ZtBè՞-czo7:wYhA"[朗A֥A 2Rgd F)  Bz7ˆIrkhn6iRhdIJm2\Y@)}*P.аCTw5VZJ S"wP\-OJ@tmDwYs(}'Cˠ5wDAg$pS#9 &! b+79Đ9bw㬁\9NM 2< J Z<~zT?̓Vw_?% a*bly2UqEiيȻ̔>ҮkW.krܴhwUr$RQs=R? uh^4+뼞skN6w@yo'1=#= @z2.jai̦f  cʑÖxQ(RKH#SZjNSLvY" YKd~ZznOt\[B<9vsyZȕNEbi~&O$1"38԰dZ'e<f#e@Sseٔ$v&CN(8B&#Ab T:oh8Ι%sTj- ^H0G ឋeib[ ƨj+Bnw^RܜYrK4eڮ-B3xH,VΨ \Ή-F%D3~=$Ӓ[~ Fe!J=8(d[6]jt$CK_9QK먚x-7J4PqK \:'c`j #ïj hYjkWKx7*i(Z+Xrb_KFvntnܻqRVK(=SKάglV %jSJI(][.](@Izh UۇdJsUΓ.fW` 2yq/wAl wb W`%d攛MmZ ~.LoY0$%PvڔpSgz?FgSqSslr-'7^8- gH?P_`51}]mpw&]u0Í|fY<߳5Fd9i^;c` 09#2F/ʂ9szQeS_AQ'9Ðx^C6JZC)A@6.(Ⱥ:/gܰ5#j=|rCJU2GB./$¹>1糦]so۽.'B1XeY^KnNjbw@תۥFvt\bXaK^ū|VD[E`? >ٔPjݗqFUy] aD ;3 _ ؒHJd3IMbUff eGOɧ9ĸpVv%ZY !/i?] rIJ+x6m* ymBŃ>h&y[n ڄ[kWdzVa5c5KEͺdW(i6$q섿{EMeG ۊ;TXؑDmEJ$h՜lLwNnamĩzU Y'lpw+94;/%^w޽#eіRntpwShk v a[ry,ròOcgJ9BK^/7A7?<Bt.cR(j׎=C AR"N:!FNR-se_vP5B^:{_XIaA&d$FJ8|OC#`jj.Kݗ?逶A$G ""JK[:0A3D9goL֣ކ'}8 C6gZd->vGL&Յo?'_nxЋ `o&LI KOMO OpC:L?<X?ߢj(8pJ[2T T/ ڨ%_ݻXO6 DU µdW}>8=R&ɝ/4.M<{;!9 zv Eo'b {02?$AsCS:)9?[ee$Y!,LPќILRnx㸲n!0Ȕg fJqmN2tSY/d)J-pѐK;7>BCuˌxX!K1\u1so_7m}Hf="he0ҧ,QՌ@j5LgFE#ث4%~i{8 LQ+I)A[Lj)gx h@S]c #B)l$3 C.1t-G] L1NMXݬqFi0Dmج ijhͺ4Vjp%)~e*f;ɨ7q ٬7%n[Iq*W讻BpЧuj}OsOd*4AĆ9EAwF&J*3+Tr5ωvb?.?*σܲB|-Tȿ""pnYX8txZul_Su,ױ\C6 9HGtqʜÁ2 `Z%:v,7ʒbμCjn:+x~K.}Ӆ}<Z jhiV9} EgcxtxR>㴟#neEPB ri - LEDો{Oe/~u􋮣_t]wj]P#plju7!}`<܆ۻo4٧_z&r.MureD<8P42_7ZWwOZ\8iq(~CcioMb;..?].r59:B xc a* 5fCpN -:AzRV 7j[@#ZTw†?zpG<-u=@^[k3 ?1N߆xa9QB.ԵUY쮪(5LAG%{ELs~yc[h`UNR&WN̯^TNe.*'.NVD JFXZ.d$jǦ Yz-gEI)V:鴯8z;|UıLu|=q!ĕS}tc_gM>F|[y7~5k齌滹.C^ľח]?؄W|hzk7h%"On]BϾfh)RSZXxWROhp6=DIXNPNm^7-e|Ɣ̚2B+# 砀XPViܣ( a|ᔬAOpnN=tnBR!TF7VZxgd4 ϴJ6BO &Z$#I*gaրPj\n*y ! hex )p~tK'ZH鏳R 1Gt+{/Ga'"DɜQdyȹ2E jNԕ甒 ; @Dbw񒰌&+GtWt*Ֆ#pB[:4(IɚKne0OeY߼p GE%)}bv}n|o"U~0؇-N^ ~$qɊOazmTP=7/I R36. BWJJYc\VY;kW.'%S+>\o̤'%qf'(cTڡ$l~Z`0KpEn馽j|L%<0A{%2g3d*YF ~xn\k΂yXMĨG6"Yb0UW[ b60r# S`pcJd4^$<$׈& SM_נnbH:(!WqSqj7Ӄ>8[; d#aߔZ+V򞌿N>#^Ӿğ-+jD_0q08w&t3BӒ<9L>gH(ag!RhDf Dx4zMA⠹ *ybvL(5Y<*RLN5ƒ.KCi릩gxWn #DiPZ61T(NddMFTKmU2_+Sγx%JPLX=24\p3Ȝ\;4^-ep2xe-z9uzez19bT#6aˌf?Ԭ焔\)˦e/+JZ-%Y=| cXSES,Fw6VKA n2{$D M*pO}elyG2GءG߀C2֖o0@yq/=5Soy"` N`ZYۺ9d"_%kY< )Q= }es$=zy';Sʫ~m8 v&o/ОWWoZ#nlV䈭]e8czʪ: @So3  M*okSB6=rLb*fQ.QLO- (r/cCl-#|URCzWe`57qF${_LMpoʃ֟.}ٿoǢIlz呚ej'<Wϟva]NJ=\ %hPt@M1XYD>g2`f=QT<k!bϣ99s^ ]>h@hɚx-[ݩOcdSh_VN_!l7 Z= .c-dPTҎΎKhh~E44ѐ`IZ)t48{c'Y~̠QE!ץq]|5mR.'T2S/ʻK߶#CC,ِ܆E=X&N#pP#Z0˽1hڸ= :`KҔSE/ C T-:ʽɩK(8e-q&FRog&KJ7=mNo6;w>i u癧m-p 'V=O3ǙLwvae":.D^r烪&'4)5|*qDzlo)g|3씐vdWgz34tq8uhv6ԯKYZnr_Sr,yoqڨQϱNHxi,Jz3 yl/}랻{tO+:5~Ͼ+`P3r5D$u ?VRib$<< b4w\0wȁ+2(ξbl=D i=(|k<DXM# >!b:aㄅ6m1^W).BJo䦺i7S}Ĉ1Ps کɲQsکcR[N6h'F*NSI 'R;B7Q. Y[s8FCjWP*Z 䶫i)R>\R!i i63kJBwe`1jTw֝;ju*)@!uU$aa\kFv//c:Qq?km#Gl;yX$9<"`َ&lf[duZA챨EXu|e_oӶ{RcU2ʱMWs lFůZ(ТϠoY{|K<=]ef+]+zgm /n y->{@>qo5a~j@nx8.,<1yR42Y!rSds:^t}͑ N? 3~WΕK_cj [$BhkQ BY *s>S9O8?TȒq4ҘpƛbX8;gh2cX7yu5_-^*NI;#^N켾P_9-Ff|T^YubUJMx{r-8{riT\z,K~s>wM0.:\mSeFH޼!Zr:]iWurUPWLq3|A^6ɘux=1LbS S^( ы<n #M❷ZJL):Õʰ6VP]Yf*,ElaAPV"R^1@ik9)Ƶ$涼㛤<,aux-d7\hb^tXzgݶ𶇪}V#¡:A X3_@)ζceȮ`X)/yi=Du `/;'#`OQOYě]]C<90o;቙6QVqq}kJS~6? ۓ/pr_dQ#`x8i X&cCVZ1B _4.NHx_r/pmk|*bKi2Y@sMߋm Lhe2h\{1fm;~ iG ŪMLxF؁Ҥsg)lw-^l~@̰D&tgGJ.ΨJw!JҠѹW%ZBfKʲbivY1=)|t8$؜&lXTfg${߮QS ᓆ%U{ad T>`%!V仅碁Hbg%Uܸ0fx_aIsޗ~r.nmz8]KdH2Nfkif+ƕk]Ha6n8Zk]&mwm ',cho˷=V/NIc]RB;例 7GZye/X*L0*eRXֱYѯWlԯf"FTۗ/&!-8Z4`-6Lu~FOt9q-U%}oz6%2pTv,1T4+A<DfE0 Uɲ-XpԔ0t͙$|C !Ăd}]99 |sH$:M9< \eNX 91&l~xR`? y3 ;`n1{&&v/Un`W7.V*̀q9PP* tnY$^`%Hhש:Vh<0֚ (" _4-:NRTZ1H~ȼ<`~e7ÊxjN`0'<1STGYVx6P$ )ĜNY).Ak_1/ֱ g LKfKFAEBamvBoYCDV5ѥ5Ȳ.5^Ks~ ei}[ӊ$/ "8@O=fR0Md qVZS "4}poIoPi<}eUjhU\vT)4g/b68g "C0>J't&#G6#hkGhi/XVhi)~!U6b.MW|3s сFԾ\ٍAvG?-uyLM vhh+ %\/oG|&W ~nc"qF{Ȧǫ"lh*|,+F_ca0Cwp+w3GS_}oȇ3 9$k;G1{R#cHoT]J`+QH=5+ ja^yNonz5îktG^ zr iRh7P0aM Fu R'EF ptphXvF ޛ9@{35-9n7\<@7P]uhݬ=Q= ZK]&^FM@>qRovw¢(˨צ0-mAaRAS\(k=U´HIM!G^Qm ӂ&54 Mp/xRpiJAShᅵL>ݤnpmP{Ɍ[n"ӕ3.[2ֽϙ81?F?bLft޺f1GqZ2ֽ;EQcAzT'EM8(yz6)8Z0qSN!1Qw I}3)$[*cu eupz޾)8whN!4{ u rLiĭ+=]TD ~F>םmܻ&&>(Hvɚ(V6Pƽkҹݨ1ſd1m$7Soܻ&}3w{Pj~,iHWڴhYI$̖ ܎:'|+W (8Sմ]lXYXwaYQ_LXmc値 %iܻ RC*]%i&77w[#j!|*(]@o‡;w ?ƗhϤGbĸEMFl"nl9Zft٭ch޼!43.M=xW(Ҭrנj \F=lBT̟Ԑvh]mzy:QU9ӕ U@Q՝( :h6eDtLFB֐ueEڤfGG:$+ͩ+7UƤ =6`T*95x~ Ƴ +^cz50lcl(kF#Lw_ \yY^Cy&ƮLIXiC AMØg_==b>ξDf4m*c'#X ɨ&7W?dlT,3p6 3Qx%U@U;.B%u+w7$Gau5Kj{v yGZFq#M4jõs1{r"ٙ\˸0p@7N\X;AM.NV!+hz!㛽bb鄙X$[kFN_iC|Y 4v|Y{oY^RO  tQA!n ]()==qIVeIk :ǫeT:dP5vaZKhÙ Ç 2bJa+ӽ$cАkE]xzM}%,+kjryE4pe%qjmeYE]tBp;=r9\y[3Lp&oFC}&(Cdi|+M{gi:_iQt&+^V:^י#ՕV٬6CC~Jf|9lu=_7k+< h/|t2]w!k@+G]rd@%*19\!R\Bhl.z!NvBPA㖧`AV&qQqtTgyceW2Owu5m:̋\Oxr?}Wu4gypGr$ *՚|Tgg|Nzu$"jZyww3|1$!p$SBnY-hN9hWZ([-n}H\Dw)6!ř_Rx<5W:KQD,Uq_)e?Ɗ4 F\Ƴ4dqs.96pFToj^I3uZ[2+oͺnt;/I\;;'xi*$~ooDWt˨QYZ=)џH.lt T3@=}{OJs ԯtb;<%\] rpm&F778T~5̣ 4h4edW/v/ߗ@w3Àu8-{-(6RQԐZ.p,j `E̘N=AEM2 w5A25 kB|Ǡ=vPT`X4Y(ir M5/;?L9zm/wmFx Nz3b׍|SNfau+'k.MjٔBi_9 - ռA5z:;愸;eb읢7-SWO+I_Uje|~ HԍB\sO]דWu9GJe9{wB }S)~ TYS2m^!tBgŔXYaEmvzt>gx Zҽ >k[v;Xo0]jftMB ^S!k*k7Iι]^B%% H֧E_lRu3>ۜ/%.%Ipp2ЖmTZ O'uݞ[9g氳ziM;h,RJCc[ۤRtWJ5੭1b":\l؇SuʳqY2~wbc4WlF=c>' !Tuw~p@M?~<=Z;[*X?l?j< L |@5Pda@Ѿ(|+T wQWa֤̤/苠3:5L/ }טwsQN2SJ32v&_i~;8s3~ViIֺ͒5$ VJ&l:&/%M&SצlMoaE`Kope\5Nr"۾UZAf7 _LxEpo-Cnp)"CrB=w6n?=.l.zxS/h?qAv+n}즸7rmɽsoIZpabŹd;ц̈97s_e ⍩myKxUw-d@ۑ ]V;а& %ɥ߼Z{z|j5,.Yml6 0cIۺ toA R9({) ￿^OGto89`h͊Bɟ*'WuhS<$=S^L\ '8$'1<>D-$g0nf+pCL)%N94hF'hb ܠ*8LICrNwL E7á֠J@=*#0:%q ,E?K3)sJAb'fIf߮iIRV/^-*QW[Ĝ6 @ebTXɒFD!j.DMzֽ@tV2)x ''+KS3$!^#FS`.ipJ/bMY03BL6X-W RSpP'\++BX $Sܟ:2u+T-:e;^_2¦:/ QKG$Ѹ) 'IY 5b$ uX0 0`\K\0P<AK7y"풒Dǯ(6xK\lܘiհ@qq,h%zGBQ/C׌CԢ4kCiSC,XJso2i Q%͕ЇBt)*P1qi\`Z\NJH#ìk~O }:z'AgYPӊ4>5:TEχ\!NgXngw(>|xaTϓ>_J=غn/1|]~蛧_hA.=K YHD0 Hh?K'IZM $y͕#&Q F_VQzpy52xƒi T>8WPnc(<:#`XJK̎O Tn =p5t`C 4Ri }HRSFtQ/戂EDVJB4AQiL 33#*]":YprTÝ@ 9d%WnޭW#`)mte5^ #cmctVǤM%t:1l܈#  ZP25&/TԴ* _nnOhV둏kI×JY:+ t̛VYǗψ\\OުE˗GA@p)# p3TGQR27-AJ=?zw4k#ܲ Zw?x.X>M ˤΕu+Iو&&Qŵ;ީu:ԗ:ȶ4:G_{DG7N/wyܢ ԕGXK VUָPqGXVӨHS;rD!9r46"oSTI5qg^7#ǝ`7NȾK}6s!Vqވb@?z//1r"G˧x7t Ic 4j堑|Izu38O_^4F4+!~y2_ehzg}<_2KOevr6瘣hE(C$_ %r6;;=3L9zv(\ߍ?q} t?ͱ5OTyhKĚxGS%5FUځ"8G’e򚑷ɚy8!^)$L)rAQ}[@N{s5T=wɉ*^S= (#*=~ecrYb]amœėFW̉hr.bP`lða̴ւn- }žәRnnj_ô4J u%p\DD}u=QoRvΩaWHK:-Z1{^/(5g˷.[d`K NgEGmi5_O}DBeo< i{ưn<}Xq Ja t_؝>f>v؝b)aS]N= fY\NgUauʬ鸨E*QQQ)P= ]à(eA1ܼ=׷b_{$!\Au{K&%ѹɕSE0W22ue[/>2\bkǧGˠ-/xq4:r=]2]CŎ|P9"ݼa#@uЊj m3t=N x9ZGes0G5_6v7o^h+C~TB47dH:%| ]B׻sFF]wxz}㯕YLl_*!["gk]luZkYȤB!KY t1F4Y<֎~}}!}F~?LY&<+::y 'ٻOrˣXφbͬs4I\.GWQ́/F N7h  [|zøb_V6d{6]GnpiobSG|ʖPOA 0Љ$d:+ 3TPi#S6/l /iO)8: oso+pt'87E)j& " ~OE" L'5U()ASML( >qBʈۊZe@I;Z+mla1E?6BAeGY DFR~Q!G$-9{r$j'-9uRMMӮ}9J.)AՓ.{BksC?v˜#Dݦ.?Bn׵Vn]PqP,6X!89Y*&ֈ]Mn^oỴ y9ghmNsa8SEl->@z(u3|3ް縓} =0nfѲCIhZuٕyTy?/${YnCTX%5DăW sm^ETw A*ӵv^bls_ ؞仫ՙi\⇛Y܎:[ +nm?G_|h|Κyup?Y3ȳ;#Yf/<|/<7?(mx>Ix|wZ˧wΧ\t_˲9VаFrU쉑]ky;gj2*r@R{VOY9kK7CC+&-!mAs09ɝ۾F>E _lƊa=+\.v# ; R]IDq^ЦFH{GJCe+gwnL;[чٺO(ؽٝ㲽=qΙ=8.R0F:*G'9Q_3ԽYN>7-AwS|?x_Yxeb}Fm+3J<崖8Ol`чedBOإHwjyc8(ZKَ7^oN"“R#+=|ײV"kTDk~6˷% Uc,Ƀ49D !$sJLhn5r_YHkAhMWçY _֌q]C7G5Cbx^Nc#je: @IeGv fj7il? E}~T'+OQSsˠlȆsBAk{vA;Qohʼnz6B1KV" Piι!p`ؤTj}~߳"A^^݋y5Yw"{y$$ĐPҤ$3}T`\ޮlT'?!#?L MVϚ1u QxڧS 1,ݻ!={'åLZI:jt& U|W)+3 цKuo\SSɨTrzkƸ|wv>ՍƨEܤCoBe\ӊ$* ($(MժkLMY6a::Ę !ڀBS07giDmlko* ("@rVkx MƱ&>ϱB ,l֌qڎHchU@c.k m bU tiPrIp4=$hՆn C3B@ `4_tQM & G#Gگpήdr砊ݷк?:6q!w]1%e>\K"j{\#j@%]L\Q;VgTZ䌞֍8+TAX%\r<-dTj)=Z Xɓε2I)P]RdG:E!.=ަP-4y֌qۏQ7^fp.(w t̝BX 5 TPr})=ٮr@Qz|:g990Ơ8u>;<89; oM{Ɂ)"N݇{Gv\6?W1Kj HV`?WB)k֌nm2쳞pwyp{6˿Sti#/4^%ox tYSb{ !/ gg_pdp-'jGTP7G{]JsuiSAv$ɞ G.,!ݛۜ9}0օ`7*5'[-27RBh( 7׃}pFm厛^|#ny:O@$(Avރ9FhƻJV|rԳ^Cэ;4^3} |"_\?K;*o 9=|3傛/`<#Χ#~\bBֿSw0]'4LGa:YM drc L`\@m :+N4(÷BqTH"g&z㊇J 7@1ݜ~[$"ޜBДվlA c_sD 1xH!JԺ5x>Wg!in5 ;$F~~fgUxuvrVtۉl'8vn'֝.٭$sK Zh \(ʈ(g >PBIRGvrr.z 3[w{Dbn7r/&#@c{'2#9a~l80I9N:qa=NZ &#(=7Ke%*ܮ8B$/$h:q8tpDE`R-#?x'~ƒeHz6)&шkpLEFL$(J J͕B0v. Q^_Ax4oSbD[#hq|I& ՆP|nΘl&B^SbSpFVKVe#?xqU YrKHKЄ*F iL8ܼ0B"Df=$u└z2K,$ R 5)$HD}Nb,16&H @CҠ}iGD7/JR#6IR1eXa|KOKhT'k1qP,),"f-,:zÉ']p\x0JݬI@c9-^\H쨛p }tMTnl%bPh(YS u7e|;UW/_rFmLBp?p#0#{'4G݂B|jE#s)6>?œk[^=p.No.fYW7|!יPrq9*hBG?{jLHsΉ"6+vWiAh5lӆR ҆h3=GD{$ȼ4PW l@ ҹ%A[s"-(3b [pI&NlGk@vNgz_2ȝ[nK~{At:poШVbˎ$;`%*Y\VH-C,G;&Pui3xZC-~LTK' %c(cCQQSH،!O87[QSzMf~zyߥ[sZNj3gV5V3qmeShФ)8h屹utCYSP(o53n}x61BAb^"uF8𫾏8c\71v"Sc 1)b (mr^ҋ~]; 4%eQdQGF]~,}e'XJy<ؕw:3?5sc 3j5w{^4mdy&>pӅ^0ϹgtwO3\bcw|gKkenyǨ\(3" @~[sfiub5nbn_|jSM pʠU8o;d>ٛ$ IZޱ5͚BpvQ-@0#!yЃu/ct14:ix,|j Q#47SI{f{ALo٠(48z'jHPp(mwpGPH/Q0 ~A0}#FB"5# 8 2z |ک,e4j[G7Zň, Ftfb֙(֙ݤ)k1a3ƌ[GpQFQY;1эf1` #=< ͢ϳ 26A,m?Ny@0Wk8<ȳ7$, aQ}pC?-HS/NQ5<(8xK#D K(Ж Cp5s5n)X,fVw:&&@V$!D $4\W'UZThH+Aʀ)erZY^ 2U C)$4L9H*Y&@(8Q9 RF 0H(s@`SX,DzT͉R`2*3GkBrCpIB .Oﻨw޹t|gX˦3_iVB==~S|^Ε(JksXulzR#sD8GcL!jAL=Kwy+=Ȏk ,̯&[nmz|Ғ KDS. s.s&J& Jd,$UQPfe& eNՅBu@zjRswQRcG;oK5r,,pQ=S2(\LMԢХpH (`4#r⢶ ?«s\:'HZB3]Yu:E$`%@%AB\TU晔9 -DU"м:fE !TݹwN#jX㰄K6Jߥ,Th2Ap)RȄ>U*Y!S Zs˴r[A۹w.޵\ۊZhI .mV3R[TdJX"P Y&+_e3*2Jd ȝKNjs:GJ۩3qWs)@y},?|,ڍӕ|kALAjtV{"J7Xy(*iZj' YU)&BrQU sW"ܲðC+Уt0iHY%ũ&tsex<[l(81}ginޕ]׵Rv7bQ7k]V2[d3X/}:jJ.ͥ r-ܟ3KyPjOK<,ӛٽUJE,ZKj܈E#jEW:UOڏA;@W-]F-1&@RG@]iẑ8t6]Ar7_zЉ{FDwo BGAuc-q0 Z[}&1 1LW(m<=Pd<FkC ta#w/!i0_+?`J8SO8z5G!yᵙ'COV޹38z5~JS3 ^Qun/=1X_z?I/:$A~Ư8I\mtO?YOzvMePP*yBHe ʼPcMHY)&o:˞B$k򢕓l$]b}(IGI 5F4쉂@#ox#ğ;5^EMkD>S!0dr̾Qld57tG1$\ZMh$㟶x|brW[ë[p'"ʥڎΚ(}y˧ǝ62Bh8Rccwa/sL\V%=. i}q湬Z/3eu+9ɆJGtCj.:JI@7'ׁbB&-U FZօ6Qґ_JWwZ4/w?&YXbjI~jg~'CRb^V d9*O1-9X 1H - +:UU!S5_|=0+uⅯE_OgbBU-:\M)_wNeMijL o3V4_ B/.y/z)veza2VeVVR3A:S$$2π]Ks\X3BĨBWCQ.`HC8/C v?.qI j# =jn.cϨs&gq$y;u1C5|֣*/ RH*T2F5g1%P3\УSB:gS+0ƂW,x>U@O;E:cXJt/Fa)H b5J1>Na<~9Se*- -'nݭdNNZޱ<6k~K rZKx7F`JlqCFa!,^.Sq.uM6G3yѨi.յrZ092R"Nۑ`3uqZ%J-ocr&-7_ )F,sCZf!r[@9q#%ҌW9AL UB(2/JdHPTiUJ]CtSxs"J P^V~u5YJo͕ ՗M3Ftf`.`e:.9ŔJjˋ*U2,á9-ס·r:#-w(!1wSӝh|]Ǜ3OCxh\,ՋeXж 6!X8">cIxAh"SZ$ÔDW}uv6fNo':NP6nF5WQr'<C|(} D2|&VWs"̆K3%hjyǙňNA>L4y_(agzWgujp?y4^ګ7~kZ&祋д^ڣds"a!_I^`zA6!A1\r"$q<bkCc"R^wh Zl7(N%<ɩIV QA'Gn6jKɃ-Dz rvL0Heٲvp)+7\XŜzN\~6p^ɮDLX_m|=18Y~/@݊Hw ̍(ig;A1G˰G5beQΖ}* vKHF i^Yst6k\ph!a\Nv9P5eAB%Cܸ5rvv0p9>{b(3n6( \z0/ `||7\# L], @eu Άq79z1OJ~}> c5ZD\Ǥ2b؁h|1BQDˢ\ɽjdi>,c5G= YY}8>$k #nyB 'Ҽ9dwx61"C; 8ȑ^Ԗs] !EtW. +} A!ͣw v38?/}A(m-L'l$Npx9_A,Kٱ@A$^% ;|ol`v2xəoIN$ 5 8 |p?/}L"!d[jo;9v?nr ;p4S8e(pev9 }h-8K4OQ=V9(!V;uqȊFs'^}qTh/TPpկG %`$cUI*+knH݉>'pw=3/v(iS @GH@ϖ/2*3as5&gEa]T OYKjJp\'đ{񖉍r0LIEΤ`( Nqp43ٳ^/u5~Zn 9۝p08moѕ!Y*Z.XNW ibc\Jw Tl uP(:qSSD3R8x3J/X6_ #}^xu' `D`xhJ4ITXuR)QHZ "9Dr?XlBXM5Ixb.J+YHSgn#o4T;6u} !A{$ֵ'2EMyP`!d*⤀O ِP/hbz%+:8k8͓wa([-]"վNf3@⃆cI_o%1ڮߖ iuT9廃4F~@/F!]dvNivtʶ+m QВ8З)Dzp֋8-Xq싹byїA@R*, ,U?{IEZtĕ$fX4R"GbIjW*j؛iH:vwF0I6, 6P gYK$P*!ۨr9|i6[۰oYswg.8uj3KϘb:3L#*@}B"՚򜵫u"vxII(#6 9G #~~Έհ]_]*q:WL?M=5՛w\M3\뼤K![7}V`+u}?iRO͕ٜv qݹ-9XUa+$ Սy3 B5 ՃÈHzmCQ񐜣 O6>{yOb8mPǟ}mN1o8Duvx!.q"aڽu+7][vЌ\|q8gud3mN$U4 v]/8xF![_MHkRAlG;ܩFT LAe_ 7(E9Eʴ&cLY3U4XB"X)Zk҂!FD[aJb.3B6N\#k:Nx[g&MBy <+˫;kV?D/wg@.̿2&tћ3$c&Ikd3OXlIj+oyp pP m'~%KcM3^%ds.xR@-W&ΰ;T-Rዤ!RIVCΙ~YeÜQII]!FAf 5@}LC)JP]v 8 @3C%␖m r&ekCHM8?0ZS J5Y>hw'_hϑZ%Y6דVȆg%jlJ2"Pc)$:V RTLA)i,.%^?k=_vjB>lhSpWlL> ? O6z]Z3 {q7^xBC1*՟o?/:0u`>g̊:~wꯞO|1rٯ5.荙4͋v˱T(|`8\ֶ3gzc9x`hQ)? YW}$e X,ShDaO87Y~;0:Ņ@^@#ad26"@@y.: VΈ~F#(L"%*LxۀTojsjY] fv_>VDƇ5֛훁>•sTHL(:/V-rVGѽY^ ünJy}r<%>g:xc"p%ۄqb˼ZxN4JW8q5|^>PU%t⺔zk$F  =WǶ++TEw (Lmci$ 8(e)$B0T9fŽD~%ƽGRJBXb&c$SEzyP;O  u T wIso$.x¢ޓJlY0Zcp<x]Np-o^ Vyź 2v)-)FAVR+kf;%@>@4&ʋRMq%!m[/*[: B( 1zEI⩬`@K_ <8)e0s*-f 8X.鐗)Ik|PB[0FERhAȤ`*S 5Me@`X:,(0q!h66I E,Wx4s[t;.T\UN+JjbE̴'2EMyP`!` $|bHy "JR)Jc FRPy̨VD)VZr˸y'8T&bj,a88T%RyZF.D ?(,MPn v`f=>:3-51q5q9!ά'*) k-9lUܩ|!bkL`r_5h4# 3Ub` M+jmQ8#Y^8#Yp$`mj::_dc? J;ؚuf6U/^KO| +'sGBN\D+ɔn{+Vѩ2.Ƣpmڭj:$EtEpLsEߗ/Á){Sd2AЦ7.~cI=U]m`RL#ROG(߶}:IHojm%G3 60kOfr(hV;$7<͐I:xjLp}KjrE;=bN8OsNF]}%8ʉhv/Xr8\RI ZƤJi14y)I[Ƽ&-,q"Mڸ˨j,N'sQ舃n hvۙ&HJ@Q`>Z=ٞF%Mey %G={$ύCrhP B$JGY`^h,۱ry73fϞ 14 6`)F娕<V+pκ` N+ Cݞ(wDr *%^`o'LPq/2T/o 1nR7ubOsrbHb_ثv=˥$`9稴ؿ榔  Ro\?I W`JLo?B' ^/b8G_^I5vgWyyޱlXf[¼@B ͏sp2ᦪRɝdD2%An%^xKm {pVnARNƆcؾGƹ t̀)&kxm+JD"*ΈPt#=(`z-Y[XRcY6w)/Xo=s<K&We{ o'q9[j3.c!ga}V"tc89dwHɔ~Nb4Ir:@`M,'$9$hdUg8,uu@#}oR^9)&)N?հBX=M'eٗR)j4k1n5>GO& H-p6RYyy9{^ki9MT|#wU\-hFOJ?)8yQX[M(f̌\=Z!jx:D[l7[ysPA<**G$\[AE &r{A X捰 76 I 571W~ +)gyuC! i b켟ߺ`ǥorŏT?\S Z;^X&n6# o%װ#ᥥ$(R}?rM3* ! `W{K[X%4]!qNWCmrt 7.e\wkҹZ,b76i{؍)'8JT~!>}{愆Cz2h4*4W_gk$,mov7z (\qE:ϖ^}r |w5qb+ ߍzvܦZKbpei#UMY`Q2Ş9_ɚL댙o-$}bk+Y~m0HJ]g׀fh ^rcպ*I,WURFj-8=3f"e_oNb&Ɯ}w  ỴX&YDQe&+NNWWŀT!!rq,cTc y<8OL9;LrY/|`9O@=жzWՓ8tHұ G!A19&!䂂SY  IJ|9M|&QU΅#*Zrbd[AVw:̌ƮEY D{,@D(xe#7t /] ߷ɟb"Q~1=z'Nۮ@͇eXk߼Ɂӳn@."@}zROulr/ɓ^]"+4um%!#-_BSwEޓ8uܧC}ow;FcFnL3_K<7ǔ࿇5^ OSǑt1$4u"dȵ@n\*V1B 2{."ʿ+'qݏ}h/n7M Bbm͐!6c΃h6/ٴGї?Lŝ˽7w\S:;DG"b*Z7O5 W~>}ԿrU(ZV8"Ia"B"9MNHy*i*~ 𺞰>8ξ<$?`+J*q> r?ɲ4`=&VLf}^&.:ó[UWLjm2rIr]뺳9(Jy5]G$ I[Wg6ηj `jfhr*€Z~'f y0oJ`F6J 6j6:ؑ5?# rbdt{Fo sslPѯX6:!V2@;Η|QB#8*x$j)ƑonRM)K>" zա8Mz\/zzL  GIiΦHdw  4\s8N"J۷}FY,wl*xz|q+[> x0wԶF:`?88Pahidz%;^+dZz2]{5pᡁHKZisX/5KL|#_C`or>x>,<;nWPXbj$(Kl1d'kKeAof@.|n uLz`JX_%͉gFTru-.czZLvm7Y=ܻ$) ^5._ڍBxs~p!?Ё6W˩.BBqP CBV-CiҞt-N :A핒OlhX/%*j.?ʦjެ7y͞; t.7if^ߎ{:q+쌶Y3zRNE)v,^e [n}tu7NAHԕ윱*q$y>1OO\!u;;5`PLj[&`A4D V^I հTPp<Wao=Ff8fuSagR@!RBH\8" '

uzG[D)~Ő/ZpmF[pRSM 5!|ol2m@  IqEx`+D#B`!J՛[Y TU(۟AQ.)  %,uذmEac !ށ W3 {9 3)R΂~NoGymҢ~o*δX,N&x%84/g8~-[-(Z27['z'Vqyl a1'.`Rpbx]x& x/R _W:pN`Fac7O1wQ|ų>qBHHBDT x.%>&g)q,"VYNىBFJ#iVmw"»XL˵9O_bXE&Ʉ)n箵 Ff<.Kh*y.,$qcjٝq-"gwFǨ:"x|#ȩqI[u=DqIF򴀏T@_m,K:/_L^J/[%ʺ kviZ4Sp֮3叿MDfɐ"7^R`4h~)-XS̼$K9󈐔DPi %N^ Vfem>Pz_iF1Rv|ؤae.J'i 4Z"#A11 ) [rl?6)'QPT{{ Պ9|.!ڮ哅_b]!H'1V+U+WY6GEͣIyC9~uI!^A#M:6H$o>~IC//ql/G&ͯ+%@ZC6+U>)-uJnd-UZz&'_'̓p,Bc׍)V姰lH8&X7Yr{fuB ԒI4lpl~~NSBϕ ֡|MynS^|gw.`S}<_kd) f9',ƃ:C pM ~w4 |n㈆))MQe`꿎ώVڔ{WVQ9ݎmuo$OϴƲ{MV1<'90eT7'Dc25`:V=_=/*NgKe]_CH RpU>+~RCUamuԅ/gol;삇H~5~a3ReE?=jqyw:;|߻ &\ C厞__>?/,;nлd\ zGPǢI dT7AڟGi4E>_@TW8mN;}z]7pmsxQFWci_!d7Y_t74 6v@m 6Gn(3S4pcwn _0S5i3grx$`w]d Ѝ-JrlZi"Nh@1L׽hdO  yR`Ʉ -0KpmYiKf>+5@q\xPʉc2"qRt 0(0#U$h[ƾM$hk=BJ;+72q1YӋ_~e!=3T XgYI]4[oנJq#i13ZP-n;dZ 2孶ۇXxk>|)9C[s=_|3rX9XR_##}fuY._ +M\;Țo.W552Wcuו]j]W+9VWj["c]:dJtN5rOw[a Rw`.ƆcEHd@oC5GHmZ[y8l7DnZEN ֕ M,Y`98[{N6=V"Oы+z EHi$֪[h7&˒kDMsߺ?xb+b}߇Űw:Cxvoj8z UkfXMݍM}qS)Z[ɍOI7f-E #({KPܞ.gP܌b[5V^pޘ#j&n,j؇O5=&Ϥ#FO%*14kBR,hN93gjwa#28Fh|9U!rn"Lc iܢdf .\x$|;TO"@9yIp eth F:\@ZRG**bhYt\PC"ґI:P:`>*bǢ:[l|J"q" 91QG`Q5X- +)k$!6i)[|kʓ &LL@)øะ ؁29)^i$H֞0gD*mlP B&\pʭE ~ƹ1uc)5p.t{7Z NԔi c4CqRCrœ9BQ~W֩VH}p ñ"!`M1{-=_ъ gB>o8'xЫ񽼻̅((hEKux ݻN)ELCa c_@A4f՞2" S|iw;53W/M3i E*쏀iROU I?&Egۚokb[W-Z Bl0VeL-^ >Ԧ(]J8Djdp$RD/-n5!_Y@\)^IFF!ы-{͘דx:²̉JӗTml{MnPpčr{1[ 1cZ n?oc+gHxVDx@K.Th.U{6mjI=<9L;\b`چX>Q|k3u׵#)Fܷru"^AZ9m*Ӌ()R`8 Gקz?[a_̲981ɣuȓRTgO"bklDCthV+]ŔGK<4mX ӫ?^Eձ]q,,xd'D+,#)ҙ`T$+55΁QNFڊJXcK T!jj $w@PX_p+(7.KG`"02h-# @lkfg5?i K ٚq(l˲  Ahx1'CÑ*4/ יibJ Q.B8zy..xQ P KGeYB~ou'!PY/G\n{lVk98;h38Bߺ?!&U];4V-tcy0je5^ıi9!ؖ8Bku39s#IӷYH(^TwvTR:瓱sD>\B&?{6=s&}b,vN0}EȒ"v[%Z$RdSM3Ibu}U_g_8tه7=82C ,kŒ\8چxYZh^|T\!vAx҄kM8Ar5L&'Lڦ\-6kgk;B )%H0 i@q9jL#2"ጤJ ĉi+ɧ&/& 7&${07iXIT]*8gcH~874v}go f]Sۻ?tPu4ZUB*ԤDՉ>kZ=Q9"{DGT-&nEF#~>Q_)NpN΢ʡ NZ!ۭuG8>fjjAV 选XCPq^ʁr"v:粋 jCQQSTT nl D IRDa P"ytKcc*dStv!d0$3Xi@5V"K=VqθAjȒp11*ܞ[g#h-C)Ú+$ L foR&a$wL397=rNҫ5`~7;y~Iߌ&W[rXZsX>V98{|@>[oj`?(#pd^Zi׿WERǐJPJ28MUCU,!,2dݢ"! @ O V)3Ź`jQ.ԯCC|dGFABsn##lkӱ˟^-yеPPBB&)F@KBW_"{SeB0h8 B 2'_][CWvBC _!Hr1:xvß'jCi\9@!Dtɛ[N BPb>2a"$蘷"=F8Xe`" }@uPHQ**AI}\T"AA$}:9^ !G 5l{SP8iIn13JV}rJ_b\|CqsLD .l"~%ł֬>s_\ 3+16W;OrLܝ{VUFNI3uq*L 82(+!n4 m!?D;V/E;9ߙKM,œ)2!$(4V HqB&5PO\oͬ9s'[]Z ](y-Hq::dEP,Z{bAx鵰݆ȥ{ iƣ2lhJ"aD2ct1dG<pPN & ޟ_K86Tԯ сGB_j9aIPmMk j[+!(&5asj?L`8)E`LKծUħL;} mAk 0|ˇ3YܤoEN,ML!Ͳ"0[ 5e:0 Gkj橶Q!7TH# X$D4>fI) )e>yoSG; Ԧ!aDnMFZSl) "1(sk)p=O]Bpbj9P0Se _/kn_( EPHaĒB0Ñ(hyB 0;H/l…l}Wk+yYᣍg;>,ٞ tCyFқuEP#+"_<^>'( )ʰ# ('=+b!{m1F25CFm>Tꡏ^70ҁ bYvkNف4LAJt 'pp$ik;R50lSσ Z5+$(H,yq p[ADg-}xAY"s"a/+r lE` e8c5z&`QOG槁|ht,ū[{%Y_bٖ+_ Xi"e]?p;ӚPN mfHc8u(rUE `et%X6TqsA6NvE+,Fz50H|\b_~LMFkZ<5TQ{|O1`dz/4f3_ޏ J:椀"ihGS\0 aqLgp$Y~oddš-^+cNݸIgJ!H'~?JH4AJ*}£?tyN4DeӻQ ;SvjFg; <,bx.pS#t$ie"V>`:ʔs @"-{@]4G[K\lI67•[//:MstВ_iPC tН<./QwXhI;лX0p82q)#q.IO7ykAp,/h<>ON J &&qӄK4I@3夰?61UHqBR€" p`/" u\hjڒG8N7 L) s;rFNm:^^?獼mc]z;^&a(`XIv*;>P۫o א*+{;&%RQHaEU܊ҹ@MSRd>iI!Ӌ:,T0eQD,g5@P2ЖS!&3W d%Z/eIȖnO:΄^FrD2\ iDmh.[fͳhrdzwkqlۖ߈0- M`܏o+a&1ۈP w{hyËSvy'\,Ʊ/cd[;ZBKI*I IJs!9䕶|e  |nn Ce<xEP!4>U DZQyƅu 2F|~og_Y2"yFAbogNN꭛hʩDbFO &Js35,Ԍv,ǯ{(oQ|$6="2$ )!AT촛 @Q14Dܕ/)?ʮw?WaR7!B\US8hEH/z}%"=T8, mM0K`ue*^HwNhJ[ &- [Pb7ٖ>W@ȴϡAb!R}]XL8̜wHrè|DrMs~5&13PN){̢~D*'+w*ʉ7aXܦG~ݣH,#8)QF&l}QJIldn!Ϻk?/8 VȻ0 ͏u$g.Q/Y8H `#~=MI j ,\%RGH3Z'R @|0+_?.soQn41O+W>b:x/MxyZ3yl\=s#S᧛2)ւT\~ĺ"^Z"o΄%^ÒJJ\B}v[fՊWPkYBZnR)qmTBv"ֈvJMɻ{c'߅t-V94FK2ctQe%O[ 5xbof$?Ez]gV,`MJL&rj4پ&G+7}[ v!O%~񧋇î\ꭗ c(h`>>^ тR2RKSFU{Q:JD9tJ0QEs0sR^ }cHNjowv#N/CujS0!SLRF^p)QSZ9˵ޘ$h3MOc4fBJ70H1Zk@,X!: qpD[ *@qf` QYf*d.Y^%1Fqh2Z\?AOh2 o6j?OT~U .\;/k|yjZ*)8N0xY=q|kqѧ?qޅձP:)B ^F-cZ}5P`)a(eHYͱ_=Uj8 CPM"A;LIʠALC3uiI!…4ȩY][9r+_F2Ev ؙ}Yme|tIdzu;:RV_:aŮȺXeD#QPD!JMHA4 ,綈sOA &AwLfVD `SLM6]~ƏhD\Adxm{WDG4;{]5.ե*.IJDO"}:Cf-'$4E|g&ZMpW*Rkoa&FзV#Z+TҀj[ * r*9UFⲻ 4TQKTS!J|"|ٞJwwum95dPՀ#Jpza=$S=8i Gy!ݽ>VĂjka{Pƀ "Eeq(piXt&\NTBn*19!={hG3д9D-M4) VwAvڛz[d?<ЈJQQOJKTL*U.+|'ӐgN֬3B$g G8#:>m:e 8Yla@J(\cmN*^G\s l`:\4`X!-8-wL)J!rQ2}u -:@0a| fNw0qŭ4jŠR$8ޓhRJ"V贑L͔Ry=XWrKPSjBXhK.`T. ՔHB|v8 L@ ,PE7= F'Q$.p^Sh=Q#6:;4TNNӞ;oHsDSZ.X1dFZ@2E!6ٲ{ٲrUr8]Qt9j9*̦׉v1|\mj.2ޭamFѫ??I,*~}Bq o~ޥv w_%L ?'ѯ~1*rÃ)Wbۤ~ӟf3, y PcO =TX{=Œ9d"`)6jh![I2ß7dRɄ{Q&)e# /sb`IG"Hޫݐ419-| E]' <k**fk8Y$q@)<ةOɀ &'h8tn,2jj/hQD,RSD߈hdPW|>c`[N;rZ&Vk:De̱f!̋(~S'ݴ+,G9CD^ gE_KGH2X .B9H(I?8JȠrS{ԅ,:ϒT!<JƂ =A^u5X3c9~¯F6x_Fn=oDA{ R ݎ[CE= 1A헪 p 4 QhPv@Xp!񁣐øДk=&(\vr|s*.D@Vd= TQeeI u%hSɤ,*0IJi˫Bd:}W v4l*`,qH4Ɠ4}Vs|@e56=_.(N>URP,<}Z~T(6&H3{oȏw֛=.9.Gv~}_]f`Mx@ y|T%~?Bn<2cx4T􎶣#`/q&+a: .I*5=*wTͶ#/^H%;wWpfԚT&oFijTJo!:cƁb,7c'4 OIKTQyPi5(rJڱX,πC8TK[v|eF^n1[Ȑ<3xTǪ ;d-sGsq,jsRlȽPZpFv(wsa%gAL℺VcM00f߭e/_әaV20#ڥ{RК^s뽊Xl@}w5!w6kBW)Wݚ)twz9Z"Ip"V-~X5jVJo HMB^)?],RHD JIpΕ+%QG'%H'==cl("EVn( Z'HETGpVDKJ ',xjXhDԤPM٬! ܀'HGxFj#p+E83\9B2ya 4_ L(P[&FFe`Mq9`!Q MDNA@t x pRs9:Jc#Hp9-O*8fgZ/26kVE9fMP.oleDtt7XCڴz7rҡ8wxILH*Qˆ!ߎMPz4(-P܋qRR)=+',ahBԺe px@99`ntN aqDpq!8 Op(& ΌF {`2ڼY50kc! <4^( RtS5wxqRB" A" =R8Ѳ)I/T~;B M 'W hxNpXdK44E9]cV(tUD5 jVIyv xt]U1V?*4yV@:Xb .j T1h'G) NM5ZXRבЀj-]"ݩP)ϷrMuK\$&cx3@T2!gX Dj,aJW6SFje2FC+ (5/v8<3r4Z3ThQiOP-j&jR "@%껈 cR6|(c eZOugӨL*%d.h526qExK3n5pkDWܭ ɱUj":Vi|U 5LsWKT ! z1N4F" Q5RX+`咳ڪf o٤MG$u4)ՁaVG]nI"9ZcGN`E>nDNO[ Mʖ GR GPtͥKДRuJ5P,ZW)Yo-Hp._*ӽ&]i;64\D(7pf*gQSe5Z[xv0Lj$: `)$|ڣMj `JE@AE|բxܟM=-A3HʘspTJLtl+nnܙ*sOGv gcitYѿê[tؘ~>߿zqPpW'MP7D<'+ont,> CJT-e!@?/*4x&Nd|+j"~^N5!˦+rr n8ik!s<$[?ȯDnea4oV $ׂބ\:2j;|8|rL$hMՠJi/O$rjkO;;סR!oNQ%z;*0c5VZvž }FA =)J.xE A a)kbVҧ>%jZG6$5J?iC2A''m큍1<e&rr"pHYXLjdn*m_SȇLiy k+A87 ̫&Y0 (XndWyq1"Jb)Q`ad Z'cHo T`+,{֡-T]z}%-]/ O~J_L?0εҔuvw>rtm͏J è\aXexƠ!ע3ҏxz8Up!l [3!3lͮnAc emuս!);< ZOamǻe<:7gF d!Gϟy4ſ4~y 'k]Tl}>/.$עՆXɲѽ3Sܡ.G3Li͕X,;+ˈ{]5 VC%KzfFEa» .5[5w˜cn֭==oxaՊwطv!jXw[mxZpA,tb9^/8&]lO0ܚ=ےLϺ8v|s&xƐ!{1t^A6u7Fҷd¼ߦ>^4G|%4~N;:E(t{.۳6dN1+u)&$ĄBz&쐗z ;d ݮ?t@9adyxS.`th[?Y$äػ^e٫C ^Bs&i%цC`HC"/3|Ut֍q`D9 Ztv@ՙs^_7:4T^ЧK`ع ۽t vK;.u2J\>2#hfW dzB86grNѐq[>g=>oU_ iFـnԅxJ2'W<$_o7yxƐ!Ķ/ΞB0!rȀjڣu`DtUUZm/7Z^U5V[={Ŗ-4zʘIo%4F_fTI 줄yYNF[I 58 l5,ן&Ҩaq{c}{c,J&Bgl9Md9Щwd!"Tڲ=k"\B]YsI+ Ry9Bcc{2mT*H~(xA6Rw,m ok(J=ܳ&E&;V:66s'rTXS)ba.YxcJ kn1. Lf! +͇!ɮ=lNܭ#xJV.!4n pb %)xZP4*Źr+PWh3qDͥ[ܻp}%ngC#8Q@|}~N^V r_0s4N>9PU\dx1a5QF=Y,JMQe1^FQUU*ݓG['Za2hph2PaQ2Nd4pp@P$~&okdcm Qm#+AS*2E,Pc ZiڂcN :mxx!*  X>=ZRs\.gwGD=W}^o^|.ۋ~6_Vy7߸ R+kgtBGbZ;tp C†٩L~F_ >lƓR~;usTϔI8W& '2鯋ɛo_MO㔝3PT4]nvJǤ/XZxa﫝oA4q!et_J` *iwCQ)*QΘIR4mRˇP-֠:,-*ʐb`Rk3m4^VO#Kc#S9Kb}GܚtVuhlXwl _^(xlo|B*tw|JPo3 /T,y0I#'fzQ%ExHW h|}@sz2ts@F)$By25ZXRڏ%]нfwm6TM]=,E qpbW+] _e)tVڪTTV=pc{b;#JɣBƹ܎yg,1yLijON&w`+R zQ ,^b?d~BZ1~?q"KeJ wt&1O/}8m2a4]PYL C/-{Boӱ>}ZO 7\*tڼ}PhT)nIQ 1a8WVq'sRG!Z >^jڐt1n_vG.glGu[bm?*~OoG_ C&CBnh"COAfrҬ/ T8Aamj^#88aņwS1e6چ  P A5LG :ݛX $/9a\:NU2=9BI|,x(z(8BLY:S-#JPVoU0ΛT%%oK?m?[74!'&-Jfg+g?e9Y+j7ydo,Q` \ՆӠ]Tq 3IDWs_5V3ٰl-EOgwОlmsAH〟l#NX{~bSiTy%% ]X?Sx3}|]9sBV I7)PYnK3U]]-?ɜl\1dgZPތ!fukBhd! m2?m)y㭉iC͈Ѣ9[TO7Iq8e)?Ҕ7 v:̘!82$R[3`Ax -SfD6!:cyQ#59K2-=\!fPg \Wx@mt LEp5q׹yZf=K;È@A[3hu!3xf"Q6=mY)3ǰ=Pޤ ރuNB2] ssRcFcS&zl䒛hGC1?}LoSt[0qf^#`^<=)SE-p, R ̝ٗDG\]|>CPl ?334Ǔogy4 ?!/ׅ^; nnDrFCҏ\_L9Rs=gv>)?=EKꉕ'A[RCD[B(>L(dgRp>#ې1KqeULvJGC2.2&@&'#Jp* 4pxϤ[\7{][4r4Stawv)xeoiX{ҋ0eeJa\rS P!Og~ףFUYCʢ-Zb->b Z=VCa^CkӼOo`L[`>CuF` ?zJMCH1:Lz\% dKF縖{<}w烷٧7?~ -àgz0<6rιCp }I6PP9ιm}@YKw1)؇|׌<{36N= ^/W.ƭ688*HHKoF=wyt|*2!4Sz3Vk+$t?}jo#<91 z)ym$Dvu-Hިڳ?D =LF]oo@=c&k rN%Tqaz&߿XO:1yejS}DZpڅw\H+pԤ'"U#Agh(G5mOR Z)܅<"DܒGTK6EO[FrQ7FO޿;zYoKNn'3*k絒ynQآ;9D}kOӽBV=2G;g^4 EuCE)j(e=]UYwsT[JFx53atwgV11cPi W+Bhek/&ae6aA5EşY8x _u7wzߝ]w'>#u3!DC +XxNr~vAD 2U5J-B ;3+(˘U(3XS{6I/dmWx#xbV+ٖ_Ls wVL8V{mUKۈkQ'Y9в#K x&?߯-!Ѭ<*ek8W8ok?[c*D= }B,~8MErfuXIKmSE=u%׌zF5PΏ8Rq3|s%%&{^A2)ݷFP*մr>,V ц~.'5Yz~jUKmʛ=Y(^\ݴ>?\6wLV|~IOOh:/DO4q"U=B4!T zD^uF2^& gϩPK[mJσzAcҳ'A5;_T8Ĉf;_Fe Ԗ/5dwVώ"J*-s Z(>)a6ךᰡJ0O%8ͺ>[F=S~XmѲtޏ Z& PLd:mfA  -M|mB;c1DhB4 =LF-y-(²[m>k̸dő}LNړLbXLiT8,el1`WUXi? Wr 5}OZ%nOr;1`OFR%lJ91tslÏ`Ò[(?Zi<<Ę5c({'.5ѬuF] J5g@i[ы & N$#M˖UYyAa1n$}s<(#oeSp#:/KI&q%9;mIzYѕTZ$ 7z]Q߸4P%/>~^wʯ>\w`?JKNʏapt!;م ,j3 &3T'|b\m&߳/\VlH(ax TFeḚzyUPq- %oZՐˬ1Jgdr2wᄎ*:XPM_A Ap;lNQBD%4lpGS% Hg,^?zWCQRC%ִWtڌaex&oWxu7?-MM*U_ :viwY?_ ?>pvFF1 ӄBx<]DMJAOg IdfqN>:\ 鉏näʢYtCELqzT'Q ڭ.UDh#Y ё>v=Ѫڭ y"La$1zzeNσm'F-NUT֌TzT.S r!@*H NWwt+}9}3'm9R]+YVx}z Ve&W!p)`u.o N¨}pgrM3ósju~0x:lԋ}oPfݯ;ŗ_L W`N͞zFo-V\zpa $gfa~ZF\M&v>,GyzۮElXKaQ.pEzB=&=4FB?DDAb (+WpirlhhnfƄv&"NКGPi?XdjQl։eqg1!j4'FQCL \B EP,I4kg F^GRFLP#nl(qZJQE "1 oY=t]v8pi¯f Auwדtې+˽;N!y=BOTA@bP$h1;Iz&DJ9042y\i% *TC {č1Ĥ .i$NIJ#J"ZrRɫk/1xμpFЩZ&"DMz!c F;yM8>)cc `FkiS)JQ$aB<HD#Gc(d_jvSkfYFy|1!1IϷ p=eھ)ev2kvuO}V˹o|Ms WT] |fӫn'FLJN&.]gBWwfP@)5v6 %bZ!-~)BCa} ͢bX%),QZTJl!+ݲBU|0U<|S ;"Z݅ERJ -{X=c^IfV%I8 ;C8 7|ʶھSU8y*χ ICUS`A GA`s f裥2pŒo+/hpO -ÂQ-.(A($(E be0X65GB,b#k% OnC#-hԼ,0 <^eIRB6"' p@5HbvSfд`/O]pAux-Izj́aJ)D n`Dkzʛ*qʰ Όw& Ps2?:a>[sgy`\qe!4 23ӱ[~iz *%/oM?(,1q'Be`Ai1?#2hѿK^Zʈ%'A)/t6HU#XALFV8K%,Ŗ DZa>G0UrsǮ;iyrkW X92#9[j\rjF$c)BWG .SN 82t" tZ^ҜOoF}"8\ &uP+sFҒxUF?MY4uMZXZWPPI:152 *68-<'?]8W w}@IdnАD3`#%M) ۔TUHC婈!"*XEPGHH&۾ԓ:$O04k O&,K 1B|n$ J ɇDHe?y ?G{e 8g٨pjs@=MU6 Qn޼ C|o]lۨ .mӯdL@ɲ m`vw^|] Q`?USn&J3wJ)/pKm;G/<y3Z-ւMަ-(NOh88v7m@-\yE"ko}FFI8J%kvof;Zpk/G32(U3!rŵAOǵsa=2˵aO r'1IG[,U `n>F T6^nZzACJ6p0f/ξ- f[6lMX8['E SqxpYR5ʎw+}} ~U1- G'NbBh JSVxppԵi]!m0翔2-**rҲ%(|/떿/܀|Z9̳o&xI R96&TM3\G+DΰMjGP@ʼ3͕ .R)&%lT,\sT"pcaSvHa9RRp%"#Mb1i]J n0%$D–( ,BM p["T#*m2 oX(6%(@ "D)m#h B#0ZDŽqI#! D@!]HX)%my9*Rt!gvvZpޢ e ߕ&^@Н.4uϥ 9jLܶŸO'|qy *!Q|/ZtPH>ww4گhͿdEYmձh: ,+_.͗ >>; >~÷|rLP {|K+Օ!Ɠ{O -ی)UAtMcn@6C*qB[8O-\ ]VTRi{1lq8,vbWxac`CɅJ9 ~`S6I+0 6>DMLR߁:YL<%}m기_QGe[ޤR9~aVxĜ¯Ө5q~sy^ݬ: ;n62 ,%/61;mԐ0eS5L/͐_^w--BH73`&ӯ}gk \oo|u'?_a*ϽfNg?/^3Z q>]Y!a՝AQbD#p?75VPӧ"^R_oC? 6-@-wV^,ʩ U J+%5C3FV}lTZ+N1%^L8lʛ\P=B8O)@ZbD8 y d'{p@m݆ԇ1C+e fcf0X[^!ؓ0A38iu\SFZ>,bbO@cogm߹vl~ -=";uYvW#E5۰e sA8uR>ڶfZIN=q>0RKzu/}~ɋ>@b SČL}M"wx<ٗ!4V,h$#BXDQ 9ZDQ"bhbJ5 XBʙM7o %Q-Dlg_7>4`]3N[(wfz9bvi-PQR擽~4h3EѢ^SWݢt@φӫ,HˡRF2BYƁdY[E$%]={Ϋɽhbr w~ys7+~`Ƿ~wzǻ7Ezuo#ݎm|v|㛛0e&g>YJԿjKa2C:= gyCg9^5]OqOi(V)ft޽Ys;:mL3=IOcbm4.exfqK^QqeV歄/zA_"9K:3CM54'ݫdL^o/`>5niT[d*.k0rֳ ڛ Ԧ(]uARڬPomlx{3~05ӱjs2nHE㡟̂"?zZj sFg@<|277IM$Mu#Vݫd_tN7cv(췔 Z\5˒\gnwo49^8w a~~ϛF|>\܇pq.Y9%&!&?J OΧnltCpQV%:=*^t_񲧊QsZV9D-LƲφYd1F\H s2Y dsOFv4+*(?[ tR(cMX>1:1%0bsgΊ͝;Wlw xH#f s@ BG$IB"06QHP©Ts3qD:s}4ԥH٧( kyhnEKr3o7ۈUᇟ. CtRE.o''C0H rck{GbY-o%B&MsJü,6"͜=gϝ5`{prcwv+ fU=vK\2= eu$젠ne?p07#иHL5sB,Rq#;vvȉmϡE X8>*td/7.KBbHB jeѵ3t7عBd2.qCkgjIyaӄ;V鱘-qtFWb#HJk!'8ڞjkaOmm^PoU'\tp&` f|j! %̭cR =l4voՔa3ƛhW}wITJz2 2a8OqRbRhЏ *ײJ#ԕMUR@ wX+4Q ` 9 Bm"!LYFC:;>ٙQ Jiu='ΎgTg} ً={q˜Ǩ>xG[`~q{ aJ8={pR7)Y}9={X#CkJ|{7o4kj\lq13Xbƭ+2\Yf0Q3.Tv=-{f\OIM/P\ZbQ.ep uD((c3m2"&`ٲ@2#\vפ8Y.{CPǜ9()z}*(yzdnt`h.@,ඪkzAعwDT I$`d1L P8NJ3zwi2V&†in\q ˗^QL5G!hDP#h(P&*BZEDD&,T؇1(3 ̙ =ȆDB'q"c-@k)nj P+ 8(QSd!e"N>{P5 u'JhBEH8`4113?CJ"̵6"JEsT ѬH Tի1rk0*hݐMX3̈́qL(IyX3ll)E[&:T浦-޵$"`N/d fw,{$9YjJh]jbAbK$bWU~&TlWH4$^R)]+ -ʀʿ?X鵛nI_ppH  r/G QQm&DVA3*j|>P%$h +]_/JRcf BςI-ehJHsP0Cj9؂H(,*(Sd_PPDl"zL5zN qعSY%XsEl|c_ (ˆ**ʴHDF8S NL-SRq&aaF7 4W"'4 =dC6yR͊3YpR1)`̫E%ʫ9߯fe}^9Ea}C<.[SnR>^*\~}i!c.ADd$ [`2-};oqV [gd6D= >܏p} pm X ˛7~”K\y-w\R1G zi~35npC}3$]‡Wc+1նc+1U^!]Q&ڦ]y!nnFo)>,9nf(Qo=Klh #WE|UU\&l>G5br&5JEc^C.[dx}.'n{xd1^XClgiqEEɨwU\ rƇkIRD[}q'VWtέs \xź*ZGda&jXwW0m9` &(^uw۸uAB+RKvXjGdp5tۜ=G(yD[t<*"$}Qt5j pT\>m{ nkO@( Wewnu cnQ9 tu=@kj'z[$Äjmͥ*<6W Riց`qFggTT(B R{/RƒS 1# YXCu^P'I j`a!T7Z -qT'm?jFd1&B~,.9' 84)TyumOFK83 IQ>V *$ţF.q|lp:[XY36/Z/ eIII)<'I‚J!-I$NX[NW9F9±Jz^he>,%'I&U*=9I T3,L$I0kْ)TjEZrOk:j+sLC`QrTsA+mXҸ GJ1 Yb|%EAZf8 AY7$n 7x5AS 7𻽽KB\rѤ y"$S} |nh7*vK FtRǨNlj-"MvkBBȔۆvca;vK FtRǨNJٙvK/"ݚ'.)2U ՟nvK FtRǨN#*֙vKݚ'.G˔w&ˬe 0 b,s(8ֆ}U@Jtcfo&$Cll.gs|RvF},9]AĿH-⊣^w|K.;zqz͑"IiQpo)O%Vb\SQ g󰂺}ڂ7M4/+䭼mmO֤X%kKh7XE㞬jD;|` Hgc bEn#GuҕM*-m~3غp܏fp B3Y37Emxoy%kQ2 >ˑW F9aFbehx\ `z-"(MH D,!G{X;G-XVp98i+A?cr͉KP68EjᇅN0K%p+cXH%bT^U%]䖕0) (Aۭf1*ؕGcfUZYST m\uwЧ-r;>lѮu- @8m֍FBh<$Q)?R5um wEI?u Ҝ*;4yp)dˮqg*ITĢT Q+E1 )Sn'Ydy`1w/( Egpqd3be*! -NĿxQ1y2H#4m \xDJMMʏ'3[ؑEW RIOD˰Eg[Qn ;GPe y`f(|V ®?3Rb&Hq< tBytcG j WX3n5i-s@םk9aݐmH䁺Dskx>{J -KyX .p8o>-$7;pj\fw>80tkͥͱ=YN)p_, [jϐ&C-7& ՑljKjr:S|&俿ijQEƹg9UFrU尫n6/^/mD*<ht0XIspeʞkDH_'>%"JZѹZn?Q54 b_F1pL9X0|BJbB7qga;'T`uAЪRŤ) )5b-s9ԔHݜnwhLpG䳟҈xygcxHkqֻB̈́Z~jGYwmg[!"jx3b80kGC {2f1 &8:_8#2iZ\h ӃTrv11ƏF?㓺rIT',4͟&vܝ<~u,N>.Qi06GRw2Y0>&-N6!`UOڌ)d֖K;f:C4|aJn~A`|[}oܗu>W"J&ppY cb`ݩ䩷&iu8}' h ?THA}!9-[ u8毘E5a[7Nuud\&"_2ǦY]scf5xVfL>qk NmsmcRJvN޽0a\7`v.Udh'{kcLo\DFA:4+tǤ?[;5&R8XHd8+U$r8\EN B#r%+ynFKx (cXy8eN%pZ0Y(1c_Pc _'.w9+0vI\)hI*;SJ"k YQHc08Rrc-BaEpɐ9] ;N\|p7_ܒS0fs> !v5W W%\wWV~M`!q1wMՃ~}Z>Mȥs-y ,Jzp~8ADddyR5 d:[(ymKR0{p'L)dXLg?/o~Aן%9 M. x?܏0mcStB૙w\塂MX*8]T $^K7{p;G{X;G-XVpIkFqkY5k ŬxF~;P3o׉r_ݘ18)&d-t(3_g>yAi7EUHq7sM`h`N7c0BD$VT YN?4o'߫mNJ{31͉lZKFJ>jXGvQ=Cs%jΰomS(8lf0n5zgqY^o& 4K*j$_7E}Tsj!Y~Xl )"ye8E4r:Z5tl7P|* GBjdxIQH deIqNBH!01\nyfW>c3gǒ/џ`M&ep8U;u0~2`]9l3+̮z~HxE/:֡Z !Is.RE?_Hk1+${iUYYa%4ce q4՚B6ǖXmrYxACNk CP9V(j59np^#J C\eò:(K.Y6 F lg45Qn@$B{Nr )$\:Ø+HbwRFD/> )ѣkIJo$}fumAH2;*/Xe8[-fL9ѾNN͓Ԁ [ly@UNϨO4)iWfՐJϧa7ihWڂ3XOC\h͆gvN~6f8cEH7c~]u9=98l"r:4LBQJfu: qQ-w󷎋3@]Oex'i2aC- 4,n@\ 1Wm˺l2c0ܻ஬X}eWo^}\{9? g)ơ҅nA]6cP݀w8_P%Y uj څчH5,.^V$Gg=@x iҝvٵcik;?3/)uZ(i! "`l=$DS `Qg$"Z6`A0Yx9ʜQEvmv[!v%g@29,^:J=PYDc59lᢔ =X֪MCDtpy[ ^Ÿ h*պ:Q Eh DI-}dC>·DLtx6u#{AӔ55/{`Dlq\d@ZI6B"JMq&vzSx$*#R:mȨQ{,dQ Рn5ٕ &-B0Qdق^,_lpN`b.(!pBNsr<5&r qRpk㎧6C1Bz J  %${TLJڀ`(!AIRYK(0 &eJDY/ErQq@#ŁMu/1P !St Ğ %DnÔJ3#zkO }n0, lkcNL*) ē&& "ˣ.|49ƚtA> BNcyvM)l d9QA{j#i6Oaqs%7?lL_;)괁@S)\ٚM֔G}rE a7GU|3]sHkFq! rS$<[Le IȌ0 #SSLlY@0'Y9!TaA$;rrȇߤO Jt ׯ m0,GJg$k{s,}H75Ƣ&Xto C²v33Ll3jOP1&d7L$#'? 0?Cs SIA( 'njD h&l2E~h#Ƹ)L2UH&{߼x{^xضPS񶚊UJk)VM>U+K^2J6ɸK9k$h}?<^cZ|qjlֳ+p4Q5*03UCO*糫ˋG)CCASz|ZAUX>{Z*d8 a,G&ϑq5.Suq|Dpg _ +! -E dIl)1;&žD^y$,J3^+-rBWο~Ï{<9ɐVw N"ӫNw +{`S}[hIB=S*,+?tׇOŵ+.nw˛rՀ `O93Y~cH)sdE,+FQlί+MŒRr>qS ʰ;,fI:V!!:ly23iv,Or?>N͇wWBh\<ųmwW ۺW @9|RP^`BTî.ef%Nv_ y2=:=e=d4i Ww@qlJ),8&$IdY.cKRPv."%)nKUzmoloiV:1TZ7c_s҅}B&i&\JC88WҤuJ%99ԤJSāg4=ʹj#Nv"AAԃ:~U?!/oӀؗy 9j?Hz"V(e9^ғM׍;_Z`T xEz/WV]t:3O]uY?ͦTtZBWZOk4Wm? Gjd5\~*0dƮJ~ CYrp V̓o1 Ўu 'א&RqOe-Mnb;XU3:|=kp/ȭ6A҇E_wc?]AhN6>ske%[Sm$ؕ+ )zPY9_NseٕYζkTi9<\ϔ4 0zY1 M'|y1ƛnovy$޻NNybzOyE(HT(XȢxF&[0̫@;PEPycQ-8]ȥGv2d-_QS&zE9*mE9t^N/[nɛŘ?B'OpmIcmz5hw(@$Q)sו '^dz\dt;@544)ɬm[ 3z-/7w[p!*$(G_5n,q8]IDGzGRm$gSVSos0l=27q` u &Qރ-_% R\ZiTRE=AF) C$Hm-FKɟ(X I+˫^PWf~n9^]_Ǵ\+)n@h=4zj$M{~{l\H+2VD)f*M$U!t(TsHnHȉ?Q=gQl8Y|nǾ*!ڴLoJ.b V" |PNS1)i"' PTE9hk{7sЦ7"ŘFqƃFDכֿ8F  y tZ]wKλ|fɦ&&w򹝘4 qn>zmkSSYu5de L ;ƃc `|QSIs԰l=?,Qv5`V0;*6ltid<./|Ja(Nŭh֫<ttptN".a:F%2(#f@%]S نk=W/ruz'WhY5NF$"J4mBN^ݰAԂ?|VCiIn}~<⥪b<~=~ݗϟu?j߮e~/qBⱷ/߾}vUonO޾~3?WB|s[{_oߴX=n5jKБ˿)znmE#'_]|cC-H Q(&sVv0WN5ksDDhT1S+W\;IpsXEA@;1,LG:o6$ac,Ġ-ZiNt$v6e$,[ LLiu3yQS6_-+{ma5ߝ Mނ[0d&d8uA\ jHh"g`FEU=GAutG!VE2`d 5&%E11 Fܻ2Fx kD:qk3@n)'m{ QϙkCuXph)5߹-:'7"&9sJ'pJU3׭s|GFWgo{_PɢVO/h4XYmǏYqDo3\?{Wƍ_ 9%@>IZiд/-.*@RVjpjkf8g,6 z-+\_ ,ya%xUʌs3EPe\3* Xt.A 5.{19)۾bD)\IXQo~__IsѢzn3e(rí{ihL2B%OD+(e}^\`a~E!H=ќIoȅ, RA6>Mtfdk7.vZ OS H7!$ HuڑD2ֱYPIӁ*y%A(=PQ)W.w򰽵ۼ7c `I"/])Cph8[ޜ⍇ G !|t\MF+ 8khACߋRKP|X,|PKsGg"'B+:ESL$v N+ӝܕH{_L8})xzMӛ>h(4 FSUT*(URB+Ťj,%1en;Q$pED ;KjI^;iVEJꓻzlYA-9s.r–(DĜ%Q :-h@!k!A^FtpT*Ql /L5Yz-nmdSpztBٍ:Tqַ(ݿ ֞;ȸR\.T F{w=<P>&N5UzHڣLGW X(8%|X,?|$|~yRL|>ENYGLOw$DSokT" /*XhUo*xFlU\fWgkj<ƯFw5vQ7p7nnŲuK2.7˓PApY70B/ŕY}#V-rz~_ _[ފ%#7tEó-3ҫ,g)J!WX,ڒYnAxĈN;h#%DoBEj6$䉋2%1 lJƝGHA"W8"K2{.pY߼yMoiL{}\j]i1w~lY@^D#ǧI#"񲉪&wwCLOŰrp.p形+Fss3xy/2 +Z`WöiZfJdOl4IaȞN.[Uvq pTRIR"QGJEՂVpaMaڎޚ_cGWnw7iv 5֜: h/l ,eghQ." Z14P tg!MΎ::VWLL*ysBuvUo0w32Y"x.3<3 *T Na 9;ǙJER1ȁһoiW*CBj m7cёRg|jsΏa*F~J͖)|5pO&I?ٗ*|Zi1ZtcsV,1܆ TY(wC yҾ* -R8\,,ta|܎ 2$ H\k 0qEZ]F%wTlW*V؞r*+ uGt_~}Z.o_\ Z]zEUXjhP;z-R bO%IP޽܋6Y7n.9@΄|ADWBI:_df//LX,e򚨖'<4ZwmqglR~ a7&.7lPeI&VE1 r--*kR.fEn4X"#()|G#/uGέ4`}"ެSڵEZ֧lxU?Rh;T5!=Jvd~iav*J\u\oCxGP?{BQ;Bɻ=]);B8];4Z*=Yêe ^{7zwgM;֊TAYj[F݆X!~vj պ?-L ^)OEHaL!N6VJCI+cDI m|t</]uRsC,IIB/Pr5\r M2ʕN0-$SiIjM~l'Y7ېyvfoWt2"fp(bNT5$W5)͋ŕl1)XRS.?N4jyǛ:GƑ (2EYIr0ρ=,<c\yNsKMyUm$5C(|+17sDi;*>w;<R*@-yZ0FA<Ƴ@"!`NF*Ez wcE:RJ8#@1cP:d4TYf YT*0ƪZɐer ie4/d X;ipZhQ<йNLr#@J)AŔa8p u4A"5"VZYɕwu}2n| f$\\ SL'fJ/d;}!A}CU߭FVUFfIs9eՅRf,r'?Ϡ_ʔ_p%mudQkUWNVψ 0?w&Ef)쩋Fc.+)5RS1rJelMO5!= `DZ WNSqM 6\;M640~KٽZ^Sgſv٪-˫sg/5?NͥuYJs;gKemZJ3+ńOv [}!rM3>E$bW#/C )27-jbu3Z0::%/Kp ^M0kZoZǫ_&R;EtiX,>Όu#ɣV/8f3Eê/^$ˉ[#I0o jm&lJNQwۍu&\ wGwl*ꐭv_l% 7Qf@ؕ%g hF7ǕE!d;^/5CПsUl]_a {?+:~sR޾ 7^Z*/~uDF ^+Q9Wлۅ NGJe=Ɛ'.+jnN.'7c-Zi= c)UöBK?V׮L3Q)~c2}KpwuUH !jZ M-powBxrJu'W{ hǂ }?HfznқIѓF҂TCŘ{; ָhW-5Q.Z蔫Hʕߥ(Փy"ь#pbOFʺ\P5(}NldЭ}?gs0A"e{n|JJ($Lb}R6{@Cy3JŘ3TJ#Ve)%(w، wy6T%7ҏ坱b4*[O)&Q 3Tw0H7H }v9_R?\9/ 2. OyvafXbs|~]&*^ H?ç JG# FJiUƝb4'hy]>p>X1ϫaMw_.Aq p=^D;J6eM̚ jWg}ք3Ю*>+H>kܐھ̅\PT̅gm<EPF$Tʶ !"~8y&c(k%#SI,eh&tKk SBqyAB^~ IKqA$xmP@hF*$@$hPj=NȹҺpJl912ō%)͔EQZ D^Ǟm-͕by^jȭBᅨ 9Vұ/)ZUR(-)b39 EZuE(aeADWQTdaϑIsE)Ƥ%J^ j(FvbZL>i޿@7Wbۇ  uuh`m:c==|o"ğ4?# ^ض\"ޫ@xɼ(!ޏ'wbm<:Rzފ5:_Iڠ0B@I¶-Pc-%T߹R\-]Eб)nzLл'ƣwF 8q博WC:rIҾ{_.bEБ7 'E>aBO]ift _K ٵ!'3<4Q#3yZiU@ bEH+IY_>_DaH]T<3%X*XQ F+O@#r^C(DK9CD:䊐dV$ Jb$muQ  2Hb0e_Ԭ{Q."{Mw:NO8I:%f I=(c${uIߞH׽G};vo_2xphR3ϝpim)j5 9EDrݯ 5մyںO 6'DSԞ:BpzY91Ub[M%``󄀳D5,4& ]%UM3\eiWBBJBA؊JոԠ^jEY58+ʟY{qag.'DQ $YGqZrL^*(@$}=Y>E %RO2.ϲGy1iyܵ}-*n.OD7f~u~>_Z2/p" ښ֓q܇Ei˧/Ѭ S<SZ"Q>䝯''X @7^+%]' %9.r Kr^)+iauK=Zkk5sid7SĚ P.CI5#x҉5BR*1#)mv6<7!ұz8,~^F70v7lXkXU2Y\/&FPiQA-.R33e˯r)%x{UoO]Ӟ ]󍺷 " \"̂|_ZE_a1g!rR?:քV3 |-p.>]昨XM[i{QNF+vl{=~I3@&BH/]9{=@v '\1:}"\x+ "Ƚߪ{B}@L*-2g)q`q)Bp) +0#ܱz7$a.bO8,iA\Bˤ˥k6yeZP@Z"{՗T)A DDEgv5R_QC'Afz;sfq .9yçF2%/=x*IdF ¨KL3 i NRX @ᴏmMxF~(OƳ@I!hA(S$LiIbwt:~aS 2Z/(Aypgc~<`Lh"%w)od~`Pebm^<ߞɇoW><rx6=kۿwQ*D_j׏pz=2Z:!Jj^ ǚZ4)\hAvimIOCXۦElE Ç8f\#@L1JR9uX[ hhsFKEe<!4{H<$xZ0j$upIVzn1`JOE~ťdj)7FFJbFd¤ ĹS혵2GJ˜ jn:AUPD`4b];R]uxǵnpA{q= $"@@xP<&ݿQCZdy7&k[󏖧 Z 5B)U]XS$RɎɎ"Jb@ID /6Bgx1Vg 9B:-AML#P3[ Dr [3{"PsF#$/ƎWSw q\"RK.ᇃƤ7E`q& P\t'#}L㩬ףοd\d:tv{c ~?)?~|>r/~дCgngŏl?NW4QB)`:hd.W.?p&LQnA>uXi9gSޅ,T)؏~^rPv4䅫h#*u!XN6X9]DUo֭qGS[UNqUJ{|F-L9tWg\-//ZEwƙBwjHBxm!b+QevM/quz!"]>Q}N?}H»ACBcOKϿsBA)@[z%ty@\ewyn=Oz*2b 90,H%['zgkؒZ>W(bUբ|;,\+ }7EG[eP RTwx,x&9Μfg%>։x"*ek|{jXt>Ƕ,27A)ְ^;B{n+ 1|V#1E_CAF \A)~ˮ}YC8Y\x~xd%>.%B O acIF![ma\*Pwrڝ}3V :(._/[.IfA }5_9_OM\LeKO^tLfҟ+eK>UʓWD{I;oeTXBmA-Hh;gTLpKA*s·ci< ]_qU/0"|346i!OwU%m/Al2J@1Fu{*p%"l[ .5ܼL_/Q1@L iW_}z^;"%hX.`[>Җ͖YJ1ᎈ9.) @zJUx\VE h^e *Z@{a" mW)>`iS b(5^)UPy/yb ]Dɩ*ɻ53H@(i\vw8gDXr#mG h{"/Y/!AbEKtg0*ӫhtvo3gsտcX1/WL{Cp E$ti$-HiyGYnr$Jb b~)>żE։Ga 2PO{DLOivk+kY/09($8V.TK=K{N( "WVlՁJ;[U,KՆS+gmUcJ jL Dp&)grZf3lP(Qt^Fg6"ͥ"9`7Q[ 9:ü9uay`a3fHa8s8.־o{0a,Y=LCФesҔ#a.Yhf8JsS "#JtP4y Ӡi!AP+du c(^e\ |~@[@Ӣ+BѢ<8q) Qf⧝/6 g<7^%dէQŬ7 h_'nyuȼJyëtrkZk],Evj*V]5$ͺMKI(< o-Wᅅg~ѭI6\E>vLpJ؍$W/Va^w+85 iDLaA" ,m"2/AHj$}uOgaP.9݆UeA̮Ԉꪊ *d*Xjat$57P%wo 8@Nm{7%X d>UP{7xFsP&e=i@Fe/:wB7>m}a",)ȃ]*T! EÌ5$d,h#0 F L6)8qey79_qީ)KFpPT* X`-9VIӖ 1&-Pg`IQ'%,1DL1 "ęh1Xe. =Iux@T \ WfˬXh4+(=מ+cGX-07&Dl!bD+T X!Q!U-;sѨMsp\}ȓK7{jfxM0WQ-O髾vf8yuyb^Qt<\VO"2 9FMԑK'`P.pHA 1oXhP ^eɪ֘skĠ"h!w0@Ŵ<,Uk80  jR¯rR?vq,Y9 UF y:BY)SAEAܥd/KOgRgPʇvW#Riq0 : ,.b{y,^H Nh=` ƊJ&FZ\/;XPDN$TtWI4\aE DвUjP!H% ،iGbJ]VbAj[ ѕ0E.򶂷J#wug׫4J(a^At/hBQ#AS!&U &DШ:ؽƠ0'୺)$0(C6#m3{ (M{s9 53AP& b(8Urȧq-5BWLU loMn)gBJmI&d$F KEdf "dt 2"c漦; M"b++#J[ɠ#RJGWyN*J˴eGAWZ g%!hh%a0E>&aF3@O|]c3{EfhIrn@فt`ps}FRW9tD*QA&"ԚSD< *D<&A@nCt#P}Gin#G/n5U7["!Nj!QO`R 3Lj }ƌ؂1{L)}05`ѣ3;| C a 8+',Wt 6Y†8j[d"m` O*,@pWۑOC7#7VM7&|j2Ҵݠv0gtd, YZQ|b.+NUىU655TBXF30qXЩ;eE!ń^al1MF-VL8UӮUӀdX0oO99b΀P  2A.MשCtϱrMa]X$SPKtL S EfqUV\LEhd{e*SI ~Wee'Dn+$1B 91|# C[G q⩝Lҭ%A#c.Fp5!z!(!O3~ozLhߕViVۉJL(Qj6(T2MkS<H6cϓIy@Bn2A߃X4EXJJ+Uefq t"xap;H\U1J򗋅 8*Z6P |/f,8>Lg`5 kZa؁FwM[ wY+yƪL{ԻAP țhreb+$'g_Źc>uQ噐;C/gƫTޝ'hٵ] k97Xp0òXo?p3:YZVnmf-'a:<=pf[Yplڑ}(kzjPR:8qY("-Wk`o&_ս>,mל|2~j:ܣ˞`ΥNTt(:DΥ 'x⋧O2˛ƅwf % 1yu.O#0XAuw ΡȾ8.=u8z&(l>?zSfqZ>&'=^vm]vrm='4Bg[\~ J8=}&IA&yZ+n]<re*Ӗg֖eLe |zd͆O~xNʉ7t;CV8`gpcHK95 ECKA@4@nlT?ԂV #T/t \zŜ1abS lz$$( 0"ǝȼɄNNx\sr P$|yd b%yj[-OбUynj!DqAfGG^*H}P` "܀-$m5k = ڀ@&BՍ|TbUt&qOJ]kSGL֧M'y3z 9m޿ךbkEĉS(5Z-sj]|Rw^wB_Y\}rVV_jB9Ӝ;tk>$a ~q S ,Bjyjog9Z0zEezS;+(;PgN kӁsc2fJA{ : &uЊi#dwȥ4a){t9#~5 p`_gI6A)"{T֌-gFk V. x+(H4dB /xD7Jkmvmp<(St \pe땶0 JP6xC&Ʀo/ {gfP'/uTtwfl>.0zNy?79XizgC"mxB{4:1 N< pno>g;.@ɬxpoDi8ޜ VZ`E!+YWԁX^<R<d=2T࿆i_.=ץt ]qN̊e8v_ uixo柮..?퍽\>2еD].5x Jqvh)I ƄKKW wա0|1siWڌ}\-@Ndf7.< !h2W:M4ٮd.Of2D !cT!_1A"ƣ:x'&\2#SBk 0@m*S-P7wfsj,fmeS?Z]Lp+/u@.jy ֚3+⬰[A ;H]Aɔ9Ӕj8Niɹ2D cVqO{?ζ5*Uj>fzv=ƒWb̀}my?,|| ^Ai*ߒoV_.x {1DdX6AoR`=+W)?^?ܚe.TL(0  wW F4 yZ?Կ NJb̔4b#ИWvHyG08e?U/2<8 oN!x]tff9)zf^ՠxIRRCCa64܃rNOk3gii M~aEXp9ʊr–Lػa^PٻHn$W> Pj0kavc >#KTjw{Y%)K3ʒ9 sN5J̼T81ҎV0D*I3hisΚtm*H{b"21L0Tr̥:'vs'S.5WpXp^s2YxaqYRW*pmbRH #,ESkiqRZ!FX\ZU \R::Z0 c+. 芏$Sç+.t,䍛`N L/odSSPPJ72^B)f\= # @֜ƝSRӐhhQK!;7R~oQFZqezUb:73Qqf.C)u}q؀(=E/*E,\ɤ}qw>Nrۛ:Y [i6)SXbaWP&X`jY5k4:-|`Y081ZDK~v Vi߳UZ}J c'39g[]bO2֫$P a9,@е ZX"BJ %t&*r^[UTH3n㞥=3f7bv3Q睍77BsT\) w2qFAȤVb55R(NYU(u{aw%VI))P z-S^ZQ{U0 [TDmppDk(FCVE5mYrgZ C;]lmoY(ZNֱY0{/Žd s ƀ::8nyTH9=<'f0( fec.L6PFѽŗd&gϭ%W¹^m 21_d\uwܱܦh9sJtN4Zc/?r|(S)z}淧2۟:77Ǐ`>_ٓRPs?.plh~[yhƳgryǏh/+^-藏?s7 mY\߾/Y[)#Cׅm]lj:Cw,-c8/8$Yy-ru5a :&=1SΓ[_mDԘ+[P)UGaꗙs&o|UmRMepݟ+kamE0}iAI>Y@<17gz7_~5y( ~g^(ww)Z9c\3&ks lT'׸— JYHňɝA/v}X8BӘF C^6G Sh/mL4i$Ek֚mbH.گ߶0Rfi ?KPO5 I,%&H퍑Eu;F2 +* 5EF,]xDO\6Σ]Xg~,Dlퟍ;#X2qH 0gӚAhRW3s }J4Msmhl+?#gZN -L1Rㅦ`u^0qE\ k"DB6 IZ\<}:'-Goϻe.;zr‡lL.9]N#g=Z}7 -z?oh>Z3[VW36_ic)Y H 2%b5횉ʪ䙈)f#Wq& Q-KN fv|S@I~X vds2hTlH&Db+;T[BNkZϚe+],[Wk=VOpx&)\==QF5fJyFWO?ldg~=W\-aYӖs-||!Y<?i@uw^kh#\6}ZEn5oVS zVĿOqQf% y&:ئ3Dgw'KM~MX#6@q \EH.#_xo,Se?ݻYh`zz%%(SǀƧ,:_9}0Aǜ߷*ӱ޺-uz9Gv7NmP=}T{?lI<FfvJd"dqN34۷6ZV-@/oo j9^Y4B=Ij^&P0z.3+؊= & .=0tZB5^_=m*X͘Zx?K!:ɛ}ڡ*rnNzS=bU!liok=O{(^L~S3T0efd&_dE|vL=$&QdKlap/(MZ|X_1l_6nQb 1IBZHfl9Yxqyo˼@okB'A^hb(gng-J>Ag-z!! 0ݢ6/H .Q;jyk/SxieGڊ*EHĜȸHQeWā冁㊐Ԏ{NJ`_ ̫ryv Kt9> j\))f NOFE5( RPLs$QYa}…m܁*G<@J6 k:*RB60Ʀ҉zB>kZz"azy.|_ 4A4AC2hǐkJw,)hM !%'nCOH7Is΃yR$S Yg E0/ RJ0pfH6Rp<}*f#hBXƉa#\X `iJʢ`h,g#g0pud OIuģh =҉"0%z0`[Vl\@ I[Ia>(_LhAr,2 3D=|dhZ?vE> =(5V{ 'H㩓w$MVU(ݑZCULJ|Zagx<c6AS,E3t@ZjƋgh 4=,sͷZ(Z #(*bYRLpDt )N0xxA8n1akSkސ4)i2 Vr96 kP$$D8MKSiIzx2#}&ʽט`NH Y:>&NSLIsS.$iYA 8Z WM'&*=/O1\什::R0RMI %c4*Ӕ#D(8ӄa%XaX*0Irl^{0ƒM [2U}|# wg[F;۽rnJRERibQb,ƍeM3U2@K96`}Z,Vi 85)g26NDGZ^Z)}0d9I5zM0D^Pc _2mnLUDcy W |tHm qA{bKea;.k!~WuL25Z83Ě͗)} (weK+@eynQ04B5-\n67ne/WW~=Ar3ӷ/w5$ nrΏw.VF}YwOx+>W3ZK:xWetdO̊`~h{3V uCd{}>>Ѻ>N}eR*4w20sLZ.f)s6ġ:,+~u;רQE8swݻo -HF1[Zc?b7 y7B>FWkX}ޢ yKۍS<ں/m0A[,\3`-sSWUVcJ0шրI/CAPB1%ʳd@LAD(Oh5C_ =c襨ʠY(:TUv3tD `uD+sxJ*JSbD)y'1LZ[R)V>1F& 3q8qN2OpEffaً-“?ɽ[S]w*Tʤ9t?+rߟj)"-#,A+I!Wk[dI$5շ&)O*x6R={5cy0ZqnF9X'a6r,K.__>:0.fm֥ f"gCJɕh1I/`An !ذ(ji S[kO>=)[}'v F-0˪~*rٖgNf &,L!z's)s?rvv.zoOx-BV;{_̖">T̡evΆL9e'fnfv=Pg\Uෳ.i81TcIyXE<?r g8OsQ!yMuRe^ߥm5b:1ҹ~Xz{\JR#+.ɚ$䕋h%] n*G[U bDeSYkڭzHVr$SyqPT2Aݪb#:U(Z$JHkڭz{TLg{F^Pssޢw. Ymj="%k(oc3s)\/e oF΄k7dZ= esJFݙjx{k)׍5=m -=5뽕\5%̧+x!45߷)x_Uٯ& j$:N<+k~z`N(Փ1R3'u1v{ QW|EHpVrx75 7g( `+}<%U0r{^qO X6[U#s]^!hhÒ?ÿXwXwXwXwgjh>wdsj 2RYos sk#8c)N,1K@^B`W?gՉ.zHF\Zofx0<ޢ,fwJp$YxTS֩YBֲm1*.1bʽKK=^1U*l8& 1flO=۪X|c@>,S$w[gi;]vKͱއP3ʂSfL8X򧱟JVwO6Zu(8z ^y^ \(T/,e16ȅ,ӵr ǫq~8ȂANƇ\ A"1M"Uݛ\T6rn=_`|%g =  SAl%]\uRܫQ?_nVWRQ :Fq ;;PLvMvH Э\;2K`^wwjtwӭ+b2jV.dm5S=F6 Vw{ii)-Ag}m;H AU0ٚD*r#ݭmJB űA ٟU-VGJb DWVzq5WZ$I%2l( ĥS',F'j-]%o}b%%Mm7ibOn#f:@)0¤{jj{%{JNU1)Su2Wɩ=Yɩ y"ZI$AVѩ:GvxtkڭzHVr_@*\SM% պΎ\%C><뵄V`DpI63Ŝ 8Ǫ1#Ua,"W :|2nUwd4]Y#(UN/__BRfL3j:X39AO|_')<7!N\ql#+sQL1fjձ5# زnbf,}SjQ? &BT6|fjQǬ F4mN#qU*d!Z 藵̾@'f) AU6rx RYo,,8Ť!OdDgv^Z${uIhW}6Јz>)-*X XõN6. c<|{ohd)@Xv܃,]Ytd:/K:M)%YK8j-%K&^c,!dUt)U*WuV^Ǖᘔ3h'':\1ä2 414q% sJEdU\UJ)T\sbFNųb`CAzٟgI;[^-7Ό7_Ǻh}ZbeT0HfxeJCʑԼ<@HCsR4=R%nќY[@y:*Kq;@~)ŌDŽHnؔ yZ`l8)a cs(iX)ƹpKxBQ8Dx*FI eOp V\KL#yKX) v Gef穱 _(i337T9K5X,rg뺤B85j jSe܁ f$\+C9!#ԆzG吺-b '+:Ęk8n+ jծY3 >{%hXS\},, `>渽g}Dz|uԽ_d2~y~E+\+ḋx h s b^~2cgKAp,o v *Mʖ")9CsY,,<#!K / 6ޡ+L9xPT N!HRQ"SEqUF+,8Ŋ\#]Z-T;%dl"9;8~V1$TSS#5ڮ G0;}>{](4x)/>>'z`3&/pX849BX!l1/ ,m-ⷳ]c4 Xk~确 p<<jT`AhAaȞc>`/f՟q&uw'N9|+c@].{#'yso͡:JeBL볊ֈރؐKN>T cᄃӁ]YsG+z }0B<ؙ²=+ cqHNoV @FhJ#HlTg}YG~pdzetp>[HHq#+NZyԁ@jw AF@ A/jPitj<-T=!PS*_¬WQN1BDl+U%8,o~y?11cxekm6_$w."V܉}II\>F'Â=見;=1c'or1(.Y9Y l.ABAa-&MRy Hc˒ѷnI\2o8t1X  t ߙ忕wK1;>v.x9Rnς-JŚksiē_u6Z ] *x'FFn"lb251>J87tER2R'4 DAYؕ`̦ &X6w ?xWaʎXIhzSk4]y ÞQI׎{C%0=&IU.ߗO@2K]uSoӟ̩ VzK?/ٹc Y:Ӆ.sEts=WŹ, x1:S DLG #]>ÝݦLO۔ieZs֮Lmqװ`=֨|_XmHBp#S$׃cI/[] ʈNwTnW?ux/Or0okn$kCyr[\y0OShm{㛪Z%84o~*>-_E:}]# ,k LxJB1^͉__FkWj(OBf"dd''J`Ƥ h3)K UN zN=x!(y$RF?+l? vkg 'OãHZ!d!& 5Fl,`Qb8ʐtവ;a)+E9J5ܖ X[2#j˭JN^TDqno*pbswҬR@R  G2TF ĮUȮ~ړ?cA;]*u>Dzԗǚ֖'37qg0 1~9Y$Q7 p}F;#vj ZVo,wrݢ_Zx?RHFgd(qn^Ch!Uނ:}Vbšv"q FiE3_gM>I=Y twy^=ym{b{cYXyԍՏ]V≯ee%nCs.U''{sn^}n3s.p*O2jٴ}cLZOq)anRԩӝ3R*pT3pܡ5pz=:<"Iz+.zs.R5#ֹrX}`щ#1Gvcf!MtrR1Хg{;_E:獗{UDHGΞ_׺i|W1E_ HJz8y?~ vDi}+MLZY%0"ZcN U@ *. ^ :Ql0`* sk)'16(pjTs*fnwZj& Qe0nz9Q`O=,R(*GE#=*̋ȢHivkxjF9muj-$_ۙ~u{gݺ17K3_`oBΫm:2k;,*7~A -w;_:'7ʂ각.dCw믆ڞ΢i3w2Bzə&T\^u?iP9622 XՕ5HMg,- ͤrMή%&,fOy>Q cS6Y"wY4pL3P0?jրZvE;Y .x<  o7WT_~笏ɎׅTgiJv _d{)|ջjЋ^D>tCe*,S̝ԩjP<VWs{9Պ vʝ„_|>9g-RͥbaR's އDs"baTԩ85\C6iآCRpdp%!?EЯ~] Œ".Iw_{X <Q"~y\[9YtAoy@7f. +mNg\2Z1pfQ 3hd`Mt@tp>P~6#-xxK;Y[lVNnM[<Ї`u/`>&8W%Db]s(1Ϣ]2a  ClqhJr FkpB>kFX[س9\4#q[Kw)7- Q"/H>xIw_&Y+1+M%vT>(b!q0O 'm "qYFF{3oc~]΁kXEbXn^ pyC$#&4)v!g Ȇf?4@!A~I i?}3|pj2O}&т)ӃGY2X~6?=^cx|c_b$lƊ+OS,5R°X Ҿ[x dR]?Ndi X_=AM!j5 W[\7h%YwHh12 >JZK?o۸-m6n ma3D+*hYʱ¨ԧ̦)jO,(C:r6?|dXߠ?ob?^Šno֭A/+~|bGJP&qmJ?'UN'쯁k0ӎ[$btR"$=2h],4pbvxBO -R2)dgjw/0^$Y.nA~"x (Ƌ*SPf4~-nvX[I+nJSPj孇7+s"̥͞Fwc`nNN`<$2L4lzyK$jj#o8RPz ,D-cɔSe~ctbPKęH%:>FKЧGH 5ÃM9`aαTrࠁFdtR TA:eK2K2K2K2/Þ5V 7'0هXq2WpBacHd@Javi]ŇĎ0y8#{wf9jޛb@qёM%B$lk]K b2AOբUK%2X ȋŎڬ2V{F9W#cˋΌ S\t/o=*" ;R}"ĉ?ʏ}h}_&KPkq8+rv/7/^`4"X%%{ )dXMљE|33{|4e4aA]|UmIsM=D{2{9-N4+ w`*c.AV` H.juc1UFvoc#$ @ZXkȬˇʹEL &3e%CEz @t!df'HadSu¬3`Pb Cw$#_/UOcdXqIA-g}jXj.h.ܕhǏ:UlqZ4( "TgM'?|,_ 3䔠5 .)NǭiWP^cMRw^ z+Owb2jW"i]د;5ݯ 5%ZV5gꪖZN$zjTb"z\⊶Dg46{-$:q4S[霷FD/YL@h53QXJo#8pd%GxÌ,I40$)Ԣ$g"9GkO1I5Jf-|-KGNۊ/uq SN1-u+ꮫg 0cV-4FZQ^:ԝpvũVǡzd vLQ:cBIxJXFZ-ԝ-T#)ј_.d$^A%JSƤauR}%v[z+3;7wn^3>e\4_͕n^PsA)JxJhضp'*5&xR˔!1) 5Yqd៵C,'ff>3KAt ڟu)+R(I1̰-X!;BN; m\>Yrr?9{x CJ$z0),b+D0'6 X!P ԤiGp[83rqdN9T9&>]Oki0 L,a I `T8mpFLpgZȑ"܎l|^Yn2~`ٱ0c$9,bKG[fwx0%bX֎ h`F oִSkkHz9"uS,"Z BmQxˬh$=]*15ˆo缔"T^;wm jV(7j)}2W%PH)!Z^[N:Σ4U\S&5 ct~"٘h%e~V_~ϗq;ֵ"[ Ϸyxx0Q`?YuHK9kg78l KIm sRWqq*9.jplNꒌrRkp;,Ǻ+NNjk='d [-ɻnrRkPa;Oj6Zd1q18x>{>7̄!̤^,ͩ2~:j4_)Mr߉ s_*X@}p/,GDK{=#8 D˫*/o'_SiGF8R(['bͶXVqa رC 6i/.ȐwLAZ.H&oscYբ Oot]|QMh?ί`d+o<1}6XgJ 7l<D=h3O>ޔMKȩXĒ8P,i>}W[`IwSx#-oN5L Z`RͦJU8/jGB*S$ȓ+LBC^lSN{JLHN]jy{9:4opI3UL?](P9zO+@/Mߌd[wFa0GQQ02N4&LFɍ)Bp*eA'% Tձ,^_^gxњlY z9 >\,;ަC`.~?oV8" W{jղG=$M>= /-U餑Lfg{ZL|9Oe?-l>ݤ}H<S,=*?H}ǦA+ˏܗ|1hZ|8[Fi7-(@?>RȗܞӿjSg?.ҳ%+5[\DSda:$=q%-I}Gv0cU&Z#A|5ew r]phЦnh3 W s@"QoG8(?z;Xs z>49.}t=aw$ZFEj=}<=Z//GdWU;X=/7֝0"zV2{ݩci U{gf!}sTKx}FD:3<8-G, +IMGY9aaȩ`(߻ů}bDVo9~#U+Ѿh%SL9:F> w td?< >\dp_"m!nBI~i"/-<#oýd{殖R9//M 7~t7.)_ N$R46sk9fq5c%R2mחgb:.VWG/bpzw5ݿϞ8'OM> L^#A*cXZz1)T3G&&/Uyj1O<<9yϥ#82买II0ņ`! ,<(Wb_|q[j1(0ebJ^VGR˻ߺ\RBK޾y,0/zʯ6Y{#2+?_lTLf 3G* gj拟Vcn/0x6?#J"|ۣ?b9\"hfa?CՒki%F@Fc2LߨBs'ؖH\Rz+4*MRyz 3ߌGʛW޴UhZ`g-4\jrS8g;Y9ВƄSg3EsˤԺ܌[aUoWmm]^oVD/o.Jֻ3.QChAw?ybLufa_XNaR6[#a/S( ,)6ZhS_$/ߑ)+5q_ec X-Lrq`şY q(GL1 0~uF 0B#yJ#bN!1rg[A 1s:,D}n"IqouJ'&&F= -c"7 NFC¢9p2|n2ge,8&KJgL q}),W*++hsUj53fL Ds!hVgUꦘGPZIYHZ嵮ZV*w09lLJ`J ~eQ3$&uyv<CSS#ILٿsߡO4"s6D>^;3]\:j^Y+ky"9 %e'YP-ʹH>=tt\P* irr.BA٦mu[o_$3-Ȗ$KB~qbYaxX?vч~#9+%% o-m~G?N9_ #L:?$z& I\+ibCz+%haG#ʒ.3W<̽2(\eȒ#!矂)lFH1cf+o5:e$gYo>Κ$^Hd]Nx2HјA SRHlU'%>G禱]< n>R|B:;Nʌ[I^c:SsΩ61J1:w+&w5谚Zun)G'ө\Hy&sM!ai7ʌ>@5aiR^B@$ r?s6/}C9?~\7F9(>h~:maůY>-zf^7l߲௰#~r XlaA bg'kϛfdW&EE6X6X%W}l@y;X͹\~r͛IY4G#ܵa,cRgehp̊J*ACZm-8Z3j+dRrՌd+lZ~~ gI6,7')AbYʖ$/T@B+)[V߷HDv$kzzVR,whӹU,Ԅz:?H(dŌ!ll&gOUeք")l,v bCidD 2Iq%(>H̍06" % + DI$RQk)LlL_5^%⚣nMa^t؟µlL">;4,|0.5L8_9%Jް_51F[UQrc><70@9޳kO36xrӉ&N]J5r|U lWLɑQW;gxpJJN.ix P;OQ,@2wx$Tf܍|#p zcoZ`TJԝ k<]f쫄د} Wmz{H % JS&Hg)*sx*KkЧ;/;}VޒwiXx0B]Eznz4 zGM/L/Xzf6E740o :ʭw[o[Qh|)u!U,XǒYE U,$p:Cc\A-^1 vt~uӁ>Bh3[ItfsIŒ%9ߌwq vnWwBş׸SQGӬ1ݎ&ScaS ЏNyfƫ|1| guɃ寁_D;e>̦Ce't [)Uo9YuZbo5E?ϰ ci)%#!$qI 1,Q|Ko{AN1S/g i{PJKBzsE1Q{8jLFRy_A,LeDQ{)smDXns{"BDռm~8ZReꗺh` ,'v*IFe?"Иyp'xѮ(WSD-UVKVPwCӶo_nñ;زwBS\Ē|oQ_LTvqrh[ZqŮ J4]sI[1 0j~Y1 퓖3?A{~ n lZjDžP'-@ C8ܐDp{fpwi{4zϋ*#ݨUDPØB:$!+l()EJnG~@^~"ϩr%BF[ ;TbytWV5#&Ԃ7 Y" Z4N ÷#5 Iy.#V)`BFEx(M9GBj@9/%qs*" 3$I#@m'P&9:-H$zc0`nG~@^1)kF>IRE 81UFy]YQ-u ww՝ݶr=UFJz #Up&K[l'n:{סᏕHt)46I)lK2!kHKvnv񿋷'DRP5v bR .9>`cfSTg5Y9Qna63,,UP<}0͡7ٍtu" cs:ƹA\dp&ղBYKrRvE_< !3q\D|)9U#g+\ S7mZ z{vSS"GyU\H%3'.:Hgu!9_t{o+=19 vq󏡹ٵiNBh:ׂut37W_DԌ:DZj1$II71XX+"Pu#Z<鶁,z/J;{ELv>)ů'wxMٿNN}޻$@-k_e.v" q& H1͙ *0sQU ‹ b"Ѹ@$x ov{EoP:jOO݀íw=IλL]L=l kjEi+\ wW eePi mZiK+`tAKvxKO.L4:?=^ ~|䠱D6f7ݛao/2 {x NOO^&/ٰ‚d!._MFGaDNςG??9= o_Zش.?l2OӃϟ=?xkYK s'xzÃNW7ӣ7f&ys 4.~vFdPm,k|Vqfq:V4{2ywc澌rL?ȭ'(d h/DZe. lZ nSuMaoOꛒ_f-e6g SxfvVO9/51b=7Dدg~~5],?k7QgmL+9c. aNSOZ&m{{dP7>^Q }uZWAv;ࢹ4|9Fݞ9ϓV o@L^xK7yKA96ǾW0%ۻiFJx?@Gn$Ʀ7/Vԧ;^m[w7~E6xjw!ιj>O^vT9 B+?A. OLW u~>_7q?L[zAô3|_qV`7g?%^v;FNU6p}M^pR;~w?+_Ҏ` `Kϧ{~/_s2f4!x_GuM_#mq NxFyNld%-Y}t_Ȩ}MSM}߬f$d`TGq:v! i xܗw/[v= oF%&̔ ."`O/E2Y<28d\ŒL *V#Jw1ba 6h%VY )ső"Dl$#Ɣ"!ƎJ}|`vmf "sL ֱz ԌpA6H%r)m;h͙n|"őF%6d`IaM?Cj:/\,kŢD:+|4aa*fN0D+ccnh1N~,sqmB 3=DJ粹k&$b@gPLlӘ$27 oB<޿pc3_8+_U*f1F3x|E'RlxK}B[] 3"|6h/q,~c3X/j <#\۱EH٢ [d2虆9F`B-#ʨILu[d)#$bTDVDƨB8 Ԅe;@ P9]c M7Xnv>Zϊg\eYDrsO9OۡPKK`zR8dib9! 7%ǽ,.mpr:R-#" 2U6J\R"X9|4&"ЩlB6X,%Nr'X."=. fvɂg,ٸbrdUm̘dNV}) - ɷ1#ɧ>mؠ%/JSLo:3(MvN(=(8@֜#Odv(: 4wc.ZS;SEK_O.]o K2Kt˲L'o'3)LQdi-LAd W|U<%Ͽ&kk!Kpb,Y@"Ha4nyCD8s:yϼW+ݠ+b4>Z^feRvj343 c4wl}T'ɻ: ;lZ&U 3HeridWeT83MYqjnŰc9̜xӕwWR^ys|* zݜÈRT&HLA:3FhEɷe (w 8yB<./c* b9qbd;"4^hG GL\zJ{uI7:s):r\ 1sRJJ*Ϭ4ұoGt9WG&y%M[gyH-H+"QQKBKjbG َ|^ S4r-YJћdʌ1Ƃr8v"r.5Gp7MëENVĻU=8Lhl` m^>$&_`LE# 8ekstLx"d̙2I( 7 !P41:&l(?5 e^e=%VQБ (t }ˁ%JGԇ|GO`8ɸ'kV?*ZfSBR%9ʑWe >%@24A!QhAdQ2%ZX;f+c"Ct܌V-U+#{ ](Qu+ைӔ.젦=.~M8/D#34-:->g㴞epYur\. pdgoBm*ig;n^ h͚r||ӊʕٖ/CPlb 2Ё1T}ϩNyJْ *}e^!a;TgI[bјs9m18\oAE`àaq˒J;V^Y u =Hau_A$4ymM#<F}d>!.jd="$0p{➷1_.ov\ Nj]V5Gt.3_a  A+$[u+V]Np[pu.p?BgDz|#5>}-w86$r" ӤB]ɘK8 rJ =4NykV.ӏӋ]s[޸ߏ%eg{Zye5pZ1:#AK^%^} (@탌$Dz=AZ]+ٮ6lW=I 83v4$SJ^$1$`7dzAN]=dD} '?dF]=d-Hj%#JY^}/پlK%-$U都d{$SBsVsJw Tvo/iw߼}BcY';3,TYʸ-i^Q4!jLhMuEJS7 V;LFh1`,Q"2R\Xvߍ R!2bRɈfʖb( ǔ-hKZDNΆUE:~ pA~x6݁< Qu >x937P#=VUУK)BNUYcQPyi2&FP`Tu_Dmxߌ-LoU`(^» /I*f]U.>o U-6| n/. ew<Ϯ+(XG ۭn| oo.y}D]V@7Ր=䳍U ThmU ^Qm偯f+#i֭nl 6l.O'UH},hrs[oVb8F|]L6ͻgnb]кgNu x盎GoN~=OYc4'12:0O녝i#8M_fMYjY$IR²ǛuBYL?ι]B*n;_~>+c%r俎S`2pя@νtt£/ dҋZ 'ʤd'$6{㽤_3oRщ]@XVO--|rxtm\..=zjR.wZWܞyyUߟ KN><$TOoG.T7ڨyUqmPI5EnFz}^RPb^MM<w^e6mgiKaF-K?pz]徾,56j͇oSi GE&بt=z<]:@i{o(1./]7Q;~{oo~ž=tݏog>XսmͻWysw4|~7drFKk"v:b;IG( F.0-+e )!B#ϔ1B*C/ ۏh y˄7IP1z*٠OIQ9  5<\tLqTUi%0AM&?[3tAdyY rRz7SH"=a ,e !;sMp}V`J"JVH=VZȉ3&aYp^R= \6'Uu} FSc@8Ù^SA!Ph8Uk!B L҃* * ,P f­DQFo$S"g1C&Ksh̅ < jqtn̂B:Ek U܂d QF=MӀD%8\h/ Ph-L.B("|Y#81 :)ALywyF12.(TASΑ%1ׄ9N\Tˆ 6,ц8 pC' qktCPS3ex@z8] f`-(XIpa+B\YJ9u:Ah66!` H I1*pa({kL5C Ƙvb)0h|444&ОLnL ${0>P$@t4jGgd/!@m 5!EW"J#> T TD[Ĭ6sĦ1fjpT:~:,M hp`(xpAq/P:-U5(+To8̋ao`އk/M-}/Mv)/EЪ}CK/ ڎj-m :PjВ_ުBKaEUSVmZ}Yh%"_Z5]zhIEU۫- 0놼.)&ʥ(>[c.\Qj7*iDXyY#C9CucޡǑ }CH8!)gO0$2Zf@=xj9Lp+ W(̅,mQ?^B#Db BTQ@hQ+@#8@(Yv(KVrC@rʃ1LE"lM 3 ^.z7PkasRS*۠1sK!eJȓ9˾dY ְ1H)RQʕIYrZbIs) s.~-)L{AI (QoҊab pg%!PbJ _hNHQǷ]ɚ=9Gx#}굈<|'=-TS뵳  L(0eR@[-" d0M4<p-錨]a@ͬ7UkgW*.ԽNa@) `bT21s~^kz7pWY!h)i<0g!\q>ȟUzHQV]!U0x ǹ܏B$Y<F(&Brf5wVIJ"wn]-v/5DY1US,= ʵts%#%b.LC"=4GgclQ䠫 +K_ -ր:+ŚI%zkV$k4w-E of]ff'vKf2|w4ZSt_:_7͘ՏYVRPXY2+**dOfgw+~]P.?=w7DH⻍bͻ_z ڃ548]o38* QfmUèn}[5?(!nu۝PEoWn>$k+MUNBpda\. :0"[7ݬ_Dظ؀ޮOSp1?!ؚ|GA88mm $?I|rqSm璡g^s([݌[䍃-[z}L|Χ>~pA> ӧT\#k?ݴ F~N93H2cp mv13שSw|dxN7K/UףNubUH EH $ JS-|b={WGA־:>g֛zpz>mw vwn.~$/^`'|ҍ)O):>`uvOި_^tvC`7Kp"IhIo0{b` 4?)5MҧiH܍8;p2Ūp3Q0#gs+orsr;lwD!8b+P"+DoeONU u$:o;o1i>ZcUV#v 80v)J6՛2Y%ldŇvWl+NA߶qUzRo[w.B*hʷ1o߾gw ֵE73 ~k{Bb+i&Xuf$9 !:+5)E4dHZ`(:BB2;i]6V[6f0k1NWNWU-^gIӪZ$e̛\]U\_^:׌ir)"4&iMҿjˉN2?qu+wd;3ί&O|iQxb#צ.sF<'!ڔQP Ӊ=_ 3$a'ֺ:W˥+ϸ%:u481;gàZ9$X<Hr%|Q+ĖlQE-*Ɩ֦n 76_Ώrl[ Ը{wX'\bȲ$nc{Pa$761vt [q;<5N ږOxxzb%l|h׊BBҥDeJkL"-J6H+N4 uk!y=ȩ(pa4 ۠Ed @sz:v.6}Yϡ Z0eݭI3mCxzV]ޥzvLy+TԛBApRQr[a ޮCB 0=ŀbz Vrʧ+u˫M6ϻ[]ܶr9*wcĊl9Pnu-ՓCe`]+8pz IofCkû/{0YI;Z6^mnHf?7V[ 9(y8䕭@~@^yZcի;Ys$=>K}uZfoAxoۃ&}8ܖf`JG'v&oM\,[ƁMEG9!&xty-.kBymG QXʃM%%;E2`!gHD[LTRVJ% [JĒB;S&c"[ # kmu 4H1D6EZj5KP)Zx2B&i%-@Il?-Y{Rl%H=6"ObLblyE9,o=x4gayC&I\ ZM_~Oş:clV?TTI4l Bqk˥ ٮlKmb)։l*G /-[Z^50n`t~Jnml`*D΄5_U;>;|m;7'_w#Q'˫GW 13*Ak4.sXd&f?Pb(y^(=k  8kxR:UvIQArV_!n,J$){`! *`IE2*'GNԟ]`$'~{_l-z|渖2Bi^Y-0BǀVg m\Vg - X2.m^Vb~Uγ`֦AJ]Ӡ.̪ߤ=eTJCA# rTۣ BxNS]6,LQ!K66.Jm]br&J'E ll &%p՞V*fBo-D.*u,E2D)U$It`vQ8/{USx`ڵKB0F#Dl |MVi}!S1k%DS:X" VE`k:R94kA?:X Rf5ljmA OE]THh R=+;*B"U<34Wi 0YWum]ڶ3~kCNށ $6Qc͊H7ELœ_0HMkoJU^E, G*0TXeUf Z%MSV *Y <:>#dGj~qſ*+^|) %-kxyxѝ^M҉? m3q~4^r5]v\Ըr1~S<5'|+ 6(1^ ?j421Vl$*rŧj@X&H`jk^Ƚ6 F-k `Uȹ~B ד fHA&Fkv(9{lY2|WT얃 ɹP 5/F"xj7 2 /O΂ٺFZ@ ZBnucd(V"| )p JщxdXH!Ka%n#0tЎlԓƱ}O%!Ǽk=QI7MnЄ&r 6(؍&i"Vт*e:!}=k:H[/=4aӓ;n II&euϯW>nwexT o/on O ZߪILک$R8dy`eTّMŀF%]yڵ~>; +QBjU 54P4KJ6H"+ >b O8-~~v?OmM˷_~&{,; /.:pӷ5)gNj]f_^ק18)?MM>TiA~@ӛG &d\M*12뤙NjEd~߫7|헯n r?'<&Arޜ'gWg5u\򮄹K^U&ǖ}hSr54].4-sL5*}ѢH dj&lq|xxxxGߛ^E؝R1~zT)ppy> M|y:,gi@~-=9h&ѳO/NooΞ>;=&M}K.8%v}ykOPr4Z,sBxqv^~zvq6iF9 *vIݒoD0o0/W/}y~qXRAC~$q8_Cbj髋/^>~Gw;!~is7,z_rz~}w}$pYį~b^e~O٣?.'k+5~R^GOm X &'wH}%p;rr߆cou7>9~rTYx _:ε ٓv[ z㷮n=|bZ<;Ot_~;ϩh8>:qMN/8rs _1ww4G^]t{rkϠOz{ogGqϟMﳇ|<p)N@W_9Һ2|쯅_W'%V|%Ñ)(~=:2yl<}qV |tk=:X6 'xz9Ĝ ߛ3x/9QWzW_|g_tESt[^9of!9W'/WC誛8?+Pbs_9a\?3#k_=߮_a'WѼӳ` I1o۫-{|Pj2uzm _( |Vo)({nHv |tp\~vs|>8 ~i/.1,&59fd3G7iĊ׾q3Feژ%G214: Cc#KiߏDOYb5qq)EO'ѳe :)VWwfvDL?YgNl.>J;LM֋AP8|&4s[9 ;rǻ\zw0J0|D{G+pաƢ*wٮ};NBz*NLYIaDZ't)bRL"@Ŕ.ݗnKxVzamI7y.M[t[֧t[GڞQCtrjo 0cv{h$mFZ&MM \@[hk&fF RW7ƙ#EFHsSKThD)vJ$rbKLStX)ԇ|J2M&Y*6RsU *8^35Hܙ}njQ}&'XjLɦtM) 8YkrI H j=n]I>Lcgixʃ6 =]P\w M OQȽkKc_$KeAh߭@S[2t~, ۠8 6ֳbhp%>'.J>QWU 9bR{.V74 k-11[>Pu渶\!n[ tVzo:AM{OnqXWe /n"܍o9}r!|QtNgnK=,I{8@WL=/W4$(t{wZ~]GgA cwnC[H uϥXԼAzg*~*FYߙ' `Vn^0J{>2I3 ǚ2k&3t;cY/y:(B5^2TY.B*N23Q43gCjc𜫺g^rC ,viZ ۖj|@ 5 ^SY F{nW[?kz3i; Myل01t@DBI%&K%Lpj˘)^@qI~Cjcryk(X%1qҟ))' 5Ŕ pbT݃c|>a*YS,lc@|ȜLeBOb=MH u ^kJj|lib ?o $%θ${tj<$n^I)KwcSì4v^ ;AwGӫ?DT4iX*DNf'$H)q л5OiOjA'$:J ֍AK%w_ w4a916|Iu.#<8]ߪ Rbƺv|5666e*&9? }3'=lꑓ:1:V41#',əȄ3<]W mxnHF!&? {AlzK]fTvC.6úCRG`A^kU.3f:BIq<4rQ.jaTC sLt2.Ҵ͹a49<犔*ݥUIz_[MwV%olh4K= w4 r+ψBuwk/Fyi\]4i"Lx{Smt96wVϻU|I~[VM[V;rl~#ѦҋqƅޞhS7`FP{zl^z1f4plލ`PRȎq!Y\h~-<2.>(7]t{ebBhm_>oE֙nTWcGWϵD:/h D( =\ b"/݊l p/:Ư>1(4m((}R N!k1ŭ%UH RJ< 3īHEI31;0+[*SNs+d 3%DjLPJ BhF .UPDl=*@P.ZK@\p P"%S%8KQ.{aؙD|#S-ꂀMt8_؝R3>fPn*A ]e(%KZieJ*}0M,7N#\>|B+BWJV.'-)غ=4*~E&b4(v,Kc&ppeu)k&m0Ԓf&&SJ1 4L E]jT\$ղEոEƺP"tʥL|ZcUAݻAVpVE$#Xtiںl}A _a$QCE5vx`h pW.;[FӃ4Zc-7:[tz{{pJ5[{Sˍ3xnG{MVdz 堒#w" [`Z)ydMT=N+ЈB p*M\A3L5IÔ!&ʖN\-G^>`O Mddd/ׂ>J $3ߍ}Go(ںܼmh5'rThָ|ńTn;lxP7Ӈ'k ǯE*otrȺBZK` KV 1\a"c,>I?6`&g|dþ 0 ol:5ɠLZ~rUrdS dF,NLؤv;߮m (oß/J/^h_Ѓ*VJYU;heQ 3D|&Q6~ |d|7O'Gc/lOW 5rw;Sؗ0+-mU}` .U(zt+On?r_{ig89IX=BΏڞ+„0}|"ozq&fǽxxt<7}d%TN Bh%*Y^*Hk^lqҳf RXs<6G#T dCPH9rkl !r}0WdLoympd_)ii:T!)AwR˽6R,C3Qa6RmVdf Wx0 #gPHUzME:$#g[X$( (OQ%fK$}DZu0IjһRwviQ!=bQ֜w(5x !oH IlUsu\爐$$LcaZTa@%BL\JkO MVޑpR[=1!هGlLPHB.aTQP`[ݨ+Qtd0kl/ެhE9Z=hFf]ډjwj5e2"ft*˖ON/Np6{n,LO1-@2< N׊Yvd9dZu%JIʾYx6y>>~?}:x%Yw}[dTrpT!)Yq!Oގw/* n $=_nfE@u8h y3jLI51^nk% rS߷Q`XKnpWo'.+|햲[]{MT$#S&VrEg*mD4~㞶THdkYə|qs׊ l@볷A2rQ|edz ogp})h Ѡy1eK@51 F͵J5+cJY2l9$AsQ[%`+VirOB RGRduR0j5J]P5U)V~zo}U6x ڜS-iI^FkrZYN9%քvFjw՚a< %fUd7)J?޹]EiRxBWo!kP7`[:jrfM 6e;x[!٭>NquPXKDǮO>t"$lʊ(3/ҷǗ]}#mY[BrqTtzUwfGtVrtj9kTnLdFo)n/X#*od[Xm5 6p :٦gT 8|/RXt e~zSqgO#]?ޱϿePͻ遬~}s|dx.-x`d F?|Wm1Z۳6sl_")/%OArA, d2ǀ:+*4h$@tT*ʼCPn6x1& cIݲ8`Kߛb5<-Y\p;zk],4NxR/н6 Js*+ܣKqrF*EQG7VL*9h>mhse$ǯUHRP. J_z!%zu%s(^FcT$qlI-̄}!cKNHڤify%Հ[҂C,ZSha\|Cvm6{jX@nmwF7K鼼嵞U'gWy;p8?@"͔GU,sl.뷷i畳g~pbosȆ[q8}7zs>oIC$͌]ea- ninVzQz8.;6Bvz mU.k!p)A;Ge6H)4 6$EbH4 P9 Cc!EÝ^x@Y~O/fp*+lR=Kڠa7UOPXTGhr,K:͛ɍ<}y.8> V1): N>✤йKDB ^3]ϸ'B.bԽ CK '\)Il]tE.an_=* TqCȢ .|_Q#c#K%2xzBVgg̈@,Bk/tA):)F:'`ml#F%˰D1[kGc^3zmATo%RԷ[^rAl[zhT#[1m%=eC,XPB2SXGpg-D]8D'PIHE}J9W3T4NښAxe"eגr whUA#׊;tmA&`\g"f̆ ˲ "aeLfKf4zq׹t)U(iq>zlZ=0^5[kp6aZVYqw7MxkT75h*}׺LEg>~icm|{v`/V\]ozuTNoS푼"t.q Hft9?͸.䛭۳^P;ܖB4#P\#攓%D+nVګ?Ӓ\}g&5ږgBw8zϯa:GŖ^4^w->}^ &R8zK$r@K< 25#Y!mN4 A爛ae lRCָnd!6Z@/t=gj,Vir`d$ !q XXzyYixVpr*$KFQ eFW<#]\ɜ EyzprK^L9K>?,Z KM*%i?_M%h&|泫_og ا𜥒B@ uPE FSVvnrlBXǮCޥN;gH2&FsmO% vk~ O7@P5:1~ oNGoFE~/yb8>>g+gP"Ҿ E25.EAswZD+;_@:٩'|%3ww2Kkϙte/q{QG6GeڢxsKƴ mfL}of;qN!Cw6M=cbާ,T.3ivj@V.?8P{W6 :>@3+]cw y2L]c7v%&/AB T-98DAXJMOZpEx.Plxc uD`rRr5T](^RD| )uH M.NzmmF%VZSTQ75T=SR7li ^A8I(ρ}ȐJQCcC,2-ABK% S)[-ʖtNJur6&g t4Z2 EKf*v|BCI➯!]騖SبetpI7M 1x炋Ȳl.& Tk&^ɪbFvznz?}ri6NozN@aNR=<y鿓ӺϜ/k@tyvg}O{6}\{Q<>S(|[ZJz2E`x;.Ze쯋ĭ쯞(ޝu6.Du/%%6^H*.VS@(^~јО8Zn|.Nwy[~{ z.oְ9E}3Jv'?> -*Z. }$3"K$^*>_^߼sѭY0npxw 1xniRF+&сez|>+4^vp~0{fyU}iy߼x -xZHlsdD, !g-,0+4f H7P(JFׁB-kJC &*dNJPbM1 A}>+3}@k 3P\UY˞B6f//a8i0>Ca\ ;wQ۷qdXktCA㑕hY'S@l)\ŨA6V-cZc!P !UZ |)cs({|6* trcz_2h_mÆg^[\RSURJb2/ƕ,'4 g{ä? .!sf9- A|.RLsVRlj')vj|dSNQn?gm2>^❙.9+۹ >,l9c/Κ5)sP9:8Yk& gӓηl0( ^T䭊{Zh,9)-$ۙp|i_컡Ȗnּ+7 (r=ato1ě&ǺB\8DԔ餦"|2w}'9 =d1&<@@)k>d̒~z?% ΑӉ<\bR+KiYܠvXeVOe).ԉS7x;:;-Oc$opbFxCrp}c+t\cWL*&,sğ>/ }^!cK.8_M=2Uɽ|۹0 l4PI25]^בt27D\34'ń0C6)&͐?קפ(H9=ӤKWlg/^,VSƸw_6O_}zҌ쟲Xr_w}1mj܄5@"yIt_?![bƳyjʷz6_>v{hUg&O{":;*D=~S# I_1_|[o-̄/[J۫wwyf&iQx*q=bM^O)Jۍ >>ܙ m '곻žD6|n 9+;PB,R9=%VevCxP,ܵM7Ȼ9ȁ}~07o\|awYu@B~͢ωb+|W6 \GB J-a-aP9&zPÙlY6T&^0a. ORH 0(u $QMy ^a73. 4< ^ܑna WWhV얷ʇ^WZrT@kطˣӢqϔh*}@jj޷_["F'}vЏ,tՑ{QDҬXuEk'*&Aq8gA}̡߼~m9&*s{8Rx&B ^c'0Ֆym4:x9\nMV"$EI'ƋS]_ XsRFm8h0s h3~Ą0yO?&D$wqUq+870[p.% VRx`+_hܚ+Vhdu6K,unj18s';("jؙ>=2'EAP9 XaKd/O #C˵#s-֭:fl~Ouauu{ S~v V^H]JsY,7 orA/66oqFa}=/ NJE7_擯*z¯b/\z}!~Xú{ˑdx9rf'ALHީG ڽv-枅0F, LMFvxq%2CU"XR8S,ǿ)TGE齮u=`<,h(XрJ[}4}CmW ǭS6ޯz1ԛP b,r$E@OmҨH# SOM>@+ )Vm(ܖ3藕6iX,NPQr[)h\' b||<_\|u=5x;JfgYj*m])LU[.h,W(g+"&<]V.iP/ҜCD'ɏs"@1BЄb[z+"(&ҭQ:Z }7)\{Pc@&g _BמxG幻zwmGelbݍ7YĊq<6wlnqRզn6 vtvax|H* bDbOHLy>A?epi"1$h ?%ymGg'ḭhkdnGg?OcrZ]1 zvGiñtX׮H4_M4ʕS=ln"R3iYv2sS pc˸J %ң0vQ 1Hpj," #u5 BIn\ę梐$xGExȞMi$r1X?4JR9&t@P&N gO>K3HsiEr@ $>'Ki)tyډxyZ|qWV2>!gi oL$Gu`'j*FZTu0\sb6z]VWZIÇu!I," D<_ǛCvGl"%vsKȂ"*EK nvNgt aak>5Q%?ܮ7՝|HÇl!1_d{e )Ǘ|:R" 䔦T"ҜV IXPɒD՘U?2YL (09 s']@Yӂl#{R̼;:Ur] Js9Rp$Yzx!֏|ՀꮓK̬,'0jMN ˭]tU8%*_^RrUR^RvaI Kh5y"W҄JSYivKl a=5]$;爷`lq s=3aL6.Ù𻄕_3'1%cnGg'iTD(JsjG (ZƈKޔΣӘaS2 WGc?/X|߿kfд CZM10(UJ [+38+VPA tiwRMg) 5p/o}@ [YF+WaG#>s_E/?Zӛ>4' $KGI$H* ӘC0eYleGjeKم$ю N#sNB8+)/~Qfj,w$|rg_,DGrw\kKmҖ߆۟NI9Pd@>0%RZ&&Q5ŞVyA1$ Y%Ŭړm/u3wpl /֠n1-WM[ŏ}7^J_6~t;XݽY @=궮& '^PQbлȀmHk=hx if$fd^Nc$Ϙyl%3C& K`2(K:#uO/f?_ * gSM'fj:ZEf#J!Q9p!FWZkZaCמ5 r%3BXY#/bj(J3₃=d6`r<}{ A 1'ׁO CtS5*Q3`VOa99jDUAkL3$H`XScr?k?uHUNثUt[=\]-CXAI']>cvM+wMaҏY-ov# |8󆆝.;$El3(RWv 7G"PL2g&O eg!Ks&6:%&A;X۳ޮGN%} ^Wb>J^x5e5GH. K^+WCup"BI; C%,=wIcbeI<ul|'ńUu&Gg嫸7%1,s{zҚLϬ ',2t@PEGx+/Pb6,GTSRQ IJ%uH0H Z " Um3:$%&8#!B`=fs/ܝZ2]99$YԆ6auu{ C%{0>*8P P1$gc% ?XV,}m"*j/\-(F8];/@/ij$|FAMխE@Ո#u=_pM Tf0xǃLd1;)Z( $CHRArj?g`AFkAhdՃg[h8wrpz%H3- 'fR%',V^>KY !wQk?o6H",D7p$EVUKb6Q礖5c5 4F0{a}+J |o͓sr:ðDhjI%CQV^ \9[zΖ+V=-\*ZkO ^!]mGi#ʔԁy"4kz\*1H^-=:OE/Eo?km$G0_ io/ {|I>ge$"$Km=,-ivgصb=Xd?SI 䌍1<-g:234|+!oGjoe X{6[r3/\ʮcWY%״:D4AP穭WQɷW9c U۫A(U$ ?u!mϷt5us^hZL[kܶ眲Ukx;R+))y~n1he`O+η.11%I)oO(C2$X $5K9`joࢲ5;74Ɓ *\w~xi7q k?W籬mURjѮPYij\H:PNz!dMIr ..jS;c>;ƿ l^jW9lr׵/ _5]\ ٯwg,b ^-wbzEbqG(a<(hӻ7W? Z-^zSOϋySHn;; 1٧yl ׍ \jCrt(G(qe4j1 #8!zNƉ2:PgWƷɔg_ E)3}@-,Ô=⯙0Y1&{;{oR6wwz גsdØp4k0 @Yv޽i "8C4 ?c{7%7rCAEGoFY rWQ;zڃfÍsn 0O8b 3E1;YJ64ǩy6T;zϤ);sTLv+r25{ r[ oʯzʁu6$IV5/pl5{vs7_ޙ;h ^߰{eU\B'k .m_鄦%: ˴^2{fdoUN|>8(BI!_n6j+ ]𺊔 2ovB˅%~^G;NC7 C3{U&wPoG)'s}]}u6)T2zP_(&30.>g Lϰ*a6 iDDBFpG9IE CB>Q.1I{1> QqQ fg oov]A*Ҡt8yT9+xu={ϋ}DP[p3ΣmHR`=1za\CJ+#/RM||Թ( БA;Rh (ID*ghL]`UG=:'&ְK~J: HtS:S-gKu'3JՃ~jqS})/Қ!;˰f8f޺zB|9}6~:nР/U״J4(-ů hLqQYt EpB*s[1 ,x"AOiAK2ʘs:xZXR TT_[Ϫv9_ҭe2dVU!mݻ@k"wt]/@` uڝSWNEluqE xC. R+zbP8xrFGBjtt-HoGZomAԿblFU^\@r &[ k>=|(҈kz$J\6)Amg*tdB,`3V/JKI5IL DJDAs/RPXns]*<嫑5 hhI*^f鳯d~֨(8$ǡƾ?w'S5]߾g=4[a T]Y@ڔ`[{{TgўN[˹m,pM5L)̈́6I.pDB\#oR&_eFeq͐<|`b2iAgEm}Ϛkm/9}az/¼w1/K޷eå1 (t'.# %ɯNPJVv$ BQ*JQI1.j aL.塂],`rΚ`]% vά1ئdDwϚ-.P2AQPr˦"->,&p·D W?Q eCn$ 2ύ/D+m4/ y  L u.Qi*s\ŗ+VMGMlt =e O C ]eU.8#@zŒ*"P|T2 Y#}(Rlն@WaKC_CU6o>=?s`ѲiP enWS/AzxFSDҙleqXE q- (o*h&І9kj7έdYQݧWYV2B,37Lk/ɷ3y &2MTH Ih0{Ko_|=L^NEߓgFk@FPhm'KG4U}L {{mH5Y-U ZJ$EKFcEߓ/i" Fc򜪾'_ԝɗ؟PmQZx1'4Ӛp9fh 趉. K5p k:t1R7=Y- E! ıHSA%@&c-IHA Ԉ_j'൐"hAs'7TL:ANGOAe)|}4Yr+*n8d<|]e܇m~>}8~bmySkIX?d'i5td'o<}RP<>"o OO8N)~ϗolgpQh4B _'N #&ŨW9#>Y z1J.l1g~#WxE[tJWȊu~|,ef]._\h#M,G4(j\)/d1]pYr|wxe6N{*WȕVVS3#*I)扆;FL)h+ p>^ Qf+,UPѴkTp /Մ)R3YLJͼecܨ(iy&Ldzȕs裰.pDȉvQFHK r}Aj"K*tk FL26H%]rW MV $5*.i\`u LG%i?%L NioQG% i<ʡFf-ҭ:tN3 IĠDל,KT,wQ@{ u@j2 iMlƓmlkTϮDn?a7qx8hfOx"hHk < H k: -d зTo |L ӝ`ל]3 *5\8g=|fy5QI.cew˸@ mu3c_&ipތ|ؚ$ԢɭbKgsϘ¼*)!L\6ě#E(?z͠R2\ܵx\c8*qk2M;j [K tv+hr$=UKg y="Q>cPhzF 0PW|6P6#_3 >\ f-͏4zyWT+QPKYnJar5]O w=_# '⪙?*_f1h%{Y HNWYPj\E㽍^jػ"Xn z~ 0 o3ViiglAKj`36#X!;c茝EShޱgI0;t+hIV <8o ]:&)5jVJ3xKmw\IOӕ=|}^cB`,7IAO3d7C@?{xw}1 UKZm_fb&[-P`0D(CA̶cl|{!ȥׯj?_ftwz$IZtyu^_Nӱ6e*~g̨zuS|ڭ53 4ιF92p9L=š*3VtL<::7RQȚOTm"}To(͞VI8`G $#@9V t=j.!wO {n.V7.{Hq^o#<8l ~H@\| 4cT$fWAח81f1f3L w`Ի Ώ숾 nFWgq2 29l^~tv Z}I G\`lwZHgPOIpYUEr2 ,"I2.ur @`W$)|r>B9b쌢0`b<o۔x( %'U狂22E ]վ hp #=":dZOR;4$2vZ}7 #M%rkHM/4 "Bk i*:Q1S[5x3aG)b'^.t@Cƙk֓LPQB2wJ  SfLu5+lpk@qK>}0x*j/$l#uFy96:;Q"G{b ŮP=.4O tL*y.Lπ2!ҁXaAt$0YS9 \ [.jaW t9S<{FU-4+hA̟ O .Y+7_FÐCH=ǼWQeF;D"LXi{*f3H8!B*3k,8*)8NK]9O* uآGD!5juϢl|8lL); w0H5uRC)qu#1s $QO(Ǎ`ӆ_靈: 6 7 M[D3^E[*^8he5!9 o w>BSQP2w77'{S7 0cgѮx]Mp%6f%6j%v\)l ,ʩͿ_ FWgq2rRQJIcE%b݉p+\fW8XvgO=>{BAq\GQϾ9Fe b^+uCjF2EӯL2"i= <X}f?W8P >E%f?{bP)%gLi'h祢2AaVR!(0 w|) Oڏ+ wjڻa9L}e6HiGF!u[A#A=ݼۓ5?UjuxvW+aI'BPRQk!~(\430l ʥh*HcA9ndL$crJ7<4pͨ>׌Ieg?R#hrN*t\ZuJ >b?IQ*Q:V%aY0~4\?]%.w*nﰸyr;+ > WI Ey01Mnvإ5 X6>NAmi"zq,ԧ,l}& ipτXY{b`LX~Fn,?ݥ&ݥ*jݥ&0!+FNM<^ynۆMʆl\z#Oa$y7y=#t8\ 1%y!:}a3:K`}k WBw )^=Pl8 $p`_3 680 =%`":lӄE L43h48ap?9υ #CA?m[A`ǡ'BCsh .h[=:F r Z-55A)=a㉳ 2g9ali1!CuRjIv~߱7axy{N`$UVr[z,_9` d2Sd)  ՜j@FP",pMp)IpV0E]D\ΣkO6~l\gXi50tJ2ϗcppo 9H 6sh- pXpi {:hc9|~x{z0.Iw?O~M*qT8$(}\ZE#p0G&P!!ygTawO+5 $J/ݗ }k8{.{xxwpowп.n^`¹h¿4ؠW.JƠ/g2߷~#׷6HƆ~AW=.f ܵ'o hm"aQl7_ӯ[<\ww.x_t6ҳ^|r+7GB"$SM&[*1)m9 vK^hv!!pM>I9$c`(4@A@` $:V #N20<#!L (}~ȞBPT7zF3Ǡ"L0=>o|)"ƮO"!Fv}"Qˉߘ]sb)td-]\}.6>E4I(kO{8햊A~cv[.8[Bc[ hLC8q (kjG1dV#f8 :Š@ D(Y $Dh,j:)j~R󶰴7ExSYu,<2 YG$40(9'0 Zat_)bJW%uXq$0rZE CPC DE WuLPA&A|RA6{q-7Bq9/? {"S^>|U=o?_ۘ>;A(DFyUcfȲxv.;2nbu=&2E/œ(׵sWKagEAt?E1d9LM baQZ3C3/Z'Șx,E FL{֌!ّ剋 )G#ީ`*vx`R+O/>fo-^].8$SapT#fȱxI'.>Wͩ̐a M\Aaâ@p`]JSXR)vz13dYYN]'Ga%ovr3X<ćJo4Z)7:1L@;PZN/>fG><6JᠩpA6LwMUӋal$rMC\ʸ뛻/;<`^DPXea:߅cuVCf }2U*fATTiH Pe$Gr,[Xۗ6ՙF{(퇧jur Dʃy[!H(TL)Z×}W/ o4 9.!n>y;?#&8a)s-a}w#c9brD ݊#8&qKnɝ6+z,\R)3B/5&!R8)Q_Uco8" ,g:HVA)%JĞrn2_gC z!(Fs Q5wD5@M@4"R>>39Y%ZYF&"p*1kLk2@ca( L!I\[Ƞ3: (u*AE X#bZ;Hc` h TLIDX܎rmH0 }b}^ )nz| ZDh ' w  R*^s&%y爔\V߅CH`,wً_QɠհSPI\kv841ZS$eqǖ@M w41Rxۗ^r(ct߾2{Qe1?369G=A~x Jk!Agzȑ_ ews}d$2gfOQdfw 'Ūͺ$4n%g1a^rjcnMyEX{L010%2J&Fb1FT: MtVIDNTpFyZå6p⣪*ilëf)-F,2#o+Xb,\L.17Y(9j/AɌ鴈0^3/f%, wm#K-Z6KŦ8c˂&# C Sg7kU5ie٭y6B9D05wTf"r ȴB Ǒܙ;2ep\z*$V}%I4rJԣ'j<0zD% UJ.`JKT,>Vj]RN᭍_?~&+|u?]*Ыoѿ6Sx[b0Wa%ټߝyg~Uyi>m"@o=q3r<[}ŗ,1<7WO%x_tV;WBʰ*Uyo;)).4@mcUc'zߗ/KjսU?vKh*s ]ґ&bE_觸9qwVh' ]VO**`!;ẂMIBBAZWot x]}¬J=[eyVQq:Oy y^z MqnξS< |*_|%Zv6k^)Tq>//~{D*˳n~7x>}U)] x5hF)@ 1hOP+_`̊IqҠB4 2s&(WYKl^.@O_֗+)|T&k|w]`qhN,<20N4.{QZoNր _J %@tGV2 _xqMV/ } )\,o.>k(%} ە~_ W?s.u~]bƛ〆K-P8bjybat MЎ\$uhGYyd m9I11fs\9w;^tƞK|5WpN [DjXYyO\؞hSYU,95l0Y'KUA^S"zWdgYE^ټK0a:zeiΛɬdVْCJ Iu[wu]>|?,]~oMm<_uNt5%A+|=ؐQG[=~}}*~UcSRj~JWT~jUbpaj;|mlsa6W}PlxLs/;~o`XD߈ |`E\Mtv32di bn X[:ZGգc$c$#te5t7I석c1&|v_x%J\"MIo-VA2/imVȔ LIWLnA')mqٹnc\}?QZsULm= c*S},J+Ϫѓ[g}3-rl=@%{1ԣyA7e}=zI Ž|2*=ܬ?fiMsY>)!b,pRTtcBҊཔ 0Z#sMmfkH-eT9^A ." cl*c96'{OXPl2hIQ Z Y#КPJp49;Uc{be%c\u*+$deR<9v/O7"n.;;ki"򏿿1rPiI>lD@ߴpm,}{/ Ln~*u6vNwEZּݖ1Mǔ[g&L)\f׋݌K`V@9h}$Z.gcֲymTuR==vCv[61zn'r v{L4wnq(\mۖ>3aXy(W#KVfH*T~:Rk繋V\w# V9*'|xS`/ ٌz,6~9=f Cnitvm= /M~cvK}zXf@5ƞQV|XSXTQ:_j10>AG÷BO0h{>'izA0Zr<]+`Me{P]ıAU#w3yɺR-"u蛋teEcp I c!{eCf$kt2E` *JTkt>JuJ2NZsƮ`ߦhTVkѠ- OJ[q`pY6XLRȑsQ)dTd!eD wj6p4h:3QQJP9DrfVNc Z r&/e?7HfdCzs*Ȉ+HjcY,V@u>v*dS/y6B9D054w&{FEL[uPʶq^ 'Vc̗<[!:StN,R1'iCPzR3zp44&:@l8V7+tQ|]??|BE曋gbSghûUtKtcӿ]_]Ϝ7]:3Uw5ハT}T_vSD})m ՟,x+DzeH OXKQo2@=vih ^|Abt+[$:1򐂍NjὡDKRuNG$节dT.jE5Xml~څMVs\J6%sTDE2<Wr$%X J!(S=E֭& Ou=*QTY\{h,WjkSNBp)4\Z>dk0717X}hyaa.ʐ$LNfYO(1gSZ}O#NcX{]mpF'C/ȑ||9,jdG XMZB10)h4(I .$( "M@a-%r9IV~snd 7{]os*[ l҇&4ƹ@Q欨=5jc6 GཥXMɏ~ YY܄ˬh:Z6;m3}ՔzGaAqqBI$3(pOܢO MP9+@Q)rR!ڹZӔ 9{UqQh q!M0uZHcN`5i=P5E@9Z8sY;[E"r]$ /v\>V+ǬraR{䐏"z9D0#YH %xc5rCY\Xd&!d":Q[(0;[26XM~RdxsTz]bǩ^%q(܎Z V<qEJ"4#CDńWX %0KK]z6DujTy/si?>햯֗y t x;E  Ynh͐4)_a̹l?A$R3C.Np(TEG>ܱQ0xRSx8TྀT L ̠LPCY `Tс⎰ &7сJ5qf'(TRu *0m Av*+ocbp8&c \ה y5#,ptRIaaR۔l_tjQT)ָ;RT^)<ϔ =gM`+_Zoxj%lNױ۟Z(r9nMa,rŠ"HhY2b8 cH[m^sN2Pߍ+w&uZ6OIQ1%ɅMmS)D~/WmBb^8A* iGʅi!-D%b9y'ʯcnp9?ATƉ ٰ֯7#[$U^HM';-' 6g I Șń2Y yBEZǰ6ol_4i[g1JtU]R`:Q0.JqB3){yA}5`i\ AlzOrRfb u O]S&bȂv"#OD˨=*vռun Ǥh5Tz"g1z$0/S"^%IO9_&r$4-Lks4&jf@:w)dڴ.DdX3y^6"|O~bs")f؜K@w-W> 5BmרA1h^r<{$wVuNOa#jEֺNL)G8A ɉلj= )H#Uוx)bVi'AVlBƬs>6_1 v]EX[2$LQJ'?wfjuk,2clEQub> vNPbqF=!,i7|{B87))@Kf6LFT8nX[vsXk"VuFOP7Տ;:k۫ͶU?D e&FoY5](X0BAYɠwi|}@iVfs e::+|֟ f):>JJrgMŖeSr1^է8Aғ|ef6Rx'(1|>;%I5NIc5aXWMhGG>}]u^kUX(Etu ~YܲLvlh.'tgqv#rMDȾ]nWT9NPa.d &,"HzrZvK~vncř)XipLvFZ-$NaE鮽+dL*ʌZi%`({Tvzi6Zu bQj+ȧmDDŽhux!O,\K1 obsnDeE ”6CZ Y е*4@I,$@1G<^n]h-%m h2;۫1ΎOGpLQZv&v)p G]]к OkAJ25 4k81Ml OuLt8~dD|}7C>:geLz6CAA#E;/Wq׌m¾(q`+\?DSݏn9l5ØWSCIEwto_$ONrBp'i{<#s}9ze6?zH\hB1(ox.ިzs@ W5tN|6z;}yHL]B+.;s>]S..Y _pu6 .dRao/pa95 QaG,bpɨ(!GYGʗ7 2vK>U'Vz_WWbۧ`aHiU|qqI%!<Țrz^WǟQJڔK#uj/'?'\]yow 鈖*E2i/.G>X3hǗ[K#. T5j_gE~߂JpqO]z<ɻNGf83gw@  S1Y$ r&'C`y{:湕$Qhh8 m>>q!Jͽ&'x mDi\% $EdRlA)~S }0ljsP}Y%}iaa Waq:ExfΨz]{Y7Y]1`HذLr/xz?[1u#Bpg@Ow/Sܮpa>L.pB$5B\**GM3Qۙ}CPLFjܦoCV(WlDVM[ų }q>Dy #^~ke5ֆxI@PmֻA/VJ]西=|>9|z9)rp3>Fyr e6 ^$v qߑWVo9ig ONc;ԡp"gq|w/ʇ? >=ܬ+ Mpke#PxEZȤ-7@+M&Ujt3T.Z!N2& ES bp,֜jt8pԌjIR,|Y:Dѝp%@8xK)~}2P+A[`-0$%",N8A{ #8~U*BFDh|DS j:T#!!tQPnzem.*O=XP*olS~T tԢ":%E<7}tX`,U1a(((RIr ͋F3ݐjAhTq(d.1dXd#Y  4(ωۤ ?ixQ}I5J7PeR*'!Q}I5#\I髖RiʤT,蓐稾ڤAM%Re~CR ]ypttw!F Zq )xr:{L&b8>6:4¿ V( ]V\Ow/%XTk4O),z?L'?O&LNF{(F=k=;=Lb).,_Cmct:+Hp|@5@an@ +?ͯ UY<B{gbbh(́o^!뮗]` ]`].xqE^Kַgqڇ߬RnI4쩾<5”2*/ P@Q2l,P΍DW`Ӂ4gF422HZ>6 EKkݔ~<N2jAK0`a]SaDH1:0+"ZDBIsPGQ~Z@F)}gm *׆װdC8ݡ *t aŖn'pƸ6,0O֬7SjL D?M_ʍ$eb4ÝL*""<8k89F X_H09 \ߜ Ȅd7~YAH 8Dz`Tr\$Iy"kA­fòIjkY@^ nlLqr@vup^|-6P=^E5Q D#ߓD͜%N):e*qޜjLR7)eƽ'+%ʭux2,꿠>ŷh$]rMBSްcF^-/A 7Rf[~bPQ܂-.:ق礚uL7+}eŝmn_ƏpiLWD׳eQ4{3gZ?3tZ/OIA~م$G_Okmcl!jd^;z@KUzSxܴ'ϡtAEH'sX%g=&I/B@n+)%Z9?wW:Et>J}>x#n_3naxл5[{zxx{q~:ްϞCQ ѻP&0p^]ղ’PYT0q-TluTOr;"d<2_fVEkpK\"E82;wUd0MpS1VWjԌoeR3ubMPg75' xM 4NVpq,7{SjA |Y.Q@!Ot^oZv9y=5l\Z|S9T+ԢuMT2N5ZX%МjMև9@bA hskjjJ,<|Q~0f ?^z¶h?iRW/پw:#5K;Pm1b3wNھcIyE~x>ETHهr0ǏF_a6S#oqdoa4=Ky1*\u$!_)-H[[ JD;h]R,n]HW.dJɓw'ỵŠDtھv;ޖ'9vkۮPu!!_nTaqF%Ś5,y\C]X>F/0].K.^=4/X iq,PݯO"{`bڻ`F>.ߍ9pPj[\Q}.ifҋK<_Ȳz0wEV#ӘY/wx6|lJYj79<&єxP:f5Ԏ0R&!%a+Go_sHTya.H/,\7+5AF`_֛ I"~BĕF]d=Hj|Qxf$q)lDVc) Sd%ELi2lln;`K٢tK>m@rϱ҅J҅GW`\ZrmG| ,QʫK.)HR9Eo\RS>T<ͱW".{{=Sʕi>d\`񺬋0VnpȽ5,CAUP+H"x .z]}(V.-`L8M{6jq1"z@U\vo(}5\(8Syu˙7QR@M߸P vͅ7;Q!sC B`t6p zDA1.!7Y7@Kdj#-7Kޫ$qG.A2FPAA9I "I6)8w &rr[WKZG֯­&q1$G)i+( LQH=∊!wKnXa5ܰcĒgRDaQi(O&^iBVjNB.ǹb=$ׯ-sSh!tDrAE_V3f 5obimS1@>?_i>CpbqX:Xjua9F[4|7vjyLe&I8!] 1ƮXՍRjفQ*%DuIE%lc@J4oM^DOQaU RThN9T'h3f ?gFj.zeSa$9F*?Ԩ6Ve@W^!5 >K&kRKsQj+[P(h#?{.z.(GP43vgWf }ڼցX; b9AA3[;+>r!lޛT (aQԧpjS606Sh@ ,-ܬK\5Lw:S?n.wAtbQŻu}yjͻŧz&!raH_#gލOX-V!Љ}FTּ[|@wk!/1PhyXa!w<,Ů TZ=.Cee_l!/9*+*Y͎ohH_IɊZ) UnD]f_4?t48vCd (m.w@įG3෣˻;eveOfo k&Oo}~>/+uuO>d;/a= d, ߟtrIu9dUU@Q.x4yZ*FPړ;:!:l.PXsRKMy#Q:< (Etԓu)("J) C) T/PKɺL#;t(e" L5x/PDJԂ 8adJKnF(Bj^^Q:DrR|ǔJwI=Y #JR~FRkRO֥q\rI~(< (eRZ^aJ M(4Jy3@) ) uDQ*dJEqSBi8 "_WEP[ɺLi(e4 ;@)a(-f3Q Yjh,S9$5`A ƌ#JA$ 㒢$%y$@<QКkcԒ5dlB9_*3J˭M,HDݕה'SR;?FJ ]a9?#{ҰԌcL8l Rs Bj#JR[>K;e/PKɺAQ l,&Bқ*i|J楮Vt.Rjvw_.vS[wonL'+պh9/1D^@)nõ~jFheF޼{kd6xs]4t6fc^j[ցQaIJ2:{`I4$MA6/edbm,7߰dR{ &.6jLoO/H^B br SP6S@#M>oJt!/XZ6c=A, ' 2іg P5Zq6; `^wo.Fϯ t(z#zu0՗p%GZ*ӬUt5{teAT-?miv n[9IBS$2LqjYSF3NP 3p[*0[j ʿb%.;>&$N*KA2PTBHM')URL&Q:Ԥizۥy6wSqⷶ[^u6)i2zyr5͜~U7b1gA:q3&ˈM`Yj@` @bF3 )Hs#2a2g:cCQW.7z]n%HArǐ Tst10My0\d09p~5!c(P6M&DYsBS#t(XR}J&~| #sXҪI?~9)I+' ī-ڇw."LvZ^Cܟѻ_^r:/hA7~bRףo'8 U2tO;'ZIVhBGXq,^v2]*w/?MV>^gccN@1'oTƛ(t! q4u*<*SE (UGưQuqc%c:_j֛,v#\1<1i;$ey$<9X_ތcg`Db(qe;("% (((zhXE|TcsK@ŷű£&cHuhf'@q [Sy"[VbE8zqQ*pA,IrύnVvbV-F)6Nfy05xDC g/6^~!9Mol0̓ɖ z яnǣ7W/ooɷyi~MݕBFJVN{6_ݺ7ˢZiԢL1r.=\'543+7 8ߟTbJ4B9r|ƞya۾~]ld.[hl3~ѹl frXj?P6KRM7{ԥr(PPP_jV]lVɄ[`ϓkacƨ9}J! KY=EO1((^.(JEq;2.tzO"ul}Njy]A3[;+>r1lޛi*ޣd"2Xtnx4.n6 C&M~)'e !C4 S9wCRzX@'U[q@*nޭ pFa U#M(;n:N3x.W(9iͻŗ@z&!ZSKtb/}b)=^]ģnt;T-/Pww%.'|u . NQS,tk/<2V 9#VѠ:R }-d]jOrDQ w: (eBjvd?"a}52˥dk5="Dsj [&\S&N t3jT򜽽/o.n8d (7 (=nAZ5d -x>zф4z~.@aߚѿd}i&/'IRsu%hC:p zLj^c8vTjȘU,nj d(S)$P$ahdn9$A$ynUUTpMZc}5V+JJRzp:IGsX&LSYJF#t*MJJ 4ߙסFB$QGO B)k$ӠhJX'wmuH~cH a6aw O}M<#EI%9Z}."};R#cWb *MiVރQ(3jNԒ~ԩEB8(0qa-IHm_l\"VָAڻRY),&eY((<(2 QF#>!naXTX!#Kg{›7[xZ'KfW4+Hq o|r.{ZIJҊ![xHV*ZT([:<i V8F[X\ EVSn-ҙwHyↂzܐCPdEY[Y澡2hs, .ʅ, tѩ QX9=7Dd@e:t[%31ebGȍq]f 0%ņT;G"b{`ӣk:M]\ im%ҀAh@hOA\PFwQà!!`:خ`IcT7Vc7VqpcUKp0>Cӑ.6Y)R {d1=D|YYj`5F^:榒"/Jger^Tw=QsK )1cӭNj4({ y&dSF~n{7#͂;IFw02Z0wKI n}X+7$J6bĹݤjLR11ox wFj9^^FWnma=Ɔ= y!XDȕM1˞X侜q>U[qUv,Io ϺQܯnnk7V`hb8\GAaV/im "0k!DN`*Qd#D٨ waR̤Je.ZnjVU L1hf;P5+%D \*\ع`\>a47o썦aEkJgq161nѶڦ@@Jl\%S  Xh6&\ᷕ6 XUŊ *W󂫢 Ң\^qF>kWqLڬqW0͵5RtlkA<IG 9'>WӝnqdX6О Y՗zPF+~Գ,L=Zm=\<]_R4.ɵ %!F 17RS6Bnc+vS%DU+ 2yVara+&7Z.KQڈbU z|Zc/=* MNj"]6*&*Kp? ZeR;-bkII,ϷX2h%>b <c3\duS'O*"]}G+|T蟩Ԝ(3`l]#VZj|{Sj>٫ΐ F]" @cA:0,xp=KX4 Cs/>l])yt)unw p5=<ٍ臟[ٟ"r~T.~v?K>Y▌-.[_鶺tc۷Z;6nɇ}D!/n.>9&!tOwtuʯ'l>nf~ޗbD`=杰>]okpfތκb!TѓVj/?A(muB T Ԛ,z bX2ZZJ\ʪUI.ZNWHU.+k2B_UAM>۪1{Gs{Nzْ}s>+"/.m 5)6L?&=or]14b;PucVŵHΠ72ϋX2zeȕT 81!xEdQA(PTs_btzE~B#8ei˸]K2#V,)ug Cg/CFX <h-5Ɖ7硠HI +]Kk+pXm^X3kWZVJ-+u<; 3PK{zs`! >G51sɥr] ҅dUe%@(!* ʖlQkN5`2%M!TZRrUP)P!`O(d0Vd!^\T4rU)%J*KHKp~k5s~ "d-dMVZpGw N^|fTā-ݷaZ\K,-qFfJ6y56\Z栖;Ej+,kgIPh!ul=RRN9ڶbH чĐ=7d8%0Lnx ph/UUq {W _ Jhm h\) B\&h2 F1[ye.*%W6ӹ}Z)Z.ܟt\Pm7.Pt1w۔aN6QJ:vٴ}av)s73c8 =tmtumS4| 7ͣ&O;;?7NYV;!0\)73;UO*]c*?G[HmF,3MˌObTfQbsG[ύŶHxU?4 u:8σ^w=v Hܛ|,䕛hM)8ճnN7b۔yj@ևrlS,~شNOi C;=Ѹ_llcAɒeom.XiLՃ6Br2$K`JhL Ԁ??.=LYYijQ:'^U%hsrn]0T2s.,`)EU Qزʰ, |aA{M'nwCnK"tfCKM6'Zκ 6ʔH?Ies4ג.nPZ@BWPAVvXu)-.3w6C&g.:̲5.~IAeYVl 4gZ~R.Rs y^kZHbZ=skfj~kjgޭn?.-F׃(k;ՇTU cC8&WId'E%(tQ%݇D5Yot3 Nvzx'`]0LR,myIr΂۹JjPe4u YerrԪ>^1wFaQW¤Hٝs&t?7A*: kA8;ۗբ(ør\X,ĹHݐ6<>zxW韌,z^RN@7c0J7Zk 1vB jeQRXI @$eFt*0ձ XP Tz\eOf{<4?0$}]oZ[[Rץ\$[yr,KݕښR0"3B-,T0Z!%5|9ߪrso{ş[(|!zeUgܴ_m]KL 򛗡*/e.KU(+drSJIS*}L (ElV򗟲(Ӷ SJQH{{7#vGin6J4bL؂jlV +9n=Zg:ȬɎ}XV(>L6K! IGR|{\b6>=qG ʩ>k`I>BUa[M|ΐ>>.V1汁Kt ־AP}dR|nzW`D!斤HG= pF!KOP;H{SJ6S\ŐƂcHd )8@ZApp䋊}籋8U {$_}aHCZdէHИMH@fg(K Z5j$-04ƒȫ\P);2ftJ!}e[j`rѥbB.SP1m@\t}X+7®>yĕzϻ R11ox /Ɖ-=-лa!DSlJv݇q{70eD b]z@$9B^lڔF$G8pc`u(n## 2clGa.pK|2x t>)q˾|GlL-?giܙ#d2ŧS|4lO.44Kݯ?_Tnw(5;_9[|qӇӛ67+f}wm~u=fO[>#+wuO}>/v} [ u{wo&0sF;X_:XDu^ :K|CǼd;;lJaKRk&a-5JJì[,ԗmAJJIY))<ԗmjҳRY~3񥧤lK-URHXic2-f9:o+f`,JOI}ٖڅӋ/=o+EfY`V6J7RK#T0+U[*3 +=%e[jh+=c+Ձ5Snn5Ny+=G+ ;qIҰSԠә[)0+Ev*jLi(¬t#fmazs><),4T'RìTL|aVqgmKs}ԗmr^zV "JasESZ)0+I^2Jr.faa9Q;ҒuVJ:JI%*H'۪1oN|TONKzٖԨSqYER<*gpW3'OTc 썦:%c բr/[P5ަ@Uƕq[i6ʺ1W.$\dp_:3'#@QեFaϪ:wm֒xk"H-cB@17LV ^BYW+r K#mC~|{sآX (=k`YCtԆZ xmr ֬}ObZ5|f %Ȝ17%?B?{ƭd K%WZmUVnd|-<UZt ) ɡ %Nqih4Ѝ ~Ok9I␃Onb.վ8`{ڬܡZ8P aUCm1`)pT(GH(#34ѩ%Zɴ:MMC0TV ePߓ|!k,=(B9ԔS{1KXc^AJ $g W4"H.e”H]E&ɆX VEb$]FW P>n9Bƨ/' t5˚oY/v;?a)aB;lL>gbEeQ;n-bߞ\]iB1r/~X !h gQ'?/on-۱$P* jcM{Us~-νJ[̨i:z((On.o._+ۗo Y/amh@nv1GyaFul(e&TGS4"JL fiN^*g.m>sV#/Z8gw"g3H06\2/NDyKr&Ѫ! F'~&aHOG{ +.#N;X(^}N8K^0iŏ"ݽ*Uz"1l>Ajr<+)$F4|vu[wFi\FV@if؈$"R&9IE$,t&(SrD,FH*mQr(v"Ih& ÝHJӔňVL#s4Mi14Q$M $,Zp)v`t]5P ƤYc^ٵw`ֿcf͉}eMN΀ë́{?s(/CM /cK$ܠ (U hMI1tYwT1L1E+>pPpO@~0F `fشQeF57rHy$xFi*dDD5eN8b tC#% ?l]48iFʏ=w4=EpN6 g3TtEP9ofT2Lh3rè{47PӞLPJAE#jolc1ψS"}&AQܱ;RWS/uyG-;DfTn֐-{ܰK a}kz=0T"ZR!8=3m//q>` z#M|(iGgYD {E'ZuCƮ?=LAVzM]~ReWLp3_sc_= i^v}sqD8z0:!t(vWO*@>Vxs= ,fIxg0 29Fȵ"-5.9]ׄ{,>pĻ"p%%1uJ\l򯛛wW_=Z~)'-25Zlq5[mX?;b<*2 05rY y(ؽ feS6-|y/z  av $ v6TV _w3}!PJ`fQ8_ a4e@BJgec]yRB±=֖ow/Mc\h5 [B҅ԜUX8"6Q=Y,'zڑȫ[]br;N N]`-:x ;]^W-/S -Ko'g<v6fŏ%_ v9ݺSAO媺cDg:-g"3x@۹ E2֒r )G1S{Y7Q 9u EuBcztGELh-<]Ѻ!\EtJ{;ͦYb:nC"A[i0Gֆr)StE Bޞ,~NզWc|*r{BAANЂQGq0)/vF-.َ1TRxX^1CJm9ꨥXKq4-Օ2iR_lKM&G-}Z궂3>գR}[՛lĒ7 $%)q!˸!N=HZfP|x9RTH PA԰HJ1R'"7Cb+ǖg8)tv`׳on95tu߂![JE§j/v)8¯|(#\>jO3P1AIM TrHi>*b]^PW ?qJqm^+ZS'@SheeH3Db SHcCj&"fIr7v[ehtJW NB3NbH,tBb3!xl)s6LX#b&ZH`rV"ͩUČȀjT2tb 8Ac!%<1 W? #0BN^XLDH*185ZeR"=6$T e AQW}Z˻OW.=l2A" QX==ISWRˇ߭[:g3DpA>|~.ޮ??PEoY/6 iM_ޜE/V_cM?犠0ߞ\Ǚ3E8\EVV`2kڌjLnWXl1%')ɓɟ"skxkU8ܸD3 !j01qgC648I)Ҝ6}5b%_EW߿èV.6Zw)E-OQh)NXD(. TO>WO{$`JGQ`WTvp\]2Mv75j|sv :BAK_ ?^mZՎpzlqʘyswx^>;bQܓQՍq:.E?t:|\yn`_lAX률{uX:GN_vև'|!{?*⚏ۼ il..Kb~ŸԛՊj)WW6`Zm:81+Q{z#X96NoHBs!$N䉚J8D1E/LdNcOx@%Buxk/7A(LQjr9裏c{lq3q>78o>5ӫPa]_з)ۧ)Pȧͧ3ӽudt##F4n>;}V9L1[ܫٙBpxEӟ≄pD8 ;=0 "x`#\(RtZ߉ u"r#9a^,rꪾ c1|̃<\_.:^?,uWRzTj,uK)W!#$!?jY&3!>m=1Z],>xt莰̖B7{՛ʊ$=lH+)XG:lm|d')*J }}U7p2lq:;-%[,Q•\:gTuP_En+[TNBy%2䳉.ޚT*䡰[ÍXޚfrfo$ ꗰς6{8mߧ2"O ȤE6q|7k3 ^YeLz%۰mȤx#9ʰ`#xp702!eY*:5 i'ar]p^$h<~Vp2> 2N%Vf8\Q@o TSs) {(ť Rh?}BiXY谶lxd2[?=w1 YTa r*mE2$ɹS_%ңMFO`b; n?gyWbv8.XNRf;fsf?:ͱz*(%{a3OE};E> @05BpfmxaI6e2ijl?$.5Iv+5ցa(z,kꦫǫ9osww \]+}}ߟ, >\J|j*U5?Xg?z:k~g>QM-q&9Y×11J{Ƿ?)'e#ywb1qmݘZI3FTĸN7D`S&,Xncxw$˯8j`Ru:\E(Tz7}Ȭq{͇XcްDB TմqP+g}#"dlm2m+5*>o/ĆJO(8/=?/vCb U6p#5 9{)B"&CR_w/^z^Gh*=VExiFgd⼔5\FjP9k[.`Dh.P|g`XJK/="uWj y!X &¶I+ܹLZ)F1m:錒2fQv1fyL2Ju0 |L3.O 9խa21tC,TlP(J\ ՊʼZnUUhG)iB[@,!ǸcCdc(Di7^eBghBrVY!ODx ^=-crbBMj^2,~bZ zX(!`&,k3)Hqe&!r >VOuaғoTJX]QQV RJVe,DXPRRC{ycC#EUu%UkFQeB.*(ϕu.,1d2^oi#Y gիТ VW:7m^/m. ,@FJ¨Z{K .o@YԢʄC&hz xGѽHGP4n䫘VW3hFfb &0J. cC q Z0GMhC8d&+ݜfkL$0c$a:B,-D?)m dK\YEm(ɐ]~efϮt5j+tp\G@`` )F=D6D DGPHDhX}HGWQjs(&LDx/Q?kH | doIWۢ뿣fch2^6Nr#P.ekYUkm LE6QOEZa *aD^,w ^.[aMWsK24%魐 G)alx0~z[1zZ2['|O_Y|I8qm*4b~*$1/#)R{ɖ%?btI'K(@#R̼C\exXDc\M6'/ XA({%R`%JA)B7zv̖9MQv$#xw2d-dKn:qon^M,;ncxwb8oRu:f\Ev O"]tOſܨJg%ے?J]DaPyt,9,_L!aJGE9tw0DP/KiSQ&¶Iufng6 F2fQ4YO4JxB9Vb2ܣaѢ<}|2| a$4ǣFA=o8R\*+ZUFʮtrU aPj7BUcdw6,Ejs7Q9*݆>-% Z4̶‚=d yvll(φFuo E|bK8:LVH _Hymơ:`ĸUdr͹%BPI)*XQMRAUZ9 4w(?RthR[:: ISQ4Zx`AȠBt[UlTYm<,2&P^K!GVuAG"@;ay"[sf ]&R1+%|oP-(j1c]W2U! eRHcJ _钕J]uG -UX#K":;z o-xS'Hm:P(;1܁ ndZj5N9F~b6p.e{ՅVYA< pv}*3%"p.O(Wh9v_*e&SᾧZLv٣hO Ko\b$V\XCKi>)f2hXrfsb)ZmSZ]4UgV4Za6qaH a5$n÷pjK;Y o\ ά_HK'g-bJ. S~zr+c^B[k6IOQ<z^NO%pyrLše)`|ХI H`V>Z=xC2NͶvH|B8\0@ R8vpy@ͧ;WF+3B,0#7hՀcްA[F~. 9EvH_>݄ RŇ>:,uWXl;tNpj j;x;Lb&9x$mL2|&iߘ{n7kP9OF-B2wcD:s]iVR WLVT\RR[{tMeZ1_81K a(:shڠ~߸2ўosv9Moq0O$}m$.If}6D&7mm`F/}\DMtM= K\S(C1JR8TN !7HA>擯c[u}ȉ+VէRX#|_/X7<6 h_/_~!ȇ_}3F[պèF-o'{۟;?KS!W_ ܍qVU%[| Yݵo=V7O~NSeL% W`튜+SZy( ,97oa,n $=Ty-3z/3Kr,YBUܖ &BR\+#sJs卆T#gHR99W̊*-hLJ:/0Y+ Xg )sGsU,=Α7"պ/7c,ǿ\}竐*?4=6Zy_݅*CxOZ*u柛w/'&LlιfTD#{0\^y)HHgEa˂]!~obwVޞ͎D2ב{JvJYWb:-Wyi,O}.+Wj[gȑ"Y)d怽DscE&odGd4-Zb+(D/H=eYXc"Nޭm02,xRޚ*EX4Bjvj^)Z.*]DXbHj¨JK^0Ѝ $'4~DGݭJqmK74nRQM$N bI7C*M${RwMcof&WLy?Sz-&_}x:>,ɔLgrA'@@ˢ-EGt̐* K!,"FSAZ-F (1X CQx4LºHǻT~hଡ଼JCWl{گ4S#pkpYYk'J6a-k2])T9XT-.5o&` AB,`eh Rs@^Kbj$yo=Q%7^r,ZˉP"w+AS5-Xԡ)xLJ \zƃ>j ƅML8c%ch2VK>xץX:be\}GʨGJZ۾+щQ@W,L19dû:c[R&툫`SnMn] +hjWC]qnz7)Q>W)}Gv]澽[vGS[!SZs)Dr-:{Xߴ( V8 dqWzLuwc*1M3cpjG%m], >fCkIx8%ls]N6Ydb/d' Eu拫&M@v-yePAu<&Ҷ x$>vRye<6}y^J ;0X愬Eɹ,EpM67%e(E6)*YњKT]qD_vDTIs+Œ_Ԁt\6JfsJJPZI F6Q{|]+5tL"KC)Vi9O:zYMvp͛&p7޽7wd8G _>}2Q8.?weIMc&I=V븫1d^ڌ/KH r_?^VGr@eu hRc lҡZ$g iq*sl*Tm5wMyanMXOVD%q^od `lnb9i~?_ogruαcob)zэٸ58'V6Z+dp̍6#i22#Y?ev5Z*Jp R[KGT5T*C6d\]dEnh錠52$mޛi ). rR `R(-lxZ͔|WzCh`| 1 Go1|ynYu܇/4|6h)\<~_ܽu_Wm/?[31Xp*~ N2ɄE,m5-BkCaJYRN+)@=c^V S'r0i-"E3H}s %Z06#R1.qhL|y@]s%P^oQ R:Y:$ZF\ AH=]ś^8ͷEu]>=qB+TJoy3A%rB>~>xHԊ/x^꣟/c0) -a!^_rs{4Uf,f&ko"iJ:\ۻoiO{8,Lb=5cTQMVضbv:2n^`I1&7mj& DmH^U0⤘Hc",ӎbYwHنe#?_߂7< 6/}z=Hm9'3Ii~P2̏ⷛ~ngTyY5·_NǒQTYh(lPn6x=+mUh!\RS]' :s+JIY'.7'!6_ AuGsQ烊gS5$LX$IgQ`v Vٸ`jqQ8Wt؊U<ZڕYs5mZaxE<fM!"^3mZLz@E<Zk%ů"vrv1:),!yÃNM5VN^2O)!l_ Cgy=8k#Q6xE<.Xz40Y<|s|S:P9+n+PSW`asM(IK\HIMn(A%glvKk ,8}k|C|]jIj\͏jIysu9Y?=NVD76 8ݷq8ƭ W.Lwq#Lkfx8|Bءqwo_n0+TKG޹8- xpobʐ}Q:"{v^dL!_9Ds0E?lz7t-W)}GvՁ[~ow!_9D0q+ h-W)}Gv]̻B%z.C=QÒbVh}LۍwE7q\_OnW?僓x0ܽu7sx|wJ%v^ňXް_Ԕ#bu"^S9wH: (=iHC). PI5HCi%uQz(e3>NʇKY_I 9P^>s3GCǦ祷텟._Și7WeVMhjqWV<ۃze4׭+p7 Z(N;wb,vJx,$na^y9Qwa$3R6ۘ1I]@R#F&?A&Ϙd2:C2 zDD)(?Fg V᪈Ӥ8bԸsoAw fm7tb W,DJr$_6.by~a K-NSfO 7SmeaZ V`[p I5g [swcDRM{:J,mn>藑펰%;Tľ}rgbO;}V^{i}[JZ?m-]|*;byW:V_l';ZkrO&j &s.ӽ*Glnޛ&TXG+vpgK3O;\J&5q&(TvEO9Г qHJ-B VH_,9A( Dd!`p(5JIYڍ!A-z8Nqlı"sj('|]jQz(/LAi|.x%՚Sz.V|$9mr(HCi%5 N M炽æ MORs>KOӅWR@>RcوC)$ϐ A(Q .+C0ʅab2,esDW+,LS,=(Zp^B `U\B%>7#A.cswթadu>䥌>!FB/tSM$NTç%rK,(#mQUX^-b÷*M™ncsyxLnH۪7py )i^rs{40R$RO=| ͈;Ʌvv|P6An'knI K?rGjU&vS.X,C~${z@RIEW{~@@===3_lS .;Y(" ='_3`61.vo!uleؐ@mz/0^ղ^2N:i4l{ vy-cX3ǭV# .L(ln{ v)t`k)q ba>dS,};>px4 K:ό/<xNU$s}D/>-bR JM~{:8s;? ic7'}Y4HeN%5NXr'Vrn+ՊQCrOc{Kr~b\i;mgZӭk\ZhAS'W%% x.>\ph N+9oz'v8Lû\|W|0eJ^ 6fД!.;= w@MHz7/u_xaU:p֝!U_]j(uUaeFu~!/c(bUsTZԗw{RB9&@y B첿n=ۆphU½*MǙ Z t9-I}򸘘%[P !U(nv/v\kA76<71}Byo* p.S7L³tY\c_b/.t^P& 9s QD_W=&D C~ǻu}Njͻg9z:,M4Ħ}_#=f3B1p1ox.w4³[r&Z٦`Gn.o(nO}nPZMSgVie%[q#ژ@SړIn6Qp9si1Kt1̨W9B/QRezM3&y2l?C̟1)gix?c6m] χ^nj @kY{o^ [̗^4sՆjp9 } d3x:B'E6:mizͻꤜ2VřSKūy%|]Շ"aE2+^Gpif G 8H0Q p3O+ xIp,[k`sز6d$400+>H0ڸ&l\ 0#.`#-cĭqM8R 6Gt}bDCsk\R$" {Yҗr%#vw @-/Qxd$yFq!XHJeY,3HR;zA@+XN`ũĺ΅z)%V U`WEb ˃3*l'מ5NKF G.FGmM`mJFþ uM>tw2q:CYK>۴LD"F^A`y5~^uBaH>}JU_]jq|aV \9. L b઎RNŽwR Z SP(.mXB);juZlkSPx) 9s ))z{7YHлbb:nE3SEa^B감37 *^kĹ`B1p1ox.wz؆;z:,MMAXlik@Bu?KR*۩lWJP{i~g] R9n/ C[`"^Xu|u-QWFdlUŦҪA{Xu4Suo,iRb*(R֞lj #S#K!;#kO9VG\i Tv5&B N~u W.*TZ|Uf{W+*&UbۓZ(o\uSAw8^a!gnA6Ey=力izP \L'>ޭTN감37>np#e ح]*~4vn5XșheRIwiy9a$,>=0\=8F%h~{u-]ODdzT8xJcwyá;R6HBSe)ӾNĶ?Y1Hv5J5Y6<4,~C.SSY֓oe[Э#Yb>]o|>Dozrh'G<gC̟1)gix^>y0~4?oC@{kZ_=χ^k󿵞j @u)dT1M0Jb yL U0qL m![8O/?>^u`|HcYݗyCЫ;5QY4/@jxx ve R /6CnVHp$.Vd㗷o+xl[.8.ݞm$Mb0zU`pyzfiKU"ټ.v/zz~w4HչJvTRעJa_%CY!Q.Bv0O D-RCjaji}%j)t"J-u:0-dJՁIݛR2r0  gQ+l?Z7g@PrhIn\0D7 ޿RM|dGX0[R)aL8i /%mM#$FbL8T 4P(ׂd4>LbX[[9^ʡ];0䤢8yE1ST>,'pTBTJMpaU׊`!\Ӌ"CO(fô|5@ hb/α`f/k@EmhEhcC0em(,!,qz1$Śv߳ 2i%9 s*dȒ+BEdBepHMܖNq`( rzU) EXИF!4cIp*TY6"B\I$BZ= #tv!ђ'0ɉM(.ʄbZ#4X'FI,T hJ0tLbl,r˂~ؗyA>u_V qGkeI3㏈tjbySj+c1f1x4ml$!{JG H1[_ߋcRI]$bBwgpU*_Ljh<^*2$w 9ApZWpk\o׌;0l `>VclWsM3Z|V ݬ}nBi.8 45/1„oŬ4CM?'s5/VOby(Trwb5b8T=50eVXxxwف3iB]0}ſ^l9/|%U̲9LzCh!ν%2jpK-i=[.n]bҒK -Y+[]}Wz#Q ^[́Ţg:iq{󼑦^S 1݇[RI],"-rE ݎZo]uv# twu ~4R~ c/@PSl"_9H I_݇A Y3_KM99Z:aFi  ,p@68Ԋ[ՅI 9 Y7|Zu,X Z˽DZ^r&dS~nA:nN7|[wq.nՎޭ 9s BAN_#} tzP \L'>ޭ]Lښw /snuXșheRL jv|cJ;8xlo{q+]{.5&H^nNVH7+%y-VJRJOJc~;ygcRJM"+=i+|#u/;R".մ'l ?n`A] ǗYmٻ6n,WXzڙppgR2+̼Lń$qI͛&ͦLe["}.s?Biرq %Lʫ_z(eTnyҒju(ܔ+6@kToR(%W^0J_{9ΡT.שߤJu]/]8/=@&@Q  PߜS F&B-l2R˵רߤQu_s+4KNT#RR7DJ_~j ZQJJiy^*JC){F)?By^JpM4W^6JbqfFnRvr(ϧGw4w]Z.F|hTRXn;Gp]'7_m?.~qtwdD ګ_x2]l8<}OiT/TfL,zBϞp#pm#k_"6LDޯ4{K9zgӮ' &]6bki۫-@ގZ<b&]s.Q @l*È.ɝ=s{!"וM G c~z87R_h/ڨ{ F}5Q@3SkhUն6ٸ49ΚXuoո"F5 F7os 0choQ>$x$tq{˸x; xdӤxo˫7(4YL6H aK۾ǙJ]&y$z5/~ݺѨʘ暷7ȄdBZ$b 78c fx%,-"ff7y®&t~3nh*z,}R Ñd%-i4&FJucVZ"uwn\L╽iDd+w#8xBoND2V\{uw2J2{{03vK .E;9l:Mnj>L-MZ}`uɰ7F9Ά…&@4ws!k+_;y_Z~ʻΐ bp:Eܴu]PH+'Rv::@;t4~~Q$?׉$֕Gi2\+F5gFu~$'eQ5=qD9 4ޠΗ0V٣niE eD1%rdVD{ޱ#ʡ{vF^.nZ̼p?<|=(8BniS߈d`U۰r*} jagQ8UTLWv>rbS tWʞitrփ!g\Z:ZWC[;Ig)VN1|^sylCIFR޳VPq/QT`r kٝk/RԐ.+J=ͰJfCG!=id@ǙNTkՆ8@4$$ (jy7䯑Mr UTK˾w XHބ9'&V+0`m)JńYAyB҂1ESgH45HrJ qxDžF\1.X  E x@(8fy*vtlYf֪J6D@9K>`CS*Gݟ>C_^L4+q:)mb6'sU*ʵ~+բr}kW~Z LJӖk@۫?|?<ݺwHzՒPұ21u-n䌠 .T pM!Pa7^-U^am_dqgΕU{V^Ktϕee5VSMl'44]O@quA|<ڰJZT'O@VEP U5?9!ŽCEpYHg V}jE[1~~-N{lݡzv<ysZuyMٙ! Ξg|ްƭ\޸ wo߱j5wϟDigөhQJϫfQ }Gv>֬[xOBpaJ{MT2bhBewԱnEp 5^Ӻ5/cnwZC0'l&3MCnj#53Hbf XpaL쫗{8P_<3 gTa3A?n/ HHcZ( vf,F4m./z2~9K\R֥c+tR] +-A1+et{1hP-μZG[1ۋ=h{ǀGd4ZGBiF;6c@b@kOVZ5تsU/گp4#T1:,'&\V\/!|۹/[s:Fd _TᇦkPP /¦ a%f揧$G*9jk:?j=r+4< ہZv<Ն@e[wEЅPφNm!vp3 | _=u tBQǺ2KӚu iݚ@L ;qnY7 c3ZP@'u9w@)oͺnM Hi6ꭏ(dH~ "Jw!By%^-?uf)G;EjV!kDxc("xfTS*#Kr$V'/Nf|fnP͆.< _2kt$o8xg'wV}D7(u,u{BVrw{Nh_%__A,ij+t\+Uޅ|b~1l}?RQ^ 0po09-VKQ sL9 N$[&BՆL-otۆCw< Dg\SǭwJ fZ0$dCS"&HtEMr]QWʘ&oԏ$]rP HtӃ(Lj Ip(ʅ;SJCeVy*͢?+LnyQȟ |/$~lbc28?ʗSGG  B+3\_w{Sy~:ݜ|߹w; ZhF(H Ȩf=i!)` zm xK` Z_1~nH^86Eۯ˛;T/¯v/uJ.co7[=w立˫KftUuv{y󾩝 7Y*UݺcUX|:urR~~9o=+ pߡ(b$PpMO†%X)b,t0E9m"6'.iCP$@@Mő!6vMVObtҴi*0i\$8>څиG yPzV[ۨCI84o u8sFD#14-ԚF(ϻKmEO,N4XK_( OIKGKMlWPDoqFO]<(=vT B?S#veLxf6= dB^&דy7.$S11nㅻ^r--лua!DSlJ2`̻[*!1mmyޭ y&ڦ6L "5$%7BC^BT" Q"7%7fCdͻf"@"HȾ/"{0],*h u* qo~鸤ruQOFJ7]%9&cBBAQKǬ)#2tVƱ -YGv]F6KH8`RDb>a )vFY{rM9og$r6kuqkE-I1(.h«Tt;N^QͶSd- RΞ2Tf 8N50 DpvXZ(|uI7/̐Ji\=qrm"#jCjd>XhS@AA'wBi甾rJJR\ka*uHZPr ro%FUH*ȫ\HáCkHp/]IIJ~z>j ׾.^ZKBέ0Uy!rA!hZ)g_T7!JVqlyCۼι,e 5ri`M[YĂS>A"C@eX,y}Y}Y6$ 4}0/JΥ8goZY%-Ȋ19BXׅԖ36U'" E;xAFD5(9m#97@ wWmM \| OLp{0 01v;1;q7noHAv Qsb8 Yb< X![,V1{80|۳M c[Og~әTCZHK &~L!\ P$Sтs)@ 9TK.&-'3aL)MiHS)кqzi5S+s Bnхw84LmOi0N[Bۛ|0_NM`܂po@9&3 GpC41HT5<*J$5WŁ, Wz 8qvSԛ΃7sZasv}F `K6dȱNsN PL|Rו61jbY~raB{Rj:7VssaM!讽Hy?xbWj11"J(dSbj8) y&dSm\{7ٸ=wK tR#ƻwy6[zkwB^&ٔGϼi[*!1m]}z;Jޭ y&cSWPH,K;  y>.Mda4ُlԽlal +hTl´p JupFi^Zou=LvtpEUFIs>VKe+Xcfҵ-dE ѐtxn݌lڬW[n/٣JxƱU. Ŷ. An>Uwş?9:[ У;_MwyPs]=Gg3[>؝`e~ߐEQ_Cޝ`|" CY?n{e!.j፾cHxHC)퇰G%j:3eOxh3Eɛ4V`Pr3N# Q/"7*Ǣ]e<\Y<#|Xrm|s/\ ͐77Zsٜ"~Μq|8xR Xoמ H=}gtuf zAzo᭤[ E]ag~mF{~|FM]4Gp'QIT'fsЫ9A ủ`.$ylJJd*BqN7Dp#>.!0[^k޿Xn"|\_- nz{έ=GmU_ʂ|>sE-×ܾIB5%2#e!ЀRy^B@%ЍRYLIr2Yn7P pPF :2cݨ@ [֔Z@0pHTVkrR)SԖP@r}"} 5]V{(\v"8j:,vS oRuޚm$5@0>-q;M#y'C|X'RUw]k遼\ZFk xfp5ݩ8iTM"Z ŃӸѺPN@~%AA eNry" 5rW&*/˧DrQ=ut)SRNiOeW M 5Oqz ,pSu  xޮW}n.vAqdt]K~7fεL,MtՈ0@&?Ajkv BP͚:*XA*B"zBf :J#T|DdP9j7r!XNK4 3 }\7i('~$H A}SVY?Ձn'S|S3 oޯlF.z+՟(Mָ9V+3.Dk/L{. f+F%\j r|hAncGx[p':_g;\.M]y ˆPA"r%Vf&@f(} +'i(ìWᔜR\PFlT]7Pe,gXJj3QX]2s+j X((v,]`hhLbn[+yj[8D`%뛱iX1/v&k[VM+a -K}+5(>7:m+ kIVX;ǴҰVA4eVzVjM+hia67RsӶRKźm$ZmR`͍ qΟ@Jܪ$k;XUnWJ%AssQFdFkMfr^gylt!MͲ>TQ@Ȟ}Sqk񭧮 )F.Rŧz SVWJO.{B}vJGql(~D j 9h7h ނ1e4A05PIrnђ'ezt*M̀>dlY.ֆ149d{& ̀йV.Et' yC'hpT7wVl"ɆWBN&\?{5o^Ȍ(3{`P8IW52+~,{ mE)K%;ʏEqyUIMu$Pg? 4< i:ru}q& JH=ׄ4LӍOAXr[J^EڛOUf;]q- a}4cqP;1$ݵ3{yJx&*{4$ UK~5y ?f>i)X#UoV!1$3+_ŤJHidvC MZ7* ek$ČyQbD IHїw[ @qR.E D}50#:RslQf.\ yD]ƴg2,x?`Fp R#5Yva'&r5ju%YO>?Or n-=bոQ*`m" ]*١+:ϰ:um5SWno Ri\{测{LZ6%OkOzU5nB?uz3zCc[BjR!l ۍi汤U6>cqpL`0O$z8IkIGSݳۭgrOk}0?ZvUJ?Ps"hQ ?A@~$ebrVps8 C$Q1>rh:zcHNc[# "fzkz9|M+_ aI*8Mqcjc#Zs~.;ÛKݠ&M@EPK.pU"+JB+@-uO&k Ƥޒ 5[QfZHX pNz8: %ע5U\ b9sF}-Ah5!@NRk],1}2 3HA QZ, 9c9ZFg:(Lf\ d4s FZi-#$f.uZfblbuh1'^(3Y2Fި[rcPC=siå[ɥ% )8BFxLN!"2{'$(ikEwZ s:M3e x3=:aDb&L024@!2Jh ,V%CITh,8[ a QZ1=хUZs3k&K]w$,3 Tey=+#qakMq(b" @,AcrA@bVZnaTY+\ Y /n]AR6_^rRG8[|))//4%3ɘ:ӆ-RVI zf KDipI\Vabp;jHFHkwKLJ0(8\z [sP@xE ڀ% 1в') æS}4<1pi 8VdP ˋTwOCjsE7|ʚ>򗓘"7_8"$R?&oT+-ң|g ҽwd_C, Dbӿ zh[>|S(.l:a{=Ё{6)<>J9u.( C{r~0EoJK.M>/ɗIYSD  `u0%w#ǡx0!8DxQ4`ٰ$h ,>B mhsm٨̵` 3qSySz kߙp gôaif>3xPWzx?[??ypb08t:|%O8BlM!߂4r] WA>^F:~;lLn.\?_U3}o?|qo.C\lj.b̆:B5׉pMʎ#õDщùꤋRExuw2?Lk['xBWxCOoBl:o~&V˰10&OxO>y~80gm1A?.w  Mr. Ќ|/”Mq'P&Ux030T-+e\LnIr;/?b.~Nw~E=;y/ݾ%Hh_3 xsl ~w5g>M~Ԧp\o.K>oxoI>]r!UVRubW^6uuΩ0LjI)^_*38ΐ ɲRkbO7FC/`.0uW0oxGЌL3Po j}R}!x=Ilo}ޗ7~aA΍?ݯ֋|~>q>h:XºД>力Kտeo^U_ׯ1:{p3 ~=>]& /8W-xOMj6a׉O^XsSsyk}=aOLi;x1IO*nz/ sK^ݽ7?*-_˟`Yַl?-Wԝ2ಜrzU3<ۋC'Ӏ_LoBe(~-O.v\֑%cy&2Ɏ=.vRn |uuQ\v%_,.}ѷG(GnmU̲-WM+;㉫ӶPbIՑ<l ̈́62&yPD0&IF45e-A:3(*RG{N(f)婂y `Z L#Dh+$FXY+}+b5 4BPi٤`ckj0 W=lbt$<`Rr\ F1'\1$$ي1Պ Im#GJps"rtO9̒ dyުF)E &Y"Ę ɩ:c횯6AVbnAK#HK0O+ahVF-@"MpWAhR 13<\p!jrԶ&6Fsk& 6$%$ԓdD1pQQ=Px? H`ʘxk>c&X l)[٨9-YFsBS$tQkԂvQbҺ(đ(& ]AK6Mܒ ;C>A<(A`J Q{Tr:IT$4U$焼bRs.gMK;k ٟ Aa+~Xw4:jg5srsBud;9"N p9c 2俎f>L?xs{sAS9B}Q:T(4U'HkgljHdE)ߏ%fcSUW7(ʚ{K\n(qvmXerko]k ]^uX .;rCBnG-OgM2d7.-Ey~/԰ج٧?WEvD拴wK}Aݑ|&Ʀ'mn.=i&} lvk_,rvBp-Td^NDmdMA0@N%Q[rqZ2"B @׉E uY{}[R jqu~89TYzվ|6-kZ;H -uINJڻbCs w l6[.'} ) A 8f9]>˷ nY,&李8ܰro/::Ek t%V>.jrV2FO('[H-X i^J2P_ tZrq^2̲Zė }4q(l!3n,r/ "x%U~>It-|E~îki"!}QpC 4Rn$3dռck݇f#ѻ߶//4e P;a3[Qk/cdjfhBU(C䩱)lu( $K!PMbtlѭ |soN Dq(9  |#\q~Ts 䒱 ?ڎbރM zpI^/3I$ݒf*ĒJqq?=2$㾑F)FJΓ]X$6#,A"YvYGtR>F~BE` U޵.8Nȱ % @)n,q,RQT)CN&hĐb.& ALBI5FsGe%uYN& g 9iEĩd-0Y(q:đ"ݱNo`zS|(_)W/|<1!DrEdsjqv<֚e.$#b7 /WqtYżuQ`kρxg&d| j%q 'ZuA1)rna=Z 8˂͟W`O+,FTVx&XMƛ5YkqJR;d`S"AHkF5M!Qt^pibBX4kZھdz+Db[]1ϔ3Yda*a^Rp!jrs,\1ILLRY} Ԧ(*9Gs^ȿZ'}}DP !V&wsp rQݮ6+k􀦘R>k %)YkMT XkO8dOj1:b5( @G$f<5#f5ص1Z8U#:_1+Z޵-Zr8Svw!=K1+!% '''<W jA1Sms+S#֊poROMdV \.LL?sݥ;ߺ}< }5r+$uwb2Hc+FtĄ#D1?:F[uD(Ơg [XÈ9,1ET2Mdd~L&'dNZ{W 'X ΰ8^ke/,އBgJǖ8zWpLiFZTb-dDlg҄1Beix՜R)`Ras:urmjI&Ұ `<\\MݲM M9yy\‚U>sO_|/3Le?~s*njKĐkrj%ϥE8µww7_‚5_/CvxwH(u+S_J1̺֡fqroѱv5+X-bD j/W}wot(}㽉ntnU6W-McRjOK+15Pa-k-Ȍ9ٻID@ %4!M ~ $* OQ,F@\UXHoe3B-Y+CXXZzߊK# ?P2&DavA#S8 Be1oZ<<=J?}J2aeպG9XNCNo0 OFÞA\zIJ>Eq_z,lNyFZ3;0тIe;"ڮ$b0 phzv&o"1b# ?&Ou{ZFBN(B=iށFp79O3O=ڲ@7'.ܪOЗ<3  LlwIۦ<)U^*&(&(&(؍< JIFӔĺD N J{v$zr|Ijϣivy4-_~esh&ZaRJ saxYȐ=c$9ώKVJS))T$Z&>  ƼPk4Rn5sQA@q\]FT~"SK jgvQ˙x"\J{JgM!򀌑1 Yk1L\Jt:T4Qڦb .IK-Y%V!z'ۊc0*IPDRTQ)<K1(W<@*JZgzPBS֘hǪFa@3#&d5]&DmAVv%{/W 㪽ShvYkbޛYg ĤV%|#0 .R oQ(fnK(GF-L3 n#cq@aXpH }4u]"Sm6j!>Vr3M? H|[>el9½a^\|x9 r' D`?dOho=ܥ2Rw|g1Y-^b61}>^s#+!jGL'Hڍ\)i#Pj.QF' /G]Kc;NBwאoL˒ϟFuY=ifVzpY@&.J, p]- vՎyaf* WЩ~V} wTĩTG4m>~Q)1ƢWp 2W*\4J'՛win)'ou kV*b:tpsa~M\a1f3U;xGF0&)+FRܖkpA!,ZW$,ڬh3 0杗f4+8TXq^qQq' l^Qq#Y/gPŀ3bq2 d_20:eXr6`d7$E آW]_|t g N!-BraW,&DNjAmGը,?ud:$x}<x"Ѓ=%GkDz:M A:bSX-HctJK^=RGnȮg arcMŠ/֋Y:9HW#!S Jp8M_B9*%1 rA Vq+ CDz#:8IN42& UKԔ#dVXPX&@BEEB@~^%73 ;bYR{鋞ZKv 1?Vz]E* @dkFi6]ތ)!IU8c(b PpUpxP0!ܠWNIJLjp9ǯ 7|3>Z$YO6փ9znR3ی2ԑ$v+ݼ{cba3mgO c ["ςfC0VtZϫ`AU6c K-JT2F37/^ 2u+lQGWI;Bc袕q+ۧrNmj!{>WHx^~ n"0vŜPXVZPSg[:~ cCZT78~Sn߿S#pNǝEʋˋu9˃qE эND];ɑDYAW)r$vBҥ]d2S :Od-y8,׌q?E6Sg6jڲ6qךQS8ݧTQ}4(9n"2ӆ9,(jcͺ蛻Wekf&lv;׾~োm]Wq.FW. +d6N!uIy*\/WW鏧^y?`~}a=I[~7U3S=cNA GIt)hc&Kߴvc{sc:XހSQL(Qm=Iك:]~>_ya!>Y|os>F!~>3rzٕvj 4HnlZEHxP Y'*p~JׂNiaҠ-t!JaL G j*)=D1 3NԐc+ Xk@nCv( Hω@V/x:0$S/(օU !  9RW H ,a{9V4%b!^,kTm!G$zlxs$"R` ~˳TIE/i/7γ,+k6ʢp1FbgJYFB~'IZI~z$xn'tqH Aξ!"ِ^DP %I(J͏{%?}È;RS`Ԁ#}eI@ h$<mrpf5AC`kz-!)PgH4B^qЫ;'B&2sL+a-VXʩ4+fx+f!RɄZusk5dr(C' QK7|4m: c9׊~ S+RLx. dVXfԗ s GaRb8Β8/y4˧{;QZ~y >gG.a8"  7P0)w($a\0XAWj2]b6?n J]Ϧac;A mF?{[]ᱷF;I0 A R*Vc z,Shކ@C !xN?bWjGz((AiG׼|hUIXtNl/aš2/L) m0/D1<= $4R$(B!b&†` 9^)d$22F3N֌w0B xP %""cF)xX+*| f89x]Xb^b2N2(ıf_ !)!n|뿲1`aU['m7 hugmc#!O5hS9,M3J0cTga ! PyL,UZPZ6[tgj:ٓ'.BmԥoXN% p!P)C#s {q=%gIf6 cFA>VKڑzd?1,,~QH 3D>ͥʮzפֈs^rvC||'qɮn]Fvu[> 8 mCCVpA{?^s7[09(vh#/a>`sp9 wuuSRVhi⺇1GR|bx9%"!Kv地G@$E Ch&`a]ܖU@cH)ሄE֬*圦gPVa;% @DDB}&c S(Uo]R.ad Jt4D[~Hڔ׍y5~k _M<0t< , '5L\!N~-IqwuyO/m>K%c$)Ʋ6{"Z"c!pʭKw]ĥ/42 Gn,; nzv0m/&]v1u!|FPZ%aQCPz,uGvJ9jN!+s1Hpd.`gjwX,IN*"SаVY6Z2XIM U Eeqzq@rwiŅT,JeB"&/"t+GͿFf ?ǩ-ҋ/).W"xٸH)흛 sl>$K$1%W.sg]W'7W{+=dLU.6[Wnm*,V&Zk|w@0(Eж"ܽ wZo^J& .O|yTOq~V+*놢denTVTJ5;pb # &*GH+Hx+ō3iB0>d qs付?JHlqȥ.buM Qa's-!w ؁\%WCMAZd>qUSB)ՓSS#8}eA%=;kfB|{yi WyZImŝA-T"vp9vssbb;-"5}겸mY[$ivٚ0QLČLrZch)9ax a%Hᢼ9L Pi˽2jA@1?} 溱D88BeRpQR;mB8 A$F2'%` @uJݚ?{[9/ 9y-V,:@Xbɠ!QԴwml0=9ǶE6)J|lwbHV}uRBL aH1Zbs?X-q_Sp@52H.%6K}f e,~SCA@M'TK<Ǔoo Ѻ6z=˻/' +%I;CEUT5S JĨ11f^ODV fKU-?7ۻ_w(}E^a uR&l  Ʈ6*xbw~V(V5Ñ!'|[¬冺'vuqq ծ..ᾺO?U pI"|{ŵxz#U7hC;g˘WH0JvƪN-UT>8IEB@).:dj~ w*§T}qg߷W,\?Įzn>d6Gs@s@s@s Bǿ / MeT.?{]!_pPS %M׉$ 7&Rfh79^>vs2cfNԢu1I`F XP#`Z)n lͅTC*Z=AVjԴ~]Ga\u6e`HJAjp?emEjԍC{Ϝl-x (;ʚxDdAkf[uate VՆڵ -0Z2+43WX[0HɂDs9l{Z+pjQ$} 5fzҳQcMc0Rn StB|X֮t-h,V׵ݴXsc։7Gwn5jt=4 C'`8S ^KY1h "0Ys +F;8B`pԂC퓴8răOH?0fO1&c 2xu'H"%lIԃCv⎦Os 7K\>׆l˲¬Ls ԃq+Վ,bԃI[lSzpg>]\1`F -leRDԃ 3RG,%Cb>k SsPi K)tw[: {w/ FPJ+pLD ;wI ٪d=IDlf>L/u=rk˧ɥʛe82;kFF-ߗ|:f\ b,r·2tt1(oe y&eSC#+ߑwSS>'h&Q4Hb/:My:qͲ)M8 jaFR6.BIbbm y&z`S<#pd秸bp+YTZnxn͙Ҽ>>'w-=ygCmѝw6$Qw %_>h#&?]A"j8 Ll67=UӺ/ƋR(V_%T~;:u5 [mNCc p-+ *6xq N7a~_QŝW@F̺~b>1"|qf0m} lcқ WJ}zb{oE~9XE^`o.V䴣JRn@> SBU*T>sOogB`*Cg;r uB0H*&MY24FNscn1*-s ]Y{{$Wpo3ЬP٤QԤ:2No 2jM2(U  o m5:zy蠨0BT:nR\e5sTř7R03B2%6JHiU>rdЁЍ΁I|M2LUffF<ʉi]u}&L`⺫4 \+}b-$>S/U\3FZ+ɤ-TTi~7F槲h#B˼ Be5HV6xOCf0EbHkWe\$$/~1ن4Pԡ4êvRmx 6Ĥ{nAr\m;Ǽ8.tV>TO7)kwף/,# i66w&|_ `bbxIU :.**f`|I-䇖y.Z)zpyR뭜f㢊]?娇 6OԖ"Cz7ۧk0Fm5y)͚)f@-le6/Izt5o=4J\ N&A; X&GqLϕYFwR jtD ~]!;sU3j:5#IF9<;qcL sx>^o)kVMQ2xO;K%FJ,o&QU3w!L1Pl,䍛hMЋ_^w tr拾N.kg"M4˦#c[.16.T#MtMhȽ㎞mS%;g1BHgˋ_[>@CfV7W +AZ0leq3|:@2۠'Dvu*.ıZxj.<*])4RpI^L9"-Y%BҜd Tv;sDziI~2 nl 5%Og~C~?97(Y>|p$MsjlhUՙn{84z31Ls QIxM.]dмYrR 2Wb- h]1es)ۍ38 Ԍ1|iEi\J|nf~rD5=[3XMNq4 ^6-:5Q# {J 6z=81L Ӎ.P.REbH!xHN983XK Ѭ%ۧ0`-I(Rd ґ@5:C8hRpQJF,`H0s.pN`n/@- y5?~wGNO'{axPDJ ]W?FdjD~БB@{?Ot8|BbcI#pTJ)bVիq8o%tJJT'oO$fl(Pt@>)vtFz 9g1Sgݶ|}3 \Qb^$D=r79?lpq,2̉"&Jtw̛k0:ר@%@ZrqNY8j4N*+3++0r%NX1=ج 3Y+ 6nq4c atb=\qkqq0j bB&-Z)_ %#mNS盢a-,15`uL<&LsZtsvj1;~MyR13;>2wQCbQ] &X[S@3h] Svv?8D`Jqvvkȃ$4|F-vD/Bݚ4Y!p.aJ'bJw@r$vk~)-dQ6x:IN.d\błǑa!G{4YXUKJ[?t? YFN!\|i^mT$խ,)5T,t)N[H{!)r/8 8*۲U,% ALN j){n\8 ~0ͩ[΂-,4 Vv&kQݘ+옪/ HDڪJy:HDjf*8' P-y06PIօ%OHQus%oՊus4rQ9= r҂jBe8gX Ju.'җ]A J(oR)wLD2դzsD)Ui(vFKWJJC)-gRPQ7$PQJ 13F)MuJZ 1O/S}L5&J]Pz(4 8ZxKv(4 XU\Pz(%, +'RPZPxORPJɓ@)i(TsD{:oyOCgKӼ':DR+WS_v꘼wAyT%TuPR*.9{yb~dz<~ VL gie+Cs\f,,G3֤g:Ǽl&c2**ʠurNN*UG:=en{&̛oup.6> շTc. yPJ`I4T%/ y4-a_ z(MKXP`\"I%o&r(.xQs'җ] yI)|6#OX둪bb'cMj,tՆn.ψEtwTx\&k H^d3= ŗYR͐ƗLV|i'uɄhJ{@203/ Q T NYϙĂڒeg/SX,"ςܢ2mnTQqs4rRdI4T{ v3JEޣ=ru y"qqB5fq(" Xİ$PշT3/RPJI4Tc}6:s2RVDV|h0XHowib=`/˥ʝA׸8 |N2a\ij ?pgHv# YW@Ru40v4ـ\J40{Sexz;?j^z!DɼuHN)mF,B],; fܭPQfCX#JbD­. 92ɐΠve<6PRÉtC R{|ʅrF.s>Yh.*e)*;75}=ZpU"#V0T_Zb!Ph-/q睷 E<6$32d,P h⩪6omoT"m&o>89+7*OڔO /v)4e0lW9;ɻ+3%Lyl` ҹ@3([Gx}?STٵD>Hm*C%˘qkH MRn;pr~􇝋ֈ jS>.٧lF{Em?ADW LuZuB}ћO}{˟?o <:p_`CXV[&AV۝N *]3պʹ\Kc;9vGޮdCdF]M ?vWGuE #Hzcb\ 'Cb Qĭ0Ho d'{Mo1M8EcGͽ?w;ean$$d;NBZW}ӉLTMemq%?).zɱQމyf"߭YI@+.XV  ˴x`HDxq'(HVҀ"HkJ`Z4aYܦZL)߸+] G7yd+S'WܸT)L+ k74"5C>%+_Ŕz(G=oڰ" |W[UhM>Yu6iʒ ~?>uEKDd_ଃQEۭ/gޛNG%IPbH,0Cw6'{ŃO~~}[eZV)+6ף1l?JَB-(&&?G+XJei4YV>5Ğ\Ky=v<{l#vǢw!E6s{$gq\!}-! (4 T7T$ЎCd UW Ec)\4S BN _4D?<}Lk$rH.`YMZ`j76)\b$8:u.-ǰVISP!FVa>׊. л< Hq6nߚP>;mەx,3cE3&"kqp%yflPp#B#..{nY'~aA1U9%&W]bU;:j|]$5*Nּ|5U)+ELJJ$9bI*1bS.*MGL 䱆60)fxƤ e$$K=Mw)/7DTl ,yyAXrs8tb bx`Xɽq'F#G3#`ع' J1z,t=[13ۚ\u3X"aC0isj4 ~˕72WM `91$`g_w=hTR6@NbF)~k,R9zZ=Sj\g@D.0c;W6I*d΅Eӿ oq T']t1 5G !å,iB,+Ԅ+'pvVHQn l|)jmV[NVi%;HS%L&`R ^0E,(H`*r;F*]{Sƅ@X-[<|ł35W*բEA$oOKZcq/?{o^[db=ߧemwݥ\wQ ,&+ Axzrѻ#! s6 il}߷xE!X8ɛ/C {`"px-^'Z%.e:#z[%_+o[ݾﺟMo Wn/QɘDׇt&s2zB^RXxhyhmyXCh0a6xZq*N*-vPs;m<ض]icR9`tbaAp1:S蓏t C_Ɍ$2ƃɌ.LEx[ksaIR$'c]Z,V?vU -+~5&kW:+!l8Fބj ŋIfNf$@?DWjV FY=lq&IoؾFbK$34{~81z ^O']wq |W;A)ॷۜRb lW ]@`Z"G"gYq< n_)j[TZMt݊xb(. xO♭ѽ<)u{DvдR¸ej^1[0L[A`2V:a9Ή+h&9D)EF~v ;e6m_6foMRIcZxk:VXͤ[j, Fh{)M1*-:w;fN3[[\",+){oH~y$M"IFK2vɯ#hlagw6wX0-a;8zX0D.~3E NUD,XƲX[ݢyΟsr dpզ[Iu+$-b+ka}&EX\B8# -Ln?oړܨv2޴ǏF_wqj0e M҈BT[4c~<9Ft'ևqibF8JnL/"5t$Ћi}1-xLwdѩy%RX%TF7G`=1m jpF Aj_M=!@s.!GеkzB(!.S +'5ikm+5aĞCULti9Ee[f~RNTl?.o ڨv{]^晞\Ubd믐쫫_'1WAo4ɫo^틕NnoԎUx˷/&S[JCX}:zM}4 2KYqO#e]ZtH:;pmCf,NAbP:cn󽈦X\#VPC^8VTx[-%S;F]N(}hV.PC^8Ta1lkعx%L,:?Y 韪P:78)=pyYW^̛9a= 5r;=K⽂~ 9ۮ=˸+xH E [%_.NNH Mڮu4}EɲYEeAI@OFR>&CTӘzr%KR!:TfϖQjFkPu{LkCӜ513ε^9[u#rՒInY!Jpi(OX*"Ԇ1\M"=Zda w\/"HS|j$uI0_~E4ȩ\(·ldsh1Mz ľ}† ΕSZSQVkYlBRCíʡRxϵ,BҠZBhU&&6E@+=LRf/j4-;H t93D@r;dzڣK=S]*[AzԃUaWHe)YQĠ46+4R$R<&3@)]Af iu6*l\_iqH"RlPYN;"SI-YJZ4Ĉ\+`Ԍ3+Ǻ `Os΅5 oR*)wBNj?DNyp#*쯕ˁA'e{5;>PCfT]s$u#[ge6jp5 gcQ`Y5R`HhX& SW҅΋6Zqf^jOیo{5S5O5S끺Ewgb~ OhMS50Am$E;Z?=DX'ŒͩwS+TwPC^8T፺ڞBfĎ[jVG[+J~\[k%ovW)7ͥo z}^2A ՞s vEx,Z,O5"M Kx$'#)uM],AwվeKg7ӕ W)ܹٟ9MC[j:H[~&rW+ߦ}4fxexa)R5!k[5{X<{sŻ'posrv56sbنLVZ;)v֧=ƫP, UHVDaQ5/Ŕ¢șQI#Ylx'kOxK/g T9ύJld&rzyty_'#?vlTZpgDe~reԃԨߢc?>p G_?Pss ?OQ`Cd#6h 3l²3Aʺ=֚%2x{q_m`!^i;bX˙77dI/nʓ~kP6bȘ0;<vSGu̻ǚ[8F`^= #M:0\k=|lz\nsNa&P8ːJ:0ݒsL|Hlɍqu~nK&:<rQےl.aQ%+1 ^°9~{?rN(3rM8DlFO&ef;)y| Al-1uP__Dg!pĹAPO˄bvNNGWo*7Qs SNB436Eѝ8h)~jRLwU(t`qbP`zZPwj`ոI !2s#eҍRh[^5 ۀfg>Q`#%6*ԨLHzM=cGΈU6ءdL ZuS|l Ҏ.Ijrl,f<3$ޕHy-J"En(ȩ!&3.lg]6dO)!T@M$%Dk9b0C%2O^CdT5NK*yj`)(lU9Wݨ&$I#oEo`d3d#,jopajdS# 8HeD7ɀ'^%Kp<.l4Ag/6=gzvWE8gSq鏰WOt+h[ /JWnY4Ɇ1F.&+<[g;/"fH='} ؘ("mj<'1%agZeZ$4EN9'+[)>ޙSIwmEyVSs9|}S:`nʮeҰGMV6.6 ڇ]v`Wֈ=WGe, 8ch9#QmeGn=IZi偗"Pw%4 (2,yQgrecI$R_z(hm}@v0Q8EAh.&iZ1)C+8&Qε([<ǘ06 xc+WSq̏0 vV r`O)#~*\}@^N+""b(ϙj }C68e$UPf{a^DQ/L&L?|IT'NO~; pmMyY=z(&X^XFS#v,.Cp"4Yꃯ|rw-vSR'a׎w(νi4X]_^H^[;Ȇ^@0'pwA-q9a"i_ SSqOk k#5yH~ }#@Ob~ĺHo(\Tkh+9SvbdBv 3sh+rieYg3XtQSbG j8Q8ơEqguR<AZhԌg B$" %#!*ҠˆγZ4d\4pc<{2 ɳcg$, LmMNBdI:{.;(ͤSO"4hkSIϳ.ӹ:|Sru%TP5E"2)RQ|*ܒ"BHևeݔu5~Y.'˙OWDjul5sPP#O]Tӽ. ̸P@s> qV͒E{6~_WnB\f w[Oл ۝Nm)O 9Uv>SԴO?Knw+&# :"|?Kr1u_+6[Zi'|>/拔?iG{'o[{'o Ɋ1dLܠ4(gΨ,eF%%7$hR4S(fjf+|q8r\ ځ^,|X]@V;r4[,ǜ-OAX|?Ճoͺ B &5kSbݏx޶x޶x޶xvtYA.SȽGiI+8} WB$/qNb̦ukbC8LNc]k݇|jnC{SӲ;痯 x}Z}٫FQgu }IfE|>%0rĖļX>x<8!eRƈ VSɣIJ9ἡ`6U!v0ydXX@ < &+.Iq1y &?1{*dywSR!8HO?#Nc1;w ZKE!p)`"i4i#D:GxdM"렔 6 IbYż6Bѡ!'Hv7$5(9?p" &ρ&ԃsKzS2R$dH-0knM\AYzC6~FĄB>LhC̒Jhm2F!p$#fc˼ъ\01 Lz.'۪e)OFE'/ $8 $g+C,_24Ymu}1d6Ƞ W($zU!xpε9$$w&tˉ dݦ(V+%] ;\=<Ģ1Cv `l{r 1?da=k3uzMݵIu< #e 2r=sήtp+=Iqzjpy&]msF+,~ۀyq*e{SK֮8}IN d^$!);ޫRH ZG8x_fzI;h'!ߏy^f8H ܫbyRzsefI+V(3C GUe&)@+ȥl,*Cq4A& ÕY\%h;Ms2JOЈT4sA3Q@dYM#pR44r̐V8\tTn-e{=^HY4"s:31;4fޙ3șP! U"a܀h,gkt+T12y 1hLLg9)W+JTT+r 'EZpL>qm"g@ƑKidBbPήԒFBs2>l>߆Z†QԖWSH/xnkGB@IV7 Wݻ~-(lsu& -%(E* B'wDI#dʩiYa S.璞d<(B* (Y[b{ рB M$uy@Bd am+89hsVe{4*&w xBnf-j 0K4;Ne$km`#bxQ![rkl$(~CV>@mIL>A孈3Vٞ2 {jj+ކX,I[}B5S 6tOZˮ0= GJ'}9!e3b->m0VJL>m, 0pb ꂗ͛dq.HDAnr )3\u}N[%+ /[j6x.U W3.j|zI!i<<*ЂHa@eGb@9Z!Gm2QS/a(6n$juerA+7ź0puQ $bUh_gDzWĔKh;Md]۲4}\S>esג#}a\Cb$V$: TnB PNV%׸8^W]ctoOhޮB=u .C3Ge=nczM'7>yf qT*jfOQ6? 6ř&O/6j_6m[S֝_t&=gSGR'wo=)f~xqSbR+k}XgL'Z~j[;;9q[X:N{ròx42dtLy]51^KY(gQXy,֡˂F2b̙c&ӠL2yvV=N`1ւ1y{]h+J[}pS~]X2nd3 Ϲ8h2 A\KRtcH4w4ws|M)k`x{JjlWoݠڂVС=[HWO-sZ*7BZfO2A+7jw̚X,*a GA]f&m e IGpumC1Y \󾭯Eس|SOI&F#y!JHUg  B >Ŧo=b9+bX1fQ$dd]H>\a`g{°~ϗtv7!}/0;@<SЋ*^.)|X_ye-ٺʭ4jpLU~?W?%yf`2]>kDJFm^Jcb|s2{N3bW/_,}w7-''P<,gNx靌&@ǝY,, y%Q`%8͛-֌QƵPE:Bu,T .&Lg*9ˍ3R{&}96N*i[mۇ-a}R瀐V*,Sx:H#i#`w9Fp!wX}ppy)E~VFULV{&I=ZtiO8S nKcN* 4[Gq@ hdxh͎2P[5Z(zL1n;5853K*Fv&/YH4R3܄wN/qƜGoƅn[tpN= h=&۔ߜ.Ha03<Jk9-4o L_:MO\XmNX%gCgJ<ovكFdt xYydv7<ueJNoBz9l2hjGBuf1j~]|>K-qnZA]dZI=VRJVdZI "eŃ(u ځ65êS"qE>02K jyݩop.d({^go8% QdJq fO7:,nlll}Z[ 'VO9kX*1d̜K!Hg yE\RS~7Tz?jԈj֘ V92%(xż fːJ)sKtV-KGYt;a\; ȄQG/ow֥\ya|*7*ֻȃxDzONO[FcN.?#fg7Er\qzkmW߿Y͸n9NJB~G*,ǣRmxwCٙkg$Л`no][bEap[ٕ7 aYb@EL҅co;ڍC#js4n Gzn_v;keLYiLw1a+u3"_F񻥛 qã74!0n~0je>^| Oil< [(L_v$,B7'ONHakr&$ͮJ)ٍ%#v/1ls0Bb)<]U*~"b|>$z+XQ*XOv<(ҝvx:Z_"T DZ CiA*n]uqc:Úy[e+HnRĚӘ"oCuhdJ)˸Erί!d^e J'ORZ#˹:MnF-F|`{I~vvB-s2.{kBփxDV#ir rÕCyNvv+\sBzkPol,@ϋmSNdZ+7 2b Y[CjJ![ȍcTJB>^[m=`%͹>41Utj}Mwm;W~٧TA'm^Ë]bשک){1ߗ)R)&ЍRsH^"4ZlNN6$vA"cDt{z`eNyT^e޻0*Mˠ&uV6ynm\vo:4:.']w X@3qop_ ;qsK=+9LsP%::ņN/sڪΘ@W*2MS)i=յ;<<8O/v/(eI׋ͽF{L!:a)'Q7ݰ׊w9PNѐLu{k5bY%ΈOrcg02ysKD7MǶCqv/6mkMw鮝6uM] ]U5k\@S`Ln:7%Vd)6kU"* Їϕ}1MLJ&Cv #hB޵wklkZ7lJ-rd jTm*Ӆ`%7 USGy؅>(h؅>>m4(߸WKdN(;9>P*~@' ʾFe\Yk,֪uD͛TxaZk{גq^5Za^!':v\;s9ViAPiHXj-aW]I}2~ʳc_VZu;SL['ՏtrNJ 0)qMDņ;1Lv !wrUPTh*&r! lu䥖LHyJ,X;J!@1y#tIt%'S> }m ӕż{}p>$$2h^[8xZ=g(sF"mHsDڎyRH³7n NOBt@wer뫓f $(W =>sEARc+?%B?H!9{ʥIWQÃt `tm'}cGQ頎 Zчċޞqpש0MCp4\Ї~iΣ5N@2J,Xc}[s`9֊yZ!mjŎ|QoA]Qy+#5e"#Őyok"+8{QjB]:SOZh3\&Ƙ}^l>hɁA)5 v5YLl=G|\G`40'iO@% j(9%/k4$L P5m~9AtA|.Ađ/ənBt߸sb3.THFU&Ȼ4"d '_CycL#W!"Gqs=K-p2Gвn 8 s! Z𢣁9V}ҧ\!PId;FYMQ6 !dӖkYJnTJ ?V%;Hf.@WU>QMЀPIK{<7XS%1.1dDh*j"RܶVB`8ZTٮ]$J;5h[?U Ѹ~-1:5-wE͜VsLpVG\s۴'$v͑]Qghۑ, Fyn ޽7@نcr$6: #qv02VuApvE9_' @!l(*&zOpQιh/q:Vh(&.Ght)pi]hTO˿O'1}m"A<7}0'`׼8\Q\8rԂ X#z1ܳis*}ph K}ҝ+0[ clS$FQ6y/0y_ :͖&c/;A:%ch/N/:FdX T0jQo[G0Jfr-k0)Xc<0o7t"/ Q:ąϸl8<2ExK'&X}<]81(08bj(p7ˊ"gU^.|*Q)Rdvܛ2uϑsQ9veusN@&oJ8%7Ej"&XY0S5~p6i+wFxt sDuG7(a nłTD{F3 a>PwO(AӎGAmo R9 ُ"G"%\fsṝ.ea=|=(fQcR'3ݍWRrЧ8 wێ߽T*"0$16|BSDZ*vt)b LLI`Ib+L:G+g.Mib&rʌV5CdžR8crXw]r  R^SbJ>Zu-8% %b,=89Cӹ:E<:EljZl ! 8`jvȅ,%-TBQr}(&{H(N؎"v^ALDK T&>7YC\줹^Աְnv օ4:yo_{iW6P۞&"8%5 O_yǾ;i҃j%[)r(b7z+qMAX`61-)& $ Y4_ghqtN*٩B뙈`N&W U+3r ifBrٱsHrc@j'u3FĂkęPlʕꝘp-21q&/@7E5'O΁g\.AI]Y>nƣ<фQIٔj&SZN 93j2匯eU?Zm+Y8̀N&Um;@`F-zBN9|hF8z=E&QW<9) ^ JRH٣' Z D蹣|΍§;ht7MxUhjˆ^fϯ/Po.Uޡ6/O篟ϛGn[>y>?±b.4FَtX.ddDKE F΅,XTC!Dx@, CNEZ qS )fxkҜK۹Wc8ժ6yU0,Tlcr`)ġZn2bOS**&AhCSJ9}EޘJ>wvcǑdUZ]ZnmuF*ʦ4Yם"|9G:}qêGKحg1'DwI9 JMYTB,-"߉ܺf^>d_>.0Bݔd'&sA֑PPRʠ@`X1UjG D/ [uiYZBlas^?tÿ꟟2T//|4vU'ZyECzWgϯ_f|ʞ^˷^rbqBϙſ6鋝右e2۾bU)CIF39pG~A%ELvUJJJ?KJTtr[n*:9]ZRrm %%E@4Wx8Hew%l N'7F&iK61֗(`sӓ񉑴YTd4=x8A E:{j>WO<ɍ AH!{Gaҽ Op\t¦EЮ{3+L$#N\ooZ463Gv SpF'$t~$U814Ib#MfU'"y^f15S.jf*0na4o?.Wf`J aU|٢eCLW''qC1j_]: ףcP(2%/"sΦEYh)IR͋-v.9"Nh[-8'W#i#Ν bDEAY{ZfOI` c2\p@rl c.3f^IW<,uAh Nxe]e-*^/;Q5PsaԽ .' 3,~yZW^Nљ$o.1#7yxzXQёH88#=ag^ޒG7fi*y;/L*1 3/I1T|lԚjzFmU|:B ǐIcFeJ7y̤a,DV4yK^1}: .9^t8fISspI TX-~2?8rL29u},7uq"08^rPboc!&#ev=\OFz<(խ%, $$+cHAJpؤ4kI@v8 .s<{&)Gyf "wSx3 -wmF%@.lRYڗl@RpHb,kQ O7sJIy;^Jbɝ}ɝ|%pj#LeMH8ReҨjRl5WFs+v0 ͔$mC1,h -TSzkN-7viyF0kIIT$R)Bu#k0u bKq;Ho@#v_VYAD |w3FMG _Se`;G(EP(k=c,7f+VE |y~Ox^\$ijl̇ȏTh]&ѺLuD)Z7 <u^yt k {#8-WLb1uSpi>p_/vl&`T:<8b(9IjEl>h7e<0[f,J.luזsW W<<8'wwyX0g7We=?0V`^(t*4w!:K{S(0}J Ё`Oq&8}Ւ]{XEyĿOgߪNvW>0wfq߾X:U4ы;Oh<0<wYɫT\-Ĺc{W"g a*;PZFaΠ)As(ō) ~k>xx.kmՌ@D qu]Ϲ}skɌTy-1j GY0H!č8E3Sy`:_z{rsոqZҹP҆ gexvOfk9F.582?^@,N?CDj$`O|QLQ=?{pf $ǸⴟHΝV @>gS1TˬO]tĖ(@"ۇz*>(o&1GWNKQ`⎣(b> 2p[EMu6A,;, 44:8|ӹTwqS53!O^tDUH08xI5ځXq4ۿG&V0h kSNN~mS-M3<)՜}!L^EamRj׳o^EGjYtXV#훅?(cAyEz4>#mX{s۬a[ׯf>|5k*ݟL:gB<(H&g(g8Ofp=݃:XG+~~'8r7S!#&nd+ R`FV;}e ȭ(6su v).{b$'"BI_Ŝ YO-1𡮉=kٙT| N2gS|Y 'pƳ\Ҟmu'(hЦ7A5:v!^)%m蠒|jyfT#,:e"EVK%>A2& J L߳(=_#jYݮTb4`Ȑxw>m`Fz.cPRT-RXD l 0Zh)-Q,rV%Y,W# &!4p U($~d5v@Pb4b$31ˆ uFHkQbqs:"gpl>"h^s 6[#+jo` 0$liMӈ: 2d/|n]&vhj V%{?VSGj#q(K@@*ZDH~w*;3]bRmv&ӹp' >h iV(zVC$xڬ=taB=. cx沦4o:Tqτt)7Bi*6zuu B.uUvp07/J7*R-d=|^W] i͒i}vðz@= T3ĞP X@o4jW+{6p+NhQ2V5B6Zu29h}UANmz]'߮}B> y"Z$Sb;i7Q$䙋hK2/P+Ε= =?* [_ s&0Ty뫙_=1c?WT6e`m' ǧUSXg oLX,y{WW? K:Kd,(+S^&!#u:ꒀp#IpH3S7.#TM58;'R0@؉W.s*O)a)MO|$dؕ]P=勦LPwԄRZݛ& $)@(i#υh,TL#50*#nHemo$ 0Z;Cѻ;q}3ۙ Mo_d"%~ lFb 4W1@2H%x9l٥|q> hx ܼH^0`6PBK@}ϫ~f:}A0=9|)x]7}%'Ýg2$f7LijE̅gU!'W IBO5M2AjJP| aGhE3EqF'ĥТoղ6韔(sF;OYH3Wc?E2y8*~[_i}]ues}-Mu%A)$>n.Pƭa{"@ F+AnΈ b6eaVt1Z1ZGI}+|QrI"1fcՒN JZqQQ2PUqtc E2p8#l0fkoٮ0zmyVѲ/%8rğ,k]-KF 3Ϫ۫(,ҋ(+ˬ&%NJHQZ{MVbC7A &J=4udsX$9ʧښ3$?x-P\J6GŖ#h ԛԮ*r9K[Ce"j(bh2wls ͎=M>ŝ>X#l6%G4GǍ Vs9feH| _P7lmZ찖kkY 7~&/wCsMDRLP3XxḚ  2R C5Ifr9D3:R'*7k $%HQ.,w~Z-9ƥ<ro|E7.YO ٚ@.,}ggg O\g# ̀뻻^=Z#8%o}~&fSI4̆xBUn."@(?t{_p<.Xr8T+Rvg3fzf2eqmx R<3`8LYzo^j:31FgjPPjN/ͬ g0k+' RCuda=!ǡ*cVSu\Cw^N10;ojOЊkzh1iI|krߞO:ico0.[//,)gZ(Y׸.xoUYwwh\vQ,׷nlO>e|wA*zjTm\N}uu;Lw?Ï_q9LZfQ2ބdAgJ7J#Tǔ m,25p 0*TJ&j9s=]|Ҿn#M]^s]>\&{SG zNa;tן66W72ٰ3ӛOt~(*?kx. w{8piŀ :1Jʖb?\ɗ~\yǭVV|_:BOҾ=TR!V\KOL[A< N«LȋQTSh`j#f(=~Y,0 0Ux5%#V+{f]W[Oo_]FERY ιײBRH0EkSNa".wY̚óYFcwu /ʺtF6FN׸ٖ @uBN+e1rYLòA;spe׶O5_rZ_k8eخZ'y#Ji-s \Â[oaul Q8z{nƒFe}N[t]]:z/תW, RqǵЕoO;Bv{!JZ.z-JάS49)Q*MSW8?t Ri=ϤԅuZY1quR%iy'heF嫭f_$Tڒ 'Vu*@]*-NզոcT0wʎ i(gxg `.DTŏJ]?sp|˧Sc->PtYc5p| Bʶ#:/+iE;'tS=r*B5i ބ,unS))POJ=װr,Rn=E>1S!T J'lgp=))ա9_l|!(>  Iʼ*Āβb¥&|ZM+MqViWX6SJ, gXNi]6hP*}m^mT =M4\'.R&Y$]bC:O;D0Ǜꐪ{Z"S࿗Xbg~Zof>/nGy4ߟ\]bɃOx œxb7^EZÓFx.1eE% h@m#*DCy WW1~{M'zKNz %wpz :l6MgsCio7|`T:km0H  6[ilz$@XϨ̉hґ7Lq5 XD.[>NH>NcR瓽G/LrK]f5D͠)KutV1I;m@VŃD I(+$5 쳥D[K0*NէpJ;Ǘxx3&s'r8:4R#`m(5̇Jd&rw(^gN |FA 3#a5(8a2pah0IC((XV#â֎7Opč5 *!q {0 Ytp4yI JF{|G.dOgϋt;dr6tvrqG%sD !pͼ^bZsj;ܥt+&Jb!*@D,<@4rsc!1PƔ.ª  㥚R}3, :" -}l %Ax fN1TM&`&:qſ#V0j)!p x fۛAj7r?ӱIϼAw~]<|]+wwC\DMߒo>?+=0 vADpdt]Cw.dbnG#x>XE[b-ݻ9PXwX5cE )=gˣ U.e(@D!L`'C0Ql=(5GѮnS\ue;ɖԡ/1<mEpQ`b- $#F$e>~F] [i{<܈UbwcԈ*C27.92h8`?؁1' 5CSr,9wAhp9 +]I%1 V`s0b4 uiM=CVStFdBjc31j)\{X)ݽ1K8V JNjJ[NL juc$4mUVT'7sJ&}J*YŕV7B01uٚYU2@+umjP1H%b Ok^K1Le`6~B-sd 1SpU)8I=T`Sj0Mk)AEp .qm \[5$xvʒ2' ޒc)ۊ?hIg,rW !CyfK{u@Yf.>b] z{߼mB2)fQXYL,gQv46O$Jr> =f!ZLyzm &1%5T Fx\JФYW焁iMtan֊"YĘvQ`O5'+37gW{?{ aKو&P4a].np+reBlS*1mx c25Aݞ~ vŘCuZo#-|rU}{^#Xu$mXLT \ZT՝.^Ld ,qGLw,&]*!+ A)#L6>Ձ*na߇lQ ]szXʥvåb=m3Oa}1ogE2x z Y71{ynNmO|]62ZДKFnYG'T#Ƨn},hRr{x 1Aɚ$knPm |F( i੧Ci.(K:eka6M'j&[ˇb25;5HJk )AYiCg6֖QL:miIZEDkm$7E"x39䜼$07[Zd![jbwK`nj5_dB "%9.< !O܆#=dPyDjF/Az~@ HPDZH/%FN. {Hx5 JFb'$(QةmhIv߮l@ςQSqP@B(:)Ƙd,U!H #)#Roym9֕hP[jL%妵$kfg8j+#.wYK]" G l!{p0mTrFiх%ei$$IJmg! 1:m yD-r7S&4Nw'=4)AG7[pwIw*Vah¤YuYYfObt"Ja9SFgOa׋1F h .B]of8A`m^pܹ !qH{{$ L p9rz`PP+KDomDp1oǧ,6?/}3y7ų{eRftfL|s)#֫}g.sm1FI\(- yp2%B˸ } xr@C&'"@e^r֎,E%9ܜMgq4׌0Wǥ1Ej2 L.v0'(͑X; ҧX7%2D tl󡕱a/y-*BݶL1\i-R:x E,Wؔޖ@JٌR^^n!"/ِ?. V o?A`wa$lhTC%K˻gTY`>;B>{zP뛙g ;{}/ !{8{jb Djo&/^\k-GSʕ QӣYH>磀d C{\Q p0_㚗b-Pr..Wjs0i - g{ 7 BP8o H .G6,D,Qrv (A{,|TJ3iġ\읎Qp5.QK4I{fprJNi${h{XYt8%T]V=Jb~\iN[TԶ"*$D0;:RM}kL:~^.YQ\Scz;(9b'o6H35>w{wN:: g`7<)#n[f [>3Ӟ;L >toxEi[sd 1%S%:AcG,h4 (-QdlM9j6cAA5pO>u֓W%,YYmI+ ?4WcqQQyʳoO_ݜ8JI{/p~JW{YB!  M)y㐶#\lhywM))'w(H 0t\%&%)e' %iu4􂉪>9蠀 >Ovp_8lہG5Lcl9!x?| c}+CL} ~].{lא;@pP=teoM1Bvc> `јf4;ZרѨ2ݫhr06ퟅvAK͟ߋxmxm7S:H+ #9MF!L#㍔ 9kJGs5z婢K7yF ]}P\#N_#Wihzj:ޏ6X T_&,ބ6:{a)_7q3o9wnZS,Z֑|&cSȣ|x7">AƻpCtZ3ݷwhwBrͰ)Ab]畄JBIV욉QŮ) }k4kWQam{s |dʛ~6jmZ4|=)Ԫ2pZJדr:N 'A4fۮ ZKpvV>^?pW8xp p4+%X\""(AJxk!fdޔTJL RۙLH +c|y|_=NGv69էaM_Ş'Yռ)aR)\bU͔cmYcXI]} ci(Z} FaMK*ޏh<9[8O.fL1FOştNq}uI0_Kս+1zchjGOTaǰxKX>"<œ/MBՖ:JTuo>ẛmmSͻi"L`XFBj &+]\湰?t B5 ]$N_"9^/M%Gaˎ=Ҏ&^ p@IaE:Yc^ 4 bUK"߯rrggv/o@t$>njwW='_E/4 Xw~!@r֋Kde KBg WP.1bri1 ːzI,OۈeH>-ĘJ* .ߔg^Wm,ֻZF[J#kĺvX(G11/9ZRD%I/?=Hd#- ,7lS7~Jvz"hBt!cMnσ&x ]\Mz)>lC_#Dt}v`q缾#X츶h@*>@~ 4]kkUaGF<5ңΚy׌Jdz[ģkKD(8 %Cy%1l1tmCq PkALx@%.:+)UJ`[115%'Z8S3Mf#m(<-Z²Cv Cn'eߑVJ׊8)$2%EDp1d,j (.sZQɭ&O;8PJK[FK'jgiFax"pI3pAQ y].3UB:@,6hA\zH V(9ؗd2C>c7uıFJ5?9 xc+ML4^ \&l ;WoӺR,XXq5Tg`NA@&1&hcG*,ר<Shk;zqKYظ$iG,юX0ඡ*ѤU2BBj#h;Z&|4놂6e._O7vC'Z6דk@\RgѦĴ]HFm,/@ꑑN%۟Qf@Ql;HVng-pM̉m[b&7lfh>HZ.x.Rsє#v*$|@.Nk*LzUH t@]Y2z1:bQkQJ0Np5:{z^5ʂ%sBA!(CZX ]%1Ruߐ dSo(.d_D8SV(6rKTg~'㗾{չڡI @<NM:Bi^;w9=|$?+KDU)+IVAoww+*%A .-j O/;J]N !l.ʪNIvڝ.kS\ڝՏ&9QBG1לuO3DWbO1B|,qdqURr&~uC< MG􀋦8UZ1N=+PO%1@ m}GvĎ] Ǩ )&Uh S:K+34{C€e!v u?1pr ]y*{_?; z} fgW;(ݨ.l0vW0ӭ t y;EXJ479┒I7&9%ӗOcJ>7Qswmm~9؝#doH<= &3/3c>b)d%)0bKj5*Eџt~<5f8}[tV$KLΙPҊ&NK(O <AXQ07 #R+P(D j-$26ͬY= s7\PUv k'v؅K?]py.r9LZ2V *2OFQ0:S kAx.LTAMj-Je76PJr{ U J*2fG$0xA*ʹ-UGH9! lf|AOMK͡v$|~?NM87+TDx6M7 z5f]d"W[ f[2۫B`4aZ}sm-%-B9 IWAHc#Dklr/(h io[ҢDxzg#.^f#9JlD}bԒXMRj\VenuUnb"4|jQ?'-*qåp mzK;`';g ;bUR%2&QT=UOQp4ЍC0mO*BVk"5BJ5;ë 2`8Ԝ8%g1nj]HL8K\E>s%Rò"~(,' v0R5d7aUVdn+VVIvq5 &}wt:f7ci9i敌nx0)sU k adE ;rJ ʹO*8LSd8FKkS3F͒cN/nrY[ y#E{5*}>mdO~ *A.W"WβoAU97o(ەW1@HӉg*FNU@V})Ry.F}I#\sZkm2tӆ>XO˫SccNhliGsg޾z\҅}tQUȠ}*w^`?:.݄>-l}_z^ع4(EH0x c6(覈{Na`U@0gq7k'㷛W [Tema9'у\MtىfOU-phT oA{?7j'<-"V=C.O*myY}) +U^)ՋDTB]s՟Do㔡{<#j77A%OۡCɖ~ﳊ^tp)G5+hYڏfeAG<-Nkim ֗u.n3'[\qh|s.*4J% Εu}DB@ҵSڂe|I+\Qa"o\:ɞ_l6r`1~'|'|I9dydIJrc3-<y:aFZ90ɣ\FJP R&=Ǣ !OWRF ]lDE #utI׼hs4: n`ۖ2iP/b Q%1nOgtwF7qgtwF7)ѕeJA>17Y.LʘБD*] 3 XD-fP-Cٌ]fj[!ގ| &'8`R v6'Ѥ1I猦(3f4])En͢-*І8X[JeZ.+<J[p$>nÏ-P%12i j>cLeirDOs1LDi(@)ULӦR1Æ/3 |[~br﯋7{iȼ}:=P5MA>ΣLĤSύ#ÙΩ RRDj"49sz..Y$^}]5 /qd=FÐ{BW'zpa8 ⹑"e\1 iY4 K馘88 ]*ɂn(%,Ic#CMn8Ɗސ9O M'.+2Z&L2Ƞ'NȦ&ui)lbi uF2 YeZ)Dk#y |juxxίosfuMS,/ѧ(FuuHnv{simN͛`nlG4 9x-Xq}X+|:rfTҜhAƌMӼtXŨ58Ll1AI+?ٟF3GNO3T֏˳ 2ڽTڜ(G 4MNd[-w[_QQ8A:Յ?-.b1%P=~%=}~V`V-!>j-.31.g|ߞa6_Zχbи<~.g=Z#]-U]|-uMGӻ`ջu}_@PzfΉjŇjj6JsU=Ow)iukHslZC [5T|غzVwU"Sgo[Wf1:Шb> ח_i_*|^_5(%@ 7;a/Gb[QLx8&n 6?}[}ڣ4)F6CP1}%˅;}`@Cb +3M,] P[0#i]^"$_pxgoζ \;\ldb =(7&4=\_3pwC_|Qs>4fJ5gj1+W(=yV!+e6߇-.'XZ,6 Hk{奋2CRG۞"z)b'eې!,kGT$WC ަNT7#Ӑ49}=z>˴eKCW5,_j+'_w76}rl.yu?>nFGSOhH#'H?B'Hl mə햎\}u%딎VR1}?^R*e7CGr*|HO9% Lr:"o#'#8#8#8#lRG)3G+牌ҘgRX7s" 08GZܲ }IHz6vL]Ъn2}#/b@^%WOv 6P;q}^l ꂩ'wUBcH" 橬 k4e~0Y<{%zxssY6M$^E,>|f7Fb՛5{; 2 XOteWJn+W԰UuӆE rT}[/15O\R^nط#Ȧc㷛-mrMyzt\s'/xWϝzܴፙ/:gYVx9*"L=0~*hKK-rtx\8:Qc[ܴZInhHY=@Qoխ+pu9Qu87`jaQiqlE _*ĖyD*74Ju#r@(2P-&^11m@aR!<*O݋T2 (UWoT2@gaF+aF^%5a'̀ 00vhswYx3Em3:BK-Rh&EZg OU]R"; =$3A'V؈:}nqdOEq;+,۲LKm) fmHuԒRуxl18mSK\ rB^Ҕj(bۓ~"J1+bcj(ۭ"X~Jo;ɞ.9h$y0܂T$bT@Lhhi*ʲZp_'.i|%7sicX^=JCuU{ <>-.+qoo3jL0L d17Ƿga6_jc syFq3b)&Ϳ__Sٻ6$W|Pvd;LL̋7uPyC} и lD:ˣ/dK tvۃ뿤!qH1be7FX yxUj Lu]arZVFev^ CU#!t14 !YôKt&j x[N#R>W>P$&lI? Zsy\xt7TZ $Qq{y-N8l'=PFt-a%Bza[-P&-,~N/-~}Žk)/a uz fƏϓ3ߚč YxHkN9-o<7g.xc1> aMh{2ic=(oqې,5q)PD)eeFIgoqn~oFr Ș[ƼתՃ5E]y S1-ILi5([ѓUުGClߑx;Nr0%X@5ljFbB-s bQl-sډ-;;СjB9qt7ˁi/]R|*< wUr4^m>iuLн3"hDZ9e2;CFNJQ;i!F#iXx4!Qwß;"lS3ϻ]J]R4<\g̓Y%6jlK'Bx F\&k1i7Of|"_!ycY+?q͑l"v^cs`K4ʪr5$JszĬ v_9H7p ZWkƂ[;bq=)hZ#&f1C-݈ShNBLF`,9d';SLe'+էo&9rKeHy;!Jñ0ĴD IpD;1^XlTFH&G"K(z}$_jԬہbK q;G9CӞBy[ B NBh)2Fړ\VO[rDpcT|e,0Owz2ZƷE-mO?5äp Oӿ.߮ +7d蹺ֺfEx֮17[W.߇*ǷG**IR*=꠩ܚT$~|ZşreOO ]mE v#p{zDV\X>e(+2لJQz ƙ)_LQL(SRz/ WY󊍼(FauT#>1tv|Un?4Pf"ȧ/?ݻNVIAGBpzM oAs}_KA|zH$y u@_Ω ӴhlW]i&)KҢ%󩸀'FY:P Ah9V9QB j]`۹3xt$H\psH!4L"~ 5sSäUuݻ27*H&"E,Wv[N:D@8Rzq2mo.Me{0|ey8*}p[xkZqt+(m/uRSԩ4]Iq u+ktcvLM~z`oWeTz1 %ӭQM4xa7ڡ 6kA]5,:Y)78Hۻ~bho1PvELjIYFU1)CS=grH0<~Yk"$ժ[!+>ňsHxԉs$S.Jӎ\KEXu"R+ S*h 0 QR>E)R/m-Ъl崽V+Bsu8 2vE0^BS !Yr1^UF-t4jkq׀3D{ ]þe4_RȖwzp¢YkyG.x\Pw:JM)QM_LzdH];!*Źނ!I-8{E*{C3}~wizgˈ]VHx".ޑTW;sB6Е˪f ~YZў.M{u!peqi(*빬$ ͺBly#e(S+QZ,hqB%߅!iKdߍƴmZM+7#kD\=AB/+6 3E'e x lFs_.;e E0%,/(A0QqfJMj>՚brH|Q%Jp!@E9q44:sn'Hxbņ7#°1J2R}o Ӹ%`z ( <}`ldN8kcTHEŐC  (t g;;pـ FC>'|8h*D$Ap9ƣHP \+I`b( i-c1YGy<2S$|fJB2J"r4COu8#KAngWz}87UyWy|}(K!3嬲~5ʜy>{p_OL_8"/<͓k<\׿{i$FHaTC珽%ozyܸlM >G "@fpIm7W0d<'3|ܙo&wg'7 ZJA%_E"FZ$1닙wٍQPf.y6աZ s>HoąNx%70:@lbp "JJZЊw|[wl7^ LU[Ha0ddՐ't/'tjͯ8R!HfiRZ"))EJ\QF@\U?,PB*ɬo9ng3ZHr.j9n7N4@ebSmYШ%;x A3m-_am4Ľɏ}3ҥqOsaF)b*:Ĵg'$`%pFEHF]u8멇̂wbVpiG֐?Ӑlӽ+pӨ<%[rã+r };h~U`ۮ3GP=k2EYB&oujl^bxu?zw -L.Q`5F)tgRr cAQu]hܻf;vőuk, Ϳ?UnP>ה} &c8N)ۣ?~dk@pc+)6J_SHA%;z=AM"+);s[ Kd -,"$"]TZz=#=dڋ pLf*%Z9FS%[RȎ{gro/5fe;ZISe6gך>rlOAة a2ZyC }}ǹO-0,J TtT'Ȱc1|W9d %nHk.j&CU3CɘL*{ &KAM/`P%pxJ/Mw.̿-lOB-MR/>d{ެY ko(Æ` C9!p>Pj46b[3Wϯ%ѷ׃OOLU=h׃_K"A:Q\B60?Mw&7/oX*N9h͚WdĠM[;`]X*g 0,9߅8 |r$iUp 9)D9whJ 2a ʶe:cxSNL$''g~?ގFUi9h9- e(xƹAߖFvj4c9/]7ɴR2TSkknFS{FƥqsleklNRqS,\m%HVHJD 13LbKu7F ),F#K0R!%#&L|&&#R;%(V b1@b#Z:C A^AZ#"<0)XJDXX'6$Ds3fSu1\%є*n8 3Rݤ9`L\ʪ q A='\72:P9_EJTgY%&ږ%2&ȳyu$ =}Gq]eVIr,$8J +UTǭ+Dţ#Y{cWm[èBDj+ RG7n@ 4톳ܾM F+TD$'R "Gnvr-fLV*+~ezfRY޸L w\eI$S%OA`Bߪf?i-g=[4bNF[E# J!Zlr1\%6x7or`&a7|GZ{|` -<ܭ(=~;LGmFu1g%|)hHgNkʎrCX;WʃЛ7d[AIUs,w)?KX09b52W&#"4Xbt‚ T".pl݌uwߟj&"DV텲ҁu)I8`YiG5qRSR.39m?6 S-d.Dy02YP%bj^T0[I"yuVb:Ã2 siTLFk!KT_P j{1_˲ RΨZtңR)猴arIrl{ߤQK]'6{<_7+`vw PrZA#K-jϋjR $q%hYhRr:Ve,|N#:.hR˸a[m 4K@G1^6. 匰xXwq囅͖1Yj "Y|958.96S-!cwNeHZOR(Ýv)'J1{b\J)1بJwe+XI>goՐQ[9ﯯ\veeI6O~J !`l't\qq+$9x0k~ ŽTrJcXƋj0f+@F.(𒧾:?5ۖf""bSe۲ _9#DBYp'3Ew ŸAg);LKiqar.N:W&i1#(.RꥢVL WA2L ;&5{ tѝ9Kf|#Tf!e/FIn?mc֑v!Bkav0+}=m`%S"nPHɾX0 f:i:Br)0GUƪh #d KqURa10df]&֓a/9 V,1W7橢gTKzj?."VF$.2~Cm*x}.eq>_f_!7qB^;΀)<8oI-QΝfk0~ףyԛb|tףE@$_0jKW<"~JNDĵ|#4 {LqJq=yxؙU_L)̸4sdp2sh@uJxTx/ӊ -̹X﹅' PHAKdBAaHsA@)vߠW[x6Eyi]\NaQ˘!r9`30.)VrnKc\2Y9Fs;נeVf a+a%No\*F?sn~bO"%J7Z<=rjX=~ Q$A rWMxy_ߦvuWT_Qoӏotw?]U\ߜOq53BO;W~͢[ywn|a2'7і@4Sƞ=~W19R*"QɹS]K᥊PţOtOEAmf3)#i2)#i\HTc鬐 \\(R3¤BYM8V"GŃWh$g1HOU:POU:EPQͮDχqu6_(Uzۙő8ӑ8GbuՋ>a`0 {,[&`20M)2`iVxduKdjlTz-՜gt˗jNKc zbK.JQP>9 \E(ʉ*V6.=8P-եIu+T6cK;Vm'f:s.!.A[m z0fhBJPZKitCfZ%u,5nTpq ^dqp}r)jQ?8srmYĚ+iE3˚YiG39Xb9jz ۄ*X̲ 1haD"( 8*ځ*%{ƙ$q76%ɑƲEd`RVDeg:8#qmAq٬#"P.F"A#JǝcͮSw S1O Y ۥ,,8ƭ]iwԮ]9ޙ(` H"eĻaR0rCq4cp-O.N/]1 We';SB`hA̟d{d&Gn1gh\{GLVI@[pߣv%r); Of2^B+"ZRO (MW﬛ހTM~k ).LfLFWǹO~>+b %:d>w2 f#ټU< s(:J .-WyF][^iyͷ*b" {vm @]X1v`REx6M^M-UoaԔ"ˣptfR4,<Uyh -/ҿΥq?pyzHSbA.1yɗ 4 r`ޑ&,Qd1.YBvjO ~4Vg=S}7rp~Tc6zsv.3BO~Sٟ񵿞PK%%&~:&~2=T6@T;Yi71 ( :6%Z5JB!PT5}бF1TKNb}BX WE:6EM֤2.`BP~au( (=4``܆iqVC-j`0sYg5t׊t_01%اAفҷvC9{u6TDImPGit2=FV' =ڞP59??Wm(HN*ci\թ Zjݏi;W@ek֔F̪'f!eI!y?:88Ջo}Β07:e >$K)Kl@:em9+9 ZgN7:e3׻Z2gVUg!\,@~"lیq;+o&PG0&c;8M5ONZ&}1}jmҔBgW0 :֦)6h8Y t!ų+ ,TX! *9vckC :0$J[PPX)9\P1P F{`A*1_ n-u &'q/mTrּ.ĤxvEjMGOqBqeNAS]I`X\̹bA9wc \$q߭{R}<Td5Do2G܀l2ciG#V;Ց@iOH]AlRR'<<P<{}QXv7Ձwvg 0k.4L*92kejؗy}K%3 3N8[7CcU̶mʁ:UbFztbyT[,YId0"НWGF^,+ vdX{@5Y[qJ;C>ȖQJB^ H r+J/c>URxAv J;J#ќQٯ V#;n%<\jl57q(cV &1G )VP;@9_ӳOQ-lU.4NG`#3 ,F%1Wc= ċl?H4(Xq4.hr9,ydE~7nݨ#xQGkǩTbÌhL'7Qb xQA-CA8nA݉NɀhLA/fi1`zK~O/cW{A73= ~i3/rP+ !Fv.WIQˋ!}LZk໅BuuPd{m#~5x{rͺA]llּmP0oA7H(#]{On5{sJPU(@mSՂc#|?GٛLSDP:ծ*^U  \؝*&8j~N5aaG 3*[uy5_ dQU{u'厙ؔQ^VŻۨib(h1u+nEpT.c3x.NCL@}m`rϧ>x9Nx>6G $c'Z ! h@ZY+djDY`TV;, t'`_Ž7(Gg(;L>ooatǕ,_+X/`;3>(sov€_ pD=btkZ$aYzXNBsRj<'qn203bBP.߀;Z"BR@sXˌp0$p6 3aL/+ Up֪:d#G1 z^!"e=bgbhX (tc.p/oe<zkRCGL".0K-+bW+#8$X.tB(лvPZ|~E'(Y>O:r,li%j*pBw~RZ Leh1-{1U= .>b^^=k)Inlp+ĪL92>iCK.󉛈TRt|=[Nzh(Ykҧ3WIW@閜Z:3D[yu [k;|D!-j!j8!fe"ٮl*ãe5,p‘j:q]U^F+_%ҫ'%sh9.t j@T'0z ~]B Sjl0mz CPۆ㨻NBD"fEܡC5$3;wCUS6"~z*;Ɯ62&ZѶsxRhZ}_Z_}SAcmP~pF+Jh$@]3R8!޸|[qيwkq>n}%kVvF %hWj(潩9#B w`ۯZ&;l}i5-wA|z}j*.:w]U`&v:5[QU'3@/B{ضH |1S{1tyۭ?ABI&c3ԙn~ΣX<ΣX<*W8Gu`ȑS2-F E D00cZ!Ő)K8 z>8A ` 6OTA݉Nfz?07T7l(6-L/|i/`jO̙Hnˌ@gXX=Lh Y6~2Ku"xtMjիED s-Ku=L殇KNЫӲ?\ ]1{? Oc2xA0#-ՔYf^[pyRQ)=>V ``$:E@`ݔF `;L5"朆 +%pJkhPgXID5-f/x&pĔ3Aikl8ܔ+)vI VzH&/R)O)#h3קԻ>CVD ׈DDlVC9[[wS20ˑ3*+9[ZZPKn7 }I&FLr][q!77P#čiLp)7Kk.n tn QY+q[@:*$*%,:%pM> +Q)`k-H DĨD %C=u71Q9 k@6/1pu%,ŵ-VNY ΀9B.Êo Clucӄ249+xA5e`Zģe4,;'ezϧR+tT;RػLMy=1һyň_hJ,D֥B_J=(f㵫i7D11sZb3 }Z9y~'PSc%J)1);'' 몜gޞ%὚^Mi>OAҮD`N' )'24"ڧudX;%JE o%uX5t=:&PAuXqw:؉cÚWiDZeቒUum{5Q@C"ΰwR(bU\ ?3ysu}hfa2܇׌;9iw<|oL 9,nǠ0m=nYH:>FwoGװǎϛ%o— ޛ9|)'lSL14&F띮^K^ :#K(>㪸 ~yzLG%@(gU.~>NNfTLA]N"_G(%uāa|af.f`i&Fuh)[pv P?~ +C]`1*(g5z`W@hrVB!LJY#q{Fi@8f=ƚX A{cdyhދU&Sb-CЁ'mWڥIdD4CN)"{L7+vfrN#li`2%!nVKw#o2{c_.YsLy3 9ӞrUij6_ZY"+%^p#I|yr6XOF[| 8&!܎|l&Raql>yM6_: )%RB٨Zhl-QBs!04ACA Bh RH1A:jFSPw1{Fhzξk]wzA[Ͼ5Aw((A1Y l,Ec rg ^L{1g1γݩ*6=C&4@g,CԳ!ju05Nj3MVX$D:/ e,6$Xe pda_Cre } `wxi9q(eaD( h+C ZU bHv 1|\67?/qVzҔwD)Ks BXk|Mil?%jvWAbeuCkF'ƅ<≦IhkSs.o>B\o^d/}SrnLRjn8*XubFL?&%?K'5mfIcZG!7>p)Ic=je`y5{G_wP!4mG|+۝\I\:Bʽ徤/)YyvFR-: vu/&')fc<=)R4L@K&JDC]&do/we[1h5 rg~I<|lyƐ ]o:IW e+><$!CꋙqM6o,mfK{lq%&S{= >8Yv =F.sN1abeu&tX1N;>ڽHTV5,SSNkz_䧙.!fH(]INɒ\# ʙ3"ʟle+tl8Pp^XUWFI 2p% L+2N2ǖА Te@r2l͠Z4#0m,lIA~oK~NjݶW{Wt= ' O+A-gyhzPh )IÃ0YYC@K.M J,p le 6yֲLQkuǃ"ـZˎvp#ݩ!R)\Yq&ע.P[Gn~`,pY)lwOqZH-}sG?^\x-\#23??>d2-]Ysz?W\E8;7 ~#u 8f3%X˕%tcMHJ?N䕦78T1Q)j`cnݼ Dy}f}Y=ݥl >Z'ۛ.Rg<xoo´"l8[Ք㭭JG;G`s˲8ps}4bٟsUX9wW-I7.92%*Cu-u Etrĺ3' -ʡ_߼f#U2ޝUF0 1e,'+-itՍEgWjc'GbӋWzl2HWfgYlpĖ>aӖ=#})JE԰H6G*M#̨dIU>8xO5T7UnPjrCYIfd1o)4j(ɌQt5{kɓ7GѼk85rm{l*ϣ1f 3t?3h;:yjG)v^ڽvh$GmyJ Z6סZQBFv#)L.ljrs <[LyQVu! =RIab%kDy<>eN̏B ^v˵-Z[1h&Pc΍Βl g}DkF m@΢i'Xcr`׉ctBb>V1c-Ÿ Dj "]hPHn=Ky/'2 [q(فU-;t3T!04΀~~1H3h52f*k5ٯ7S*&OJ5%J?VZe4)Mm}KoTdFKQX)[[0B!}l!(Ej3"ZmC= < ?°ø5d1sѧ?o1hq3ZsubW0'.tB)NDK#Dk(+\*I eqyE;_bq+4*$>4zZŅ8a|1 {xI\lRGų2O搧ŧ]?;x\5$L@A K&-fX&e1`$\+ Wz"S+a$Xʑ_ Vn+Ұbeb|`.g_@^;ÿnMlzؔE4kliJ9egf=2_u 眝Bk~W% qcfYk#~2F7rzaF'`FC(~E5EQ1k~]zʭ\M ^djl9*Gbj\alcHBޗK;hgA%kq\㨮JY@P팬m(eeufmfpgCh"`*$}IB'Ucq (6F̑ >8B,"\y { NpT L5NXu:!WSX8Zv@{ TQ9g pt{⥼ T)/yXi$+# .iT <5 4Vn(hp1®R[`0e\Σ gjF_aKW!5/]'uk|]*`f`"Q IٷIA ڪ-znt8g0AŎv +.( FsJiO1mcIj*.x1 Fk"غpsSM`vet@S$?Zh߽CIpPJ*"gL $BnPdQҺKGB>brz$] 2<"g rP1XMBo!<}ED@LJޙux`;o&f]9Ę<|6Aq,2 d`po5q=!!+.3D*x g/Qh˗)8S{!&&xJ(D$%Sj)grI mfDOsЎTc٣f 6ga2N!7%amr?--jA(DӢT*]p4p\4!q $D(^dX+UM,ebE32!#L $:7C GdC<!;5zF҂'jS@LR J&JtTA(& C&Jf Y>kŅb9d<ZSVY i}n/F_0קVg_ F3C!czĿ@Wu5k4/uL_QlbmYb,hǽ_DhH_}E,_$YB.,O.Dre%9'Og,MD,&·cͻcB_ Z=D`EGOa X'^‚"rFXhk5$Vy;W~޹Uub`a h.",8ϴ,$2,$Gr5 3;x2-Y"*+Oe~Cq|̰c-s>.2d&nԽUZĀׇ@L&I}ų *CS*ِWSN8" Usd 5J m1+VD׉ 𘢞^gFzj0;_$R:2':8&_PvϾHq̺&}v/d,%/r0 0C= I>~3EjϠwl4xyjӰcrk,4Q礌r  TǞR*'$JNd95En".69qN+D>¨ ࣌V R G]@$a Rǂlg0g:h^s EPe2$*J1R,ZvS 2xMP*&i($n6cԳ5+#to=[o` uHB:m0QVtK$|"`psSgA 460Wr(Z.^L(I9ayN N32H{>y5X=2RZe2s/AȈ9ySXaҙRݕJ(ۗ9U S0*Yξ8hi9]ZSЇ}}5y Sjkiz'H_:[dGrwͺb="vt1#o9|qTbBE 7MovV6iYY%$}gƍ1Ӻw06ʖa߽DaM4rP.#Rn@:CYf9,0h+G-EQrynXJJ(TekôҘ[RV9SX*R@Joxy&$Aej[Att*+*[Ae#cd!݃t׸WN2OT:2 {JREȑ0)VQ ^ɰtw&V[EbHe[3rE-w(#8y4 )PL^CʃXTAg$UAp:D#[{)lGiLF`S:XC :J ڮVs6AN&яJ|0SW m[LSt ࢌJwB&Bk-:YzXo歼cUz58]: IAdu"A Y] A։HU'@DbvKԢ ?EvUDJ 5,BSiµAZn5"hLU^gNa([iwl<OƜ!kzSTҰ˟ǥuݧ<~Nan9d@V{ k1`I(ZGa\ 3 ױ[~:v .v+z(h{b@H=Tz=kf4}t|סn43xEeuu~aVThnNI:ޟ꒧ b|XJէYjB c /ݴmejBq- -) -0ni %1HΤ,82,cq& ׾,(39sVT~J3|fEZ*"ϫ"u;fpJ)K/…Bɗ{]@,˱;YnKw>10!ZӖPti{9SuM58@2I1![l?ˏ)SiC 7Jͪ Tn>l\s^#ߛސceYa\ަoG[<iƮJzMF!` ?|?zj3\"#o[LhL :?Zf7j=!:-+\Md!_FcϼWX bb:hNNͻ n),+7($[Rw0ugZDr%3D/?\f.aº,!lɭ40hP=_[>"bQY.]ݺM_aXL?nly†@):GiHf${T9@Ai[>DHHj,{T[{2Dird/P T*7lS{ t\Pdуc!*-">SªKsW8NPGm<`6d3/wVV#w߄o܁Z^X aJj$+1i!QF#cLL[\ZF4(($9;ŲJ\ݿկin+\b~PVs8a;C`P6uqa2y25xyw/;0",SGjB Lcڱy>-HWZmģPEX|gBX ix"msXw~PmY:D.s38fX_c~Y+d_yJw^,)]x7-+5BWjǝnM5ois\(*hO޻$ D'F5yWcOKS!Հ*E'T"'Pc_-9ٹTp!gS Yq6e]Xq?p?_/q k4\?+\?;f xanrk~cEz#Uqa*Y]p=qM}ɖz?o_t4*j {8w+`Oc;+q}3ލpr<ǢP1! 2͉͸E%(3J2,y~{3zY)bf>9=l5vCi+k 2a:ܠI]f#1޽,Ҁ=g r"ԀzZu!b !@Zsẽ9@@ Ht':>a$ƔA'zDf };{W~1R5mQig{j2i"?Ev" LKz2=õ" =sRjT*9U0A xJ|tYA]f.gloCֈӃoD|*"-.d3.;bq^iAg`2PyVBrON?t92sKi0:(0NPʂw L gH+"!G+>uȖO/y)gAh%k=KEt4hLpp"tT5DLC N!¯;jQq, Ja"#QԚk21-|Ҳ!Oy ,a )41*a#Q$t*Ђ*`V3ѧV:Epm"| gWR2ʩ'ThbBt0CM;&P/*yW1 1;.*)yafdRhPB=kRR"[oơ|6@OQ RT\ p<ps3}z Z3@U;7f2++Q=*iy`F(*#D(8u N΀Q /J%0QlA0;Ws_A@ޤ j_qDTM&%K˪TX1y=*8y)X~`PLDvx3lZݏlxq1(H `[E,ɔRJ)֢S1$o2"(Uھ ý3Tk![*A z,p(A5&׻1h@Q \eaɽ |?|çeXE\|5~t@i-fl&׉1jZgqߢ`e^k\*zRYtǂ[aҲP"`kg_||&Tr 5IJ.gJiJ4Ɇ8i~=ͥu07?ZJy:Ԃ`NLO`Vȝ=&Vd[FbZZʼ +lez#RPMI 3b}'"Oa4%^֢DfƁL/[sB~ l|})r/1$X^K^))GtA^V77C_]&5`00_Ͱ\*'͐ 5)LAKƸeП%Y{A-술czjk:J?k]k$Zq"w㼶Û2V&yǕ A.I t%;$S]SU3Cv{bؐD tDN12bj(/F4^ ɒu! Z*ٮA tBSnQ0 'SV0\i4#DXD#2rr3#Ga= ?i<&jOB^i$R$FLV6YnDcɈY+U3$]9DYifWuѤe`]|)-H> 7?9#(.0CA5K֊Չ/]pw㺗r$%BrIxHw@(hQvj!9hIKNp>5ts5'YR,cL(^\svZT 4iT$#ѬPcװԛU2a w˔/H`ѾL.lPc) |5n%Ux%$4]B3AIQrk,DE _p4A킉1.fC{_5F{ ʖT4Մ-MzPOWyخ}Qc 9)@6P_cJ*Gmc/UN4*Y#`v. MsG |xjGYXo!ײt)דk0 MyO:ZӁ*a<}p3N;s3My_\s4Zc76kP~0{ۿZ6(X9mǽl6&-.fT/Ԭw-)D#cNȵ罳CISy̓l7&d&Y*Qe?aF^ww(iCHUߛ5cSn @ϻN`1gv J;`YS;6EkGL9A2jh3rE#D)rsrl]eW놹i{#+vΑaұw}XJ+>da裎NFĝRV8ZKdG,v!Q "HD6 HmrPϲhUl3݂N1C-pÑ%p7q&ӛ.4w (z-o@̎5&H.3凲f nw6)Un@PdC2F;ِHqV%WcAz_z:W2 0^c `2F)6̜B]VN[ՙRUbxq0 SmDcϫnܥfRt+,[8X Vmm޹̶9oX[9y^"HDmІLYDϳE@z6-LrD7|oӊQc]%u4UXWUU0v{] #p.7a49)섋GK8 siGi{+8X5(fn0gS҉"f悪sfc8x2.P:#"*E8znJŭl]ۮC/Ȧ Z+BϵI$e‰j}fJ\` mf3R-%JJ,a:I5c5/$czǙR G*&K-9^7SԃCKͽ%.4l"7Y$SA'kPG$46r\ͫA Xxz] r(Otz.H\ ۔\Z=:'ABi+5ObhXh$7y @.du ~@KVUy0^m'_n8(3Y7/SRHy"xnW8ٹnf{gz2ZuS+Do .55\xW="{5ꅸ`<KH273Po }|Mj| &R T2k;3דѿ`ߺLc&Cgpbx_=Ж*l,z|Qq&On:;A R</go<&?m!]R%֯dk*Nz]Hw5|2.֓ؤ@H/E ff<46-*k&a΀&j2h(kXx,y'|\kH3FqِQG6 Z m *ڢ؎EWpBDoa7Q, j*:PȱE%iB%SuIw5O_|iB`-Df%F/Xo??&DFM <($%ik}?!XbGDfQ1Hveb,9NO־8y S9q}7);/?9VPc&iO*x p`݄vb#O9f1ӎ2#aLv9ln* ́wk6)"~6.gKrrNnٌX>/lW୮{Vݻ@'u}6OV`'X7d!h=w,'1=fv$*IUUVH"Q%emA&3 /H(R+ TZjd5kQUM;_)K"t'9*3G$iيgfJF:@ FSV!ý3Tk![* /&$ʰ\fiQ-A_Ɵx8yiz@{tP/J,F J;!u1?Y8hቀyg _}u)k\*zRǸ $]0e,E87ȿxQm{'N3p frH)F'g)=m)tT3$W *20?R&$y#ͱ5a,XǕj%%.JaTSBDpL;޵q#Eзݖ&ˀ?,/p݋'{nG^Wz=Þ~LÈ']bXG2|S6<ͭS$d m'#)@K{uʻoK hh|pNB2ӪB**xN @@d(^*e--jk y j55wUZ PnȹqG3Utwڻ`k cޑLx)Bt8>ʕA6 u#5I 1V)'*Ǭ [ɳc7Y{1KWiuEO“AXv݈sc)z0'f7D{rVj9Y6 7+ZH 28_=Bِ/|j=nP(*n.'l>)d:WkbH]29r/V'KC`"}ƾHM%'omZ(>` QY +vC{8Yb hR?h . RӸ %/v7J&jyoP,a}L1IǺTbxIq'(0XX gi`LU QoPATu}bAW"P%ɜ՚'9*/[IG=i#^G 2h/;Sc bB}0WWTijEUS Pdt(N_tk#"!Zr@lT@%TZ9 #Ѻ X[ h&zpUG*ErK/Ä"n9Nd:.}l'(o@'e߷lv^Uzy=?·%BJ&\sT{Aqшʆ8gCΗ6D'Xw%\NWu`eiqQk9ba) U G58JL\8bPtCK7dv|ZШFۭHKTѪh/{u_o4--hÿֿٞ83EBf6zquIol9[&Jy1Z4\ch-qy#,J0xHԱOSh-,KL* X),! 谺wkT :uX}V)pQ<쪊g'C6pC8 Gʡ$w{΅1/LYjY۟y"SnQ)wuM纏͞}m%c6H_o"hMm>laCns\Y%)[`ICzvr@H";z$A5!uupSZ5S|WN蕓58S՜^:U`mR6{ϳsw1i2I0`bLZUZZ6:֞IELy #ypE'fylvE6;"TF>iM{op5xPpL_^dO;g^u'\)S_;GuJw'tS_Kj^6T9 *m Q d:u+o$@FK ҠpD)x tNsr5sgwB^eMJ5+M[I5Z.PMQA1H9SJiGY*HFi}p MmJc \E5{N5ɽTҬTt RX\W%;mE"ͳDҎv۳{kՍԇwwO~ޤ4e?߬ߔǷg˰]^M߅Y#fQSVk:K3_Ȯӿ77jn)G } 5VIz\D;ɔԓ:6Z#jNgn3Ѵ[n}HW.k2M*Cuo>b;Oa3}ḭaC7} 7%&8lPj5}, gɍ0閟b-X Rz)2pŨhK%t ^!tI.(xI.vT^JRzESLXX=HYIf -"Ȕg(vD*%] 3K` r©4ɅDEoo+?ygx[%rnܘVn{Q?GMR^3BNKJg#w)8:gV$ }.‰GZZW;zN YZ.itFp:զ%0IcN;zйϏzgfta[1ʧŭa6w4Bf؏sNkWdlsly?{Z }3^5W3*gߜUp)yRO,蚛e 9|=\|^<$~g{;Z+ %jN8.+j/*-Y>a^QoϿX>xSC*8K"X yqmGK9QF$0JbZ1K$cqty;P-uF?wC-/7WЭ 0z+@;ŮR%JZPȴjw= {u V(˷ߎ@޻uZ_>Z J*ãNyCJtA׊2| 7sI/7P/fWnopO睶0;񧶯Ɵ }|V6؍,yЙnuE~-_DVĨLN U+2LfRuWnqA$ J=iL/>T7Lr^ 7ܧHcjBnnSR~ƯqB$ l<PD=+y:,% rržbFO3k,4<㕲sѴ\-[b^ %f+>XW D łZj `ᤃ2˫CWﷇOn2'E=?O jZ;rzr8Zk*tleJR 9,'g i- & gLr-RņSKl9\@[y-s2JliQ!(HR%cP-X(Siܤ1snR>(ziنZCAet4Ds;F 1PMUŗPENH'*ˁyS624ֱ4)V )}FADej#phVUqxéB;<״v;RJp ()*iXôR%/砄y%S]؍)Z+ I#DmQJE`lB??@y],-vNQr{ߞ?[D`ݼ/A˗}ڀy7ӯ~ 0UV]"9?W?U5R!nB!o\_߹[-On%L afٯ]I8nTI 5T3z*jtv @-=z ?wIZKe!uwwqˡyƺnWra ])v BrKkjNDGR-ZS&YjP6F+[Ԓ%tf-36.W0,g>^^h 60LIgD&B(w#\y0Rz夡#VgoKh!Lg]߷Ƨagfeg .Ϛfbʔ_t}NJ<rrGvQ[\ wGnÍ~0hrz%ݽ%40@HU!MǰswV Ր05^b?;^)CΙPV5ElVFO\~/o`A$`r\JK[AV.vQ@nTh]2P^̸'|YĊg?k'ݧA&vLP4SJ*miTV47 hSwLe^$֭@G3^suned T9A?W>^pj0RtdKA;0РXqC.rDAkm{Ge_RHKن=a2C_\r1b7Nӣd p~j/\=D5aCAܠTbaWFd.bJ#d *)t茶Q=w28F1}ۮA10սԠSGvnC(%ō$Q2q{H;/U? 6[NzY5gK:1tT+*[YWʺVUzVҨZ*ྶ'cw@h%'-̨hGdy>?-=.ؿ/fr2QF׬.f´l$}/i븙 y:Ub*>oijkOhY{4CuMZ1i=3ѡ\y@ϘF+L#]U~ZS]!lԂf \ZƄ:RVRڈ'Mh$ dB*̝zmYDah aY_0rCݒb%BS(j 8 ApR@#fdLji@Ġ&8;Qja# ґaEL )*EvWcHjϸnzGM)#6goϲg5"ޞXFB!JopkiSu5mRג6BZPp1Fi\{]Fٸ5nW D*]R+lο{[_@gh+'7 & 6zjvn謡DƿAE:,4@Eۗx=ۯo^C|~_SBr᜙VA̡n&zP@+_y !F =֪xO(쎭~vcSTď1NTP!3ɨ9 pcQ9a5Ljr4+&-iA$R~KaR^ƅ8~`>=Dh+w87u;ZAy) 6[ V*o|\Xq&ME?aUfUfUfU#J䖼ʋ:Hk\(oGVKG'cE@4k g?t:\_V 8OK/Ll,>+|_wױe 50fPiAF}^sy:;&=cxc-D/# Ac2x::JDj6#8Km"(yIsLf\X?&N&ƒۂ ԹU  V7qˠ7#Cvov/BțtB:X6IΝU2ڑdڳ!d>hD~x?Z ޥiFFG ѓ@,BYҬΙrGՎif{v?Q؂Jƴ6cmւ\ R%bQjxqClaA 8fVhx&jj Bwl5c{dM=Sr5ϳJ=ЙLo~B1$io+՛7?_=|'bvyvX]yٹș`.l7gV,';~xClx_|Mf=SĉwQn/Po..JN)g{6#'`S4'"䇃>14ړBNYaN$"_ LѵuNɩ(Dѷќ#>FP,i,Qef虿Ogٛr%E^-=g_|"D>ҷKwn~㿑#|U6ܙ3/[#vK!MWKIjՅw$όO&Q)O:jnZcG}CM#/ज़.:p.X݇:Yx<8OL!a0[(>\foh'TnC M7qrH_¯5Mx.vKڄaU`6i^)-L_B"C(ykB2` a_ I05`B&l 9 Au?bzr<^o_[(ȎwSio>WzAJt6c@a\Wݾ0T_WE[vhexzJr־}|h.C>\16$gXhշy5jk zق<X>Opaq@7v!z,BO%2T&uDXmCrZSN0ID^ۺ(\2"\b\%Ҋj3IFRJs.2(o OmC&ՕB=+ЕL00<;{[ Dh3óZ%haYS־2l#^ljjYn濻ٯfγٵ(:P% .< l |Hju{;{ 6 2ٻq%WTzɞU~؍s&59%S.cؒWN)ERdRI,`F6NSbCIe"抢:b ZaZR I(R9V9aYFKz) 1vcU*M;⎍D 'f$BApxZf Nq2'!&t%6lP0"a(q NQ,%N, *lFES IC`-̩So1u@`4n8bFmj˦aU$$CLd%[Ŏ&t"gcE:RoSb/dWJOjfc &ʺ`=ۊj̨B8t+@B54$J6k+M i _TDʇĴBZf&ûQ=1&h  Xgѽ,'[\څ_6_ڊo%W:_̸w7HzEGrǪGKZ[yq?aȤa=,|<nߗhXWP#L۳ˎE+|kpGd'[F!X熝ֆgz1 t_3EX `S/$s)uhŞ7?FI/m~[?R(` &v2h!{jКbd1q*E DZ1H1B䭕><AGq#T61-pԫ@Evq=CXHp4b9 Ԁlm=%j sFΔQ+/B-*Ĕ5sKkLZSN*6W_Uudt&ģR9RQJf(zKxÛf>o:jٸk(jϛ/"/e\v=o*.D$ů(P?onA㆑>o.fde k-y>ڡh_G>AYRt 8()"z9|«zATRHURF>:0.4B ?>"ZU[s.J a-2DlL mKGp2=̘Q6)O}/5D'J,YƬY*Aȯj^[uDc f+zr׽`_ثVն&unт-PAX۹jɏ⌺>E&V`"# }{hVqmϠbb~~ խ{jAXIVcśI3\}#?wxD9/!Z5t4 _Jszjug:Ę\D#94g$YѦ"˧9x[o:O,۰-]>]4J@tZWd Ou2N8!Gg,OCi4OrѐY` 8Wdsc<:ʪKduk8Hlӧ(2gj)&h\.t @ 饥h$9Q95,y)\"IQ,|IAN3fB39ΎԊELv;:SzP&S6mUspƱxvFXjN{ kƃW%H!2ثHVS>"XP> rw~zÛˎ'`Rⶫ??^w/wjnK?{F͵pm1W 4OleB#~ q%q*C,AZHdӈ3qKe*$;pT'\ZIG%Ӏ#ӥ-#3@YbD"0%IXiDhəuZB'`:lU*"+X8N>Vɴ˴POR kg`rU)qYqjU8(7 ,V:,KI!"M"Sϰ:UQ D+%T;LJ6ncgd_y7>zu9zt fȻ4,$o_2w.AD=St/ol0Lg:xw7~^w+E ;WD vdp0}=էZ k AvF3x~-߱u8Hh9MpL6](1bݝM]V qZހWݙuM'W3wo,n='WfP"K5XtJ09PҌxG9AacGD J\I\2eTLݔ``P:2 9m 5H1- la<&#W<@~z`}xKx Zgic'JqbDjy+lZkaP @AsdrVpdRb) 1vc c0};J޼i#/Z Z`!"jAE@v6{SXM4N E23 3k##% jBUJXu&b}qsǻK 3KC%Fe 2 d`00*SX$:ԁ8ːE!SGjC_N(X5t |P>pI]ߣIvxvgRx6p`z`uQH;tOGnadj)\|F?&_^>47i4nZRehmB^ְ)~_n5л t>w;\0:-jw4Ի5a!Dw) r&s6y}R!9O!w޴]S [\k l%Ẓ%=La ~륃qE*e-:d(5wMzSZt s[y9딅aQC9ZfoHb%7FD _3q>>8[h&Ogx+ MpSw ]~)na=•u#mި翤9*rXK!ߨ k"PFz \/KꜣҰFaF5VÁ\sF}J͙$qpyM W.= Wdس HHGa;ۻfj[u&U+ C~_f4 <<}lqοj|&ژyU8d}8&8w%ZD&1ieO((jkV{޷Sh%,G繻~z|mg„m8b@qYM,զrdAnKw2 Rrno|.ZFLSs]'}N ጁ7so5H$ͽ/s<#G7dyxCHVtrxExQKe(n$-qN(gkt򬒼Q Ie<%U@3@ʱav}'@Gxã?RP?N^8J,] fy=QBn1M؈0Lb0=j]OcJ4b- i%#(֊Ao"ljiNu̵M|3sYTPMtf/W[/ 1{gįG{u;H[8U#&'ƪ¹@DS&xc[=ح}Sǔ=w>%[+7s8b+s9֪]%3<>tF~b]+&EU_ ٣/x ʊ={$$Z8d/WLi,8LTf8c20dȐO2*_& \mVHGV9Zrj!KR@/؏ 8UV1"QY$`Y[~J pnZ[*IBڥ s~3 c+Y -@ezZ~dJ~ U:YL똺 )7tulY7| t)߳{=۾G 9c˾$"H8kT3.14d,,XX_H9679kW{yq4_\^\`'X!0#JJqQ8Q{|c1q={{7ꀦz tW3ȣehYƝ+"U#*+)m+SK5g<Ķ_cշ|:(̹mKXZ L1pB{q2GT1!d=.2Z (qXRJfD-O? [^]QTd ȊA(9.D`Q2Yj}ْQIf(asg-Ȋ@d{ nQE-aXҨrSA!-A@m+~S841 VFLf;6S BLmv9hu^=R %m,ʂ }~ #bȽ xz#FM]嫤U ׄhޔ #TXP~)$NH`wӊ$aqƌV-=VR" A]oq@@+U 9kRV׫BR T̪T=qhM;0ok }L]\w]): BX")F)+h~m6O[v %&DuZXPDGN]c;UL03lյc'pt5V"-bGU.+L eͪ@T_^BmN#i+*\ @j>]j<\R*|ڊeI(VHJ9"%GƯFw[@ G<40D=A&B<m&vt8>]BTb"d+.kp.M5A иYÀ)xOQh5](+.ȟ*WףZC#X ~N:^+\\{aNހeZ.CD3;){lEdf]s@+! TuWJ!8¸b RU<puNkΜ?W1Ey!L*m+v6=bKbgݦBUׯYi㘘BSDŃEFb4]1Ϣ&dꃿpw7+?#U[~fœe^ Dj32"[krS)7J%W/=Mq*H-CbU3\੔ cyԀ>z[֫N? G.B&q0BGL̑򄋊j2U&!#01$S 7IN h#[03řs%ڼĂ*漡-, K%8B{Dp(UMI@!ea*-Қi8((rQYLiǍ6t #Ѡ" ,r`,ERJ8fFM*:TTӭ_!JP͂*)DЩ抧K4CDrTM91˞.Ԏ噵c9& #S݄@bD~פ3OC4v՞^[U8=ݶ;ܢzwT+|jq̋JŐ'JIHsTkRMe')=j)%*OJJ;FyFI\j||EuKGZ΁?Oc@>*:Va]⋸d1m(ּ94c !ǼC}Wp5Ŧ)4"_YS{3VN& Y7 wR^]Aj4Uitͅg O 9rڡ;p! \)ZoI?Mr9lF1ҷ~ڤ?y$I.rYo.(V ZX9nf  %V NdI/Mz t9q)1⽴]8 c8rLaA6$WDx8^gYK]! -=yf,vt+l4ҕX"ZNETǟ/C >PR[ɓo۳?UDB_>A_οN;,b߭>['<2)q@lk?.5M)i"hZąYsôDV&2l3QE#+≷*ގ(yw)"`ѱE%SN҈dcOD*?i97XJ"HF !k" 0kKF$g#b{F$K#a îYPTAד 9{ET SKM꽽~,L knͨaSʲq7Zշ-Mǵz恆}Z@E!LFfnGGSJ g )BwU*izxax{ϾھU󞼍jռ'=OC?׺qƻt㖠\\.e7J9k&ZFbW$P{Q.mÉ,g3hP62 #:TD"h}:BCwBGz4!TJj?SMRp`jr9F!WRTVZT߻&_46HkRvZE&W1LMJr?&%;.T'b ~zg!JXR&6]ŏ5t7NM mb*8e &33)x}GʺLS>1D[8T~豹"V=%lx_}(絆Sj뷸O3>L&c2_+O˲OE4Gn媉VghJxeJBNWᄐ~r() 9$+!<@j<^BLQ ҲqZfԝwj 1=--c)FA43:-RUQFr,8 12D.C<+CY!t#. Ճ촡=9u"G=ˌcQ?@LpjvUPK24ZCA^!Jtx1((wk p1S {"g9.cw%ItO$GdO$JLk!Z|:Q9&-xwT-:[b[A%'xTyRuX3yeA1ϙMW8"\E ď,* Kfy+rkؐ;f7"/(DLl j{¸1]>Pul~5ZS6]RDӓi P T/o<6Zyꇋ~Yo}3u73Z漗~39 ;䠶Yۋz[b(pFzUӯI@j0j],v P1%k/Ösy'A)=HT&_Œ5X*f&;TDDAjVWFX\FOrJd-CıwfT 2S('`k'[O˱1~9精"ȅ ro>Sd8D*+taSFZ2:ҋU? {' -`fzHk]4'$)n,\#~RP4oIsF$>1vɓ'B6c59v}w/?o:[o_>>Y-FL[I[pt-F={޷g [90sLZN_xfCsd+˶?=:;Q難q4][pt~-F={_kWՈ?)){m>:mЋ,3l<:8pGյ[nm[COe p‹AH?i< :u&~Pr0{Щ U>8Ä2jZS]lOc yZsR{Zs~{|1 [Dy B0xa̶ '4 d$ )Yg"Q,@7SRP; -A)BK0-~|^2Ɣ!W;u%ƈmvzD"5btaQʎ.,^*YFdQExn:8?<>wЩq+;<>z waRS١k{("j,W+>7{a<ЁCᓵ{"G\YEb$ab^1v03! 9f1q&^Cͻ"Aޣ: ZDoCvprAbPL9դSa3j&#אq,[EA!/gZzVkJZKCМJxd;΀q b9;#IB/<agm3/cye5"%xYHv-ȊȈ}!HZF%MɃ F@/&NYRd)2%*/)f ;Ysprw@uת_@Z=W=R$c`PX,:gр$j2nl`9ב#yTE1uع\B,(P$E:Ȉ 4:'e/lvjEb谣⇭DO|/G'C ;st ntC30/)EZeaU0S Q'*b1|znYBl/?ϕAh,@k?Dg4/C fpy(@rgE;W0B.2Z+-d>_fv^ZQSVRjٌu^p+xȊjaTSD^V$.b+Z9ItK ] Rs{wmf':Pmx7rH:'%dkIwH}2H FRxc>/?|[@N]݅YHʻUYv&zSY0`t ;Dx^9Ȧsh ou>*iK&OXcG"y͗وOo1#/ʑrsgx_8O6IOXc-FtA+Wy"{Kifv>bP\6tBPRfFjR# ݣu"AZqW1t,YL'?]w,kq Cu|qf{TdI PCTj7Rkq gD25iiLG!52t{mdߘ҇]6ҔˣQu{uZ_][6n]H+"-=%[7aU5m䭬7nrӹ< pdOzrXrC~$l$M>o 6Fz@!R>W ?+|ZuˏU[# yT5c}&'cZflWg+cC9R`z`2 AE0 Lb;1TӘt+TkũhaA6pNڣ4p"CZ"F$Q .Rjm~jTynk)M|gq[_n+y m-hS y"Z$Sͻ#q[9Hb˱&S6m [EH>#7Z:_ 큨$R)eAPg$a-'E PdzkwTP.x?(3l$H,c)ʬW0.J:C W4jz[|N?Oi(/~l|J"j酵JCQoǣNռ翲*Q.S?\ck44O׿/seCyȗ<蕎)DXpGdC*FP gYPI{PʟR'mb{zӛ0q:N C÷ce8BmȱQ毾o_S˜}/u9)$ q!B3P>Ap]$i866q6cl*WqǷ_o1 o=2U$y]ߐ̕TwR~/Nvr&7SWjCKZ?]+j46Ktdh|1rcj]_$e6Fl7AkHD` cAd j SѡI˜ xʐ.v`r(}|`)G*w`qmFdȑ'Y,t^u0Rǰ,d_&@PVG@1i+pK|`B"FU g:yl>iG9[G𔣪&*xCZ'P4Ʋ,x#2b1J W*s/q$3~Tn}[{a+4/4rvf~s~^l^VӺeC1b ar >9Dg[V8\[y4dXA@}lbuRRK%DT"CDiFGu"Ghs ;c!Ep=T DG]B:woI9 _ZyJa<&XҸ( m(_hDFG97, y'{I@p(~!b01TxC >qx0C$-utNn…e^`%; QU"!+Z~GțO;+}gDNdOOW j8I93+'A^zq YN lA5)Rݰjȥ0coMV+jHigs_7lGHi; ݋&15i5O!I6Lá\6\ⲁ^ծE+AiEt^=h[rouCiGD WCu OʻfD!f ۭ;hCyYkjO T͍kNY~N9 b0t7Ow—q:j@F#rʍ8ۻoʋ{PǨD?}8 a5T覶VfTT\hH-T\P^\>7ۭȨцS^tDÀInWצH7'Zr^iikުD{I,Ks֧FY,lk.N*݌lps(t͕DZW{AϮilǟ);peYx'K׆R\44P1%Ag: X(|8]W:EEaSSt[t8xTȧ)0/+I$^~mAu j2bK2D?x;Bdx3-K=sTqYlۏ[o5bњg7MQTdT1Ɇm>Vn?rd抡6'hh8 ϡnfIOV!7B U .N/h7'V*&J\I*yNDrC^VgUͥ&ҴXPpT @XO|RɔI"ҳIj*$I oG!^*KҔ_O3!n$ߋ}|.+= ^&dLB%RA*D4A3i5HOw;IZgrD8ʼn6t@E DIfͩ1a)ː"Liމ Y] ^Xc%>~B؇>xF!i"gE`m~ka-0 nf[tG›+-{fP69ħܲǀbBqEYm7Y履'g4oWK5n>V!lmȆ\Nxsg vz$lभ% [f C>J ͋0"Y3ZEBi:?cBdᒝF&yh>ht58;h t9LD`(DvBU2KOR,=h 7d\֛*^pEs | ;*!BZD}/ u*+c{:uth[Q^[GK7WbR1"թb4:)d]NBV8=[)luϷv2xxczc@*o0G >E_w=1C31ԓԡ=G>RcN.ۡ&ga'a1,tuuKmVS$(RcjB{ˑԖAM"aࠦݪ\PAMlI^[ҫeۈeƃĝ"7ri3e>u3;n_oČAaO_FZ,UeO"NGFGFGFGE!i{Xp=!I{g&qZ'$rG=]οjQ?%ߠF".oklvm2V3PC =P,nP,ﲺ$~42"?;d]>CN8Q ȧ5=3ؗOL*?(t\O7!*iWXkeOJl.n։R|R˴vzud5ŵWy(u\@Qr7F m3FdC5"Hd!ѡTк>5ק:D =nwk|5P™g3)a6#@.+bi V=!xu֎>J/;f ΔAJd,QPDE.PH&G)"gĊݒ;-ސHOݲ!0Gwۅhjh*""(D` 'ՖZP#T`KEKP@PP 6IEf0%%YvOpmH|.'AXrG:- ]oD ~ x6G"J>d'45_Gj>}.- ҃__Me\T* W6w=ސk3+&N8^$hqX ZqRxydPC+"N/se!I87}Lۊg .4.MYPϰl;a-Hu~k )^<gֺȿ k"s- ߩYy*%B ըqfd<⌅Fj>b,F 0W| aN,@ֺۋ(~ ze HyD/,*} }dbL"sx`D%gOl-G':ˢc,N"4e( xDW`ayD$Vo$DEbvEλ:+NNoYt@!gDh1vBf o ?'aᏍyPRn9'VKrY8N,e99`Ip*s$% &ǵ?ȱECH$9O AiI HXA4 r"<bFB1< D6mJ<&LNR4R&rkШ9;x'zl4t ^h_ O,^ۧa{lA} Ú|Ѷ8IL2ݩ&mNEmdu0ow~Axo lo}m?ygXPT _?b?iF?y˿mOtOD_o,͐@LӃW!@\Fp3/7}±䦘pT]8Q iswJH(D?Lp/*I(½Jb:C_ E2o6KG=8{c2;0XF>8;Kmd,FR-b ,Jse.S&!rNb(dN9!f)p"֘߁ϧF3A AwwVHs\-Wss Z?Mk_V;{Ux~0μ^[}%Y#}-#r02qI҄ E4MsW,^K_S+ gYGaӌ!$x(l:> 3=mŒT+"$ͨdTa%8HDjJRǭP"1&$"Q9P;aFm`̱8zXMS(c!hN1y!$(Kbg~0J0{>0r h4(\:i4Ya zodf 9ݲ!X![@cM zIz4zh74Gc1c h{G8zˆD9zXkDzɜxGZd;0 HO\mϥk@P,Ep 7uy FLTmejMe+u iDQ{orƏ 83I5c l_vG@.isw-c<&tX!GHD=3z8p ZVknlvrbo'jF!)p|Rhuq)zk[[n[u2Զ։QlV5[ahQ&%(x3V=^p NvYH\_Z6 NR{ K%MPU/ċr:a Gܓz@&HM=(DdnmIlN*+kӥc;pw5`Km}_N!'չ+plJꫫ}mi½-Mغl-F$7FM3H4~˒K['DX6 Ip/ b逴fA^(`[ԎR䊂0|[PQGL%&bD\4l4#Yz9{b4lg5`|Ld,OR(燑e|s,#CPIaWv*WՐt+?)4p.U8"1Q_tfV8Hac2n1 %*27a;BgŔHx!9aJFJIp$`&e\bf$+,lSS~ nDtXNszH]o\XxstFV 5ֵ zзJ28m~,\t;2-U!^pKV(0GjSh`\/m۩,ؐ! >'fLH3ʣ< 1´ _ii~ڳAXs֭q SD g ?0).l`049MOr% q/L6dh>cb? 1GL""L()KG9K>`+II3_eHC:z0$/wh(.GCnsA?MI<*GoAL ua-4M 5;"M{ٻ_d~Eݻ5 BDH!Jk5%DY"Dj G+SXӺIӺ6?hlh:; ĉu܆k!:F9a_u< ACI "5:_C6RH9 &6`}wG-=d BКOLm*ÖTqeCW]e׺৴ZrWc^0>.Z9Ds)i0ĸTM:(R^zwtİBqp z]+HX4I\Bꈵ@ѝF2eh<↭h%A!CI֢ W38u<@@ I,2c8fKXʁĐ 9֖&gvAU3lWƌxYŠ^¦i+8GLaZCEi16IWcBLYN2z.c>p) {֗JAA]bפ sL.#4jҖ9PCIr˜(2VsNp6sN*¹lp7R"{ﶗ |%F9^88BsJzi;0Daݯ-7)]! W+ ,槊J9%4]O{8a&p::I*lB[ CВ&٤q12ɑ "U8A&0.#^ K~Z='$ __)`8eI qRz TNLp,ޠO3 O?)51h[H ;P5FZޥggKtt׏DLIF=x(=ca*.SV26fDͯ6{zM{q2f?YK7pfgh'7gu?/};ڴ gP0L)ggnMnj Vlz4ՂDJs+H$V|((Vz 6/ ̟ۿkW18X56hX'/͇hV'.y| jYZM%,QPWFx]9yZȔp!xk߿9hggWwv?YY~ӅyY#BiG*KM!rѰJ)G6i/3_& ώ13J 6xIQ;wrt}::. qc7yۜxB#,,y5x#@`B+thuW9&h "l%$Eҡe[+ O 2FZGA?Hi&D!¸bGO@ԃcR{{C?5.n4Z23њj5r  P`ID/_5K\n~CwVwTbϝ#UE׹{s&d1twCЄ⤺Q@& m'4yƛ 1}(R5p*fn!Bo8ew 2ׯ6}WRs_ '` PP5N*_$`%SO?54 +r epfZ>p W⯿Wᶬ`csN6ln&rMLhi}-[OWw'7_5?G?I' f<qBK(B4c|Jj9F >;j#=SN3{ǫ>x_޵bL@WMNGu!N>aTrc8'_7:do0Y'^s*H.NkkQ3'S|!Aȕ RjliⰑn%6S ZP\ӧBUNxт׹kC"7xI,,!`LeVnT+-e- ]k|gh(Rho$| 4Hf Ehr AkNBav;%{*k%oh^X@1i4-r !;3Jа msbHOG&SN/hZ+p0Y뚦 %Ǹ^'%@ЇlBr1^xnP5a u[OG)g\o8BkU+5^j oL5|e.sY梇$ L/ގ[=wv\Dsߔ#S\~w_0<iفâV3F:?`3(}v+j &̈́Q>LIa{=+.BLtZ&7?e9kub7u Tnb6W Ozn Gn(@grv<!; *:Ԙĭ>TdnB|QD`*.yCC꾳A{ VhXɓ/yC6MƐO PBbЯ>W7ژ/vXYs ֻ^Pnano5nc:?>NBɆk:^? 1뇟.'o1;[ )ˡ X;ɿ ߣ[ҽ)|E{wMVkjoIsݧ;7An-P?(0t"^~AlA %GMJLg.A9htWQ<,O Sתz^QX[}  Yyp#XQQʄIc`\%MY6(ڶk`J!DځXO8{Xje}4 }Xo.A0U̘r'^X J͞UKlR'; z]J۟ӫ4_.@ z".t%^TMݴ‰rnA%wzcނr 0߁ q<9{S8:1#*f>Ve6.ӎlх`I5,Fp-/W+E1xo=XtʝW+42幊G:|=D>(qwMG WP5tK=spx ^;YVDay~67Xz }cW)A/vDe^XΑJLB%4TT#&6^ajWBŽ7C|n-VO!JJN*ΪՐ-d;:EDVsq[n +QknYQs gi"!&7;VD0hʆ`JI"ShL}D:ycv* 4}7}?=RftSW%nf(7)(mW0) & w͢.$0.x =#P9(;zi+s[4.lh l,ӭ)z3O1)REZ+$6Ȥ9(OdbV׳@U|jQˆ1i1E{'C5 oKBD;Q̡v{h*'CuX|x߁xTICL!Ù[1rsKNmDr[<4c<פY3%ýbhT*`% gH "tx>m:SPn⛳p%zNu㠺9K/6a(?~T+ )a1?7& &OѕE SxZ# uҢZ 3=i(Aʅ棜Å,pjX9Bk~,.Gy𝧶2ޞbGK"2=V`9Vf'wX9yH\EN\E=9P7;`21|o)lQqY2ڰ7{ƑBr0N7X\683z{l)ˇ8~38$b!JX29Szuu8F/jv|K:p`c?AGXy1M*+ȿ|s_MLECFMcj\Paз إddQ™'IlҎ( N6A0q 5H,nFLlhTfFj&X9P&P2#1qJQfXa(NiTy-UMDb< 0yČ_aǠO:yѲsk'(Y3,)Vay)zS)"A#8M ~|_F #MIh6}>OÝ݈f䝖>X8.4. 9 +AwP/8bF #DB(ڑ΋ GNE鷆.[D%WR? 4HK@gI:v߯J5ń·<-tQkZ })  S$N1iJ$mg.]哗gpbə>Xӈ:P~X.btǍ$^,nU|oA $Vto|yz!O( WSGqVh8ےA @wtA+;# +,<$a*SIv>s˵_ZM{q0P@?GpPa 4 gw7x3y1kDĆKQ,mk`Ts4{A2LRix{cΥOs֝Vw LcnFέ#ˉ0$`~K ̻G{Ȱy!y&[3՟m·_xi,k/8B_vg5^_[쒗 "Aƃqq"AeLeRq9O 4ܧ2lACcoE,ⴳ}{trѳ(q '=d"mH%:lbD2 M2DYŚ5N;5\0.sP/6ZJ3T0#p 5C"wQjC㼗u_}ŀbϢ ) MYF! ! hG`luaK-1 VjDj;B3 SיK9Rqd\cs7Gu0Jw_&U'Y)"2^c*_q#5$+X3V$YV*bX\`Bm?ë4?9.ǚs* vG.bHRDrt*qbx)ljя: {Pj)#b pI^10k!*B5CF*IdmẆ$rB0PdFî'ẂxnCE5 <X3t {cD  fUGQ"L:AQLpZ: qU%X|dm\[P`*IN?5U" mB{,0sY@2V!`']b> RORhQ w4Jh&XZxnCy8G $Qa֗$֓ S '/>0=)!+]q-٤b=Ԓ!:I ƿRR (-+p2-$)yRڦ><'%OjI0߭a8t:f4rPjoa5k\R^©ؓj?qX2@w^ګ.qF]&B=2 ŠprSE\ZsԶ_ UJ^l9nӹ6(իzMar'E)boA+lHf `&TPlpouo߿efۛvY"(}poΠ Kʒ#K_.Kd]܄%4o8PDu\PT',҄>{Yg֌ɗ}:R-[CFJE(έtMhg&ZۮTmŬ,J-`@ܵpһd%(4/&n%0{[k$Z3Pb ! #H$ں,!;%f796!ky֋ʢ?Ͼt<4/be՞t=X-W|*d+V[)U<{yǏԯqŧn8^<&`ULخyBd_Wʼn=x`'7~>Yv,Kڸc ]I$Sp^69*T~vDL1B!_iqL1N%k*/.@(t?čCK}LgD֚^, r{jU寳< ,Bw ^RX"?8)a {~R},HsJA̷@ZY܋9% y>MZijC0 "ɔ 'Zkxn RXc;Ob^Qɓvupmb=3qcQ:F%߃>eMS+ǐPB)[ʰPNen}IP:ύl%u}r? hB.r.<3y3aZUou7T5_}tQ.Eۙ[p{'܎p RzG҉Z7}Qtf"M0 (z&m2eO/t J?Ye;js X?~iz[K[4".DrgN)\.qSzg.5im5ngbAg gq١mC ҭv8t6a'a?1kDX6!y!yȟmrm6`y77s¸jisOS:,hA&ʎGSaNJʬBue_ҧφ#?Kp#ڥb~_Qe-z3A{E? ФZ?瓤=IXRPt%uĂ~6 ^@ԢY~ ĤB=_Ad V'osN;sxRRNVzoƣp)jԲY&^!R5PSZDm9`Q ">s !O+PX{J,)˲L2r`W-+># C)B C'P8s9~w0HuRbTu2$}WFA|z\={FyGFi?g#է}'WUS!\־"ȣgF/ep96+C8 YQܨ*~xܐ=Lmr)m{mHRkI$l$cIMHVX"A35)/~40/B ?g gE2Dz:{9E:rN@.iWOg? cx(y YeJ=9Ef'tmY@e}Nٶt+(^}#xdm^)0'pYY@]Ӷ/'"b9 i ?콸_y _xLL܎͗wťo_^^6Z:(P>1) 5RY؁.f/[Bcn9*y*+Yd0jp_gi}.8/Γ}vGg^f˫gSC]H7'ŕp3 4}GC!ߵTf2$ Nc;_|{9_<:Q2I7dD(pX79]Adcy5RrnOQC'˥|M|)hZKD; Up}Xw~ӿ?}[ ;>h#K$-6 6z2RRF"dvr o)m@k?WK{Wfq?īoΛS;3uOr'&U1P2I%yf(n\N` hI) xbslfJ;Z++8AʙRH}0(| \Pj6J*ooʅ?zd!))oꗃ niHT!WO|d2·?mR1z<=LUBS'Ly[m>Oc}]@Nf ͷ.a~z3UF'_ZcɛMӥBV}K4Ϋhcp*dwjEV;V79V5a2>Zs)=.F;+W E2g@7EO< Z6GOvؓ_BTi NyAaa= jfa" >&輒7hAɘή/gqY=8{^hmCr4̋vPkɫ+d#zƏ홭{% I%fۭMS2 Z䈲FTױ&ۀ_`ԶO0)qaL c*$9$E#G QCeYbFTh^|xiGuK N)Aq䂢4D*1DfFϣZSpjX\:O"( bxt.BRj`AB |;Q6OZ\5 T#!() Y.aq=P ik^H5i%{Ѕi*cwxN|A:\CwIͣ]DWo1|i'w x7DS&Q 䕢2%#lqev"[[lN!99FimK9֏DxX>P:*lC|Ơ;nc. rJUֻ˽!_S#:e;Sl )6hܤ$c20vdC!B}7n%#l1nq8<yђgGIPnBM=$H"If|rm1]d/,8#u/ީd2ZIP防Y<~*1DsVEFy: <#ui]2dƠ-Bv] QHRd`=( >Sϭ8nιmԲq' AK1" VB'0@ n5-T"<'s(P!ʠ'#\*MOx G4 SzPq>$dCXT6 j6J)u #M&2@TKqƨ`Z$t4S09I V^~nDG 0Qm'̓EmѐXC& Ir"=|O|c·h;mK Z0;9uL!th3\ogԈL-*RQ*a<B6nb A(S }~M ]/Ԗ'܃;7hNjϾ#岞б7ɆQ ث+mVteKիPH7Z;ZG5ߧOx94qkdG45k }7żM_N)ݠ5u$ Պ73y 3Pm5û!o+-ԡUT(vN7 *# {l7?GJzոS~w^խ2.q_g{D|<=ՆE=|٧_9tcJq1BI8on&|mC11>9Eoya3B+QW/Կ\?aŷSTJ= /%_Bos_}M%'xGղ~@jPʻ m@\ i/3{S~qaorwAVܢmH.rT3\1CsMIYpiɻA\#kV$BD(.W֩\V`!|F')Z` sx'T'&ͤC #S0zTh)i!ʉ@!J.)x"Ƽ!hB(82bʢ"Ϩʥ{ZVe QLp ("hQ9G3BZ9ckH42[$FfT*g1Ht!ZzĄRFZ%q!R͍\B:Y,zA&+E -`G .4L^qs_!%&9MrmdjD<ʯ1{8PISLg0Ey9UQe)8 PcQ!|k)HkʦەMA)_M#\8`\v0eC a=-W)u@7焯?W}!dOFՙj'C*wS@XCxcIYFZy?^Gm¹0X\W& 7痓号p`9wg;'۰PZSY,3  >hova=38zJ5҈qƩ˧a?G)ɢ&+$eFG2`2 U햙.}`/ Q1Gt %/ 䝃W?7pra.;qǤ;aX6xBw{Ǔb6/m~/<#n=vSBUmqw~(d*٭gfrLCٿex:Rt\ůcQe.(7g1Aŏ ;ο-h'kUvc JQ /WalfaI$'8~&4p,Ol7@vlXGv2S49^{Ex.QBQy-lArCBopNtC-EWcFZ/2[!W3ZAvTL/A5H__bWO/.Px2ǫҀTxV79a`fApO:h_No}ߏi7 {# ak.>U],UO+ [J[+|WeQeQeQeW]zBs{a_+KXs^}O#ϫU5fYpc}ȯڡ5Q*0̂gs WWـbmUt>n~XQn۷RX!HTVYBK@ 7Ԟ˘aQsTH-0NÜ_xw5A6đ+:2FETK;ˤuъd)0y"Yр?JW~!Gf{wH0Pw#1% Rnu<XwӜ,~~v#93Z OW)ڸ y?nGx٬w'k@Ajz,oei;SHFԇ="qs;(f xy:s=nٟ>_wuGl'ɧC*wp-[ ݮ6#xm.~zy=p>&a,?j9B$κ'_3WkA`_cQNJ]X-3^ ĮˊuIy א/+g%EIs"X^A(&*꫿5 ϧYFq)OWн=ČIAw0 {v^6_V syE͸<ψ: O]=\ͻLٞa>Rs.Vf:M-.o||q]LYwfnQŢ.E]Zy-RH9!S #8Aɭv9V'ʛN :?M(S`#? wۻ77_VaCK]o/FrWoˆ;/>Ͷ0WGT C~«Ɨ԰"bU䅕hRjҝrs@+X #3nIZ 怸GJX'B{Q:̘BK&J"->7 #YeN&)Jg41F*Ѣ4"y^h!j1+*E~* =/<0>DG.^i +Tvb l8*1k )SBDg佋9^(9D?62vA| /o.VMIγ}J72apj[9dyݼT̺ʆ\;/XU& ڢ`WKȮw>VFo'`4Ew>+ @;Ǔ,o/OTl26P5̾%wUimz+X#{q;{>^hJ6>% Nsw^xIEH-((6;/D;wc׼d~hQ4a(_nGS:l~*9G nO'y?4M4NaF6F]Гy<#/# 0cچ$I0Qޫ^JHxunåa-^q?֢s:}ѫyFqyf*Ooj[_}(ln؂`y!žwlXԡhcb5o}Ƌ簫AG"&Agj{C.MW _ѓf'~=ꨜ[(KWْ00(r٦vBd!ʈ-%"qTulnh9܍:n0 UVH-}n.n?A1\/ YI+#DB"MȦf"ë\JhOvVw.h`@7ޣ]mZ"nzs\36'8;QLKCCȺCCȪ5ܒ๮0j Wt*_=E&80Th9݅AШ<JK&0$&Z/0r jBft{@;; I안@ Ndۺ5fd1lϋ֒Uy :{̮McMe[EJ @ڹ5)˟5! gg67ly<_+Ʈ:E֌໸_ O~`L΄}}T{Ji1[7ӆ3eΠͫN)-vo ;xY|pk,f}tW~?שʞwbĔ4R+\U[K^}rP3m/#`4o+a% ġ^m|?NrӃ9gcN•zS #E6Qxd uRv~b_r]w:%,ޕ>q#/v}V%R/UqxeTSÉ}<O a.gƯ@ӯ/ қ AO&of#I2]EU#yN7~/9yC(՛{ouo:oIy ?q v:'Rg=>ϷU@օx8g + bmw @QDwA5/[ HaW2nXe,S akaC b8L  E-ԑ3帩-R[Ap2`4^YDR.śIjNLƂ%j("-]ȽXќ3A41'8c4v!VaPs\A>-DMuZ/n}O ӣ, 2sigIƃJe[x> Bhjhӆx8inW'E*x1菃wT3sZ&nh&<д"W8cvҗ u)_TYeĝqo,7]:/ f .ՑtRxF@VjyOG;[?uyE.]ͻh;Vri('&# ./5sKOsĩ0K{\Juv~4kA8qʤr͝*`be'IrD6aI%#,b[ղ6DP[I; \?`9K ׶wb%*ԮI]V=(5e/ZjLLb]y7 XEqf89 +&s&5U}|zI.޻5!?Ul3e/-+|Ylu X)ФzgD"jߙ #0fR4_Mv[K.>(X.-&=*,_cU<|I:{uFn&8pfGcꟌ)sŭUj=hhJ mב(WTUPN<( BրBS4EfS!V"XR -YytV'NN8`Vom8Ck<8L 5T>,<VD ֘J/V#<Zc+*Qfɕl+B+Gإ K@G)UXJΚp RlZ9']YLbTY/P\`NpjLe=©1rp?Z+w׳JՂ咹{ƕ:pͳ{ۈU!aU8،"pʣc~C YE${xŲotDs͘.;lJ1zSKi|W5|..${) 0h0 "s^,TY&LFt@Y.2e-p"坜F6Hu5]*iuu}*a<4o gNZ%vuYYpD3HMϑ6 X+e]쬮6~!uOo$- [HHtFzus/$]JbÖ>΋))ٱ]`PJO2#R!E7\:S1?M?W{ h.n5ת"q*j/5n^J}:sS`/<}/$RGҁa:0< xCm/,F[1ٻ7-8~zFqyjp1 C?' ֔Mڍ$(/5 VmGGk)k|nF-[mB~\1NS`"ૃq)Fo! $Ͱkkpԩ嬭욞%$iL8/x|c$ݬe5BOB^BYffKo_2_oEJjCcTLZXsZ'UhɳH_-FPg5w]z߉--_HȦ]I\kN.L8:>ͬ]}hSѠ_gKVw}<㭨`RcjŨUY^j9V k 5u6x!D \4&ٌgZ9)N9'mf;RݵXqe,Q&}CoNLݹ$Ν j'a|A!>Kd&q`KϠF[i~[ /Z,%9xwOϙ@7y0O/9^+!DI DZaЉFHֺyFROۅrX M`;bΚr䈜I ]8'SlV^IA%1b'e#_TK3ѩͺTs WÌDXE!?7qp:Dp^?N=ŊWHWRKn㖉jmܜ߽櫛䇓x&_O۷ݛJݿv鸵`N~{xz pZ3DqT T^A:o!O81\U6 vIaQT9s坒J{T,4g\a/9f(m-6( XJQRX䮌.} ^J6Q xhs1G&x!΃4*i@G8qvK+@ Fb+u2 fg :?Af}+ r$CBvd'8gV$) uIz/Qe4#B4Ѫds,za-j>׉s?fxG BbsV*D!W=1K\? q;{_w;_(b~s; ,'kb%E( 0A9WSM ]Eح)y4Ъ*YURJ8WPЗOkc,QEca\@K -S$LP-Hoa '3 DgO6F@ 4Q@0*Pur)fg;A*c*<:9PaLpm9\KCLnrr4!A;7xyJ̾vGހ/S7 pۭ5覿gIvgMf霙< -}2 /[R5 Ւ' *ѭL&'IO*9@L+L{?y3zpP~:y?>|ΩDsw_:BpB62z tڙML9* ЪAV.>ng~w1y53IcOI q+ΦeG%^Zyu1>aӘ7>2Ga|o]?aN}-GڏͼQ歏 ~jrL(Wk4g6:|jU"’p$#/\+X 6`j#FcNJ3)r5gGFD{c,[[.'p]j `Cq?^K%gzߙxKy< *1SJzE@[x8H* zN+I'lSs$G}+kq('0D tb+QRm9QoN9dFpX#1 9Gez8_!r6t/6 8j1DY俟!%3 `]8j R(1GͣP_.rz>!r$nkq94\öI>"G01@Dt)3dZ`=AKa6ݩ`\Z( qա+6懷:u1o4a0)lh2ͿOg`&UCPR9p`_,^"buU*mJTJ6 J/uTK"꒸h g.:=X-"n2륚;6b M9'f͎@"Fr8 YAZpǤ%4X SXLP" e3*”r 2.NAmMxԂ@ ƻ-@\6_5%;te :CW{V*J3CTAU۽)܈U{]ӀiNwrJ/ò*Dxg! % m+K=XY b!_xj K/gBVyjJA\hEGZYC+U U({![uyIIw_lrH3ծ=*7_{tZ=֤W]>Zʷ U-PE>Dj%g֓o| b[GأD=~9M/GҼ 3ĊuOG;zY:mtnt J=1=6~nT #kVU {BiYn jt&"&H'NE'Cm g0m,Ixo ]|BlYPv ~ H[c4cHR9m P~&Ä>ET4ywcg(b#BLZ.]RaC@( FTi!Rt' o~5dD.N*%Wb:^j*e\5/5Cn⢹Ē0Bw?"w_%ӛ \znpSL͓ag~~\ ~0Beh*nݲ4XW?. N.~)wajnFHjBI]BNԗ-$j'bfD-^ r[PTЎa8#+&Ά?^W aE{nqݙ0M2Jh2o2<>slTJPtه<PbPF䝠HbCb@ut%Db5#/ZI & ^Qp#Ye(e!0kXSiRK1tAR wJGkŨbB60Za(O]O9=+v ^PA0")5H[I,"E bkM^SOF )\w(ޒGk5p6B'`:|3Q,ZitH'X0 q)9 j2* ޙCF])Y@-e?=aKb!/.Jَw<.˴QX{|l5fUFv[W~5̼=Әa0vZ%n5veZ>(TnZd*LW7To [a5vj;gq Us(F\<63.\, Hr<֛|K"'.@IΒ3J),b/{ )cLBo0?̮7`Ic"/m0=?0̍{OQðxt7=LQ>fّz]dAd a)Ւo0Կ =\[ :H8E$rdPdpf !N@ðxRבK{أdH`N}QA^8DPLF𘨇#0D,L!"a0E=3gXA ʜJYH2'*b‘ʐ@`y`C"b x/Nex@ڒ7R@sG$ORN^dqK9=-dr}5yXOrZuc4>Yp~vjtm }NznR;Ew%Oz0.tv6CT6!F~ybx؟o#^¼kY}L ;}ħ]R9ߨhHCqS6[1֭.eT;Xy3SҝuѲ֭ UNQyuSZuAթ*픫 m[-kݚАo\Et)ާ EU%`m49qEp7cd^SR)q:Z'1A"2u`{UВ\b+.1]KЊXߤl{<M om9kX<Ç9E/I`.D<}yUj>'LA ?C$+f d:[Əy|붿I:Lq(]uR(G7]f:~q=gb> qɐ7Z =:=P)0~=D"D^_l p5D%xKYm$u~~6uC`h魙ΠPoIتSm_OvȒ7mЕi XyqL\S7 J&10􊗼TY ʨawƒcEQ#A1v!pPQRJkH-,VZ[LTrRFA.P88z uRKV'a`@H\ps+"RcEHw; xʤUmH]R<[v2vJJN@esݏ>̅g1ER̥?dc8P}z*TMjI-H2oev\GHE4]~0U" !݊.k4%V~*a!&H"e+9e+Zuav.A\RPGp[ː p+0کD40Zm"ú"m5zX.cgH<zeZغ֫y7KwhjtItr&{֥BU~<qd9j#pQTi櫉&WX콍1YwhwI˞ޮI#}sPYPFwR?8JQmߞ%XPnGR Cn'Z2|t7`re{*`$c, bEIdgj :46GlZE] s3XRJP+[+NIY_*rL[9,/.)5=O7Re?_AK9f# .f8D(HfMl`/y4ߡ+Ž:RD1 8&+ׯ?3O|8GenFHHh!A׍; 'cl0:7+ښPRhPծʬpsv {¬g:&éiyVl=)ջIIєu(a$4QGEV(g[XYw|zC5iH +IYT_zR!)3~HVr3gl*S-z~TCJ"j򒦦51 VTvzyPSwfBȼLǛZM*2uȐ 0E',{l>|Kv\vGŚ7FϞU(+>9~4ɦinhf^tlP C;܉Hӌ6` i;&I?M%Xn@iˬ_ I[c2WAxdz`t\&-*6q#鿢ʗj?%U>^r|ɖ 7rd+$-H 8/ U9yhO㿬zjCp)y-'Ţb$t?lI)mumDQ01 1֊-nv\,x\\CL4' x-*[Uwnq4!i49Km9HKᙰ@YtpC~ؿ"XFfϧEEzuE ?ܡof%ӻPfx{xO/*n=Nr= }Rޔ։K)fN/ Ӛ4+O<@B}T \j;4lCg"=Pf.@CY2Z{B<;J:B)^f n8V$\kOz0D pF˴tj'p$gLNN"`3 % 3*͐,m9VhϳB.(^.UB㷴YwAȫy;C?\ 2t1vW8>wSP6 PfnqLB 0ꨃ~aKUªQa O`]'FNI拕5ǔSG;#JCmt#q" '\rU%wQX埜50:iN܍^5Tʲ$DwwNS7Sj(V:PYG<0Y\K"R3: :ev+ї)CTlɀSlX/+ˁx70d!v(?yyk<󿻋~պFp»fS]b\^|{ͻ<X߅ JTώl²+^kCbQ 903+R #ILՉc|Ԍ*f`aMܣ׷o\-w~Eu!^_`"ąǟpqsAGyz.KǗ>c%y~W6.|0K֩~Q+\z|;=~Q$&>T*9ؔ0{ m%Mj1(1gtnSγ3w~Rv 9soS"۹xظlU-8r [硼&N}~tu}zzRЮS [ ăl1 SM4.ߩף+k eZI7C\Lzs{SxXn糎b.ŏpHZ 0K}}Ŏ:F8]o A 0Zs6 J)%ђHR!%S̉mS UN)W9fvQ !apʍ V; :Gǽ DH6)-f}Pp[13'bTl^AZ*5%9lXzV({c2‰_R&iO`7@[zV١jJyJm: 0fWl;麼T~g-V QDQBJ8.F4OKLP`ed/!ixcBp$pQ4)B2 [!UIj-I}IH䙗2D{Z˨EBPbCrZIkZS#zJ=ܡZE8!Da!РA+(o сM ŒYOj!qC;ƒƲ@DB#GghTB5HW s_dw(nhiv80 (8g.;E l $…RSڜݲQ1*gCJ%Z cL4G[@9H!UHtTqTOBQ\4NQJ:d(e*OiL%[$%*FIj(V⮓\s) G(a_4jA8ZxʥLw0!h RjEG[1 0Ҟ1J e]§VAJ⼁SM4#oU8ܴDj' ]pP@ªN1?TuV.3CO}!&9‹ՠyE'$eJP:y% 8k8цO:k%u,m}=?K,. "p"TM{.xລyZ+B ~B\΢>_oIm(y'td.̅Bs>ALg_V}mFO==WKiN)~FUîN)O3ֱ{;ulI:{׭m_zSF~?UTNPW{xpO8#6UcH9:ʲ^WqzL _~w.ܼGDgTj6*q=:'ӈomcax-oT!O1uB_欓 s!W' rޭ5n])i> _[\t7aT!l!`F90p1jmjAՍa7'E DmkVq;ϺQ(I iF|TT !z>:gP͘#zǖ馄=cy SP '$tr~6}yFvv<%ƿC y:]42NzkER2!級\ڏ? vԴhqW|fspQzQW8 t9+Lޯb[.v9XGSf!,Z T ֣e7\tK;h!0p}Bmc=16{If( Y~q.%I͎g|%h˷;An͇ Za+$ȂC$]5 mP5f|:%@gө}Y:tY]Xڦۦ^}U>..mHmS:hThgNŅ # wғRLOUdim.ʣD)x:.r%1>֙i|dqYЮ|l=]uoC 'P>>>ve7*nnw\ Lƺs6(Zu{Bd z GBKI/U/v%_:S5?[ZJq<^d]x^\.&Tf#>8IA#$ L6iR꙰@YtF :;DFj`rv]^@(·حbKrJh)=Devr*ĉ4Vz" ,9УrVj`DQEk388*L1֦T%ADmP-7Ncžt%bLZ eli\{ө)#Hm ^nclBLCojWm }.{tX]֛p|z|LuظJQjRݵib+U²֦kޮB9{hCf*KK&ݕ%*$+|3O}9V١jNN;Ra $"LP2 :x^3pI1H l%tFK@xZ>Zsǫ*ﱢH 0^1%@'%g66D ku$IhX[dd#;$oy'9MRԌ2Ak_J%r(ēr KJ!v)֕t@gj';L ESvHūS &u$G{%`wӺiNWkNK$8NkP|uZ4f?8U܌mSxe|"ShrIFGlREix8\^cS_bi& "u/-9a'PAHB u1yRI!ta2 -}hL QFg?"z)Y׷oɹZM}a*Y^dLuRgo}:$*!HsIz%?~j. ۗ>c%y~W6.ŏ9CV@x*3wUk3CROߟh_eMR}s&>T?Vm{7efj1(1gtn [TC{-nǰ376%9摪Щ?x QIK3d# b 38sinH9t@[?Tr0?^*PSimP @}9rRm1%@&][s7+,lwWaVRgه 3Ltgki )i4$% ¡2Jncȩ,Kڙ5 d-TG:|;[{Ih89^wCFvO!0DУuorx^ӆd4hx /eM/'L].%7WB?c Ĵ6ޭBU7J1cLk`^;pV "4xˌʜJpb04u6#ib)ͷ"pslI<^7o +r5gɲyp䦕[٫/vjڼUzQa6.040%}ğL5U,CS#N#u $o2i3qJ\fqGn ('QnTFeXhxLB4)*F8Sf ͼ˴fIMr[iNhQ5jvܤ_/sڥ;>ֻ[#>4sݽ}2-p~o~Fi>5d]CO?L9&0.z} *|`~(ʸOϿQ5a FWvvax4&($Yy<-'R1iW778Vخcj )@pGpwǘ%55+Wf};M[,Q F`xMxr NB7nC.@%\"i^)"'n *{w )@"#VԢ.QGRMݍbo_j,@ ɥ% o&V(^GV/`X  X&5DMa[mipC Zms5*(ڧlOObT;k}-cZXՎ$-]SaMY8bS&}em*Tɛ5GwSMtvt&~UՂp⾷BD=땧Uzpz²ۋS˧]&(];ݲ6}j\b~)X߭kwj' ~Ȣ5E7}wpho!jM*?-RJ㑓jqWcu s%hr@szSl-϶swG qGKhO[ަ4Bdt(t&waz86&Ph%DGK]<9*&Ƙ>BІ?v_֎ ´0{L !GOo!#ʸ>!zTw5|U'|oJ6o; W{nwBU6ppPBQCN֘8H0/vә[0?8=:S㊋s\@[jݗvRؖK\_n&y*O^hXNA&6i[Og1 mD)P _ɕA6⵿F{g2WEb , [T28,򬙱 4Wy %u%4d1WW.T@`Ds5'PF'΄OƙI1'>'q*HU` }I{Ʞ aT ּ& aUᴚ[pUyyj␰,xD_߾}|:4Hj6.\Y3 ePI:[]jqJB&ao{XDr^0fЫ"h`FnbS6p+&לn=u]OlʚB\D s& Z֞AO-=lϪ1'Nh1L]t`73\ U ~/Q6&!,6~٫B/]]-K-zC5<Z9CP[wUɃ*Cs j%EowW6}?1z :7<BhtkV($ʺ}`=3_N& {?4W+_KBh}bqabl31. hw?T#)3Ȕ i2*9'SsȒ9H"QeUJ 1Z^@e kS =BfΤ¦ @3ÙSVB,KoI308&9bdFSm^Ģ.#w! I+|$kr kjU1 /WbdV0 Pq?9Bǃ!WnX =ZP)aD)gLov~~=膻D d2BQTNQZQN9+8'u` |TP-Kf*$e;&,EC)Kyޝ2aY"MD@BL5hB'](CrTBJ}2L{tFЮ@zԳD&2YPKtJJѤD/p%Lè֩\ IAILF3WFTb,qMH$kY?jZ#՞, Cʍ%Ajt$c"4RoRuMAKD,**V8={2^9豓c"rfB؛aWqZCqUA(wyZ"ډ$:gT+H~:R3HzMBZP]O=MEjܟNXMAQҧ(SMуMEjCR⤔\2 )}2;ĥtBBɢcJ)8)_8$DJ-%4Д`3Oʦܻѯgk8YV#mm+L㊢ŎWOs1cfSsN x>>.Ǽ|BzjS^Klڄ-t/ F,t9){h*&oY5dxkU ڪ%7h($)T(j(w|XQ͠0 YrCW@0‰CHpT:FK'TQ8%Lg!+o_gCFNs+T),D)%f2HQq)pIʞ[I;I GqW붳N_,SdV. p_N7T4s0kuHDl4],_^]^N|:}Umq=4`k5^0szHfΥytoO(;. ^=h^Piwx*C 5سlF35MϣG\o7OĻ0lFk)f=3>zMݏjO@Kh#Uh5"QMSsYDR >X%sX&wlBB[Q&Dv7{=2u8/S4-:?T9C23U&U ,1Ku~}~Y_πP1y,lBEZ6uօ[}T$pd]OtC@ֆ}A`9HAY#`2>NDՆ=\B'A[re&? Qm6'3, ΃Vnu"şՕCj̘ ZSuy)woGG2cXS&: n"ZCeo6||3䞲Kڡ-ĭc9m>]^Z'Lz^m^oo ƫr Phdu'ȄJŒݲ[<Dž7[L|~Nb=;hv,Ҩ;Q(Ij>leVKDяrً(r"h)$G!?:;K( e 6c/*|A惵ǁiD=%Y,>a|-ޚ^PY[p f 51oىx F:^yOg$_$Q`y܋[ˇc[c#@#K>3vELa+vMF\ w-bGQ޸#"J9|`%.އ`ap)$](v®p݈$]. "$CmDbfN b>UK gX$8Ů8'3XMe %Aլ_ GO8"PpQd0rp%8T?|i(AZtdRxw^`%QM duD2L^'quLA8YvwDX1lSdy317>"9T/xj Nb\7/˧Gi\(5D{[b|jtm5-*3c*p}('z7g ³cvddaz=(/䓛:n%,2oHKgv|'q" LW}mF;!02"v>h"2`,iQYe YY($\!*P!Č PU,?{$Ήp ec+Gc ĂEQJqO]20,A Y-#B:Bڻ abk暫}KH5b=SIY`Y6IW Ģ,JJ969"2gDFݽ]ѭR 8K+6̒^Irah7ߚw& I^8~hy XԣHpv4a;O0AFC\lX7I4nOTx4+JbsyTpJ$,"ѧe) ڑ!TB QگLdkgzA&.oФV:fHoo?ibn Z#F^yvvPhx%p)sZY)L®rJiBJ "ρkA, .f0)-_\?L~yU&?D縶*nd#,TL=[aȻ ¦2ոNJJiL%AN*N W2S^qx1p`c[9-ޑFɬld z)?uKM*BC"pUBK$`Y,`.RdJ#pؙ6id) ߻;`^#:YC !5qEc^!mFa7mC%h+ͅi)Y |֟Xؽxp~Yy(L*̯X==)Ǵ^\__[LGo%?y'O1br973WכP zKh}͕%][eN3% ɥ/_ܽD M gV߮ T!xݾ<+еkǨc_վ(Am6lxz,X~{uqf=O*az$JюN{؉ ڈ@DEKEc:G̍8AJ0ej'֕Qʺ<}/~Å]Ũ$tD$aR=Lp$J ׈ktr8S0sF1; ,t]znuޞ62LL9UMڑ ໱`qswdTgddȝW׍~֊ ,{+&@F w ;YIHNH?k%'޵g-({ { yUoKЌfG-R.yň"`=dY@* +a5x% "Ui`x+gɽDKQ0x3 7 l< YCS(_Wwb3WHhXڂ|$%3xՈ-y{)r 26b'r&A"gb4$TJJz8i9,H$ʄB k5-+-IDc'~tt" f%gٽ}ANYfda;G;NR+gbUB*<خcQG_%uE5ڦd6⫣ :bNY£20Ë<$@dC"ly,??}) z!?VPe n粛6~w7`d;LNCF$n&T""_֩w<]nn 8J r1+L~|c1S,>%=LE oo(Am~"8 Ebbe  _bºʦYeb˃숧;{r^@g9Ot?F5Yҥ~HWUS -E T[9;X~(DIIMԼ8eFn) ĕڒ Q)-<UJK[Z/@=B!~tJ4#R;mw\ًϵ;N8D$ZrS0O &uC+PpP*45uI^mc{.QhТGhZS B#x8hԦtkfMakp-!g8֐ EaLO"TeCenj)4)loڻPcw7)LMϬ%xP7~?7ne7_||,P"SfC & Gd#pl֖4T/}G3ѐ/B5ǫPK֨5L8 _MQ2כM/Oj'p7z$7 >/6 nX 7~l*qJFn][R%U;*͉øssq#H$3'`ۅ:FV ZoZ\MI$ nGu:&]I!bϛU^ݪBwLw#oju\wlFݍ_Tf8SZ++nlQVnrqσ{_E ѭ"遑ӏ+W$䕋hLw}Q FtRF1$c5iHևrM)58}ہ'bgy0.m:^Ũqa ɩ%!Zm|1NPƁg2 y\C3W 5}[Rd qgЂ$ejbΕ~gg(7˳A܎oqK^(٥djȍʵ J@I $՚UTl6qrUb.*jbcjs/O r(E E]~QqXM-|6\L\ԑ8X[,[ehtS&%ժ{d:jKoU]Eg3$+R;&I %cpj/ y.JN-zl9Fզ69CX,vՒZL^ g%g\D:(@c9s`ePyƮ(pGTNSiٟow]6Iaߓ~8 2 8Xw sR/sSZM /cw(QRj,sԞyjTvZ9Ypm<#Q (0`y9HBpG%/QhKZ .5 F+r29:eV_S2g%} YImYx1BEZDjy}wB;%1L6ĔB_$ ='&J:4=oQvrp D̨Er[4f- zFJft х- ǬЀ<8O9dO7WW׳ﵫ>9+OnX*TS:8B߮eb0TBxёg Ggn$(r tGa%cR(iOz@њ<|{Dr>j30guP`'=X{rItQ(],WPȩmDZ_o޿![XN*bNԃϜEc,_~2>S9Q x R 9RQ)IV60<[Hנ7}Ֆ7Ro{ 7* Cy0R+hRt57C\UZ?=xB2c8UƐHٔMsG^2Mp̆\B8K/8W^撁@$Zg(l(a!pT[u`ndvnLˑJ)4dM''Nýi8Z3©@i\8rZ(Δ@EZS ̰I$QmsqGr|ϗ_݄7_VIZ߮?J`H6G lnhI RXBYWF"/J"u.spxRsOrsEH\Μ+k yaF0[܅Me]PZqCl_X_n}s~<&t1!~}@@@@: wTbOPg/Pms c!g'uJM#`}Dto;R ȣC۱jAzsw#iL)xg3KBPM4+Jo Fy(5ZegdhDFQKh {x3s898ЏW&JBNX\#&^Sata!=zqEm8F sT䓖6" QHHG)] 0*2Vc"0qoRH mCOMKK8=23"3ک]q'zNP#j-.aę]5K̮ӣs0jQ 3 3ZO2:w=nM}! i}Tp0tɏ w6mHk 45VjG.! ijGC]XخO*CilKm/"ժc1%?v@WOν`؂4՞\茎iM9-`6љ5)?%>Zύ<<1:6OBrsRGu[椶"e ugUL/o8Es0A/=)v#2 t霕 9'ܚsZ9jDzJ`RJKmGS^b=IZ{sS:KZln;|AD"gB/JnWT ǘZ82<#S%54u+m3J{s\,}5ە+uunSX֎uv,4_lt$Eai-b1}(ΒǾcRG'Q85_,iGhOK!e Xv(zъ'm=7 #kfT40Q8u?2.! &UE&>b1ߗ!ް@nCU`jCv{cC2`pl>5[hxBĶ9F!lUD!`ݍP?gQ&\wy#?< lɠґ6\dK`9bdQ~$zw.28޻0(55Hx#X!0 eԗ>ΐ̀P]4kI1ڝ6|ToTׄQlkxKf7j#ryf@)mƹweFTR)/Lkx # ެ[S}n|ԧ1wj(]l Զja [6Ӡ?+MФ檡c6c;^fOR_t>YObO~w%\cxmZXrݢ' 9ТUxӥ p- %r7dĪ,THrG8`(]zpǑqXU;M#̧r9Y$9zB5iϕNYxo-`*,p{4ŏ )lMM /ȩQ㿜N9ҋy> VsiRFQ혲NrXY2ESH) Ueu{ A/C_, UOUcwrRe]Ё <~_ͮίn/(2#4g/@[,ל~>w) oO.S@Aг<_,^O9b=:nPj{)tЊJb&7kw}k( ( OVle}蚼U(m:K!<nm5<$0^e `(!E dP̅͌ m$V!YAYf9GZTS.%J;#)j}6Zk=ko#ǑЗww З#H.^wɗ=3=/ZI!N~38T»"Gz&:Xb3yQ4HQmI (KD]5+0 g!$.5J4p%Nf>gx)h&@MrԩVF:zȤ岔 F&C 7f W>jE8Xz+J. ʃ$JZaa`E6h), H+"n_ksaNZZhp۟=,UTA޺;/?rtuyI2Owa ږiD>?P#ϋ0.2?5Ə8}t^7 ~Okݻ>轌̿  ʇ>cͺ[AKɦ,K4‚1^ 2!o{cm} ؿ5e^(܁1QhgDֲ8 |#yQhxg?@kYCp 啂m8rY!y}qʍ$^J ˉ?tؘ}aq1iP5Uo;HCw<!4tF!t )3U!>@8uAdkOvH~^ӓ7uEH eLFxx.Zu2t|;{(>چŷ7&X#@ɰ;#B;x n(O`c+UZ(C*>-zdٶ^8a`ڮ{.@(bk ߳2/Fsq9._anq?gJwfxӒPrNv&NG wJN#N(|zѐ kHurpfywueQ-k]d |Í(Y;p֡^WKٛg?*@J)鹜562[O6ɚa"9?M8Ĭ-PDo/BP??ybv*/"BRlޚy9!aCj 7g7fuKngdଔX۸IZw8&}rk4 q n ̙fĥx/z-wX|kǾ6;2ss gb6>6dDG8&21>5H k&Ob b-(c9ԀXgP{`gMQ.o*3*0|0EL{@zÄx jcTsGFqbcW:GlL;s6D9bubUTyOUB(*U",K=%eM[FSN ѓYBL8QLje=^ͬOBF,!*5 igyʛSu);7$jGx#" >۝;w]li)xS0Zjqݹs0%[ާHhHC3pם{R1P}0ўUNrn{`,4fxRZN T}*ԌsӮy瑇9!-JyHMjޣT<5H 9A7dNAHZ Wx\W #ay/ez#Y2h#+c&kψLv HW0G0Z`k,㬰U-[G+` Xӽsu;-(~Ai9kw*Cq:|2V[NBzavPzHT|5׭:p rĵ;.-qNgcux)hk㶚 ȥVm֌#b&Ri!`Q K$?y Fn TT` 6Q ܆@KW*V_2'Nᐎ IXj<,T5jAV[rIph*d{ 2l%Q\<,99fm]C/Bi'\O9RmZl{+<.Y)ʳں-<2p87a,c8enn{Buccb3TUuS< g2I@Q\!"sN)jTd5]jϋ [> qWC!2_sEkm4Ҝ>½ʌq1%}M)&i"ij&%%pL2oLFLqp-K P'Rγ$¦gN,ѮMj^78N*o/ s9* pK%_*Կ!FCujP\ʅ-D6H |>>B=Mh=W*JI} ¬+YɻJȊ]!'FR(cJzZo\*y¥ 9BJsfxi(t? Ԓ2sӖR*㤔 AH)qRDr~ғRf⤔пJ8ǔ}PO+P Y 't]䓺 ,o]x)Q.(R=}{ށ]“d~ZM,+* B{uOr^qQ$)%#V ! 9@-W2QݑC&0 ּFʓLtʍ=Д1mR#6At,bZgv?N0;QNJ#˜"k|m#HY׾,Ί( Z;>*S}֞0Rj; gYLR1tGugmg\Szd('1Ti!ۋf,FcJ} $郗/!9Fԇx 1)$SkDx$Yp$d.JS Ý&CT'y@KML3a4Zv2[ާ,2U\rn})g@@ i?QΚWitX9.h|H=XRbMqq8͵?E6gԄ.Zױ/kDRl,ǠL6DR< *_XWE,PE Ï؏\<(6Xc0K4l9Q齧A2ёsCBgU3ZvęဠTN:S>Rv(>rA q|&QpY$ l"4т٧b&5a92&;-_&M˛ 7e,LV>sIR f$EQ@9I_&K^,3$1}ՂMj2 3IIS Єz sm#RZe)I4یsbW$Z HZޫkL0!NoMw*H* HT4Fg&uRLTdT*Chf)R\SRo(#('P#["_+ lB%*J3TtNR;s+O4ƛF#hT447??,Q};i'\U>?ef/FKJѕbogqƏWO9`6Zw)`aGDsoCڪܘ/Q?|?zM?hƺUx{# =s h@쿙Wӿ_<|wn491~Ybe;kIBrm$SNj7J"CnMi#:Mרݎg{r]kڭ ELV#j4knGEP.[Hֆ|"zLIyQ7W[WAI.wIͤdyw}qd~y?K/WK_BOrvy a A=Bhh\fpe<2I AH)qRC(g)=m)$l´%~7 ZQqDG27@3cʹKӊ3kbR+!'x*QuBs%DyAQ6޵# "Bu6yq$L:d{29sPV&eX_X$m`0- U` Hp{ 3~޲x쭷ZތTY#KήOg g6+1~3s~I̻|'1LtemjX >9MAǟ]~~S9>tJ#Jt˧{Փ_ uu3fyz!=$ >Q{|8jH+ q4FՎFc*GC?.@&(^6x "U#UC *,+l\}@vw Z=<{uo~t=! u^Zw>PN~N7ۂǩNY藇!y_Y"|m5=t=`әtc.%s?d=7>}\q6]XlW ujzOC؅j=\ +}vAdKqwߨ}s:!)RbjV =ssΙ98wq;6-71}myo_`j!Άn!姿|[} >< hQr8u4n??zDh}b-c|-}i Z҇Jbe **M󑦑}]`X9le7"E`y BIsj&!-O(^s?Ă6n-BJYs?Xj1,qۘJnJO#1sUqv03e-:]!2%!qgVS!ZxԸ/Zk%ܲsPƺg4cZx'$PT#3,.-}w[3q.e4L0 Y8?야:#/ev%YcRNs 'hHl9lVt:;/yqۻRw;_Yj>9`ʒ*_!Xi+{pg=X_a[煬uŹeb TQi)<I䧃mX| N ҩ8`%*39 31Y39< +sitIQ#'4 S]Ջg ¢4d&oUwrٛ+8?H;c'l>_uYxƏ,rspcRd<N0z"ե&=> .k<4+DCĊptfujFb 72O:k+]WfMdl[.ψ-Eg _{ jp܈ ij^Wd ?.pbB7%m-mko[K~۵1}U &D㣶QFi6tt:ԕQx~YaZuZnEɩzjӑn+n νv[2-iթl$~z4reI_]Tm,SP9(Tk/MŬ$d6k,EIoE.k/O) I¸\^ ,ז{EH,bme}7uF%C%׍ @V9uKN"0|WinDo߶Ug^{V3Ѫ3U$'}貁;b^|EG"8c>p{#ss iUnPv1LVWoBM{<CL A&/<`gRJ94d:,G噔i ?W )G aìg2Y9СVpV[7j1ؑ (cHlӔ?FeGpMoɰٜK T4DQZ߂ d%"¾$-[C^PQ]7;'M6Sv)/Ld>TL&cfWGDߐMA\#^U!Rň}5vdR OEB8D9˹u;'eiW%K[l+c]%iĚ45VirB{bҢ7Q͒A38.?MNVwdQŷA!Mwl%HfGjɷ G yVeiJͶDVzVjLz%JͶmG^f .jI(Cjs{7 =H+SksBf!~Sfґ2zZ!q`fQƽgk4vy^z)ywwQx ^o2u}i zqdi307*FV_nSyQ҆0u!C0 [w6t}c i Zg8a%8vs@\T6]_?9QVlƍN7S˗j@)iDZV^fl5mb&ꦊu棡!pfVOVRj{eeY*dkZ_(81'/GHGdKjaV&D/%zGlNDžU@‘ Ր1 3w yyӇnYQV0j(Y==zX@> g/Q @d; {㴝v~u}CǀSuCgtCF08g  [^ #%Ш k/f)e^8uOsl7BTlY+ڪQ 8BcNIn7v,QOQ{GEkk ֐QP\;8b7CM;uu˶{}` AӡP!RA#V>,f;Oz)bZ6[0 uWvgɊP4 h_Gm D=,N(=`RVwSA&>LbKru$~-&WE\13  $׬cIe*@9iZ0f(KJ$ᠽ`Pܖqߒ+_k3n32\O؟6Qff|:g4t7ͤ\_sR:WŹLʵ1)scKNiy:#- (A4,ŵ'<'NDPTWe|2 g'NS9iE?%(ZuܠLqWj'8 R1?{5\:۪| k}Azkԟ͎IZ_]t 'uΣTN{=B %2;M֝Z$'uF9-o !CcY/p)-Hp|7_ޜGeh7i", ]):/ѺU#*\\4B2@KQ9kD/Ev gQ9lKm]:%VTK@&Ty[v^UhCԍkU$ tϋQ΁`l*MUj)m=c"+< ʉJjDOk|3.A0MO6q@TNV1zV>q2F C{,i: n[B[Ӝ˗_Vx6j )#6 BH g/Tbj aˆx#} WUZ֌uޮe btmG *FȆly< 4-ac[ʄn*vSEۦ*-7*\&gzٍ_%ibY4ve,$=/ Ll9 )v%٤n=`[YXWeCBdڊL]ĢP7[VU[pk$G VuR 6}q!j!hb )550>0dd4 Pc$\²kc!FQn2&;s !Vj:0ROiƖ<Ӆshx/dk#& ( չH3Z;D$ R6&rZtH=^)ROo7M#KHD-ڱ2qƹL3gNAe8"gF%aF׭͞r)+&`rx5Z"3zbd P&=aE!h-ZAȒF 5ZI5kVT),(eb"wޥtRxDdP~kľ_퀉ܑ0SyNp S@ :1KdS%9Gti:BaF㻣vY缏[&Il8rz֣|N': DcO2vYb^6c#p^ĜJ ;NUs<.D) ({rpsXQ)NǦԷeB1%Kvs 6wK5w 5lZ6a?n?}x7KYAr %w{yý=ڧ5+8F\_5\/6_F67uJnxyX~{X~p<[Q>}1{NoDu[o-f.H4׷3y̿m?+9{R,uBNle/RKl 'p8B KR*řoN_ tҧ&Vɕ"F_Zcu$/2'V/|Pm*w?a^Q]^1-RkH.m i;aǏu/'8ZӔ9H370 )I;QֽbLJ G1U]yX6 MPFoAT4L=|2KΑA Hk9x枮bdF NvFumqlHJV(SpJ-slnF<{0V,*[]PHtN-wü|E)XyF+suR#v2QEI"xHFkR[\^ee̘hI:/&?.Rh'ݢg.\"Inߟ9}Wpwߪuԕ_ҧ|]}~>>"y-dE?~={["d~+_NƟzL;&jXMks>q[]Yo5i/׷4뛲LhMYi&?Gx7Sn:16b]Sd5wKArXn)6sǻEwKA餶Np1ͻW n9,7ѣm,"gKo[÷׻oW̓3X~_7MyBIM۽C`!-r/I}ٗ#bvD m ӈV(5/@!ym@b@.E:XtQ@eJ;^ YlЦ.[V T]hIM@0 Mi[YٚzneW2Ҩ 4 (Ɩef- і҅EQ)L TxFm{Uڳ% r!)d5Z"ch1 2Ȱw"#Und*xO)ꕙNo}ΩMyZF+wSΙX'p&.l] z`o5G(`謞>~ ^?=:z^SjUDCzHǨZ7&GX5bp2$d#\SL`P Hfh>h_Uiu`ъ(SBFqQ`SY5*[VhMmm)lamZ[* BȄ+H-N'jJ`9&wh̗;2ZiwCmDJ: ɏ+(4rv!Ѡ&39)A9{`dyKTPAV. NU㐚BGc'   _yinQrO RʚP:x/ ]14r1hzzk>ҫؿEf*Jyn1gumEMsdKpn[ ,~ctDW ˀHZ<]I?3aDҕ(ށD'\/cʧUVqxvȲjgl)M#;O_PC%&RH˅ )6𮐱VډՇ b jjuUZTZc[mi "QVCpFim6CuCQ-4!IH0FAUK=Q#QڔulE۴hm]S[۶E6 8m-+ IP @ Ŭ /V%birސPbp @Vy5I)PpP;!3k+YwKIHR^|D˖*s5=HQWoi/pzX hn!_X1;Sn`e' ؾgwVj=2{^A2F>@SV91iO0qtudƬsHOokaK y-PZL,DHԑ$}о$Lb(I{t<2JcpI4 Ej@cvN Abf)6iɰkL  %B05-*$@7([ +U 2et[g>E>ة- JLL]JGf2u[np/w$h@B<[B!JTل{grG-}%1qCe< D7)b[I<2'XTas_h_r NI^R+z^ j6Dԅ*ɭ)= h~o<#2 0c$=)t$-Z/@ԁ `M`G=eNy-hc`aB3s f}Z gߗNӰ>)H8`2  {2$*^vN^9;c\W1b2UU2ߕu<8/e:2w??ꕷ+oWVyCuwun*B5TҺNkBW ]iV-֊hyVH\\օ|ɻYjCMH+.W2{M4ux>kiYP|Fݻ a!XDZ)]oLhTn5QBUDl)%R ߾.S2E@8$"F)FPNZS()+7PMHKJ!P S?"Y JBC)+8 }*"(YwKkdk t[;$ EȀ܎Hc(*09sѶFN NA&PfMhȸ#QA)--U,IlޟhGN]cO&W4&mhLX?4?Fh<JX6'×S;jD֏[HH 5%(t햔Nux2,̛`Be(/Zh%L$.sjFU)~].w2}З};۔'+$&/EFh0 x`K E:bk68 zD܆;Ucc\2 'Ԯ[%|B6J*1BW 䪨x܉Qm-;)#>WOfXKy Bp_;K*b̊Orvҫr,+A-r8yTNi%klxׄS>U #Nsܓ +7+oW]+*fմҒkEMP;kRH[ܜ[)vSo %+R?ڮ=f,g׈!,̝DN@ RgHҲTMIS'5)MmLSZֲ)V(}N4X)@  i 5P>fn< 8@pR*m40  F2.mq>FX#궫>lO.Y?v~%U 2fT;H-F :TSvpJ &[DzMHCk12-b% 6?A1j L #4!J^fOcM%jxJʨ!=:"~%Uv5-sFêoܻ=!C.O\\77[*w}}M~L)IaT"b]SJ#_L 5 |S?c|AF_vePډ1SĈnz;ek,E-rogfarvB3G9 @?Fȹ҉4w/t[ϫb^I9:_nBA׿~$D=};S,Zf`x-q0~*Z(QPr7iدg?ye Ղ;//PJ.aӊ13lʨfBv܈f.I0,ePfJ.R"oom]WҰ.ڶ( VR-]WXu%lQ@UAʲF*˝VڢԨiVVJEP[tJT2 i(uUVEJѹz5d&#/\``%]"6t|N睜6N B-WIURfpo#!}@evn>VƳ94#GVd>BkۑGRb2[v^fЅUnw{>2R^TI&II~]sx \Ǥ Emsi=Nn4B7Kc9YkJ[h7-fQ R1X^jnUX&D]kYYƲEBq @ 8Vv(˶ww|+~^ SѠ mRzB)Z#G rݽ?o`ܧ,5>yG/${t: r*s}sFiL݅ۥxsƼ"^(9}B;`}vlh[PivUACOy[b7#EJ0S:d}T "3'FP <ZABg}3$wno8a8Hz&RCK Vy.NafQۆ] I|MV F5 flY."ZPBq!5‰ۈyѬl 1aؤD^15;-\ё"Nu *uu @#uy %'S`Avl>Χ޼Ԧ.l.̒ R Bʺa.RK8h Pj$eZ%QZXVmJeͥuQh)b C2 '!*wCu\"I|hpa~7@+pPْ&!zWסT:U2]+[ (:r k uPe]tykܑcXN$gHS讀92\>ܜZEO7 grn`N{ƃ<1jl5r5+֟Yh{4>{n4|(4@&%kGR-.##:RL7T^\yThs4Ũ%ndI2|檼a7]z~=Z>V_}\ơW¿o %oPnQEV=4a)X ]4&$VuPo!w}Pd~M{F6_r\u͛:/CiӟWX Կ':=eqGY'L/ X7~_8.˜ QtA047zC8ռQ6RS (T1-Q6IF7"7KuÚI}-KFf|(2QKUqIqߧBb Tc/E&]W[Jmfhy fsF jAlʬ)}zia9b4|0n8rBb)m?pulFAyغx6ye}g&kcS#؏?K̉5J !74~:yA<-y[lK޽٘H^&Qfr$YvG'j ib*aD  y~t QiY?>:[$wK"3e?L+.D1Ͼ<"/4ly2X)=dk S{CB]~K1YQR1w\Hg= %^sdCAj͸zh* D(tNwQvgtFBM >}\Ezoݾ${)NNF^eHjTř32|r'˜ىwBsE^Ob73FarpW|mhLK.4غg0 B>~,(4.RPB¢YF5j7a;JܝalDv ͡,^Ɣ<"HE1NɜaOjr$ ǔ\bțId\瞖GN41TBH\||7uvыk: ŅXCfk !z;sCNȥDN`0(IƖ`BA! nj&R.* :8U8bss!8 (˦֦!ŋUm@bS!i+RD*U w>FiCf/>T%޿JDx|7hAQz( jюjA`"iHyb"DS@4jbhjabaJkB*K䢶(`T$ow{c9po l}dԫg$`#sH>#1/7/ ik=#1wyw:L*& 'W (s\!u+y|p7.Bb$<$tq*I0_T/֎I ]O;B:^g b0R uJ,saFLSЖ̀YiC^{Q ^ z^^õ%p#3/_ô7OM03ۑC/ ȎUJ6y'".SjޘPTS5Pr]uoa^`%o&2O(͒K^PbJ7xo_gΛWѧǷOc`O?n}0.NI|. 4^To5(QԺRaSKf+Kk! %($uDoDȬ+Kmf;/Zo1\ùK\}Tbbn5[%zRvɥ ہѻu 'R!&UX1yKI*PDgnH$4#\20uQ7?eiD ]xqv WJvݸ`ognϯO(Mo׫]C^K'no^V8?oude/_&S=Z$X9UŮ E8Zms\]^sn[yzb[a7Sm+lys6<>]d`J`&]yzt`1Qs{pso=k:5ۖ;&p3+ %F2нםCbaA 1q}51ċ?L,FaOrv,}Tƕ#EXزA)Mﺼ U3Ϸl{QH!w /_WA"Z lW:ױݾB\ "_*cSݑi]8t.|k WVUUݯXr?՛[ӧ՛Ài3bT4}y}oq=T.JT>ů >Lb cM4vVzo=*7hZF)dG[,>M'!tWi/f=ŋ{n)ZF)fqn>OӉF6*Bnxn)Z)ϋR\ vc7aۜEz>=ܯ>tn޴gGQW)VnOn_ot^ȥ7j'vw NmK-N,GX TvZlgֻxCJH>s+J+bp 8phj5ZJsij燹qQvvNl<5%+tobƌDmA*0nHuxd%"vrN$"anh6UrP7$mؙE(i CGq-,` ~1q7- FǭTWt4>YD K۸)v,%RD)Jkhc'9[mB@e\vTS DG hk CEFb\Eƣ Fkl#F\47 cNi(vǃICc@R&e<2JCA n vHl\>A*v e0Rb.sgSN{IjcOʹG?w&޻HoZ3\kkQ4y']~vٕgW~ziQ Ჩ#)ceـ55px]F6.P `WS5Rs<ٵӂKˮȝ!W4Zd!ș7|=f;u=-v- [F'C~dn@)5n )֍R%uYs['LK*1HAm+^p8388 U\OWn\9_W#rQW eZӓi ~{ϧd8Q`͊||YQ|xŷf?oX>g"E"ߐk?Clb懧;tJX{ro_LnU=S* }dJxa%?>w~HrƎID"@rz)VL0O+drGI'75ZfAǵMsSX)^u*C-?$z&x~ !hSمF3ǃ9ʓ:>IK:^ɐB^lY1`q8^|òDOrSs@& sʝ#s9"h|11*@W3Y䞯G흮Gz=ܕ=rO]BGP']''"ktY+΂|:9( v/F9Ȫv&w]J%('w{Dw<ێzNrsf'{ڽIO(uJpp%'[bUo޷$z6 V9掳{']J4#v#t.>L٢ˉNvO$h]trVYuӝNֺOaWE3*(EEe6kTLnė\pߣE~0_ðaB='&j`r9Pt0NZc[p*Әhn%'  `K8gDDxx',`HDna11 }NvFs':0Ea\苽d]}xU;\%,/շΗW&wKw%g[&nHn^oPmI_Jo9q܀WUgCt`֏dt1Ɇ.Ģ?qw r}hPCX/fWvy!ީ|p\Ys>kL/wiFY"lk`JZqTr[b{:K!N/ԫTaN(x]kNMf9ŤyY) ƴ(J0PFTWRl+`ܧJdi:9 Ywnzw(i<;1 }UxqLTeN݋MsdrN8}"g:Xc4eaPq$3Ik> -bR.8=Hfe`:ɒ 38qqkG~p !v9k GSL9lrYk&g B]r+m!B c Dh-%j(ekQՀ6T PB1)!`Jjxw7 P.o  +x/fB Ŭ"$ +_o>0+IO䑩>`NV12;_ 4kTc:A˞u MMbD5& CqN3Rʪ*ZRRj1!9AM75 1!YqU PAZo4wcWg>o7˟ng^[`F=QSv|t[$2tߕiZ|F=J hL)@rS_ ts] :0NyGCn[Pfe 9-&kA)7;YqA(5֮.Q{=|^t|]_otr>wW# _F]ϩZzh}I_bK7z#zc*˜2x +!Ӈ'wkDrk@Ńn՞b=%AO+j4p "ػ7+%ьryCyS7~Uh^QqJ WȪ,)RqE2n8^CH;ǃ[7q&ۂn_^a0kǮD+#[9m/(LHv9<먮6I2q*,45[ǿ7͐jKXj?3ӱn\ wK% N4F9-GZ&WzOZ4qJKWPsH2ng 9l՞WnHNxwZ>[!$GwZV#S/FDJ13h'{(54Lb]@N!ᓎ><)Aќxz(-OaWER39xX0%ʢJvKLQ9kp\)&Yr~uO'jz:=iM!)yT58htwο=DQAB) uɴ,k0mI) ʈ@!z-hhB#-w8Le@!<#Ǽ-QA.C|#óCsHUAW.hZ*&T82@s$k+p:}K5.(ȍaMQՍ*"2D !<>uܱQ{/6 τ upt(ʙ wx!L[GFmܺgۛ*LDEԎsItyQDiMsS0Q8 `YLjn8$Nc4-HN`pfƑ8rv&ʼnU@޶[%᧌q?p"t1,H ֘=3xJ 9I<3+gW,f!RkZVT%3\ג麪\hτRѺfDI ,m!kvY +ּcOR"D$7J6Ս$R;~6 mFav ! *A=C@S^ K@qFtvl( PboEH1A8XhMx-D;oX@[k#ZjnWv>f뻹yT9ԟmzqKs{Wp}S;;J;-lu3/\_~E}AOi4fn]w)K%Alʛ]^Nf2'?wO}Z |9E*r_~,~e5 wb_< G:Z)tEW A¢0v#5Dhݝo^UG3{LHA&ԓ 4xGo@vX >3TpZ+CDϼ = Fnbֳ]B0H#4XXWì )T|:9D}Y+.k{9|]:Bj e*iC<8dӬO[ETW2=k$g{_eRK4]I5p̆q+@1-ΘӛRLFHSlD]yoECQHI9?԰yBZ1d:nH%I1<:.-ucw]pzzfNżտn7#v 0ii! Gpۻ:N3vVԡž7@Z4C倆1ghŻW ! ڳ!ō<k_l|@sVpj!ʠTq.W01B4W\Zk\lx,Ol_<ٰ`D%S]AO枳жJy:ȇȊ`Gg$%TSp4=^iq&Gp?ǡoƑ/=1LBE#^nu7*#](8$~>K&l$q;>Vu™.+BL)%T6iguieLĨ)Nco$ r 7ǭy>: _CUXNaxE0"4e(4qc$hUMJS²kqTY0jWROl0#[ǰE+'Sہ/%v~sķ}2غѯ 3 x$^ssB7Cq=`xv󔝠O\X?0JwFfAnd+ KYS,~U^q,%E"$u5 Q L:S {W6ʟnnG2^@#Ur}J**Xsr[ e/L81E5i4@oWl l3 5^X˩5`UjJ֔ Ԁ-QVB YXlQ|o:CdEoL_r=U⇬XPOJ^ި:%J† ` 4:&%.c%5G'pe416IKڌɮՈ]vS\6ϣo, !Ś?OsJm&y3}}t7^b*'X|7\}-K"䋰w+ Bp@g9n2J u8+oH}{YA0l0J9 l'Q<_ h /Wn5 !^6S4C*ՎhTƇAib90( ׿!5Zb8xL63j]lVnc9,I(9,4gnCs -p! |G҄0Bb$^uEbk>2XYjB;a1qAe-DHݺF4#I͘` 4Yr"fq=W_{wo^G^O4Z: \l^_~ZILvnt~i (Mesն^!~;wW}\R>o ^mV V~}T7vm^%aۄڤtwS9P!iG6?)rI_eD=7)|k Aٽ ՞;o])yOwVYYӝ[ *Q.]9.9$bD7p|kǫBPkNkN)G<lU7@ZUF,眧jGvOY]U _;IUkL#k%Qɨ)5TB9BY2MXP-htw [rfw,rlB(ɑy spuuHWrkΛDH0Q$nB Y1ITL(tpCt1yB`6 uV:={In4]c8O@A뗟&M.l9N֐r{ WjbTS^݊Mj /%!|v>3ie֭<6aN&Wue6}&֔MKQZ{밡S v3jWlBf,ʐAHJ,hV$5TUvv\e_!{M@k H7 ~y-j]6D|41#iUϿ|X1:s;~@mccܑ~Qc1xw8V2e-w448ajqB4u&pvFXn 0c5cCɆgĎAla-< h!hP΍W1yKrU^\{v'g |;9[9eSfѧ4[wy~l+0Y=ny9f*<Hbެϊb1i:R)(NhUi ՆfX`LQ"(W%Z"(h!$oN ZzEKSMֆC%%dTX:&KF(cE=?T„IQ̮P>+#Ta ؿ W7l"8HNo/ϿbP$b"@%e!L  D?(.+p>>l>Fx͟л @B˫<.sEz}|g@)y>P=޿+ làD } }H>Ŀsݝ/E)}"&ě:!q6-WJjMtɔϓ%X#pV:Ϡ$M6\ṉA3|k=tUUtYܬ?u ^TfnfA1"hi9_;ut~ޱ&0zMz7{agE%yMe񾦗XI gT|(w)Q\q܊0ᖹ+SZ{Lwߋ[g;5FQDtG|M\<9h^k)|ct=qAk }tQ󎳹:N2^ 2B*tj?Vv4k:)vjX+)v{R~wppLbpjFww@h`MOk'h@37&{[ۚvD& NI%c=:c$)f~.KC|XmqE2\r<>uԥj谔Q9X* j vrM[]dֆ\U'(&U߻̊p]]z%1RC|j`ƥv,0tXH|܁iBb;>-y-^|bֽ{8#LQH|k!BhzO+r>zLmpwI!G fC)~\˻"bQbt;9m 5ʖbކ!LQyp @H!`I&fMBQAGuː)XHZe؇5 \tX_ĕԬefF,AELMG!_WXiMҊx3s4.H rb$ k`Mm B46S)K y<#'doF-p"g֡ -qI1*Y,V'3M1Y ` EJt;djwBRT" R0zcI! S*_#͑k .0r*>&$) Y@IYx> G&B kȵk# ޑ(R܄zK!<ٯ۶ŧt /& ͙ýr?^r6w?5ǹIrd_yX:soFE=o!1Gsk~SCɊIQ&S9l> *ßǺzV@yJR!h^o RXe:dߪ1Z*)9*vW~Jf2W]@ĂX6JJF"`-GQb:~S)I!jGP$)Rg,ib900',,kJBڈ]TMsq4p@?Mgpa#LۉZܢ2y*%oȌ~_:_3݌%6>s qF&>װd{X17ϘGVϣOWt(_qU췫|[?9.x,9{=̉e _~nFJSȌs ݱ,ǯ/Wu o¤6 NoM|ujYwQy[W;mF.,MZ2̅Sp`aݨ oaOqdmW/ &PAz 3*كxyrH+:̌ p啡N$K%w&$Qh2PX'rtĺWT\7ye]ZjtJP* !*TZG3Py5ZFYJ9qP4(nHLS Pe"5Z CWqI"]$ @zcTڛ"u'hDx<>S=N tCM 7#ü̶~ѳ#WS$3*.,MMS>pqm"g]ByB2QBJϪ(vw $ _q1,V,<+VQ?R740Wyk)zqgRRWC}ji*T#Wﮗfp[r'&8y~H-LBK3jFbjKqAQRnXg )M@D` P޸.D?Azzىc :3.p(XO|^)(`[ZN 6Jl]}- & :TSIvیwI?BhYګKw5ԬQo wxU?JLTeĪ{q^/_}_zZWD qF׺h5M"Q-!DZ@G"'Qgok1ދs-@S3~2%nJ VYA*CmtaiM R JZNÒ۩tQ;nJ#Qٹ(+ Z(+KӥV\6QZ/~~-k YY;idQW6զg-Ͼa1t(l*@jrsu,mꫯpQ+7ߑ &RnK4.Z'⬔4f^X)qVKJ/Jy/+(NJy/ͥl૸p+R(XUT?|)@R`hBYPJ`CRnKM9J/JyGbr =v[jdؗ^F+E.{aQlR l+4J9 +&VzHm2XE[iT?&\JgCV*d K +=$bؗ^FK%i!3妝nB 9s3Ѧ]bM0X34k]tk.ZSyP@{% ԿzWƉ[?gij?-}ּ^T:kRzROؘa/loQއ{-zEFKZhƥ8UAF\t*Ȉzx_AYf:RQZI |)Mep90R(] [-F5l3}jg//OtWw<Vp@U px]u߭*vhvpԊK6ǔG0 LaH8X#ljD Ǎ+|YI19 .KFg?\+p]~nL%ꇤq'UȪbHG}eA\6 ΆﭗG}>;Ol*azp\'GVU4b2l<ߏ2F cs]<[ݏ&VDCGrgPIlxTٜUBD+?‰ol$#,3o lVV,BgGϴ_*j*8y_ɂcH~~of~0FfNHF"Mp8u<|C*JyC}\ߺĬF˾|D 1fbOdz iBL\,lwxbZ7JZYe&jH@4՜Kލ{;x/)jm߷3eGD0Γ|x6{ 7i#egGH4 e.ћ=pCTiGsX($e׌9"*&Ƒr #HN(6exIO2BIn e)(#T3D,ׁИ82xm6)Z}I vWM=֧J܅ݻ@s &;rwI@$t>ܥ8K3"H?olz|{KrhnoR )9V?7&RR\Ff~/-t48B,h9 HxPd7F'}9ji1|\2L1Xo,IP`ۏ Ƴ嗾`zzfhrϽ|P08,RֲrR\k/oƕ(e *L@3V{ҕDF7U-khl[B|;·q^A; TEpE1!sΝ(Mw9>}'1#aM?J3( NȈ1 `:U PP!8pm# X'Jd -.qn_G%ǂK":W.h-k/Tփ#`.EQ: EaR4@TN(1.kX_z{J|U:Nuٽרj?`1/j2Q{ tWpIAN3+3UXs/[u Qb5QHbqLvZj ㉑$ܘDD(EECܔ.ZKIZ.cT\(9-$3&m+sSfQXפ&nQibeVVjZeV]xOɀ ~< sT#8XR -I4JHȘbN^I#KD% P9-Ca!,"E = &q*m4ZZa9 |MlYHDY=JTXlaIKYIH2bqXSLe?$ ] U!,]d+ f.})T >U1]< 0-TRI!ZW B5z QH -};Y| HǵtP#XEyXy<\L^|=/2Dj5wbʪTI2˷f?[\竖*..`"'oФIo8x:u)>nvKt.Bť;{ TLt%,/"euX8 z&b u%&*[|ʩD8m,AQ΀NfJLG -M/CEn{WrI)Dӑ)\UG:Biih#9! `G߇oD"+[#)롘 E <">kvj/gdC{ܟz7Wk\-nVOΌԑVj #9f Lbe'鸷>0z:y~s˺Qo֭*BDe)JnͺUyhꐐ\D)E݊JKm%)63Erڋ)%X.[Y L'y;r[լ7>wC_3ts泙 7x :y]F8PB-N{ 7|3%gUk-|s*lԦNY.~ysL#Ӛ*w?>jxs!wE5Q,A* hAN.pMu{),c$+*2䞻m?<cn'F%u֭zg@V|"zL=λ-> ߭*r;+Q*tQ@E߭e15HW.˔:IU;*OXNX}lBcDJ+rr‰H1;Dlm-gztl'f"׈Lw;:ns/S}W\Zm͑k,äKJBJ_H5W˖RäboK)eg!IiJ5FwRzRx2 I/xXk0F\wi%fVV?_?e,Ç8݄=v@@\P }RP6,R !')$fQ%j ,Ai};`Ԫ):AUS:MczRz>ϣB9ہ3(h5ʳ9kViEYiYEe]}C>1q-H^wk䋏ePsv|h@c*"7~$I8ͮʕl<;-n[` e]$)=c8FutZMP{{\2DBi!j'DߗS0¦ӕc)<: &Њ40]gg;^1Y7;;Tf&p,98px@:HW(橵~z7C1䈘EEΑ5*"Ym3LRAR;$Ib̷I @hy=ݛo"ͨY,- V3Mh$NHX9iU"I-fM%Y /Q6TfQNHKX@#~e \ĭDD2X@D,boY+O|6aAm]D OLlmQB\%`ʘ4C0uߩ""8"˕!19BDQ6 %ݏ D{[vwQzUaݳG?/{:W)?bo޵5mc鿢l6!k;)x%[]n%EL_TKZP HIMbw}88Wtq?D >zrmyΓ{,D%ܽ=/ &4LUFQi,!ͶL7Gd8oe=ogPszJ~N#10 /a?ݣeJwf‡-m ׃9 H}96nn g*d_k)*A*IjE=4060BI k?ObLZ%)(زx$>_3>^Ryi>]Aii|`2Bn" $vV/Bʔ"﵀ԝ*ה>PR îjw"+6]ۨ~Is as@됐o\DdίϵnXc/é`i7hڭ EE!?n-j7W lDǵ*ڭ=pJIcֻvCBq=)[ؤ傈!عCzUf!`R)m_&IIB(B(0esmC8fQqL#* C Lxa$HH(*#PL1z=L]\\>>D]ւfĽ*QDZ' Ui2‰MEE4i%h|_d iH^IA͖w5imAf>6`jX':8F9hf߮ 'QlsX=zS:xzpbX=lpRtrն9I|w\@U7ǘ.qlӁ}旍341;So'~ggW!^ va؅ͳ+1]bv$oSiQB;7_Ǟ^hgIäJAkHdqDn'(PrV}9z F@2vCn5>}5 v.gc$^rՕ3rm(gB#/` n}Ag$㔻ji1Ua:W.urzE ڳ/5z+8{&GzEhMdQڽs&@0ɚ\"u)ӷvjƝWjDnqs(|brǧfѡBr+MMP&q1 L$bQBdX B8bLd㘂8 V˙ (؅3_VAtH`S_k7o$;w@9Nfwa!ŻSx~SQ!@M! '_EK󓽢1^?L ߁HK\M 0UӢ:fHĬVOiqA-Z+8R5 )9%Fmϔ.-̓C8B uRR13I1ō'S7%KzD2#ˑKH0课|=8t3|mbm*~׹oH7>~xfFJ$ K)~MD&sSF6N? s2>hsl1 g+9Y)Gc5̵PʌƩ>Pd(\*1ŔJ([F߿lwA)+%G2G 1Ǘ~Ǭ='(bB덀<" 82S,5ҥD.-*a5*mVhݐ>b|;7&96|VD^A1l+0Y'ڸa Y*aKmOLHsV'iEߑL7g}Ar;L@~),(;GWP(0Da'E0oWqzgWmRg6 YػRG^¾Zc%YFx9sˋI}P~@+CFo 7DS(׭5 Qf\o E.]k :FLu+ꭝ'L0(wb8$.\1}#15~| P8:vq,&^D4dR DU($)p$b@&)D,I8M">P"|;PrS:8AB$<1f4i"TZJA*`Ps>A4%DY`PYl/o:Z'YQ^33l2<`_dޛcK.p!Ųxk2' Rv ~8ٲzVk-9 I=ސڊOU9zGnw$f[_;^U_Ox9z%$Jqhr$4łH3aBx/e bcRf@4A0R!C X:w-f8T1=x21&ц&݅2g|?L+٣?-.fK^\B>pb3sK᝹Wjl.~5w,x @ͼ⎃8{=OԽ \t.6wKٺXBP&8ֈF8 )&86I)#D+)i8Z ]|m|?.< *T@*ՃRH *0A$4,M^Kq6&q[~&}0pxZyi:Ӻyz ǹJ3>w9"o7OO | c^tKӅ~ /]}j}\ez״c@\z{0^GA l8짍>=>Ҕ|940(zߺl?eusrE3}OkY|t+&9e@UhؓBa)"q@ -^E5~;mez4翴.Mկ(t@`$wƶd#g~(,a#6lA{\@[[w? |+#*t3:zMAdRZa^KTDI)#A`Ya)z"K˕R ^`!s#ܽ~K>ΧqD[.6I,FIZHR0Iޅ#@J]G00L 5݇٥H)\cBBň0ʥ]\.ڛ5pBQ WBM$ŒP2"L T4(z tRM+J\,OYfGY!̊m|L6ജOm;FK_aד$U앢7 HHkz8Pԝ|); A?u IjT]搵d&ߎsI2 Z z0sA @"FqB&P0"(8JKF"֬j$L0&,;rlb9@ _l~SZk|%hGT/JP`%J(BB)cBL(e0LyRY$bΈR5M/[_/[ qMWpsX8z,]֫ \nn׏ D}KJ) 0'|kF pBZ=/KTP..d;> `ݰz2D w3)y'Y ɭ; 0Vπo2Eqy>+tMZ[eIdk{wNYnj=ju 5]h?/XJE,P(8PИ΂wwP'%m}ɳ_EB9[RtNM:BR3ߴ-#$s`zĉŢbƀ1$FhY hȤ@PH $ "‘<LR%"N桄+~ 66!1>5RyV`+e iZ9IH8vhԼ3ΖYZ$efT5^u8zaiݽ4?+1_cǁ HPjj0zalHͬ2{#ˁ劙0Qi}MwCcFu|C|a6oJ.՞wRAp=5_ɱWp RYG%w9]~aN]{uyj6˂O\לSC*.4[+ߋ,=;/&Ֆ(P* @ښIT  NdUlY}h5A*uؕ Zz1v⥐zfgޑH[RF zoehQV,*KP |]'^[_WYv% ])m:y5{?B[w~q)H]#M> C Wf)@~_Gl榨d](^8$M^/$6.00~ѱRKB3eՋ9A~%xe:/ܿUs^vbJ <%Bqt⓽6YNTmX=|2^RlGoդ sOaSNW<8ǻQWj^ԔBq:Sӷ|-*p&$%\gRyRL?mw9}cˎS$+/=I,S*. T8cRra0=,(~ 3Ii AqJMx7?sy|Z&M/E=1lRh}+-x5ٛ}v杲6KŒߦaU133j(qTǩQ3CjT #tMMkl@7uel["䲉MɫE\q$x f06w˯.҈T" 2Ii4 I5LE:I|)L 9@FqDE0O *}PWVbӨa6z wpr򒴿%LkİчnK+JUښ(1=ҭ8y"Z.>`9DEBcv4k~6''X'l mpA'83f3g4тijAD$(Y23,{*S;A9j`%\4KʈCL4Z؄RDYN52g4D`IJe'j?_ofmd&_@yxg`,9<J*1&J#2zOy zq ޷gDe[0EP/`O2wwclڄWw,ݻV+K?Lg/W`!"bx$q0ضQ$>HT}ڇAٻmPcQ4Ţ)<)kf8yp`ώKfI;edT 20;{Om_Ͼ89[g!#-޽9y8y1ymcy 9ܚLD$F~@S0eM,AZ:0Eȁ&࿠o 5Pޮ7MM.y(aVÛt(ex*p,'RpEB$|‘"XQ /fn"b:Y'56?{/OO/>^99xNWhx`?EE3PLRMUemnRT)5nKaO DchLii9ی)JI)1@C&O!STRr<2Ĕ Ĵ|}1"NUZԴsl" lhb>Zz'8(Kyjz\^3>|;~6~+ZE;O}ɾ\'b:΢7by^ /"h@.^PE)eiFsE~n}ߛ3aUT8v91~ lDl&[e=~˷!OeK^l:ح xz0=,YAiBO8\`1p8IediIuWY-RMI)$s惴r?ܐZ4b]ˁǰV)Zʮw7}nc3E5}J>zb:fO-BG:zkaÌ,ӭⷫ 9ǨcoC}Φxu7 hEoc[13Vc"V͠CQ &[7XcUQvӑ]iY ;O"5~;̄ uV!91Z!J]+Q}}8Dl%ùSؗ蕘:FW6Nί X rC#YyyŽQ98z_O"]-`:]8>n>oe|_z;XŎh:VU>q\|Vys_o6ѭ,zk?Fߩ_h7Vt9vAľv;iQGLj:$.Q2źoѿnuB DtbQEKL%iMgj:$.2yPq(9{%앐vdP6y!LH'^zjI) MGbFoJq),{N5$o[Jrh&A5)c9̅DT l$:S7* %\̘1"v x jjN¨d(Q! NEyZy"lHe%Uнkt 3bI)+܁N6a5㫗9ǥ,¦#P޿,gIf[tdk]x:lNôHs|匨RdaoaAnsxáa5WCTʶײDv[y/Eu{6߸K`]}wVu Wu nCzu8I*Hppnw1EHJ"NG.Zye!tcu:Bf&0"*)K|̍[ގ̀<_n v08JJ5 OJ(U{hf]%USKCGLΛy7(D W6)kO_nΡ9p]EwO*VO,vP*8*gwzc 5FA+RF;D.brQ)5[KY*>B|7V_yri~Sjp0ZFv]̝gjV$4/9<+AA"n}QX.m/f:>~686,@)j:(էV$ 8Sa$Vf(4;ƪ:| /!ѥ5@E{c>0儓09V9e\(g[pYϺ;DkMGն"X7ˣƥsg;/4qQQΥ?7qOs߀8A ^Y߂tg',hE, bט`"h!D4O^Iqn'$LJ R*ꄔFhjZOR0)޹n5K)aRS 8I雖R&¤y膔2&tۖR¤!Âb vb} v@J:\t +yQ=ڤcDN: ^/w "XYjKRl5iTDT0Ż]C.np"q"[oR 2۰ĺHrGp[|\5O7%xU$ykVR:ocbTR7y]oxeVvh=yb6!|]5x&S^5 RJPbv%)K|5atYٹ<@ Y9w+W(=):ggKa5h- +߀TvA]ř՜L_Lc> 6ν߯l@.rQ@YVnV.vp)bj@WҔjufatŰW9ĩ$SL8lRvX),M5䂂X(V\Lu< :J"4y Q&w) =L!90ffh <Mfc;da5RN92`Kft۾"H A@$&;J2QVhY /]" fPd`fLjL$z;~hS ʄ :!Ja*$#,#&BQcd&J$3FcF9)E撶)lIB$'ij1 YfM >X&սQk (ݵDwLAYԘo(<3"؎rݫ*AtMQ; R!A`Y*O;H~a&R` Au/[ٮb;LMg:QF0Ɗ`mMNրK N`Z lU6>K"KLIEe f4 X7ySڂ-s-'#2j&-A,'-L۔(1~ivN.; Q\!!LdZXC, ~xGYN(nP3ST.o) {O/"NCAd[Q Wu%/=:tU -t .&vDHWPP Nq|lj"x=ުUkQA 6Ƹ "%C:7x}HUP\gsKb>/=}G14,%R 02MC 5HPiA9 "VK3u$؁1LXP*Q󁌔T y6f\~xcRC|Z4j_s|S\бSxx ;EoE)>l8~(}H|o^媷4U@>i',"Msy_ &,[>?\)w[7+A9-į 2Ϭp"x@qF?~b]]5Oh:a>o̓-.F= v; Ꞻbɾ=uP|w?J5zn}Q].]~ԐNǷ`vmf/$cl"D28i$p]+`.e(g[0[A'- bV 50MQD"]6\0?R;:N8el3ު(Txp|vU|Qv];P;s˫g7\'aD M~9M˛6Z8A}y\I[Hp&KISyƝAS 7 qC)FV>RYY,Yԋ=mW i-[IQDpTVyՂDU^l@뭂?CGwJdK"NTL^2jLqoy7["71W `rCCEb`DIx-5F/\()%cyST|OOv Ǯ Q~򋇛aNNr@_}\ ␉һ:Ӣ-h)l_/_]\;i-zBh2a_ tiIHBn- ۬zzUF;GVj^SH^>5]G_\vHaSTԺ@jO*9/dE>6x듓FO|OuuP{T\3 c>\73l:^tt6 iYȔV4Ro@(ʝ;ט2'I*AX 3[&Rk9J3 9{UeH`"(8iSsNVfF eSc -5|H#~4psq9u uY rΘч.kvf.\ dPvi]TpM[=u>OdœY:|شrO+вkoA0J+e.-rl8Oc2W-C61R ]*dSPdfx(lwzbj@HzE&_?Ňxƀ!خ){J:jWD]\5)+NyVC%mT@ bg.t׬ W w'ZDYetVguu`gu_#ֆRAt{Kt$] l[6jՐO(c2<2JJC-(s 4;6H/E-RDQczþi6y9HQ+L(AkL <3cghhN aܵ2dNv:TGhM?c,UFbֶ6llfK_nF>zX*ܫOBY.jVƬIgVl(5%Q>)"W]MLf:F$v 9Uijy1cR en~+JaVA)ݔ-9 {31<>#Qj">p+܁'Rx懯D<.at{uzY.s? s(E/Q5J@5Jd@ZY[N1xnvJTuTa)ҖRV q>uw8 DŽ+(-[h)@~ڹAi{wyp!-ԟ9woxƠ!8s޲?"Vc۩ nX3B -['u/`6 |]b"X'j:}uz-a`;~Z_dKtdu:G̱jD]!YDjC-4Jnz+vW4B tzφ>0+Q)½2!}*`^㡝yCxu}oDwоJ#;RdTGB(CJs/̋5ܤ,\^Hɘ+˶0ЂRV[ƝH-KQ4 V5dql㳪YS=݆峹[s1pnWs(%O\i$u hJsLI2E:`i.ѱGAR/ &9B^2ڄjRe$V) ϜAM ;0O~H8^@@={2`2if Cn%wJ2'i)=H&Kw;uPI6mfD#Èm|ZhmV.?Ki LAi5<8Y-vNfj~ܠ` lȬ J9͸':W'RckDX&QZvVg%-#ȉTC8?K|iQ6*q$ QZ٢`ig=Thiy=lh{&@oЄ, mU M2˜鱴ƍ-XfZk&]"|\g )kI17VECO&(-.֎kQȨQ_Gm[j;[UK'8Invm6e[W+{Ђjw8u`aޡ(sxv\Y*Qc 5,f_Ϫ_ Vq(:wrj&ʁ|F*!~4+e.^ hPYrEQBa·Blt?];u38~vu2]?z¿Ә;LnZIh=z;rz;/Z~]/s6^ SbpYwl +1>#`|+jx`V@ Fhvݚoo3+S) { =a*)FPb|-ȃg׆YkZ|q' e=i-8yan0c~҂~gŦavܥ\. g$0UsHGJEV.+o/ y8o7t2 _o==vvqF&_l_˵93#΋忹q_䝱sF駇;>M6z# hizt{gkZ($ezCM78n)McrWh0l},簭1ޥ"RP1FUQ_ &w-"Whak6ˆ@ z5^ڿ>/:gbq(A: YV2 {.Cae( =žSRtBΫ{'SD@ODΔR349"ӣ|1,(FMt>ZI&_݄kޗr5~CW7U]0Kk?gO"utzt.t8]gI[D&bHIi#pʒ5 L q Z)Μ\btjfө| c-O\y\!S.Kp{Q=-"2OHdR!ƺVcvX5U TFXPsBFx /u\J]. NIBh:x 1)!h`핽Fb,Q,1`9 C]P*RA)MWk)A`=lNjY[YgV<*MOɌLHsFaIh0LJd ,839 pNgdӄ3  ]Jt ;G-yF15bR|1)@UHVz1)QqM-S60 MZuCΧ RZɲuC-bf 8gXNṣ}ȨVZ~6D!HۧUw2v-^[c31qr5[n?JE51@;M8:/ghqdyrV0j)@bNVu? ]}ᨫS1{ʻuk qi #z<ԑ6h [;Ү+7ے5 &hB yxӽڡxaiROZ8n3y`22ȑyaI娤Q xXze ?:՟9_/N+ tޫʂi?GO׳tůgy_ ׳[^/1 ;H|UR~ ob4mzB[?yY |9~qժ~'f8K1v,S&jyL":}{;)D<#7 &\p#NeMi\fAQ=B"a_<,Wa?XKW|:l" h-H(YO?}M$8tç\[w]22lMa]ʓhNJ! &zc8(K=8 uٸu/#8)мz%9q}v喛Ek1tkn^Ԓha_j,{QXɱGE2a4Ŝ[Ɵvykf@J"?'emd8~ucnh'WGG(d=rH+_<J=n(lX=MXQ׊]$ӛ 'TSe [#+jts,DjC+):⍶EK}zZG"b {(hGV!9)?9QkQNKUJ(6| Bݏ͆bֲ!?{q /'/p$:Jrc!JfwfI$"YN]\,r{~ p_OwOOO7$ E: pyJr™z`8uOK.ibML)g,!M!ıu$,qByTs3u "AB*im ќ%p$b~̭{_Ь#s8 fӉ+ d{EŘ$!JF#\ZXsHXJXX$YBX<ƚp0W,r}9_ln$Œ DiP+E*L \[kS tʭ tJ4+P${TXYľIGYл#sW7+cc g +KE#Pձ#qr3?O tGscfsJJ[kӛ,P6^n?S'+hqXL$L\Oj1 >#"RX5V.m$"F'٫Cw>sN].p/X1~{(39Â',x2LџW 䟮b}X5?fnq['$3$.uc1Sq0^pOg><_~s^:`aJIF 汭^>:ƀx8}pP;n׀w:_鸅>8zMCmEӆXw3:W!#!ySpI_@ꦮ 1avC@el)b a;D?5j8PG!6Ȋ F2̚^CĢ{l!<=n͖ALY"$ŖpjK"qgb0U,QKH(uP1DH*v>;S/D?gBffn}kC +'o+/hW~P?n q%_A]\{uqŵWyu}CǘK`aZ9]xX?kc1668pM~>ߧ\/GYkd:cMߪ]O+|A]0P@k9Zߺܠi+ߩx \z "2Ķ(ʪ=P"b5rlte qt)"ӵ2 /}2H K4jdJUYktZ*BF0AZ4`‘,t߶IJgʮYo^pE+*7ۧGvmg9/{0k2BhEb_(h1J_f?n0ADK 4o.@ -W-V7ww`4{ ?iDJB.FwffٞxP%R4Y)E9b4f6w{`Ղ!q>ڈPE ~{P*lTtrrA427sn _)eqBLB&9XV[q{!daRG(G(ܱT_Ps~5}^R)537l˻a̟G|^Sg:sM?dw3ݧWm.M| _<|S[fiV^+/,wI7_U${/-]`;>tU˘뼑{@#yDZU("VA.{5-C-TM֜~ =(-I1eű ҘX$$r$B"maG?"k5Ui"0ΓX)iQ8m%U+Ȗ&J<r us{]~H#l9Tk۰sT8t-Ri 9|!l|E8߉Io%>C +PMm!?iTcԐ(0)>FDns\p$횶ۮs ZSZO`{`jU0A)+U&oWDq` cZS$Et| \ݖL ,owOF!4WO7ag;֡nDc^r:_zUم>kr6}$FWhQ@K'\@@uNJhZe:ilnScaM7[CE&0{ 2u[:?ݺ=NSk{Ǖz׶dskF4{u }9%gໟIJ\( hU643:Ū3]S3Hjc̡\q?6Ua{.J )NnLDЗj^r,fV>D&_ QU5y'S_ʱ@ȗ^MRKg(\' Q2qi-q4EE"o`ZSH&@ R/pOu;%.AŠAK~ww?vO.L||5 ?N,ǾfvQ+䡲;j4bQ]T D}ܳ&q $8~y%HE ֗u`h5DsETKH%HQc4}[&YG]! irg}n ףvԫ>͝MΣW2|h!)۷ݽcd ]۹[_7Zf{L5ڻVz%R(Esσ+S z=5AIBr"S [Cvʃѩ:F֡w@"5VSvCBr$SL:V!Su2ڭ]`hkڭDC[hAea>Q y5R+‹ӛ;glrqncْ4f}G|u7^m~c7AcW<ɇ=57N!܋@OQ}Ucņ@/, AR"u/)TcK*%J) R*=BJ5XR֯d4'kW;':;TgjO{U$\x73s[X sw_k^ CܯDzXD^߻jَ͎޼{(jz3O7;Hr]껆Z%Pr@9g#_,gs@iy G šX$tS8I.qURҒuNjR^{RJGP]X:@r+%+GrvGPIi69<1Gv5Ik|EAPoZv"EzT _ )aJ18c)@6UJȱA'ٕg{5!FZj9O__ _\ Q\uNKHom|j<il5Zb-}LJYl}it5)W(#f]z&p NXʮ7*e.ʝ*d4fL:!bLC%"J0hRHD" w;XxC.NCk=wZ+PÃΗĂ*cjltZX$` t9!c+Fħ15p\HzJ')ѓ*RZ|Ƙ\M&lR0lbq8#h3X %Z aaCUb$2Hr'lF94UQEeHD#AӺ͈@:L[gqط#ILjjcY`zwkL\%?cX!uO`>鞢tI 5Xb`&#K$ruB|sS9,Qr%f+ďPjTj'ުHvm9oBт6lVxڭMAhkB:&9dzm?MUjC"Kv|Y`#S62HkoY `/:O> +q3ŷѻmJ􂃅yz% i16F3l;:1R m0$ ~P|Xt9qE^1l",҈48K aUb֟αPL蹪5v:"D'P#wVx;UYc.zZ"IZ1s,Hnx)rޱ+Rm=RǮrhI,iJ;Ot('>Fz*hFZYOpqw,5GeZYf7il9un`y$X qd^`Y0bv[w|Ļ+sJ<䬿r3gV; @%s_ GB8466.VHEޕ5q#/3E>;/aY Gd+G/Pdw".(9lwJ|_" bEdT J,~X`.G6%RAMHҨ2fjgydJIQH% hG'`^6[6V@PL)lЉZ5RV:? Z B>CZXbK 9x]0i?JG8\+R[5% !gk.1 1sߑu)pZ~u|(2gkhN?7. hr1QwXἯ)Ϻew4ֺ!߸f'87` -}Gup9%7Xֆ|*S4RBy(j9|1OؚrFGT":8ӎ StQGbhh(Hns V6ģXO!R:M!5[ǥ )98)R_$*8H!!zM%%QFjUP͏3W֔f%JLIp 1[ fK\s$Dx09Z{k*//npaA~J/͚E%u# *ttp QIwѓӾ؃-yA1_I5_ LE5-+`1vmۄyϋƌ^'\].Yfk[~;F:Ck+{Eu[^4j|`Sؗ܀Xk5;z[WsͱY94>7 wr&cbF ={4r;knNkH:f\y x ^cvV󻐇!w!6ޅ\]DF ݎ6i-.P11qo{gzk`01 FN_ 3*bNx-/ cDI?tn?[?Gn~e rS]>.c)4jd>蝻W= @h !",-FEV3JdU/z1u`sGO*?!hLC.Ga]p`L#qcbήW#KI ;&"LauOM8lT`z^cGv8B%;DxNQ(u)u'}t'4`L|[[GϐD^ʫuD|h$>t XkS(d .)%Ԗ?:KJLhHBD$8JjЭ5=ki}bv9Ӛ]Uc||v=zke>B  j߇:dUB۹ /6v%&xVgވCս ^p凊_,j=A 7aYs;&|~+(ZW Yl̕>U&[,{oںYH)Pdq|}23?U{LLU}GH^k듗lgEyԬrePN$VWfFM)=0_=, +C6Aw}R ,4c- %+}bPG5Qd`C1_Z h)A%T$S(Ą2j`,M RBǚ2hK0XGA\:Z+z2tц. d1? q\P0!9JJDPa=Rx1֞zQH糱3/~>."k#K@ w9qXkfP]9 ,/:]d)k5{\mRS&E5$e_sg?1 _4V^3j⚯K#POoGz,{vU%r5'f󐏳|7dEY2K <*TX,ӊr?y!苐ˀ8c08"BV.B1*3h>bg4Z峫Ve0H[=uI٤OG}kYȲ}=O. H֎ռRl0Ѝμ>{j_? ݫ2ԊZ3.8{L|dpnR1%:ZgsFQ~ь=F'AY;]ta 5us"))vbe1Z󄭕|PdCM@{Jsv))rA:>X:!4Ș聇(4S+1I>P! D?3(¹#d8@!RCD5y9"9Ac9Rn5psYvKM<]k+etic7V]> UݲKJQgٸ)0[9V@CZPZg"jgb"i@%}EjVOڭPhziǔ֜+b AIieTB!گ7㏈sP1p&k"U@jؗĵHeM $ۋX[hzJ{*rq3+a* VhR’0Ǔq o24k*`Z(VVBvmۻ;%DTȚ8Rb<@P:;R mȻ&1^=?滵skսh2r 1AܹJw\RZE,Ѷܚ"x(c?[mltʢ2!΅[5D5TYJJ"q+7ҸڪwV_'3.i{yh3#!П+rFEM=;ÈĬ(^ {.8cR;$n7*Πc1VIP!HXhmǴ!8g2T&M.1V햋FE'bYqUqܮ'liWBġa';чK-mڧI{+&5qDUD)GEjLS*j bZiH.)9)|§,k;"'oWa &Kъ۷;%v~:8~9DǬ͊߳G؋5a'Q;<T~ÿޓ߬|=|a |ųe ~7m`{x45]]QtwjVUOf֯Ơw b)J@QrqlSEXD]ͯVdYm0{ZjTC~"!IIJͧ_~Oˢ6#H]ؤjY@I^.>dBBg $]_"L]ֲ@y_<`JLC#d*Ivχ6@^oSGT>ݍ2oSß;4[, ?n7k5y!`w& ,NY,DZ3hHz0띩]^̿tD;9[ޙ;w-V%Oř3ab:#frlGs~fӅ[ Ww➴jiS5,߽ιQι3zfzX 79h?Uуz><_U}5Onk`vzw" ڠcosg9Y+,f_C%uE.b-B?U~ϮS-~_FQBA="qg屴@}A]K:YXC98lne>:Zk#gUl {;p=db`<)D<_F\Mg%#?=T+۾SRHB<_=$L(3ٵ{i衇~p,| %??m@='E?O%Dr$][;ȵ VJyH[HqNUL"qqiwlfEMneM]R_KcM$A qꓴdBYV͓ D3uj{%;D A^J*ZC=r9D(j^eдK^+$ %`Z""]]FD';|^_x YR"["$*r`fVn`1B%%΀xB$輴5/nL`-G5nJ`]ʟz'Oy[1$Ģu,nBRku1&F.H g{uOH` {p?8ϝi6ߝ"9| +.)hؓjpؔ#41 z/ *-GvGy¶Supq./֥FT;Gq>#U$(tR$IKZKR*UmhYKwI}.5$LEFхvo@tRc,O3qki܁Z,v:"IKZK1R㰥ii%54RVZ*(8-fcR- &lZInb]VkDZ(pPJ-3rhމMZBe[dRj76 % [(m] )`JZвCƻB]=@mL~k җQ w B0f 䨃DBZT41 Z]kj_%LƎH*jHZ{@y6*Hh=!R* t܈+%7&I'bF$-))veلr-g+QcD+o}QDuQGz;Prr \XQ,`QdfQ9 T$-f~id>eMB2:Uj9CCbbd1BKI滝L=(EN]i#Pbȫcb 78b>dr` !cpO /n|7_=OOGGejAxpv> ntq%tw0^*&Lw_Z/i0e͌.-0Geʀnp| *@z;7]ԄqʃU뤤T)LDMܦ&0CkPz5qB67'kJ$zgII>A(^O\GO@RBoqޣҴZrNE*Sk@.%2s*PsI`ᙴB%v}l`~2A /4ۘH""5ḯ?y ˲g;sIgH.er"gRH֏Da5$ l0H! $>o֡LD9i#4.&Bh5j/9h | G#/Y ꈎ/j#xd7(&@9~kl߅ZٷW~>qm$wLA0bCDsuQ jD2!.2"ZfcG-2$UF4ѕսSq;-UFԠ(CN`l+˹^u-h/֏e__w8NwT;S~2JW;T쌡7* zٗnt {&Osۭ#T}pU326T Nn5Qݨ5@tZ"j06 |c 0w5jBN]*bRQ'xT3-.]̩kCsMJSMJtKA 뤾t;ix^#[h O!k D=D*bX'u1dꮥ[̬H|,ڜZ5難yŽz9Չ*u)(rΈ=X~E@u@fj̥'ͥqsº^IKj NKRF$}P߬C g6Ua Cf0."KKg.=m.\*DW{ڮča%nO? [Sz-nqެˉ|ӽy0B׃y.3wWSOV}i{}bx?ݻۅ(n>22H[$IaH^BʎB[B$IyAm0$=oa$|0뭤#(P At5TuR^Pͳ老)vN c⫕MQI?NU)B7”%>ͱO=JKj3#wU3㇟~kۊ%/4.V߂A<}|QIe4 .7|,bY9FO1͍2z'*51o(kv{f)O"e$jPuE-U'N8X"Y֫8M+EH\_y\ϝ{~],i U(˲FW?>mN>ft@%:: {#6hSCqm۸?-4̬rBPT۠H߃LSrl#]E«&ܥ4^a~h2QK?*9/f9=wS2b3s$e]hnݡGp8u<ɴJFq\. Pi!Zfh, ,r#) g jFD#Np@ch9:YPrk*[DNf%"A2(C'~ @D фZw z(P=Q-C)tdt0@ Γ N9;M\åa݄"A&ZN{psewRxjƠH/WuК;0XOYƸ^Ex´d[?̃|g^+N~,iQqˋ -3e#Qd#:;uvv?Z/ly >O؎UKvSTB-?X,7O@R)ݘX3;Ef5S# 95zZpk!3rxȐ)2fo.ٚ)ywICvb;FK!n5Zc䲽-nBDn"Քvgx̔RV1`RB?Od'l7%#7 S^&8f:bX΀x~˷ S4C"^fox*vx)%b(PBnFFV6{80^Hih:v/pYm\ndFmYLުQL/65a]Gb]OdhUzq:(@t>˶2hdL`*JzE+6k v:NK";$(t뱫Q T;BӜU5unʫ;}o++|Z}vwעP|tiw*nS zcS[wNu?ަV]VE!E  Ua 9RwVkj$0(NZ`kKfO3jEvցdNgu/qe"ԑINʐn9 o֡f̥'Υ<"I K9j+q3$Fq1TU[-.ye!d@u79u^R(7jsS^Qz鮈SD WzgKD)%:%HzA1iL n!Հ@+(*4RoWji3K(A5 Ep7eYV@1mAU@N\㪇JEٰn4o -sesP-V3cr` e TY8!͡*-wjAmDXf P֨dp(6!qYAB226AL6ZF"gґȕcȀ繿8AiSifIƸ95^+-P"RZe|*S˘WhHq0wA4//}'mWW̽=z5ӭ?n6[FCyFſIOW[<+T/ɷn4OeFڃ/n=SWi[7{/4O>HF Njܫ.swOx߭.wQߍrPOɮnTi;F}-dA&Yj6Yh#*+ͨ6Xge4AP8)ͷ$™c$p}L4o7qJ&&9w׬rfD;$yqyx=2#G|RIJdmcYBcuLgI.P}+(ɼ:}K{Wd"sor۱ s  ULxs2ۗPcZp4v3m(Gc'qҐS>'̾c% x:ݣbܛS>^TOOtAډigcpd%$WEFf^"m4] ajѵwe=n~Nrf;^rpq 4BE9&)Z)R5#?$#EVUE#^q_гY'1Ak)L OJ<ƝEQ:rTn}O^.~Fl9@PTj^b(Tc|gskɔ7"yggUy90CW-w%xb^+ +D#FkG5NAp܉f;SbStQOP#ӟtY($>;ۛL0Txt{zCcyDrd4*e? x""JQ&,XqzɎ4qNvL.Bst ҺMK1xb;OV۴vڃ/$$(1cOR,f4 VHC[uCckDfp0|Ȓ3:KN2EzOo 2Smm´l?wOcQgn;FLҘH"mmӻ@e2hQbcۡdc;r  BR(j@r-6)0"v]\z0^ݏ3L|/no/n*wAq$̸={b:L߽'ɏ_}zr~ω)m>~G~ݱHFfTi| }bշۛC@ 1%2g ƓُR^_z`f.$5f3xcz)ߖ?g ٪Ӎ%-g0^ Mcx0 <7lZ-ytr>ٽff]bIӴ`6M0c7og? Fo/p#K2Jr#|xboxn".\D@?k}p yg4dLMM]BKfcn L$۪E'?gf/(tVnყ9 $Cݴ߇٩meRڞ_k7$!F{b]w|W%UݖdukHNǜx7?0j92Ws▌@XYP7^i%7z~gkyR %QpHS]Iy]\ Lb9`E({¶CMc]2 ֽ241ܶ?wY:Vr)!.^M -Ug,E9h3:;u&]ZS`QcqJOg~<yqCQDT^Cᬧ: H\`aJR1Ps rzĄ{A8+<̽eb \j`HWIƚ174""TTaIƌ!BA|Q #bF"[b CW ˖{,4K`'Qm$b F~1W:XhfH R羖2:b $ 5#ð&!4`bICĖu F|ØQP_hiEW`# PvQ{ ΏJO|R+6HJj1ڕFǐ" Qj860;skPb4>81_1" Y"fQsفO9ErW(2!_2c,JzV,%[D Ts ,5B`bQR% є!Xۚs(N(+ݐ{K_jwJ LXɩ_I`Gŏ拁9pGx!mxNI))"IsJPNIaQsNIiؑS/nyA(ǹLL݈, z?Y͓˂ fY3d#{YD`ݣ9q9S&ݣ 0Xl#^b|v3Uf8neStFihU]BhTf#*=*(E{Ξ۟jT۝U1ͥ\ H+CK-Q|o:P VXqǁEӌ}g<=\&;v-L=%oQWH'T'H \o5(V-xJhHnMY>MPDT5r"裡k ׃/k@tbF3M2i2dWW?z_m͵gnN?e/.Z~D),-Uoѵsq38:GR>%9VtIy#lH(ʥSfg<J? +!ε.T%VFz5"+emu\ *'D ~RW:0o5B=[\ȤyGvll- Nzo~(x4q@lKS^AHN+؛jJ%ɝ@6uQYZbrPQZE/1vm`5٧fb݋ HLY lf[Bg ͣ'K5bX9ew5wSS'1w_ 3~RF.isPYyc~LrLy*g.edJV_kM!-b":eQDϠbknSz[ELqxs]QŚb":eQDӋ{ڴ[,lGVo\:^THQDRĿd`~mg̲tqỵe![v/|8iki3HF1rb͆e靨g}V3Rwy mN/稩\u9"OMs\5XviBu< tݧ1˜!w'WJNreR]dj~L)c-FP8 |!" 8;}BLıѶ% ort{Z;TH$t$DŶp(( $q&PT(*O*4y(<$X@3mCC}惠8*8 f4PF+P~ BvDQm4Zoh 5ZRD{.B_(i"1Ys Ab1H65"Q,eP'luS1b^@oKngcYI}VuTZ\7BG0$ I o)>+)Gj*6JAB}Vfr"6Ê>֯tB渣CQvh%0`!x׺NѴI0Y!^irm:tKћQsZuxQ̦HI뫱ӾRri+r{ef t&О_K_ ~$"21.S-EȕHR}jP!W Z+T;3!Ԁ* $TM(XreQb<]DBI}ū?g.dfӅ!)aG۝XŶUe".KL UGvLj`(RLpKD0RqdBf9qsaScA+I{vBc%hҀa]`i`) }#!$GXP1bS)E|b(ߖ\kv2b\22$E@X>wY[g띉G{'OF[?F&-{f߭,äS._`F41,_kabKP؀h C5 a-xafoΨwڇ6q#җ]Rhrvj7W~TjsMI,JηRТr,it?hu?lKPTZ;@[ȝϤָc d#kܼ;[ PiHDD2LrJ$*4yIGҔ/Hmo~,LpCu3(s-z5 nbwKŧlj|f`DZ. 3\˔&$!V )5W&Qb#\Xcu Oj6Wjr줊[UUu0ں [Hj#BQ :eB ,2@h.xufƏ3EbԉJQFF>2'~*P__?!lf=i;,ݖԈBsi3vDG7Au"%qǒL QlIH*"rJDd*2+*Mk4b`Pc1n I:]QIt$"1$ dAI˥QR,nAI\ n1 6%(4Ԉf]6x%*xC@$ƀpHdD=4K|jCdC!UE5 !Ye7!{GqXƿ2eXq_@jJycTaQfk?}O'VUCjKM{B6֌>Tk ՁK)wTk=ݴ`l*jjM C4SL~nzB脾ǻnE!Tuٗ[y Tݲw87 脾ǻpalJEg-0ѻ57~BQ_7I:?lb~_eE) O|\֖_ ҃L׿6Wts{{xo}\AzhCd6Hj{dEzf4/6/Sȫ>oۆ(UI[BE0_}Qڡ ͠al:p |WkS]n5q`יz˻Un:pN;|V_Ʀ\μ[8qwk!o!B_qny74w Ձ tBN06Йw {wnM C6sCAݎwR U5J)^ R n(RgRn(eh-JI})5F)n( nSj+ J8v(=,ݖgR@7i\(@)J F)n(ŌO{Qw[Rk}AyT_DS rpFrdK %8Yɡzfu4UdQF#4***Jc4Zǔj7D%k|4 q%_If}fd%t;Z &x8ރآdK /6z?ٍ$_d DQ*?de\޽cDC|.0IeـIjU^TRIg;mAHnHTy5;R$vGJ؊!(`ݗvD@oFJ#&㤉.J^FIcZYʫQNhݥͩFU.qFʌfr;*JnS3~"s#yWDf?]4-s#bK2tVaPGu:čChS9%y3LHFo9"^rB?N9)/v?;Y7>ԇ ůt7 7c F)!CpƙCŢ e#p;V|GYsy[A 7!T;K΋19,>f6M-ܠٖg|nf+?DS7m<+*uv[N}wbkLMjl!qT|Qt75hs: F0e6j\QV\~ﬗԱtD46ʍ"442BxFU*%P'gyQEK!cL6[Ghs"v\qR1`+Dr_sTV(*(`D#Iy(cLk3QF% 1*)zKQl$0 ksս{iygUdN1;S~Yiw~SzbNhBBVZeղRS㽕B^NK a9A(rO}\lݹ%w\ߢHkݢ'f\oչ_leFm8HetuZr(ptjU3r >"Ji՞4Q&:UJhʑi/nz48~ ]Hm= fS)1-OEdtthflXJz5VG2S[ohc3J̔Q^ϳl>/n-cXLo1Z]-k:&E2(If_b>6hynK a~ Q2JRzˈEV$ŧv^p$֝O^ C6,DQc.LSR1Hs5-!Uxغ 脖D 5 tDAõl8U²0>4OxH<f3vyq)hAVj%u~J6֋mrq'QaNi$a\)%E*He>3C3́YqR&lMjpFW!'>_ξO֘3 YQQpLZʨTMPyHݜ(BtpViGpѩP^hQ_tb! 1)hu!i϶0RgLQ롊H>HJ~:ʵ 6ZtYC(b=8{CU9Z7T^h;Y{[D#ҙ=f3'fΙd} Z#k-b ?cj72TN]tw0-Y2U? /:vUԔ/5c9v5-']_Gkp)=:Ga,hpb`%&#4ʜFh̀i%Qmo&eID a1G3g=%D i8IlR y S!A$ƐIs(͑3FxCgQ{I|drW /+ޗs=i߯p[JovUAy3nm` WP1$tXuy ] lXd2~x{e>a{@V)LO J<-uio%т*)_=xbPiTFaݟR䮧 G7=,wOqR2_%+KBK zl6E~K$^=?҈Q_̢-5ZM?kS^NnVO*Q?Wsd2XLEeo+˻Բtdqx2ke0jQq U[Ze_dU!(qDfj5'%"   BjBh:>W{|U@%=5zD)!A+8GQ#Q:-QJ:*Hű7YfqS%I$fdRi4;&IT)8KuU6{9Eu6DK\?|?|,+7Df?#+yDD!xg{)YrM/o T_RKd?.}b ??_Iq(-mAt}I f33b9,?өlg|U w|ZX,̆ wV^4{},悃|u{Ta5㐼pUՅ5=Qǂ# ʮ aR#%@TNeNCRmJͥR5JtCٺ*N@!6%ҳF)n(ڛ~CRmJMQ#Jh-H̀6C#h8Ir:Zo&g[J!PfJɨYxg )`4f_)ݴ[|㙛kS(v`ۑ!^c30Xw`T}:);jv?yx4t][o#7+_g-VE<,flNdyfme}$;3'?dKZ-V_( fbŮX$Ub39nn|l`2j9hLeϰ^2|Őeΐ&7Zв v;՘C"KNIs s,9=+߻:Lî1kDSonBUr5];@ضPEfJ &őQ|`f A!:~tn'5ufzG@ۡ77]$MHKA%F/ˋ{F dPUx*zwQr`Ʋl<<8?"GIw$\+[ j?3jbm`~Y[y|* qu9z7YEuͫ[Z4}>&pK_^ٻUO/Gl6sېl},lqatXDqr?Lq: 9˃E(hÂq y[kVnџlf'wq3OGnxO_,Yt; dnUQ6 U$]%5] mN/~7}u_|q|+"trKQjeV*#|H4!V7g[;94X]=_fjTV`TmۺN(v"N(g ;3!X`}.8=vߞWx=>8h `D.K0OUd_i*86ƅ {|YBt!R,LQ21&}eK\QC6:Kc82mD33 W^*\yU*r3Oc~djD(T?j!ף!&9wS]!!RύNzS)DqHMh)zy&AfL(L5O %<ِ* *T2[==^ygr5~r,fٙ;,.O]MKrvb߽LJ_O/ LEӅSɝ=~8caDXTFF?㙳| OymmPBtۣ><dlt,U 7վp0N}HFKtmYJZ:se@ݠg waҕ|*AHԸr}O΋p! Mej< q\@ɨRbZ՗e))帵4O㍜ BKҜj^,p{cRaZ*|ʝօ!T@zH<>r-@-űv?l(i&Ħ΢_ͺ& ޝc Y+218\T_'wћ44vf1q&"!b7`M*%c!r(`=d>A}0Ux>\en-1{rP;oͷsj5AtG؜LX(ӎ08٫}P6^d_B%eٟ@}P#v?vZka]yPU jRBܦI[*y+Qd`*TZkaHX&bY̹r-\PqnH*"-U]hmvl=m"HvJ4 dEy1p#$ѩH3 *ŒQ`c[KXĺ.PLSHBc`_1j>(YIfPu-QJ,Fƥ:tTVT`4бtVCƌWz"֭kZW#(C6{و Z9#^ڇm_tc͗yp[C KGu(PT(/7dsW+.qّk %>J]?3xt?v^3bjXhz2OD]^t=\7C.>ڕ- ע'^6=?:\5Da[`aCʃD(f^jejb$#W .ZJ5/݋]SŹ4qkv_-]J;Z J 1sX&$; yMDyL8 Q;ꊠ{V#A5e(xԖ!h }Sx.M('T+젎zRHE_\t+Ӷ.䃷%eʕm+6YV"/=#-Qe|th%m>aM'3~6O9/_ }vQk85Ew\;D@)#-]¢{ªqzuxY [M@=P9~m(A4+WF:_7Ρn e: Q˺n|Y[Pւ|*D487BnMe:MQǺ!%Y\[hIhNѱjf_dW68z>]GG>lR357=pvOo'ޥ?_< d1@T_(N9a/ RƄ_OytfFoW[;¥Kei?{.רoc~ &{frB]CARN|mI1,$7гdI9OaB7{xPR\5:4J[mNJ _46~C>2LX˘)!%oy |4Ж:S!}5\CEjIRl Ԉc2X2SSK[ "ND+X88q:vJʅxSfe&&&Y@pʹDX#!&%"TT%D ]$c-"KJ.Pԙ%V YNQm8ˤ{*vu!2 _("euƕ ˠQO%-_Bh0Pm6ԔauМɘ( 1--9D0Ku^#Q)-!ۈ ,bȈR<]G9Y-BA*.37©(v>>i R:ҌHƑ)ڤ[..Zw|]vWV@7L䓏X yw&N4jrmv񺌥ukUyD(N/B)u'2LCλT1\h9858E[mfe &A23/2qdE7λ"6|2ՔS@C%rpFnD 8 xbP&d)94ө%)(Z0FF I_ֆPKJwrky3^-ZnE 3u N i0|c@F|yt8xԭj-w۠@mhiB3 "j ރ̩tYyABVm/T]#?iQk)0-y bER2,?VJTR&ô4OZzZTm[ 6~]'hn%{c@‰IA l*([WS.`uԒT1=:p&R1;Fbjhē,KHl[g Wy yfȁHMeOH3rT|nmN}48%=eIMdPwiC@sr|E˒? uQxӚPWOGStX'ar2w!uF.]<#DQ Pt1VFĂc.e4\Ѵ֨!E ur:S+{bzmJAkb.iEFarYaS\2@'QP Cf2M2 &AW5VgWwuƬ y#NzK!M2MotﺻVBTմu@б/ lƻԎ&I:̊D%!uhJRJɈ6dXbQR$A6HuxU՞aYJMM ""&' IAJ=f%Ui| sa¹pW\U$e*(]mwE#B]K-V>$+h {3cќPD}{A(S``*oPKND, :Ndl&T&҈tj EN' 4 9@QN7XJܸ~EaYom|}z=Lo -9^7Lw;Ο$9__mΗk&⢔ek7W[[fqYdOOxv;qcfN ՟Dy 7W`^ r3Ds39"o)zD_7n$!V3dL jf?Wk6Mu<+IZ,WzzMҠ>Ro8?EhP!Aܽei} d3=x%ȂvC>_zpY8teDD/UJ&gR2EFdQB+ t1 kk':_|)IpGv`3ݑ,L$g_Yg9ʞ({֍]4kݍJe%p2301ɢmv|\$K=v*|5 ACT;\b107#p>KR0NAi#qVle1EQ I6TAu0.B;/:XQ c+2);X(ƨ[O_DF l &[!`+(yt7#MZ5&ľ^V˰dPrTͫhHXyꉑZ3{\?VP{Ϻ, cZC˅1 =9$OG 俆rZQw r>|JrkRrq%u o/_wY?[_|㔖l%?"j?5ёD_Wt7 ?G6m{. UOMɭ(g]s}W;C΂mCd4n^DPʱGZ6xa*mڅ$mw5D$KnLIrui$ k#\PMkvl8{1$=:=#NŎ\dVU8I1 dBZ# SBIS\ c,HxF.+,^\DTZ+#9lH +~NA[G%'d(4YSߝݽ[A3n>߆հ`@a!'nnv"EGAc0`d\gH[Z`Τbda7H,OAW45 ?NvxiifWO&IEO([\n!niJ7# BB@F?b{6ي N N*mٜsZX(͆g(Z4õ(MV_qs@ rf{CV}Pͤt#&鳡ǀ`0={I\a'L^taQ GPSZJi~:;D:i[vӯ >ڋys}}u5]A :X):%:Ip1i|DGVN*XPXj`-8J~ b)|K!Uf"E, e?xH;昳-c( ;qI_|"_H%"=L/hvZRQ6e'v2FB(R;uɵ=[fL=upi^q&,{N~ڵgJO v\} 5 vX-* Vu3ͭʀ@`r cPHcj'qhEnMBcr}_2p5]좫2= KR |$I"ZFԚB Ind^3ږU8b%~Hg>>N7 W9]^_ߜ|;zO/p9_ɿ_e.@?뻋_{j~N^뿜|[v~yxnVEP.ɲD: WKԺsCO=#&Z?VKP}})vTB޸Vٔ6_7frѭV%S;Gv+Tdѭ6q[sM7[[ JLv>pN1[5m y&Ħ ;A'Wc$(:u~'A;z@aC4?GǧTsi{AQ4B( K*G%x>kzB4  ܧ8XZTJ=8O鬥BZR璀ꢼ01V`VlvBM9HНXĎs|EwAw}%r# - {'nyblN"oAP4N'r|r2u,x2|rmIi;{'דkq;#{h4'5?C,Bo,8"t%}:?0Á{3h P7=EԱwVN?veDD/UJ&gR2EQ JZeो1X^sW)@-]T뽆nc=Q2w&vféDG=Uh@\rh9(#O 0IH%|읖J`{Ej[!b펎cGՆهO6sq@dK J`sApYFp9>)Da!oDlJ:=jePb:sn̻ŒF1,䍛hMu oD7*ePd:snN;HGZFMtMn'!_?ЫNVW4¾ J-(0ab1f F*3'|Llǔ^S&>hA@uWzY~e]$sZTW$읠Y1 rWJu%(x:WmYkq:RZ=1\bfWNOe\rtޤi^x8ЄeoQu?$]܉%5l:3'^cSZłMµzuRju ̠LLjOlȨ`(-TpժN^T>f>jzr'<^赒rT &rd6dEM:z!q ݣnJTc䐗2(e*Ԛg#X7 ֹJ@YA'."j36q)cqnD7"Uʠt*PRNf8`!oDf$ܞ%#hOهyd|tݟcd)c%roNf zs>O ޜGݛO9Y)ovVʡJWTsaw+=j+J^[)JSDJJ)Ri*59?>ZJJˬ0G%>X7#R{ʊhVXT|J5r]f|VRn 6 +-]>k2+&o-ڙq-0-$9-DeV9!5`m<ٍe޷\}hs<ݒsiPՒ-k;&[`)· 0QvoZ|NQnJŰ;򚌀_E [2AdD.Uݯzt,"=_KN]^s39w$c’"j("b(ջ`n\SegOj}Mg䂈 >APtI)Rn|SeBlsnfunĩ~Ay"@`d8ؚ/vz̟+NḒo˶~'R*gTR[[PfT$؍s<2xj~dXK=ZH5gbFPNu;U.`3bRb#æm^lV&jIPS휚s&D&TvѡtB Ȼ;)9V+k ZS`Iih۪%0, dJATBHZEۑ(&qZ/8F^$֥ܽa*@P?U ~<}[t>lБ FpݓWRA|"ay0r M$|И5÷6X-6w} j؉20ZA|sVE{мsS˛RTG.R#d˩dģNE ZF3j1sȐlI܉UX0^oe崼M=U$4rd/ZREpKniPRZ9Jn^;QYN8a>278R(瞘{m[^c$#4X9# +)yT900soaL^n}!/cIhYxD EM@|*zff!iK%]Qbo)^XPlj)-Sm*vz ܰ>>Dİ~_lڴ,dUFf[FDVνf[)&Jv]ʼnJmѝ- p2Gʅus˩i3/zڻ[_>K@jae b)vK@>WmXF쪵6'e\a-2^2FeE!cjSW xs/^"[>de\ 4gxp-(NЏ&c#y(dam.5 vh$HJyA5ݻ*MqNp&o:Ҁ ͂-۞Xfu5f%Lpn4ϘP`zt*4&3e4v/Ӿx}mJx0P\E @"FkL߸wŒ9֡uǑp^mbqzZ+!&st"TdRI{xxzmq1 9lugl QqÓPnD0{*G8E^ƃl&hURՊjc&bVPTM!'2;穲A4e+jWIIv0U1dv#* Ohy˛4$:"TȆOAx*7l(C{?쭲mgMBפ-y|67PtX5cU@jkUTFٷ>K{tΔ/oQI/LEl;@tBDbv`Sl&*Q/t )u4oÛݦZ~B{ZEqTBF6[[:*QSe~e'c&-u/y9 =ڶ\PBC ݘx' !{}q\`A!gwl />tz#ӪVf!b*Nȏ(\ҳkp]%߬*?Ur{_&s] WLF :Ϳ_#Ŧʝؖ1d2lBW|֜؞\Ti ;$j݊]Zr8$Y&P3Dyy(u|Zǝ @d3c%<&c|`:<ʟeX b%D,v01^mog?IEN},Nw]\7J\\3_[4VwYPCsil@9Z3D m)T66J}.Dy.mIuYߋ#^ Fy&#BR,kbbt); 5J_7840iי95"i*H)ndӥARέ~҅];\96 bfgy {qe=T 6e),0[~QZ`9AnL@^Ͻ8 ͷgx7LM-RQMFNv"#+~hy.QʹMYƄc\/}`ڧ;6$o儕#bNYnDwS%SCr*FlJS󻞊:H<}1%j)X !!w{M wálG?u`,Iw;zpefvFE Kc?i7}6vTԷ'鸣*݁]OX:km֔XBSwX;ecF6ȩpi~-' (̧1d09ce`7G8=Wҽ]iLz l[Hm}xΖ J}Ya8>Y+%s7[n?3ٿuxμ\FZA_~;WaqI=ˋ+~uwC_ζd*M)B7x9h=_t~~}wkY3_ ռׅ}ƶ;|9KxSd|a?"ePZSΟP+}D WkNJ™"Bjs |n1] 虯&a3>k¶Yo(uHɒfPƋ[Đpule͢~+$A<|]^@ Ss$L%ج$%ewϭw `䲞snCLmB 'F.Qќ vP _-Pau.sy -ZWZS+ԯ{c'6 l$BQhJ88v04>wem$Iz:<"##E{0X,0>-q(R$ eJ돞{4_|lU ఊ}$G9͘Qv(;R~,}zcmGIxy6W8BaV* KTz<,6Q񖸙;ps_0hD;Un;E gQ8ɣcD:4bǶհ9ٍu6&'Rl4_Y6|ڌ玏 KTw+ {X6qaݍqm`l)ENR^49=״Y>NS\jwb Qqۢn\q_Pg +v<Hj~;qv|wHXm,uҞU|&Jz繗-Lc^={2%}&=`;:Ŵsp?'R8k#9PaNGX mPRƄ6l!@ލ'aC%.xqj"J!3Yp3Eކۙj1)ﭐyuZ=j-|7j"- 4غorre#.mWI)ʺ6~#@:.%Ϟ;wlr4{ON=ߕ%+ۼ9H3ә[:[c)P!GOh3Gu |w^~Ck2g2g;]XYuuЬ?Ngv 90fmZܦ5p ?r 96kॅp<8ns7uaBc|o_8?-Ի`)oZ{r(sm'yQ,* k\.P|$e%_>{~3-72 v^)e êEbWS?2vpS@1**%fG,(+픛@'IUBÒ8~;.@ߛBx+c(:t)P$3Bku7Tyb'lX4hHu3l}v rq>+bxʒLjpnNvK0 v )(xjȠbphֶ[q g-7wF6H bCM ѣGsN&ܸ(o R")7P(EVَ (gkihg+%n|֖8ct?]tEp#IAD(>q߂@ǏFю Ӛ!Xu~\"+2{(tb/ꇐi9={\*kѤ8m!j2Iǚ5f4ҫs$-\8 9y6^QM$#<b,uJ6,9>72vJ<*"e0( V9v_`!y/QMKeom1IrZ3]v/!?&5Z (jcyn]->K+J.񶴡ikB:zӷ|i;5NZ_Vȍh$*Tkb.\|vv?fNO|a&4:ܸChJ/6v.! GP(x9(JŹF-atJ\tr9Ak/pLn_20LS'+kNjkpadE@̘s`PS:8?qz7<@'_]x\נFg|vg;]E9W+W^)R\J1@z4Tr, &UL1Q"R,PZri HjARSnMxn2gYYo.7%ՄLn9m$3`/M3Xv`$#in`{:n : ͢sNvZIs-fZFSb GA$Ra0ph3L$Xcp+_{砆(GCB(̞5ijt:-oa\E*eρCRM,]zm]H8RC0 j\"5wd?7=!=/tw8CXG|[C/tkQ,IުK|ؠ4L'vTnhu2|3 GzT|$?IN{5ܷ~H^3۩kY9&֑k‚EΑ,')DRiQIm#BV;oy`+Z6ØJMl%2S= gʺ_G zi?}&7#(-A[/z,[4Nuɔx!ϗ'IMz}+O\pqw^Щ4BUv+ERT הUsww> Lxo_K&h$#ؒ6c`~Y2M`4YC?a(a۾W@ "9].fK_φ.z6Hj)ݭnw+ YmVBRE Zs^͘ +q~ )_]7Z"/i0dRθ|InFT65IHַFA/7S\Ln ɗ5,.`!-lAnW(X\-M0X|!V9 x,X , Z7~>_>o0L2va?CW`}l -ʧros4/Fj|-(M#[ށ\<=/WdO}|>|=7$z.͝AmS$ ]Ԓ(fŐ%j#F֤i)3 BH0̈́@PB@5$HDumaolՂ q츺ؾ]B&JZJ[!m`U))5I :U+tjʈoF*ڜjA9j:ͭ[,2ҷxMVZ4U'-=r-ZJ; yC^=f?)vWEFK@U ׁY>1!}h AH\i[Π(}a6,^ "wW[bQP{7ܷMoJvضMFSǶ3*˸)IARZzmF^Ze^Y}W?)!ZvMCY<+x]*޵]rRUMMgWϵ!YRNLyRcՄ pwE27E2;7d4R$̹`b֗]isW !4^0s ̕Dz ܚ@$lZ} j"/Bj2/b9hC1dȂJ2xjy@ 9ez14Wqzd3+9՜raeS g`B<8#SʖwD>gZ8GGC]<:'&ʫ0;o=ޚ8X[eO:5e|f`G%Oƪ:LeڒZ$y?[~wo}k; ,.ҐҩMp%ydݰ3o` FuQź;p3V?d2Һ5!sSغ~֭ bTU>waG۶nq"[7W`Gp@h-B-Td$"K"`HSQ,G,t>)T#JIKZKiBBaŧ}j),+SB"'-=>-gho>Muwjĩqki\"'a~ZH7a*NZzZx"fƒҧR-99ǭ8rnD!fP")Y`F&Юy!4֐a&DY&3 \_)Y@s(5Fw R3#ꪊ&+LǓQ{:ԫoDIڲVC jBuq'sz|yUdFcobܗP|,7.W1h.~t ,HQvkSTwWt6rk!qZJHG y:ؖsc98 56B, h")g31d3)HZ(Δ8KAM9e ԂFRKl@]M<ڥkI5'4|&S|CJwmbl 0: :kuZQKOn;Z??ǧ}E{+rqͅZeG-wȨƱ9VBr"!oVް1_}KoN,e&NWwhbb9Xg'KM*א́]D^ٻ9X`Lr;gxܻ%wIcs)~# ÎYx D~/ Mz۵@ !(u(Lgvg*S^@Peb^w{ PpCT0%%<30l}/ #gHRcO-j7A ɖZn _3s - vݎzٽ '?N݆U LhClq݆ڴ5B0M׉+mWǛW# pw\덎h/g"jGcHprg _tOUl`[4W6mqb1ÚQ2 ay|i.lT4d/!}髊?%DqØMWt 'azLZP8V4P218dHc4&j1 Tp l8#VLC39c6CF^8Q}r~Uبf|Q*?^ָHP2fD$)103~%r(a@gt&1eZS2^G \0wҦë[vs~* 3M_Zl։>u7QG@a.n)㼋Ϳz߻ew,k7"OEx2C4/Q2(1;HJܪd 0'p3:D fm﻾KTyꠚ: +QށzKaRd]6}FFV0-*KgrTkT9ֹkT06[P͞,$gߟq6o;n3ç;zJ4k9beQ1,b*ק[P1YԓdIىR#hcxeQc/޾_!#1^BٜE6) JԛsV`>m q~@y3~O S$:H}Z%tp;17S]i r)ϫ<D7Foru qoDDIb-'nSxȕhOX!m?y[$;ؔXt˯F)<]tOY"/h{iTQ |$ĉHu?h0(/FUI} XaHUcѪayL`MVR牦d|(l|J) _[Pxx̍^_5QtMS0sbD {*=znzW_xocu[?n2b>Ar3!W̷b%@?|E_?| Ӓn=B[Q*社?G^m|67~Iabc=^z:!nr`&zwy753>,QM+wMk>}c/pAAiaB 7[b?#2[ 5U#D9@aJ:n,2Mޢ⠧k ^zW i-pGCͮ3n+tE JYk@-| w9WV<{o1)0ioY˶jp~omY~p! sw|SOч?E4 LJq͈a;k^ʶh V3l6FUsVzGu* 2ŋH{s߆5(V|rERltaeWVUސ$S.LΔ'`''S"ս+U*% [|oul hIAc8ktLzcjM+ĄB:誆.o㱦[ꖩ}K]c4-06Pu kh?T`ymb/"aTeI9wZb[E$t;Rx i:5[jPwhٍ KhDa1{ug8Ç=!|_1( cïЦïب"rJm~E"g;|U(DgqchH>֐: $= >V3`I'Sp)F o:| a@qȊks- [;`5 ak`+X[;;(WOB.Lޗ}wy1V<.TOEaeye/WΊsA=.M_a|IJ c']gU@c߳)ģv*^R[ES{'x){]#r$49I^˧옜 MT85Q Hxr峤 V?*aG|RLRS\' Fv;ɦX/5AX=I{pJY.~_j & ]ru A~#RЦ?ڴ /#L!W9>WND7J@y:!@R^lFVAʹbr\Q aty薩d~#)-w 6$|fn6\~,u>R(HC\ ^X sjRK͠t4FD{k֫sRKmи^ʼ5*P[7/l/uf-9jkKeͣvRGye{,;.Ke0ZK/KM0:bvT+ӽ!ƑyyH]Cvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000006671551115146032676017723 0ustar rootrootFeb 20 06:46:24 crc systemd[1]: Starting Kubernetes Kubelet... Feb 20 06:46:24 crc restorecon[4813]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 06:46:24 crc restorecon[4813]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 20 06:46:25 crc kubenswrapper[5094]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 20 06:46:25 crc kubenswrapper[5094]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 20 06:46:25 crc kubenswrapper[5094]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 20 06:46:25 crc kubenswrapper[5094]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 20 06:46:25 crc kubenswrapper[5094]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 20 06:46:25 crc kubenswrapper[5094]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.556730 5094 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569005 5094 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569061 5094 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569070 5094 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569078 5094 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569086 5094 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569095 5094 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569103 5094 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569116 5094 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569128 5094 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569136 5094 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569147 5094 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569160 5094 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569169 5094 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569177 5094 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569186 5094 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569195 5094 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569205 5094 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569214 5094 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569223 5094 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569231 5094 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569240 5094 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569248 5094 feature_gate.go:330] unrecognized feature gate: Example Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569257 5094 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569265 5094 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569274 5094 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569281 5094 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569291 5094 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569299 5094 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569306 5094 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569314 5094 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569322 5094 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569338 5094 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569345 5094 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569356 5094 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569366 5094 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569374 5094 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569382 5094 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569391 5094 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569402 5094 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569411 5094 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569420 5094 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569428 5094 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569436 5094 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569443 5094 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569451 5094 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569458 5094 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569466 5094 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569474 5094 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569481 5094 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569489 5094 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569497 5094 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569505 5094 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569512 5094 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569520 5094 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569529 5094 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569537 5094 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569545 5094 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569553 5094 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569562 5094 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569570 5094 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569579 5094 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569587 5094 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569595 5094 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569602 5094 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569612 5094 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569620 5094 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569628 5094 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569637 5094 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569645 5094 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569652 5094 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569660 5094 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.569895 5094 flags.go:64] FLAG: --address="0.0.0.0" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.569917 5094 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.569936 5094 flags.go:64] FLAG: --anonymous-auth="true" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.569949 5094 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.569962 5094 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.569972 5094 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.569985 5094 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.569997 5094 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570007 5094 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570016 5094 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570027 5094 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570036 5094 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570046 5094 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570055 5094 flags.go:64] FLAG: --cgroup-root="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570066 5094 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570075 5094 flags.go:64] FLAG: --client-ca-file="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570084 5094 flags.go:64] FLAG: --cloud-config="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570093 5094 flags.go:64] FLAG: --cloud-provider="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570102 5094 flags.go:64] FLAG: --cluster-dns="[]" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570113 5094 flags.go:64] FLAG: --cluster-domain="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570125 5094 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570135 5094 flags.go:64] FLAG: --config-dir="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570144 5094 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570154 5094 flags.go:64] FLAG: --container-log-max-files="5" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570165 5094 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570175 5094 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570184 5094 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570194 5094 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570204 5094 flags.go:64] FLAG: --contention-profiling="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570214 5094 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570223 5094 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570233 5094 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570242 5094 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570256 5094 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570266 5094 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570276 5094 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570285 5094 flags.go:64] FLAG: --enable-load-reader="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570296 5094 flags.go:64] FLAG: --enable-server="true" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570305 5094 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570319 5094 flags.go:64] FLAG: --event-burst="100" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570329 5094 flags.go:64] FLAG: --event-qps="50" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570338 5094 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570348 5094 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570357 5094 flags.go:64] FLAG: --eviction-hard="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570378 5094 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570387 5094 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570397 5094 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570408 5094 flags.go:64] FLAG: --eviction-soft="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570417 5094 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570426 5094 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570435 5094 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570445 5094 flags.go:64] FLAG: --experimental-mounter-path="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570454 5094 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570463 5094 flags.go:64] FLAG: --fail-swap-on="true" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570473 5094 flags.go:64] FLAG: --feature-gates="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570485 5094 flags.go:64] FLAG: --file-check-frequency="20s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570494 5094 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570505 5094 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570515 5094 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570525 5094 flags.go:64] FLAG: --healthz-port="10248" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570534 5094 flags.go:64] FLAG: --help="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570544 5094 flags.go:64] FLAG: --hostname-override="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570553 5094 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570563 5094 flags.go:64] FLAG: --http-check-frequency="20s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570572 5094 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570582 5094 flags.go:64] FLAG: --image-credential-provider-config="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570591 5094 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570601 5094 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570610 5094 flags.go:64] FLAG: --image-service-endpoint="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570620 5094 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570629 5094 flags.go:64] FLAG: --kube-api-burst="100" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570639 5094 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570649 5094 flags.go:64] FLAG: --kube-api-qps="50" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570658 5094 flags.go:64] FLAG: --kube-reserved="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570667 5094 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570677 5094 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570686 5094 flags.go:64] FLAG: --kubelet-cgroups="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570695 5094 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570735 5094 flags.go:64] FLAG: --lock-file="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570744 5094 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570754 5094 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570764 5094 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570779 5094 flags.go:64] FLAG: --log-json-split-stream="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570789 5094 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570799 5094 flags.go:64] FLAG: --log-text-split-stream="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570809 5094 flags.go:64] FLAG: --logging-format="text" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570818 5094 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570828 5094 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570837 5094 flags.go:64] FLAG: --manifest-url="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570847 5094 flags.go:64] FLAG: --manifest-url-header="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570859 5094 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570869 5094 flags.go:64] FLAG: --max-open-files="1000000" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570880 5094 flags.go:64] FLAG: --max-pods="110" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570889 5094 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570899 5094 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570908 5094 flags.go:64] FLAG: --memory-manager-policy="None" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570918 5094 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570927 5094 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570937 5094 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570947 5094 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570970 5094 flags.go:64] FLAG: --node-status-max-images="50" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570982 5094 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570994 5094 flags.go:64] FLAG: --oom-score-adj="-999" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571006 5094 flags.go:64] FLAG: --pod-cidr="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571017 5094 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571035 5094 flags.go:64] FLAG: --pod-manifest-path="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571046 5094 flags.go:64] FLAG: --pod-max-pids="-1" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571057 5094 flags.go:64] FLAG: --pods-per-core="0" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571068 5094 flags.go:64] FLAG: --port="10250" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571080 5094 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571091 5094 flags.go:64] FLAG: --provider-id="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571103 5094 flags.go:64] FLAG: --qos-reserved="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571115 5094 flags.go:64] FLAG: --read-only-port="10255" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571126 5094 flags.go:64] FLAG: --register-node="true" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571138 5094 flags.go:64] FLAG: --register-schedulable="true" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571149 5094 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571181 5094 flags.go:64] FLAG: --registry-burst="10" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571192 5094 flags.go:64] FLAG: --registry-qps="5" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571204 5094 flags.go:64] FLAG: --reserved-cpus="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571216 5094 flags.go:64] FLAG: --reserved-memory="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571228 5094 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571239 5094 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571248 5094 flags.go:64] FLAG: --rotate-certificates="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571258 5094 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571267 5094 flags.go:64] FLAG: --runonce="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571276 5094 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571286 5094 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571295 5094 flags.go:64] FLAG: --seccomp-default="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571304 5094 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571314 5094 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571324 5094 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571334 5094 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571344 5094 flags.go:64] FLAG: --storage-driver-password="root" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571353 5094 flags.go:64] FLAG: --storage-driver-secure="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571362 5094 flags.go:64] FLAG: --storage-driver-table="stats" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571371 5094 flags.go:64] FLAG: --storage-driver-user="root" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571380 5094 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571390 5094 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571400 5094 flags.go:64] FLAG: --system-cgroups="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571409 5094 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571424 5094 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571433 5094 flags.go:64] FLAG: --tls-cert-file="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571443 5094 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571455 5094 flags.go:64] FLAG: --tls-min-version="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571464 5094 flags.go:64] FLAG: --tls-private-key-file="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571477 5094 flags.go:64] FLAG: --topology-manager-policy="none" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571488 5094 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571501 5094 flags.go:64] FLAG: --topology-manager-scope="container" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571514 5094 flags.go:64] FLAG: --v="2" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571531 5094 flags.go:64] FLAG: --version="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571546 5094 flags.go:64] FLAG: --vmodule="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571557 5094 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571569 5094 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.571896 5094 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.571912 5094 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.571925 5094 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.571936 5094 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.571945 5094 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.571955 5094 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.571963 5094 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.571971 5094 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.571979 5094 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.571989 5094 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.571998 5094 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572006 5094 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572014 5094 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572022 5094 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572031 5094 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572039 5094 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572047 5094 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572055 5094 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572064 5094 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572072 5094 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572080 5094 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572088 5094 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572096 5094 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572106 5094 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572117 5094 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572126 5094 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572135 5094 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572143 5094 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572152 5094 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572160 5094 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572168 5094 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572176 5094 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572184 5094 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572192 5094 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572200 5094 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572208 5094 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572216 5094 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572224 5094 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572238 5094 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572247 5094 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572255 5094 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572264 5094 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572272 5094 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572280 5094 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572289 5094 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572296 5094 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572308 5094 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572317 5094 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572325 5094 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572334 5094 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572367 5094 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572376 5094 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572385 5094 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572393 5094 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572401 5094 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572411 5094 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572420 5094 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572428 5094 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572436 5094 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572444 5094 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572452 5094 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572460 5094 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572469 5094 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572477 5094 feature_gate.go:330] unrecognized feature gate: Example Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572485 5094 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572493 5094 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572500 5094 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572508 5094 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572516 5094 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572524 5094 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572532 5094 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.572559 5094 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.588280 5094 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.588365 5094 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588523 5094 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588546 5094 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588556 5094 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588568 5094 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588578 5094 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588589 5094 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588598 5094 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588607 5094 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588615 5094 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588623 5094 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588632 5094 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588640 5094 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588648 5094 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588656 5094 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588665 5094 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588675 5094 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588685 5094 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588694 5094 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588726 5094 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588735 5094 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588743 5094 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588751 5094 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588759 5094 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588767 5094 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588775 5094 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588783 5094 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588791 5094 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588800 5094 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588808 5094 feature_gate.go:330] unrecognized feature gate: Example Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588816 5094 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588824 5094 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588832 5094 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588840 5094 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588849 5094 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588859 5094 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588868 5094 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588875 5094 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588884 5094 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588892 5094 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588901 5094 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588909 5094 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588917 5094 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588926 5094 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588933 5094 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588944 5094 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588954 5094 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588965 5094 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588976 5094 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588985 5094 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588994 5094 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589002 5094 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589010 5094 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589020 5094 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589030 5094 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589039 5094 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589047 5094 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589055 5094 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589064 5094 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589073 5094 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589082 5094 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589090 5094 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589098 5094 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589106 5094 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589115 5094 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589123 5094 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589131 5094 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589139 5094 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589147 5094 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589155 5094 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589162 5094 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589180 5094 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.589195 5094 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589439 5094 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589451 5094 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589461 5094 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589470 5094 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589480 5094 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589488 5094 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589497 5094 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589505 5094 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589513 5094 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589521 5094 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589529 5094 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589540 5094 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589551 5094 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589560 5094 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589569 5094 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589578 5094 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589588 5094 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589598 5094 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589607 5094 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589616 5094 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589624 5094 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589632 5094 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589641 5094 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589649 5094 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589658 5094 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589666 5094 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589674 5094 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589682 5094 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589690 5094 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589698 5094 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589727 5094 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589736 5094 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589747 5094 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589757 5094 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589766 5094 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589774 5094 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589782 5094 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589791 5094 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589799 5094 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589807 5094 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589815 5094 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589826 5094 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589834 5094 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589843 5094 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589854 5094 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589862 5094 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589870 5094 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589879 5094 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589888 5094 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589896 5094 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589903 5094 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589912 5094 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589920 5094 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589928 5094 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589936 5094 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589944 5094 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589954 5094 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589965 5094 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589975 5094 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589983 5094 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589993 5094 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.590002 5094 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.590011 5094 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.590019 5094 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.590028 5094 feature_gate.go:330] unrecognized feature gate: Example Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.590037 5094 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.590044 5094 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.590052 5094 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.590061 5094 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.590070 5094 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.590078 5094 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.590092 5094 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.590385 5094 server.go:940] "Client rotation is on, will bootstrap in background" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.599771 5094 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.599972 5094 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.602835 5094 server.go:997] "Starting client certificate rotation" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.602887 5094 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.603139 5094 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-13 08:14:35.525513352 +0000 UTC Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.603287 5094 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.635142 5094 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 20 06:46:25 crc kubenswrapper[5094]: E0220 06:46:25.637854 5094 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.642097 5094 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.661758 5094 log.go:25] "Validated CRI v1 runtime API" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.700447 5094 log.go:25] "Validated CRI v1 image API" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.704471 5094 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.712472 5094 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-20-06-37-34-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.712531 5094 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:45 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.740287 5094 manager.go:217] Machine: {Timestamp:2026-02-20 06:46:25.73521531 +0000 UTC m=+0.607842051 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:d25915f7-4d55-43a4-a20b-9e6118746152 BootID:6fb44c16-1595-44a7-b2ec-4faee6098a1e Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:45 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:b0:f2:bf Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:b0:f2:bf Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:b0:bd:ac Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:2f:8b:03 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:bf:be:a5 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:31:3f:de Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:77:97:f3 Speed:-1 Mtu:1496} {Name:ens7.44 MacAddress:52:54:00:bb:56:b8 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:e6:75:03:50:85:7e Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ce:5c:c4:8d:a6:0b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.740696 5094 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.740999 5094 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.742849 5094 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.743094 5094 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.743148 5094 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.743420 5094 topology_manager.go:138] "Creating topology manager with none policy" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.743436 5094 container_manager_linux.go:303] "Creating device plugin manager" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.743936 5094 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.743982 5094 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.744342 5094 state_mem.go:36] "Initialized new in-memory state store" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.744450 5094 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.748339 5094 kubelet.go:418] "Attempting to sync node with API server" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.748394 5094 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.748435 5094 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.748461 5094 kubelet.go:324] "Adding apiserver pod source" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.748482 5094 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.754111 5094 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.754158 5094 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Feb 20 06:46:25 crc kubenswrapper[5094]: E0220 06:46:25.754467 5094 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Feb 20 06:46:25 crc kubenswrapper[5094]: E0220 06:46:25.754329 5094 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.757048 5094 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.758801 5094 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.764785 5094 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.766690 5094 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.766743 5094 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.766755 5094 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.766765 5094 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.766781 5094 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.766793 5094 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.766802 5094 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.766815 5094 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.766825 5094 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.766835 5094 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.766882 5094 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.766893 5094 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.767833 5094 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.768621 5094 server.go:1280] "Started kubelet" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.770108 5094 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.770409 5094 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.771357 5094 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 20 06:46:25 crc systemd[1]: Started Kubernetes Kubelet. Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.774606 5094 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.775344 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.775380 5094 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.777062 5094 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.777111 5094 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 20 06:46:25 crc kubenswrapper[5094]: E0220 06:46:25.777387 5094 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.777474 5094 server.go:460] "Adding debug handlers to kubelet server" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.777960 5094 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.778038 5094 factory.go:55] Registering systemd factory Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.778101 5094 factory.go:221] Registration of the systemd container factory successfully Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.778220 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 21:17:15.292909303 +0000 UTC Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.778369 5094 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 20 06:46:25 crc kubenswrapper[5094]: E0220 06:46:25.778617 5094 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="200ms" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.778949 5094 factory.go:153] Registering CRI-O factory Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.779004 5094 factory.go:221] Registration of the crio container factory successfully Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.780095 5094 factory.go:103] Registering Raw factory Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.780528 5094 manager.go:1196] Started watching for new ooms in manager Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.782891 5094 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Feb 20 06:46:25 crc kubenswrapper[5094]: E0220 06:46:25.783024 5094 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.783678 5094 manager.go:319] Starting recovery of all containers Feb 20 06:46:25 crc kubenswrapper[5094]: E0220 06:46:25.782864 5094 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.188:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1895e184107aaeb4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-20 06:46:25.768566452 +0000 UTC m=+0.641193163,LastTimestamp:2026-02-20 06:46:25.768566452 +0000 UTC m=+0.641193163,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790303 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790364 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790381 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790396 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790408 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790422 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790435 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790449 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790465 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790478 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790491 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790503 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790518 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790536 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790579 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790592 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790606 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790619 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790634 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790678 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790690 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790722 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790735 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790751 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790767 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790807 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790825 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790861 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.793268 5094 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.793364 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.793400 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.793426 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.793453 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.793478 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.793501 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.793527 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.793550 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.793574 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.793599 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.793622 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.793645 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.793683 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.793940 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.793994 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794025 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794055 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794092 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794118 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794147 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794174 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794202 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794229 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794261 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794306 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794342 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794375 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794412 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794444 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794497 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794526 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794556 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794585 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794614 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794643 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794785 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794809 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794834 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794857 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794878 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794899 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794923 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794945 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794967 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794992 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795014 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795035 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795057 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795079 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795100 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795120 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795141 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795164 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795188 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795207 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795226 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795248 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795269 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795291 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795314 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795337 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795360 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795379 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795400 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795421 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795443 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795465 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795483 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795504 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795525 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795544 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795567 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795586 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795605 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795624 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795643 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795675 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795725 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795747 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795766 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795788 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795821 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795841 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795863 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795887 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795971 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795999 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796021 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796042 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796062 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796082 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796102 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796123 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796144 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796163 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796184 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796206 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796225 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796273 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796293 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796314 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796333 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796357 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796380 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796402 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796525 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796554 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796585 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796608 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796632 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796655 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796676 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796731 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796765 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796786 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796807 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796827 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796846 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796866 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796884 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796907 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796927 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796946 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796964 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796985 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797011 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797030 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797050 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797119 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797139 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797157 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797175 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797192 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797212 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797231 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797253 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797272 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797290 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797311 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797330 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797349 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797374 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797453 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797474 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797496 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797518 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797540 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797560 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797583 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797614 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797634 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797655 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797684 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797741 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797763 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797783 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797807 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797830 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797851 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797871 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797891 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797910 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797931 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797952 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797971 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797995 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.798023 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.798047 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.798069 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.798089 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.798108 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.798128 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.798148 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.798174 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.798196 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.798215 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.798233 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.798253 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.798270 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.798288 5094 reconstruct.go:97] "Volume reconstruction finished" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.798302 5094 reconciler.go:26] "Reconciler: start to sync state" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.815436 5094 manager.go:324] Recovery completed Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.832901 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.834917 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.834975 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.834992 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.836058 5094 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.836078 5094 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.836102 5094 state_mem.go:36] "Initialized new in-memory state store" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.836398 5094 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.838762 5094 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.838837 5094 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.838893 5094 kubelet.go:2335] "Starting kubelet main sync loop" Feb 20 06:46:25 crc kubenswrapper[5094]: E0220 06:46:25.839098 5094 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.841983 5094 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Feb 20 06:46:25 crc kubenswrapper[5094]: E0220 06:46:25.842073 5094 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.859954 5094 policy_none.go:49] "None policy: Start" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.861912 5094 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.861971 5094 state_mem.go:35] "Initializing new in-memory state store" Feb 20 06:46:25 crc kubenswrapper[5094]: E0220 06:46:25.877781 5094 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.929369 5094 manager.go:334] "Starting Device Plugin manager" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.929596 5094 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.929615 5094 server.go:79] "Starting device plugin registration server" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.930216 5094 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.930231 5094 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.931306 5094 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.931405 5094 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.931415 5094 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.939937 5094 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.940062 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.942044 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.942168 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.942270 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.942535 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:25 crc kubenswrapper[5094]: E0220 06:46:25.943313 5094 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.943527 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.943629 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.943684 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.943780 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.943801 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.944016 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.944644 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.944781 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.944995 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.945030 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.945047 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.945502 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.945534 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.945551 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.945700 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.946081 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.946154 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.946299 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.946428 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.946506 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.946870 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.946969 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.947048 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.947168 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.947200 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.947214 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.947421 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.947517 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.947605 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.948087 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.948109 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.948120 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.948298 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.948337 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.948889 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.948984 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.949004 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.949345 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.949479 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.949559 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:25 crc kubenswrapper[5094]: E0220 06:46:25.979932 5094 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="400ms" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.002100 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.002165 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.002196 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.002220 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.002244 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.002267 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.002349 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.002408 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.002439 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.002466 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.002504 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.002532 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.003084 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.003165 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.003799 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.030631 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.032887 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.033038 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.033061 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.033150 5094 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 20 06:46:26 crc kubenswrapper[5094]: E0220 06:46:26.034435 5094 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.188:6443: connect: connection refused" node="crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.106988 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107066 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107123 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107159 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107199 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107238 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107276 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107310 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107313 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107417 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107346 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107453 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107491 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107497 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107544 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107547 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107590 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107603 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107646 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107654 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107662 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107686 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107742 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107757 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107789 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107799 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107850 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107887 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107919 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107995 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.234692 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.236888 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.236958 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.236980 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.237021 5094 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 20 06:46:26 crc kubenswrapper[5094]: E0220 06:46:26.237602 5094 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.188:6443: connect: connection refused" node="crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.283541 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.311488 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.320915 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: W0220 06:46:26.340193 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-523e79a3f190a5488045a232713f38143137dabed606bdd6616494ddd29a17cf WatchSource:0}: Error finding container 523e79a3f190a5488045a232713f38143137dabed606bdd6616494ddd29a17cf: Status 404 returned error can't find the container with id 523e79a3f190a5488045a232713f38143137dabed606bdd6616494ddd29a17cf Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.345855 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.351752 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: W0220 06:46:26.355975 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-43fc53989740edf3c4bf5ea5184e4fef79420685b6e4142b9cf631e2d0db3483 WatchSource:0}: Error finding container 43fc53989740edf3c4bf5ea5184e4fef79420685b6e4142b9cf631e2d0db3483: Status 404 returned error can't find the container with id 43fc53989740edf3c4bf5ea5184e4fef79420685b6e4142b9cf631e2d0db3483 Feb 20 06:46:26 crc kubenswrapper[5094]: W0220 06:46:26.377298 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-610094ff93a3d58da948bed349747975cdcde485d4b5f8735fc90bccd47d3ee7 WatchSource:0}: Error finding container 610094ff93a3d58da948bed349747975cdcde485d4b5f8735fc90bccd47d3ee7: Status 404 returned error can't find the container with id 610094ff93a3d58da948bed349747975cdcde485d4b5f8735fc90bccd47d3ee7 Feb 20 06:46:26 crc kubenswrapper[5094]: W0220 06:46:26.379445 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-66e4ae54e42cdf1afc4d8b94e8e8952546fda0d67e459fd74a28a697710d260d WatchSource:0}: Error finding container 66e4ae54e42cdf1afc4d8b94e8e8952546fda0d67e459fd74a28a697710d260d: Status 404 returned error can't find the container with id 66e4ae54e42cdf1afc4d8b94e8e8952546fda0d67e459fd74a28a697710d260d Feb 20 06:46:26 crc kubenswrapper[5094]: E0220 06:46:26.381153 5094 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="800ms" Feb 20 06:46:26 crc kubenswrapper[5094]: W0220 06:46:26.381333 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-2fd246824bfd27eb11ec034ef0d12243d315c9a3a9f91b2d701b7db3d80bae0e WatchSource:0}: Error finding container 2fd246824bfd27eb11ec034ef0d12243d315c9a3a9f91b2d701b7db3d80bae0e: Status 404 returned error can't find the container with id 2fd246824bfd27eb11ec034ef0d12243d315c9a3a9f91b2d701b7db3d80bae0e Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.638108 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.639671 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.639732 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.639747 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.639776 5094 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 20 06:46:26 crc kubenswrapper[5094]: E0220 06:46:26.640420 5094 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.188:6443: connect: connection refused" node="crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.775960 5094 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.779154 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 01:20:50.937149723 +0000 UTC Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.845189 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"43fc53989740edf3c4bf5ea5184e4fef79420685b6e4142b9cf631e2d0db3483"} Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.846916 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"523e79a3f190a5488045a232713f38143137dabed606bdd6616494ddd29a17cf"} Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.848356 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2fd246824bfd27eb11ec034ef0d12243d315c9a3a9f91b2d701b7db3d80bae0e"} Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.849893 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"66e4ae54e42cdf1afc4d8b94e8e8952546fda0d67e459fd74a28a697710d260d"} Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.851541 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"610094ff93a3d58da948bed349747975cdcde485d4b5f8735fc90bccd47d3ee7"} Feb 20 06:46:26 crc kubenswrapper[5094]: W0220 06:46:26.935166 5094 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Feb 20 06:46:26 crc kubenswrapper[5094]: E0220 06:46:26.935313 5094 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Feb 20 06:46:26 crc kubenswrapper[5094]: W0220 06:46:26.936485 5094 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Feb 20 06:46:26 crc kubenswrapper[5094]: E0220 06:46:26.936586 5094 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Feb 20 06:46:26 crc kubenswrapper[5094]: W0220 06:46:26.950423 5094 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Feb 20 06:46:26 crc kubenswrapper[5094]: E0220 06:46:26.950543 5094 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Feb 20 06:46:27 crc kubenswrapper[5094]: W0220 06:46:27.048475 5094 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Feb 20 06:46:27 crc kubenswrapper[5094]: E0220 06:46:27.048607 5094 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Feb 20 06:46:27 crc kubenswrapper[5094]: E0220 06:46:27.182432 5094 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="1.6s" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.441175 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.443570 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.443642 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.443661 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.443732 5094 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 20 06:46:27 crc kubenswrapper[5094]: E0220 06:46:27.444760 5094 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.188:6443: connect: connection refused" node="crc" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.665903 5094 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 20 06:46:27 crc kubenswrapper[5094]: E0220 06:46:27.667410 5094 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.776373 5094 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.779886 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 03:52:46.588087905 +0000 UTC Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.857261 5094 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="b984b4cdf6ed9e2cbbeba5300d04d9aabca1c5deb777d5f2d1f92c56488baccf" exitCode=0 Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.857341 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"b984b4cdf6ed9e2cbbeba5300d04d9aabca1c5deb777d5f2d1f92c56488baccf"} Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.857414 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.859423 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.859485 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.859510 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.861902 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1"} Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.861968 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38"} Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.861989 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d"} Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.863731 5094 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3" exitCode=0 Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.863835 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3"} Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.863917 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.865613 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.865683 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.865740 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.866278 5094 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326" exitCode=0 Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.866373 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326"} Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.866464 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.867760 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.867800 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.867819 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.869205 5094 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155" exitCode=0 Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.869269 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155"} Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.869340 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.870623 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.870665 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.870680 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.871147 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.872362 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.872411 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.872429 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.775991 5094 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.780365 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 13:58:58.971928842 +0000 UTC Feb 20 06:46:28 crc kubenswrapper[5094]: E0220 06:46:28.784498 5094 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="3.2s" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.876879 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1"} Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.876942 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2"} Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.876959 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781"} Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.879120 5094 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378" exitCode=0 Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.879241 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378"} Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.879354 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.880837 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.880906 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.880933 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.883385 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"7bbeadd569a4f1d9c8ec473e4dce3e5141ef23e49e911237ea9637cb3bc0fb77"} Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.883463 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.884569 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.884634 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.884655 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.887289 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81"} Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.887406 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.889205 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.889257 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.889277 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.891811 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d40e9d51da95e9023d99a2b2cdc4aa1a6d6755d0110393e173ee57fe9bfb74ab"} Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.891838 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d9d72e26a84d5625d799baf5fcd573d245475eb954a13456bf1813c0c863dc5d"} Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.891850 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7b3f535bdf7005ade8abbf6f234ddbc9c15136792c8ebe8a3646e9dadedea986"} Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.891969 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.893121 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.893177 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.893189 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.045801 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.047909 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.047987 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.048014 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.048056 5094 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 20 06:46:29 crc kubenswrapper[5094]: E0220 06:46:29.048878 5094 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.188:6443: connect: connection refused" node="crc" Feb 20 06:46:29 crc kubenswrapper[5094]: W0220 06:46:29.056886 5094 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Feb 20 06:46:29 crc kubenswrapper[5094]: E0220 06:46:29.056973 5094 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.780586 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 19:43:03.416637411 +0000 UTC Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.901176 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1"} Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.901324 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.901326 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3"} Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.902911 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.902959 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.902979 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.905579 5094 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c" exitCode=0 Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.905828 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c"} Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.905949 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.906021 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.906021 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.906985 5094 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.907287 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.908249 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.908298 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.908314 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.908319 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.908369 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.908392 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.908324 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.908587 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.908623 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.908635 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.908597 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.908654 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:30 crc kubenswrapper[5094]: I0220 06:46:30.781884 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 16:29:34.544350262 +0000 UTC Feb 20 06:46:30 crc kubenswrapper[5094]: I0220 06:46:30.914659 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f"} Feb 20 06:46:30 crc kubenswrapper[5094]: I0220 06:46:30.914761 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b"} Feb 20 06:46:30 crc kubenswrapper[5094]: I0220 06:46:30.914785 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa"} Feb 20 06:46:30 crc kubenswrapper[5094]: I0220 06:46:30.914812 5094 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 06:46:30 crc kubenswrapper[5094]: I0220 06:46:30.914930 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:30 crc kubenswrapper[5094]: I0220 06:46:30.916682 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:30 crc kubenswrapper[5094]: I0220 06:46:30.916809 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:30 crc kubenswrapper[5094]: I0220 06:46:30.916850 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:31 crc kubenswrapper[5094]: I0220 06:46:31.186274 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:31 crc kubenswrapper[5094]: I0220 06:46:31.493105 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:31 crc kubenswrapper[5094]: I0220 06:46:31.782940 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 13:58:26.935767669 +0000 UTC Feb 20 06:46:31 crc kubenswrapper[5094]: I0220 06:46:31.892580 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:31 crc kubenswrapper[5094]: I0220 06:46:31.910938 5094 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 20 06:46:31 crc kubenswrapper[5094]: I0220 06:46:31.926401 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568"} Feb 20 06:46:31 crc kubenswrapper[5094]: I0220 06:46:31.926488 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1"} Feb 20 06:46:31 crc kubenswrapper[5094]: I0220 06:46:31.926653 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:31 crc kubenswrapper[5094]: I0220 06:46:31.926651 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:31 crc kubenswrapper[5094]: I0220 06:46:31.928795 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:31 crc kubenswrapper[5094]: I0220 06:46:31.928866 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:31 crc kubenswrapper[5094]: I0220 06:46:31.928887 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:31 crc kubenswrapper[5094]: I0220 06:46:31.929255 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:31 crc kubenswrapper[5094]: I0220 06:46:31.929350 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:31 crc kubenswrapper[5094]: I0220 06:46:31.929372 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:32 crc kubenswrapper[5094]: I0220 06:46:32.249839 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:32 crc kubenswrapper[5094]: I0220 06:46:32.251923 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:32 crc kubenswrapper[5094]: I0220 06:46:32.252004 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:32 crc kubenswrapper[5094]: I0220 06:46:32.252026 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:32 crc kubenswrapper[5094]: I0220 06:46:32.252073 5094 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 20 06:46:32 crc kubenswrapper[5094]: I0220 06:46:32.783933 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 12:51:20.221342636 +0000 UTC Feb 20 06:46:32 crc kubenswrapper[5094]: I0220 06:46:32.930184 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:32 crc kubenswrapper[5094]: I0220 06:46:32.930248 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:32 crc kubenswrapper[5094]: I0220 06:46:32.931760 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:32 crc kubenswrapper[5094]: I0220 06:46:32.931849 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:32 crc kubenswrapper[5094]: I0220 06:46:32.931869 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:32 crc kubenswrapper[5094]: I0220 06:46:32.932666 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:32 crc kubenswrapper[5094]: I0220 06:46:32.932813 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:32 crc kubenswrapper[5094]: I0220 06:46:32.932842 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:33 crc kubenswrapper[5094]: I0220 06:46:33.784509 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 15:52:43.183778197 +0000 UTC Feb 20 06:46:34 crc kubenswrapper[5094]: I0220 06:46:34.079895 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:34 crc kubenswrapper[5094]: I0220 06:46:34.080152 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:34 crc kubenswrapper[5094]: I0220 06:46:34.082406 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:34 crc kubenswrapper[5094]: I0220 06:46:34.082461 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:34 crc kubenswrapper[5094]: I0220 06:46:34.082480 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:34 crc kubenswrapper[5094]: I0220 06:46:34.785648 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 00:03:44.662012926 +0000 UTC Feb 20 06:46:35 crc kubenswrapper[5094]: I0220 06:46:35.786612 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 19:08:11.579236036 +0000 UTC Feb 20 06:46:35 crc kubenswrapper[5094]: E0220 06:46:35.945003 5094 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.084344 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.084807 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.087110 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.087234 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.087266 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.784907 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.785164 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.786895 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 03:10:57.889639584 +0000 UTC Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.786916 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.787040 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.787066 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.793441 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.795840 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.796255 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.797927 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.798012 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.798033 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.943359 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.943598 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.944947 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.945122 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.945241 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:37 crc kubenswrapper[5094]: I0220 06:46:37.787277 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 04:36:52.626427321 +0000 UTC Feb 20 06:46:37 crc kubenswrapper[5094]: I0220 06:46:37.794825 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:37 crc kubenswrapper[5094]: I0220 06:46:37.891538 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 20 06:46:37 crc kubenswrapper[5094]: I0220 06:46:37.891930 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:37 crc kubenswrapper[5094]: I0220 06:46:37.893969 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:37 crc kubenswrapper[5094]: I0220 06:46:37.894159 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:37 crc kubenswrapper[5094]: I0220 06:46:37.894316 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:37 crc kubenswrapper[5094]: I0220 06:46:37.946158 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:37 crc kubenswrapper[5094]: I0220 06:46:37.948129 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:37 crc kubenswrapper[5094]: I0220 06:46:37.948188 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:37 crc kubenswrapper[5094]: I0220 06:46:37.948211 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:37 crc kubenswrapper[5094]: I0220 06:46:37.953498 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:38 crc kubenswrapper[5094]: I0220 06:46:38.788450 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 23:41:55.406672566 +0000 UTC Feb 20 06:46:38 crc kubenswrapper[5094]: I0220 06:46:38.949199 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:38 crc kubenswrapper[5094]: I0220 06:46:38.950886 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:38 crc kubenswrapper[5094]: I0220 06:46:38.950956 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:38 crc kubenswrapper[5094]: I0220 06:46:38.950980 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:39 crc kubenswrapper[5094]: W0220 06:46:39.440847 5094 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 20 06:46:39 crc kubenswrapper[5094]: I0220 06:46:39.441030 5094 trace.go:236] Trace[801421912]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Feb-2026 06:46:29.438) (total time: 10002ms): Feb 20 06:46:39 crc kubenswrapper[5094]: Trace[801421912]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (06:46:39.440) Feb 20 06:46:39 crc kubenswrapper[5094]: Trace[801421912]: [10.002187394s] [10.002187394s] END Feb 20 06:46:39 crc kubenswrapper[5094]: E0220 06:46:39.441082 5094 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 20 06:46:39 crc kubenswrapper[5094]: W0220 06:46:39.484416 5094 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 20 06:46:39 crc kubenswrapper[5094]: I0220 06:46:39.484564 5094 trace.go:236] Trace[1063127187]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Feb-2026 06:46:29.483) (total time: 10001ms): Feb 20 06:46:39 crc kubenswrapper[5094]: Trace[1063127187]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (06:46:39.484) Feb 20 06:46:39 crc kubenswrapper[5094]: Trace[1063127187]: [10.001351453s] [10.001351453s] END Feb 20 06:46:39 crc kubenswrapper[5094]: E0220 06:46:39.484603 5094 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 20 06:46:39 crc kubenswrapper[5094]: I0220 06:46:39.777345 5094 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 20 06:46:39 crc kubenswrapper[5094]: I0220 06:46:39.788851 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 02:59:04.323079362 +0000 UTC Feb 20 06:46:39 crc kubenswrapper[5094]: I0220 06:46:39.952088 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:39 crc kubenswrapper[5094]: I0220 06:46:39.953384 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:39 crc kubenswrapper[5094]: I0220 06:46:39.953431 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:39 crc kubenswrapper[5094]: I0220 06:46:39.953447 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:39 crc kubenswrapper[5094]: W0220 06:46:39.961004 5094 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 20 06:46:39 crc kubenswrapper[5094]: I0220 06:46:39.961102 5094 trace.go:236] Trace[893958956]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Feb-2026 06:46:29.959) (total time: 10001ms): Feb 20 06:46:39 crc kubenswrapper[5094]: Trace[893958956]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (06:46:39.960) Feb 20 06:46:39 crc kubenswrapper[5094]: Trace[893958956]: [10.001877736s] [10.001877736s] END Feb 20 06:46:39 crc kubenswrapper[5094]: E0220 06:46:39.961126 5094 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 20 06:46:40 crc kubenswrapper[5094]: I0220 06:46:40.592992 5094 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 20 06:46:40 crc kubenswrapper[5094]: I0220 06:46:40.593085 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 20 06:46:40 crc kubenswrapper[5094]: I0220 06:46:40.603485 5094 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 20 06:46:40 crc kubenswrapper[5094]: I0220 06:46:40.603591 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 20 06:46:40 crc kubenswrapper[5094]: I0220 06:46:40.789129 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 14:14:01.78015178 +0000 UTC Feb 20 06:46:40 crc kubenswrapper[5094]: I0220 06:46:40.795472 5094 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 20 06:46:40 crc kubenswrapper[5094]: I0220 06:46:40.795608 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 20 06:46:41 crc kubenswrapper[5094]: I0220 06:46:41.503583 5094 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]log ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]etcd ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/openshift.io-api-request-count-filter ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/openshift.io-startkubeinformers ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/generic-apiserver-start-informers ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/priority-and-fairness-config-consumer ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/priority-and-fairness-filter ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/start-apiextensions-informers ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/start-apiextensions-controllers ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/crd-informer-synced ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/start-system-namespaces-controller ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/start-cluster-authentication-info-controller ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/start-legacy-token-tracking-controller ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/start-service-ip-repair-controllers ok Feb 20 06:46:41 crc kubenswrapper[5094]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Feb 20 06:46:41 crc kubenswrapper[5094]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/priority-and-fairness-config-producer ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/bootstrap-controller ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/start-kube-aggregator-informers ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/apiservice-status-local-available-controller ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/apiservice-status-remote-available-controller ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/apiservice-registration-controller ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/apiservice-wait-for-first-sync ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/apiservice-discovery-controller ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/kube-apiserver-autoregistration ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]autoregister-completion ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/apiservice-openapi-controller ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/apiservice-openapiv3-controller ok Feb 20 06:46:41 crc kubenswrapper[5094]: livez check failed Feb 20 06:46:41 crc kubenswrapper[5094]: I0220 06:46:41.503689 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:46:41 crc kubenswrapper[5094]: I0220 06:46:41.790080 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 19:52:23.064929019 +0000 UTC Feb 20 06:46:42 crc kubenswrapper[5094]: I0220 06:46:42.790189 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 17:28:42.066674577 +0000 UTC Feb 20 06:46:43 crc kubenswrapper[5094]: I0220 06:46:43.299303 5094 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 20 06:46:43 crc kubenswrapper[5094]: I0220 06:46:43.790868 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 04:22:01.432598002 +0000 UTC Feb 20 06:46:43 crc kubenswrapper[5094]: I0220 06:46:43.890541 5094 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 20 06:46:44 crc kubenswrapper[5094]: I0220 06:46:44.791659 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 13:58:01.484458188 +0000 UTC Feb 20 06:46:45 crc kubenswrapper[5094]: E0220 06:46:45.599474 5094 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Feb 20 06:46:45 crc kubenswrapper[5094]: E0220 06:46:45.604210 5094 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 20 06:46:45 crc kubenswrapper[5094]: I0220 06:46:45.604222 5094 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 20 06:46:45 crc kubenswrapper[5094]: I0220 06:46:45.604527 5094 trace.go:236] Trace[580338358]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Feb-2026 06:46:32.522) (total time: 13082ms): Feb 20 06:46:45 crc kubenswrapper[5094]: Trace[580338358]: ---"Objects listed" error: 13082ms (06:46:45.604) Feb 20 06:46:45 crc kubenswrapper[5094]: Trace[580338358]: [13.082283811s] [13.082283811s] END Feb 20 06:46:45 crc kubenswrapper[5094]: I0220 06:46:45.604576 5094 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 20 06:46:45 crc kubenswrapper[5094]: I0220 06:46:45.617168 5094 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 20 06:46:45 crc kubenswrapper[5094]: I0220 06:46:45.643065 5094 csr.go:261] certificate signing request csr-7btk2 is approved, waiting to be issued Feb 20 06:46:45 crc kubenswrapper[5094]: I0220 06:46:45.655652 5094 csr.go:257] certificate signing request csr-7btk2 is issued Feb 20 06:46:45 crc kubenswrapper[5094]: I0220 06:46:45.728209 5094 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:54088->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 20 06:46:45 crc kubenswrapper[5094]: I0220 06:46:45.728278 5094 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:54102->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 20 06:46:45 crc kubenswrapper[5094]: I0220 06:46:45.728303 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:54088->192.168.126.11:17697: read: connection reset by peer" Feb 20 06:46:45 crc kubenswrapper[5094]: I0220 06:46:45.728361 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:54102->192.168.126.11:17697: read: connection reset by peer" Feb 20 06:46:45 crc kubenswrapper[5094]: I0220 06:46:45.791990 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 19:55:32.592918222 +0000 UTC Feb 20 06:46:45 crc kubenswrapper[5094]: I0220 06:46:45.792171 5094 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 20 06:46:45 crc kubenswrapper[5094]: I0220 06:46:45.968544 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 20 06:46:45 crc kubenswrapper[5094]: I0220 06:46:45.970425 5094 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1" exitCode=255 Feb 20 06:46:45 crc kubenswrapper[5094]: I0220 06:46:45.970462 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1"} Feb 20 06:46:45 crc kubenswrapper[5094]: I0220 06:46:45.985634 5094 scope.go:117] "RemoveContainer" containerID="f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.504565 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.657781 5094 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-20 06:41:45 +0000 UTC, rotation deadline is 2026-11-21 02:59:36.658512115 +0000 UTC Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.657843 5094 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6572h12m50.000672813s for next certificate rotation Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.775322 5094 apiserver.go:52] "Watching apiserver" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.780757 5094 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.781110 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-apiserver/kube-apiserver-crc"] Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.781551 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.781678 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.781783 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.781576 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:46:46 crc kubenswrapper[5094]: E0220 06:46:46.781842 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.781942 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 06:46:46 crc kubenswrapper[5094]: E0220 06:46:46.782077 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.782194 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:46 crc kubenswrapper[5094]: E0220 06:46:46.782256 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.784691 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.784917 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.784957 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.786999 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.787123 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.787136 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.787371 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.787938 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.789749 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.792137 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 03:17:21.741923958 +0000 UTC Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.821339 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.844929 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.858128 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.879216 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.879863 5094 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.895117 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.911859 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.911907 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.911971 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.911995 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912015 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912032 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912053 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912072 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912091 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912109 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912130 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912149 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912169 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912187 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912206 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912226 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912261 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912289 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912308 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912328 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912348 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912366 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912385 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912401 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912419 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912397 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912437 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912455 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912475 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912495 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912512 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912534 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912551 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912572 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912596 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912636 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912656 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912737 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912756 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912777 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912796 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912818 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912837 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912856 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912877 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912897 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912924 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912968 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912992 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913016 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913036 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913056 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913038 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913076 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913190 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913236 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913268 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913298 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913304 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913325 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913357 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913384 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913409 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913433 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913459 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913490 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913513 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913515 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913541 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913577 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.914041 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.914081 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.914112 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.914147 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.914179 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916297 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916341 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916382 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916417 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916459 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916501 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916675 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916732 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916769 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916800 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916832 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916865 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916904 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916937 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913508 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913550 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913872 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.914123 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.914179 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.914188 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.914334 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.914502 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.914788 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.914839 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.914862 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.914964 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.915053 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.915166 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.915213 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.915278 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.915335 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.915358 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.915434 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.915460 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.915599 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.915614 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.915641 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.915671 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916107 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916120 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916196 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.917099 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.917269 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.917793 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.917838 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.917878 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.917915 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.917947 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.917983 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918018 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918054 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918091 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918127 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918166 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918204 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918243 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918275 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918309 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918341 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918377 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918411 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918443 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918474 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918507 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918543 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918577 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918606 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918648 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918675 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918724 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918749 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918776 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918802 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918846 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918872 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918896 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918922 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918948 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918971 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918995 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919020 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919136 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919175 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919201 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919226 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919251 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919274 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919301 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919325 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919348 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919372 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919398 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919423 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919450 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919475 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919501 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919528 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919554 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919579 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919604 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919631 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919655 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919679 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919724 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919748 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919852 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919878 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919904 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919929 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919956 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919980 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920003 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920028 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920055 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920088 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920126 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920155 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920182 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920206 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920230 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920258 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920282 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920309 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920336 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920360 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920385 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920417 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920443 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920468 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920493 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920519 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920555 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920594 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920623 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920655 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920684 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920733 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920763 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920792 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920818 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.926925 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.926975 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927018 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927054 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927106 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927142 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927175 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927201 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927228 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927254 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927281 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927306 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927331 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927398 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927433 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927468 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927502 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927533 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927561 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927591 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927621 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927648 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927672 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927730 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.929075 5094 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.929726 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.929818 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.929849 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.929941 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930080 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930111 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930137 5094 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930161 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930185 5094 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930214 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930239 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930263 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930290 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930317 5094 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930340 5094 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930362 5094 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930385 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930409 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930433 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930460 5094 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930483 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930504 5094 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930526 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930548 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930569 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930592 5094 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930613 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930634 5094 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930655 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930679 5094 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930729 5094 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930753 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930774 5094 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930796 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930818 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930841 5094 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.917336 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.917429 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.917814 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918001 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918134 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918750 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919073 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920863 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920888 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.921118 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.921189 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.921332 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.921429 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.921765 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.921845 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.922000 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.922042 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.922217 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.922450 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.922491 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.922677 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.923047 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927476 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.928592 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.929042 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.929042 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.929299 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.929356 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.929383 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.929466 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930039 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930185 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930206 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930262 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930329 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930651 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930871 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.931220 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.932017 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.932943 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.933191 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.933307 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.933522 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.933680 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.933997 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.933999 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.934827 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.936552 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.936919 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.937246 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.937428 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.937606 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.937777 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.938039 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.938282 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.938533 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.938789 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.940881 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.941755 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.942279 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.942397 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.942731 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.942690 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.943036 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.943182 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.943213 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.943572 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.943630 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.943920 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.944638 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.944762 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.946875 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.947033 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.948378 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.949165 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.949284 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.949594 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.949654 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.949614 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.949969 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.950048 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.950432 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.950472 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.950762 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.950802 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.950891 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.951040 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.951096 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.951232 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.951498 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.951509 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.951801 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.951970 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.951971 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.952455 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.952475 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.953955 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.954246 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.954244 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.954601 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.954654 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.956769 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.959520 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.959800 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.959814 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.960187 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.960399 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.960505 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.960601 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.961017 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.961285 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.961383 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.963166 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.963622 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.965157 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.966682 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.966835 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.966931 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.967181 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: E0220 06:46:46.967468 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 06:46:46 crc kubenswrapper[5094]: E0220 06:46:46.967498 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 06:46:46 crc kubenswrapper[5094]: E0220 06:46:46.967513 5094 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:46 crc kubenswrapper[5094]: E0220 06:46:46.967579 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 06:46:47.467557136 +0000 UTC m=+22.340183847 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.967682 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.967989 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.968215 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.968307 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.968405 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.968697 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.969461 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: E0220 06:46:46.969622 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:46:47.469581664 +0000 UTC m=+22.342208375 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.969864 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.969915 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.970157 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.970327 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.970579 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.970712 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.970965 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.971161 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.971978 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.972186 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.975242 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.975355 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.975576 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.975834 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.975855 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.976626 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.976846 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.976839 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.977024 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.977065 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.977240 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.977342 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.982592 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.982818 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: E0220 06:46:46.983312 5094 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 06:46:46 crc kubenswrapper[5094]: E0220 06:46:46.983373 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 06:46:47.483355666 +0000 UTC m=+22.355982377 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 06:46:46 crc kubenswrapper[5094]: E0220 06:46:46.983796 5094 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 06:46:46 crc kubenswrapper[5094]: E0220 06:46:46.983841 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 06:46:47.483832808 +0000 UTC m=+22.356459519 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.983850 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.984546 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.987871 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.987981 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.988062 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.988365 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.988668 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.988799 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.988907 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.989315 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.989964 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.992004 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.992785 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.995868 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.998087 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.999525 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.000423 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.001618 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.001646 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.001664 5094 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.001739 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 06:46:47.501717369 +0000 UTC m=+22.374344080 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.013499 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.019169 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.019887 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.020034 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23"} Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.020876 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.031869 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.031929 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.031972 5094 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.031984 5094 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.031995 5094 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032005 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032015 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032026 5094 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032035 5094 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032045 5094 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032055 5094 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032064 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032073 5094 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032081 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032089 5094 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032098 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032109 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032119 5094 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032129 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032140 5094 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032150 5094 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032160 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032169 5094 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032178 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032186 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032195 5094 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032204 5094 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032212 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032221 5094 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032229 5094 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032259 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032267 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032276 5094 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032285 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032352 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032351 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032294 5094 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032408 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032457 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032483 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032499 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032514 5094 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032529 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032543 5094 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032558 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032573 5094 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032588 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032605 5094 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032621 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032636 5094 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032649 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032661 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032672 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032684 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032696 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032757 5094 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032771 5094 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032784 5094 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032797 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032810 5094 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032822 5094 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032836 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032849 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032861 5094 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032873 5094 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032886 5094 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032897 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032911 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032923 5094 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032937 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032950 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032963 5094 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032978 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032991 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033009 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033021 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033034 5094 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033047 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033059 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033071 5094 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033083 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033096 5094 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033108 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033122 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033134 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033147 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033160 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033174 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033186 5094 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033200 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033213 5094 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033227 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033239 5094 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033252 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033263 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033275 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033288 5094 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033299 5094 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033311 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033323 5094 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033337 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033348 5094 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033359 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033370 5094 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033383 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033394 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033407 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033420 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033434 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033446 5094 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033462 5094 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033477 5094 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033491 5094 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033503 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033515 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033528 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033540 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033554 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033568 5094 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033583 5094 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033598 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033611 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033624 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033638 5094 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033652 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033664 5094 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033677 5094 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033690 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033730 5094 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033746 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033759 5094 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033772 5094 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033785 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033798 5094 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033810 5094 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033822 5094 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033835 5094 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033847 5094 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033860 5094 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033874 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033887 5094 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033902 5094 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033914 5094 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033929 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033943 5094 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033956 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033968 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033981 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033993 5094 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.034006 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.034018 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.034038 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.034050 5094 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.034063 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.034075 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.034089 5094 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.034101 5094 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.034113 5094 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.034125 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.034138 5094 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.034149 5094 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.034161 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.034173 5094 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.034186 5094 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.037082 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.039343 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.040149 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.054013 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.054399 5094 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.070646 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.087090 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.099458 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.104973 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.106039 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.112527 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.122785 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-8wch6"] Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.123117 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.123522 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-qzxk2"] Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.123904 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8wch6" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.124027 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qzxk2" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.126518 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.127183 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.127444 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.127611 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.130512 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.130828 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.146109 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.146234 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.146269 5094 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.189053 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.214811 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.240844 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.253116 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bc82500-7462-4daa-9eff-116399acb06a-host\") pod \"node-ca-8wch6\" (UID: \"3bc82500-7462-4daa-9eff-116399acb06a\") " pod="openshift-image-registry/node-ca-8wch6" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.253166 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6p6h\" (UniqueName: \"kubernetes.io/projected/3bc82500-7462-4daa-9eff-116399acb06a-kube-api-access-s6p6h\") pod \"node-ca-8wch6\" (UID: \"3bc82500-7462-4daa-9eff-116399acb06a\") " pod="openshift-image-registry/node-ca-8wch6" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.253191 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d81c0f95-7b6e-4a44-8115-f517fc8f4052-hosts-file\") pod \"node-resolver-qzxk2\" (UID: \"d81c0f95-7b6e-4a44-8115-f517fc8f4052\") " pod="openshift-dns/node-resolver-qzxk2" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.253211 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr5sr\" (UniqueName: \"kubernetes.io/projected/d81c0f95-7b6e-4a44-8115-f517fc8f4052-kube-api-access-mr5sr\") pod \"node-resolver-qzxk2\" (UID: \"d81c0f95-7b6e-4a44-8115-f517fc8f4052\") " pod="openshift-dns/node-resolver-qzxk2" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.253265 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3bc82500-7462-4daa-9eff-116399acb06a-serviceca\") pod \"node-ca-8wch6\" (UID: \"3bc82500-7462-4daa-9eff-116399acb06a\") " pod="openshift-image-registry/node-ca-8wch6" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.278861 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.305477 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.323598 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.338812 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.354245 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3bc82500-7462-4daa-9eff-116399acb06a-serviceca\") pod \"node-ca-8wch6\" (UID: \"3bc82500-7462-4daa-9eff-116399acb06a\") " pod="openshift-image-registry/node-ca-8wch6" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.354289 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bc82500-7462-4daa-9eff-116399acb06a-host\") pod \"node-ca-8wch6\" (UID: \"3bc82500-7462-4daa-9eff-116399acb06a\") " pod="openshift-image-registry/node-ca-8wch6" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.354306 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d81c0f95-7b6e-4a44-8115-f517fc8f4052-hosts-file\") pod \"node-resolver-qzxk2\" (UID: \"d81c0f95-7b6e-4a44-8115-f517fc8f4052\") " pod="openshift-dns/node-resolver-qzxk2" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.354322 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr5sr\" (UniqueName: \"kubernetes.io/projected/d81c0f95-7b6e-4a44-8115-f517fc8f4052-kube-api-access-mr5sr\") pod \"node-resolver-qzxk2\" (UID: \"d81c0f95-7b6e-4a44-8115-f517fc8f4052\") " pod="openshift-dns/node-resolver-qzxk2" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.354339 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6p6h\" (UniqueName: \"kubernetes.io/projected/3bc82500-7462-4daa-9eff-116399acb06a-kube-api-access-s6p6h\") pod \"node-ca-8wch6\" (UID: \"3bc82500-7462-4daa-9eff-116399acb06a\") " pod="openshift-image-registry/node-ca-8wch6" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.355564 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3bc82500-7462-4daa-9eff-116399acb06a-serviceca\") pod \"node-ca-8wch6\" (UID: \"3bc82500-7462-4daa-9eff-116399acb06a\") " pod="openshift-image-registry/node-ca-8wch6" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.355629 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bc82500-7462-4daa-9eff-116399acb06a-host\") pod \"node-ca-8wch6\" (UID: \"3bc82500-7462-4daa-9eff-116399acb06a\") " pod="openshift-image-registry/node-ca-8wch6" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.355680 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d81c0f95-7b6e-4a44-8115-f517fc8f4052-hosts-file\") pod \"node-resolver-qzxk2\" (UID: \"d81c0f95-7b6e-4a44-8115-f517fc8f4052\") " pod="openshift-dns/node-resolver-qzxk2" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.360901 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.373770 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6p6h\" (UniqueName: \"kubernetes.io/projected/3bc82500-7462-4daa-9eff-116399acb06a-kube-api-access-s6p6h\") pod \"node-ca-8wch6\" (UID: \"3bc82500-7462-4daa-9eff-116399acb06a\") " pod="openshift-image-registry/node-ca-8wch6" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.377612 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.377724 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr5sr\" (UniqueName: \"kubernetes.io/projected/d81c0f95-7b6e-4a44-8115-f517fc8f4052-kube-api-access-mr5sr\") pod \"node-resolver-qzxk2\" (UID: \"d81c0f95-7b6e-4a44-8115-f517fc8f4052\") " pod="openshift-dns/node-resolver-qzxk2" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.388722 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.399467 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.489012 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8wch6" Feb 20 06:46:47 crc kubenswrapper[5094]: W0220 06:46:47.501083 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bc82500_7462_4daa_9eff_116399acb06a.slice/crio-19ebcbfa6786c1ff4d74dff0e353d2abd0d6b245d46afa016754b5a228eff35e WatchSource:0}: Error finding container 19ebcbfa6786c1ff4d74dff0e353d2abd0d6b245d46afa016754b5a228eff35e: Status 404 returned error can't find the container with id 19ebcbfa6786c1ff4d74dff0e353d2abd0d6b245d46afa016754b5a228eff35e Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.529735 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qzxk2" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.554258 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-56ppq"] Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.554848 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.556624 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.556743 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.556774 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.556804 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.556831 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.556918 5094 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.556968 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 06:46:48.55695381 +0000 UTC m=+23.429580531 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.557036 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:46:48.557028302 +0000 UTC m=+23.429655023 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.557126 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.557145 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.557161 5094 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.557192 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 06:46:48.557184245 +0000 UTC m=+23.429810966 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.557243 5094 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.557270 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 06:46:48.557262537 +0000 UTC m=+23.429889258 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.557320 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.557333 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.557343 5094 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.557371 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 06:46:48.557363119 +0000 UTC m=+23.429989840 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.558261 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.558618 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.558802 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.559022 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.562015 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.580164 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.595719 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.609160 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.619069 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.630177 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.640498 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.659016 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0810cb2f-5b29-4c97-8b16-e1bb2d455a0d-rootfs\") pod \"machine-config-daemon-56ppq\" (UID: \"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\") " pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.659096 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzjnr\" (UniqueName: \"kubernetes.io/projected/0810cb2f-5b29-4c97-8b16-e1bb2d455a0d-kube-api-access-hzjnr\") pod \"machine-config-daemon-56ppq\" (UID: \"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\") " pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.659129 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0810cb2f-5b29-4c97-8b16-e1bb2d455a0d-proxy-tls\") pod \"machine-config-daemon-56ppq\" (UID: \"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\") " pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.659145 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0810cb2f-5b29-4c97-8b16-e1bb2d455a0d-mcd-auth-proxy-config\") pod \"machine-config-daemon-56ppq\" (UID: \"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\") " pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.666403 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.678302 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.692475 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.703378 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.760169 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0810cb2f-5b29-4c97-8b16-e1bb2d455a0d-proxy-tls\") pod \"machine-config-daemon-56ppq\" (UID: \"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\") " pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.760227 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0810cb2f-5b29-4c97-8b16-e1bb2d455a0d-mcd-auth-proxy-config\") pod \"machine-config-daemon-56ppq\" (UID: \"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\") " pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.760254 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0810cb2f-5b29-4c97-8b16-e1bb2d455a0d-rootfs\") pod \"machine-config-daemon-56ppq\" (UID: \"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\") " pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.760287 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzjnr\" (UniqueName: \"kubernetes.io/projected/0810cb2f-5b29-4c97-8b16-e1bb2d455a0d-kube-api-access-hzjnr\") pod \"machine-config-daemon-56ppq\" (UID: \"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\") " pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.762912 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0810cb2f-5b29-4c97-8b16-e1bb2d455a0d-rootfs\") pod \"machine-config-daemon-56ppq\" (UID: \"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\") " pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.763360 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0810cb2f-5b29-4c97-8b16-e1bb2d455a0d-mcd-auth-proxy-config\") pod \"machine-config-daemon-56ppq\" (UID: \"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\") " pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.793079 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 03:45:26.439249222 +0000 UTC Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.805920 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.811235 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.820290 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.830249 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.839674 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.839843 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.843889 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.844570 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0810cb2f-5b29-4c97-8b16-e1bb2d455a0d-proxy-tls\") pod \"machine-config-daemon-56ppq\" (UID: \"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\") " pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.844871 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzjnr\" (UniqueName: \"kubernetes.io/projected/0810cb2f-5b29-4c97-8b16-e1bb2d455a0d-kube-api-access-hzjnr\") pod \"machine-config-daemon-56ppq\" (UID: \"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\") " pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.845546 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.846665 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.847313 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.848394 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.848958 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.849546 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.852257 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.856755 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.859168 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.859970 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.860515 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.861410 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.861976 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.862549 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.864342 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.864918 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.866277 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.866745 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.867374 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.868500 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.869018 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.870030 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.870483 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.872633 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.873234 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.874081 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.874602 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.878988 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.879506 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.880193 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.880744 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.881280 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.891084 5094 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.891219 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.894278 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.894786 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.898908 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.907308 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.915108 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.915160 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.916313 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.917268 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.918529 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.919503 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.920225 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.924757 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.925444 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.926407 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.927053 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.931067 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.932023 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.933017 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.933510 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.937820 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.938436 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.939086 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.940076 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.940616 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.940833 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.952413 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.962433 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.962671 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-9vd4p"] Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.963378 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.969464 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 20 06:46:47 crc kubenswrapper[5094]: W0220 06:46:47.969924 5094 reflector.go:561] object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": failed to list *v1.Secret: secrets "multus-ancillary-tools-dockercfg-vnmsz" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.969966 5094 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vnmsz\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"multus-ancillary-tools-dockercfg-vnmsz\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 20 06:46:47 crc kubenswrapper[5094]: W0220 06:46:47.970039 5094 reflector.go:561] object-"openshift-multus"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.970051 5094 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 20 06:46:47 crc kubenswrapper[5094]: W0220 06:46:47.970093 5094 reflector.go:561] object-"openshift-multus"/"default-cni-sysctl-allowlist": failed to list *v1.ConfigMap: configmaps "default-cni-sysctl-allowlist" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.970104 5094 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"default-cni-sysctl-allowlist\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.971928 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.979226 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-zr8rz"] Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.979620 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zr8rz" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.980455 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-29bjc"] Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.981328 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.983179 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.987047 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.987210 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.987322 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.987647 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.987816 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.987933 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.992749 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.993935 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.998632 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.036243 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.051197 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d"} Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.051272 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"7a14287aa061d6bafd555859cf0e70fa9a59fff7f5dbb8ebf7619d92d78b43cd"} Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.056416 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f"} Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.056457 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"1b66ffe0a5e617d0ba35eabca4823ca8accc1ec1f5cd91eb035283a65ce25291"} Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.058159 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8wch6" event={"ID":"3bc82500-7462-4daa-9eff-116399acb06a","Type":"ContainerStarted","Data":"4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500"} Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.058225 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8wch6" event={"ID":"3bc82500-7462-4daa-9eff-116399acb06a","Type":"ContainerStarted","Data":"19ebcbfa6786c1ff4d74dff0e353d2abd0d6b245d46afa016754b5a228eff35e"} Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063158 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-hostroot\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063197 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-env-overrides\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063219 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovnkube-script-lib\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063242 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-multus-cni-dir\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063260 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-multus-socket-dir-parent\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063304 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-var-lib-cni-bin\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063327 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-run-multus-certs\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063351 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-node-log\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063372 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/19cce34f-67a6-48c9-a396-621c5811b6cd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063397 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-run-netns\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063435 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-cni-bin\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063452 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063471 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-system-cni-dir\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063491 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-kubelet\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063509 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-var-lib-openvswitch\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063525 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-run-ovn-kubernetes\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063545 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-etc-kubernetes\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063564 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/19cce34f-67a6-48c9-a396-621c5811b6cd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063581 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-log-socket\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063599 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/19cce34f-67a6-48c9-a396-621c5811b6cd-system-cni-dir\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063618 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/19cce34f-67a6-48c9-a396-621c5811b6cd-os-release\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063644 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swnw6\" (UniqueName: \"kubernetes.io/projected/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-kube-api-access-swnw6\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063663 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/19cce34f-67a6-48c9-a396-621c5811b6cd-cnibin\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063680 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/19cce34f-67a6-48c9-a396-621c5811b6cd-cni-binary-copy\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063712 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-openvswitch\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063744 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovnkube-config\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063761 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-cni-binary-copy\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063777 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-run-k8s-cni-cncf-io\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063795 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-multus-daemon-config\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063821 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-cni-netd\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063839 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovn-node-metrics-cert\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063856 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-os-release\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063876 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-var-lib-cni-multus\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063891 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-multus-conf-dir\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063907 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fphl\" (UniqueName: \"kubernetes.io/projected/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-kube-api-access-8fphl\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063925 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-var-lib-kubelet\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063941 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-systemd\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063958 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-cnibin\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063976 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-systemd-units\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063993 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-run-netns\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.064010 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-slash\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.064025 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-etc-openvswitch\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.064042 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-ovn\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.064058 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjkf4\" (UniqueName: \"kubernetes.io/projected/19cce34f-67a6-48c9-a396-621c5811b6cd-kube-api-access-cjkf4\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.064477 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a"} Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.064501 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012"} Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.064510 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7fbde06fa389195eb5092bab20c7c50624673f2d114a2da417f9632638713f3c"} Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.066632 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"768c3ecca738877395d88ae13a22c6270e589ddbb87bdd0e770f9b8fbec53733"} Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.069664 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qzxk2" event={"ID":"d81c0f95-7b6e-4a44-8115-f517fc8f4052","Type":"ContainerStarted","Data":"ae7b8fdfcbdaed00729e63d6458f951564b2a56987eb0e01cd5559425dff8cb9"} Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.069822 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.077225 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 20 06:46:48 crc kubenswrapper[5094]: E0220 06:46:48.080628 5094 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.086370 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.109317 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.123639 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.135920 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.148594 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165623 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-var-lib-kubelet\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165693 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-systemd\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165732 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-cnibin\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165750 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-systemd-units\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165774 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-run-netns\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165773 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-var-lib-kubelet\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165800 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-slash\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165821 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-etc-openvswitch\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165841 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-ovn\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165848 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-run-netns\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165858 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjkf4\" (UniqueName: \"kubernetes.io/projected/19cce34f-67a6-48c9-a396-621c5811b6cd-kube-api-access-cjkf4\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165878 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-env-overrides\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165883 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-systemd-units\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165873 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-systemd\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165916 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-etc-openvswitch\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165956 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-ovn\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165941 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-cnibin\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165896 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovnkube-script-lib\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.166093 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-slash\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.166414 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-multus-cni-dir\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.166534 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-multus-socket-dir-parent\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.166609 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-multus-socket-dir-parent\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.166645 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-multus-cni-dir\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.166660 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-var-lib-cni-bin\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.166735 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-var-lib-cni-bin\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.166739 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-hostroot\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.166773 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-run-multus-certs\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.166835 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-hostroot\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.166841 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-node-log\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.166890 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-node-log\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.166934 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-run-multus-certs\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.166953 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-env-overrides\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.166894 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/19cce34f-67a6-48c9-a396-621c5811b6cd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167039 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovnkube-script-lib\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167110 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-cni-bin\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167140 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167165 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-system-cni-dir\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167182 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-run-netns\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167209 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-kubelet\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167228 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-var-lib-openvswitch\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167246 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-run-ovn-kubernetes\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167264 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-etc-kubernetes\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167296 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-log-socket\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167316 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/19cce34f-67a6-48c9-a396-621c5811b6cd-system-cni-dir\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167332 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/19cce34f-67a6-48c9-a396-621c5811b6cd-os-release\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167352 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/19cce34f-67a6-48c9-a396-621c5811b6cd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167375 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167386 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-etc-kubernetes\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167414 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-kubelet\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167351 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-system-cni-dir\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167390 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swnw6\" (UniqueName: \"kubernetes.io/projected/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-kube-api-access-swnw6\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167444 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-var-lib-openvswitch\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167410 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/19cce34f-67a6-48c9-a396-621c5811b6cd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167468 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/19cce34f-67a6-48c9-a396-621c5811b6cd-cnibin\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167508 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/19cce34f-67a6-48c9-a396-621c5811b6cd-system-cni-dir\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167454 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-run-ovn-kubernetes\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167529 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-openvswitch\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167487 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/19cce34f-67a6-48c9-a396-621c5811b6cd-cnibin\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167351 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-cni-bin\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167595 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-log-socket\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167489 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-run-netns\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167754 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-openvswitch\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167853 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/19cce34f-67a6-48c9-a396-621c5811b6cd-os-release\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167905 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovnkube-config\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.168527 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovnkube-config\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167941 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-cni-binary-copy\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.168587 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-run-k8s-cni-cncf-io\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.168608 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-cni-binary-copy\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.168610 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-multus-daemon-config\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.168655 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/19cce34f-67a6-48c9-a396-621c5811b6cd-cni-binary-copy\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.168677 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-cni-netd\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.168696 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovn-node-metrics-cert\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.168728 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-os-release\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.168746 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-var-lib-cni-multus\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.168763 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-multus-conf-dir\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.168782 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fphl\" (UniqueName: \"kubernetes.io/projected/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-kube-api-access-8fphl\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.169040 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-multus-daemon-config\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.169082 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-run-k8s-cni-cncf-io\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.169126 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-os-release\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.169158 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-cni-netd\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.169431 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/19cce34f-67a6-48c9-a396-621c5811b6cd-cni-binary-copy\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.169478 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-var-lib-cni-multus\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.169512 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-multus-conf-dir\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.173125 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovn-node-metrics-cert\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.174965 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.189327 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.191785 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swnw6\" (UniqueName: \"kubernetes.io/projected/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-kube-api-access-swnw6\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.204347 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.216431 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.225737 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.236785 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.251518 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.265508 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.278756 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.289655 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.303206 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.308631 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: W0220 06:46:48.328195 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1c36de3_d36b_48ed_9d4d_3aa52d72add0.slice/crio-7a9fcbdbce2beb3a7269a8c12a8179221e63193036cba570355ff8c9f0adb656 WatchSource:0}: Error finding container 7a9fcbdbce2beb3a7269a8c12a8179221e63193036cba570355ff8c9f0adb656: Status 404 returned error can't find the container with id 7a9fcbdbce2beb3a7269a8c12a8179221e63193036cba570355ff8c9f0adb656 Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.573397 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:46:48 crc kubenswrapper[5094]: E0220 06:46:48.573640 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:46:50.5736057 +0000 UTC m=+25.446232411 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.574189 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.574274 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.574412 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:46:48 crc kubenswrapper[5094]: E0220 06:46:48.574438 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 06:46:48 crc kubenswrapper[5094]: E0220 06:46:48.574481 5094 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 06:46:48 crc kubenswrapper[5094]: E0220 06:46:48.574500 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 06:46:48 crc kubenswrapper[5094]: E0220 06:46:48.574517 5094 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:48 crc kubenswrapper[5094]: E0220 06:46:48.574543 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 06:46:50.574531612 +0000 UTC m=+25.447158563 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 06:46:48 crc kubenswrapper[5094]: E0220 06:46:48.574581 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 06:46:48 crc kubenswrapper[5094]: E0220 06:46:48.574594 5094 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.574465 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:48 crc kubenswrapper[5094]: E0220 06:46:48.574610 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 06:46:48 crc kubenswrapper[5094]: E0220 06:46:48.574747 5094 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:48 crc kubenswrapper[5094]: E0220 06:46:48.574585 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 06:46:50.574563573 +0000 UTC m=+25.447190294 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:48 crc kubenswrapper[5094]: E0220 06:46:48.574820 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 06:46:50.574778168 +0000 UTC m=+25.447405089 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 06:46:48 crc kubenswrapper[5094]: E0220 06:46:48.574861 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 06:46:50.57483847 +0000 UTC m=+25.447465491 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.793897 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 17:05:03.346301971 +0000 UTC Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.839235 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.839376 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:46:48 crc kubenswrapper[5094]: E0220 06:46:48.839448 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:46:48 crc kubenswrapper[5094]: E0220 06:46:48.839671 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.075552 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qzxk2" event={"ID":"d81c0f95-7b6e-4a44-8115-f517fc8f4052","Type":"ContainerStarted","Data":"5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855"} Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.090810 5094 generic.go:334] "Generic (PLEG): container finished" podID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerID="218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10" exitCode=0 Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.090934 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerDied","Data":"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10"} Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.091032 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerStarted","Data":"7a9fcbdbce2beb3a7269a8c12a8179221e63193036cba570355ff8c9f0adb656"} Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.101060 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9"} Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.116135 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.135940 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.145160 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.148611 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.161778 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjkf4\" (UniqueName: \"kubernetes.io/projected/19cce34f-67a6-48c9-a396-621c5811b6cd-kube-api-access-cjkf4\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.161822 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fphl\" (UniqueName: \"kubernetes.io/projected/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-kube-api-access-8fphl\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.163124 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: E0220 06:46:49.167689 5094 configmap.go:193] Couldn't get configMap openshift-multus/default-cni-sysctl-allowlist: failed to sync configmap cache: timed out waiting for the condition Feb 20 06:46:49 crc kubenswrapper[5094]: E0220 06:46:49.167777 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/19cce34f-67a6-48c9-a396-621c5811b6cd-cni-sysctl-allowlist podName:19cce34f-67a6-48c9-a396-621c5811b6cd nodeName:}" failed. No retries permitted until 2026-02-20 06:46:49.667757228 +0000 UTC m=+24.540383959 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-sysctl-allowlist" (UniqueName: "kubernetes.io/configmap/19cce34f-67a6-48c9-a396-621c5811b6cd-cni-sysctl-allowlist") pod "multus-additional-cni-plugins-9vd4p" (UID: "19cce34f-67a6-48c9-a396-621c5811b6cd") : failed to sync configmap cache: timed out waiting for the condition Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.179167 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.194926 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.195335 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.201186 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zr8rz" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.210927 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.226719 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.239853 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.256429 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.258234 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.272608 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.294863 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.315200 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.332981 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.350488 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.365943 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.383317 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.400714 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.419831 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.436258 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.450807 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.468196 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.497830 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.522624 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.549512 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.583229 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.631566 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.655599 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.672867 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.688755 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/19cce34f-67a6-48c9-a396-621c5811b6cd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.689215 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.689959 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/19cce34f-67a6-48c9-a396-621c5811b6cd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.776912 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:49 crc kubenswrapper[5094]: W0220 06:46:49.789355 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19cce34f_67a6_48c9_a396_621c5811b6cd.slice/crio-4f0fac9e016292130a4e155241d6baca44a3c442472b1ffb753f7c30e43c149f WatchSource:0}: Error finding container 4f0fac9e016292130a4e155241d6baca44a3c442472b1ffb753f7c30e43c149f: Status 404 returned error can't find the container with id 4f0fac9e016292130a4e155241d6baca44a3c442472b1ffb753f7c30e43c149f Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.795186 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 14:25:49.006747948 +0000 UTC Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.839414 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:49 crc kubenswrapper[5094]: E0220 06:46:49.839596 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.108993 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" event={"ID":"19cce34f-67a6-48c9-a396-621c5811b6cd","Type":"ContainerStarted","Data":"4f0fac9e016292130a4e155241d6baca44a3c442472b1ffb753f7c30e43c149f"} Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.111401 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zr8rz" event={"ID":"c3900f6d-3035-4fc4-80a2-9e79154f4f5e","Type":"ContainerStarted","Data":"7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118"} Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.111444 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zr8rz" event={"ID":"c3900f6d-3035-4fc4-80a2-9e79154f4f5e","Type":"ContainerStarted","Data":"2fb06227a0e267c5944978d6f12d04355e119b5a602d6c9141c724b852fb0281"} Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.115304 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerStarted","Data":"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf"} Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.115353 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerStarted","Data":"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e"} Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.115365 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerStarted","Data":"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba"} Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.115374 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerStarted","Data":"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60"} Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.134098 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:50Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.154155 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:50Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.168756 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:50Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.181501 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:50Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.199836 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:50Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.213995 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:50Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.229010 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:50Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.243355 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:50Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.255684 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:50Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.266613 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:50Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.282352 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:50Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.301225 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:50Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.324618 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:50Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.341556 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:50Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.366174 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:50Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.598909 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.599042 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.599071 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:50 crc kubenswrapper[5094]: E0220 06:46:50.599107 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:46:54.599070531 +0000 UTC m=+29.471697252 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.599169 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:46:50 crc kubenswrapper[5094]: E0220 06:46:50.599183 5094 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.599220 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:50 crc kubenswrapper[5094]: E0220 06:46:50.599239 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 06:46:54.599225836 +0000 UTC m=+29.471852547 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 06:46:50 crc kubenswrapper[5094]: E0220 06:46:50.599375 5094 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 06:46:50 crc kubenswrapper[5094]: E0220 06:46:50.599419 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 06:46:54.59941004 +0000 UTC m=+29.472036761 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 06:46:50 crc kubenswrapper[5094]: E0220 06:46:50.599532 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 06:46:50 crc kubenswrapper[5094]: E0220 06:46:50.599548 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 06:46:50 crc kubenswrapper[5094]: E0220 06:46:50.599561 5094 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:50 crc kubenswrapper[5094]: E0220 06:46:50.599596 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 06:46:54.599586354 +0000 UTC m=+29.472213075 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:50 crc kubenswrapper[5094]: E0220 06:46:50.599596 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 06:46:50 crc kubenswrapper[5094]: E0220 06:46:50.599620 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 06:46:50 crc kubenswrapper[5094]: E0220 06:46:50.599631 5094 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:50 crc kubenswrapper[5094]: E0220 06:46:50.599663 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 06:46:54.599655696 +0000 UTC m=+29.472282417 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.796092 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 13:15:01.919546015 +0000 UTC Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.839464 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.839558 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:46:50 crc kubenswrapper[5094]: E0220 06:46:50.839628 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:46:50 crc kubenswrapper[5094]: E0220 06:46:50.839790 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.130914 5094 generic.go:334] "Generic (PLEG): container finished" podID="19cce34f-67a6-48c9-a396-621c5811b6cd" containerID="711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d" exitCode=0 Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.131262 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" event={"ID":"19cce34f-67a6-48c9-a396-621c5811b6cd","Type":"ContainerDied","Data":"711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d"} Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.135791 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c"} Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.143043 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerStarted","Data":"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9"} Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.143123 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerStarted","Data":"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e"} Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.171164 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.197975 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.216825 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.237744 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.274814 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.292765 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.321254 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.338140 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.354335 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.367557 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.380281 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.397518 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.415463 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.429082 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.443072 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.458773 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.470810 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.483517 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.503372 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.522395 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.543208 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.569878 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.595460 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.609077 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.628952 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.650722 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.666015 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.675795 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.697190 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.712282 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.796246 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 15:22:22.257918386 +0000 UTC Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.839156 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:51 crc kubenswrapper[5094]: E0220 06:46:51.839307 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.005168 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.008454 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.008502 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.008512 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.008660 5094 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.017362 5094 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.017671 5094 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.026426 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.026481 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.026503 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.026532 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.026556 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:52Z","lastTransitionTime":"2026-02-20T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:52 crc kubenswrapper[5094]: E0220 06:46:52.050487 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.054897 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.054939 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.054957 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.054975 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.055087 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:52Z","lastTransitionTime":"2026-02-20T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:52 crc kubenswrapper[5094]: E0220 06:46:52.069755 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.074133 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.074170 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.074184 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.074201 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.074215 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:52Z","lastTransitionTime":"2026-02-20T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:52 crc kubenswrapper[5094]: E0220 06:46:52.092412 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.096568 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.096619 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.096638 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.096663 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.096683 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:52Z","lastTransitionTime":"2026-02-20T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:52 crc kubenswrapper[5094]: E0220 06:46:52.109778 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.114182 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.114247 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.114263 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.114293 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.114312 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:52Z","lastTransitionTime":"2026-02-20T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:52 crc kubenswrapper[5094]: E0220 06:46:52.131095 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: E0220 06:46:52.131291 5094 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.133416 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.133461 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.133475 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.133495 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.133509 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:52Z","lastTransitionTime":"2026-02-20T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.148650 5094 generic.go:334] "Generic (PLEG): container finished" podID="19cce34f-67a6-48c9-a396-621c5811b6cd" containerID="2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d" exitCode=0 Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.150470 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" event={"ID":"19cce34f-67a6-48c9-a396-621c5811b6cd","Type":"ContainerDied","Data":"2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d"} Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.169194 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.196132 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.215860 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.232868 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.239109 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.239175 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.239200 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.239232 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.239256 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:52Z","lastTransitionTime":"2026-02-20T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.250417 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.281529 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.332982 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.341824 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.341917 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.341934 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.341959 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.341978 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:52Z","lastTransitionTime":"2026-02-20T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.360230 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.377749 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.400261 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.419961 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.433571 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.446382 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.446433 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.446445 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.446464 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.446475 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:52Z","lastTransitionTime":"2026-02-20T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.452068 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.479453 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.502250 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.549097 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.549279 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.549538 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.549797 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.549888 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:52Z","lastTransitionTime":"2026-02-20T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.653944 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.654403 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.654530 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.654669 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.654830 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:52Z","lastTransitionTime":"2026-02-20T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.757896 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.757996 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.758013 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.758050 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.758061 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:52Z","lastTransitionTime":"2026-02-20T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.797338 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 12:19:28.192964668 +0000 UTC Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.839272 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:46:52 crc kubenswrapper[5094]: E0220 06:46:52.839477 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.840077 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:46:52 crc kubenswrapper[5094]: E0220 06:46:52.840263 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.860792 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.860913 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.860931 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.860958 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.860976 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:52Z","lastTransitionTime":"2026-02-20T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.964288 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.964354 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.964367 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.964391 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.964408 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:52Z","lastTransitionTime":"2026-02-20T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.067998 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.068073 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.068093 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.068172 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.068195 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:53Z","lastTransitionTime":"2026-02-20T06:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.157020 5094 generic.go:334] "Generic (PLEG): container finished" podID="19cce34f-67a6-48c9-a396-621c5811b6cd" containerID="844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776" exitCode=0 Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.157124 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" event={"ID":"19cce34f-67a6-48c9-a396-621c5811b6cd","Type":"ContainerDied","Data":"844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776"} Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.171914 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerStarted","Data":"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77"} Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.174813 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.174863 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.174883 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.174909 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.174931 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:53Z","lastTransitionTime":"2026-02-20T06:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.183314 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.218030 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.253755 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.277019 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.288936 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.289394 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.289600 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.289838 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.290060 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:53Z","lastTransitionTime":"2026-02-20T06:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.296208 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.318006 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.339611 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.354363 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.377369 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.394364 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.394957 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.395047 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.395063 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.395083 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.395098 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:53Z","lastTransitionTime":"2026-02-20T06:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.414137 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.432908 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.451796 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.468375 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.487019 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.498637 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.498767 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.498902 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.499002 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.499111 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:53Z","lastTransitionTime":"2026-02-20T06:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.602535 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.602605 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.602622 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.602659 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.602681 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:53Z","lastTransitionTime":"2026-02-20T06:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.706269 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.706317 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.706334 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.706360 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.706379 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:53Z","lastTransitionTime":"2026-02-20T06:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.798384 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 05:40:11.789592844 +0000 UTC Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.811091 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.811172 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.811192 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.811221 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.811243 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:53Z","lastTransitionTime":"2026-02-20T06:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.839930 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:53 crc kubenswrapper[5094]: E0220 06:46:53.840204 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.914969 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.915052 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.915080 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.915115 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.915140 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:53Z","lastTransitionTime":"2026-02-20T06:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.019178 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.019275 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.019294 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.019329 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.019349 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:54Z","lastTransitionTime":"2026-02-20T06:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.123095 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.123187 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.123213 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.123242 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.123261 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:54Z","lastTransitionTime":"2026-02-20T06:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.182042 5094 generic.go:334] "Generic (PLEG): container finished" podID="19cce34f-67a6-48c9-a396-621c5811b6cd" containerID="c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd" exitCode=0 Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.182144 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" event={"ID":"19cce34f-67a6-48c9-a396-621c5811b6cd","Type":"ContainerDied","Data":"c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd"} Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.211418 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:54Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.227614 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.227694 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.227751 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.227789 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.227876 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:54Z","lastTransitionTime":"2026-02-20T06:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.238238 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:54Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.259823 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:54Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.281967 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:54Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.311309 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:54Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.332601 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.332656 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.332680 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.332735 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.332758 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:54Z","lastTransitionTime":"2026-02-20T06:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.336320 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:54Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.361775 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:54Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.385330 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:54Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.404377 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:54Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.426044 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:54Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.436484 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.436563 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.436574 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.436593 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.436607 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:54Z","lastTransitionTime":"2026-02-20T06:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.445340 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:54Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.466974 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:54Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.497457 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:54Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.540781 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.540849 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.540878 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.540908 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.540930 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:54Z","lastTransitionTime":"2026-02-20T06:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.582344 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:54Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.605569 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:54Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.645447 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.646056 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.646068 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.646107 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.646123 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:54Z","lastTransitionTime":"2026-02-20T06:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.653161 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.653346 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:46:54 crc kubenswrapper[5094]: E0220 06:46:54.653393 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:47:02.653352616 +0000 UTC m=+37.525979327 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.653716 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:54 crc kubenswrapper[5094]: E0220 06:46:54.653768 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 06:46:54 crc kubenswrapper[5094]: E0220 06:46:54.653800 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 06:46:54 crc kubenswrapper[5094]: E0220 06:46:54.653823 5094 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:54 crc kubenswrapper[5094]: E0220 06:46:54.653912 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:02.653889528 +0000 UTC m=+37.526516279 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.653947 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.653991 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:54 crc kubenswrapper[5094]: E0220 06:46:54.653970 5094 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 06:46:54 crc kubenswrapper[5094]: E0220 06:46:54.654118 5094 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 06:46:54 crc kubenswrapper[5094]: E0220 06:46:54.654131 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 06:46:54 crc kubenswrapper[5094]: E0220 06:46:54.654153 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 06:46:54 crc kubenswrapper[5094]: E0220 06:46:54.654182 5094 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:54 crc kubenswrapper[5094]: E0220 06:46:54.654244 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:02.654155225 +0000 UTC m=+37.526781976 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 06:46:54 crc kubenswrapper[5094]: E0220 06:46:54.654299 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:02.654282558 +0000 UTC m=+37.526909299 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 06:46:54 crc kubenswrapper[5094]: E0220 06:46:54.654338 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:02.654325009 +0000 UTC m=+37.526951760 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.749073 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.749147 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.749166 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.749194 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.749212 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:54Z","lastTransitionTime":"2026-02-20T06:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.799043 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 01:45:15.327120053 +0000 UTC Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.839534 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.839561 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:46:54 crc kubenswrapper[5094]: E0220 06:46:54.839867 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:46:54 crc kubenswrapper[5094]: E0220 06:46:54.840000 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.853040 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.853097 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.853113 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.853135 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.853151 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:54Z","lastTransitionTime":"2026-02-20T06:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.956088 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.956149 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.956166 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.956196 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.956214 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:54Z","lastTransitionTime":"2026-02-20T06:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.059885 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.059943 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.059955 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.059975 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.059999 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:55Z","lastTransitionTime":"2026-02-20T06:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.163467 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.163530 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.163562 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.163583 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.163598 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:55Z","lastTransitionTime":"2026-02-20T06:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.194531 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerStarted","Data":"0fefb2c2f6cc86094f81a7b5545706f81ffcfe3c3f92d0cb521079734da9f14e"} Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.195379 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.195449 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.201885 5094 generic.go:334] "Generic (PLEG): container finished" podID="19cce34f-67a6-48c9-a396-621c5811b6cd" containerID="cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f" exitCode=0 Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.201941 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" event={"ID":"19cce34f-67a6-48c9-a396-621c5811b6cd","Type":"ContainerDied","Data":"cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f"} Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.215915 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.238328 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.238313 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.253764 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.271595 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.271652 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.271671 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.271698 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.271756 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:55Z","lastTransitionTime":"2026-02-20T06:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.272072 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.293726 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.314089 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.331388 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.349436 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.365441 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.374275 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.374324 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.374338 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.374363 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.374377 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:55Z","lastTransitionTime":"2026-02-20T06:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.397339 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.414095 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.430442 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.444189 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.472674 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fefb2c2f6cc86094f81a7b5545706f81ffcfe3c3f92d0cb521079734da9f14e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.476249 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.476275 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.476284 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.476300 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.476311 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:55Z","lastTransitionTime":"2026-02-20T06:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.489447 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.501132 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.513124 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.524495 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.538241 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.556187 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.574653 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.579314 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.579433 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.579510 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.579600 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.579672 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:55Z","lastTransitionTime":"2026-02-20T06:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.589223 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.602869 5094 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.603055 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.646889 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.659824 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.672474 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.684319 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.684799 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.684927 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.685033 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.685134 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:55Z","lastTransitionTime":"2026-02-20T06:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.686025 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.704632 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fefb2c2f6cc86094f81a7b5545706f81ffcfe3c3f92d0cb521079734da9f14e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.725211 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.742515 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.788935 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.789216 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.789291 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.789374 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.789454 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:55Z","lastTransitionTime":"2026-02-20T06:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.800193 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 08:42:04.895454614 +0000 UTC Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.840040 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:55 crc kubenswrapper[5094]: E0220 06:46:55.840363 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.856700 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.873805 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.891638 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.893512 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.893589 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.893613 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.893645 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.893666 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:55Z","lastTransitionTime":"2026-02-20T06:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.910463 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.941454 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.965151 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.986639 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.997695 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.997752 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.997762 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.997782 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.997794 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:55Z","lastTransitionTime":"2026-02-20T06:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.006239 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.030457 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fefb2c2f6cc86094f81a7b5545706f81ffcfe3c3f92d0cb521079734da9f14e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.045909 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.060382 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.079600 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.095097 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.102278 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.102348 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.102378 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.102409 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.102427 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:56Z","lastTransitionTime":"2026-02-20T06:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.113474 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.132277 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.205495 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.205869 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.205995 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.206134 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.206225 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:56Z","lastTransitionTime":"2026-02-20T06:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.211719 5094 generic.go:334] "Generic (PLEG): container finished" podID="19cce34f-67a6-48c9-a396-621c5811b6cd" containerID="817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d" exitCode=0 Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.211862 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" event={"ID":"19cce34f-67a6-48c9-a396-621c5811b6cd","Type":"ContainerDied","Data":"817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d"} Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.212499 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.231470 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.248277 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.267571 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.289723 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.297908 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.309733 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.309783 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.309794 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.309810 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.309821 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:56Z","lastTransitionTime":"2026-02-20T06:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.310798 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.333079 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fefb2c2f6cc86094f81a7b5545706f81ffcfe3c3f92d0cb521079734da9f14e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.355395 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.372072 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.393382 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.413002 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.413064 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.413187 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.413198 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.413220 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.413233 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:56Z","lastTransitionTime":"2026-02-20T06:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.429207 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.445808 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.465566 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.484496 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.503451 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.515329 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.515382 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.515396 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.515417 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.515429 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:56Z","lastTransitionTime":"2026-02-20T06:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.521247 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.538236 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.553781 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.569599 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.582779 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.599996 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.617872 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.618198 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.618285 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.618307 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.618338 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.618358 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:56Z","lastTransitionTime":"2026-02-20T06:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.639131 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.656934 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.676681 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.706082 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.721671 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.721910 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.721994 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.722092 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.722181 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:56Z","lastTransitionTime":"2026-02-20T06:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.729268 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.752457 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.768843 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.800968 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 17:05:15.545691176 +0000 UTC Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.825329 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.825395 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.825414 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.825441 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.825461 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:56Z","lastTransitionTime":"2026-02-20T06:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.826265 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fefb2c2f6cc86094f81a7b5545706f81ffcfe3c3f92d0cb521079734da9f14e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.839391 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.839416 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:46:56 crc kubenswrapper[5094]: E0220 06:46:56.839615 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:46:56 crc kubenswrapper[5094]: E0220 06:46:56.840005 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.928961 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.929283 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.929344 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.929456 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.929522 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:56Z","lastTransitionTime":"2026-02-20T06:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.032820 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.032898 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.032913 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.032931 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.032961 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:57Z","lastTransitionTime":"2026-02-20T06:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.136993 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.137040 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.137049 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.137071 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.137084 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:57Z","lastTransitionTime":"2026-02-20T06:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.224101 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" event={"ID":"19cce34f-67a6-48c9-a396-621c5811b6cd","Type":"ContainerStarted","Data":"a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307"} Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.240496 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.240528 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.240539 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.240555 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.240588 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:57Z","lastTransitionTime":"2026-02-20T06:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.250971 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:57Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.269898 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:57Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.290232 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:57Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.307431 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:57Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.343660 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.343731 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.343749 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.343815 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.343839 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:57Z","lastTransitionTime":"2026-02-20T06:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.359178 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:57Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.385002 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:57Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.400406 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:57Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.415081 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:57Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.429297 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:57Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.446616 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.446676 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.446690 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.446736 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.446757 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:57Z","lastTransitionTime":"2026-02-20T06:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.464375 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:57Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.482005 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:57Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.504204 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:57Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.527022 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:57Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.549421 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.549495 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.549512 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.549539 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.549556 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:57Z","lastTransitionTime":"2026-02-20T06:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.555009 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fefb2c2f6cc86094f81a7b5545706f81ffcfe3c3f92d0cb521079734da9f14e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:57Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.579618 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:57Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.652763 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.652816 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.652826 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.652843 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.652854 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:57Z","lastTransitionTime":"2026-02-20T06:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.755764 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.755817 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.755827 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.755846 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.755857 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:57Z","lastTransitionTime":"2026-02-20T06:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.801683 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 10:43:24.765455514 +0000 UTC Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.840050 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:57 crc kubenswrapper[5094]: E0220 06:46:57.840221 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.858962 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.859029 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.859046 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.859070 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.859091 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:57Z","lastTransitionTime":"2026-02-20T06:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.963640 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.963697 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.963746 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.963779 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.963800 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:57Z","lastTransitionTime":"2026-02-20T06:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.068109 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.068182 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.068204 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.068234 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.068253 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:58Z","lastTransitionTime":"2026-02-20T06:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.171280 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.171340 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.171358 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.171382 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.171401 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:58Z","lastTransitionTime":"2026-02-20T06:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.231855 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovnkube-controller/0.log" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.238950 5094 generic.go:334] "Generic (PLEG): container finished" podID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerID="0fefb2c2f6cc86094f81a7b5545706f81ffcfe3c3f92d0cb521079734da9f14e" exitCode=1 Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.240925 5094 scope.go:117] "RemoveContainer" containerID="0fefb2c2f6cc86094f81a7b5545706f81ffcfe3c3f92d0cb521079734da9f14e" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.241352 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerDied","Data":"0fefb2c2f6cc86094f81a7b5545706f81ffcfe3c3f92d0cb521079734da9f14e"} Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.274845 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.274889 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.274907 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.274931 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.274949 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:58Z","lastTransitionTime":"2026-02-20T06:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.279147 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.310697 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.332028 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.350348 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.373606 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fefb2c2f6cc86094f81a7b5545706f81ffcfe3c3f92d0cb521079734da9f14e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fefb2c2f6cc86094f81a7b5545706f81ffcfe3c3f92d0cb521079734da9f14e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:46:58Z\\\",\\\"message\\\":\\\"etworkPolicy event handler 4 for removal\\\\nI0220 06:46:57.986434 6455 factory.go:656] Stopping watch factory\\\\nI0220 06:46:57.986441 6455 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 06:46:57.986527 6455 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 06:46:57.986658 6455 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 06:46:57.986382 6455 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 06:46:57.987032 6455 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 06:46:57.987247 6455 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 06:46:57.986381 6455 handler.go:208] Removed *v1.Node event handler 2\\\\nI0220 06:46:57.987273 6455 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.377960 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.377988 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.377998 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.378017 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.378027 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:58Z","lastTransitionTime":"2026-02-20T06:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.389698 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.406395 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.427987 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.449646 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.464737 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.477001 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.481371 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.481471 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.481485 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.481534 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.481551 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:58Z","lastTransitionTime":"2026-02-20T06:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.492937 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.514837 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.534857 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.551742 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.584688 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.584750 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.584763 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.584780 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.584789 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:58Z","lastTransitionTime":"2026-02-20T06:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.688533 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.688580 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.688591 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.688609 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.688620 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:58Z","lastTransitionTime":"2026-02-20T06:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.791390 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.791441 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.791454 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.791476 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.791491 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:58Z","lastTransitionTime":"2026-02-20T06:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.802053 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 08:52:38.563907799 +0000 UTC Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.839984 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.840076 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:46:58 crc kubenswrapper[5094]: E0220 06:46:58.840308 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:46:58 crc kubenswrapper[5094]: E0220 06:46:58.840495 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.894548 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.894603 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.894616 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.894638 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.894652 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:58Z","lastTransitionTime":"2026-02-20T06:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.997821 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.997873 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.997889 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.997911 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.997924 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:58Z","lastTransitionTime":"2026-02-20T06:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.100390 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.100425 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.100433 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.100453 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.100463 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:59Z","lastTransitionTime":"2026-02-20T06:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.202851 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.202891 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.202905 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.202924 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.202937 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:59Z","lastTransitionTime":"2026-02-20T06:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.247429 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovnkube-controller/0.log" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.251607 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerStarted","Data":"714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9"} Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.252294 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.272200 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:59Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.290398 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:59Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.307932 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.307986 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.307998 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.308029 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.308044 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:59Z","lastTransitionTime":"2026-02-20T06:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.314345 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:59Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.330597 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:59Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.354991 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:59Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.379031 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:59Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.399010 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:59Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.411317 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.411521 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.411744 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.411887 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.411974 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:59Z","lastTransitionTime":"2026-02-20T06:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.425083 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fefb2c2f6cc86094f81a7b5545706f81ffcfe3c3f92d0cb521079734da9f14e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:46:58Z\\\",\\\"message\\\":\\\"etworkPolicy event handler 4 for removal\\\\nI0220 06:46:57.986434 6455 factory.go:656] Stopping watch factory\\\\nI0220 06:46:57.986441 6455 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 06:46:57.986527 6455 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 06:46:57.986658 6455 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 06:46:57.986382 6455 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 06:46:57.987032 6455 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 06:46:57.987247 6455 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 06:46:57.986381 6455 handler.go:208] Removed *v1.Node event handler 2\\\\nI0220 06:46:57.987273 6455 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:59Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.456494 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:59Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.471517 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:59Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.487525 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:59Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.503815 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:59Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.515065 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.515118 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.515134 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.515161 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.515180 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:59Z","lastTransitionTime":"2026-02-20T06:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.517084 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:59Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.529070 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:59Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.552276 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:59Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.619381 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.619462 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.619484 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.619514 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.619532 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:59Z","lastTransitionTime":"2026-02-20T06:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.723264 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.723308 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.723317 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.723332 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.723342 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:59Z","lastTransitionTime":"2026-02-20T06:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.802924 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 00:12:34.551280443 +0000 UTC Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.825561 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.825608 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.825617 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.825634 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.825649 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:59Z","lastTransitionTime":"2026-02-20T06:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.840240 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:59 crc kubenswrapper[5094]: E0220 06:46:59.840373 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.929015 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.929064 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.929075 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.929095 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.929107 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:59Z","lastTransitionTime":"2026-02-20T06:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.031630 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.031687 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.031730 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.031758 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.031777 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:00Z","lastTransitionTime":"2026-02-20T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.133977 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.134082 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.134103 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.134128 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.134147 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:00Z","lastTransitionTime":"2026-02-20T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.237856 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.237906 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.237940 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.237958 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.237970 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:00Z","lastTransitionTime":"2026-02-20T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.258424 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovnkube-controller/1.log" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.259289 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovnkube-controller/0.log" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.263019 5094 generic.go:334] "Generic (PLEG): container finished" podID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerID="714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9" exitCode=1 Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.263053 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerDied","Data":"714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9"} Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.263131 5094 scope.go:117] "RemoveContainer" containerID="0fefb2c2f6cc86094f81a7b5545706f81ffcfe3c3f92d0cb521079734da9f14e" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.264184 5094 scope.go:117] "RemoveContainer" containerID="714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9" Feb 20 06:47:00 crc kubenswrapper[5094]: E0220 06:47:00.264434 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.282102 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:00Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.303871 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:00Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.331279 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:00Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.341113 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.341216 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.341240 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.341268 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.341286 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:00Z","lastTransitionTime":"2026-02-20T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.358002 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:00Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.376579 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:00Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.399441 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:00Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.419680 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:00Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.440635 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:00Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.443826 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.443876 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.443888 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.443905 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.443919 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:00Z","lastTransitionTime":"2026-02-20T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.459192 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:00Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.491440 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fefb2c2f6cc86094f81a7b5545706f81ffcfe3c3f92d0cb521079734da9f14e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:46:58Z\\\",\\\"message\\\":\\\"etworkPolicy event handler 4 for removal\\\\nI0220 06:46:57.986434 6455 factory.go:656] Stopping watch factory\\\\nI0220 06:46:57.986441 6455 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 06:46:57.986527 6455 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 06:46:57.986658 6455 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 06:46:57.986382 6455 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 06:46:57.987032 6455 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 06:46:57.987247 6455 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 06:46:57.986381 6455 handler.go:208] Removed *v1.Node event handler 2\\\\nI0220 06:46:57.987273 6455 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:46:59Z\\\",\\\"message\\\":\\\"c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.140\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 06:46:59.340011 6610 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:46:59.340052 6610 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:46:59.340153 6610 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:00Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.521857 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:00Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.539161 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:00Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.546451 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.546532 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.546556 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.546587 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.546607 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:00Z","lastTransitionTime":"2026-02-20T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.556020 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:00Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.573392 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:00Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.593108 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:00Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.650543 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.650887 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.651069 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.651275 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.651879 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:00Z","lastTransitionTime":"2026-02-20T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.755419 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.755470 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.755482 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.755554 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.755572 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:00Z","lastTransitionTime":"2026-02-20T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.803426 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 00:55:54.27389845 +0000 UTC Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.841793 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:00 crc kubenswrapper[5094]: E0220 06:47:00.841994 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.841807 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:00 crc kubenswrapper[5094]: E0220 06:47:00.842596 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.858457 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.858494 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.858503 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.858519 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.858529 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:00Z","lastTransitionTime":"2026-02-20T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.961438 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.961489 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.961507 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.961532 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.961551 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:00Z","lastTransitionTime":"2026-02-20T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.064167 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.064226 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.064242 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.064264 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.064281 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:01Z","lastTransitionTime":"2026-02-20T06:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.132388 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f"] Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.133307 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.136200 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.136343 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.158448 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.168379 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.168428 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.168441 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.168461 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.168487 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:01Z","lastTransitionTime":"2026-02-20T06:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.175326 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.187760 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.210682 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.231988 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fefb2c2f6cc86094f81a7b5545706f81ffcfe3c3f92d0cb521079734da9f14e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:46:58Z\\\",\\\"message\\\":\\\"etworkPolicy event handler 4 for removal\\\\nI0220 06:46:57.986434 6455 factory.go:656] Stopping watch factory\\\\nI0220 06:46:57.986441 6455 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 06:46:57.986527 6455 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 06:46:57.986658 6455 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 06:46:57.986382 6455 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 06:46:57.987032 6455 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 06:46:57.987247 6455 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 06:46:57.986381 6455 handler.go:208] Removed *v1.Node event handler 2\\\\nI0220 06:46:57.987273 6455 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:46:59Z\\\",\\\"message\\\":\\\"c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.140\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 06:46:59.340011 6610 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:46:59.340052 6610 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:46:59.340153 6610 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.233118 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/60f18419-2e46-4911-bceb-d8651c9fac66-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qrs4f\" (UID: \"60f18419-2e46-4911-bceb-d8651c9fac66\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.233195 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/60f18419-2e46-4911-bceb-d8651c9fac66-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qrs4f\" (UID: \"60f18419-2e46-4911-bceb-d8651c9fac66\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.233318 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/60f18419-2e46-4911-bceb-d8651c9fac66-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qrs4f\" (UID: \"60f18419-2e46-4911-bceb-d8651c9fac66\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.233354 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr64j\" (UniqueName: \"kubernetes.io/projected/60f18419-2e46-4911-bceb-d8651c9fac66-kube-api-access-xr64j\") pod \"ovnkube-control-plane-749d76644c-qrs4f\" (UID: \"60f18419-2e46-4911-bceb-d8651c9fac66\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.255357 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.274459 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.275628 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.275752 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.275804 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.275832 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.275859 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovnkube-controller/1.log" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.275852 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:01Z","lastTransitionTime":"2026-02-20T06:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.281697 5094 scope.go:117] "RemoveContainer" containerID="714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9" Feb 20 06:47:01 crc kubenswrapper[5094]: E0220 06:47:01.282018 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.289876 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.310173 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.332362 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.334895 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/60f18419-2e46-4911-bceb-d8651c9fac66-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qrs4f\" (UID: \"60f18419-2e46-4911-bceb-d8651c9fac66\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.335031 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/60f18419-2e46-4911-bceb-d8651c9fac66-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qrs4f\" (UID: \"60f18419-2e46-4911-bceb-d8651c9fac66\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.335075 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr64j\" (UniqueName: \"kubernetes.io/projected/60f18419-2e46-4911-bceb-d8651c9fac66-kube-api-access-xr64j\") pod \"ovnkube-control-plane-749d76644c-qrs4f\" (UID: \"60f18419-2e46-4911-bceb-d8651c9fac66\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.335203 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/60f18419-2e46-4911-bceb-d8651c9fac66-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qrs4f\" (UID: \"60f18419-2e46-4911-bceb-d8651c9fac66\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.336357 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/60f18419-2e46-4911-bceb-d8651c9fac66-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qrs4f\" (UID: \"60f18419-2e46-4911-bceb-d8651c9fac66\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.336548 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/60f18419-2e46-4911-bceb-d8651c9fac66-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qrs4f\" (UID: \"60f18419-2e46-4911-bceb-d8651c9fac66\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.347637 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/60f18419-2e46-4911-bceb-d8651c9fac66-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qrs4f\" (UID: \"60f18419-2e46-4911-bceb-d8651c9fac66\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.351612 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.358063 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr64j\" (UniqueName: \"kubernetes.io/projected/60f18419-2e46-4911-bceb-d8651c9fac66-kube-api-access-xr64j\") pod \"ovnkube-control-plane-749d76644c-qrs4f\" (UID: \"60f18419-2e46-4911-bceb-d8651c9fac66\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.368623 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.379650 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.379793 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.379880 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.379940 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.379961 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:01Z","lastTransitionTime":"2026-02-20T06:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.391571 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.414053 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.443857 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.447141 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.459101 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.477480 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.482961 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.482997 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.483009 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.483028 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.483041 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:01Z","lastTransitionTime":"2026-02-20T06:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.494442 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.510646 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.532734 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.553775 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.572638 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.586205 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.586248 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.586259 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.586278 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.586290 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:01Z","lastTransitionTime":"2026-02-20T06:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.589243 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.609216 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.627179 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.649217 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.663411 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.679980 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.689114 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.689177 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.689193 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.689217 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.689232 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:01Z","lastTransitionTime":"2026-02-20T06:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.695449 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.723271 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:46:59Z\\\",\\\"message\\\":\\\"c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.140\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 06:46:59.340011 6610 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:46:59.340052 6610 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:46:59.340153 6610 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.736834 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.748904 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.792361 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.792500 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.792596 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.792693 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.792792 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:01Z","lastTransitionTime":"2026-02-20T06:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.803888 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 00:07:20.193741285 +0000 UTC Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.840188 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:01 crc kubenswrapper[5094]: E0220 06:47:01.840461 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.890392 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-8ww4n"] Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.890917 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:01 crc kubenswrapper[5094]: E0220 06:47:01.890976 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.897786 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.897892 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.897948 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.897987 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.898057 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:01Z","lastTransitionTime":"2026-02-20T06:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.898991 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.910556 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.927100 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.941269 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.942333 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs\") pod \"network-metrics-daemon-8ww4n\" (UID: \"da0aa093-1adc-45f2-a942-e68d7be23ed4\") " pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.942409 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhrtm\" (UniqueName: \"kubernetes.io/projected/da0aa093-1adc-45f2-a942-e68d7be23ed4-kube-api-access-mhrtm\") pod \"network-metrics-daemon-8ww4n\" (UID: \"da0aa093-1adc-45f2-a942-e68d7be23ed4\") " pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.968573 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.985621 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.001291 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.001377 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.001422 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.001451 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.001520 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:02Z","lastTransitionTime":"2026-02-20T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.001738 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.016156 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.031200 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.043444 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs\") pod \"network-metrics-daemon-8ww4n\" (UID: \"da0aa093-1adc-45f2-a942-e68d7be23ed4\") " pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.043578 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhrtm\" (UniqueName: \"kubernetes.io/projected/da0aa093-1adc-45f2-a942-e68d7be23ed4-kube-api-access-mhrtm\") pod \"network-metrics-daemon-8ww4n\" (UID: \"da0aa093-1adc-45f2-a942-e68d7be23ed4\") " pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.044051 5094 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.044378 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs podName:da0aa093-1adc-45f2-a942-e68d7be23ed4 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:02.54429127 +0000 UTC m=+37.416918191 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs") pod "network-metrics-daemon-8ww4n" (UID: "da0aa093-1adc-45f2-a942-e68d7be23ed4") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.050911 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.063936 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhrtm\" (UniqueName: \"kubernetes.io/projected/da0aa093-1adc-45f2-a942-e68d7be23ed4-kube-api-access-mhrtm\") pod \"network-metrics-daemon-8ww4n\" (UID: \"da0aa093-1adc-45f2-a942-e68d7be23ed4\") " pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.071294 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.091283 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.105819 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.106078 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.106152 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.106217 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.106273 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:02Z","lastTransitionTime":"2026-02-20T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.108225 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.138571 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:46:59Z\\\",\\\"message\\\":\\\"c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.140\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 06:46:59.340011 6610 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:46:59.340052 6610 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:46:59.340153 6610 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.166028 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.186266 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.205672 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.211496 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.211562 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.211582 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.211612 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.211631 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:02Z","lastTransitionTime":"2026-02-20T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.223625 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.239981 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.256459 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.275005 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.289023 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" event={"ID":"60f18419-2e46-4911-bceb-d8651c9fac66","Type":"ContainerStarted","Data":"c8f6830864b9f0c23a91dff26fca798b670afe7c0316ae71ad386c027b8ce0bd"} Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.289111 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" event={"ID":"60f18419-2e46-4911-bceb-d8651c9fac66","Type":"ContainerStarted","Data":"c561efe0bdf3fc9b35bebb6f356c9ef56add69a863637766aab3d3748acaaa63"} Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.289150 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" event={"ID":"60f18419-2e46-4911-bceb-d8651c9fac66","Type":"ContainerStarted","Data":"7c24a78f0ce2423b8e774c11a286a2e80caee250864166c38a97deed16cb5641"} Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.291076 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.307086 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.314816 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.314880 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.314900 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.314928 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.314948 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:02Z","lastTransitionTime":"2026-02-20T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.322256 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.344650 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.380059 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.405863 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.417553 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.417798 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.417889 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.417966 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.418055 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:02Z","lastTransitionTime":"2026-02-20T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.428003 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.444932 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.444989 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.445008 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.445030 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.445043 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:02Z","lastTransitionTime":"2026-02-20T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.454132 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.459211 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.463816 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.463854 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.463895 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.463915 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.463927 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:02Z","lastTransitionTime":"2026-02-20T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.470543 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.483072 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.488135 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.488187 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.488201 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.488222 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.488237 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:02Z","lastTransitionTime":"2026-02-20T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.504805 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.506226 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.510231 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.510271 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.510285 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.510305 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.510324 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:02Z","lastTransitionTime":"2026-02-20T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.523146 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.525264 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.527978 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.528013 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.528024 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.528044 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.528057 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:02Z","lastTransitionTime":"2026-02-20T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.541352 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.541478 5094 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.542927 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.543641 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.543680 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.543689 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.543719 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.543730 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:02Z","lastTransitionTime":"2026-02-20T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.549348 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs\") pod \"network-metrics-daemon-8ww4n\" (UID: \"da0aa093-1adc-45f2-a942-e68d7be23ed4\") " pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.549562 5094 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.549876 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs podName:da0aa093-1adc-45f2-a942-e68d7be23ed4 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:03.549617598 +0000 UTC m=+38.422244329 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs") pod "network-metrics-daemon-8ww4n" (UID: "da0aa093-1adc-45f2-a942-e68d7be23ed4") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.563341 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.582398 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:46:59Z\\\",\\\"message\\\":\\\"c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.140\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 06:46:59.340011 6610 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:46:59.340052 6610 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:46:59.340153 6610 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.593347 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.608330 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.628030 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.643419 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.652046 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.652119 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.652137 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.652170 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.652192 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:02Z","lastTransitionTime":"2026-02-20T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.657190 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.678369 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.700042 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.719418 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.738854 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.751579 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.751832 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.751923 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:47:18.751875112 +0000 UTC m=+53.624501863 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.752069 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.752085 5094 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.752140 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.752220 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:18.752185069 +0000 UTC m=+53.624811810 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.752277 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.752328 5094 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.752430 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.752473 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:18.752445466 +0000 UTC m=+53.625072227 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.752490 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.752519 5094 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.752451 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.752626 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:18.752586249 +0000 UTC m=+53.625213020 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.752611 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.753116 5094 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.753164 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:18.753154943 +0000 UTC m=+53.625781654 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.755233 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.755306 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.755320 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.755350 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.755366 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:02Z","lastTransitionTime":"2026-02-20T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.756935 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.780940 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.799009 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.804409 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 05:28:51.334857578 +0000 UTC Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.813338 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.835392 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.839251 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.839332 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.839390 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.839541 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.858889 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.859180 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.859245 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.859347 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.859411 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:02Z","lastTransitionTime":"2026-02-20T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.858854 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:46:59Z\\\",\\\"message\\\":\\\"c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.140\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 06:46:59.340011 6610 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:46:59.340052 6610 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:46:59.340153 6610 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.879237 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.892856 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c561efe0bdf3fc9b35bebb6f356c9ef56add69a863637766aab3d3748acaaa63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8f6830864b9f0c23a91dff26fca798b670afe7c0316ae71ad386c027b8ce0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.963827 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.963893 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.963912 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.963941 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.963961 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:02Z","lastTransitionTime":"2026-02-20T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.067561 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.067634 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.067657 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.067689 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.067742 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:03Z","lastTransitionTime":"2026-02-20T06:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.171819 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.171917 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.171944 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.171983 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.172005 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:03Z","lastTransitionTime":"2026-02-20T06:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.275805 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.275850 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.275862 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.275881 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.275893 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:03Z","lastTransitionTime":"2026-02-20T06:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.378583 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.378761 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.378797 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.378840 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.378867 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:03Z","lastTransitionTime":"2026-02-20T06:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.482461 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.482524 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.482543 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.482570 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.482589 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:03Z","lastTransitionTime":"2026-02-20T06:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.562032 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs\") pod \"network-metrics-daemon-8ww4n\" (UID: \"da0aa093-1adc-45f2-a942-e68d7be23ed4\") " pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:03 crc kubenswrapper[5094]: E0220 06:47:03.562256 5094 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 06:47:03 crc kubenswrapper[5094]: E0220 06:47:03.562379 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs podName:da0aa093-1adc-45f2-a942-e68d7be23ed4 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:05.562350194 +0000 UTC m=+40.434976935 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs") pod "network-metrics-daemon-8ww4n" (UID: "da0aa093-1adc-45f2-a942-e68d7be23ed4") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.585912 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.585983 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.586007 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.586039 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.586063 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:03Z","lastTransitionTime":"2026-02-20T06:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.690178 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.690424 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.690582 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.690760 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.690890 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:03Z","lastTransitionTime":"2026-02-20T06:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.795088 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.795147 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.795160 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.795179 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.795190 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:03Z","lastTransitionTime":"2026-02-20T06:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.804811 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 08:57:49.080080649 +0000 UTC Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.839633 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:03 crc kubenswrapper[5094]: E0220 06:47:03.839868 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.840479 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:03 crc kubenswrapper[5094]: E0220 06:47:03.840604 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.899132 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.899274 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.899343 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.899374 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.899435 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:03Z","lastTransitionTime":"2026-02-20T06:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.002352 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.002390 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.002403 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.002421 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.002434 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:04Z","lastTransitionTime":"2026-02-20T06:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.105416 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.106025 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.106258 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.106537 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.106630 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:04Z","lastTransitionTime":"2026-02-20T06:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.210523 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.210603 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.210623 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.210659 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.210679 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:04Z","lastTransitionTime":"2026-02-20T06:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.315063 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.315126 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.315148 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.315175 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.315194 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:04Z","lastTransitionTime":"2026-02-20T06:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.418774 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.418818 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.418829 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.418849 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.418861 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:04Z","lastTransitionTime":"2026-02-20T06:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.522581 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.522668 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.522695 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.522769 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.522793 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:04Z","lastTransitionTime":"2026-02-20T06:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.626093 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.626160 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.626179 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.626207 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.626226 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:04Z","lastTransitionTime":"2026-02-20T06:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.730532 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.730635 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.730654 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.730685 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.730737 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:04Z","lastTransitionTime":"2026-02-20T06:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.805610 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 09:59:03.639557747 +0000 UTC Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.834806 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.834880 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.834897 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.834930 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.834953 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:04Z","lastTransitionTime":"2026-02-20T06:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.839115 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.839174 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:04 crc kubenswrapper[5094]: E0220 06:47:04.839437 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:04 crc kubenswrapper[5094]: E0220 06:47:04.839594 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.938144 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.938223 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.938243 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.938273 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.938292 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:04Z","lastTransitionTime":"2026-02-20T06:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.041644 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.041882 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.041917 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.041952 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.041974 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:05Z","lastTransitionTime":"2026-02-20T06:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.146477 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.146604 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.146736 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.146768 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.146786 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:05Z","lastTransitionTime":"2026-02-20T06:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.250466 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.250560 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.250584 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.250620 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.250645 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:05Z","lastTransitionTime":"2026-02-20T06:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.354291 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.354370 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.354389 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.354420 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.354439 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:05Z","lastTransitionTime":"2026-02-20T06:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.458317 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.458392 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.458413 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.458442 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.458462 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:05Z","lastTransitionTime":"2026-02-20T06:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.561506 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.561587 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.561611 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.561644 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.561670 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:05Z","lastTransitionTime":"2026-02-20T06:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.596602 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs\") pod \"network-metrics-daemon-8ww4n\" (UID: \"da0aa093-1adc-45f2-a942-e68d7be23ed4\") " pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:05 crc kubenswrapper[5094]: E0220 06:47:05.596856 5094 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 06:47:05 crc kubenswrapper[5094]: E0220 06:47:05.596978 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs podName:da0aa093-1adc-45f2-a942-e68d7be23ed4 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:09.596944965 +0000 UTC m=+44.469571716 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs") pod "network-metrics-daemon-8ww4n" (UID: "da0aa093-1adc-45f2-a942-e68d7be23ed4") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.665238 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.665609 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.665853 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.666080 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.666232 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:05Z","lastTransitionTime":"2026-02-20T06:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.768907 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.768973 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.768993 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.769020 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.769041 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:05Z","lastTransitionTime":"2026-02-20T06:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.806298 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 19:02:48.86184556 +0000 UTC Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.839958 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:05 crc kubenswrapper[5094]: E0220 06:47:05.840186 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.840323 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:05 crc kubenswrapper[5094]: E0220 06:47:05.840496 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.874239 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.874349 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.874377 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.874410 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.874445 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:05Z","lastTransitionTime":"2026-02-20T06:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.877969 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:05Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.896969 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:05Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.918788 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:05Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.941870 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:05Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.976129 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:46:59Z\\\",\\\"message\\\":\\\"c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.140\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 06:46:59.340011 6610 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:46:59.340052 6610 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:46:59.340153 6610 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:05Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.977755 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.977858 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.977884 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.977976 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.978040 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:05Z","lastTransitionTime":"2026-02-20T06:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.000389 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:05Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.020843 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c561efe0bdf3fc9b35bebb6f356c9ef56add69a863637766aab3d3748acaaa63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8f6830864b9f0c23a91dff26fca798b670afe7c0316ae71ad386c027b8ce0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:06Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.045090 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:06Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.062145 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:06Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.081413 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:06Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.083053 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.083116 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.083135 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.083161 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.083182 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:06Z","lastTransitionTime":"2026-02-20T06:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.103851 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:06Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.120003 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:06Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.135387 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:06Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.157245 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:06Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.178758 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:06Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.185623 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.185689 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.185741 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.185770 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.185790 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:06Z","lastTransitionTime":"2026-02-20T06:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.198873 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:06Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.219868 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:06Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.288949 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.289020 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.289039 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.289068 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.289091 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:06Z","lastTransitionTime":"2026-02-20T06:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.393139 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.393231 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.393255 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.393294 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.393317 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:06Z","lastTransitionTime":"2026-02-20T06:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.496201 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.496286 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.496313 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.496347 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.496371 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:06Z","lastTransitionTime":"2026-02-20T06:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.601012 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.601072 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.601087 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.601112 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.601129 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:06Z","lastTransitionTime":"2026-02-20T06:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.705730 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.705823 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.705848 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.705879 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.705901 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:06Z","lastTransitionTime":"2026-02-20T06:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.806775 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 09:19:36.548553188 +0000 UTC Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.811612 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.813956 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.814266 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.814483 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.814683 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:06Z","lastTransitionTime":"2026-02-20T06:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.839302 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:06 crc kubenswrapper[5094]: E0220 06:47:06.839466 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.839302 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:06 crc kubenswrapper[5094]: E0220 06:47:06.839965 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.919408 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.919477 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.919494 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.919523 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.919540 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:06Z","lastTransitionTime":"2026-02-20T06:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.022775 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.022837 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.022861 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.022890 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.022911 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:07Z","lastTransitionTime":"2026-02-20T06:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.125809 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.125884 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.125899 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.125920 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.125937 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:07Z","lastTransitionTime":"2026-02-20T06:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.229305 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.229389 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.229406 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.229437 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.229455 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:07Z","lastTransitionTime":"2026-02-20T06:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.332278 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.332451 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.332540 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.332589 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.332613 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:07Z","lastTransitionTime":"2026-02-20T06:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.437058 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.437137 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.437158 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.437186 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.437206 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:07Z","lastTransitionTime":"2026-02-20T06:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.539670 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.539731 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.539744 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.539759 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.539769 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:07Z","lastTransitionTime":"2026-02-20T06:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.643884 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.644027 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.644057 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.644097 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.644121 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:07Z","lastTransitionTime":"2026-02-20T06:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.747659 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.747888 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.747925 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.748014 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.748138 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:07Z","lastTransitionTime":"2026-02-20T06:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.807845 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 09:38:00.819387497 +0000 UTC Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.839687 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.839695 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:07 crc kubenswrapper[5094]: E0220 06:47:07.840010 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:07 crc kubenswrapper[5094]: E0220 06:47:07.840079 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.851568 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.851662 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.851678 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.851797 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.851814 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:07Z","lastTransitionTime":"2026-02-20T06:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.955596 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.955767 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.955793 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.955821 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.955844 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:07Z","lastTransitionTime":"2026-02-20T06:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.058926 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.059017 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.059039 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.059077 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.059099 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:08Z","lastTransitionTime":"2026-02-20T06:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.162326 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.162393 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.162406 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.162427 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.162440 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:08Z","lastTransitionTime":"2026-02-20T06:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.265518 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.265575 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.265587 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.265610 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.265629 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:08Z","lastTransitionTime":"2026-02-20T06:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.367697 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.367812 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.367837 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.367866 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.367887 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:08Z","lastTransitionTime":"2026-02-20T06:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.472410 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.472493 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.472512 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.472540 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.472562 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:08Z","lastTransitionTime":"2026-02-20T06:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.576401 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.576465 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.576504 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.576537 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.576560 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:08Z","lastTransitionTime":"2026-02-20T06:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.679640 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.679693 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.679736 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.679761 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.679781 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:08Z","lastTransitionTime":"2026-02-20T06:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.782485 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.782549 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.782567 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.782662 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.782685 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:08Z","lastTransitionTime":"2026-02-20T06:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.808402 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 03:17:32.287849881 +0000 UTC Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.840036 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:08 crc kubenswrapper[5094]: E0220 06:47:08.840210 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.840406 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:08 crc kubenswrapper[5094]: E0220 06:47:08.840862 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.886458 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.886515 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.886527 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.886547 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.886562 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:08Z","lastTransitionTime":"2026-02-20T06:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.990543 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.990611 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.990626 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.990649 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.990665 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:08Z","lastTransitionTime":"2026-02-20T06:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.093110 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.093162 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.093177 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.093199 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.093211 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:09Z","lastTransitionTime":"2026-02-20T06:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.196944 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.197375 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.197523 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.197681 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.197843 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:09Z","lastTransitionTime":"2026-02-20T06:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.302054 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.302124 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.302143 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.302171 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.302191 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:09Z","lastTransitionTime":"2026-02-20T06:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.405849 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.405909 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.405928 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.405960 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.405981 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:09Z","lastTransitionTime":"2026-02-20T06:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.510254 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.510329 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.510347 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.510376 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.510401 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:09Z","lastTransitionTime":"2026-02-20T06:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.613776 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.614108 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.614449 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.614578 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.614801 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:09Z","lastTransitionTime":"2026-02-20T06:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.652777 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs\") pod \"network-metrics-daemon-8ww4n\" (UID: \"da0aa093-1adc-45f2-a942-e68d7be23ed4\") " pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:09 crc kubenswrapper[5094]: E0220 06:47:09.652944 5094 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 06:47:09 crc kubenswrapper[5094]: E0220 06:47:09.653345 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs podName:da0aa093-1adc-45f2-a942-e68d7be23ed4 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:17.65331649 +0000 UTC m=+52.525943241 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs") pod "network-metrics-daemon-8ww4n" (UID: "da0aa093-1adc-45f2-a942-e68d7be23ed4") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.718872 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.718925 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.718942 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.718968 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.718985 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:09Z","lastTransitionTime":"2026-02-20T06:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.808898 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 03:59:50.121048578 +0000 UTC Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.822319 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.822397 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.822418 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.822452 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.822472 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:09Z","lastTransitionTime":"2026-02-20T06:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.839487 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.839540 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:09 crc kubenswrapper[5094]: E0220 06:47:09.839746 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:09 crc kubenswrapper[5094]: E0220 06:47:09.839910 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.926683 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.926780 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.926797 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.926826 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.926844 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:09Z","lastTransitionTime":"2026-02-20T06:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.030361 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.030429 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.030447 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.030476 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.030501 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:10Z","lastTransitionTime":"2026-02-20T06:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.134451 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.134534 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.134552 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.134608 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.134628 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:10Z","lastTransitionTime":"2026-02-20T06:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.238211 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.238827 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.239021 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.239201 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.239367 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:10Z","lastTransitionTime":"2026-02-20T06:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.343194 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.343246 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.343260 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.343280 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.343293 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:10Z","lastTransitionTime":"2026-02-20T06:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.446525 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.446573 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.446591 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.446653 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.446671 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:10Z","lastTransitionTime":"2026-02-20T06:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.551011 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.551398 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.551612 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.551862 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.552100 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:10Z","lastTransitionTime":"2026-02-20T06:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.656084 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.656175 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.656199 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.656232 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.656253 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:10Z","lastTransitionTime":"2026-02-20T06:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.759810 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.760145 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.760341 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.760527 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.760773 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:10Z","lastTransitionTime":"2026-02-20T06:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.810739 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 21:12:01.892840054 +0000 UTC Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.839332 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.839372 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:10 crc kubenswrapper[5094]: E0220 06:47:10.839529 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:10 crc kubenswrapper[5094]: E0220 06:47:10.839678 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.864670 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.864820 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.864847 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.864880 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.864904 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:10Z","lastTransitionTime":"2026-02-20T06:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.968567 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.968641 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.968662 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.968697 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.968765 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:10Z","lastTransitionTime":"2026-02-20T06:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.072432 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.072520 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.072546 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.072583 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.072606 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:11Z","lastTransitionTime":"2026-02-20T06:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.176636 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.176699 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.177000 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.177038 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.177349 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:11Z","lastTransitionTime":"2026-02-20T06:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.282201 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.282585 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.282852 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.283056 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.283236 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:11Z","lastTransitionTime":"2026-02-20T06:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.387064 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.387132 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.387151 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.387179 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.387199 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:11Z","lastTransitionTime":"2026-02-20T06:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.490368 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.490923 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.491078 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.491258 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.491402 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:11Z","lastTransitionTime":"2026-02-20T06:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.595497 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.595586 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.595606 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.595642 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.595672 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:11Z","lastTransitionTime":"2026-02-20T06:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.700466 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.701156 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.701224 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.701273 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.701302 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:11Z","lastTransitionTime":"2026-02-20T06:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.804313 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.804375 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.804394 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.804421 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.804443 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:11Z","lastTransitionTime":"2026-02-20T06:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.811514 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 01:20:13.008437076 +0000 UTC Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.840231 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.840393 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:11 crc kubenswrapper[5094]: E0220 06:47:11.840997 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.841081 5094 scope.go:117] "RemoveContainer" containerID="714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9" Feb 20 06:47:11 crc kubenswrapper[5094]: E0220 06:47:11.841190 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.908421 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.908473 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.908490 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.908516 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.908535 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:11Z","lastTransitionTime":"2026-02-20T06:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.013178 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.013248 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.013273 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.013315 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.013340 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:12Z","lastTransitionTime":"2026-02-20T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.116777 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.116824 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.116841 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.116866 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.116885 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:12Z","lastTransitionTime":"2026-02-20T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.221279 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.221363 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.221381 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.221411 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.221432 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:12Z","lastTransitionTime":"2026-02-20T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.325300 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.325368 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.325388 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.325416 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.325433 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:12Z","lastTransitionTime":"2026-02-20T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.334262 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovnkube-controller/1.log" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.338521 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerStarted","Data":"210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca"} Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.339405 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.364801 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.390644 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.411755 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.437360 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.437416 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.437431 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.437455 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.437470 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:12Z","lastTransitionTime":"2026-02-20T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.438887 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.463801 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.483460 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.502808 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.519553 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.530346 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.539805 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.539846 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.539859 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.539879 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.539891 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:12Z","lastTransitionTime":"2026-02-20T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.543568 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.557478 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.571083 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.584373 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.612670 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:46:59Z\\\",\\\"message\\\":\\\"c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.140\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 06:46:59.340011 6610 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:46:59.340052 6610 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:46:59.340153 6610 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.642845 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.642891 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.642905 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.642925 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.642938 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:12Z","lastTransitionTime":"2026-02-20T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.645045 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.658353 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c561efe0bdf3fc9b35bebb6f356c9ef56add69a863637766aab3d3748acaaa63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8f6830864b9f0c23a91dff26fca798b670afe7c0316ae71ad386c027b8ce0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.669984 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.747749 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.748204 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.748247 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.748280 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.748301 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:12Z","lastTransitionTime":"2026-02-20T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.811912 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 14:58:53.881092253 +0000 UTC Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.840214 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.840274 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:12 crc kubenswrapper[5094]: E0220 06:47:12.840455 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:12 crc kubenswrapper[5094]: E0220 06:47:12.840609 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.852059 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.852114 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.852131 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.852157 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.852180 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:12Z","lastTransitionTime":"2026-02-20T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.858325 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.858361 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.858378 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.858405 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.858424 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:12Z","lastTransitionTime":"2026-02-20T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:12 crc kubenswrapper[5094]: E0220 06:47:12.880832 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.886611 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.886688 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.886748 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.886786 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.886812 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:12Z","lastTransitionTime":"2026-02-20T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:12 crc kubenswrapper[5094]: E0220 06:47:12.910269 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.917351 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.917420 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.917446 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.917480 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.917506 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:12Z","lastTransitionTime":"2026-02-20T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:12 crc kubenswrapper[5094]: E0220 06:47:12.936353 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.943202 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.943241 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.943253 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.943273 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.943285 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:12Z","lastTransitionTime":"2026-02-20T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:12 crc kubenswrapper[5094]: E0220 06:47:12.960860 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.967079 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.967141 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.967158 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.967192 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.967213 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:12Z","lastTransitionTime":"2026-02-20T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:12 crc kubenswrapper[5094]: E0220 06:47:12.983043 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: E0220 06:47:12.983406 5094 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.985617 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.985873 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.986047 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.986391 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.986692 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:12Z","lastTransitionTime":"2026-02-20T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.091455 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.091535 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.091560 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.091610 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.091634 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:13Z","lastTransitionTime":"2026-02-20T06:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.195418 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.195496 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.195516 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.195550 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.195578 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:13Z","lastTransitionTime":"2026-02-20T06:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.299955 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.300012 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.300030 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.300052 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.300071 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:13Z","lastTransitionTime":"2026-02-20T06:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.347556 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovnkube-controller/2.log" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.349073 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovnkube-controller/1.log" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.354478 5094 generic.go:334] "Generic (PLEG): container finished" podID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerID="210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca" exitCode=1 Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.354546 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerDied","Data":"210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca"} Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.354624 5094 scope.go:117] "RemoveContainer" containerID="714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.355826 5094 scope.go:117] "RemoveContainer" containerID="210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca" Feb 20 06:47:13 crc kubenswrapper[5094]: E0220 06:47:13.356159 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.379129 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.401604 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.404983 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.405099 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.405178 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.405249 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.405313 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:13Z","lastTransitionTime":"2026-02-20T06:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.418693 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.438274 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.463530 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.480769 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.504324 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.509109 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.509179 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.509198 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.509232 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.509255 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:13Z","lastTransitionTime":"2026-02-20T06:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.526602 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.549811 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.570792 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.600615 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.625984 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.626054 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.626071 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.626097 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.626112 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:13Z","lastTransitionTime":"2026-02-20T06:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.649727 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.673110 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.693829 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.717838 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:46:59Z\\\",\\\"message\\\":\\\"c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.140\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 06:46:59.340011 6610 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:46:59.340052 6610 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:46:59.340153 6610 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"ddLogicalPort failed for openshift-multus/network-metrics-daemon-8ww4n: failed to update pod openshift-multus/network-metrics-daemon-8ww4n: Internal error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z\\\\nI0220 06:47:12.918393 6814 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 06:47:12.918801 6814 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0220 06:47:12.918936 6814 handler.go:208] Removed *v1.Node event handler 2\\\\nI0220 06:47:12.918990 6814 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0220 06:47:12.919228 6814 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 06:47:12.919287 6814 factory.go:656] Stopping watch factory\\\\nI0220 06:47:12.919306 6814 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:47:12.919342 6814 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 06:47:12.919361 6814 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:47:12.919443 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:47:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.729031 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.729160 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.729267 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.729379 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.729468 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:13Z","lastTransitionTime":"2026-02-20T06:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.734456 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.746097 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c561efe0bdf3fc9b35bebb6f356c9ef56add69a863637766aab3d3748acaaa63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8f6830864b9f0c23a91dff26fca798b670afe7c0316ae71ad386c027b8ce0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.813203 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 07:41:31.611712625 +0000 UTC Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.833528 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.833668 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.833837 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.833969 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.834039 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:13Z","lastTransitionTime":"2026-02-20T06:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.840047 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:13 crc kubenswrapper[5094]: E0220 06:47:13.840257 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.840088 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:13 crc kubenswrapper[5094]: E0220 06:47:13.840499 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.937840 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.937900 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.937920 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.937947 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.937966 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:13Z","lastTransitionTime":"2026-02-20T06:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.042249 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.042324 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.042345 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.042373 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.042395 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:14Z","lastTransitionTime":"2026-02-20T06:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.146263 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.146343 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.146361 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.146392 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.146412 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:14Z","lastTransitionTime":"2026-02-20T06:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.249379 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.249450 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.249468 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.249498 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.249516 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:14Z","lastTransitionTime":"2026-02-20T06:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.353276 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.353355 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.353373 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.353406 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.353426 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:14Z","lastTransitionTime":"2026-02-20T06:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.361544 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovnkube-controller/2.log" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.368567 5094 scope.go:117] "RemoveContainer" containerID="210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca" Feb 20 06:47:14 crc kubenswrapper[5094]: E0220 06:47:14.368890 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.390862 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.408506 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c561efe0bdf3fc9b35bebb6f356c9ef56add69a863637766aab3d3748acaaa63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8f6830864b9f0c23a91dff26fca798b670afe7c0316ae71ad386c027b8ce0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.435081 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.455422 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.456294 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.456349 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.456366 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.456391 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.456411 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:14Z","lastTransitionTime":"2026-02-20T06:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.468449 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.485810 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.506873 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.539201 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.559938 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.560038 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.560065 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.560112 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.560131 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:14Z","lastTransitionTime":"2026-02-20T06:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.562801 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.588549 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.608847 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.629521 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.663858 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.664092 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.664197 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.663944 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"ddLogicalPort failed for openshift-multus/network-metrics-daemon-8ww4n: failed to update pod openshift-multus/network-metrics-daemon-8ww4n: Internal error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z\\\\nI0220 06:47:12.918393 6814 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 06:47:12.918801 6814 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0220 06:47:12.918936 6814 handler.go:208] Removed *v1.Node event handler 2\\\\nI0220 06:47:12.918990 6814 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0220 06:47:12.919228 6814 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 06:47:12.919287 6814 factory.go:656] Stopping watch factory\\\\nI0220 06:47:12.919306 6814 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:47:12.919342 6814 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 06:47:12.919361 6814 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:47:12.919443 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:47:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.664296 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.664507 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:14Z","lastTransitionTime":"2026-02-20T06:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.703142 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.724411 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.746259 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.768043 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.768114 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.768136 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.768166 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.768186 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:14Z","lastTransitionTime":"2026-02-20T06:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.769617 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.813585 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 04:34:20.82517273 +0000 UTC Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.839588 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.839697 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:14 crc kubenswrapper[5094]: E0220 06:47:14.839813 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:14 crc kubenswrapper[5094]: E0220 06:47:14.839910 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.871539 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.872006 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.872187 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.872361 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.872520 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:14Z","lastTransitionTime":"2026-02-20T06:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.976559 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.976624 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.976643 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.976671 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.976690 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:14Z","lastTransitionTime":"2026-02-20T06:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.080378 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.080466 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.080491 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.080524 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.080545 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:15Z","lastTransitionTime":"2026-02-20T06:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.184369 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.184875 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.185060 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.185226 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.185381 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:15Z","lastTransitionTime":"2026-02-20T06:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.289558 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.289639 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.289662 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.289695 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.289753 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:15Z","lastTransitionTime":"2026-02-20T06:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.393490 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.393564 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.393584 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.393616 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.393636 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:15Z","lastTransitionTime":"2026-02-20T06:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.504783 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.505313 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.505823 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.506295 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.506656 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:15Z","lastTransitionTime":"2026-02-20T06:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.610901 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.610992 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.611017 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.611082 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.611109 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:15Z","lastTransitionTime":"2026-02-20T06:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.715295 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.715363 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.715381 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.715409 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.715428 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:15Z","lastTransitionTime":"2026-02-20T06:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.814929 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 04:53:42.292504849 +0000 UTC Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.819261 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.819327 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.819346 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.819375 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.819397 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:15Z","lastTransitionTime":"2026-02-20T06:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.839300 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.839372 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:15 crc kubenswrapper[5094]: E0220 06:47:15.839585 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:15 crc kubenswrapper[5094]: E0220 06:47:15.839791 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.868510 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:15Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.892812 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:15Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.912689 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:15Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.923442 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.923505 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.923526 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.923555 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.923573 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:15Z","lastTransitionTime":"2026-02-20T06:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.936619 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:15Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.971625 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"ddLogicalPort failed for openshift-multus/network-metrics-daemon-8ww4n: failed to update pod openshift-multus/network-metrics-daemon-8ww4n: Internal error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z\\\\nI0220 06:47:12.918393 6814 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 06:47:12.918801 6814 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0220 06:47:12.918936 6814 handler.go:208] Removed *v1.Node event handler 2\\\\nI0220 06:47:12.918990 6814 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0220 06:47:12.919228 6814 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 06:47:12.919287 6814 factory.go:656] Stopping watch factory\\\\nI0220 06:47:12.919306 6814 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:47:12.919342 6814 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 06:47:12.919361 6814 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:47:12.919443 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:47:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:15Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.995251 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:15Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.015559 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c561efe0bdf3fc9b35bebb6f356c9ef56add69a863637766aab3d3748acaaa63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8f6830864b9f0c23a91dff26fca798b670afe7c0316ae71ad386c027b8ce0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.026913 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.027252 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.027477 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.027696 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.027919 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:16Z","lastTransitionTime":"2026-02-20T06:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.032608 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.056195 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.079239 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.093018 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.094779 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.105934 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.112192 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.131391 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.131462 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.131485 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.131521 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.131548 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:16Z","lastTransitionTime":"2026-02-20T06:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.134374 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.161219 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.180657 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.199339 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.230614 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.235197 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.235235 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.235284 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.235312 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.235334 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:16Z","lastTransitionTime":"2026-02-20T06:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.254737 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.280049 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.304334 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.325683 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.339948 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.340033 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.340056 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.340087 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.340105 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:16Z","lastTransitionTime":"2026-02-20T06:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.363417 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"ddLogicalPort failed for openshift-multus/network-metrics-daemon-8ww4n: failed to update pod openshift-multus/network-metrics-daemon-8ww4n: Internal error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z\\\\nI0220 06:47:12.918393 6814 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 06:47:12.918801 6814 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0220 06:47:12.918936 6814 handler.go:208] Removed *v1.Node event handler 2\\\\nI0220 06:47:12.918990 6814 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0220 06:47:12.919228 6814 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 06:47:12.919287 6814 factory.go:656] Stopping watch factory\\\\nI0220 06:47:12.919306 6814 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:47:12.919342 6814 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 06:47:12.919361 6814 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:47:12.919443 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:47:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.402098 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.423068 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.443837 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.443919 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.443940 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.443974 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.443993 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:16Z","lastTransitionTime":"2026-02-20T06:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.484825 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.509550 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.531040 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.550355 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.550560 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.550731 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.550887 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.551024 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:16Z","lastTransitionTime":"2026-02-20T06:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.551328 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c561efe0bdf3fc9b35bebb6f356c9ef56add69a863637766aab3d3748acaaa63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8f6830864b9f0c23a91dff26fca798b670afe7c0316ae71ad386c027b8ce0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.569157 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.595034 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.614964 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.634426 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"089ad2b1-a8b5-4a97-ad0e-a7912d97c2b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3f535bdf7005ade8abbf6f234ddbc9c15136792c8ebe8a3646e9dadedea986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d72e26a84d5625d799baf5fcd573d245475eb954a13456bf1813c0c863dc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d40e9d51da95e9023d99a2b2cdc4aa1a6d6755d0110393e173ee57fe9bfb74ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.656239 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.656449 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.656589 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.656774 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.656938 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:16Z","lastTransitionTime":"2026-02-20T06:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.657644 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.679371 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.699010 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.760377 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.760438 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.760457 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.760485 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.760505 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:16Z","lastTransitionTime":"2026-02-20T06:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.815373 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 13:38:06.980224335 +0000 UTC Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.840075 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:16 crc kubenswrapper[5094]: E0220 06:47:16.840320 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.840632 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:16 crc kubenswrapper[5094]: E0220 06:47:16.841034 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.863984 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.864491 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.864750 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.864966 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.865118 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:16Z","lastTransitionTime":"2026-02-20T06:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.968527 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.969054 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.969244 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.969416 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.969615 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:16Z","lastTransitionTime":"2026-02-20T06:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.072618 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.073106 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.073277 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.073451 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.073584 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:17Z","lastTransitionTime":"2026-02-20T06:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.177697 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.177802 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.177820 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.177849 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.177867 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:17Z","lastTransitionTime":"2026-02-20T06:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.281767 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.281830 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.281848 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.281874 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.281892 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:17Z","lastTransitionTime":"2026-02-20T06:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.384847 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.384920 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.384938 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.384966 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.384984 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:17Z","lastTransitionTime":"2026-02-20T06:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.488694 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.488800 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.488829 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.488857 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.488887 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:17Z","lastTransitionTime":"2026-02-20T06:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.592423 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.592479 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.592498 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.592525 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.592548 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:17Z","lastTransitionTime":"2026-02-20T06:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.663190 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs\") pod \"network-metrics-daemon-8ww4n\" (UID: \"da0aa093-1adc-45f2-a942-e68d7be23ed4\") " pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:17 crc kubenswrapper[5094]: E0220 06:47:17.663453 5094 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 06:47:17 crc kubenswrapper[5094]: E0220 06:47:17.663588 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs podName:da0aa093-1adc-45f2-a942-e68d7be23ed4 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:33.663559149 +0000 UTC m=+68.536185870 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs") pod "network-metrics-daemon-8ww4n" (UID: "da0aa093-1adc-45f2-a942-e68d7be23ed4") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.695499 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.695553 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.695572 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.695599 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.695620 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:17Z","lastTransitionTime":"2026-02-20T06:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.799388 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.799466 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.799488 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.799524 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.799549 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:17Z","lastTransitionTime":"2026-02-20T06:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.816073 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 19:01:02.41531012 +0000 UTC Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.839479 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.839557 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:17 crc kubenswrapper[5094]: E0220 06:47:17.839681 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:17 crc kubenswrapper[5094]: E0220 06:47:17.839975 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.903303 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.903355 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.903368 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.903391 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.903408 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:17Z","lastTransitionTime":"2026-02-20T06:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.006609 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.006665 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.006684 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.006736 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.006757 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:18Z","lastTransitionTime":"2026-02-20T06:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.110504 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.110579 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.110597 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.110624 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.110644 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:18Z","lastTransitionTime":"2026-02-20T06:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.213761 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.213840 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.213858 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.213895 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.213919 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:18Z","lastTransitionTime":"2026-02-20T06:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.317119 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.317442 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.317462 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.317528 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.317550 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:18Z","lastTransitionTime":"2026-02-20T06:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.421822 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.421912 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.421940 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.421978 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.422001 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:18Z","lastTransitionTime":"2026-02-20T06:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.525943 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.526047 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.526070 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.526134 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.526248 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:18Z","lastTransitionTime":"2026-02-20T06:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.630277 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.630352 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.630372 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.630403 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.630424 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:18Z","lastTransitionTime":"2026-02-20T06:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.733965 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.734043 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.734066 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.734099 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.734122 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:18Z","lastTransitionTime":"2026-02-20T06:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.779118 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.779315 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:18 crc kubenswrapper[5094]: E0220 06:47:18.779369 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:47:50.779329178 +0000 UTC m=+85.651955929 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.779423 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:18 crc kubenswrapper[5094]: E0220 06:47:18.779488 5094 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.779494 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:18 crc kubenswrapper[5094]: E0220 06:47:18.779617 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:50.779579394 +0000 UTC m=+85.652206145 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 06:47:18 crc kubenswrapper[5094]: E0220 06:47:18.779683 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 06:47:18 crc kubenswrapper[5094]: E0220 06:47:18.779755 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 06:47:18 crc kubenswrapper[5094]: E0220 06:47:18.779778 5094 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:47:18 crc kubenswrapper[5094]: E0220 06:47:18.779804 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.779674 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:18 crc kubenswrapper[5094]: E0220 06:47:18.779871 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 06:47:18 crc kubenswrapper[5094]: E0220 06:47:18.779896 5094 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:47:18 crc kubenswrapper[5094]: E0220 06:47:18.779828 5094 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 06:47:18 crc kubenswrapper[5094]: E0220 06:47:18.779839 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:50.779823351 +0000 UTC m=+85.652450092 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:47:18 crc kubenswrapper[5094]: E0220 06:47:18.780091 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:50.780071947 +0000 UTC m=+85.652698688 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:47:18 crc kubenswrapper[5094]: E0220 06:47:18.780138 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:50.780123318 +0000 UTC m=+85.652750059 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.817281 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 07:01:30.418132918 +0000 UTC Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.837645 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.837750 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.837771 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.837804 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.837826 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:18Z","lastTransitionTime":"2026-02-20T06:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.839865 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.839924 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:18 crc kubenswrapper[5094]: E0220 06:47:18.840083 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:18 crc kubenswrapper[5094]: E0220 06:47:18.840289 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.942086 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.942146 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.942165 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.942194 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.942213 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:18Z","lastTransitionTime":"2026-02-20T06:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.045758 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.045822 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.045841 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.045869 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.045949 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:19Z","lastTransitionTime":"2026-02-20T06:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.149960 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.150035 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.150056 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.150084 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.150103 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:19Z","lastTransitionTime":"2026-02-20T06:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.253944 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.254033 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.254057 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.254092 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.254117 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:19Z","lastTransitionTime":"2026-02-20T06:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.357831 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.357903 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.357928 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.357959 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.357980 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:19Z","lastTransitionTime":"2026-02-20T06:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.461825 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.461902 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.461920 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.461952 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.461971 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:19Z","lastTransitionTime":"2026-02-20T06:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.566037 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.566212 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.566242 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.566269 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.566291 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:19Z","lastTransitionTime":"2026-02-20T06:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.670534 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.670600 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.670621 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.670664 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.670683 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:19Z","lastTransitionTime":"2026-02-20T06:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.774601 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.774666 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.774685 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.774747 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.774770 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:19Z","lastTransitionTime":"2026-02-20T06:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.818406 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 08:28:36.466273603 +0000 UTC Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.839986 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.840071 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:19 crc kubenswrapper[5094]: E0220 06:47:19.840217 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:19 crc kubenswrapper[5094]: E0220 06:47:19.840382 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.885739 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.885861 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.885889 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.886653 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.886748 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:19Z","lastTransitionTime":"2026-02-20T06:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.990619 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.990742 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.990763 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.990792 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.990814 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:19Z","lastTransitionTime":"2026-02-20T06:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.095376 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.095454 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.095474 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.095503 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.095531 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:20Z","lastTransitionTime":"2026-02-20T06:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.199501 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.199577 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.199597 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.199629 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.199653 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:20Z","lastTransitionTime":"2026-02-20T06:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.303285 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.303363 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.303382 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.303409 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.303430 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:20Z","lastTransitionTime":"2026-02-20T06:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.406436 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.406512 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.406531 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.406558 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.406576 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:20Z","lastTransitionTime":"2026-02-20T06:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.509369 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.509439 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.509458 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.509486 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.509506 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:20Z","lastTransitionTime":"2026-02-20T06:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.612917 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.612994 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.613012 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.613043 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.613063 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:20Z","lastTransitionTime":"2026-02-20T06:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.716559 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.716632 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.716651 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.716682 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.716739 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:20Z","lastTransitionTime":"2026-02-20T06:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.818571 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 12:35:13.106632418 +0000 UTC Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.820480 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.820556 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.820583 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.820615 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.820636 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:20Z","lastTransitionTime":"2026-02-20T06:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.839144 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.839238 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:20 crc kubenswrapper[5094]: E0220 06:47:20.839363 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:20 crc kubenswrapper[5094]: E0220 06:47:20.839488 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.924026 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.924086 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.924105 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.924136 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.924154 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:20Z","lastTransitionTime":"2026-02-20T06:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.027946 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.028048 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.028069 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.028131 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.028154 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:21Z","lastTransitionTime":"2026-02-20T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.131424 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.131473 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.131491 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.131517 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.131536 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:21Z","lastTransitionTime":"2026-02-20T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.233853 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.233935 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.233954 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.233985 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.234005 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:21Z","lastTransitionTime":"2026-02-20T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.337765 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.337837 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.337856 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.337887 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.337908 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:21Z","lastTransitionTime":"2026-02-20T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.441419 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.441491 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.441510 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.441588 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.441621 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:21Z","lastTransitionTime":"2026-02-20T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.545882 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.545958 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.545978 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.546008 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.546035 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:21Z","lastTransitionTime":"2026-02-20T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.649981 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.650055 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.650077 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.650104 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.650123 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:21Z","lastTransitionTime":"2026-02-20T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.754072 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.754134 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.754150 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.754174 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.754196 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:21Z","lastTransitionTime":"2026-02-20T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.818799 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 09:21:07.278432493 +0000 UTC Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.839757 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.839929 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:21 crc kubenswrapper[5094]: E0220 06:47:21.840170 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:21 crc kubenswrapper[5094]: E0220 06:47:21.840492 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.857095 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.857134 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.857146 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.857163 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.857176 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:21Z","lastTransitionTime":"2026-02-20T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.960063 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.960127 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.960145 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.960172 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.960194 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:21Z","lastTransitionTime":"2026-02-20T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.063156 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.063684 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.063743 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.063776 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.063798 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:22Z","lastTransitionTime":"2026-02-20T06:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.168338 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.168420 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.168440 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.168467 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.168488 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:22Z","lastTransitionTime":"2026-02-20T06:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.271469 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.271541 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.271561 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.271588 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.271610 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:22Z","lastTransitionTime":"2026-02-20T06:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.375345 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.375429 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.375455 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.375495 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.375520 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:22Z","lastTransitionTime":"2026-02-20T06:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.478506 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.478577 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.478595 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.478626 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.478646 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:22Z","lastTransitionTime":"2026-02-20T06:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.581921 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.581999 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.582022 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.582052 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.582073 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:22Z","lastTransitionTime":"2026-02-20T06:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.686251 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.686307 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.686319 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.686341 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.686356 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:22Z","lastTransitionTime":"2026-02-20T06:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.789914 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.789984 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.790004 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.790036 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.790057 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:22Z","lastTransitionTime":"2026-02-20T06:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.819312 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 04:24:52.401242861 +0000 UTC Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.839983 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.840043 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:22 crc kubenswrapper[5094]: E0220 06:47:22.840184 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:22 crc kubenswrapper[5094]: E0220 06:47:22.840384 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.893455 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.893509 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.893529 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.893559 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.893582 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:22Z","lastTransitionTime":"2026-02-20T06:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.996632 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.996680 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.996698 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.996761 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.996785 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:22Z","lastTransitionTime":"2026-02-20T06:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.101123 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.101176 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.101194 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.101218 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.101237 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:23Z","lastTransitionTime":"2026-02-20T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.104510 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.104569 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.104586 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.104612 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.104630 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:23Z","lastTransitionTime":"2026-02-20T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:23 crc kubenswrapper[5094]: E0220 06:47:23.127889 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:23Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.133945 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.134057 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.134122 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.134154 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.134213 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:23Z","lastTransitionTime":"2026-02-20T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:23 crc kubenswrapper[5094]: E0220 06:47:23.157447 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:23Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.163461 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.163526 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.163544 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.163574 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.163593 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:23Z","lastTransitionTime":"2026-02-20T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:23 crc kubenswrapper[5094]: E0220 06:47:23.187067 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:23Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.192683 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.192769 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.192787 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.192811 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.192828 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:23Z","lastTransitionTime":"2026-02-20T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:23 crc kubenswrapper[5094]: E0220 06:47:23.214230 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:23Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.220361 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.220440 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.220464 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.220503 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.220524 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:23Z","lastTransitionTime":"2026-02-20T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:23 crc kubenswrapper[5094]: E0220 06:47:23.242999 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:23Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:23 crc kubenswrapper[5094]: E0220 06:47:23.243274 5094 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.245569 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.245624 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.245642 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.245666 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.245685 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:23Z","lastTransitionTime":"2026-02-20T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.349527 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.349604 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.349623 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.349655 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.349678 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:23Z","lastTransitionTime":"2026-02-20T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.453818 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.453902 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.453923 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.453954 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.453973 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:23Z","lastTransitionTime":"2026-02-20T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.557435 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.557524 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.557552 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.557589 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.557637 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:23Z","lastTransitionTime":"2026-02-20T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.662456 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.662858 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.663071 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.663282 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.663442 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:23Z","lastTransitionTime":"2026-02-20T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.768799 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.768898 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.768918 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.768950 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.768974 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:23Z","lastTransitionTime":"2026-02-20T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.820342 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 01:27:12.749824603 +0000 UTC Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.841135 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.841219 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:23 crc kubenswrapper[5094]: E0220 06:47:23.841403 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:23 crc kubenswrapper[5094]: E0220 06:47:23.841660 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.872981 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.873294 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.873448 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.873593 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.873788 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:23Z","lastTransitionTime":"2026-02-20T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.976963 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.977018 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.977036 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.977063 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.977086 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:23Z","lastTransitionTime":"2026-02-20T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.080929 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.080998 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.081017 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.081044 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.081065 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:24Z","lastTransitionTime":"2026-02-20T06:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.184448 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.184537 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.184559 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.184590 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.184611 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:24Z","lastTransitionTime":"2026-02-20T06:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.288165 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.288457 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.288529 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.288600 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.288671 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:24Z","lastTransitionTime":"2026-02-20T06:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.392127 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.392481 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.392549 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.392623 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.392688 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:24Z","lastTransitionTime":"2026-02-20T06:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.496525 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.496610 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.496635 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.496674 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.496699 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:24Z","lastTransitionTime":"2026-02-20T06:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.600465 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.600867 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.601019 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.601166 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.601306 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:24Z","lastTransitionTime":"2026-02-20T06:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.705916 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.705985 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.706006 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.706037 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.706057 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:24Z","lastTransitionTime":"2026-02-20T06:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.809347 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.809432 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.809453 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.809480 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.809504 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:24Z","lastTransitionTime":"2026-02-20T06:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.821674 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 14:21:51.538317472 +0000 UTC Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.840186 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.840372 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:24 crc kubenswrapper[5094]: E0220 06:47:24.840542 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:24 crc kubenswrapper[5094]: E0220 06:47:24.840837 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.913741 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.914923 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.915021 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.915065 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.915091 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:24Z","lastTransitionTime":"2026-02-20T06:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.018309 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.018652 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.018881 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.018942 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.018966 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:25Z","lastTransitionTime":"2026-02-20T06:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.123917 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.124039 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.124116 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.124147 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.124223 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:25Z","lastTransitionTime":"2026-02-20T06:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.227439 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.227531 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.227551 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.227577 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.227594 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:25Z","lastTransitionTime":"2026-02-20T06:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.331017 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.331112 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.331127 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.331171 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.331184 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:25Z","lastTransitionTime":"2026-02-20T06:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.434914 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.435134 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.435169 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.435204 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.435226 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:25Z","lastTransitionTime":"2026-02-20T06:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.539016 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.539115 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.539141 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.539180 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.539201 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:25Z","lastTransitionTime":"2026-02-20T06:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.642817 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.642906 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.642925 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.642951 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.642966 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:25Z","lastTransitionTime":"2026-02-20T06:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.746482 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.746536 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.746553 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.746576 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.746594 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:25Z","lastTransitionTime":"2026-02-20T06:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.822806 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 15:22:52.998217742 +0000 UTC Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.839491 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:25 crc kubenswrapper[5094]: E0220 06:47:25.839764 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.839870 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:25 crc kubenswrapper[5094]: E0220 06:47:25.840122 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.852327 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.854472 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.854665 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.854943 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.855153 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:25Z","lastTransitionTime":"2026-02-20T06:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.859879 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:25Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.880477 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"089ad2b1-a8b5-4a97-ad0e-a7912d97c2b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3f535bdf7005ade8abbf6f234ddbc9c15136792c8ebe8a3646e9dadedea986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d72e26a84d5625d799baf5fcd573d245475eb954a13456bf1813c0c863dc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d40e9d51da95e9023d99a2b2cdc4aa1a6d6755d0110393e173ee57fe9bfb74ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:25Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.902383 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:25Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.924087 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:25Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.940232 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:25Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.955877 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:25Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.959380 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.959423 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.959442 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.959467 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.959488 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:25Z","lastTransitionTime":"2026-02-20T06:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.981928 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:25Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.006174 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:26Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.028069 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:26Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.046800 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:26Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.063888 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.063991 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.064052 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.064132 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.064239 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:26Z","lastTransitionTime":"2026-02-20T06:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.070305 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:26Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.103954 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:26Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.126831 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:26Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.148483 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:26Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.172164 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.172237 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.172261 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.172300 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.172331 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:26Z","lastTransitionTime":"2026-02-20T06:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.176985 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:26Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.213646 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"ddLogicalPort failed for openshift-multus/network-metrics-daemon-8ww4n: failed to update pod openshift-multus/network-metrics-daemon-8ww4n: Internal error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z\\\\nI0220 06:47:12.918393 6814 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 06:47:12.918801 6814 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0220 06:47:12.918936 6814 handler.go:208] Removed *v1.Node event handler 2\\\\nI0220 06:47:12.918990 6814 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0220 06:47:12.919228 6814 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 06:47:12.919287 6814 factory.go:656] Stopping watch factory\\\\nI0220 06:47:12.919306 6814 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:47:12.919342 6814 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 06:47:12.919361 6814 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:47:12.919443 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:47:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:26Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.236441 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:26Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.255540 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c561efe0bdf3fc9b35bebb6f356c9ef56add69a863637766aab3d3748acaaa63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8f6830864b9f0c23a91dff26fca798b670afe7c0316ae71ad386c027b8ce0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:26Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.275903 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.275971 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.275991 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.276025 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.276048 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:26Z","lastTransitionTime":"2026-02-20T06:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.380341 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.380420 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.380441 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.380517 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.380543 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:26Z","lastTransitionTime":"2026-02-20T06:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.484117 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.484175 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.484189 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.484210 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.484225 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:26Z","lastTransitionTime":"2026-02-20T06:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.587078 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.587157 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.587179 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.587209 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.587230 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:26Z","lastTransitionTime":"2026-02-20T06:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.691366 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.691439 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.691457 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.691485 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.691507 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:26Z","lastTransitionTime":"2026-02-20T06:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.795666 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.796265 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.796605 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.796817 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.796991 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:26Z","lastTransitionTime":"2026-02-20T06:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.823513 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 04:32:30.676341913 +0000 UTC Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.839565 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.839569 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:26 crc kubenswrapper[5094]: E0220 06:47:26.839757 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:26 crc kubenswrapper[5094]: E0220 06:47:26.839933 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.904031 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.904113 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.904132 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.904164 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.904184 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:26Z","lastTransitionTime":"2026-02-20T06:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.008037 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.008121 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.008141 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.008170 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.008193 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:27Z","lastTransitionTime":"2026-02-20T06:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.112118 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.112187 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.112198 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.112222 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.112236 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:27Z","lastTransitionTime":"2026-02-20T06:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.215209 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.215284 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.215307 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.215334 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.215353 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:27Z","lastTransitionTime":"2026-02-20T06:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.318597 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.318663 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.318682 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.318755 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.318784 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:27Z","lastTransitionTime":"2026-02-20T06:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.422394 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.422467 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.422483 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.422510 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.422536 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:27Z","lastTransitionTime":"2026-02-20T06:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.526084 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.526191 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.526219 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.526257 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.526283 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:27Z","lastTransitionTime":"2026-02-20T06:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.629968 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.630042 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.630062 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.630093 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.630112 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:27Z","lastTransitionTime":"2026-02-20T06:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.733414 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.733484 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.733503 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.733529 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.733550 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:27Z","lastTransitionTime":"2026-02-20T06:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.823914 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 10:46:39.093311238 +0000 UTC Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.836757 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.836826 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.836848 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.836879 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.836898 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:27Z","lastTransitionTime":"2026-02-20T06:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.840329 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.840557 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:27 crc kubenswrapper[5094]: E0220 06:47:27.840846 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:27 crc kubenswrapper[5094]: E0220 06:47:27.840988 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.941289 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.941357 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.941375 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.941404 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.941423 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:27Z","lastTransitionTime":"2026-02-20T06:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.044939 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.045025 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.045052 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.045093 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.045127 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:28Z","lastTransitionTime":"2026-02-20T06:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.148464 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.148546 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.148573 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.148610 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.148638 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:28Z","lastTransitionTime":"2026-02-20T06:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.252009 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.252056 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.252066 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.252083 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.252098 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:28Z","lastTransitionTime":"2026-02-20T06:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.355462 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.355544 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.355565 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.355595 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.355615 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:28Z","lastTransitionTime":"2026-02-20T06:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.459204 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.459290 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.459352 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.459389 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.459410 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:28Z","lastTransitionTime":"2026-02-20T06:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.563145 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.563221 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.563239 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.563268 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.563286 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:28Z","lastTransitionTime":"2026-02-20T06:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.666973 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.667085 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.667111 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.667151 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.667175 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:28Z","lastTransitionTime":"2026-02-20T06:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.770655 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.770754 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.770775 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.770808 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.770828 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:28Z","lastTransitionTime":"2026-02-20T06:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.825108 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 19:24:25.213094099 +0000 UTC Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.839573 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.839694 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:28 crc kubenswrapper[5094]: E0220 06:47:28.840518 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:28 crc kubenswrapper[5094]: E0220 06:47:28.840626 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.841282 5094 scope.go:117] "RemoveContainer" containerID="210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca" Feb 20 06:47:28 crc kubenswrapper[5094]: E0220 06:47:28.841884 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.875230 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.875301 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.875319 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.875346 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.875364 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:28Z","lastTransitionTime":"2026-02-20T06:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.977915 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.977979 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.977993 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.978014 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.978028 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:28Z","lastTransitionTime":"2026-02-20T06:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.081392 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.081460 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.081479 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.081505 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.081525 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:29Z","lastTransitionTime":"2026-02-20T06:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.184845 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.184909 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.184926 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.184949 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.184962 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:29Z","lastTransitionTime":"2026-02-20T06:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.287718 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.287802 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.287813 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.287827 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.287838 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:29Z","lastTransitionTime":"2026-02-20T06:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.391195 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.391247 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.391269 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.391297 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.391317 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:29Z","lastTransitionTime":"2026-02-20T06:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.495302 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.495391 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.495413 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.495455 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.495478 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:29Z","lastTransitionTime":"2026-02-20T06:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.599375 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.599471 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.599500 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.599536 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.599563 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:29Z","lastTransitionTime":"2026-02-20T06:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.703925 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.704032 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.704055 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.704084 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.704104 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:29Z","lastTransitionTime":"2026-02-20T06:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.808129 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.808176 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.808189 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.808205 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.808217 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:29Z","lastTransitionTime":"2026-02-20T06:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.825362 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 01:26:33.106320594 +0000 UTC Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.839808 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.839850 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:29 crc kubenswrapper[5094]: E0220 06:47:29.840078 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:29 crc kubenswrapper[5094]: E0220 06:47:29.840283 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.911334 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.911404 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.911426 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.911457 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.911475 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:29Z","lastTransitionTime":"2026-02-20T06:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.014552 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.014629 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.014655 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.014689 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.014766 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:30Z","lastTransitionTime":"2026-02-20T06:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.118057 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.118123 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.118139 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.118162 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.118187 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:30Z","lastTransitionTime":"2026-02-20T06:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.221352 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.221425 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.221445 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.221477 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.221498 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:30Z","lastTransitionTime":"2026-02-20T06:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.327339 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.327416 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.327438 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.327486 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.327506 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:30Z","lastTransitionTime":"2026-02-20T06:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.431221 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.431334 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.431354 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.431386 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.431413 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:30Z","lastTransitionTime":"2026-02-20T06:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.534908 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.535153 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.535221 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.535288 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.535348 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:30Z","lastTransitionTime":"2026-02-20T06:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.637956 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.638006 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.638024 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.638049 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.638068 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:30Z","lastTransitionTime":"2026-02-20T06:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.741093 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.741138 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.741155 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.741180 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.741197 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:30Z","lastTransitionTime":"2026-02-20T06:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.825468 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 14:48:16.592112178 +0000 UTC Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.839476 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:30 crc kubenswrapper[5094]: E0220 06:47:30.839637 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.839652 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:30 crc kubenswrapper[5094]: E0220 06:47:30.839790 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.844663 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.844791 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.844857 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.844871 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.844971 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:30Z","lastTransitionTime":"2026-02-20T06:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.948069 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.948335 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.948415 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.948535 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.948658 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:30Z","lastTransitionTime":"2026-02-20T06:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.051139 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.051380 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.051520 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.051604 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.051734 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:31Z","lastTransitionTime":"2026-02-20T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.154855 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.155135 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.155207 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.155272 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.155337 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:31Z","lastTransitionTime":"2026-02-20T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.257574 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.257627 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.257638 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.257652 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.257661 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:31Z","lastTransitionTime":"2026-02-20T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.360482 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.360561 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.360579 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.360608 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.360625 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:31Z","lastTransitionTime":"2026-02-20T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.463033 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.463354 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.463456 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.463578 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.463868 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:31Z","lastTransitionTime":"2026-02-20T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.566523 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.566583 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.566598 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.566622 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.566636 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:31Z","lastTransitionTime":"2026-02-20T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.669884 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.669954 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.669973 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.669999 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.670015 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:31Z","lastTransitionTime":"2026-02-20T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.772932 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.772986 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.773001 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.773020 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.773032 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:31Z","lastTransitionTime":"2026-02-20T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.826254 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 16:04:39.538665497 +0000 UTC Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.839632 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.839883 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:31 crc kubenswrapper[5094]: E0220 06:47:31.839894 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:31 crc kubenswrapper[5094]: E0220 06:47:31.840417 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.874951 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.874991 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.875002 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.875021 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.875037 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:31Z","lastTransitionTime":"2026-02-20T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.977993 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.978070 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.978085 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.978112 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.978131 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:31Z","lastTransitionTime":"2026-02-20T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.081229 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.081282 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.081294 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.081308 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.081318 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:32Z","lastTransitionTime":"2026-02-20T06:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.184131 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.184246 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.184260 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.184284 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.184296 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:32Z","lastTransitionTime":"2026-02-20T06:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.286575 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.286803 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.286880 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.286957 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.287016 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:32Z","lastTransitionTime":"2026-02-20T06:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.391617 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.391766 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.391795 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.391829 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.391851 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:32Z","lastTransitionTime":"2026-02-20T06:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.494909 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.494966 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.494986 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.495012 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.495028 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:32Z","lastTransitionTime":"2026-02-20T06:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.598670 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.598938 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.598948 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.598969 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.598979 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:32Z","lastTransitionTime":"2026-02-20T06:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.702628 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.702671 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.702681 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.702713 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.702725 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:32Z","lastTransitionTime":"2026-02-20T06:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.805734 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.805762 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.805772 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.805785 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.805793 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:32Z","lastTransitionTime":"2026-02-20T06:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.827206 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 16:21:05.073485004 +0000 UTC Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.839551 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:32 crc kubenswrapper[5094]: E0220 06:47:32.839744 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.839859 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:32 crc kubenswrapper[5094]: E0220 06:47:32.839990 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.910178 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.910259 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.910282 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.910312 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.910334 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:32Z","lastTransitionTime":"2026-02-20T06:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.013220 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.013320 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.013342 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.013375 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.013397 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:33Z","lastTransitionTime":"2026-02-20T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.116790 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.116867 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.116887 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.116917 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.116934 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:33Z","lastTransitionTime":"2026-02-20T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.220442 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.220512 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.220536 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.220572 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.220598 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:33Z","lastTransitionTime":"2026-02-20T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.321942 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.322032 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.322051 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.322081 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.322102 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:33Z","lastTransitionTime":"2026-02-20T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:33 crc kubenswrapper[5094]: E0220 06:47:33.344487 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:33Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.351825 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.351878 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.351895 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.351923 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.351941 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:33Z","lastTransitionTime":"2026-02-20T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:33 crc kubenswrapper[5094]: E0220 06:47:33.374642 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:33Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.380235 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.380306 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.380330 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.380359 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.380380 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:33Z","lastTransitionTime":"2026-02-20T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:33 crc kubenswrapper[5094]: E0220 06:47:33.400838 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:33Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.407491 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.407557 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.407576 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.407602 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.407621 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:33Z","lastTransitionTime":"2026-02-20T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:33 crc kubenswrapper[5094]: E0220 06:47:33.428120 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:33Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.432647 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.432747 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.432768 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.432796 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.432814 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:33Z","lastTransitionTime":"2026-02-20T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:33 crc kubenswrapper[5094]: E0220 06:47:33.455944 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:33Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:33 crc kubenswrapper[5094]: E0220 06:47:33.456168 5094 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.458318 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.458371 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.458389 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.458412 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.458429 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:33Z","lastTransitionTime":"2026-02-20T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.561389 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.561434 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.561444 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.561464 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.561476 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:33Z","lastTransitionTime":"2026-02-20T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.664824 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.664889 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.664909 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.664937 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.664959 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:33Z","lastTransitionTime":"2026-02-20T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.679760 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs\") pod \"network-metrics-daemon-8ww4n\" (UID: \"da0aa093-1adc-45f2-a942-e68d7be23ed4\") " pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:33 crc kubenswrapper[5094]: E0220 06:47:33.679967 5094 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 06:47:33 crc kubenswrapper[5094]: E0220 06:47:33.680042 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs podName:da0aa093-1adc-45f2-a942-e68d7be23ed4 nodeName:}" failed. No retries permitted until 2026-02-20 06:48:05.680024617 +0000 UTC m=+100.552651328 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs") pod "network-metrics-daemon-8ww4n" (UID: "da0aa093-1adc-45f2-a942-e68d7be23ed4") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.768055 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.768092 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.768109 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.768131 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.768146 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:33Z","lastTransitionTime":"2026-02-20T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.828282 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 06:03:06.919654293 +0000 UTC Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.839849 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.839939 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:33 crc kubenswrapper[5094]: E0220 06:47:33.840112 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:33 crc kubenswrapper[5094]: E0220 06:47:33.840369 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.870872 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.870908 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.870921 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.870937 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.870952 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:33Z","lastTransitionTime":"2026-02-20T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.972587 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.972619 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.972630 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.972644 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.972655 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:33Z","lastTransitionTime":"2026-02-20T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.075245 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.075330 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.075348 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.075377 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.075393 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:34Z","lastTransitionTime":"2026-02-20T06:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.178232 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.178304 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.178323 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.178357 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.178385 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:34Z","lastTransitionTime":"2026-02-20T06:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.281397 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.281462 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.281481 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.281509 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.281527 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:34Z","lastTransitionTime":"2026-02-20T06:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.384971 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.385058 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.385077 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.385105 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.385124 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:34Z","lastTransitionTime":"2026-02-20T06:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.488393 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.488495 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.488522 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.488558 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.488581 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:34Z","lastTransitionTime":"2026-02-20T06:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.591453 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.591513 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.591531 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.591556 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.591576 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:34Z","lastTransitionTime":"2026-02-20T06:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.694235 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.694293 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.694305 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.694324 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.694339 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:34Z","lastTransitionTime":"2026-02-20T06:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.797562 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.797641 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.797659 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.797686 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.797730 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:34Z","lastTransitionTime":"2026-02-20T06:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.829190 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 04:20:05.561705888 +0000 UTC Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.839947 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:34 crc kubenswrapper[5094]: E0220 06:47:34.840229 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.840755 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:34 crc kubenswrapper[5094]: E0220 06:47:34.840912 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.903599 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.903687 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.903751 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.903972 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.904003 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:34Z","lastTransitionTime":"2026-02-20T06:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.008066 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.008127 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.008148 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.008186 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.008210 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:35Z","lastTransitionTime":"2026-02-20T06:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.111330 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.111382 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.111392 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.111416 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.111430 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:35Z","lastTransitionTime":"2026-02-20T06:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.215278 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.215367 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.215390 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.215421 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.215442 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:35Z","lastTransitionTime":"2026-02-20T06:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.318499 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.318548 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.318558 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.318574 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.318585 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:35Z","lastTransitionTime":"2026-02-20T06:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.422169 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.422216 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.422227 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.422246 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.422257 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:35Z","lastTransitionTime":"2026-02-20T06:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.524581 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.524616 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.524629 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.524649 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.524661 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:35Z","lastTransitionTime":"2026-02-20T06:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.627997 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.628050 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.628061 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.628077 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.628088 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:35Z","lastTransitionTime":"2026-02-20T06:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.731274 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.731326 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.731344 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.731367 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.731386 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:35Z","lastTransitionTime":"2026-02-20T06:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.829689 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 12:04:55.607346032 +0000 UTC Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.834046 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.834119 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.834135 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.834152 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.834163 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:35Z","lastTransitionTime":"2026-02-20T06:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.839654 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:35 crc kubenswrapper[5094]: E0220 06:47:35.839829 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.840177 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:35 crc kubenswrapper[5094]: E0220 06:47:35.840291 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.853933 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:35Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.866243 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:35Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.884922 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:35Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.900517 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:35Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.919094 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"089ad2b1-a8b5-4a97-ad0e-a7912d97c2b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3f535bdf7005ade8abbf6f234ddbc9c15136792c8ebe8a3646e9dadedea986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d72e26a84d5625d799baf5fcd573d245475eb954a13456bf1813c0c863dc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d40e9d51da95e9023d99a2b2cdc4aa1a6d6755d0110393e173ee57fe9bfb74ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:35Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.937046 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.937115 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.937139 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.937172 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.937195 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:35Z","lastTransitionTime":"2026-02-20T06:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.940930 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:35Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.972344 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:35Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.989167 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:35Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.013167 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.035676 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.039223 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.039354 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.039416 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.039478 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.039532 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:36Z","lastTransitionTime":"2026-02-20T06:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.063525 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.080873 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.109516 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"ddLogicalPort failed for openshift-multus/network-metrics-daemon-8ww4n: failed to update pod openshift-multus/network-metrics-daemon-8ww4n: Internal error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z\\\\nI0220 06:47:12.918393 6814 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 06:47:12.918801 6814 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0220 06:47:12.918936 6814 handler.go:208] Removed *v1.Node event handler 2\\\\nI0220 06:47:12.918990 6814 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0220 06:47:12.919228 6814 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 06:47:12.919287 6814 factory.go:656] Stopping watch factory\\\\nI0220 06:47:12.919306 6814 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:47:12.919342 6814 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 06:47:12.919361 6814 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:47:12.919443 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:47:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.132641 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.143623 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.143987 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.144112 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.144589 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.144695 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:36Z","lastTransitionTime":"2026-02-20T06:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.150945 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.164842 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.182255 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.199456 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c561efe0bdf3fc9b35bebb6f356c9ef56add69a863637766aab3d3748acaaa63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8f6830864b9f0c23a91dff26fca798b670afe7c0316ae71ad386c027b8ce0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.249935 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.249984 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.249998 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.250019 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.250033 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:36Z","lastTransitionTime":"2026-02-20T06:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.353611 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.353661 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.353674 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.353693 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.353724 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:36Z","lastTransitionTime":"2026-02-20T06:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.458950 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.459046 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.459072 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.459102 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.459123 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:36Z","lastTransitionTime":"2026-02-20T06:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.462398 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zr8rz_c3900f6d-3035-4fc4-80a2-9e79154f4f5e/kube-multus/0.log" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.462448 5094 generic.go:334] "Generic (PLEG): container finished" podID="c3900f6d-3035-4fc4-80a2-9e79154f4f5e" containerID="7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118" exitCode=1 Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.462479 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zr8rz" event={"ID":"c3900f6d-3035-4fc4-80a2-9e79154f4f5e","Type":"ContainerDied","Data":"7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118"} Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.462928 5094 scope.go:117] "RemoveContainer" containerID="7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.491594 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.508484 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c561efe0bdf3fc9b35bebb6f356c9ef56add69a863637766aab3d3748acaaa63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8f6830864b9f0c23a91dff26fca798b670afe7c0316ae71ad386c027b8ce0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.524407 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.547932 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.559394 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.561834 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.561903 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.561917 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.561938 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.561973 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:36Z","lastTransitionTime":"2026-02-20T06:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.571953 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"089ad2b1-a8b5-4a97-ad0e-a7912d97c2b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3f535bdf7005ade8abbf6f234ddbc9c15136792c8ebe8a3646e9dadedea986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d72e26a84d5625d799baf5fcd573d245475eb954a13456bf1813c0c863dc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d40e9d51da95e9023d99a2b2cdc4aa1a6d6755d0110393e173ee57fe9bfb74ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.587985 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.604083 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.618487 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.634081 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:36Z\\\",\\\"message\\\":\\\"2026-02-20T06:46:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f43e851d-1502-4b62-bdc2-ba27d8b6a1eb\\\\n2026-02-20T06:46:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f43e851d-1502-4b62-bdc2-ba27d8b6a1eb to /host/opt/cni/bin/\\\\n2026-02-20T06:46:51Z [verbose] multus-daemon started\\\\n2026-02-20T06:46:51Z [verbose] Readiness Indicator file check\\\\n2026-02-20T06:47:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.654722 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.664481 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.664530 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.664542 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.664561 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.664572 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:36Z","lastTransitionTime":"2026-02-20T06:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.674178 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.686134 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.704589 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"ddLogicalPort failed for openshift-multus/network-metrics-daemon-8ww4n: failed to update pod openshift-multus/network-metrics-daemon-8ww4n: Internal error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z\\\\nI0220 06:47:12.918393 6814 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 06:47:12.918801 6814 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0220 06:47:12.918936 6814 handler.go:208] Removed *v1.Node event handler 2\\\\nI0220 06:47:12.918990 6814 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0220 06:47:12.919228 6814 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 06:47:12.919287 6814 factory.go:656] Stopping watch factory\\\\nI0220 06:47:12.919306 6814 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:47:12.919342 6814 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 06:47:12.919361 6814 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:47:12.919443 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:47:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.737446 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.756761 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.770463 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.770508 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.770520 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.770727 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.770742 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:36Z","lastTransitionTime":"2026-02-20T06:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.782468 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.799270 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.830443 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 14:42:49.465754773 +0000 UTC Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.839868 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:36 crc kubenswrapper[5094]: E0220 06:47:36.839997 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.840072 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:36 crc kubenswrapper[5094]: E0220 06:47:36.840118 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.872948 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.872977 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.872986 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.873004 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.873016 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:36Z","lastTransitionTime":"2026-02-20T06:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.975154 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.975208 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.975216 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.975230 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.975239 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:36Z","lastTransitionTime":"2026-02-20T06:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.077876 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.077939 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.077948 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.077964 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.077976 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:37Z","lastTransitionTime":"2026-02-20T06:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.181039 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.181091 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.181101 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.181121 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.181136 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:37Z","lastTransitionTime":"2026-02-20T06:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.284859 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.284921 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.284938 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.284968 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.284987 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:37Z","lastTransitionTime":"2026-02-20T06:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.388653 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.388757 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.388770 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.388790 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.388804 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:37Z","lastTransitionTime":"2026-02-20T06:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.469144 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zr8rz_c3900f6d-3035-4fc4-80a2-9e79154f4f5e/kube-multus/0.log" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.469534 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zr8rz" event={"ID":"c3900f6d-3035-4fc4-80a2-9e79154f4f5e","Type":"ContainerStarted","Data":"aba2be0e4774b30500be76b546e3ffff5c136a2e26675822931a400ca3090e79"} Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.488637 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.491945 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.492035 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.492056 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.492084 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.492144 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:37Z","lastTransitionTime":"2026-02-20T06:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.506131 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.520983 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba2be0e4774b30500be76b546e3ffff5c136a2e26675822931a400ca3090e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:36Z\\\",\\\"message\\\":\\\"2026-02-20T06:46:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f43e851d-1502-4b62-bdc2-ba27d8b6a1eb\\\\n2026-02-20T06:46:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f43e851d-1502-4b62-bdc2-ba27d8b6a1eb to /host/opt/cni/bin/\\\\n2026-02-20T06:46:51Z [verbose] multus-daemon started\\\\n2026-02-20T06:46:51Z [verbose] Readiness Indicator file check\\\\n2026-02-20T06:47:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.537450 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.550445 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.569840 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.594829 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.594876 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.594886 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.594902 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.594913 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:37Z","lastTransitionTime":"2026-02-20T06:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.595411 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"ddLogicalPort failed for openshift-multus/network-metrics-daemon-8ww4n: failed to update pod openshift-multus/network-metrics-daemon-8ww4n: Internal error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z\\\\nI0220 06:47:12.918393 6814 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 06:47:12.918801 6814 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0220 06:47:12.918936 6814 handler.go:208] Removed *v1.Node event handler 2\\\\nI0220 06:47:12.918990 6814 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0220 06:47:12.919228 6814 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 06:47:12.919287 6814 factory.go:656] Stopping watch factory\\\\nI0220 06:47:12.919306 6814 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:47:12.919342 6814 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 06:47:12.919361 6814 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:47:12.919443 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:47:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.629391 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.646475 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.664515 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.678200 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c561efe0bdf3fc9b35bebb6f356c9ef56add69a863637766aab3d3748acaaa63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8f6830864b9f0c23a91dff26fca798b670afe7c0316ae71ad386c027b8ce0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.692252 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.700357 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.700426 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.700440 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.700491 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.700507 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:37Z","lastTransitionTime":"2026-02-20T06:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.706313 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.719183 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.739060 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.753589 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.769528 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"089ad2b1-a8b5-4a97-ad0e-a7912d97c2b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3f535bdf7005ade8abbf6f234ddbc9c15136792c8ebe8a3646e9dadedea986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d72e26a84d5625d799baf5fcd573d245475eb954a13456bf1813c0c863dc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d40e9d51da95e9023d99a2b2cdc4aa1a6d6755d0110393e173ee57fe9bfb74ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.786563 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.804042 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.804098 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.804112 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.804135 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.804149 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:37Z","lastTransitionTime":"2026-02-20T06:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.830656 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 04:01:48.293373748 +0000 UTC Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.839977 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.840041 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:37 crc kubenswrapper[5094]: E0220 06:47:37.840130 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:37 crc kubenswrapper[5094]: E0220 06:47:37.840391 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.907758 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.907858 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.907879 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.907939 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.907960 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:37Z","lastTransitionTime":"2026-02-20T06:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.010931 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.010994 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.011011 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.011034 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.011050 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:38Z","lastTransitionTime":"2026-02-20T06:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.114659 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.114722 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.114736 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.114759 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.114775 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:38Z","lastTransitionTime":"2026-02-20T06:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.218180 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.218237 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.218248 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.218265 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.218277 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:38Z","lastTransitionTime":"2026-02-20T06:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.320303 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.320374 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.320390 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.320416 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.320433 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:38Z","lastTransitionTime":"2026-02-20T06:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.423910 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.423969 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.423983 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.424005 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.424024 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:38Z","lastTransitionTime":"2026-02-20T06:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.527148 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.527227 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.527247 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.527277 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.527299 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:38Z","lastTransitionTime":"2026-02-20T06:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.630517 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.630594 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.630614 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.630652 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.630678 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:38Z","lastTransitionTime":"2026-02-20T06:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.734249 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.734319 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.734340 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.734372 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.734393 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:38Z","lastTransitionTime":"2026-02-20T06:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.831183 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 14:39:37.948913495 +0000 UTC Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.838039 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.838099 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.838119 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.838148 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.838172 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:38Z","lastTransitionTime":"2026-02-20T06:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.839137 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.839245 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:38 crc kubenswrapper[5094]: E0220 06:47:38.839496 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:38 crc kubenswrapper[5094]: E0220 06:47:38.839663 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.942113 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.942185 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.942208 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.942239 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.942262 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:38Z","lastTransitionTime":"2026-02-20T06:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.044973 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.045056 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.045082 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.045112 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.045134 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:39Z","lastTransitionTime":"2026-02-20T06:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.148764 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.148838 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.148859 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.148887 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.148907 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:39Z","lastTransitionTime":"2026-02-20T06:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.252677 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.252771 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.252792 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.252824 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.252844 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:39Z","lastTransitionTime":"2026-02-20T06:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.355868 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.355923 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.355942 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.355970 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.355990 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:39Z","lastTransitionTime":"2026-02-20T06:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.459055 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.459120 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.459139 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.459170 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.459189 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:39Z","lastTransitionTime":"2026-02-20T06:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.562440 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.562926 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.563143 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.563314 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.563449 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:39Z","lastTransitionTime":"2026-02-20T06:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.667612 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.667694 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.667759 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.667798 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.667821 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:39Z","lastTransitionTime":"2026-02-20T06:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.772447 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.772509 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.772526 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.772554 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.772578 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:39Z","lastTransitionTime":"2026-02-20T06:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.831449 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 13:00:41.577906335 +0000 UTC Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.839891 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.839929 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:39 crc kubenswrapper[5094]: E0220 06:47:39.840117 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:39 crc kubenswrapper[5094]: E0220 06:47:39.840272 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.875216 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.875288 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.875307 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.875337 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.875359 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:39Z","lastTransitionTime":"2026-02-20T06:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.978943 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.978993 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.979010 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.979035 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.979054 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:39Z","lastTransitionTime":"2026-02-20T06:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.082785 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.083308 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.083481 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.083621 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.083785 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:40Z","lastTransitionTime":"2026-02-20T06:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.188213 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.188632 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.188875 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.188947 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.188973 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:40Z","lastTransitionTime":"2026-02-20T06:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.292941 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.293018 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.293036 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.293067 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.293087 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:40Z","lastTransitionTime":"2026-02-20T06:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.396149 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.396201 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.396211 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.396232 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.396244 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:40Z","lastTransitionTime":"2026-02-20T06:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.507771 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.507844 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.507860 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.507906 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.507923 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:40Z","lastTransitionTime":"2026-02-20T06:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.611567 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.611662 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.611688 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.611764 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.611793 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:40Z","lastTransitionTime":"2026-02-20T06:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.715551 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.715639 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.715666 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.715737 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.715765 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:40Z","lastTransitionTime":"2026-02-20T06:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.819295 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.819375 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.819395 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.819442 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.819464 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:40Z","lastTransitionTime":"2026-02-20T06:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.833121 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 02:29:48.224222968 +0000 UTC Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.839478 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.839568 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:40 crc kubenswrapper[5094]: E0220 06:47:40.839688 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:40 crc kubenswrapper[5094]: E0220 06:47:40.839828 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.923474 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.923535 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.923553 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.923579 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.923601 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:40Z","lastTransitionTime":"2026-02-20T06:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.027408 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.027478 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.027497 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.027524 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.027547 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:41Z","lastTransitionTime":"2026-02-20T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.131173 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.131252 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.131270 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.131302 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.131322 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:41Z","lastTransitionTime":"2026-02-20T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.235359 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.235420 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.235438 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.235466 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.235486 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:41Z","lastTransitionTime":"2026-02-20T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.340446 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.340518 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.340539 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.340569 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.340590 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:41Z","lastTransitionTime":"2026-02-20T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.446393 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.446480 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.446506 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.446541 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.446582 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:41Z","lastTransitionTime":"2026-02-20T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.549691 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.549808 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.549830 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.549863 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.549887 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:41Z","lastTransitionTime":"2026-02-20T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.654104 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.654174 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.654197 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.654229 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.654253 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:41Z","lastTransitionTime":"2026-02-20T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.757394 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.757464 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.757487 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.757550 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.757576 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:41Z","lastTransitionTime":"2026-02-20T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.833473 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 01:14:39.615440174 +0000 UTC Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.839963 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.839960 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:41 crc kubenswrapper[5094]: E0220 06:47:41.840160 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:41 crc kubenswrapper[5094]: E0220 06:47:41.840392 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.858746 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.860248 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.860329 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.860354 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.860381 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.860400 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:41Z","lastTransitionTime":"2026-02-20T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.963010 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.963068 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.963086 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.963113 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.963133 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:41Z","lastTransitionTime":"2026-02-20T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.067010 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.067101 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.067121 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.067153 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.067172 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:42Z","lastTransitionTime":"2026-02-20T06:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.170294 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.170355 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.170372 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.170397 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.170415 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:42Z","lastTransitionTime":"2026-02-20T06:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.274014 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.274087 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.274107 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.274134 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.274156 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:42Z","lastTransitionTime":"2026-02-20T06:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.378110 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.378172 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.378189 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.378215 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.378233 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:42Z","lastTransitionTime":"2026-02-20T06:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.481219 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.481268 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.481286 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.481311 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.481332 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:42Z","lastTransitionTime":"2026-02-20T06:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.584869 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.584927 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.584946 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.584970 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.584989 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:42Z","lastTransitionTime":"2026-02-20T06:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.688055 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.688122 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.688143 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.688173 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.688191 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:42Z","lastTransitionTime":"2026-02-20T06:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.791669 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.791792 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.791812 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.791841 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.791861 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:42Z","lastTransitionTime":"2026-02-20T06:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.834225 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 14:10:08.856586607 +0000 UTC Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.839695 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:42 crc kubenswrapper[5094]: E0220 06:47:42.839916 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.839695 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:42 crc kubenswrapper[5094]: E0220 06:47:42.840132 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.895335 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.895393 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.895421 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.895454 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.895478 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:42Z","lastTransitionTime":"2026-02-20T06:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.999255 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.999345 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.999368 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.999400 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:42.999445 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:42Z","lastTransitionTime":"2026-02-20T06:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.103092 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.103153 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.103170 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.103194 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.103211 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:43Z","lastTransitionTime":"2026-02-20T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.206181 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.206257 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.206280 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.206312 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.206335 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:43Z","lastTransitionTime":"2026-02-20T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.310278 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.310357 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.310375 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.310402 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.310428 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:43Z","lastTransitionTime":"2026-02-20T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.414562 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.414667 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.414686 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.414747 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.414767 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:43Z","lastTransitionTime":"2026-02-20T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.518372 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.518476 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.518496 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.518520 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.518538 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:43Z","lastTransitionTime":"2026-02-20T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.621903 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.621972 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.621990 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.622015 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.622036 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:43Z","lastTransitionTime":"2026-02-20T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.725787 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.725870 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.725890 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.725916 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.725936 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:43Z","lastTransitionTime":"2026-02-20T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.733871 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.733966 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.733996 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.734029 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.734067 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:43Z","lastTransitionTime":"2026-02-20T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:43 crc kubenswrapper[5094]: E0220 06:47:43.758386 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:43Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.764394 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.764456 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.764474 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.764499 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.764517 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:43Z","lastTransitionTime":"2026-02-20T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:43 crc kubenswrapper[5094]: E0220 06:47:43.784437 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:43Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.790184 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.790292 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.790321 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.790358 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.790381 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:43Z","lastTransitionTime":"2026-02-20T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:43 crc kubenswrapper[5094]: E0220 06:47:43.812321 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:43Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.816835 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.816928 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.816958 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.816995 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.817037 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:43Z","lastTransitionTime":"2026-02-20T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.834668 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 07:51:08.72345156 +0000 UTC Feb 20 06:47:43 crc kubenswrapper[5094]: E0220 06:47:43.838424 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:43Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.839474 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.839792 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:43 crc kubenswrapper[5094]: E0220 06:47:43.840006 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:43 crc kubenswrapper[5094]: E0220 06:47:43.840179 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.840456 5094 scope.go:117] "RemoveContainer" containerID="210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.843790 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.843866 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.843892 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.843928 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.843953 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:43Z","lastTransitionTime":"2026-02-20T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:43 crc kubenswrapper[5094]: E0220 06:47:43.867229 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:43Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:43 crc kubenswrapper[5094]: E0220 06:47:43.867493 5094 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.871054 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.871103 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.871122 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.871151 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.871169 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:43Z","lastTransitionTime":"2026-02-20T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.974579 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.974656 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.974676 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.974739 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.974760 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:43Z","lastTransitionTime":"2026-02-20T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.078737 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.078779 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.078790 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.078810 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.078823 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:44Z","lastTransitionTime":"2026-02-20T06:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.182798 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.182871 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.182889 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.182919 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.182938 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:44Z","lastTransitionTime":"2026-02-20T06:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.289946 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.290029 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.290055 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.290088 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.290120 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:44Z","lastTransitionTime":"2026-02-20T06:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.392939 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.393001 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.393020 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.393055 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.393077 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:44Z","lastTransitionTime":"2026-02-20T06:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.496141 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.496198 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.496211 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.496230 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.496244 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:44Z","lastTransitionTime":"2026-02-20T06:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.511926 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovnkube-controller/2.log" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.517189 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerStarted","Data":"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd"} Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.518138 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.545318 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.563681 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.598840 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"ddLogicalPort failed for openshift-multus/network-metrics-daemon-8ww4n: failed to update pod openshift-multus/network-metrics-daemon-8ww4n: Internal error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z\\\\nI0220 06:47:12.918393 6814 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 06:47:12.918801 6814 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0220 06:47:12.918936 6814 handler.go:208] Removed *v1.Node event handler 2\\\\nI0220 06:47:12.918990 6814 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0220 06:47:12.919228 6814 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 06:47:12.919287 6814 factory.go:656] Stopping watch factory\\\\nI0220 06:47:12.919306 6814 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:47:12.919342 6814 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 06:47:12.919361 6814 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:47:12.919443 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:47:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.602000 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.602145 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.602164 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.602192 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.602214 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:44Z","lastTransitionTime":"2026-02-20T06:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.633010 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.668427 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.687951 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.701034 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c561efe0bdf3fc9b35bebb6f356c9ef56add69a863637766aab3d3748acaaa63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8f6830864b9f0c23a91dff26fca798b670afe7c0316ae71ad386c027b8ce0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.705241 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.705290 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.705308 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.705334 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.705353 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:44Z","lastTransitionTime":"2026-02-20T06:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.715389 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.727917 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.742264 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.757153 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.770507 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.785595 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"089ad2b1-a8b5-4a97-ad0e-a7912d97c2b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3f535bdf7005ade8abbf6f234ddbc9c15136792c8ebe8a3646e9dadedea986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d72e26a84d5625d799baf5fcd573d245475eb954a13456bf1813c0c863dc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d40e9d51da95e9023d99a2b2cdc4aa1a6d6755d0110393e173ee57fe9bfb74ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.802584 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.807918 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.807997 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.808016 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.808087 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.808113 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:44Z","lastTransitionTime":"2026-02-20T06:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.819384 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.833520 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.835616 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 00:49:16.600586907 +0000 UTC Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.839742 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.839786 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:44 crc kubenswrapper[5094]: E0220 06:47:44.839945 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:44 crc kubenswrapper[5094]: E0220 06:47:44.840078 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.853241 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba2be0e4774b30500be76b546e3ffff5c136a2e26675822931a400ca3090e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:36Z\\\",\\\"message\\\":\\\"2026-02-20T06:46:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f43e851d-1502-4b62-bdc2-ba27d8b6a1eb\\\\n2026-02-20T06:46:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f43e851d-1502-4b62-bdc2-ba27d8b6a1eb to /host/opt/cni/bin/\\\\n2026-02-20T06:46:51Z [verbose] multus-daemon started\\\\n2026-02-20T06:46:51Z [verbose] Readiness Indicator file check\\\\n2026-02-20T06:47:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.868860 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bc40232-e29d-4f13-a2f2-f8be2e97b789\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bbeadd569a4f1d9c8ec473e4dce3e5141ef23e49e911237ea9637cb3bc0fb77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b984b4cdf6ed9e2cbbeba5300d04d9aabca1c5deb777d5f2d1f92c56488baccf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b984b4cdf6ed9e2cbbeba5300d04d9aabca1c5deb777d5f2d1f92c56488baccf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.889480 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.912059 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.912157 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.912177 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.912242 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.912266 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:44Z","lastTransitionTime":"2026-02-20T06:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.015338 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.015401 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.015419 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.015446 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.015465 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:45Z","lastTransitionTime":"2026-02-20T06:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.119527 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.119640 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.119671 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.119746 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.119776 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:45Z","lastTransitionTime":"2026-02-20T06:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.223227 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.223300 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.223318 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.223348 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.223369 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:45Z","lastTransitionTime":"2026-02-20T06:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.327624 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.327732 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.327754 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.327780 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.327799 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:45Z","lastTransitionTime":"2026-02-20T06:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.431024 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.431110 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.431123 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.431144 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.431156 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:45Z","lastTransitionTime":"2026-02-20T06:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.525497 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovnkube-controller/3.log" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.526888 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovnkube-controller/2.log" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.531584 5094 generic.go:334] "Generic (PLEG): container finished" podID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerID="0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd" exitCode=1 Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.531648 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerDied","Data":"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd"} Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.531755 5094 scope.go:117] "RemoveContainer" containerID="210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.533467 5094 scope.go:117] "RemoveContainer" containerID="0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd" Feb 20 06:47:45 crc kubenswrapper[5094]: E0220 06:47:45.533766 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.534066 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.534098 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.534112 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.534128 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.534141 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:45Z","lastTransitionTime":"2026-02-20T06:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.553080 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bc40232-e29d-4f13-a2f2-f8be2e97b789\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bbeadd569a4f1d9c8ec473e4dce3e5141ef23e49e911237ea9637cb3bc0fb77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b984b4cdf6ed9e2cbbeba5300d04d9aabca1c5deb777d5f2d1f92c56488baccf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b984b4cdf6ed9e2cbbeba5300d04d9aabca1c5deb777d5f2d1f92c56488baccf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.577148 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.599956 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.620369 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.637943 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.637997 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.638010 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.638037 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.638058 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:45Z","lastTransitionTime":"2026-02-20T06:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.643007 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba2be0e4774b30500be76b546e3ffff5c136a2e26675822931a400ca3090e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:36Z\\\",\\\"message\\\":\\\"2026-02-20T06:46:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f43e851d-1502-4b62-bdc2-ba27d8b6a1eb\\\\n2026-02-20T06:46:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f43e851d-1502-4b62-bdc2-ba27d8b6a1eb to /host/opt/cni/bin/\\\\n2026-02-20T06:46:51Z [verbose] multus-daemon started\\\\n2026-02-20T06:46:51Z [verbose] Readiness Indicator file check\\\\n2026-02-20T06:47:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.688051 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.708397 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.729604 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.741830 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.741936 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.741965 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.742003 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.742029 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:45Z","lastTransitionTime":"2026-02-20T06:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.752756 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.786574 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"ddLogicalPort failed for openshift-multus/network-metrics-daemon-8ww4n: failed to update pod openshift-multus/network-metrics-daemon-8ww4n: Internal error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z\\\\nI0220 06:47:12.918393 6814 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 06:47:12.918801 6814 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0220 06:47:12.918936 6814 handler.go:208] Removed *v1.Node event handler 2\\\\nI0220 06:47:12.918990 6814 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0220 06:47:12.919228 6814 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 06:47:12.919287 6814 factory.go:656] Stopping watch factory\\\\nI0220 06:47:12.919306 6814 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:47:12.919342 6814 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 06:47:12.919361 6814 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:47:12.919443 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:47:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:44Z\\\",\\\"message\\\":\\\"work controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z]\\\\nI0220 06:47:44.923774 7201 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f\\\\nI0220 06:47:44.923771 7201 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-9vd4p\\\\nI0220 06:47:44.923661 7201 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0220 06:47:44.923695 7201 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_opensh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.810047 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.829553 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c561efe0bdf3fc9b35bebb6f356c9ef56add69a863637766aab3d3748acaaa63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8f6830864b9f0c23a91dff26fca798b670afe7c0316ae71ad386c027b8ce0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.836666 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 04:55:16.376232155 +0000 UTC Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.839649 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.839779 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:45 crc kubenswrapper[5094]: E0220 06:47:45.839881 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:45 crc kubenswrapper[5094]: E0220 06:47:45.839987 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.846190 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.846257 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.846287 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.846320 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.846343 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:45Z","lastTransitionTime":"2026-02-20T06:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.855665 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"089ad2b1-a8b5-4a97-ad0e-a7912d97c2b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3f535bdf7005ade8abbf6f234ddbc9c15136792c8ebe8a3646e9dadedea986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d72e26a84d5625d799baf5fcd573d245475eb954a13456bf1813c0c863dc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d40e9d51da95e9023d99a2b2cdc4aa1a6d6755d0110393e173ee57fe9bfb74ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.879064 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.907329 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.929875 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.948174 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.950446 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.950506 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.950530 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.950564 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.950586 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:45Z","lastTransitionTime":"2026-02-20T06:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.973893 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.995270 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.028159 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.046356 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.055045 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.055134 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.055164 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.055198 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.055222 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:46Z","lastTransitionTime":"2026-02-20T06:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.066852 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.087444 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.117178 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"ddLogicalPort failed for openshift-multus/network-metrics-daemon-8ww4n: failed to update pod openshift-multus/network-metrics-daemon-8ww4n: Internal error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z\\\\nI0220 06:47:12.918393 6814 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 06:47:12.918801 6814 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0220 06:47:12.918936 6814 handler.go:208] Removed *v1.Node event handler 2\\\\nI0220 06:47:12.918990 6814 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0220 06:47:12.919228 6814 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 06:47:12.919287 6814 factory.go:656] Stopping watch factory\\\\nI0220 06:47:12.919306 6814 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:47:12.919342 6814 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 06:47:12.919361 6814 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:47:12.919443 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:47:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:44Z\\\",\\\"message\\\":\\\"work controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z]\\\\nI0220 06:47:44.923774 7201 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f\\\\nI0220 06:47:44.923771 7201 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-9vd4p\\\\nI0220 06:47:44.923661 7201 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0220 06:47:44.923695 7201 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_opensh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.137504 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.152904 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c561efe0bdf3fc9b35bebb6f356c9ef56add69a863637766aab3d3748acaaa63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8f6830864b9f0c23a91dff26fca798b670afe7c0316ae71ad386c027b8ce0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.159222 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.159295 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.159319 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.159356 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.159379 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:46Z","lastTransitionTime":"2026-02-20T06:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.172357 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.195645 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"089ad2b1-a8b5-4a97-ad0e-a7912d97c2b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3f535bdf7005ade8abbf6f234ddbc9c15136792c8ebe8a3646e9dadedea986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d72e26a84d5625d799baf5fcd573d245475eb954a13456bf1813c0c863dc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d40e9d51da95e9023d99a2b2cdc4aa1a6d6755d0110393e173ee57fe9bfb74ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.220550 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.241834 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.260199 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.262023 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.262105 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.262125 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.262162 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.262182 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:46Z","lastTransitionTime":"2026-02-20T06:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.278565 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.304655 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.325618 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bc40232-e29d-4f13-a2f2-f8be2e97b789\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bbeadd569a4f1d9c8ec473e4dce3e5141ef23e49e911237ea9637cb3bc0fb77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b984b4cdf6ed9e2cbbeba5300d04d9aabca1c5deb777d5f2d1f92c56488baccf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b984b4cdf6ed9e2cbbeba5300d04d9aabca1c5deb777d5f2d1f92c56488baccf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.350317 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.372303 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.372354 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.372371 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.372397 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.372422 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:46Z","lastTransitionTime":"2026-02-20T06:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.372589 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.389365 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.409547 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba2be0e4774b30500be76b546e3ffff5c136a2e26675822931a400ca3090e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:36Z\\\",\\\"message\\\":\\\"2026-02-20T06:46:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f43e851d-1502-4b62-bdc2-ba27d8b6a1eb\\\\n2026-02-20T06:46:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f43e851d-1502-4b62-bdc2-ba27d8b6a1eb to /host/opt/cni/bin/\\\\n2026-02-20T06:46:51Z [verbose] multus-daemon started\\\\n2026-02-20T06:46:51Z [verbose] Readiness Indicator file check\\\\n2026-02-20T06:47:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.475345 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.475400 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.475417 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.475442 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.475459 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:46Z","lastTransitionTime":"2026-02-20T06:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.538514 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovnkube-controller/3.log" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.545080 5094 scope.go:117] "RemoveContainer" containerID="0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd" Feb 20 06:47:46 crc kubenswrapper[5094]: E0220 06:47:46.545357 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.567777 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.579112 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.579322 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.579505 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.579737 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.580153 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:46Z","lastTransitionTime":"2026-02-20T06:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.584383 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.601306 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.622124 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.638463 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.658061 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"089ad2b1-a8b5-4a97-ad0e-a7912d97c2b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3f535bdf7005ade8abbf6f234ddbc9c15136792c8ebe8a3646e9dadedea986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d72e26a84d5625d799baf5fcd573d245475eb954a13456bf1813c0c863dc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d40e9d51da95e9023d99a2b2cdc4aa1a6d6755d0110393e173ee57fe9bfb74ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.677211 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.682900 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.682948 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.682969 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.682996 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.683015 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:46Z","lastTransitionTime":"2026-02-20T06:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.697541 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.716337 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.739441 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba2be0e4774b30500be76b546e3ffff5c136a2e26675822931a400ca3090e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:36Z\\\",\\\"message\\\":\\\"2026-02-20T06:46:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f43e851d-1502-4b62-bdc2-ba27d8b6a1eb\\\\n2026-02-20T06:46:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f43e851d-1502-4b62-bdc2-ba27d8b6a1eb to /host/opt/cni/bin/\\\\n2026-02-20T06:46:51Z [verbose] multus-daemon started\\\\n2026-02-20T06:46:51Z [verbose] Readiness Indicator file check\\\\n2026-02-20T06:47:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.757020 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bc40232-e29d-4f13-a2f2-f8be2e97b789\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bbeadd569a4f1d9c8ec473e4dce3e5141ef23e49e911237ea9637cb3bc0fb77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b984b4cdf6ed9e2cbbeba5300d04d9aabca1c5deb777d5f2d1f92c56488baccf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b984b4cdf6ed9e2cbbeba5300d04d9aabca1c5deb777d5f2d1f92c56488baccf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.779690 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.786406 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.786500 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.786524 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.786558 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.786582 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:46Z","lastTransitionTime":"2026-02-20T06:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.800419 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.819524 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.837291 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 20:44:45.665010916 +0000 UTC Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.839746 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.839787 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:46 crc kubenswrapper[5094]: E0220 06:47:46.839986 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:46 crc kubenswrapper[5094]: E0220 06:47:46.840100 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.855619 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:44Z\\\",\\\"message\\\":\\\"work controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z]\\\\nI0220 06:47:44.923774 7201 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f\\\\nI0220 06:47:44.923771 7201 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-9vd4p\\\\nI0220 06:47:44.923661 7201 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0220 06:47:44.923695 7201 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_opensh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:47:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.895425 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.897967 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.898052 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.898081 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.898116 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.898140 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:46Z","lastTransitionTime":"2026-02-20T06:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.920505 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.942364 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.959901 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c561efe0bdf3fc9b35bebb6f356c9ef56add69a863637766aab3d3748acaaa63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8f6830864b9f0c23a91dff26fca798b670afe7c0316ae71ad386c027b8ce0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.002619 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.002689 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.002740 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.002770 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.002791 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:47Z","lastTransitionTime":"2026-02-20T06:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.106830 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.106931 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.106959 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.106999 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.107027 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:47Z","lastTransitionTime":"2026-02-20T06:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.211160 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.211237 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.211259 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.211288 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.211306 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:47Z","lastTransitionTime":"2026-02-20T06:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.315274 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.315350 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.315372 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.315400 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.315423 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:47Z","lastTransitionTime":"2026-02-20T06:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.420555 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.420665 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.420688 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.420744 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.420766 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:47Z","lastTransitionTime":"2026-02-20T06:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.524508 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.524651 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.524681 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.524772 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.524805 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:47Z","lastTransitionTime":"2026-02-20T06:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.628665 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.628783 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.628804 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.628851 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.628870 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:47Z","lastTransitionTime":"2026-02-20T06:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.731198 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.731246 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.731254 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.731271 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.731284 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:47Z","lastTransitionTime":"2026-02-20T06:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.834486 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.834532 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.834541 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.834559 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.834570 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:47Z","lastTransitionTime":"2026-02-20T06:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.837696 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 20:26:40.72872953 +0000 UTC Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.840009 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:47 crc kubenswrapper[5094]: E0220 06:47:47.840181 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.840937 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:47 crc kubenswrapper[5094]: E0220 06:47:47.841049 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.937897 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.937942 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.937957 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.937980 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.937995 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:47Z","lastTransitionTime":"2026-02-20T06:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.041592 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.041991 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.042120 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.042255 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.042376 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:48Z","lastTransitionTime":"2026-02-20T06:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.145504 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.145588 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.145605 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.145633 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.145654 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:48Z","lastTransitionTime":"2026-02-20T06:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.248594 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.248657 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.248671 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.248693 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.248728 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:48Z","lastTransitionTime":"2026-02-20T06:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.352424 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.352503 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.352519 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.352540 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.352555 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:48Z","lastTransitionTime":"2026-02-20T06:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.456479 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.456565 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.456587 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.456616 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.456637 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:48Z","lastTransitionTime":"2026-02-20T06:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.560227 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.560298 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.560318 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.560347 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.560367 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:48Z","lastTransitionTime":"2026-02-20T06:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.664071 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.664142 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.664163 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.664218 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.664237 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:48Z","lastTransitionTime":"2026-02-20T06:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.767672 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.768250 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.768458 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.768630 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.768820 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:48Z","lastTransitionTime":"2026-02-20T06:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.838292 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 01:01:22.379918536 +0000 UTC Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.839257 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.839257 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:48 crc kubenswrapper[5094]: E0220 06:47:48.839475 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:48 crc kubenswrapper[5094]: E0220 06:47:48.839582 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.873154 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.873221 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.873240 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.873266 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.873288 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:48Z","lastTransitionTime":"2026-02-20T06:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.977224 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.977288 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.977315 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.977370 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.977395 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:48Z","lastTransitionTime":"2026-02-20T06:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.088283 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.088414 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.088441 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.088479 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.088517 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:49Z","lastTransitionTime":"2026-02-20T06:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.191811 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.191898 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.191918 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.191948 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.191968 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:49Z","lastTransitionTime":"2026-02-20T06:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.295038 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.295111 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.295130 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.295158 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.295181 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:49Z","lastTransitionTime":"2026-02-20T06:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.399530 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.399598 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.399617 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.399643 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.399663 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:49Z","lastTransitionTime":"2026-02-20T06:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.503392 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.503463 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.503484 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.503511 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.503531 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:49Z","lastTransitionTime":"2026-02-20T06:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.607071 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.607157 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.607177 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.607207 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.607230 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:49Z","lastTransitionTime":"2026-02-20T06:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.710598 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.711111 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.711306 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.711658 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.711867 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:49Z","lastTransitionTime":"2026-02-20T06:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.814856 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.814926 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.814946 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.814982 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.815006 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:49Z","lastTransitionTime":"2026-02-20T06:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.839530 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 08:26:42.681831623 +0000 UTC Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.839856 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.839850 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:49 crc kubenswrapper[5094]: E0220 06:47:49.840051 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:49 crc kubenswrapper[5094]: E0220 06:47:49.840268 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.922778 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.922878 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.922903 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.922941 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.922972 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:49Z","lastTransitionTime":"2026-02-20T06:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.027003 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.027076 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.027098 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.027128 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.027151 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:50Z","lastTransitionTime":"2026-02-20T06:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.130679 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.130782 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.130802 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.130833 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.130853 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:50Z","lastTransitionTime":"2026-02-20T06:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.234230 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.234323 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.234345 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.234373 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.234392 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:50Z","lastTransitionTime":"2026-02-20T06:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.337908 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.337955 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.337965 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.337985 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.337995 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:50Z","lastTransitionTime":"2026-02-20T06:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.440411 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.440478 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.440500 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.440527 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.440546 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:50Z","lastTransitionTime":"2026-02-20T06:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.544292 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.544366 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.544393 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.544425 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.544453 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:50Z","lastTransitionTime":"2026-02-20T06:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.647157 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.647210 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.647219 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.647236 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.647247 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:50Z","lastTransitionTime":"2026-02-20T06:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.751130 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.751191 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.751209 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.751234 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.751253 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:50Z","lastTransitionTime":"2026-02-20T06:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.839737 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 10:28:24.201652965 +0000 UTC Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.839868 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.839921 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:50 crc kubenswrapper[5094]: E0220 06:47:50.840153 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:50 crc kubenswrapper[5094]: E0220 06:47:50.840288 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.855258 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.855311 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.855333 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.855360 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.855380 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:50Z","lastTransitionTime":"2026-02-20T06:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.880047 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:47:50 crc kubenswrapper[5094]: E0220 06:47:50.880204 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:54.880176045 +0000 UTC m=+149.752802796 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.880255 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.880314 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.880391 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.880431 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:50 crc kubenswrapper[5094]: E0220 06:47:50.880518 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 06:47:50 crc kubenswrapper[5094]: E0220 06:47:50.880542 5094 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 06:47:50 crc kubenswrapper[5094]: E0220 06:47:50.880553 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 06:47:50 crc kubenswrapper[5094]: E0220 06:47:50.880573 5094 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:47:50 crc kubenswrapper[5094]: E0220 06:47:50.880616 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 06:48:54.880597495 +0000 UTC m=+149.753224246 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 06:47:50 crc kubenswrapper[5094]: E0220 06:47:50.880618 5094 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 06:47:50 crc kubenswrapper[5094]: E0220 06:47:50.880642 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 06:48:54.880629676 +0000 UTC m=+149.753256427 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:47:50 crc kubenswrapper[5094]: E0220 06:47:50.880688 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 06:47:50 crc kubenswrapper[5094]: E0220 06:47:50.880769 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 06:47:50 crc kubenswrapper[5094]: E0220 06:47:50.880791 5094 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:47:50 crc kubenswrapper[5094]: E0220 06:47:50.880812 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 06:48:54.880769949 +0000 UTC m=+149.753396690 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 06:47:50 crc kubenswrapper[5094]: E0220 06:47:50.880880 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 06:48:54.880845962 +0000 UTC m=+149.753472873 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.958509 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.958579 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.958598 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.958626 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.958645 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:50Z","lastTransitionTime":"2026-02-20T06:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.062501 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.062587 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.062608 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.062641 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.062659 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:51Z","lastTransitionTime":"2026-02-20T06:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.166442 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.166504 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.166523 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.166550 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.166569 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:51Z","lastTransitionTime":"2026-02-20T06:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.270442 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.270508 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.270528 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.270560 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.270577 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:51Z","lastTransitionTime":"2026-02-20T06:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.374442 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.374527 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.374550 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.374591 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.374614 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:51Z","lastTransitionTime":"2026-02-20T06:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.478947 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.479027 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.479046 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.479072 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.479090 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:51Z","lastTransitionTime":"2026-02-20T06:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.587233 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.587307 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.587324 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.587348 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.587367 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:51Z","lastTransitionTime":"2026-02-20T06:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.690240 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.690304 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.690323 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.690349 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.690369 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:51Z","lastTransitionTime":"2026-02-20T06:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.793673 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.794176 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.794193 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.794216 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.794233 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:51Z","lastTransitionTime":"2026-02-20T06:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.840008 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 19:30:00.332389212 +0000 UTC Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.840213 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:51 crc kubenswrapper[5094]: E0220 06:47:51.840448 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.840579 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:51 crc kubenswrapper[5094]: E0220 06:47:51.840960 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.896950 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.897020 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.897038 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.897069 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.897090 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:51Z","lastTransitionTime":"2026-02-20T06:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.000632 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.000742 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.000770 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.000806 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.000831 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:52Z","lastTransitionTime":"2026-02-20T06:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.105477 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.105536 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.105553 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.105578 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.105595 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:52Z","lastTransitionTime":"2026-02-20T06:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.208793 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.208836 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.208845 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.208860 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.208870 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:52Z","lastTransitionTime":"2026-02-20T06:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.313222 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.313286 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.313303 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.313335 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.313355 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:52Z","lastTransitionTime":"2026-02-20T06:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.417051 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.417116 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.417139 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.417172 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.417196 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:52Z","lastTransitionTime":"2026-02-20T06:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.520916 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.520995 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.521013 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.521049 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.521073 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:52Z","lastTransitionTime":"2026-02-20T06:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.623889 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.623952 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.623969 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.623995 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.624017 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:52Z","lastTransitionTime":"2026-02-20T06:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.727407 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.727472 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.727493 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.727521 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.727539 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:52Z","lastTransitionTime":"2026-02-20T06:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.830695 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.830795 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.830812 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.830841 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.830860 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:52Z","lastTransitionTime":"2026-02-20T06:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.839297 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.839348 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:52 crc kubenswrapper[5094]: E0220 06:47:52.839649 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:52 crc kubenswrapper[5094]: E0220 06:47:52.839787 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.840234 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 23:28:45.883416792 +0000 UTC Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.934785 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.934840 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.934856 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.934882 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.934900 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:52Z","lastTransitionTime":"2026-02-20T06:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.038575 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.038646 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.038668 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.038742 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.038762 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:53Z","lastTransitionTime":"2026-02-20T06:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.141544 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.141612 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.141631 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.141658 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.141677 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:53Z","lastTransitionTime":"2026-02-20T06:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.245288 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.245351 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.245362 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.245390 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.245402 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:53Z","lastTransitionTime":"2026-02-20T06:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.349869 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.349955 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.349979 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.350011 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.350035 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:53Z","lastTransitionTime":"2026-02-20T06:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.453276 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.453364 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.453383 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.453432 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.453457 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:53Z","lastTransitionTime":"2026-02-20T06:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.556431 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.556483 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.556493 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.556513 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.556525 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:53Z","lastTransitionTime":"2026-02-20T06:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.660400 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.660478 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.660502 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.660534 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.660557 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:53Z","lastTransitionTime":"2026-02-20T06:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.763691 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.763870 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.763904 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.763938 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.763969 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:53Z","lastTransitionTime":"2026-02-20T06:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.839796 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:53 crc kubenswrapper[5094]: E0220 06:47:53.839991 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.840115 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:53 crc kubenswrapper[5094]: E0220 06:47:53.840290 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.840362 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 14:57:48.951556214 +0000 UTC Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.867084 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.867141 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.867159 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.867190 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.867213 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:53Z","lastTransitionTime":"2026-02-20T06:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.969892 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.969925 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.969934 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.969949 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.969959 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:53Z","lastTransitionTime":"2026-02-20T06:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.030199 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.030232 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.030241 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.030255 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.030264 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:54Z","lastTransitionTime":"2026-02-20T06:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.096919 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx"] Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.097370 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.099803 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.100717 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.101366 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.102784 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.133318 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-8wch6" podStartSLOduration=68.133286233 podStartE2EDuration="1m8.133286233s" podCreationTimestamp="2026-02-20 06:46:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:47:54.118670336 +0000 UTC m=+88.991297087" watchObservedRunningTime="2026-02-20 06:47:54.133286233 +0000 UTC m=+89.005912974" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.133537 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-qzxk2" podStartSLOduration=68.13352935 podStartE2EDuration="1m8.13352935s" podCreationTimestamp="2026-02-20 06:46:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:47:54.133090298 +0000 UTC m=+89.005717009" watchObservedRunningTime="2026-02-20 06:47:54.13352935 +0000 UTC m=+89.006156091" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.162195 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" podStartSLOduration=67.162173001 podStartE2EDuration="1m7.162173001s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:47:54.161863034 +0000 UTC m=+89.034489755" watchObservedRunningTime="2026-02-20 06:47:54.162173001 +0000 UTC m=+89.034799752" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.200472 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=38.200443572 podStartE2EDuration="38.200443572s" podCreationTimestamp="2026-02-20 06:47:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:47:54.200227507 +0000 UTC m=+89.072854278" watchObservedRunningTime="2026-02-20 06:47:54.200443572 +0000 UTC m=+89.073070293" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.219969 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e83799ff-1272-441c-87ec-74034bf3622c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-sd2hx\" (UID: \"e83799ff-1272-441c-87ec-74034bf3622c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.220034 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e83799ff-1272-441c-87ec-74034bf3622c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-sd2hx\" (UID: \"e83799ff-1272-441c-87ec-74034bf3622c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.220337 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e83799ff-1272-441c-87ec-74034bf3622c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-sd2hx\" (UID: \"e83799ff-1272-441c-87ec-74034bf3622c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.220425 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e83799ff-1272-441c-87ec-74034bf3622c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-sd2hx\" (UID: \"e83799ff-1272-441c-87ec-74034bf3622c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.220476 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e83799ff-1272-441c-87ec-74034bf3622c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-sd2hx\" (UID: \"e83799ff-1272-441c-87ec-74034bf3622c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.257967 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podStartSLOduration=67.257930301 podStartE2EDuration="1m7.257930301s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:47:54.25707986 +0000 UTC m=+89.129706611" watchObservedRunningTime="2026-02-20 06:47:54.257930301 +0000 UTC m=+89.130557052" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.294626 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-zr8rz" podStartSLOduration=67.294598754 podStartE2EDuration="1m7.294598754s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:47:54.294102901 +0000 UTC m=+89.166729632" watchObservedRunningTime="2026-02-20 06:47:54.294598754 +0000 UTC m=+89.167225475" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.321800 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e83799ff-1272-441c-87ec-74034bf3622c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-sd2hx\" (UID: \"e83799ff-1272-441c-87ec-74034bf3622c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.322146 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e83799ff-1272-441c-87ec-74034bf3622c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-sd2hx\" (UID: \"e83799ff-1272-441c-87ec-74034bf3622c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.322280 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e83799ff-1272-441c-87ec-74034bf3622c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-sd2hx\" (UID: \"e83799ff-1272-441c-87ec-74034bf3622c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.322401 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e83799ff-1272-441c-87ec-74034bf3622c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-sd2hx\" (UID: \"e83799ff-1272-441c-87ec-74034bf3622c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.322551 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e83799ff-1272-441c-87ec-74034bf3622c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-sd2hx\" (UID: \"e83799ff-1272-441c-87ec-74034bf3622c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.322290 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e83799ff-1272-441c-87ec-74034bf3622c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-sd2hx\" (UID: \"e83799ff-1272-441c-87ec-74034bf3622c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.322677 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e83799ff-1272-441c-87ec-74034bf3622c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-sd2hx\" (UID: \"e83799ff-1272-441c-87ec-74034bf3622c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.323517 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e83799ff-1272-441c-87ec-74034bf3622c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-sd2hx\" (UID: \"e83799ff-1272-441c-87ec-74034bf3622c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.329890 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e83799ff-1272-441c-87ec-74034bf3622c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-sd2hx\" (UID: \"e83799ff-1272-441c-87ec-74034bf3622c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.330190 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=13.33016565 podStartE2EDuration="13.33016565s" podCreationTimestamp="2026-02-20 06:47:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:47:54.329214697 +0000 UTC m=+89.201841418" watchObservedRunningTime="2026-02-20 06:47:54.33016565 +0000 UTC m=+89.202792371" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.347063 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e83799ff-1272-441c-87ec-74034bf3622c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-sd2hx\" (UID: \"e83799ff-1272-441c-87ec-74034bf3622c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.362974 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=69.362950531 podStartE2EDuration="1m9.362950531s" podCreationTimestamp="2026-02-20 06:46:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:47:54.362832377 +0000 UTC m=+89.235459088" watchObservedRunningTime="2026-02-20 06:47:54.362950531 +0000 UTC m=+89.235577242" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.404399 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=67.404375487 podStartE2EDuration="1m7.404375487s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:47:54.383053359 +0000 UTC m=+89.255680070" watchObservedRunningTime="2026-02-20 06:47:54.404375487 +0000 UTC m=+89.277002198" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.419061 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.466097 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=67.466071315 podStartE2EDuration="1m7.466071315s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:47:54.464816695 +0000 UTC m=+89.337443406" watchObservedRunningTime="2026-02-20 06:47:54.466071315 +0000 UTC m=+89.338698026" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.542247 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" podStartSLOduration=67.542224688 podStartE2EDuration="1m7.542224688s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:47:54.527123119 +0000 UTC m=+89.399749830" watchObservedRunningTime="2026-02-20 06:47:54.542224688 +0000 UTC m=+89.414851399" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.579505 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" event={"ID":"e83799ff-1272-441c-87ec-74034bf3622c","Type":"ContainerStarted","Data":"d051043d04fa6ae5988a43bf0012be5faab49c0b8d99c50d97f7d08c282a75d6"} Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.579577 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" event={"ID":"e83799ff-1272-441c-87ec-74034bf3622c","Type":"ContainerStarted","Data":"9dbcd7bdd8f844492425ae1e0ca03888fde22dfbcb776a0a71b53da50e068ccc"} Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.595887 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" podStartSLOduration=67.595860275 podStartE2EDuration="1m7.595860275s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:47:54.595685851 +0000 UTC m=+89.468312602" watchObservedRunningTime="2026-02-20 06:47:54.595860275 +0000 UTC m=+89.468487006" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.839671 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.839671 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:54 crc kubenswrapper[5094]: E0220 06:47:54.839930 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:54 crc kubenswrapper[5094]: E0220 06:47:54.840108 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.840786 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 06:19:34.876721117 +0000 UTC Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.840879 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.853118 5094 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 20 06:47:55 crc kubenswrapper[5094]: I0220 06:47:55.839243 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:55 crc kubenswrapper[5094]: I0220 06:47:55.839469 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:55 crc kubenswrapper[5094]: E0220 06:47:55.841296 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:55 crc kubenswrapper[5094]: E0220 06:47:55.841686 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:56 crc kubenswrapper[5094]: I0220 06:47:56.839430 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:56 crc kubenswrapper[5094]: I0220 06:47:56.839482 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:56 crc kubenswrapper[5094]: E0220 06:47:56.839658 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:56 crc kubenswrapper[5094]: E0220 06:47:56.839836 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:57 crc kubenswrapper[5094]: I0220 06:47:57.843081 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:57 crc kubenswrapper[5094]: E0220 06:47:57.843331 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:57 crc kubenswrapper[5094]: I0220 06:47:57.843602 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:57 crc kubenswrapper[5094]: E0220 06:47:57.843801 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:58 crc kubenswrapper[5094]: I0220 06:47:58.839319 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:58 crc kubenswrapper[5094]: I0220 06:47:58.839496 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:58 crc kubenswrapper[5094]: E0220 06:47:58.840084 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:58 crc kubenswrapper[5094]: E0220 06:47:58.840401 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:59 crc kubenswrapper[5094]: I0220 06:47:59.840118 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:59 crc kubenswrapper[5094]: I0220 06:47:59.840248 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:59 crc kubenswrapper[5094]: E0220 06:47:59.840357 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:59 crc kubenswrapper[5094]: E0220 06:47:59.841145 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:59 crc kubenswrapper[5094]: I0220 06:47:59.841762 5094 scope.go:117] "RemoveContainer" containerID="0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd" Feb 20 06:47:59 crc kubenswrapper[5094]: E0220 06:47:59.842046 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" Feb 20 06:48:00 crc kubenswrapper[5094]: I0220 06:48:00.839326 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:00 crc kubenswrapper[5094]: I0220 06:48:00.839342 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:00 crc kubenswrapper[5094]: E0220 06:48:00.840032 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:00 crc kubenswrapper[5094]: E0220 06:48:00.840171 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:01 crc kubenswrapper[5094]: I0220 06:48:01.839968 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:01 crc kubenswrapper[5094]: E0220 06:48:01.840138 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:01 crc kubenswrapper[5094]: I0220 06:48:01.842507 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:01 crc kubenswrapper[5094]: E0220 06:48:01.842602 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:02 crc kubenswrapper[5094]: I0220 06:48:02.839566 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:02 crc kubenswrapper[5094]: I0220 06:48:02.839671 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:02 crc kubenswrapper[5094]: E0220 06:48:02.839977 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:02 crc kubenswrapper[5094]: E0220 06:48:02.840151 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:03 crc kubenswrapper[5094]: I0220 06:48:03.839845 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:03 crc kubenswrapper[5094]: I0220 06:48:03.839873 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:03 crc kubenswrapper[5094]: E0220 06:48:03.840138 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:03 crc kubenswrapper[5094]: E0220 06:48:03.840290 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:04 crc kubenswrapper[5094]: I0220 06:48:04.839761 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:04 crc kubenswrapper[5094]: I0220 06:48:04.839791 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:04 crc kubenswrapper[5094]: E0220 06:48:04.839967 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:04 crc kubenswrapper[5094]: E0220 06:48:04.840057 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:05 crc kubenswrapper[5094]: I0220 06:48:05.775163 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs\") pod \"network-metrics-daemon-8ww4n\" (UID: \"da0aa093-1adc-45f2-a942-e68d7be23ed4\") " pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:05 crc kubenswrapper[5094]: E0220 06:48:05.775461 5094 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 06:48:05 crc kubenswrapper[5094]: E0220 06:48:05.775593 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs podName:da0aa093-1adc-45f2-a942-e68d7be23ed4 nodeName:}" failed. No retries permitted until 2026-02-20 06:49:09.775559098 +0000 UTC m=+164.648185839 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs") pod "network-metrics-daemon-8ww4n" (UID: "da0aa093-1adc-45f2-a942-e68d7be23ed4") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 06:48:05 crc kubenswrapper[5094]: I0220 06:48:05.839185 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:05 crc kubenswrapper[5094]: I0220 06:48:05.839290 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:05 crc kubenswrapper[5094]: E0220 06:48:05.841149 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:05 crc kubenswrapper[5094]: E0220 06:48:05.841423 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:06 crc kubenswrapper[5094]: I0220 06:48:06.839811 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:06 crc kubenswrapper[5094]: I0220 06:48:06.839856 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:06 crc kubenswrapper[5094]: E0220 06:48:06.840023 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:06 crc kubenswrapper[5094]: E0220 06:48:06.840187 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:07 crc kubenswrapper[5094]: I0220 06:48:07.840100 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:07 crc kubenswrapper[5094]: E0220 06:48:07.840468 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:07 crc kubenswrapper[5094]: I0220 06:48:07.840942 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:07 crc kubenswrapper[5094]: E0220 06:48:07.841077 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:08 crc kubenswrapper[5094]: I0220 06:48:08.839322 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:08 crc kubenswrapper[5094]: I0220 06:48:08.839322 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:08 crc kubenswrapper[5094]: E0220 06:48:08.839596 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:08 crc kubenswrapper[5094]: E0220 06:48:08.839787 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:09 crc kubenswrapper[5094]: I0220 06:48:09.839373 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:09 crc kubenswrapper[5094]: I0220 06:48:09.839589 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:09 crc kubenswrapper[5094]: E0220 06:48:09.839850 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:09 crc kubenswrapper[5094]: E0220 06:48:09.840102 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:10 crc kubenswrapper[5094]: I0220 06:48:10.839284 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:10 crc kubenswrapper[5094]: I0220 06:48:10.839588 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:10 crc kubenswrapper[5094]: E0220 06:48:10.840191 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:10 crc kubenswrapper[5094]: E0220 06:48:10.840333 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:11 crc kubenswrapper[5094]: I0220 06:48:11.840228 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:11 crc kubenswrapper[5094]: I0220 06:48:11.840294 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:11 crc kubenswrapper[5094]: E0220 06:48:11.840454 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:11 crc kubenswrapper[5094]: E0220 06:48:11.840557 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:12 crc kubenswrapper[5094]: I0220 06:48:12.840002 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:12 crc kubenswrapper[5094]: I0220 06:48:12.840271 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:12 crc kubenswrapper[5094]: E0220 06:48:12.840444 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:12 crc kubenswrapper[5094]: E0220 06:48:12.840634 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:13 crc kubenswrapper[5094]: I0220 06:48:13.839652 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:13 crc kubenswrapper[5094]: I0220 06:48:13.839655 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:13 crc kubenswrapper[5094]: E0220 06:48:13.839914 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:13 crc kubenswrapper[5094]: E0220 06:48:13.840160 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:14 crc kubenswrapper[5094]: I0220 06:48:14.840027 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:14 crc kubenswrapper[5094]: I0220 06:48:14.840099 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:14 crc kubenswrapper[5094]: E0220 06:48:14.840323 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:14 crc kubenswrapper[5094]: E0220 06:48:14.840982 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:14 crc kubenswrapper[5094]: I0220 06:48:14.841416 5094 scope.go:117] "RemoveContainer" containerID="0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd" Feb 20 06:48:14 crc kubenswrapper[5094]: E0220 06:48:14.841678 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" Feb 20 06:48:15 crc kubenswrapper[5094]: I0220 06:48:15.839925 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:15 crc kubenswrapper[5094]: I0220 06:48:15.842973 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:15 crc kubenswrapper[5094]: E0220 06:48:15.842940 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:15 crc kubenswrapper[5094]: E0220 06:48:15.843411 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:16 crc kubenswrapper[5094]: I0220 06:48:16.840271 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:16 crc kubenswrapper[5094]: I0220 06:48:16.840485 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:16 crc kubenswrapper[5094]: E0220 06:48:16.840637 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:16 crc kubenswrapper[5094]: E0220 06:48:16.841007 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:17 crc kubenswrapper[5094]: I0220 06:48:17.839235 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:17 crc kubenswrapper[5094]: I0220 06:48:17.839298 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:17 crc kubenswrapper[5094]: E0220 06:48:17.839494 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:17 crc kubenswrapper[5094]: E0220 06:48:17.839636 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:18 crc kubenswrapper[5094]: I0220 06:48:18.839510 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:18 crc kubenswrapper[5094]: I0220 06:48:18.839510 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:18 crc kubenswrapper[5094]: E0220 06:48:18.839784 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:18 crc kubenswrapper[5094]: E0220 06:48:18.839851 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:19 crc kubenswrapper[5094]: I0220 06:48:19.839409 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:19 crc kubenswrapper[5094]: I0220 06:48:19.839451 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:19 crc kubenswrapper[5094]: E0220 06:48:19.839792 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:19 crc kubenswrapper[5094]: E0220 06:48:19.840825 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:20 crc kubenswrapper[5094]: I0220 06:48:20.840181 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:20 crc kubenswrapper[5094]: I0220 06:48:20.840446 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:20 crc kubenswrapper[5094]: E0220 06:48:20.840642 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:20 crc kubenswrapper[5094]: E0220 06:48:20.840840 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:21 crc kubenswrapper[5094]: I0220 06:48:21.839247 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:21 crc kubenswrapper[5094]: E0220 06:48:21.839457 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:21 crc kubenswrapper[5094]: I0220 06:48:21.839916 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:21 crc kubenswrapper[5094]: E0220 06:48:21.840167 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:22 crc kubenswrapper[5094]: I0220 06:48:22.700168 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zr8rz_c3900f6d-3035-4fc4-80a2-9e79154f4f5e/kube-multus/1.log" Feb 20 06:48:22 crc kubenswrapper[5094]: I0220 06:48:22.701492 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zr8rz_c3900f6d-3035-4fc4-80a2-9e79154f4f5e/kube-multus/0.log" Feb 20 06:48:22 crc kubenswrapper[5094]: I0220 06:48:22.701570 5094 generic.go:334] "Generic (PLEG): container finished" podID="c3900f6d-3035-4fc4-80a2-9e79154f4f5e" containerID="aba2be0e4774b30500be76b546e3ffff5c136a2e26675822931a400ca3090e79" exitCode=1 Feb 20 06:48:22 crc kubenswrapper[5094]: I0220 06:48:22.701636 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zr8rz" event={"ID":"c3900f6d-3035-4fc4-80a2-9e79154f4f5e","Type":"ContainerDied","Data":"aba2be0e4774b30500be76b546e3ffff5c136a2e26675822931a400ca3090e79"} Feb 20 06:48:22 crc kubenswrapper[5094]: I0220 06:48:22.701755 5094 scope.go:117] "RemoveContainer" containerID="7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118" Feb 20 06:48:22 crc kubenswrapper[5094]: I0220 06:48:22.702584 5094 scope.go:117] "RemoveContainer" containerID="aba2be0e4774b30500be76b546e3ffff5c136a2e26675822931a400ca3090e79" Feb 20 06:48:22 crc kubenswrapper[5094]: E0220 06:48:22.703047 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-zr8rz_openshift-multus(c3900f6d-3035-4fc4-80a2-9e79154f4f5e)\"" pod="openshift-multus/multus-zr8rz" podUID="c3900f6d-3035-4fc4-80a2-9e79154f4f5e" Feb 20 06:48:22 crc kubenswrapper[5094]: I0220 06:48:22.839907 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:22 crc kubenswrapper[5094]: I0220 06:48:22.840208 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:22 crc kubenswrapper[5094]: E0220 06:48:22.840293 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:22 crc kubenswrapper[5094]: E0220 06:48:22.840827 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:23 crc kubenswrapper[5094]: I0220 06:48:23.708940 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zr8rz_c3900f6d-3035-4fc4-80a2-9e79154f4f5e/kube-multus/1.log" Feb 20 06:48:23 crc kubenswrapper[5094]: I0220 06:48:23.840108 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:23 crc kubenswrapper[5094]: I0220 06:48:23.840239 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:23 crc kubenswrapper[5094]: E0220 06:48:23.840504 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:23 crc kubenswrapper[5094]: E0220 06:48:23.840764 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:24 crc kubenswrapper[5094]: I0220 06:48:24.839824 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:24 crc kubenswrapper[5094]: I0220 06:48:24.839989 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:24 crc kubenswrapper[5094]: E0220 06:48:24.840009 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:24 crc kubenswrapper[5094]: E0220 06:48:24.840307 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:25 crc kubenswrapper[5094]: I0220 06:48:25.842806 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:25 crc kubenswrapper[5094]: I0220 06:48:25.842858 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:25 crc kubenswrapper[5094]: E0220 06:48:25.843101 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:25 crc kubenswrapper[5094]: E0220 06:48:25.843347 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:25 crc kubenswrapper[5094]: I0220 06:48:25.843410 5094 scope.go:117] "RemoveContainer" containerID="0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd" Feb 20 06:48:25 crc kubenswrapper[5094]: E0220 06:48:25.865774 5094 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 20 06:48:25 crc kubenswrapper[5094]: E0220 06:48:25.975771 5094 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 20 06:48:26 crc kubenswrapper[5094]: I0220 06:48:26.727041 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovnkube-controller/3.log" Feb 20 06:48:26 crc kubenswrapper[5094]: I0220 06:48:26.731396 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerStarted","Data":"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9"} Feb 20 06:48:26 crc kubenswrapper[5094]: I0220 06:48:26.732021 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:48:26 crc kubenswrapper[5094]: I0220 06:48:26.783780 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podStartSLOduration=99.783753336 podStartE2EDuration="1m39.783753336s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:26.782740122 +0000 UTC m=+121.655366833" watchObservedRunningTime="2026-02-20 06:48:26.783753336 +0000 UTC m=+121.656380057" Feb 20 06:48:26 crc kubenswrapper[5094]: I0220 06:48:26.840366 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:26 crc kubenswrapper[5094]: I0220 06:48:26.840397 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:26 crc kubenswrapper[5094]: E0220 06:48:26.840594 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:26 crc kubenswrapper[5094]: E0220 06:48:26.840909 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:27 crc kubenswrapper[5094]: I0220 06:48:27.040690 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8ww4n"] Feb 20 06:48:27 crc kubenswrapper[5094]: I0220 06:48:27.040966 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:27 crc kubenswrapper[5094]: E0220 06:48:27.041132 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:27 crc kubenswrapper[5094]: I0220 06:48:27.839118 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:27 crc kubenswrapper[5094]: E0220 06:48:27.839259 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:28 crc kubenswrapper[5094]: I0220 06:48:28.839241 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:28 crc kubenswrapper[5094]: I0220 06:48:28.839244 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:28 crc kubenswrapper[5094]: I0220 06:48:28.839347 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:28 crc kubenswrapper[5094]: E0220 06:48:28.839935 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:28 crc kubenswrapper[5094]: E0220 06:48:28.840118 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:28 crc kubenswrapper[5094]: E0220 06:48:28.840453 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:29 crc kubenswrapper[5094]: I0220 06:48:29.839845 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:29 crc kubenswrapper[5094]: E0220 06:48:29.840061 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:30 crc kubenswrapper[5094]: I0220 06:48:30.840099 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:30 crc kubenswrapper[5094]: I0220 06:48:30.840170 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:30 crc kubenswrapper[5094]: I0220 06:48:30.840234 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:30 crc kubenswrapper[5094]: E0220 06:48:30.840310 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:30 crc kubenswrapper[5094]: E0220 06:48:30.840477 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:30 crc kubenswrapper[5094]: E0220 06:48:30.840579 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:30 crc kubenswrapper[5094]: E0220 06:48:30.977625 5094 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 20 06:48:31 crc kubenswrapper[5094]: I0220 06:48:31.839305 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:31 crc kubenswrapper[5094]: E0220 06:48:31.839630 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:32 crc kubenswrapper[5094]: I0220 06:48:32.839409 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:32 crc kubenswrapper[5094]: I0220 06:48:32.839521 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:32 crc kubenswrapper[5094]: I0220 06:48:32.839779 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:32 crc kubenswrapper[5094]: E0220 06:48:32.839958 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:32 crc kubenswrapper[5094]: E0220 06:48:32.840191 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:32 crc kubenswrapper[5094]: E0220 06:48:32.840450 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:33 crc kubenswrapper[5094]: I0220 06:48:33.840019 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:33 crc kubenswrapper[5094]: E0220 06:48:33.840254 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:34 crc kubenswrapper[5094]: I0220 06:48:34.839582 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:34 crc kubenswrapper[5094]: I0220 06:48:34.839655 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:34 crc kubenswrapper[5094]: I0220 06:48:34.839938 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:34 crc kubenswrapper[5094]: E0220 06:48:34.840111 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:34 crc kubenswrapper[5094]: E0220 06:48:34.839944 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:34 crc kubenswrapper[5094]: E0220 06:48:34.840292 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:35 crc kubenswrapper[5094]: I0220 06:48:35.840504 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:35 crc kubenswrapper[5094]: I0220 06:48:35.840616 5094 scope.go:117] "RemoveContainer" containerID="aba2be0e4774b30500be76b546e3ffff5c136a2e26675822931a400ca3090e79" Feb 20 06:48:35 crc kubenswrapper[5094]: E0220 06:48:35.840758 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:35 crc kubenswrapper[5094]: E0220 06:48:35.978281 5094 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 20 06:48:36 crc kubenswrapper[5094]: I0220 06:48:36.773988 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zr8rz_c3900f6d-3035-4fc4-80a2-9e79154f4f5e/kube-multus/1.log" Feb 20 06:48:36 crc kubenswrapper[5094]: I0220 06:48:36.774090 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zr8rz" event={"ID":"c3900f6d-3035-4fc4-80a2-9e79154f4f5e","Type":"ContainerStarted","Data":"0e16f340af41f0cc3bffd1d98c9695dc8ad9491384da855c5637478b18c6f793"} Feb 20 06:48:36 crc kubenswrapper[5094]: I0220 06:48:36.839473 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:36 crc kubenswrapper[5094]: I0220 06:48:36.839529 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:36 crc kubenswrapper[5094]: I0220 06:48:36.839494 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:36 crc kubenswrapper[5094]: E0220 06:48:36.839667 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:36 crc kubenswrapper[5094]: E0220 06:48:36.839747 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:36 crc kubenswrapper[5094]: E0220 06:48:36.839850 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:37 crc kubenswrapper[5094]: I0220 06:48:37.839920 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:37 crc kubenswrapper[5094]: E0220 06:48:37.841306 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:38 crc kubenswrapper[5094]: I0220 06:48:38.839648 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:38 crc kubenswrapper[5094]: I0220 06:48:38.839760 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:38 crc kubenswrapper[5094]: E0220 06:48:38.839875 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:38 crc kubenswrapper[5094]: I0220 06:48:38.839795 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:38 crc kubenswrapper[5094]: E0220 06:48:38.839976 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:38 crc kubenswrapper[5094]: E0220 06:48:38.840087 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:39 crc kubenswrapper[5094]: I0220 06:48:39.840977 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:39 crc kubenswrapper[5094]: E0220 06:48:39.841185 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:40 crc kubenswrapper[5094]: I0220 06:48:40.839531 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:40 crc kubenswrapper[5094]: I0220 06:48:40.839632 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:40 crc kubenswrapper[5094]: I0220 06:48:40.839642 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:40 crc kubenswrapper[5094]: E0220 06:48:40.839694 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:40 crc kubenswrapper[5094]: E0220 06:48:40.839834 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:40 crc kubenswrapper[5094]: E0220 06:48:40.839989 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:41 crc kubenswrapper[5094]: I0220 06:48:41.839464 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:41 crc kubenswrapper[5094]: I0220 06:48:41.866167 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 20 06:48:41 crc kubenswrapper[5094]: I0220 06:48:41.866244 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 20 06:48:42 crc kubenswrapper[5094]: I0220 06:48:42.839685 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:42 crc kubenswrapper[5094]: I0220 06:48:42.839985 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:42 crc kubenswrapper[5094]: I0220 06:48:42.840164 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:42 crc kubenswrapper[5094]: I0220 06:48:42.843208 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 20 06:48:42 crc kubenswrapper[5094]: I0220 06:48:42.843914 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 20 06:48:42 crc kubenswrapper[5094]: I0220 06:48:42.844168 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 20 06:48:42 crc kubenswrapper[5094]: I0220 06:48:42.844427 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.486577 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.536060 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-xj788"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.536777 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fnbl8"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.537239 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.540901 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.541118 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.542798 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.549253 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-p4blk"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.549625 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.550647 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.572274 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.573199 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bslb9"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.573647 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bwvdt"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.574232 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.574257 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j2mmj"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.574978 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.575045 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j2mmj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.575579 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.576680 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.576754 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.576958 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.576683 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.577183 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.577320 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.577465 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.577591 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.577794 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.577939 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.578278 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.578431 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.578596 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.580341 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.580981 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.590522 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-ps6pv"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.590969 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.591250 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.591519 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-p9fck"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.592037 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.592343 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.592523 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.581179 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.593321 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.593578 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.581633 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.594850 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.586303 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.586878 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.586961 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.587048 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.587281 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.587316 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.587376 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.588245 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.588305 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.589198 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.595843 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.600999 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.601055 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.601105 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.601148 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.601182 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.601195 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.601222 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.601254 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.601286 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.601307 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.601319 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.601360 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.601377 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.601399 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.601427 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.601507 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.601537 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.605825 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.606828 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.607337 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.608373 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.609090 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.609230 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.609331 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.609884 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.610826 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.610917 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.611035 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.611398 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-znrdm"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.611761 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-znrdm" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.611870 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.612085 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.612280 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.612408 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.612563 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.612687 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.612957 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.613116 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.613158 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.613254 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.613333 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.613403 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.613472 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.613546 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.613590 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.613689 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.613865 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.613993 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.614046 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.614006 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.614098 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.614134 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.614161 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.614192 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.614225 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.614226 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.614278 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.614313 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.614586 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.616923 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-7hljp"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.626563 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.643478 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.644835 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.644864 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.646606 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.646890 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7hljp" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.647086 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.647123 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-w7rf2"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.647615 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.647860 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.647928 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.650211 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.651104 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.657271 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.657865 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-w7rf2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.660884 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.663762 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.663961 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.664064 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.666860 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.668005 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.668425 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.668837 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-d2l2r"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.669692 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-q596q"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.670196 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-d2l2r" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.670829 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.671759 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.672654 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.674799 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-7wgh7"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.676271 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.679611 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.680915 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-q596q" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.681089 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.682511 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.684247 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-9sztr"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.685234 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-shq4j"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.685447 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.685766 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.686216 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.687006 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.687202 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mfphn"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.706360 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.708632 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709009 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-x72n5"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709211 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e594f26b-0fd6-44a1-93eb-84593591389f-serving-cert\") pod \"openshift-config-operator-7777fb866f-p9fck\" (UID: \"e594f26b-0fd6-44a1-93eb-84593591389f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709251 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrllp\" (UniqueName: \"kubernetes.io/projected/e594f26b-0fd6-44a1-93eb-84593591389f-kube-api-access-jrllp\") pod \"openshift-config-operator-7777fb866f-p9fck\" (UID: \"e594f26b-0fd6-44a1-93eb-84593591389f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709280 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f348b60-0d81-490e-bfb4-ea32546c995a-config\") pod \"machine-api-operator-5694c8668f-ps6pv\" (UID: \"2f348b60-0d81-490e-bfb4-ea32546c995a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709302 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-client-ca\") pod \"route-controller-manager-6576b87f9c-qrtpl\" (UID: \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709320 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709342 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38e3be97-7374-4b8b-9565-4d60baa02401-service-ca-bundle\") pod \"authentication-operator-69f744f599-bwvdt\" (UID: \"38e3be97-7374-4b8b-9565-4d60baa02401\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709357 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7b5c64ae-5f80-4e35-91dc-48163991b63d-audit-dir\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709377 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-config\") pod \"controller-manager-879f6c89f-fnbl8\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709391 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18cc290d-78be-42c6-af5b-3b8b86941eb2-serving-cert\") pod \"controller-manager-879f6c89f-fnbl8\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709409 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/38d9642e-3788-4e70-8232-138cd84e02dc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-znrdm\" (UID: \"38d9642e-3788-4e70-8232-138cd84e02dc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-znrdm" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709427 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8d0112e-e85e-42f1-b28b-c0c996f36fe0-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-wzftj\" (UID: \"f8d0112e-e85e-42f1-b28b-c0c996f36fe0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709450 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1974d27-b923-4a9b-9874-d400df5bd29a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ddvpd\" (UID: \"d1974d27-b923-4a9b-9874-d400df5bd29a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709465 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-serving-cert\") pod \"route-controller-manager-6576b87f9c-qrtpl\" (UID: \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709486 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709507 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2-auth-proxy-config\") pod \"machine-approver-56656f9798-xj788\" (UID: \"d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709535 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f20c574-b730-4bd8-97d1-7751eb7968d4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vs599\" (UID: \"5f20c574-b730-4bd8-97d1-7751eb7968d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709550 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7b5c64ae-5f80-4e35-91dc-48163991b63d-encryption-config\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709566 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1974d27-b923-4a9b-9874-d400df5bd29a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ddvpd\" (UID: \"d1974d27-b923-4a9b-9874-d400df5bd29a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709582 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdjs5\" (UniqueName: \"kubernetes.io/projected/2f348b60-0d81-490e-bfb4-ea32546c995a-kube-api-access-sdjs5\") pod \"machine-api-operator-5694c8668f-ps6pv\" (UID: \"2f348b60-0d81-490e-bfb4-ea32546c995a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709601 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlgjz\" (UniqueName: \"kubernetes.io/projected/f8d0112e-e85e-42f1-b28b-c0c996f36fe0-kube-api-access-rlgjz\") pod \"cluster-image-registry-operator-dc59b4c8b-wzftj\" (UID: \"f8d0112e-e85e-42f1-b28b-c0c996f36fe0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709621 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-audit-dir\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709639 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b5c64ae-5f80-4e35-91dc-48163991b63d-config\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709657 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f8d0112e-e85e-42f1-b28b-c0c996f36fe0-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-wzftj\" (UID: \"f8d0112e-e85e-42f1-b28b-c0c996f36fe0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709679 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709696 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7b5c64ae-5f80-4e35-91dc-48163991b63d-etcd-client\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709724 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b5c64ae-5f80-4e35-91dc-48163991b63d-serving-cert\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709742 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2f348b60-0d81-490e-bfb4-ea32546c995a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-ps6pv\" (UID: \"2f348b60-0d81-490e-bfb4-ea32546c995a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709759 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-audit-dir\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709774 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-client-ca\") pod \"controller-manager-879f6c89f-fnbl8\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709790 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e78b7a6b-91b7-4753-bd82-df9d3ea97291-config\") pod \"kube-apiserver-operator-766d6c64bb-c2mfb\" (UID: \"e78b7a6b-91b7-4753-bd82-df9d3ea97291\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709809 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2mff\" (UniqueName: \"kubernetes.io/projected/bac53d01-ed38-46a8-ae9e-bfb72e5565a1-kube-api-access-g2mff\") pod \"migrator-59844c95c7-7hljp\" (UID: \"bac53d01-ed38-46a8-ae9e-bfb72e5565a1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7hljp" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709826 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709845 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-etcd-client\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709861 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6k8tl\" (UniqueName: \"kubernetes.io/projected/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-kube-api-access-6k8tl\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709888 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-encryption-config\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709906 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f20c574-b730-4bd8-97d1-7751eb7968d4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vs599\" (UID: \"5f20c574-b730-4bd8-97d1-7751eb7968d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709925 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7jjp\" (UniqueName: \"kubernetes.io/projected/d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2-kube-api-access-d7jjp\") pod \"machine-approver-56656f9798-xj788\" (UID: \"d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709943 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pw6d\" (UniqueName: \"kubernetes.io/projected/38e3be97-7374-4b8b-9565-4d60baa02401-kube-api-access-8pw6d\") pod \"authentication-operator-69f744f599-bwvdt\" (UID: \"38e3be97-7374-4b8b-9565-4d60baa02401\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709961 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8d0112e-e85e-42f1-b28b-c0c996f36fe0-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-wzftj\" (UID: \"f8d0112e-e85e-42f1-b28b-c0c996f36fe0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709977 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2-config\") pod \"machine-approver-56656f9798-xj788\" (UID: \"d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709993 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7b5c64ae-5f80-4e35-91dc-48163991b63d-node-pullsecrets\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.710007 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b5c64ae-5f80-4e35-91dc-48163991b63d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.710022 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38e3be97-7374-4b8b-9565-4d60baa02401-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bwvdt\" (UID: \"38e3be97-7374-4b8b-9565-4d60baa02401\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.710040 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.710056 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e78b7a6b-91b7-4753-bd82-df9d3ea97291-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-c2mfb\" (UID: \"e78b7a6b-91b7-4753-bd82-df9d3ea97291\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.710961 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.711499 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30a7db23-c18c-4bc6-b1b7-97b32a419fbe-config\") pod \"openshift-apiserver-operator-796bbdcf4f-7k4rg\" (UID: \"30a7db23-c18c-4bc6-b1b7-97b32a419fbe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.711565 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.711600 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.711630 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7b5c64ae-5f80-4e35-91dc-48163991b63d-audit\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.711656 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e78b7a6b-91b7-4753-bd82-df9d3ea97291-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-c2mfb\" (UID: \"e78b7a6b-91b7-4753-bd82-df9d3ea97291\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.711679 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30a7db23-c18c-4bc6-b1b7-97b32a419fbe-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-7k4rg\" (UID: \"30a7db23-c18c-4bc6-b1b7-97b32a419fbe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.711718 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmphr\" (UniqueName: \"kubernetes.io/projected/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-kube-api-access-fmphr\") pod \"route-controller-manager-6576b87f9c-qrtpl\" (UID: \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.711738 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.711763 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-serving-cert\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.711788 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vnhn\" (UniqueName: \"kubernetes.io/projected/18cc290d-78be-42c6-af5b-3b8b86941eb2-kube-api-access-6vnhn\") pod \"controller-manager-879f6c89f-fnbl8\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.711809 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e594f26b-0fd6-44a1-93eb-84593591389f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-p9fck\" (UID: \"e594f26b-0fd6-44a1-93eb-84593591389f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.711832 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4mlc\" (UniqueName: \"kubernetes.io/projected/30a7db23-c18c-4bc6-b1b7-97b32a419fbe-kube-api-access-d4mlc\") pod \"openshift-apiserver-operator-796bbdcf4f-7k4rg\" (UID: \"30a7db23-c18c-4bc6-b1b7-97b32a419fbe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.711856 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f88s\" (UniqueName: \"kubernetes.io/projected/5021cb92-f82d-47ee-9978-58e897c354b1-kube-api-access-9f88s\") pod \"cluster-samples-operator-665b6dd947-j2mmj\" (UID: \"5021cb92-f82d-47ee-9978-58e897c354b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j2mmj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712002 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1974d27-b923-4a9b-9874-d400df5bd29a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ddvpd\" (UID: \"d1974d27-b923-4a9b-9874-d400df5bd29a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712026 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2f348b60-0d81-490e-bfb4-ea32546c995a-images\") pod \"machine-api-operator-5694c8668f-ps6pv\" (UID: \"2f348b60-0d81-490e-bfb4-ea32546c995a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712051 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2-machine-approver-tls\") pod \"machine-approver-56656f9798-xj788\" (UID: \"d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712077 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7b5c64ae-5f80-4e35-91dc-48163991b63d-image-import-ca\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712121 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-audit-policies\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712166 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-config\") pod \"route-controller-manager-6576b87f9c-qrtpl\" (UID: \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712194 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38e3be97-7374-4b8b-9565-4d60baa02401-serving-cert\") pod \"authentication-operator-69f744f599-bwvdt\" (UID: \"38e3be97-7374-4b8b-9565-4d60baa02401\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712254 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-audit-policies\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712303 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712325 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712372 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712404 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfrvq\" (UniqueName: \"kubernetes.io/projected/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-kube-api-access-zfrvq\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712449 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7b5c64ae-5f80-4e35-91dc-48163991b63d-etcd-serving-ca\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712470 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc5qm\" (UniqueName: \"kubernetes.io/projected/7b5c64ae-5f80-4e35-91dc-48163991b63d-kube-api-access-mc5qm\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712486 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712509 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712526 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmb5r\" (UniqueName: \"kubernetes.io/projected/5f20c574-b730-4bd8-97d1-7751eb7968d4-kube-api-access-cmb5r\") pod \"openshift-controller-manager-operator-756b6f6bc6-vs599\" (UID: \"5f20c574-b730-4bd8-97d1-7751eb7968d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712562 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fnbl8\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712580 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5021cb92-f82d-47ee-9978-58e897c354b1-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-j2mmj\" (UID: \"5021cb92-f82d-47ee-9978-58e897c354b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j2mmj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712618 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38e3be97-7374-4b8b-9565-4d60baa02401-config\") pod \"authentication-operator-69f744f599-bwvdt\" (UID: \"38e3be97-7374-4b8b-9565-4d60baa02401\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712638 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvplx\" (UniqueName: \"kubernetes.io/projected/38d9642e-3788-4e70-8232-138cd84e02dc-kube-api-access-lvplx\") pod \"control-plane-machine-set-operator-78cbb6b69f-znrdm\" (UID: \"38d9642e-3788-4e70-8232-138cd84e02dc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-znrdm" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712913 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.713005 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-x72n5" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.726179 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.728192 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-45dt8"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.728653 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.730035 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.730555 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.731120 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vhztc"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.731732 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.732529 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.732964 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.733185 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vhztc" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.733578 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.733617 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.733730 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.742849 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.742934 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.743780 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fnbl8"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.743828 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nlpvl"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.744202 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.744299 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.744650 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-nlpvl" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.744851 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.744894 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.745356 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.746880 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.747835 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-ps6pv"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.748814 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bslb9"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.750452 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.751560 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bwvdt"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.752234 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j2mmj"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.753243 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.754855 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tjkwm"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.756342 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.756632 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-znrdm"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.758214 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-7hljp"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.759470 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.763766 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.764638 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.767457 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.769080 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.771136 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-d2l2r"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.773495 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-p9fck"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.775051 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-w7rf2"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.776494 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-q596q"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.778984 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.780793 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-x72n5"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.781898 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.782497 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.784214 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nlpvl"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.785032 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-p4blk"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.785939 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-7wgh7"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.787083 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-l2hxn"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.788153 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-shq4j"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.788309 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-l2hxn" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.789182 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-v68px"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.789624 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-v68px" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.790267 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.791349 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.792405 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.793486 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.794730 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tjkwm"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.795582 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-45dt8"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.797110 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vhztc"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.797976 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mfphn"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.799483 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.800159 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.801458 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.802302 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.803051 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-l2hxn"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.803546 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-wcwdv"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.805112 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wcwdv"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.805228 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wcwdv" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.814818 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.814875 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.814922 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/806ba791-714c-4d13-b595-d4f6ccf06aea-config-volume\") pod \"collect-profiles-29526165-tdww4\" (UID: \"806ba791-714c-4d13-b595-d4f6ccf06aea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.814947 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7b5c64ae-5f80-4e35-91dc-48163991b63d-audit\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.814981 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e78b7a6b-91b7-4753-bd82-df9d3ea97291-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-c2mfb\" (UID: \"e78b7a6b-91b7-4753-bd82-df9d3ea97291\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815011 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30a7db23-c18c-4bc6-b1b7-97b32a419fbe-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-7k4rg\" (UID: \"30a7db23-c18c-4bc6-b1b7-97b32a419fbe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815039 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmphr\" (UniqueName: \"kubernetes.io/projected/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-kube-api-access-fmphr\") pod \"route-controller-manager-6576b87f9c-qrtpl\" (UID: \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815062 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815094 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-serving-cert\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815122 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vnhn\" (UniqueName: \"kubernetes.io/projected/18cc290d-78be-42c6-af5b-3b8b86941eb2-kube-api-access-6vnhn\") pod \"controller-manager-879f6c89f-fnbl8\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815153 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e594f26b-0fd6-44a1-93eb-84593591389f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-p9fck\" (UID: \"e594f26b-0fd6-44a1-93eb-84593591389f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815179 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4mlc\" (UniqueName: \"kubernetes.io/projected/30a7db23-c18c-4bc6-b1b7-97b32a419fbe-kube-api-access-d4mlc\") pod \"openshift-apiserver-operator-796bbdcf4f-7k4rg\" (UID: \"30a7db23-c18c-4bc6-b1b7-97b32a419fbe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815213 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a6908ab3-3d33-4e31-b226-b6607f34ee8b-default-certificate\") pod \"router-default-5444994796-9sztr\" (UID: \"a6908ab3-3d33-4e31-b226-b6607f34ee8b\") " pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815244 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f88s\" (UniqueName: \"kubernetes.io/projected/5021cb92-f82d-47ee-9978-58e897c354b1-kube-api-access-9f88s\") pod \"cluster-samples-operator-665b6dd947-j2mmj\" (UID: \"5021cb92-f82d-47ee-9978-58e897c354b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j2mmj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815307 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1974d27-b923-4a9b-9874-d400df5bd29a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ddvpd\" (UID: \"d1974d27-b923-4a9b-9874-d400df5bd29a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815334 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2f348b60-0d81-490e-bfb4-ea32546c995a-images\") pod \"machine-api-operator-5694c8668f-ps6pv\" (UID: \"2f348b60-0d81-490e-bfb4-ea32546c995a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815360 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2-machine-approver-tls\") pod \"machine-approver-56656f9798-xj788\" (UID: \"d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815381 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7b5c64ae-5f80-4e35-91dc-48163991b63d-image-import-ca\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815409 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-audit-policies\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815438 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-config\") pod \"route-controller-manager-6576b87f9c-qrtpl\" (UID: \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815463 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38e3be97-7374-4b8b-9565-4d60baa02401-serving-cert\") pod \"authentication-operator-69f744f599-bwvdt\" (UID: \"38e3be97-7374-4b8b-9565-4d60baa02401\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815487 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-audit-policies\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815511 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815540 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815573 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn6jl\" (UniqueName: \"kubernetes.io/projected/f1faaf31-f0b6-4828-90cc-51de060dc826-kube-api-access-bn6jl\") pod \"downloads-7954f5f757-d2l2r\" (UID: \"f1faaf31-f0b6-4828-90cc-51de060dc826\") " pod="openshift-console/downloads-7954f5f757-d2l2r" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815600 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815626 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfrvq\" (UniqueName: \"kubernetes.io/projected/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-kube-api-access-zfrvq\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815666 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7b5c64ae-5f80-4e35-91dc-48163991b63d-etcd-serving-ca\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815694 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc5qm\" (UniqueName: \"kubernetes.io/projected/7b5c64ae-5f80-4e35-91dc-48163991b63d-kube-api-access-mc5qm\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815738 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815766 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815802 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmb5r\" (UniqueName: \"kubernetes.io/projected/5f20c574-b730-4bd8-97d1-7751eb7968d4-kube-api-access-cmb5r\") pod \"openshift-controller-manager-operator-756b6f6bc6-vs599\" (UID: \"5f20c574-b730-4bd8-97d1-7751eb7968d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815832 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fnbl8\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815862 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5021cb92-f82d-47ee-9978-58e897c354b1-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-j2mmj\" (UID: \"5021cb92-f82d-47ee-9978-58e897c354b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j2mmj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815891 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a6908ab3-3d33-4e31-b226-b6607f34ee8b-stats-auth\") pod \"router-default-5444994796-9sztr\" (UID: \"a6908ab3-3d33-4e31-b226-b6607f34ee8b\") " pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815925 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38e3be97-7374-4b8b-9565-4d60baa02401-config\") pod \"authentication-operator-69f744f599-bwvdt\" (UID: \"38e3be97-7374-4b8b-9565-4d60baa02401\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815956 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvplx\" (UniqueName: \"kubernetes.io/projected/38d9642e-3788-4e70-8232-138cd84e02dc-kube-api-access-lvplx\") pod \"control-plane-machine-set-operator-78cbb6b69f-znrdm\" (UID: \"38d9642e-3788-4e70-8232-138cd84e02dc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-znrdm" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815981 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/806ba791-714c-4d13-b595-d4f6ccf06aea-secret-volume\") pod \"collect-profiles-29526165-tdww4\" (UID: \"806ba791-714c-4d13-b595-d4f6ccf06aea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816038 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e594f26b-0fd6-44a1-93eb-84593591389f-serving-cert\") pod \"openshift-config-operator-7777fb866f-p9fck\" (UID: \"e594f26b-0fd6-44a1-93eb-84593591389f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816069 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrllp\" (UniqueName: \"kubernetes.io/projected/e594f26b-0fd6-44a1-93eb-84593591389f-kube-api-access-jrllp\") pod \"openshift-config-operator-7777fb866f-p9fck\" (UID: \"e594f26b-0fd6-44a1-93eb-84593591389f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816100 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f348b60-0d81-490e-bfb4-ea32546c995a-config\") pod \"machine-api-operator-5694c8668f-ps6pv\" (UID: \"2f348b60-0d81-490e-bfb4-ea32546c995a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816127 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-client-ca\") pod \"route-controller-manager-6576b87f9c-qrtpl\" (UID: \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816155 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816182 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38e3be97-7374-4b8b-9565-4d60baa02401-service-ca-bundle\") pod \"authentication-operator-69f744f599-bwvdt\" (UID: \"38e3be97-7374-4b8b-9565-4d60baa02401\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816184 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7b5c64ae-5f80-4e35-91dc-48163991b63d-audit\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816210 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6908ab3-3d33-4e31-b226-b6607f34ee8b-service-ca-bundle\") pod \"router-default-5444994796-9sztr\" (UID: \"a6908ab3-3d33-4e31-b226-b6607f34ee8b\") " pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816327 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7b5c64ae-5f80-4e35-91dc-48163991b63d-audit-dir\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816388 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-config\") pod \"controller-manager-879f6c89f-fnbl8\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816423 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18cc290d-78be-42c6-af5b-3b8b86941eb2-serving-cert\") pod \"controller-manager-879f6c89f-fnbl8\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816453 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/38d9642e-3788-4e70-8232-138cd84e02dc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-znrdm\" (UID: \"38d9642e-3788-4e70-8232-138cd84e02dc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-znrdm" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816500 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8d0112e-e85e-42f1-b28b-c0c996f36fe0-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-wzftj\" (UID: \"f8d0112e-e85e-42f1-b28b-c0c996f36fe0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816552 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1974d27-b923-4a9b-9874-d400df5bd29a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ddvpd\" (UID: \"d1974d27-b923-4a9b-9874-d400df5bd29a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816595 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-serving-cert\") pod \"route-controller-manager-6576b87f9c-qrtpl\" (UID: \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816628 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816666 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2-auth-proxy-config\") pod \"machine-approver-56656f9798-xj788\" (UID: \"d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816763 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f20c574-b730-4bd8-97d1-7751eb7968d4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vs599\" (UID: \"5f20c574-b730-4bd8-97d1-7751eb7968d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816796 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7b5c64ae-5f80-4e35-91dc-48163991b63d-encryption-config\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816835 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1974d27-b923-4a9b-9874-d400df5bd29a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ddvpd\" (UID: \"d1974d27-b923-4a9b-9874-d400df5bd29a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816873 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdjs5\" (UniqueName: \"kubernetes.io/projected/2f348b60-0d81-490e-bfb4-ea32546c995a-kube-api-access-sdjs5\") pod \"machine-api-operator-5694c8668f-ps6pv\" (UID: \"2f348b60-0d81-490e-bfb4-ea32546c995a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816913 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlgjz\" (UniqueName: \"kubernetes.io/projected/f8d0112e-e85e-42f1-b28b-c0c996f36fe0-kube-api-access-rlgjz\") pod \"cluster-image-registry-operator-dc59b4c8b-wzftj\" (UID: \"f8d0112e-e85e-42f1-b28b-c0c996f36fe0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816956 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-audit-dir\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816991 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b5c64ae-5f80-4e35-91dc-48163991b63d-config\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817035 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfv25\" (UniqueName: \"kubernetes.io/projected/a6908ab3-3d33-4e31-b226-b6607f34ee8b-kube-api-access-dfv25\") pod \"router-default-5444994796-9sztr\" (UID: \"a6908ab3-3d33-4e31-b226-b6607f34ee8b\") " pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817072 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f8d0112e-e85e-42f1-b28b-c0c996f36fe0-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-wzftj\" (UID: \"f8d0112e-e85e-42f1-b28b-c0c996f36fe0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817104 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817136 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7b5c64ae-5f80-4e35-91dc-48163991b63d-etcd-client\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817172 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b5c64ae-5f80-4e35-91dc-48163991b63d-serving-cert\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817215 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2f348b60-0d81-490e-bfb4-ea32546c995a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-ps6pv\" (UID: \"2f348b60-0d81-490e-bfb4-ea32546c995a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817250 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-audit-dir\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817283 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-client-ca\") pod \"controller-manager-879f6c89f-fnbl8\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817321 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e78b7a6b-91b7-4753-bd82-df9d3ea97291-config\") pod \"kube-apiserver-operator-766d6c64bb-c2mfb\" (UID: \"e78b7a6b-91b7-4753-bd82-df9d3ea97291\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817364 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2mff\" (UniqueName: \"kubernetes.io/projected/bac53d01-ed38-46a8-ae9e-bfb72e5565a1-kube-api-access-g2mff\") pod \"migrator-59844c95c7-7hljp\" (UID: \"bac53d01-ed38-46a8-ae9e-bfb72e5565a1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7hljp" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817403 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817436 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-etcd-client\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817473 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6k8tl\" (UniqueName: \"kubernetes.io/projected/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-kube-api-access-6k8tl\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817534 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-encryption-config\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817564 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f20c574-b730-4bd8-97d1-7751eb7968d4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vs599\" (UID: \"5f20c574-b730-4bd8-97d1-7751eb7968d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817620 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7jjp\" (UniqueName: \"kubernetes.io/projected/d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2-kube-api-access-d7jjp\") pod \"machine-approver-56656f9798-xj788\" (UID: \"d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817666 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j498r\" (UniqueName: \"kubernetes.io/projected/806ba791-714c-4d13-b595-d4f6ccf06aea-kube-api-access-j498r\") pod \"collect-profiles-29526165-tdww4\" (UID: \"806ba791-714c-4d13-b595-d4f6ccf06aea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817734 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pw6d\" (UniqueName: \"kubernetes.io/projected/38e3be97-7374-4b8b-9565-4d60baa02401-kube-api-access-8pw6d\") pod \"authentication-operator-69f744f599-bwvdt\" (UID: \"38e3be97-7374-4b8b-9565-4d60baa02401\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817773 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8d0112e-e85e-42f1-b28b-c0c996f36fe0-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-wzftj\" (UID: \"f8d0112e-e85e-42f1-b28b-c0c996f36fe0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817810 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2-config\") pod \"machine-approver-56656f9798-xj788\" (UID: \"d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817854 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7b5c64ae-5f80-4e35-91dc-48163991b63d-node-pullsecrets\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817894 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b5c64ae-5f80-4e35-91dc-48163991b63d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817932 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38e3be97-7374-4b8b-9565-4d60baa02401-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bwvdt\" (UID: \"38e3be97-7374-4b8b-9565-4d60baa02401\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817963 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6908ab3-3d33-4e31-b226-b6607f34ee8b-metrics-certs\") pod \"router-default-5444994796-9sztr\" (UID: \"a6908ab3-3d33-4e31-b226-b6607f34ee8b\") " pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.818996 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.819070 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.819120 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e78b7a6b-91b7-4753-bd82-df9d3ea97291-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-c2mfb\" (UID: \"e78b7a6b-91b7-4753-bd82-df9d3ea97291\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.819164 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30a7db23-c18c-4bc6-b1b7-97b32a419fbe-config\") pod \"openshift-apiserver-operator-796bbdcf4f-7k4rg\" (UID: \"30a7db23-c18c-4bc6-b1b7-97b32a419fbe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.819988 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30a7db23-c18c-4bc6-b1b7-97b32a419fbe-config\") pod \"openshift-apiserver-operator-796bbdcf4f-7k4rg\" (UID: \"30a7db23-c18c-4bc6-b1b7-97b32a419fbe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.820128 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.820411 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2f348b60-0d81-490e-bfb4-ea32546c995a-images\") pod \"machine-api-operator-5694c8668f-ps6pv\" (UID: \"2f348b60-0d81-490e-bfb4-ea32546c995a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.820963 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.821171 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1974d27-b923-4a9b-9874-d400df5bd29a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ddvpd\" (UID: \"d1974d27-b923-4a9b-9874-d400df5bd29a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.822793 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2-auth-proxy-config\") pod \"machine-approver-56656f9798-xj788\" (UID: \"d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.823423 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f20c574-b730-4bd8-97d1-7751eb7968d4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vs599\" (UID: \"5f20c574-b730-4bd8-97d1-7751eb7968d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.823620 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e594f26b-0fd6-44a1-93eb-84593591389f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-p9fck\" (UID: \"e594f26b-0fd6-44a1-93eb-84593591389f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.824096 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7b5c64ae-5f80-4e35-91dc-48163991b63d-audit-dir\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.824155 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-audit-dir\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.824726 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b5c64ae-5f80-4e35-91dc-48163991b63d-config\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.824838 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-encryption-config\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.825428 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.825575 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-config\") pod \"controller-manager-879f6c89f-fnbl8\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.825614 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7b5c64ae-5f80-4e35-91dc-48163991b63d-etcd-serving-ca\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.825979 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7b5c64ae-5f80-4e35-91dc-48163991b63d-encryption-config\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.826443 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7b5c64ae-5f80-4e35-91dc-48163991b63d-node-pullsecrets\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.826611 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-audit-dir\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.827152 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-audit-policies\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.827166 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7b5c64ae-5f80-4e35-91dc-48163991b63d-image-import-ca\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.827315 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-config\") pod \"route-controller-manager-6576b87f9c-qrtpl\" (UID: \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.827424 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-client-ca\") pod \"controller-manager-879f6c89f-fnbl8\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.827898 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b5c64ae-5f80-4e35-91dc-48163991b63d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.828182 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e78b7a6b-91b7-4753-bd82-df9d3ea97291-config\") pod \"kube-apiserver-operator-766d6c64bb-c2mfb\" (UID: \"e78b7a6b-91b7-4753-bd82-df9d3ea97291\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.828863 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38e3be97-7374-4b8b-9565-4d60baa02401-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bwvdt\" (UID: \"38e3be97-7374-4b8b-9565-4d60baa02401\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.828921 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f348b60-0d81-490e-bfb4-ea32546c995a-config\") pod \"machine-api-operator-5694c8668f-ps6pv\" (UID: \"2f348b60-0d81-490e-bfb4-ea32546c995a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.829963 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18cc290d-78be-42c6-af5b-3b8b86941eb2-serving-cert\") pod \"controller-manager-879f6c89f-fnbl8\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.830075 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.830107 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-client-ca\") pod \"route-controller-manager-6576b87f9c-qrtpl\" (UID: \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.830128 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30a7db23-c18c-4bc6-b1b7-97b32a419fbe-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-7k4rg\" (UID: \"30a7db23-c18c-4bc6-b1b7-97b32a419fbe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.830282 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.830490 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8d0112e-e85e-42f1-b28b-c0c996f36fe0-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-wzftj\" (UID: \"f8d0112e-e85e-42f1-b28b-c0c996f36fe0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.830592 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.830781 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.831130 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.831301 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38e3be97-7374-4b8b-9565-4d60baa02401-config\") pod \"authentication-operator-69f744f599-bwvdt\" (UID: \"38e3be97-7374-4b8b-9565-4d60baa02401\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.831509 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f20c574-b730-4bd8-97d1-7751eb7968d4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vs599\" (UID: \"5f20c574-b730-4bd8-97d1-7751eb7968d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.831882 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2f348b60-0d81-490e-bfb4-ea32546c995a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-ps6pv\" (UID: \"2f348b60-0d81-490e-bfb4-ea32546c995a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.831973 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fnbl8\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.832038 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/38d9642e-3788-4e70-8232-138cd84e02dc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-znrdm\" (UID: \"38d9642e-3788-4e70-8232-138cd84e02dc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-znrdm" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.826361 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2-config\") pod \"machine-approver-56656f9798-xj788\" (UID: \"d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.832792 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2-machine-approver-tls\") pod \"machine-approver-56656f9798-xj788\" (UID: \"d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.832863 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1974d27-b923-4a9b-9874-d400df5bd29a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ddvpd\" (UID: \"d1974d27-b923-4a9b-9874-d400df5bd29a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.832907 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38e3be97-7374-4b8b-9565-4d60baa02401-service-ca-bundle\") pod \"authentication-operator-69f744f599-bwvdt\" (UID: \"38e3be97-7374-4b8b-9565-4d60baa02401\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.833399 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-audit-policies\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.834536 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.835440 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e594f26b-0fd6-44a1-93eb-84593591389f-serving-cert\") pod \"openshift-config-operator-7777fb866f-p9fck\" (UID: \"e594f26b-0fd6-44a1-93eb-84593591389f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.837063 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5021cb92-f82d-47ee-9978-58e897c354b1-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-j2mmj\" (UID: \"5021cb92-f82d-47ee-9978-58e897c354b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j2mmj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.837225 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.837302 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.837397 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-etcd-client\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.837596 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-serving-cert\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.837836 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-serving-cert\") pod \"route-controller-manager-6576b87f9c-qrtpl\" (UID: \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.837964 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.838271 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b5c64ae-5f80-4e35-91dc-48163991b63d-serving-cert\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.838444 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.840379 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7b5c64ae-5f80-4e35-91dc-48163991b63d-etcd-client\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.840533 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38e3be97-7374-4b8b-9565-4d60baa02401-serving-cert\") pod \"authentication-operator-69f744f599-bwvdt\" (UID: \"38e3be97-7374-4b8b-9565-4d60baa02401\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.840558 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e78b7a6b-91b7-4753-bd82-df9d3ea97291-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-c2mfb\" (UID: \"e78b7a6b-91b7-4753-bd82-df9d3ea97291\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.841091 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8d0112e-e85e-42f1-b28b-c0c996f36fe0-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-wzftj\" (UID: \"f8d0112e-e85e-42f1-b28b-c0c996f36fe0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.863477 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.882874 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.903497 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.920361 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a6908ab3-3d33-4e31-b226-b6607f34ee8b-default-certificate\") pod \"router-default-5444994796-9sztr\" (UID: \"a6908ab3-3d33-4e31-b226-b6607f34ee8b\") " pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.920458 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn6jl\" (UniqueName: \"kubernetes.io/projected/f1faaf31-f0b6-4828-90cc-51de060dc826-kube-api-access-bn6jl\") pod \"downloads-7954f5f757-d2l2r\" (UID: \"f1faaf31-f0b6-4828-90cc-51de060dc826\") " pod="openshift-console/downloads-7954f5f757-d2l2r" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.920529 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/806ba791-714c-4d13-b595-d4f6ccf06aea-secret-volume\") pod \"collect-profiles-29526165-tdww4\" (UID: \"806ba791-714c-4d13-b595-d4f6ccf06aea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.920560 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a6908ab3-3d33-4e31-b226-b6607f34ee8b-stats-auth\") pod \"router-default-5444994796-9sztr\" (UID: \"a6908ab3-3d33-4e31-b226-b6607f34ee8b\") " pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.920597 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6908ab3-3d33-4e31-b226-b6607f34ee8b-service-ca-bundle\") pod \"router-default-5444994796-9sztr\" (UID: \"a6908ab3-3d33-4e31-b226-b6607f34ee8b\") " pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.920659 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfv25\" (UniqueName: \"kubernetes.io/projected/a6908ab3-3d33-4e31-b226-b6607f34ee8b-kube-api-access-dfv25\") pod \"router-default-5444994796-9sztr\" (UID: \"a6908ab3-3d33-4e31-b226-b6607f34ee8b\") " pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.921774 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j498r\" (UniqueName: \"kubernetes.io/projected/806ba791-714c-4d13-b595-d4f6ccf06aea-kube-api-access-j498r\") pod \"collect-profiles-29526165-tdww4\" (UID: \"806ba791-714c-4d13-b595-d4f6ccf06aea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.921820 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6908ab3-3d33-4e31-b226-b6607f34ee8b-metrics-certs\") pod \"router-default-5444994796-9sztr\" (UID: \"a6908ab3-3d33-4e31-b226-b6607f34ee8b\") " pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.921845 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/806ba791-714c-4d13-b595-d4f6ccf06aea-config-volume\") pod \"collect-profiles-29526165-tdww4\" (UID: \"806ba791-714c-4d13-b595-d4f6ccf06aea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.921882 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.942617 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.961822 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.983493 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.002166 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.022586 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.041583 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.061762 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.082670 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.102883 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.123336 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.142721 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.163165 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.182022 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.204010 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.212388 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6908ab3-3d33-4e31-b226-b6607f34ee8b-service-ca-bundle\") pod \"router-default-5444994796-9sztr\" (UID: \"a6908ab3-3d33-4e31-b226-b6607f34ee8b\") " pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.224208 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.243655 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.260772 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6908ab3-3d33-4e31-b226-b6607f34ee8b-metrics-certs\") pod \"router-default-5444994796-9sztr\" (UID: \"a6908ab3-3d33-4e31-b226-b6607f34ee8b\") " pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.262220 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.276774 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a6908ab3-3d33-4e31-b226-b6607f34ee8b-stats-auth\") pod \"router-default-5444994796-9sztr\" (UID: \"a6908ab3-3d33-4e31-b226-b6607f34ee8b\") " pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.283920 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.304016 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.324133 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.342559 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.363597 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.383128 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.402532 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.435456 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.442675 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.464685 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.476218 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a6908ab3-3d33-4e31-b226-b6607f34ee8b-default-certificate\") pod \"router-default-5444994796-9sztr\" (UID: \"a6908ab3-3d33-4e31-b226-b6607f34ee8b\") " pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.482238 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.503295 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.523343 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.542685 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.562417 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.582862 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.603142 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.623242 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.642926 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.664007 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.682443 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.726022 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.728878 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.741394 5094 request.go:700] Waited for 1.011359077s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-operator/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.743611 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.763120 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.783558 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.804558 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.824979 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.856303 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.862533 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.882750 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.902123 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.916534 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/806ba791-714c-4d13-b595-d4f6ccf06aea-secret-volume\") pod \"collect-profiles-29526165-tdww4\" (UID: \"806ba791-714c-4d13-b595-d4f6ccf06aea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" Feb 20 06:48:46 crc kubenswrapper[5094]: E0220 06:48:46.922966 5094 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Feb 20 06:48:46 crc kubenswrapper[5094]: E0220 06:48:46.923128 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/806ba791-714c-4d13-b595-d4f6ccf06aea-config-volume podName:806ba791-714c-4d13-b595-d4f6ccf06aea nodeName:}" failed. No retries permitted until 2026-02-20 06:48:47.423092592 +0000 UTC m=+142.295719343 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/806ba791-714c-4d13-b595-d4f6ccf06aea-config-volume") pod "collect-profiles-29526165-tdww4" (UID: "806ba791-714c-4d13-b595-d4f6ccf06aea") : failed to sync configmap cache: timed out waiting for the condition Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.928264 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.943221 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.963922 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.983395 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.003581 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.023832 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.043795 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.062427 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.082993 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.103305 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.142543 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.164848 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.183471 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.203781 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.223871 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.243629 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.262794 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.282234 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.302765 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.322971 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.343242 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.363524 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.383467 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.403225 5094 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.424183 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.442942 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.446054 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/806ba791-714c-4d13-b595-d4f6ccf06aea-config-volume\") pod \"collect-profiles-29526165-tdww4\" (UID: \"806ba791-714c-4d13-b595-d4f6ccf06aea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.447657 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/806ba791-714c-4d13-b595-d4f6ccf06aea-config-volume\") pod \"collect-profiles-29526165-tdww4\" (UID: \"806ba791-714c-4d13-b595-d4f6ccf06aea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.464010 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.482684 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.503514 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.523470 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.543331 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.563114 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.582825 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.603312 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.651627 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e78b7a6b-91b7-4753-bd82-df9d3ea97291-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-c2mfb\" (UID: \"e78b7a6b-91b7-4753-bd82-df9d3ea97291\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.673204 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmphr\" (UniqueName: \"kubernetes.io/projected/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-kube-api-access-fmphr\") pod \"route-controller-manager-6576b87f9c-qrtpl\" (UID: \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.692126 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vnhn\" (UniqueName: \"kubernetes.io/projected/18cc290d-78be-42c6-af5b-3b8b86941eb2-kube-api-access-6vnhn\") pod \"controller-manager-879f6c89f-fnbl8\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.707046 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.711345 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4mlc\" (UniqueName: \"kubernetes.io/projected/30a7db23-c18c-4bc6-b1b7-97b32a419fbe-kube-api-access-d4mlc\") pod \"openshift-apiserver-operator-796bbdcf4f-7k4rg\" (UID: \"30a7db23-c18c-4bc6-b1b7-97b32a419fbe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.728091 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f88s\" (UniqueName: \"kubernetes.io/projected/5021cb92-f82d-47ee-9978-58e897c354b1-kube-api-access-9f88s\") pod \"cluster-samples-operator-665b6dd947-j2mmj\" (UID: \"5021cb92-f82d-47ee-9978-58e897c354b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j2mmj" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.739398 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdjs5\" (UniqueName: \"kubernetes.io/projected/2f348b60-0d81-490e-bfb4-ea32546c995a-kube-api-access-sdjs5\") pod \"machine-api-operator-5694c8668f-ps6pv\" (UID: \"2f348b60-0d81-490e-bfb4-ea32546c995a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.758743 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlgjz\" (UniqueName: \"kubernetes.io/projected/f8d0112e-e85e-42f1-b28b-c0c996f36fe0-kube-api-access-rlgjz\") pod \"cluster-image-registry-operator-dc59b4c8b-wzftj\" (UID: \"f8d0112e-e85e-42f1-b28b-c0c996f36fe0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.760522 5094 request.go:700] Waited for 1.935615114s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/serviceaccounts/cluster-image-registry-operator/token Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.760576 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.793000 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f8d0112e-e85e-42f1-b28b-c0c996f36fe0-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-wzftj\" (UID: \"f8d0112e-e85e-42f1-b28b-c0c996f36fe0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.817177 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrllp\" (UniqueName: \"kubernetes.io/projected/e594f26b-0fd6-44a1-93eb-84593591389f-kube-api-access-jrllp\") pod \"openshift-config-operator-7777fb866f-p9fck\" (UID: \"e594f26b-0fd6-44a1-93eb-84593591389f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.827549 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j2mmj" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.828016 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfrvq\" (UniqueName: \"kubernetes.io/projected/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-kube-api-access-zfrvq\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.843481 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7jjp\" (UniqueName: \"kubernetes.io/projected/d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2-kube-api-access-d7jjp\") pod \"machine-approver-56656f9798-xj788\" (UID: \"d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.851649 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.859427 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.863746 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pw6d\" (UniqueName: \"kubernetes.io/projected/38e3be97-7374-4b8b-9565-4d60baa02401-kube-api-access-8pw6d\") pod \"authentication-operator-69f744f599-bwvdt\" (UID: \"38e3be97-7374-4b8b-9565-4d60baa02401\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.868942 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.887819 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.895506 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc5qm\" (UniqueName: \"kubernetes.io/projected/7b5c64ae-5f80-4e35-91dc-48163991b63d-kube-api-access-mc5qm\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.897858 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.904856 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2mff\" (UniqueName: \"kubernetes.io/projected/bac53d01-ed38-46a8-ae9e-bfb72e5565a1-kube-api-access-g2mff\") pod \"migrator-59844c95c7-7hljp\" (UID: \"bac53d01-ed38-46a8-ae9e-bfb72e5565a1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7hljp" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.909022 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.920540 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1974d27-b923-4a9b-9874-d400df5bd29a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ddvpd\" (UID: \"d1974d27-b923-4a9b-9874-d400df5bd29a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.940550 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6k8tl\" (UniqueName: \"kubernetes.io/projected/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-kube-api-access-6k8tl\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.966399 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmb5r\" (UniqueName: \"kubernetes.io/projected/5f20c574-b730-4bd8-97d1-7751eb7968d4-kube-api-access-cmb5r\") pod \"openshift-controller-manager-operator-756b6f6bc6-vs599\" (UID: \"5f20c574-b730-4bd8-97d1-7751eb7968d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.970116 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.977613 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvplx\" (UniqueName: \"kubernetes.io/projected/38d9642e-3788-4e70-8232-138cd84e02dc-kube-api-access-lvplx\") pod \"control-plane-machine-set-operator-78cbb6b69f-znrdm\" (UID: \"38d9642e-3788-4e70-8232-138cd84e02dc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-znrdm" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.988720 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7hljp" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.023077 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.024577 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl"] Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.026757 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfv25\" (UniqueName: \"kubernetes.io/projected/a6908ab3-3d33-4e31-b226-b6607f34ee8b-kube-api-access-dfv25\") pod \"router-default-5444994796-9sztr\" (UID: \"a6908ab3-3d33-4e31-b226-b6607f34ee8b\") " pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.038723 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.048337 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j498r\" (UniqueName: \"kubernetes.io/projected/806ba791-714c-4d13-b595-d4f6ccf06aea-kube-api-access-j498r\") pod \"collect-profiles-29526165-tdww4\" (UID: \"806ba791-714c-4d13-b595-d4f6ccf06aea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.061528 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn6jl\" (UniqueName: \"kubernetes.io/projected/f1faaf31-f0b6-4828-90cc-51de060dc826-kube-api-access-bn6jl\") pod \"downloads-7954f5f757-d2l2r\" (UID: \"f1faaf31-f0b6-4828-90cc-51de060dc826\") " pod="openshift-console/downloads-7954f5f757-d2l2r" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.070352 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fnbl8"] Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.079902 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.098783 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.103343 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" Feb 20 06:48:48 crc kubenswrapper[5094]: W0220 06:48:48.106143 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd63cfc9b_e8ee_4b2a_8f36_f335dc660ca5.slice/crio-94f008c3fadb5b12936b7080560c6d49126d29712d46d44cae8050bc4be3dbe7 WatchSource:0}: Error finding container 94f008c3fadb5b12936b7080560c6d49126d29712d46d44cae8050bc4be3dbe7: Status 404 returned error can't find the container with id 94f008c3fadb5b12936b7080560c6d49126d29712d46d44cae8050bc4be3dbe7 Feb 20 06:48:48 crc kubenswrapper[5094]: W0220 06:48:48.129749 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18cc290d_78be_42c6_af5b_3b8b86941eb2.slice/crio-a8a623d9daef3ed6dfb32889d5aba7e415d276108ff677b5411cb61659224d60 WatchSource:0}: Error finding container a8a623d9daef3ed6dfb32889d5aba7e415d276108ff677b5411cb61659224d60: Status 404 returned error can't find the container with id a8a623d9daef3ed6dfb32889d5aba7e415d276108ff677b5411cb61659224d60 Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.143619 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160393 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b32eb7f4-e823-4b71-9606-d3dee9f247fd-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-jjblc\" (UID: \"b32eb7f4-e823-4b71-9606-d3dee9f247fd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160430 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e130287f-996d-4ab0-8c12-351bf8d21df5-console-oauth-config\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160524 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7a75a178-3fa3-4be7-b29f-e1f01dc859a4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-x72n5\" (UID: \"7a75a178-3fa3-4be7-b29f-e1f01dc859a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x72n5" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160551 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-trusted-ca-bundle\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160566 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95700d83-436d-43c5-9eb1-381654f43928-config\") pod \"console-operator-58897d9998-w7rf2\" (UID: \"95700d83-436d-43c5-9eb1-381654f43928\") " pod="openshift-console-operator/console-operator-58897d9998-w7rf2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160585 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4mjl\" (UniqueName: \"kubernetes.io/projected/7a75a178-3fa3-4be7-b29f-e1f01dc859a4-kube-api-access-n4mjl\") pod \"multus-admission-controller-857f4d67dd-x72n5\" (UID: \"7a75a178-3fa3-4be7-b29f-e1f01dc859a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x72n5" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160608 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4797e67f-42c7-4106-998a-f3555218e77d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-9z5jc\" (UID: \"4797e67f-42c7-4106-998a-f3555218e77d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160648 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-registry-tls\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160665 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cbae269e-22bc-484c-ad96-ad61d462a28d-srv-cert\") pod \"olm-operator-6b444d44fb-57pxl\" (UID: \"cbae269e-22bc-484c-ad96-ad61d462a28d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160681 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d74a0c7b-3ad6-4f59-b4ff-33e1209c3116-auth-proxy-config\") pod \"machine-config-operator-74547568cd-s2xnj\" (UID: \"d74a0c7b-3ad6-4f59-b4ff-33e1209c3116\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160727 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fa6b00ff-07fb-4e9a-80da-780c22acbe69-registry-certificates\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160745 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a072f264-8eef-49ff-804c-fc584b41175c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-jqgq2\" (UID: \"a072f264-8eef-49ff-804c-fc584b41175c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160763 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e130287f-996d-4ab0-8c12-351bf8d21df5-console-serving-cert\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160779 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c3bc896-4d90-42f0-92e9-77a7b285e504-serving-cert\") pod \"service-ca-operator-777779d784-vhztc\" (UID: \"3c3bc896-4d90-42f0-92e9-77a7b285e504\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vhztc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160796 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4js2k\" (UniqueName: \"kubernetes.io/projected/95700d83-436d-43c5-9eb1-381654f43928-kube-api-access-4js2k\") pod \"console-operator-58897d9998-w7rf2\" (UID: \"95700d83-436d-43c5-9eb1-381654f43928\") " pod="openshift-console-operator/console-operator-58897d9998-w7rf2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160815 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a072f264-8eef-49ff-804c-fc584b41175c-metrics-tls\") pod \"ingress-operator-5b745b69d9-jqgq2\" (UID: \"a072f264-8eef-49ff-804c-fc584b41175c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160831 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa6b00ff-07fb-4e9a-80da-780c22acbe69-trusted-ca\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160850 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-bound-sa-token\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160891 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9d4a320a-daa4-4bce-9782-5e9880aea226-metrics-tls\") pod \"dns-operator-744455d44c-q596q\" (UID: \"9d4a320a-daa4-4bce-9782-5e9880aea226\") " pod="openshift-dns-operator/dns-operator-744455d44c-q596q" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160940 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-serving-cert\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160997 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/04b9035c-78ce-4d54-859d-48f7853f3f16-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rwdb7\" (UID: \"04b9035c-78ce-4d54-859d-48f7853f3f16\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161108 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-console-config\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161194 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-45dt8\" (UID: \"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60\") " pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161222 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fqv8\" (UniqueName: \"kubernetes.io/projected/04b9035c-78ce-4d54-859d-48f7853f3f16-kube-api-access-9fqv8\") pod \"machine-config-controller-84d6567774-rwdb7\" (UID: \"04b9035c-78ce-4d54-859d-48f7853f3f16\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161255 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/95700d83-436d-43c5-9eb1-381654f43928-trusted-ca\") pod \"console-operator-58897d9998-w7rf2\" (UID: \"95700d83-436d-43c5-9eb1-381654f43928\") " pod="openshift-console-operator/console-operator-58897d9998-w7rf2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161277 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d74a0c7b-3ad6-4f59-b4ff-33e1209c3116-proxy-tls\") pod \"machine-config-operator-74547568cd-s2xnj\" (UID: \"d74a0c7b-3ad6-4f59-b4ff-33e1209c3116\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161342 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj5hg\" (UniqueName: \"kubernetes.io/projected/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-kube-api-access-kj5hg\") pod \"marketplace-operator-79b997595-45dt8\" (UID: \"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60\") " pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161417 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-45dt8\" (UID: \"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60\") " pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161521 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b32eb7f4-e823-4b71-9606-d3dee9f247fd-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-jjblc\" (UID: \"b32eb7f4-e823-4b71-9606-d3dee9f247fd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161542 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4797e67f-42c7-4106-998a-f3555218e77d-config\") pod \"kube-controller-manager-operator-78b949d7b-9z5jc\" (UID: \"4797e67f-42c7-4106-998a-f3555218e77d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161588 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42pbk\" (UniqueName: \"kubernetes.io/projected/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-kube-api-access-42pbk\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161606 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-etcd-service-ca\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161642 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-oauth-serving-cert\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161659 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4797e67f-42c7-4106-998a-f3555218e77d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-9z5jc\" (UID: \"4797e67f-42c7-4106-998a-f3555218e77d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161732 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161763 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a072f264-8eef-49ff-804c-fc584b41175c-trusted-ca\") pod \"ingress-operator-5b745b69d9-jqgq2\" (UID: \"a072f264-8eef-49ff-804c-fc584b41175c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161785 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d74a0c7b-3ad6-4f59-b4ff-33e1209c3116-images\") pod \"machine-config-operator-74547568cd-s2xnj\" (UID: \"d74a0c7b-3ad6-4f59-b4ff-33e1209c3116\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161805 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n54s8\" (UniqueName: \"kubernetes.io/projected/d74a0c7b-3ad6-4f59-b4ff-33e1209c3116-kube-api-access-n54s8\") pod \"machine-config-operator-74547568cd-s2xnj\" (UID: \"d74a0c7b-3ad6-4f59-b4ff-33e1209c3116\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161896 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4cl2\" (UniqueName: \"kubernetes.io/projected/a072f264-8eef-49ff-804c-fc584b41175c-kube-api-access-b4cl2\") pod \"ingress-operator-5b745b69d9-jqgq2\" (UID: \"a072f264-8eef-49ff-804c-fc584b41175c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161923 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5spz\" (UniqueName: \"kubernetes.io/projected/b32eb7f4-e823-4b71-9606-d3dee9f247fd-kube-api-access-s5spz\") pod \"kube-storage-version-migrator-operator-b67b599dd-jjblc\" (UID: \"b32eb7f4-e823-4b71-9606-d3dee9f247fd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161967 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c3bc896-4d90-42f0-92e9-77a7b285e504-config\") pod \"service-ca-operator-777779d784-vhztc\" (UID: \"3c3bc896-4d90-42f0-92e9-77a7b285e504\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vhztc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161991 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56s2x\" (UniqueName: \"kubernetes.io/projected/9d4a320a-daa4-4bce-9782-5e9880aea226-kube-api-access-56s2x\") pod \"dns-operator-744455d44c-q596q\" (UID: \"9d4a320a-daa4-4bce-9782-5e9880aea226\") " pod="openshift-dns-operator/dns-operator-744455d44c-q596q" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.162059 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-etcd-client\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.162090 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cbae269e-22bc-484c-ad96-ad61d462a28d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-57pxl\" (UID: \"cbae269e-22bc-484c-ad96-ad61d462a28d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.162126 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5g52\" (UniqueName: \"kubernetes.io/projected/3c3bc896-4d90-42f0-92e9-77a7b285e504-kube-api-access-f5g52\") pod \"service-ca-operator-777779d784-vhztc\" (UID: \"3c3bc896-4d90-42f0-92e9-77a7b285e504\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vhztc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.162151 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95700d83-436d-43c5-9eb1-381654f43928-serving-cert\") pod \"console-operator-58897d9998-w7rf2\" (UID: \"95700d83-436d-43c5-9eb1-381654f43928\") " pod="openshift-console-operator/console-operator-58897d9998-w7rf2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.162201 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8jnn\" (UniqueName: \"kubernetes.io/projected/e130287f-996d-4ab0-8c12-351bf8d21df5-kube-api-access-x8jnn\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.162260 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-etcd-ca\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.162316 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvkrp\" (UniqueName: \"kubernetes.io/projected/cbae269e-22bc-484c-ad96-ad61d462a28d-kube-api-access-tvkrp\") pod \"olm-operator-6b444d44fb-57pxl\" (UID: \"cbae269e-22bc-484c-ad96-ad61d462a28d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.162379 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fa6b00ff-07fb-4e9a-80da-780c22acbe69-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.162444 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-config\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.162484 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fa6b00ff-07fb-4e9a-80da-780c22acbe69-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.162503 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-service-ca\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.162520 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04b9035c-78ce-4d54-859d-48f7853f3f16-proxy-tls\") pod \"machine-config-controller-84d6567774-rwdb7\" (UID: \"04b9035c-78ce-4d54-859d-48f7853f3f16\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.162560 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76rz6\" (UniqueName: \"kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-kube-api-access-76rz6\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: E0220 06:48:48.164195 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:48.664178819 +0000 UTC m=+143.536805530 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.180331 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.215442 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-znrdm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.265528 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.265742 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48jfb\" (UniqueName: \"kubernetes.io/projected/4a51eb16-597c-47dc-bd54-c16c33bde071-kube-api-access-48jfb\") pod \"packageserver-d55dfcdfc-gn7fn\" (UID: \"4a51eb16-597c-47dc-bd54-c16c33bde071\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.265773 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/04b9035c-78ce-4d54-859d-48f7853f3f16-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rwdb7\" (UID: \"04b9035c-78ce-4d54-859d-48f7853f3f16\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7" Feb 20 06:48:48 crc kubenswrapper[5094]: E0220 06:48:48.265832 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:48.765797176 +0000 UTC m=+143.638423887 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.265922 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm8lg\" (UniqueName: \"kubernetes.io/projected/55946b30-00e1-4bd4-bd8e-3f5761537a0b-kube-api-access-lm8lg\") pod \"machine-config-server-v68px\" (UID: \"55946b30-00e1-4bd4-bd8e-3f5761537a0b\") " pod="openshift-machine-config-operator/machine-config-server-v68px" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.265952 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4a51eb16-597c-47dc-bd54-c16c33bde071-apiservice-cert\") pod \"packageserver-d55dfcdfc-gn7fn\" (UID: \"4a51eb16-597c-47dc-bd54-c16c33bde071\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266002 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-console-config\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266036 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8csvq\" (UniqueName: \"kubernetes.io/projected/6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c-kube-api-access-8csvq\") pod \"service-ca-9c57cc56f-nlpvl\" (UID: \"6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c\") " pod="openshift-service-ca/service-ca-9c57cc56f-nlpvl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266094 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-45dt8\" (UID: \"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60\") " pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266178 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/95700d83-436d-43c5-9eb1-381654f43928-trusted-ca\") pod \"console-operator-58897d9998-w7rf2\" (UID: \"95700d83-436d-43c5-9eb1-381654f43928\") " pod="openshift-console-operator/console-operator-58897d9998-w7rf2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266212 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fqv8\" (UniqueName: \"kubernetes.io/projected/04b9035c-78ce-4d54-859d-48f7853f3f16-kube-api-access-9fqv8\") pod \"machine-config-controller-84d6567774-rwdb7\" (UID: \"04b9035c-78ce-4d54-859d-48f7853f3f16\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266259 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d74a0c7b-3ad6-4f59-b4ff-33e1209c3116-proxy-tls\") pod \"machine-config-operator-74547568cd-s2xnj\" (UID: \"d74a0c7b-3ad6-4f59-b4ff-33e1209c3116\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266288 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj5hg\" (UniqueName: \"kubernetes.io/projected/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-kube-api-access-kj5hg\") pod \"marketplace-operator-79b997595-45dt8\" (UID: \"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60\") " pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266347 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5ljv\" (UniqueName: \"kubernetes.io/projected/368766ec-f562-4296-bcdb-4bcda1db6c45-kube-api-access-n5ljv\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266405 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-45dt8\" (UID: \"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60\") " pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266434 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/55946b30-00e1-4bd4-bd8e-3f5761537a0b-certs\") pod \"machine-config-server-v68px\" (UID: \"55946b30-00e1-4bd4-bd8e-3f5761537a0b\") " pod="openshift-machine-config-operator/machine-config-server-v68px" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266457 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4a51eb16-597c-47dc-bd54-c16c33bde071-tmpfs\") pod \"packageserver-d55dfcdfc-gn7fn\" (UID: \"4a51eb16-597c-47dc-bd54-c16c33bde071\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266489 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/810d7855-cab6-4e33-9a5f-6d7bac9f66eb-config-volume\") pod \"dns-default-l2hxn\" (UID: \"810d7855-cab6-4e33-9a5f-6d7bac9f66eb\") " pod="openshift-dns/dns-default-l2hxn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266575 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/368766ec-f562-4296-bcdb-4bcda1db6c45-plugins-dir\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266611 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/55946b30-00e1-4bd4-bd8e-3f5761537a0b-node-bootstrap-token\") pod \"machine-config-server-v68px\" (UID: \"55946b30-00e1-4bd4-bd8e-3f5761537a0b\") " pod="openshift-machine-config-operator/machine-config-server-v68px" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266652 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42pbk\" (UniqueName: \"kubernetes.io/projected/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-kube-api-access-42pbk\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266680 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b32eb7f4-e823-4b71-9606-d3dee9f247fd-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-jjblc\" (UID: \"b32eb7f4-e823-4b71-9606-d3dee9f247fd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266724 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4797e67f-42c7-4106-998a-f3555218e77d-config\") pod \"kube-controller-manager-operator-78b949d7b-9z5jc\" (UID: \"4797e67f-42c7-4106-998a-f3555218e77d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266750 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qksp7\" (UniqueName: \"kubernetes.io/projected/a4c7d510-2730-46e1-b157-6e890e8868e9-kube-api-access-qksp7\") pod \"package-server-manager-789f6589d5-rm75q\" (UID: \"a4c7d510-2730-46e1-b157-6e890e8868e9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266796 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-etcd-service-ca\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266820 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1177f137-190b-4563-8a6f-51d7b0d5ca9c-srv-cert\") pod \"catalog-operator-68c6474976-s6brj\" (UID: \"1177f137-190b-4563-8a6f-51d7b0d5ca9c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266862 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-oauth-serving-cert\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266901 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4797e67f-42c7-4106-998a-f3555218e77d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-9z5jc\" (UID: \"4797e67f-42c7-4106-998a-f3555218e77d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266930 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266959 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n54s8\" (UniqueName: \"kubernetes.io/projected/d74a0c7b-3ad6-4f59-b4ff-33e1209c3116-kube-api-access-n54s8\") pod \"machine-config-operator-74547568cd-s2xnj\" (UID: \"d74a0c7b-3ad6-4f59-b4ff-33e1209c3116\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266992 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a072f264-8eef-49ff-804c-fc584b41175c-trusted-ca\") pod \"ingress-operator-5b745b69d9-jqgq2\" (UID: \"a072f264-8eef-49ff-804c-fc584b41175c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267020 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d74a0c7b-3ad6-4f59-b4ff-33e1209c3116-images\") pod \"machine-config-operator-74547568cd-s2xnj\" (UID: \"d74a0c7b-3ad6-4f59-b4ff-33e1209c3116\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267065 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/368766ec-f562-4296-bcdb-4bcda1db6c45-registration-dir\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267088 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndpg9\" (UniqueName: \"kubernetes.io/projected/fc07e658-5bc5-469e-b793-230b7be58f12-kube-api-access-ndpg9\") pod \"ingress-canary-wcwdv\" (UID: \"fc07e658-5bc5-469e-b793-230b7be58f12\") " pod="openshift-ingress-canary/ingress-canary-wcwdv" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267116 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4cl2\" (UniqueName: \"kubernetes.io/projected/a072f264-8eef-49ff-804c-fc584b41175c-kube-api-access-b4cl2\") pod \"ingress-operator-5b745b69d9-jqgq2\" (UID: \"a072f264-8eef-49ff-804c-fc584b41175c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267170 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/368766ec-f562-4296-bcdb-4bcda1db6c45-mountpoint-dir\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267199 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5spz\" (UniqueName: \"kubernetes.io/projected/b32eb7f4-e823-4b71-9606-d3dee9f247fd-kube-api-access-s5spz\") pod \"kube-storage-version-migrator-operator-b67b599dd-jjblc\" (UID: \"b32eb7f4-e823-4b71-9606-d3dee9f247fd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267233 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1177f137-190b-4563-8a6f-51d7b0d5ca9c-profile-collector-cert\") pod \"catalog-operator-68c6474976-s6brj\" (UID: \"1177f137-190b-4563-8a6f-51d7b0d5ca9c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267264 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c3bc896-4d90-42f0-92e9-77a7b285e504-config\") pod \"service-ca-operator-777779d784-vhztc\" (UID: \"3c3bc896-4d90-42f0-92e9-77a7b285e504\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vhztc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267291 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56s2x\" (UniqueName: \"kubernetes.io/projected/9d4a320a-daa4-4bce-9782-5e9880aea226-kube-api-access-56s2x\") pod \"dns-operator-744455d44c-q596q\" (UID: \"9d4a320a-daa4-4bce-9782-5e9880aea226\") " pod="openshift-dns-operator/dns-operator-744455d44c-q596q" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267318 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-etcd-client\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267346 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cbae269e-22bc-484c-ad96-ad61d462a28d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-57pxl\" (UID: \"cbae269e-22bc-484c-ad96-ad61d462a28d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267412 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a4c7d510-2730-46e1-b157-6e890e8868e9-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rm75q\" (UID: \"a4c7d510-2730-46e1-b157-6e890e8868e9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267448 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5g52\" (UniqueName: \"kubernetes.io/projected/3c3bc896-4d90-42f0-92e9-77a7b285e504-kube-api-access-f5g52\") pod \"service-ca-operator-777779d784-vhztc\" (UID: \"3c3bc896-4d90-42f0-92e9-77a7b285e504\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vhztc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267484 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95700d83-436d-43c5-9eb1-381654f43928-serving-cert\") pod \"console-operator-58897d9998-w7rf2\" (UID: \"95700d83-436d-43c5-9eb1-381654f43928\") " pod="openshift-console-operator/console-operator-58897d9998-w7rf2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267521 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8jnn\" (UniqueName: \"kubernetes.io/projected/e130287f-996d-4ab0-8c12-351bf8d21df5-kube-api-access-x8jnn\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267553 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-etcd-ca\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267577 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvkrp\" (UniqueName: \"kubernetes.io/projected/cbae269e-22bc-484c-ad96-ad61d462a28d-kube-api-access-tvkrp\") pod \"olm-operator-6b444d44fb-57pxl\" (UID: \"cbae269e-22bc-484c-ad96-ad61d462a28d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267602 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vwzw\" (UniqueName: \"kubernetes.io/projected/810d7855-cab6-4e33-9a5f-6d7bac9f66eb-kube-api-access-2vwzw\") pod \"dns-default-l2hxn\" (UID: \"810d7855-cab6-4e33-9a5f-6d7bac9f66eb\") " pod="openshift-dns/dns-default-l2hxn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267649 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fa6b00ff-07fb-4e9a-80da-780c22acbe69-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267694 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-config\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267775 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fa6b00ff-07fb-4e9a-80da-780c22acbe69-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267806 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-service-ca\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267833 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04b9035c-78ce-4d54-859d-48f7853f3f16-proxy-tls\") pod \"machine-config-controller-84d6567774-rwdb7\" (UID: \"04b9035c-78ce-4d54-859d-48f7853f3f16\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267865 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76rz6\" (UniqueName: \"kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-kube-api-access-76rz6\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267888 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b32eb7f4-e823-4b71-9606-d3dee9f247fd-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-jjblc\" (UID: \"b32eb7f4-e823-4b71-9606-d3dee9f247fd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.268077 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e130287f-996d-4ab0-8c12-351bf8d21df5-console-oauth-config\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.273843 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7a75a178-3fa3-4be7-b29f-e1f01dc859a4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-x72n5\" (UID: \"7a75a178-3fa3-4be7-b29f-e1f01dc859a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x72n5" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.273901 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/368766ec-f562-4296-bcdb-4bcda1db6c45-csi-data-dir\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.273966 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4mjl\" (UniqueName: \"kubernetes.io/projected/7a75a178-3fa3-4be7-b29f-e1f01dc859a4-kube-api-access-n4mjl\") pod \"multus-admission-controller-857f4d67dd-x72n5\" (UID: \"7a75a178-3fa3-4be7-b29f-e1f01dc859a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x72n5" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274003 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4797e67f-42c7-4106-998a-f3555218e77d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-9z5jc\" (UID: \"4797e67f-42c7-4106-998a-f3555218e77d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274032 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-trusted-ca-bundle\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274061 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95700d83-436d-43c5-9eb1-381654f43928-config\") pod \"console-operator-58897d9998-w7rf2\" (UID: \"95700d83-436d-43c5-9eb1-381654f43928\") " pod="openshift-console-operator/console-operator-58897d9998-w7rf2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274092 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d74a0c7b-3ad6-4f59-b4ff-33e1209c3116-auth-proxy-config\") pod \"machine-config-operator-74547568cd-s2xnj\" (UID: \"d74a0c7b-3ad6-4f59-b4ff-33e1209c3116\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274124 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c-signing-key\") pod \"service-ca-9c57cc56f-nlpvl\" (UID: \"6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c\") " pod="openshift-service-ca/service-ca-9c57cc56f-nlpvl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274152 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-registry-tls\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274178 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cbae269e-22bc-484c-ad96-ad61d462a28d-srv-cert\") pod \"olm-operator-6b444d44fb-57pxl\" (UID: \"cbae269e-22bc-484c-ad96-ad61d462a28d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274208 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc07e658-5bc5-469e-b793-230b7be58f12-cert\") pod \"ingress-canary-wcwdv\" (UID: \"fc07e658-5bc5-469e-b793-230b7be58f12\") " pod="openshift-ingress-canary/ingress-canary-wcwdv" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274260 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fa6b00ff-07fb-4e9a-80da-780c22acbe69-registry-certificates\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274288 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a072f264-8eef-49ff-804c-fc584b41175c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-jqgq2\" (UID: \"a072f264-8eef-49ff-804c-fc584b41175c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274318 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4js2k\" (UniqueName: \"kubernetes.io/projected/95700d83-436d-43c5-9eb1-381654f43928-kube-api-access-4js2k\") pod \"console-operator-58897d9998-w7rf2\" (UID: \"95700d83-436d-43c5-9eb1-381654f43928\") " pod="openshift-console-operator/console-operator-58897d9998-w7rf2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274350 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/810d7855-cab6-4e33-9a5f-6d7bac9f66eb-metrics-tls\") pod \"dns-default-l2hxn\" (UID: \"810d7855-cab6-4e33-9a5f-6d7bac9f66eb\") " pod="openshift-dns/dns-default-l2hxn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274381 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e130287f-996d-4ab0-8c12-351bf8d21df5-console-serving-cert\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274402 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c3bc896-4d90-42f0-92e9-77a7b285e504-serving-cert\") pod \"service-ca-operator-777779d784-vhztc\" (UID: \"3c3bc896-4d90-42f0-92e9-77a7b285e504\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vhztc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274436 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a072f264-8eef-49ff-804c-fc584b41175c-metrics-tls\") pod \"ingress-operator-5b745b69d9-jqgq2\" (UID: \"a072f264-8eef-49ff-804c-fc584b41175c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274461 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c-signing-cabundle\") pod \"service-ca-9c57cc56f-nlpvl\" (UID: \"6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c\") " pod="openshift-service-ca/service-ca-9c57cc56f-nlpvl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274494 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa6b00ff-07fb-4e9a-80da-780c22acbe69-trusted-ca\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274520 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4a51eb16-597c-47dc-bd54-c16c33bde071-webhook-cert\") pod \"packageserver-d55dfcdfc-gn7fn\" (UID: \"4a51eb16-597c-47dc-bd54-c16c33bde071\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274549 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-bound-sa-token\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274578 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb99w\" (UniqueName: \"kubernetes.io/projected/1177f137-190b-4563-8a6f-51d7b0d5ca9c-kube-api-access-mb99w\") pod \"catalog-operator-68c6474976-s6brj\" (UID: \"1177f137-190b-4563-8a6f-51d7b0d5ca9c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274607 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/368766ec-f562-4296-bcdb-4bcda1db6c45-socket-dir\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274654 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9d4a320a-daa4-4bce-9782-5e9880aea226-metrics-tls\") pod \"dns-operator-744455d44c-q596q\" (UID: \"9d4a320a-daa4-4bce-9782-5e9880aea226\") " pod="openshift-dns-operator/dns-operator-744455d44c-q596q" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274688 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-serving-cert\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274676 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/04b9035c-78ce-4d54-859d-48f7853f3f16-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rwdb7\" (UID: \"04b9035c-78ce-4d54-859d-48f7853f3f16\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.283677 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-console-config\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.284658 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c3bc896-4d90-42f0-92e9-77a7b285e504-config\") pod \"service-ca-operator-777779d784-vhztc\" (UID: \"3c3bc896-4d90-42f0-92e9-77a7b285e504\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vhztc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.285691 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-etcd-ca\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.291635 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa6b00ff-07fb-4e9a-80da-780c22acbe69-trusted-ca\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.293575 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a072f264-8eef-49ff-804c-fc584b41175c-trusted-ca\") pod \"ingress-operator-5b745b69d9-jqgq2\" (UID: \"a072f264-8eef-49ff-804c-fc584b41175c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.295657 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/95700d83-436d-43c5-9eb1-381654f43928-trusted-ca\") pod \"console-operator-58897d9998-w7rf2\" (UID: \"95700d83-436d-43c5-9eb1-381654f43928\") " pod="openshift-console-operator/console-operator-58897d9998-w7rf2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.296089 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-trusted-ca-bundle\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.296572 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d74a0c7b-3ad6-4f59-b4ff-33e1209c3116-images\") pod \"machine-config-operator-74547568cd-s2xnj\" (UID: \"d74a0c7b-3ad6-4f59-b4ff-33e1209c3116\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.302812 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4797e67f-42c7-4106-998a-f3555218e77d-config\") pod \"kube-controller-manager-operator-78b949d7b-9z5jc\" (UID: \"4797e67f-42c7-4106-998a-f3555218e77d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.305112 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-45dt8\" (UID: \"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60\") " pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.305833 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-config\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.309342 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d74a0c7b-3ad6-4f59-b4ff-33e1209c3116-auth-proxy-config\") pod \"machine-config-operator-74547568cd-s2xnj\" (UID: \"d74a0c7b-3ad6-4f59-b4ff-33e1209c3116\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.310033 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-oauth-serving-cert\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: E0220 06:48:48.317768 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:48.817740406 +0000 UTC m=+143.690367117 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.320503 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b32eb7f4-e823-4b71-9606-d3dee9f247fd-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-jjblc\" (UID: \"b32eb7f4-e823-4b71-9606-d3dee9f247fd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.322450 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d74a0c7b-3ad6-4f59-b4ff-33e1209c3116-proxy-tls\") pod \"machine-config-operator-74547568cd-s2xnj\" (UID: \"d74a0c7b-3ad6-4f59-b4ff-33e1209c3116\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.323886 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95700d83-436d-43c5-9eb1-381654f43928-config\") pod \"console-operator-58897d9998-w7rf2\" (UID: \"95700d83-436d-43c5-9eb1-381654f43928\") " pod="openshift-console-operator/console-operator-58897d9998-w7rf2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.324996 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-d2l2r" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.325387 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fa6b00ff-07fb-4e9a-80da-780c22acbe69-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.328642 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-service-ca\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.328954 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-etcd-service-ca\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.342245 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a072f264-8eef-49ff-804c-fc584b41175c-metrics-tls\") pod \"ingress-operator-5b745b69d9-jqgq2\" (UID: \"a072f264-8eef-49ff-804c-fc584b41175c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.342855 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e130287f-996d-4ab0-8c12-351bf8d21df5-console-serving-cert\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.343339 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e130287f-996d-4ab0-8c12-351bf8d21df5-console-oauth-config\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.343531 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cbae269e-22bc-484c-ad96-ad61d462a28d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-57pxl\" (UID: \"cbae269e-22bc-484c-ad96-ad61d462a28d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.344031 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7a75a178-3fa3-4be7-b29f-e1f01dc859a4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-x72n5\" (UID: \"7a75a178-3fa3-4be7-b29f-e1f01dc859a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x72n5" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.347851 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fa6b00ff-07fb-4e9a-80da-780c22acbe69-registry-certificates\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.351028 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04b9035c-78ce-4d54-859d-48f7853f3f16-proxy-tls\") pod \"machine-config-controller-84d6567774-rwdb7\" (UID: \"04b9035c-78ce-4d54-859d-48f7853f3f16\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.351092 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-serving-cert\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.351137 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fa6b00ff-07fb-4e9a-80da-780c22acbe69-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.351146 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cbae269e-22bc-484c-ad96-ad61d462a28d-srv-cert\") pod \"olm-operator-6b444d44fb-57pxl\" (UID: \"cbae269e-22bc-484c-ad96-ad61d462a28d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.351390 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9d4a320a-daa4-4bce-9782-5e9880aea226-metrics-tls\") pod \"dns-operator-744455d44c-q596q\" (UID: \"9d4a320a-daa4-4bce-9782-5e9880aea226\") " pod="openshift-dns-operator/dns-operator-744455d44c-q596q" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.351444 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95700d83-436d-43c5-9eb1-381654f43928-serving-cert\") pod \"console-operator-58897d9998-w7rf2\" (UID: \"95700d83-436d-43c5-9eb1-381654f43928\") " pod="openshift-console-operator/console-operator-58897d9998-w7rf2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.351652 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-etcd-client\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.351871 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a072f264-8eef-49ff-804c-fc584b41175c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-jqgq2\" (UID: \"a072f264-8eef-49ff-804c-fc584b41175c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.351942 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c3bc896-4d90-42f0-92e9-77a7b285e504-serving-cert\") pod \"service-ca-operator-777779d784-vhztc\" (UID: \"3c3bc896-4d90-42f0-92e9-77a7b285e504\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vhztc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.352391 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-45dt8\" (UID: \"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60\") " pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.358116 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4js2k\" (UniqueName: \"kubernetes.io/projected/95700d83-436d-43c5-9eb1-381654f43928-kube-api-access-4js2k\") pod \"console-operator-58897d9998-w7rf2\" (UID: \"95700d83-436d-43c5-9eb1-381654f43928\") " pod="openshift-console-operator/console-operator-58897d9998-w7rf2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.358971 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b32eb7f4-e823-4b71-9606-d3dee9f247fd-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-jjblc\" (UID: \"b32eb7f4-e823-4b71-9606-d3dee9f247fd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.361129 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8jnn\" (UniqueName: \"kubernetes.io/projected/e130287f-996d-4ab0-8c12-351bf8d21df5-kube-api-access-x8jnn\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.364264 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-registry-tls\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.375233 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:48 crc kubenswrapper[5094]: E0220 06:48:48.375434 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:48.875394581 +0000 UTC m=+143.748021292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.375480 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/368766ec-f562-4296-bcdb-4bcda1db6c45-socket-dir\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.375516 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48jfb\" (UniqueName: \"kubernetes.io/projected/4a51eb16-597c-47dc-bd54-c16c33bde071-kube-api-access-48jfb\") pod \"packageserver-d55dfcdfc-gn7fn\" (UID: \"4a51eb16-597c-47dc-bd54-c16c33bde071\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.375565 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm8lg\" (UniqueName: \"kubernetes.io/projected/55946b30-00e1-4bd4-bd8e-3f5761537a0b-kube-api-access-lm8lg\") pod \"machine-config-server-v68px\" (UID: \"55946b30-00e1-4bd4-bd8e-3f5761537a0b\") " pod="openshift-machine-config-operator/machine-config-server-v68px" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.375588 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4a51eb16-597c-47dc-bd54-c16c33bde071-apiservice-cert\") pod \"packageserver-d55dfcdfc-gn7fn\" (UID: \"4a51eb16-597c-47dc-bd54-c16c33bde071\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.375614 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8csvq\" (UniqueName: \"kubernetes.io/projected/6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c-kube-api-access-8csvq\") pod \"service-ca-9c57cc56f-nlpvl\" (UID: \"6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c\") " pod="openshift-service-ca/service-ca-9c57cc56f-nlpvl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.375658 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5ljv\" (UniqueName: \"kubernetes.io/projected/368766ec-f562-4296-bcdb-4bcda1db6c45-kube-api-access-n5ljv\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.375684 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/55946b30-00e1-4bd4-bd8e-3f5761537a0b-certs\") pod \"machine-config-server-v68px\" (UID: \"55946b30-00e1-4bd4-bd8e-3f5761537a0b\") " pod="openshift-machine-config-operator/machine-config-server-v68px" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.375722 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4a51eb16-597c-47dc-bd54-c16c33bde071-tmpfs\") pod \"packageserver-d55dfcdfc-gn7fn\" (UID: \"4a51eb16-597c-47dc-bd54-c16c33bde071\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.375741 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/810d7855-cab6-4e33-9a5f-6d7bac9f66eb-config-volume\") pod \"dns-default-l2hxn\" (UID: \"810d7855-cab6-4e33-9a5f-6d7bac9f66eb\") " pod="openshift-dns/dns-default-l2hxn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.375763 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/368766ec-f562-4296-bcdb-4bcda1db6c45-plugins-dir\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.375787 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/55946b30-00e1-4bd4-bd8e-3f5761537a0b-node-bootstrap-token\") pod \"machine-config-server-v68px\" (UID: \"55946b30-00e1-4bd4-bd8e-3f5761537a0b\") " pod="openshift-machine-config-operator/machine-config-server-v68px" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.375829 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/368766ec-f562-4296-bcdb-4bcda1db6c45-socket-dir\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.375830 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qksp7\" (UniqueName: \"kubernetes.io/projected/a4c7d510-2730-46e1-b157-6e890e8868e9-kube-api-access-qksp7\") pod \"package-server-manager-789f6589d5-rm75q\" (UID: \"a4c7d510-2730-46e1-b157-6e890e8868e9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.375919 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1177f137-190b-4563-8a6f-51d7b0d5ca9c-srv-cert\") pod \"catalog-operator-68c6474976-s6brj\" (UID: \"1177f137-190b-4563-8a6f-51d7b0d5ca9c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.375988 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.376042 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/368766ec-f562-4296-bcdb-4bcda1db6c45-registration-dir\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.376068 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndpg9\" (UniqueName: \"kubernetes.io/projected/fc07e658-5bc5-469e-b793-230b7be58f12-kube-api-access-ndpg9\") pod \"ingress-canary-wcwdv\" (UID: \"fc07e658-5bc5-469e-b793-230b7be58f12\") " pod="openshift-ingress-canary/ingress-canary-wcwdv" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.376098 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/368766ec-f562-4296-bcdb-4bcda1db6c45-mountpoint-dir\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.376087 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/368766ec-f562-4296-bcdb-4bcda1db6c45-plugins-dir\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.376148 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1177f137-190b-4563-8a6f-51d7b0d5ca9c-profile-collector-cert\") pod \"catalog-operator-68c6474976-s6brj\" (UID: \"1177f137-190b-4563-8a6f-51d7b0d5ca9c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.376188 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a4c7d510-2730-46e1-b157-6e890e8868e9-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rm75q\" (UID: \"a4c7d510-2730-46e1-b157-6e890e8868e9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.376256 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vwzw\" (UniqueName: \"kubernetes.io/projected/810d7855-cab6-4e33-9a5f-6d7bac9f66eb-kube-api-access-2vwzw\") pod \"dns-default-l2hxn\" (UID: \"810d7855-cab6-4e33-9a5f-6d7bac9f66eb\") " pod="openshift-dns/dns-default-l2hxn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.376352 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/368766ec-f562-4296-bcdb-4bcda1db6c45-registration-dir\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.376381 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/368766ec-f562-4296-bcdb-4bcda1db6c45-csi-data-dir\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.376466 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/368766ec-f562-4296-bcdb-4bcda1db6c45-mountpoint-dir\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.377176 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c-signing-key\") pod \"service-ca-9c57cc56f-nlpvl\" (UID: \"6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c\") " pod="openshift-service-ca/service-ca-9c57cc56f-nlpvl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.377212 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc07e658-5bc5-469e-b793-230b7be58f12-cert\") pod \"ingress-canary-wcwdv\" (UID: \"fc07e658-5bc5-469e-b793-230b7be58f12\") " pod="openshift-ingress-canary/ingress-canary-wcwdv" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.377247 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/810d7855-cab6-4e33-9a5f-6d7bac9f66eb-metrics-tls\") pod \"dns-default-l2hxn\" (UID: \"810d7855-cab6-4e33-9a5f-6d7bac9f66eb\") " pod="openshift-dns/dns-default-l2hxn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.377257 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/368766ec-f562-4296-bcdb-4bcda1db6c45-csi-data-dir\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.377284 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4a51eb16-597c-47dc-bd54-c16c33bde071-webhook-cert\") pod \"packageserver-d55dfcdfc-gn7fn\" (UID: \"4a51eb16-597c-47dc-bd54-c16c33bde071\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.377503 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c-signing-cabundle\") pod \"service-ca-9c57cc56f-nlpvl\" (UID: \"6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c\") " pod="openshift-service-ca/service-ca-9c57cc56f-nlpvl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.377572 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb99w\" (UniqueName: \"kubernetes.io/projected/1177f137-190b-4563-8a6f-51d7b0d5ca9c-kube-api-access-mb99w\") pod \"catalog-operator-68c6474976-s6brj\" (UID: \"1177f137-190b-4563-8a6f-51d7b0d5ca9c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.377774 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/810d7855-cab6-4e33-9a5f-6d7bac9f66eb-config-volume\") pod \"dns-default-l2hxn\" (UID: \"810d7855-cab6-4e33-9a5f-6d7bac9f66eb\") " pod="openshift-dns/dns-default-l2hxn" Feb 20 06:48:48 crc kubenswrapper[5094]: E0220 06:48:48.377802 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:48.877779227 +0000 UTC m=+143.750405938 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.378674 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c-signing-cabundle\") pod \"service-ca-9c57cc56f-nlpvl\" (UID: \"6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c\") " pod="openshift-service-ca/service-ca-9c57cc56f-nlpvl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.379145 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4a51eb16-597c-47dc-bd54-c16c33bde071-tmpfs\") pod \"packageserver-d55dfcdfc-gn7fn\" (UID: \"4a51eb16-597c-47dc-bd54-c16c33bde071\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.381414 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a4c7d510-2730-46e1-b157-6e890e8868e9-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rm75q\" (UID: \"a4c7d510-2730-46e1-b157-6e890e8868e9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.381781 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1177f137-190b-4563-8a6f-51d7b0d5ca9c-srv-cert\") pod \"catalog-operator-68c6474976-s6brj\" (UID: \"1177f137-190b-4563-8a6f-51d7b0d5ca9c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.382510 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4a51eb16-597c-47dc-bd54-c16c33bde071-webhook-cert\") pod \"packageserver-d55dfcdfc-gn7fn\" (UID: \"4a51eb16-597c-47dc-bd54-c16c33bde071\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.384338 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/810d7855-cab6-4e33-9a5f-6d7bac9f66eb-metrics-tls\") pod \"dns-default-l2hxn\" (UID: \"810d7855-cab6-4e33-9a5f-6d7bac9f66eb\") " pod="openshift-dns/dns-default-l2hxn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.384514 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c-signing-key\") pod \"service-ca-9c57cc56f-nlpvl\" (UID: \"6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c\") " pod="openshift-service-ca/service-ca-9c57cc56f-nlpvl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.385788 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56s2x\" (UniqueName: \"kubernetes.io/projected/9d4a320a-daa4-4bce-9782-5e9880aea226-kube-api-access-56s2x\") pod \"dns-operator-744455d44c-q596q\" (UID: \"9d4a320a-daa4-4bce-9782-5e9880aea226\") " pod="openshift-dns-operator/dns-operator-744455d44c-q596q" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.385916 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4797e67f-42c7-4106-998a-f3555218e77d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-9z5jc\" (UID: \"4797e67f-42c7-4106-998a-f3555218e77d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.386743 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/55946b30-00e1-4bd4-bd8e-3f5761537a0b-node-bootstrap-token\") pod \"machine-config-server-v68px\" (UID: \"55946b30-00e1-4bd4-bd8e-3f5761537a0b\") " pod="openshift-machine-config-operator/machine-config-server-v68px" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.390988 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc07e658-5bc5-469e-b793-230b7be58f12-cert\") pod \"ingress-canary-wcwdv\" (UID: \"fc07e658-5bc5-469e-b793-230b7be58f12\") " pod="openshift-ingress-canary/ingress-canary-wcwdv" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.392749 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1177f137-190b-4563-8a6f-51d7b0d5ca9c-profile-collector-cert\") pod \"catalog-operator-68c6474976-s6brj\" (UID: \"1177f137-190b-4563-8a6f-51d7b0d5ca9c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.393472 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4a51eb16-597c-47dc-bd54-c16c33bde071-apiservice-cert\") pod \"packageserver-d55dfcdfc-gn7fn\" (UID: \"4a51eb16-597c-47dc-bd54-c16c33bde071\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.394021 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/55946b30-00e1-4bd4-bd8e-3f5761537a0b-certs\") pod \"machine-config-server-v68px\" (UID: \"55946b30-00e1-4bd4-bd8e-3f5761537a0b\") " pod="openshift-machine-config-operator/machine-config-server-v68px" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.410343 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4mjl\" (UniqueName: \"kubernetes.io/projected/7a75a178-3fa3-4be7-b29f-e1f01dc859a4-kube-api-access-n4mjl\") pod \"multus-admission-controller-857f4d67dd-x72n5\" (UID: \"7a75a178-3fa3-4be7-b29f-e1f01dc859a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x72n5" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.432664 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5g52\" (UniqueName: \"kubernetes.io/projected/3c3bc896-4d90-42f0-92e9-77a7b285e504-kube-api-access-f5g52\") pod \"service-ca-operator-777779d784-vhztc\" (UID: \"3c3bc896-4d90-42f0-92e9-77a7b285e504\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vhztc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.453128 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j2mmj"] Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.454712 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bslb9"] Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.457301 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fqv8\" (UniqueName: \"kubernetes.io/projected/04b9035c-78ce-4d54-859d-48f7853f3f16-kube-api-access-9fqv8\") pod \"machine-config-controller-84d6567774-rwdb7\" (UID: \"04b9035c-78ce-4d54-859d-48f7853f3f16\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.458820 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-p9fck"] Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.462067 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5spz\" (UniqueName: \"kubernetes.io/projected/b32eb7f4-e823-4b71-9606-d3dee9f247fd-kube-api-access-s5spz\") pod \"kube-storage-version-migrator-operator-b67b599dd-jjblc\" (UID: \"b32eb7f4-e823-4b71-9606-d3dee9f247fd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.480532 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:48 crc kubenswrapper[5094]: E0220 06:48:48.481407 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:48.98135758 +0000 UTC m=+143.853984291 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.484874 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4cl2\" (UniqueName: \"kubernetes.io/projected/a072f264-8eef-49ff-804c-fc584b41175c-kube-api-access-b4cl2\") pod \"ingress-operator-5b745b69d9-jqgq2\" (UID: \"a072f264-8eef-49ff-804c-fc584b41175c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.508142 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-bound-sa-token\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: W0220 06:48:48.516111 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode594f26b_0fd6_44a1_93eb_84593591389f.slice/crio-fde96eaf411b08b987ea2a5695c1132631ea62a21b4b7d86b396dda3b9327c24 WatchSource:0}: Error finding container fde96eaf411b08b987ea2a5695c1132631ea62a21b4b7d86b396dda3b9327c24: Status 404 returned error can't find the container with id fde96eaf411b08b987ea2a5695c1132631ea62a21b4b7d86b396dda3b9327c24 Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.529970 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvkrp\" (UniqueName: \"kubernetes.io/projected/cbae269e-22bc-484c-ad96-ad61d462a28d-kube-api-access-tvkrp\") pod \"olm-operator-6b444d44fb-57pxl\" (UID: \"cbae269e-22bc-484c-ad96-ad61d462a28d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.544244 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj5hg\" (UniqueName: \"kubernetes.io/projected/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-kube-api-access-kj5hg\") pod \"marketplace-operator-79b997595-45dt8\" (UID: \"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60\") " pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.561151 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n54s8\" (UniqueName: \"kubernetes.io/projected/d74a0c7b-3ad6-4f59-b4ff-33e1209c3116-kube-api-access-n54s8\") pod \"machine-config-operator-74547568cd-s2xnj\" (UID: \"d74a0c7b-3ad6-4f59-b4ff-33e1209c3116\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.580148 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4797e67f-42c7-4106-998a-f3555218e77d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-9z5jc\" (UID: \"4797e67f-42c7-4106-998a-f3555218e77d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.582231 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: E0220 06:48:48.582658 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:49.082646258 +0000 UTC m=+143.955272969 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.589442 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj"] Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.592938 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd"] Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.595441 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-w7rf2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.595494 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-ps6pv"] Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.600386 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76rz6\" (UniqueName: \"kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-kube-api-access-76rz6\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.602137 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.615867 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb"] Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.615922 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg"] Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.619355 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-q596q" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.625800 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bwvdt"] Feb 20 06:48:48 crc kubenswrapper[5094]: W0220 06:48:48.625993 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1974d27_b923_4a9b_9874_d400df5bd29a.slice/crio-d91c25a804ee025a2aaae437050594ececf533c312e1237c92db87a755fb8c61 WatchSource:0}: Error finding container d91c25a804ee025a2aaae437050594ececf533c312e1237c92db87a755fb8c61: Status 404 returned error can't find the container with id d91c25a804ee025a2aaae437050594ececf533c312e1237c92db87a755fb8c61 Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.647383 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.651604 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42pbk\" (UniqueName: \"kubernetes.io/projected/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-kube-api-access-42pbk\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.652763 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.666290 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-x72n5" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.677117 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.680887 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.683038 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm8lg\" (UniqueName: \"kubernetes.io/projected/55946b30-00e1-4bd4-bd8e-3f5761537a0b-kube-api-access-lm8lg\") pod \"machine-config-server-v68px\" (UID: \"55946b30-00e1-4bd4-bd8e-3f5761537a0b\") " pod="openshift-machine-config-operator/machine-config-server-v68px" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.683784 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:48 crc kubenswrapper[5094]: E0220 06:48:48.683952 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:49.183924036 +0000 UTC m=+144.056550747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.684823 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: E0220 06:48:48.685275 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:49.185258428 +0000 UTC m=+144.057885139 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.689416 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.690996 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48jfb\" (UniqueName: \"kubernetes.io/projected/4a51eb16-597c-47dc-bd54-c16c33bde071-kube-api-access-48jfb\") pod \"packageserver-d55dfcdfc-gn7fn\" (UID: \"4a51eb16-597c-47dc-bd54-c16c33bde071\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.695222 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vhztc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.703225 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5ljv\" (UniqueName: \"kubernetes.io/projected/368766ec-f562-4296-bcdb-4bcda1db6c45-kube-api-access-n5ljv\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.720448 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.721692 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qksp7\" (UniqueName: \"kubernetes.io/projected/a4c7d510-2730-46e1-b157-6e890e8868e9-kube-api-access-qksp7\") pod \"package-server-manager-789f6589d5-rm75q\" (UID: \"a4c7d510-2730-46e1-b157-6e890e8868e9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.729165 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.748803 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8csvq\" (UniqueName: \"kubernetes.io/projected/6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c-kube-api-access-8csvq\") pod \"service-ca-9c57cc56f-nlpvl\" (UID: \"6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c\") " pod="openshift-service-ca/service-ca-9c57cc56f-nlpvl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.761148 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.771975 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vwzw\" (UniqueName: \"kubernetes.io/projected/810d7855-cab6-4e33-9a5f-6d7bac9f66eb-kube-api-access-2vwzw\") pod \"dns-default-l2hxn\" (UID: \"810d7855-cab6-4e33-9a5f-6d7bac9f66eb\") " pod="openshift-dns/dns-default-l2hxn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.775669 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-v68px" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.778244 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndpg9\" (UniqueName: \"kubernetes.io/projected/fc07e658-5bc5-469e-b793-230b7be58f12-kube-api-access-ndpg9\") pod \"ingress-canary-wcwdv\" (UID: \"fc07e658-5bc5-469e-b793-230b7be58f12\") " pod="openshift-ingress-canary/ingress-canary-wcwdv" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.794695 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:48 crc kubenswrapper[5094]: E0220 06:48:48.795165 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:49.29513747 +0000 UTC m=+144.167764181 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.795588 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wcwdv" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.798887 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-7hljp"] Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.822468 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb99w\" (UniqueName: \"kubernetes.io/projected/1177f137-190b-4563-8a6f-51d7b0d5ca9c-kube-api-access-mb99w\") pod \"catalog-operator-68c6474976-s6brj\" (UID: \"1177f137-190b-4563-8a6f-51d7b0d5ca9c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.829430 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-p4blk"] Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.869354 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4"] Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.870655 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" event={"ID":"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c","Type":"ContainerStarted","Data":"48ac16689b00193d6e154a981653d9fe7dd39018c0acc1a7610d05cb116747a3"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.871014 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-w7rf2"] Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.875517 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" event={"ID":"18cc290d-78be-42c6-af5b-3b8b86941eb2","Type":"ContainerStarted","Data":"0f59159cf70471ec88327f095916949d208d3b4159ba3e8002ae3f04cc1b0b02"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.875545 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" event={"ID":"18cc290d-78be-42c6-af5b-3b8b86941eb2","Type":"ContainerStarted","Data":"a8a623d9daef3ed6dfb32889d5aba7e415d276108ff677b5411cb61659224d60"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.876497 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.879451 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-9sztr" event={"ID":"a6908ab3-3d33-4e31-b226-b6607f34ee8b","Type":"ContainerStarted","Data":"85eb772686dce2301c9b652a94f5a2c3a5c8e4d2f6ccf3474d25f433f3d05709"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.879478 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-9sztr" event={"ID":"a6908ab3-3d33-4e31-b226-b6607f34ee8b","Type":"ContainerStarted","Data":"754f8f0aa0397403892aaf1179a93493ac564f3e42342a60f2bc6e650fffabec"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.885143 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.885616 5094 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-fnbl8 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.885753 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" podUID="18cc290d-78be-42c6-af5b-3b8b86941eb2" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.894355 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" event={"ID":"e594f26b-0fd6-44a1-93eb-84593591389f","Type":"ContainerStarted","Data":"cb47109c94bbf8a2aacd53b887c83a904e8dcda61778f8c16e4aec4144261c85"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.895483 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" event={"ID":"e594f26b-0fd6-44a1-93eb-84593591389f","Type":"ContainerStarted","Data":"fde96eaf411b08b987ea2a5695c1132631ea62a21b4b7d86b396dda3b9327c24"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.899867 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" event={"ID":"2f348b60-0d81-490e-bfb4-ea32546c995a","Type":"ContainerStarted","Data":"a12ca65cfcd04a2d8d82fbe2a093e225deb8a5bcd4e29b2c59791942f7c1fb4c"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.900219 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: E0220 06:48:48.900624 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:49.400578726 +0000 UTC m=+144.273205437 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.902410 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" event={"ID":"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5","Type":"ContainerStarted","Data":"bf96c34879e8b3b8b43d54fe3cde7b504e57f4682c37550a83f12cccfaf7d07c"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.902737 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" event={"ID":"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5","Type":"ContainerStarted","Data":"94f008c3fadb5b12936b7080560c6d49126d29712d46d44cae8050bc4be3dbe7"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.903573 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.903889 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" event={"ID":"38e3be97-7374-4b8b-9565-4d60baa02401","Type":"ContainerStarted","Data":"6ede8375995c4d584a09dc3b8461d2484e83e419a1acd3902384b06a59071516"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.906220 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j2mmj" event={"ID":"5021cb92-f82d-47ee-9978-58e897c354b1","Type":"ContainerStarted","Data":"227cf5962f112730dac2de0eff0ceafd6faf36426c36f96312ee9d29cbf9c1f4"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.911749 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd" event={"ID":"d1974d27-b923-4a9b-9874-d400df5bd29a","Type":"ContainerStarted","Data":"d91c25a804ee025a2aaae437050594ececf533c312e1237c92db87a755fb8c61"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.916656 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb" event={"ID":"e78b7a6b-91b7-4753-bd82-df9d3ea97291","Type":"ContainerStarted","Data":"bb435a4e0823b71a3c560876c02f7222976e38e69c2fa7d633e72adb567c7fd9"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.921818 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" event={"ID":"f8d0112e-e85e-42f1-b28b-c0c996f36fe0","Type":"ContainerStarted","Data":"2e6ed90365089f3cede1e0c536507ed5619f4a57e64f9a16434e2833a90a9772"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.922058 5094 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-qrtpl container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.922240 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" podUID="d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.933299 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" event={"ID":"d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2","Type":"ContainerStarted","Data":"e7cdc79c1efb693626d3e88d6ac7b76397826febf3c66d451684341422248fd7"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.933353 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" event={"ID":"d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2","Type":"ContainerStarted","Data":"ed76240b0720c9e933d3f95294f5481e871153c1b7425f7867bf46042ce3c096"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.933667 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.992857 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-znrdm"] Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.992936 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7hljp" event={"ID":"bac53d01-ed38-46a8-ae9e-bfb72e5565a1","Type":"ContainerStarted","Data":"da2c0b1a5ac1bfb2118456e86a925f576435046f08e44bbd1d22ff45a3a68482"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.997964 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg" event={"ID":"30a7db23-c18c-4bc6-b1b7-97b32a419fbe","Type":"ContainerStarted","Data":"8d2d652617742cefd28875c8fe01f8304463e4e86ed0e44373501a81794caa80"} Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.002493 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:49 crc kubenswrapper[5094]: E0220 06:48:49.002659 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:49.502637083 +0000 UTC m=+144.375263794 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.002775 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:49 crc kubenswrapper[5094]: E0220 06:48:49.005211 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:49.505202824 +0000 UTC m=+144.377829535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.014038 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q" Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.026053 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-d2l2r"] Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.033907 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-nlpvl" Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.038414 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599"] Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.039867 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.042921 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj" Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.052006 5094 patch_prober.go:28] interesting pod/router-default-5444994796-9sztr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 06:48:49 crc kubenswrapper[5094]: [-]has-synced failed: reason withheld Feb 20 06:48:49 crc kubenswrapper[5094]: [+]process-running ok Feb 20 06:48:49 crc kubenswrapper[5094]: healthz check failed Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.052299 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9sztr" podUID="a6908ab3-3d33-4e31-b226-b6607f34ee8b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.068492 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-l2hxn" Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.077905 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2"] Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.123974 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:49 crc kubenswrapper[5094]: E0220 06:48:49.124417 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:49.624385966 +0000 UTC m=+144.497012667 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.232806 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:49 crc kubenswrapper[5094]: E0220 06:48:49.233722 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:49.733692174 +0000 UTC m=+144.606318885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.334759 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:49 crc kubenswrapper[5094]: E0220 06:48:49.335204 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:49.835186887 +0000 UTC m=+144.707813598 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.418216 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" podStartSLOduration=122.418196113 podStartE2EDuration="2m2.418196113s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:49.416118564 +0000 UTC m=+144.288745275" watchObservedRunningTime="2026-02-20 06:48:49.418196113 +0000 UTC m=+144.290822824" Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.440092 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:49 crc kubenswrapper[5094]: E0220 06:48:49.440491 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:49.940471381 +0000 UTC m=+144.813098092 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.541787 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:49 crc kubenswrapper[5094]: E0220 06:48:49.542232 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:50.042204719 +0000 UTC m=+144.914831430 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.582778 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl"] Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.586107 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7"] Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.630574 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vhztc"] Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.644111 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:49 crc kubenswrapper[5094]: E0220 06:48:49.644560 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:50.144540482 +0000 UTC m=+145.017167193 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.703881 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-q596q"] Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.742517 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-45dt8"] Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.746720 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:49 crc kubenswrapper[5094]: E0220 06:48:49.747194 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:50.247171973 +0000 UTC m=+145.119798684 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.778191 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" podStartSLOduration=122.778165657 podStartE2EDuration="2m2.778165657s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:49.776950708 +0000 UTC m=+144.649577419" watchObservedRunningTime="2026-02-20 06:48:49.778165657 +0000 UTC m=+144.650792368" Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.849078 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:49 crc kubenswrapper[5094]: E0220 06:48:49.849544 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:50.349526917 +0000 UTC m=+145.222153628 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.950536 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:49 crc kubenswrapper[5094]: E0220 06:48:49.951286 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:50.451261125 +0000 UTC m=+145.323887836 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.952163 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:49 crc kubenswrapper[5094]: E0220 06:48:49.952537 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:50.452529325 +0000 UTC m=+145.325156036 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:50 crc kubenswrapper[5094]: W0220 06:48:50.002136 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca3c622d_9c0e_43f5_a5ce_de2dbbab5f60.slice/crio-c96c01104aff3baab51a0a0d07e5aec2067827a35dc850409192571365bb82c1 WatchSource:0}: Error finding container c96c01104aff3baab51a0a0d07e5aec2067827a35dc850409192571365bb82c1: Status 404 returned error can't find the container with id c96c01104aff3baab51a0a0d07e5aec2067827a35dc850409192571365bb82c1 Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.035833 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-9sztr" podStartSLOduration=123.035811668 podStartE2EDuration="2m3.035811668s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:50.035300906 +0000 UTC m=+144.907927617" watchObservedRunningTime="2026-02-20 06:48:50.035811668 +0000 UTC m=+144.908438379" Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.045280 5094 patch_prober.go:28] interesting pod/router-default-5444994796-9sztr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 06:48:50 crc kubenswrapper[5094]: [-]has-synced failed: reason withheld Feb 20 06:48:50 crc kubenswrapper[5094]: [+]process-running ok Feb 20 06:48:50 crc kubenswrapper[5094]: healthz check failed Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.045325 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9sztr" podUID="a6908ab3-3d33-4e31-b226-b6607f34ee8b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.047311 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-d2l2r" event={"ID":"f1faaf31-f0b6-4828-90cc-51de060dc826","Type":"ContainerStarted","Data":"39c2e7ca833f7ef7d89846da7787d3050a9057ef816f78e796884a8eb050d1b8"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.054567 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:50 crc kubenswrapper[5094]: E0220 06:48:50.055075 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:50.555060424 +0000 UTC m=+145.427687135 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.071556 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7hljp" event={"ID":"bac53d01-ed38-46a8-ae9e-bfb72e5565a1","Type":"ContainerStarted","Data":"d1cb1bbeb617bd587b25db9333fc7fb27df8d78e2c95ab8f83863ba53dcbe5ed"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.074814 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" event={"ID":"8f8623a7-b3d4-49ad-86c5-40f19adf7b09","Type":"ContainerStarted","Data":"e4130178047d6af3e0ab4ff09453de029b92e5516ad70ed26661873561710385"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.077482 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7" event={"ID":"04b9035c-78ce-4d54-859d-48f7853f3f16","Type":"ContainerStarted","Data":"d9b7bdc00087de0003b0a5043af6ab498b1d9d30621c71ee03e559ff8b27fd22"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.084089 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-q596q" event={"ID":"9d4a320a-daa4-4bce-9782-5e9880aea226","Type":"ContainerStarted","Data":"6a844a7f76f541126efc8c47b592165b18584966dd0a56577240a4f095455b25"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.114642 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vhztc" event={"ID":"3c3bc896-4d90-42f0-92e9-77a7b285e504","Type":"ContainerStarted","Data":"13cdf00e66336afea1889b2ebf66c8dc720a6e204ebd2d05b9680bb58da74e01"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.121802 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-znrdm" event={"ID":"38d9642e-3788-4e70-8232-138cd84e02dc","Type":"ContainerStarted","Data":"edfcdda3c0dd05b189c70af2eefb8ceb774a6586f9bbfe4bf0bf8eba5d0437ec"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.158559 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:50 crc kubenswrapper[5094]: E0220 06:48:50.158915 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:50.658888792 +0000 UTC m=+145.531515503 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.221112 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" event={"ID":"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60","Type":"ContainerStarted","Data":"c96c01104aff3baab51a0a0d07e5aec2067827a35dc850409192571365bb82c1"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.252847 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg" event={"ID":"30a7db23-c18c-4bc6-b1b7-97b32a419fbe","Type":"ContainerStarted","Data":"bf1b86b477ef7d9b7bcdd448e8e0069748f9325d93b5f710501c516d3a5a1eaf"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.276806 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:50 crc kubenswrapper[5094]: E0220 06:48:50.277584 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:50.777567162 +0000 UTC m=+145.650193873 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.290562 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2"] Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.304758 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tjkwm"] Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.304899 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-w7rf2" event={"ID":"95700d83-436d-43c5-9eb1-381654f43928","Type":"ContainerStarted","Data":"335c5974d95e909d3c1bf3645c6101a019d706448889a79cc5d66153648d38b6"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.318266 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg" podStartSLOduration=124.318246506 podStartE2EDuration="2m4.318246506s" podCreationTimestamp="2026-02-20 06:46:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:50.304348126 +0000 UTC m=+145.176974837" watchObservedRunningTime="2026-02-20 06:48:50.318246506 +0000 UTC m=+145.190873217" Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.318672 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn"] Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.319793 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" event={"ID":"cbae269e-22bc-484c-ad96-ad61d462a28d","Type":"ContainerStarted","Data":"e8b5a8b902e7d3e3817c89caef4d0547445c42deb98a5819f0326602607bb7ff"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.326877 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" event={"ID":"38e3be97-7374-4b8b-9565-4d60baa02401","Type":"ContainerStarted","Data":"e4b5fdc4272d56740a6fa5f9ad11cf7fbbde3b0835b268c3ac507b39debb72ca"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.333498 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" event={"ID":"d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2","Type":"ContainerStarted","Data":"49373e13ecb57c8fe048d20a0d22ca99da0653a9549885af317cf26b90eec2f0"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.347423 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-x72n5"] Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.356956 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-v68px" event={"ID":"55946b30-00e1-4bd4-bd8e-3f5761537a0b","Type":"ContainerStarted","Data":"197208f93ee4a516cae15fc41066c22afb9af5f517a03e1efa9853a344e20e77"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.357402 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc"] Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.370288 5094 generic.go:334] "Generic (PLEG): container finished" podID="e594f26b-0fd6-44a1-93eb-84593591389f" containerID="cb47109c94bbf8a2aacd53b887c83a904e8dcda61778f8c16e4aec4144261c85" exitCode=0 Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.372073 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" event={"ID":"e594f26b-0fd6-44a1-93eb-84593591389f","Type":"ContainerDied","Data":"cb47109c94bbf8a2aacd53b887c83a904e8dcda61778f8c16e4aec4144261c85"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.377627 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.379296 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" event={"ID":"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c","Type":"ContainerStarted","Data":"756c75b3fc1e466feb787f451786b0673b156a7f8bc22ab6ff9765d0ad70b734"} Feb 20 06:48:50 crc kubenswrapper[5094]: E0220 06:48:50.379744 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:50.87972656 +0000 UTC m=+145.752353271 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.380024 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.396469 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" event={"ID":"f8d0112e-e85e-42f1-b28b-c0c996f36fe0","Type":"ContainerStarted","Data":"4a25127fcda14cf3e269b7884f247e76e7958e092869779a7319a4da122552f1"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.400972 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" event={"ID":"806ba791-714c-4d13-b595-d4f6ccf06aea","Type":"ContainerStarted","Data":"2cba69ee98eb39e06e5e23c24335f67d934d7ddd307c2f258f0da6b72887c796"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.414093 5094 csr.go:261] certificate signing request csr-hw6gz is approved, waiting to be issued Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.414125 5094 csr.go:257] certificate signing request csr-hw6gz is issued Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.450995 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599" event={"ID":"5f20c574-b730-4bd8-97d1-7751eb7968d4","Type":"ContainerStarted","Data":"082f0f7066d0aab2b6362a9588af4682178c61915c83e74167c3503683d40eba"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.454014 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" event={"ID":"2f348b60-0d81-490e-bfb4-ea32546c995a","Type":"ContainerStarted","Data":"fad35881a9e75d02e1ebd99e539d4791ed3e4cc6d7bb713a9c9ef957e003c1b4"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.478061 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-p4blk" event={"ID":"7b5c64ae-5f80-4e35-91dc-48163991b63d","Type":"ContainerStarted","Data":"39214b864a40c936cc1d08c8d3c2e33aaa966fe4b56fdd3a03d19e9802e84ad6"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.478478 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:50 crc kubenswrapper[5094]: E0220 06:48:50.478891 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:50.978872388 +0000 UTC m=+145.851499099 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.479291 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:50 crc kubenswrapper[5094]: E0220 06:48:50.486089 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:50.986073819 +0000 UTC m=+145.858700530 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.486775 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j2mmj" event={"ID":"5021cb92-f82d-47ee-9978-58e897c354b1","Type":"ContainerStarted","Data":"96d9f4a3e51cdc054f5fbb2e7cc16f9f5e938c18d9e211245e56c56c47e1c814"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.493669 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.500538 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.580436 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:50 crc kubenswrapper[5094]: E0220 06:48:50.580982 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:51.080967116 +0000 UTC m=+145.953593827 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.648446 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc"] Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.722143 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:50 crc kubenswrapper[5094]: E0220 06:48:50.727385 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:51.227365753 +0000 UTC m=+146.099992464 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:50 crc kubenswrapper[5094]: W0220 06:48:50.752093 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4797e67f_42c7_4106_998a_f3555218e77d.slice/crio-bb1740f53cf5bbe968f073c190551df76652e5c1f042b89eed337d39ef2f652b WatchSource:0}: Error finding container bb1740f53cf5bbe968f073c190551df76652e5c1f042b89eed337d39ef2f652b: Status 404 returned error can't find the container with id bb1740f53cf5bbe968f073c190551df76652e5c1f042b89eed337d39ef2f652b Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.775765 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj"] Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.775853 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-shq4j"] Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.810693 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wcwdv"] Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.830544 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:50 crc kubenswrapper[5094]: E0220 06:48:50.830971 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:51.330949686 +0000 UTC m=+146.203576397 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.858108 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" podStartSLOduration=124.858086858 podStartE2EDuration="2m4.858086858s" podCreationTimestamp="2026-02-20 06:46:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:50.817975729 +0000 UTC m=+145.690602430" watchObservedRunningTime="2026-02-20 06:48:50.858086858 +0000 UTC m=+145.730713569" Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.858311 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" podStartSLOduration=124.858307053 podStartE2EDuration="2m4.858307053s" podCreationTimestamp="2026-02-20 06:46:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:50.854971954 +0000 UTC m=+145.727598665" watchObservedRunningTime="2026-02-20 06:48:50.858307053 +0000 UTC m=+145.730933764" Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.862630 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-7wgh7"] Feb 20 06:48:50 crc kubenswrapper[5094]: W0220 06:48:50.878505 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd74a0c7b_3ad6_4f59_b4ff_33e1209c3116.slice/crio-a7f20fdbdab0cc5f6c68e27c33a34037b6042872ed0e49f45e04190788dc4434 WatchSource:0}: Error finding container a7f20fdbdab0cc5f6c68e27c33a34037b6042872ed0e49f45e04190788dc4434: Status 404 returned error can't find the container with id a7f20fdbdab0cc5f6c68e27c33a34037b6042872ed0e49f45e04190788dc4434 Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.895902 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" podStartSLOduration=123.895882283 podStartE2EDuration="2m3.895882283s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:50.894354967 +0000 UTC m=+145.766981678" watchObservedRunningTime="2026-02-20 06:48:50.895882283 +0000 UTC m=+145.768508994" Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.916690 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.916932 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nlpvl"] Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.935804 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:50 crc kubenswrapper[5094]: E0220 06:48:50.936295 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:51.436277779 +0000 UTC m=+146.308904490 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.938256 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj"] Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.955241 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q"] Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.959173 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" podStartSLOduration=124.959149231 podStartE2EDuration="2m4.959149231s" podCreationTimestamp="2026-02-20 06:46:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:50.958515096 +0000 UTC m=+145.831141807" watchObservedRunningTime="2026-02-20 06:48:50.959149231 +0000 UTC m=+145.831775942" Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.972353 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-l2hxn"] Feb 20 06:48:51 crc kubenswrapper[5094]: W0220 06:48:51.034009 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bba9a34_7bbd_44be_b82d_0a35f8ef288f.slice/crio-2803b37c86bc2d9dbd6227a403c4934c113a8ced7926e75c399d606457b5de0b WatchSource:0}: Error finding container 2803b37c86bc2d9dbd6227a403c4934c113a8ced7926e75c399d606457b5de0b: Status 404 returned error can't find the container with id 2803b37c86bc2d9dbd6227a403c4934c113a8ced7926e75c399d606457b5de0b Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.039981 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:51 crc kubenswrapper[5094]: E0220 06:48:51.040479 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:51.540462776 +0000 UTC m=+146.413089487 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.058938 5094 patch_prober.go:28] interesting pod/router-default-5444994796-9sztr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 06:48:51 crc kubenswrapper[5094]: [-]has-synced failed: reason withheld Feb 20 06:48:51 crc kubenswrapper[5094]: [+]process-running ok Feb 20 06:48:51 crc kubenswrapper[5094]: healthz check failed Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.059000 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9sztr" podUID="a6908ab3-3d33-4e31-b226-b6607f34ee8b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.141372 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:51 crc kubenswrapper[5094]: E0220 06:48:51.141723 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:51.641695704 +0000 UTC m=+146.514322415 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.243333 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:51 crc kubenswrapper[5094]: E0220 06:48:51.245146 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:51.745118933 +0000 UTC m=+146.617745644 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.349922 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:51 crc kubenswrapper[5094]: E0220 06:48:51.350450 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:51.850437666 +0000 UTC m=+146.723064377 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.416081 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-20 06:43:50 +0000 UTC, rotation deadline is 2027-01-10 02:55:08.296523138 +0000 UTC Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.416133 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7772h6m16.880393726s for next certificate rotation Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.457645 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:51 crc kubenswrapper[5094]: E0220 06:48:51.458074 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:51.958040485 +0000 UTC m=+146.830667196 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.509375 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj" event={"ID":"1177f137-190b-4563-8a6f-51d7b0d5ca9c","Type":"ContainerStarted","Data":"bf02145176698932aa6290346dc826656b111284edfa77724b46c1b2ded7ed25"} Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.537235 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" event={"ID":"4a51eb16-597c-47dc-bd54-c16c33bde071","Type":"ContainerStarted","Data":"23a6d21bc9f670c58daac935b4cf34dca1e21d756f0ef871577799c139f8fb5e"} Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.548064 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-v68px" event={"ID":"55946b30-00e1-4bd4-bd8e-3f5761537a0b","Type":"ContainerStarted","Data":"840aef56862c43e15d0ba9394df0bc3dd1070841c75042790202d3f243016924"} Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.560129 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:51 crc kubenswrapper[5094]: E0220 06:48:51.560534 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:52.060510191 +0000 UTC m=+146.933136902 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.566377 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j2mmj" event={"ID":"5021cb92-f82d-47ee-9978-58e897c354b1","Type":"ContainerStarted","Data":"2b2ded0bd00b9b022717edbca5dcc4a046119966453331ebc23620ffd9ed7974"} Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.572600 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-v68px" podStartSLOduration=6.572559325 podStartE2EDuration="6.572559325s" podCreationTimestamp="2026-02-20 06:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:51.569074033 +0000 UTC m=+146.441700734" watchObservedRunningTime="2026-02-20 06:48:51.572559325 +0000 UTC m=+146.445186036" Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.621045 5094 generic.go:334] "Generic (PLEG): container finished" podID="7b5c64ae-5f80-4e35-91dc-48163991b63d" containerID="719cfe3070710391ae7ff475b31d0d0d3486957b9eff677db33007a93967ce37" exitCode=0 Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.621150 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-p4blk" event={"ID":"7b5c64ae-5f80-4e35-91dc-48163991b63d","Type":"ContainerDied","Data":"719cfe3070710391ae7ff475b31d0d0d3486957b9eff677db33007a93967ce37"} Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.639019 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j2mmj" podStartSLOduration=124.638987799 podStartE2EDuration="2m4.638987799s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:51.61538476 +0000 UTC m=+146.488011481" watchObservedRunningTime="2026-02-20 06:48:51.638987799 +0000 UTC m=+146.511614510" Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.656807 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" event={"ID":"a072f264-8eef-49ff-804c-fc584b41175c","Type":"ContainerStarted","Data":"8df1751deaf4b25423b8ffb2da16475453a0ad90ed8820c56ff1d590f14fbf0f"} Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.664101 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:51 crc kubenswrapper[5094]: E0220 06:48:51.664377 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:52.164344609 +0000 UTC m=+147.036971320 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.664659 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:51 crc kubenswrapper[5094]: E0220 06:48:51.668778 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:52.168757274 +0000 UTC m=+147.041383985 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.673804 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-d2l2r" event={"ID":"f1faaf31-f0b6-4828-90cc-51de060dc826","Type":"ContainerStarted","Data":"2a68290fa1c4ca30a26485e5b789d657f8d3dc6f5892aa3b43625cdefd1e9e27"} Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.674859 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-d2l2r" Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.679909 5094 patch_prober.go:28] interesting pod/downloads-7954f5f757-d2l2r container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.679949 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d2l2r" podUID="f1faaf31-f0b6-4828-90cc-51de060dc826" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.734025 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-d2l2r" podStartSLOduration=124.733998869 podStartE2EDuration="2m4.733998869s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:51.727323241 +0000 UTC m=+146.599949962" watchObservedRunningTime="2026-02-20 06:48:51.733998869 +0000 UTC m=+146.606625580" Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.767673 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vhztc" event={"ID":"3c3bc896-4d90-42f0-92e9-77a7b285e504","Type":"ContainerStarted","Data":"4dad789791bce622abb9ee22f3228dec0cc564cb39baf6ba060ecc65e94f54c7"} Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.768430 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:51 crc kubenswrapper[5094]: E0220 06:48:51.769280 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:52.269255904 +0000 UTC m=+147.141882615 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.784338 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7" event={"ID":"04b9035c-78ce-4d54-859d-48f7853f3f16","Type":"ContainerStarted","Data":"83565444373c2a0fa672cd6ee113621d4b694e3092ba567c877c9e22ef97aa72"} Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.790899 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-q596q" event={"ID":"9d4a320a-daa4-4bce-9782-5e9880aea226","Type":"ContainerStarted","Data":"ee8420d5ace56cc69ef966f5d6519926a22add67977f2bfc4d794cc0072b4163"} Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.807324 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vhztc" podStartSLOduration=124.807304314 podStartE2EDuration="2m4.807304314s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:51.806445774 +0000 UTC m=+146.679072485" watchObservedRunningTime="2026-02-20 06:48:51.807304314 +0000 UTC m=+146.679931025" Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.815008 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" event={"ID":"368766ec-f562-4296-bcdb-4bcda1db6c45","Type":"ContainerStarted","Data":"661750a3cc39a7eb8cd7dd9ec0f8265614f2d44bceeb6672bb22df6e24aa3240"} Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.870058 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:51 crc kubenswrapper[5094]: E0220 06:48:51.870627 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:52.370601933 +0000 UTC m=+147.243228644 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.874934 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" event={"ID":"cbae269e-22bc-484c-ad96-ad61d462a28d","Type":"ContainerStarted","Data":"81a34f90029b69885d8fb9a456a644431c73a92d9c187ca4a2f3ebd2ae84c509"} Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.874995 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.885035 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc" event={"ID":"4797e67f-42c7-4106-998a-f3555218e77d","Type":"ContainerStarted","Data":"bb1740f53cf5bbe968f073c190551df76652e5c1f042b89eed337d39ef2f652b"} Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.890255 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc" event={"ID":"b32eb7f4-e823-4b71-9606-d3dee9f247fd","Type":"ContainerStarted","Data":"265402085e76d0cbb939ebfe62c7c80946892bfea0b80dadf2850d32502ab2c2"} Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.890317 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc" event={"ID":"b32eb7f4-e823-4b71-9606-d3dee9f247fd","Type":"ContainerStarted","Data":"1015b645cb5c31550a6ff0c30d96eab4a332985e6e6de7855f9033e0fb7ad816"} Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.896086 5094 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-57pxl container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.896143 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" podUID="cbae269e-22bc-484c-ad96-ad61d462a28d" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.914443 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" podStartSLOduration=124.914422971 podStartE2EDuration="2m4.914422971s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:51.912662049 +0000 UTC m=+146.785288760" watchObservedRunningTime="2026-02-20 06:48:51.914422971 +0000 UTC m=+146.787049682" Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.953670 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb" event={"ID":"e78b7a6b-91b7-4753-bd82-df9d3ea97291","Type":"ContainerStarted","Data":"88c64d8bec3f92cdba2164ac30b7327552174624512758460a6aded0595cbf34"} Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.970759 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:51 crc kubenswrapper[5094]: E0220 06:48:51.972893 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:52.472875535 +0000 UTC m=+147.345502246 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.980466 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc" podStartSLOduration=124.980441064 podStartE2EDuration="2m4.980441064s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:51.967872007 +0000 UTC m=+146.840498718" watchObservedRunningTime="2026-02-20 06:48:51.980441064 +0000 UTC m=+146.853067775" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.024195 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb" podStartSLOduration=125.02417096 podStartE2EDuration="2m5.02417096s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:52.020668986 +0000 UTC m=+146.893295697" watchObservedRunningTime="2026-02-20 06:48:52.02417096 +0000 UTC m=+146.896797691" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.026141 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7hljp" event={"ID":"bac53d01-ed38-46a8-ae9e-bfb72e5565a1","Type":"ContainerStarted","Data":"8321c815f13d3667834d96eec3965c5f2787df004384a0ce9a6ec23d4b4869e4"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.029181 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" event={"ID":"2f348b60-0d81-490e-bfb4-ea32546c995a","Type":"ContainerStarted","Data":"5cdc5822740b7095d1974c19dbccda507767ab0c1fa72d48fbc6d25078c0b34d"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.040748 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599" event={"ID":"5f20c574-b730-4bd8-97d1-7751eb7968d4","Type":"ContainerStarted","Data":"b0a2acc52e341f79f63d9a1995205da3048e7b090d068e86900dee98e8bd9b48"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.045867 5094 patch_prober.go:28] interesting pod/router-default-5444994796-9sztr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 06:48:52 crc kubenswrapper[5094]: [-]has-synced failed: reason withheld Feb 20 06:48:52 crc kubenswrapper[5094]: [+]process-running ok Feb 20 06:48:52 crc kubenswrapper[5094]: healthz check failed Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.045912 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9sztr" podUID="a6908ab3-3d33-4e31-b226-b6607f34ee8b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.065183 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-x72n5" event={"ID":"7a75a178-3fa3-4be7-b29f-e1f01dc859a4","Type":"ContainerStarted","Data":"6d302e0e8c1edc759fea6176bad4f60407b1ec7aa0ae6161b7a98608854ada3d"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.072095 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:52 crc kubenswrapper[5094]: E0220 06:48:52.072402 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:52.572391121 +0000 UTC m=+147.445017832 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.097275 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-znrdm" event={"ID":"38d9642e-3788-4e70-8232-138cd84e02dc","Type":"ContainerStarted","Data":"34d6c7aa37fc8eb5eeda4b0cea58bd93dc213b767611900685a4c35eeeddad03"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.129722 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wcwdv" event={"ID":"fc07e658-5bc5-469e-b793-230b7be58f12","Type":"ContainerStarted","Data":"3139a68b48fcef2d120274aed4963d2f72e77ba6fc1d7a2cb360251ad2f8742a"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.129842 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" podStartSLOduration=125.129824641 podStartE2EDuration="2m5.129824641s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:52.124882214 +0000 UTC m=+146.997508925" watchObservedRunningTime="2026-02-20 06:48:52.129824641 +0000 UTC m=+147.002451352" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.134096 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7hljp" podStartSLOduration=125.134080102 podStartE2EDuration="2m5.134080102s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:52.071061959 +0000 UTC m=+146.943688670" watchObservedRunningTime="2026-02-20 06:48:52.134080102 +0000 UTC m=+147.006706813" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.173471 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:52 crc kubenswrapper[5094]: E0220 06:48:52.175475 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:52.675435511 +0000 UTC m=+147.548062222 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.208911 5094 generic.go:334] "Generic (PLEG): container finished" podID="8f8623a7-b3d4-49ad-86c5-40f19adf7b09" containerID="ee3a87c537aa4fca93d8f998dea856908205a9a56132c65c958d2298537ef172" exitCode=0 Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.209010 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" event={"ID":"8f8623a7-b3d4-49ad-86c5-40f19adf7b09","Type":"ContainerDied","Data":"ee3a87c537aa4fca93d8f998dea856908205a9a56132c65c958d2298537ef172"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.215449 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599" podStartSLOduration=125.215422929 podStartE2EDuration="2m5.215422929s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:52.192213259 +0000 UTC m=+147.064839970" watchObservedRunningTime="2026-02-20 06:48:52.215422929 +0000 UTC m=+147.088049640" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.246573 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd" event={"ID":"d1974d27-b923-4a9b-9874-d400df5bd29a","Type":"ContainerStarted","Data":"b51efc26bb16dfcde63b61af8f545647be6a0ba0478045391f6cf4ebfce8d61f"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.259915 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-wcwdv" podStartSLOduration=7.259879761 podStartE2EDuration="7.259879761s" podCreationTimestamp="2026-02-20 06:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:52.246231318 +0000 UTC m=+147.118858019" watchObservedRunningTime="2026-02-20 06:48:52.259879761 +0000 UTC m=+147.132506472" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.269388 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-w7rf2" event={"ID":"95700d83-436d-43c5-9eb1-381654f43928","Type":"ContainerStarted","Data":"acbb92fa881d8208be98d779d340579e572e5c0e423b335c0c79f6ab35419cc5"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.270652 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-w7rf2" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.271830 5094 patch_prober.go:28] interesting pod/console-operator-58897d9998-w7rf2 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.271877 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-w7rf2" podUID="95700d83-436d-43c5-9eb1-381654f43928" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.282003 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:52 crc kubenswrapper[5094]: E0220 06:48:52.282748 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:52.782734972 +0000 UTC m=+147.655361683 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.348727 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-shq4j" event={"ID":"e130287f-996d-4ab0-8c12-351bf8d21df5","Type":"ContainerStarted","Data":"9e9ebe837e9f43c76e6f912fde9f9a4d76af0096fe554a67909ac3cf138a323a"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.366453 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-znrdm" podStartSLOduration=125.366432884 podStartE2EDuration="2m5.366432884s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:52.301296882 +0000 UTC m=+147.173923593" watchObservedRunningTime="2026-02-20 06:48:52.366432884 +0000 UTC m=+147.239059595" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.387395 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:52 crc kubenswrapper[5094]: E0220 06:48:52.392423 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:52.892381908 +0000 UTC m=+147.765008619 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.394235 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-l2hxn" event={"ID":"810d7855-cab6-4e33-9a5f-6d7bac9f66eb","Type":"ContainerStarted","Data":"26b003bdf0aedeeb11371c691d87fa14db0e6378b4677fe005cfb4a29e342bd9"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.399850 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-nlpvl" event={"ID":"6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c","Type":"ContainerStarted","Data":"1e7a66379829ca3facd3965ad9642612d26bed5b2aa8d5bd79cbfffc9fb31129"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.441912 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-w7rf2" podStartSLOduration=125.44187731 podStartE2EDuration="2m5.44187731s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:52.433451451 +0000 UTC m=+147.306078162" watchObservedRunningTime="2026-02-20 06:48:52.44187731 +0000 UTC m=+147.314504021" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.460083 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q" event={"ID":"a4c7d510-2730-46e1-b157-6e890e8868e9","Type":"ContainerStarted","Data":"8c84370dce58890b84bc2db3bb249922847dd52c6f58e7de76c62408a999789c"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.490927 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" event={"ID":"806ba791-714c-4d13-b595-d4f6ccf06aea","Type":"ContainerStarted","Data":"97bfbe799eec96272d8e9a6b7afb335847d35846686c6b7385ed8fc78aa4aec5"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.494038 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:52 crc kubenswrapper[5094]: E0220 06:48:52.502264 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:53.001780659 +0000 UTC m=+147.874407370 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.522222 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" event={"ID":"6bba9a34-7bbd-44be-b82d-0a35f8ef288f","Type":"ContainerStarted","Data":"2803b37c86bc2d9dbd6227a403c4934c113a8ced7926e75c399d606457b5de0b"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.530620 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd" podStartSLOduration=125.530588961 podStartE2EDuration="2m5.530588961s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:52.47228851 +0000 UTC m=+147.344915221" watchObservedRunningTime="2026-02-20 06:48:52.530588961 +0000 UTC m=+147.403215672" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.545313 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" event={"ID":"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60","Type":"ContainerStarted","Data":"ffaff9fdb2f03b0e17c5d470bccb1479bdc3ed514c6b3c3a9637e3af949a185c"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.555695 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.578170 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" event={"ID":"d74a0c7b-3ad6-4f59-b4ff-33e1209c3116","Type":"ContainerStarted","Data":"a7f20fdbdab0cc5f6c68e27c33a34037b6042872ed0e49f45e04190788dc4434"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.580619 5094 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-45dt8 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.580651 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" podUID="ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.595811 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-shq4j" podStartSLOduration=125.595792895 podStartE2EDuration="2m5.595792895s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:52.521308151 +0000 UTC m=+147.393934872" watchObservedRunningTime="2026-02-20 06:48:52.595792895 +0000 UTC m=+147.468419606" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.604302 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:52 crc kubenswrapper[5094]: E0220 06:48:52.605910 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:53.105892364 +0000 UTC m=+147.978519075 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.622398 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-nlpvl" podStartSLOduration=125.622374714 podStartE2EDuration="2m5.622374714s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:52.580898683 +0000 UTC m=+147.453525394" watchObservedRunningTime="2026-02-20 06:48:52.622374714 +0000 UTC m=+147.495001425" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.623087 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" podStartSLOduration=125.623082331 podStartE2EDuration="2m5.623082331s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:52.622102128 +0000 UTC m=+147.494728829" watchObservedRunningTime="2026-02-20 06:48:52.623082331 +0000 UTC m=+147.495709042" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.663469 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" podStartSLOduration=125.663446077 podStartE2EDuration="2m5.663446077s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:52.662407352 +0000 UTC m=+147.535034063" watchObservedRunningTime="2026-02-20 06:48:52.663446077 +0000 UTC m=+147.536072788" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.713807 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:52 crc kubenswrapper[5094]: E0220 06:48:52.714321 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:53.214304921 +0000 UTC m=+148.086931632 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.814508 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:52 crc kubenswrapper[5094]: E0220 06:48:52.815008 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:53.314993766 +0000 UTC m=+148.187620477 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.916006 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:52 crc kubenswrapper[5094]: E0220 06:48:52.916439 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:53.416419137 +0000 UTC m=+148.289045848 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.961992 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.019487 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:53 crc kubenswrapper[5094]: E0220 06:48:53.020411 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:53.520387308 +0000 UTC m=+148.393014009 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.043859 5094 patch_prober.go:28] interesting pod/router-default-5444994796-9sztr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 06:48:53 crc kubenswrapper[5094]: [-]has-synced failed: reason withheld Feb 20 06:48:53 crc kubenswrapper[5094]: [+]process-running ok Feb 20 06:48:53 crc kubenswrapper[5094]: healthz check failed Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.043926 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9sztr" podUID="a6908ab3-3d33-4e31-b226-b6607f34ee8b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.122717 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:53 crc kubenswrapper[5094]: E0220 06:48:53.123305 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:53.623289395 +0000 UTC m=+148.495916106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.224312 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:53 crc kubenswrapper[5094]: E0220 06:48:53.224538 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:53.724506222 +0000 UTC m=+148.597132933 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.224844 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:53 crc kubenswrapper[5094]: E0220 06:48:53.225246 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:53.72523341 +0000 UTC m=+148.597860121 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.326636 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:53 crc kubenswrapper[5094]: E0220 06:48:53.326827 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:53.826775383 +0000 UTC m=+148.699402094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.327041 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:53 crc kubenswrapper[5094]: E0220 06:48:53.327507 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:53.827486991 +0000 UTC m=+148.700113702 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.428406 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:53 crc kubenswrapper[5094]: E0220 06:48:53.428609 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:53.928530533 +0000 UTC m=+148.801157244 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.429141 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:53 crc kubenswrapper[5094]: E0220 06:48:53.429504 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:53.929491126 +0000 UTC m=+148.802117837 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.530825 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:53 crc kubenswrapper[5094]: E0220 06:48:53.531054 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:54.03102137 +0000 UTC m=+148.903648081 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.531377 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:53 crc kubenswrapper[5094]: E0220 06:48:53.531783 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:54.031775118 +0000 UTC m=+148.904401829 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.585752 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-nlpvl" event={"ID":"6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c","Type":"ContainerStarted","Data":"8d9bec351c0fc0710acb33b8eb536c118b9ecfcdb3f10ae2fa68086a118dd1b8"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.587659 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" event={"ID":"4a51eb16-597c-47dc-bd54-c16c33bde071","Type":"ContainerStarted","Data":"c22aae595dca26c79cc3480d16d8c1f950ef601d9152c428bb76b7bbe872ddb7"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.587930 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.589648 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7" event={"ID":"04b9035c-78ce-4d54-859d-48f7853f3f16","Type":"ContainerStarted","Data":"a9083153b32dccca3a1a51f54832e7b5d9b64685ff7b9c07c5a26dcd3375bff8"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.591123 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" event={"ID":"6bba9a34-7bbd-44be-b82d-0a35f8ef288f","Type":"ContainerStarted","Data":"ad945fcd7b5fc2231897504821dcad743519120a4408fb921971af86ef379d72"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.592346 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" event={"ID":"368766ec-f562-4296-bcdb-4bcda1db6c45","Type":"ContainerStarted","Data":"a7aa8f24b3d5c02da7d0d9ade1d6c71e80edc6d5d094e75e7faf8acb9f27b369"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.594762 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-shq4j" event={"ID":"e130287f-996d-4ab0-8c12-351bf8d21df5","Type":"ContainerStarted","Data":"d6b6a4d5efc598b2291d4820a26080dd11c2b20b96ec9c20aea95f5a7b87fc1a"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.597722 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-x72n5" event={"ID":"7a75a178-3fa3-4be7-b29f-e1f01dc859a4","Type":"ContainerStarted","Data":"c1b9527c261d3d942362984ecc003014543b74961a1ad7b377199b42cfe481fd"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.597835 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-x72n5" event={"ID":"7a75a178-3fa3-4be7-b29f-e1f01dc859a4","Type":"ContainerStarted","Data":"405a2793231e7c6a02ddcec6356e5472cf96043d144d133078ccfa674f0b6e7d"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.601288 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" event={"ID":"e594f26b-0fd6-44a1-93eb-84593591389f","Type":"ContainerStarted","Data":"e07f8d0b3df405b8ca7123df2b4d330a242cea3e2391d205143c1736e74eed3f"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.601817 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.603636 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" event={"ID":"8f8623a7-b3d4-49ad-86c5-40f19adf7b09","Type":"ContainerStarted","Data":"1a59ccd6bbe9f1a540d0c1a933e3878040fc5896cf3d2a910619406b037a63b4"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.606451 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-p4blk" event={"ID":"7b5c64ae-5f80-4e35-91dc-48163991b63d","Type":"ContainerStarted","Data":"fdae0bc7ef05a432f7a20ccaf763474980635bd783e9a84abb8a1efd22c2e19a"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.606781 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-p4blk" event={"ID":"7b5c64ae-5f80-4e35-91dc-48163991b63d","Type":"ContainerStarted","Data":"8f62c0f78e34ee5f42c8290c19850b1d8530b909852489e916c02461c7141c0e"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.608538 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q" event={"ID":"a4c7d510-2730-46e1-b157-6e890e8868e9","Type":"ContainerStarted","Data":"25e8f1f02a279669e9900fa8c25d826db833f42e74e3a27e72fe5628a8c009ca"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.608639 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q" event={"ID":"a4c7d510-2730-46e1-b157-6e890e8868e9","Type":"ContainerStarted","Data":"5554fd8851fba78578cd37c32041bc8b94449f93717dfa1fafad97b26ea558b0"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.609156 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.611035 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-l2hxn" event={"ID":"810d7855-cab6-4e33-9a5f-6d7bac9f66eb","Type":"ContainerStarted","Data":"54e1f7eebaa262f720d08801968d75fa654a376365c50075aea8306d86adaca2"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.611155 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-l2hxn" event={"ID":"810d7855-cab6-4e33-9a5f-6d7bac9f66eb","Type":"ContainerStarted","Data":"a4a15bfab14b5ab990972032c11c59591a4046aedf93d0d5979bb12442b17255"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.611659 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-l2hxn" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.615763 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-q596q" event={"ID":"9d4a320a-daa4-4bce-9782-5e9880aea226","Type":"ContainerStarted","Data":"b106a2e1c14cd7ee8c25551b0a146cc90e8abcc581533a5f380abd323e3b3ffc"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.617736 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc" event={"ID":"4797e67f-42c7-4106-998a-f3555218e77d","Type":"ContainerStarted","Data":"a5febfdf6ff13b73885ae68771975d75b338ef6c32185a51f9e6678e4521e008"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.619350 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" podStartSLOduration=126.619335421 podStartE2EDuration="2m6.619335421s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:53.612604461 +0000 UTC m=+148.485231172" watchObservedRunningTime="2026-02-20 06:48:53.619335421 +0000 UTC m=+148.491962142" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.619529 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" event={"ID":"a072f264-8eef-49ff-804c-fc584b41175c","Type":"ContainerStarted","Data":"b4f6631b74da908d11b454e73b916bf4df1f856d6688d805ff2095c0e2fb34d3"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.619585 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" event={"ID":"a072f264-8eef-49ff-804c-fc584b41175c","Type":"ContainerStarted","Data":"539612e997caaec2d06de1fe954cd9b6cb5d14bfdcb52922ce688d299da84cf3"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.625737 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj" event={"ID":"1177f137-190b-4563-8a6f-51d7b0d5ca9c","Type":"ContainerStarted","Data":"3fc35bcf407511bee7a9e8b7075c2f9ea4133bcc7d328b3cdc64dcc8dbbd4ad4"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.625794 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.628479 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" event={"ID":"d74a0c7b-3ad6-4f59-b4ff-33e1209c3116","Type":"ContainerStarted","Data":"36f38721fd0a0d7d7219e3717cfb74f975fa115ab06ab7f2670028cdc038f21f"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.628660 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" event={"ID":"d74a0c7b-3ad6-4f59-b4ff-33e1209c3116","Type":"ContainerStarted","Data":"564bd1116ccb908b0682633fbde86bceed7a68881dac681bc15ae456085179cf"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.632537 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:53 crc kubenswrapper[5094]: E0220 06:48:53.634621 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:54.134587582 +0000 UTC m=+149.007214293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.636176 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wcwdv" event={"ID":"fc07e658-5bc5-469e-b793-230b7be58f12","Type":"ContainerStarted","Data":"595d2440e384fdef9f38c9b5f5a4f8d8b79d0f87121891fbe8926f8e447d7447"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.643859 5094 patch_prober.go:28] interesting pod/downloads-7954f5f757-d2l2r container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.643938 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d2l2r" podUID="f1faaf31-f0b6-4828-90cc-51de060dc826" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.657318 5094 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-45dt8 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.657407 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" podUID="ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.662754 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.695881 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" podStartSLOduration=126.695858473 podStartE2EDuration="2m6.695858473s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:53.69487533 +0000 UTC m=+148.567502041" watchObservedRunningTime="2026-02-20 06:48:53.695858473 +0000 UTC m=+148.568485184" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.700135 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-w7rf2" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.711960 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.738049 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.751894 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" podStartSLOduration=126.751867849 podStartE2EDuration="2m6.751867849s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:53.738731679 +0000 UTC m=+148.611358390" watchObservedRunningTime="2026-02-20 06:48:53.751867849 +0000 UTC m=+148.624494560" Feb 20 06:48:53 crc kubenswrapper[5094]: E0220 06:48:53.753227 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:54.253204951 +0000 UTC m=+149.125831652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.794637 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" podStartSLOduration=126.794614021 podStartE2EDuration="2m6.794614021s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:53.791072468 +0000 UTC m=+148.663699179" watchObservedRunningTime="2026-02-20 06:48:53.794614021 +0000 UTC m=+148.667240732" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.822511 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-x72n5" podStartSLOduration=126.822489471 podStartE2EDuration="2m6.822489471s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:53.815676041 +0000 UTC m=+148.688302752" watchObservedRunningTime="2026-02-20 06:48:53.822489471 +0000 UTC m=+148.695116182" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.843997 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:53 crc kubenswrapper[5094]: E0220 06:48:53.844814 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:54.344789249 +0000 UTC m=+149.217415960 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.849008 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-l2hxn" podStartSLOduration=8.848996029 podStartE2EDuration="8.848996029s" podCreationTimestamp="2026-02-20 06:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:53.847387311 +0000 UTC m=+148.720014022" watchObservedRunningTime="2026-02-20 06:48:53.848996029 +0000 UTC m=+148.721622730" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.946738 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:53 crc kubenswrapper[5094]: E0220 06:48:53.947135 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:54.447122582 +0000 UTC m=+149.319749293 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.967048 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q" podStartSLOduration=126.967023924 podStartE2EDuration="2m6.967023924s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:53.898086391 +0000 UTC m=+148.770713102" watchObservedRunningTime="2026-02-20 06:48:53.967023924 +0000 UTC m=+148.839650635" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.967160 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-p4blk" podStartSLOduration=127.967157338 podStartE2EDuration="2m7.967157338s" podCreationTimestamp="2026-02-20 06:46:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:53.960766806 +0000 UTC m=+148.833393517" watchObservedRunningTime="2026-02-20 06:48:53.967157338 +0000 UTC m=+148.839784039" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.021058 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7" podStartSLOduration=127.021035953 podStartE2EDuration="2m7.021035953s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:54.019303092 +0000 UTC m=+148.891929803" watchObservedRunningTime="2026-02-20 06:48:54.021035953 +0000 UTC m=+148.893662664" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.048390 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:54 crc kubenswrapper[5094]: E0220 06:48:54.048836 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:54.548819781 +0000 UTC m=+149.421446492 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.048981 5094 patch_prober.go:28] interesting pod/router-default-5444994796-9sztr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 06:48:54 crc kubenswrapper[5094]: [-]has-synced failed: reason withheld Feb 20 06:48:54 crc kubenswrapper[5094]: [+]process-running ok Feb 20 06:48:54 crc kubenswrapper[5094]: healthz check failed Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.049047 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9sztr" podUID="a6908ab3-3d33-4e31-b226-b6607f34ee8b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.093159 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc" podStartSLOduration=127.093142981 podStartE2EDuration="2m7.093142981s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:54.091159953 +0000 UTC m=+148.963786664" watchObservedRunningTime="2026-02-20 06:48:54.093142981 +0000 UTC m=+148.965769692" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.155483 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:54 crc kubenswrapper[5094]: E0220 06:48:54.155863 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:54.655850055 +0000 UTC m=+149.528476766 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.252521 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-q596q" podStartSLOduration=127.252496573 podStartE2EDuration="2m7.252496573s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:54.250405744 +0000 UTC m=+149.123032455" watchObservedRunningTime="2026-02-20 06:48:54.252496573 +0000 UTC m=+149.125123284" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.257044 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:54 crc kubenswrapper[5094]: E0220 06:48:54.257591 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:54.757562224 +0000 UTC m=+149.630188935 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.358744 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:54 crc kubenswrapper[5094]: E0220 06:48:54.359689 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:54.859669301 +0000 UTC m=+149.732296012 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.363493 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.409615 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" podStartSLOduration=127.409588564 podStartE2EDuration="2m7.409588564s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:54.320870252 +0000 UTC m=+149.193496963" watchObservedRunningTime="2026-02-20 06:48:54.409588564 +0000 UTC m=+149.282215275" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.460468 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:54 crc kubenswrapper[5094]: E0220 06:48:54.461100 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:54.961078992 +0000 UTC m=+149.833705703 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.483604 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" podStartSLOduration=127.483581826 podStartE2EDuration="2m7.483581826s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:54.411096619 +0000 UTC m=+149.283723330" watchObservedRunningTime="2026-02-20 06:48:54.483581826 +0000 UTC m=+149.356208537" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.553301 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj" podStartSLOduration=127.553281576 podStartE2EDuration="2m7.553281576s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:54.485538512 +0000 UTC m=+149.358165223" watchObservedRunningTime="2026-02-20 06:48:54.553281576 +0000 UTC m=+149.425908277" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.563917 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:54 crc kubenswrapper[5094]: E0220 06:48:54.564319 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:55.064307497 +0000 UTC m=+149.936934208 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.666977 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:54 crc kubenswrapper[5094]: E0220 06:48:54.668688 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:55.168664438 +0000 UTC m=+150.041291149 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.677870 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" event={"ID":"368766ec-f562-4296-bcdb-4bcda1db6c45","Type":"ContainerStarted","Data":"50733f8ae42d0ce2ee3b276ab62846f77e5e69e3728251894751a5554c4017e3"} Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.679406 5094 patch_prober.go:28] interesting pod/downloads-7954f5f757-d2l2r container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.679443 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d2l2r" podUID="f1faaf31-f0b6-4828-90cc-51de060dc826" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.695257 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.698971 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.770973 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:54 crc kubenswrapper[5094]: E0220 06:48:54.776175 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:55.276158623 +0000 UTC m=+150.148785564 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.877788 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:54 crc kubenswrapper[5094]: E0220 06:48:54.878574 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:55.378555478 +0000 UTC m=+150.251182189 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.980340 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.980718 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.980848 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:54 crc kubenswrapper[5094]: E0220 06:48:54.981211 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:55.481187738 +0000 UTC m=+150.353814449 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.981986 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.982353 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.985354 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.992784 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.993407 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.004893 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7zj2v"] Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.006353 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.011010 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.023752 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.026017 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7zj2v"] Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.051173 5094 patch_prober.go:28] interesting pod/router-default-5444994796-9sztr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 06:48:55 crc kubenswrapper[5094]: [-]has-synced failed: reason withheld Feb 20 06:48:55 crc kubenswrapper[5094]: [+]process-running ok Feb 20 06:48:55 crc kubenswrapper[5094]: healthz check failed Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.051243 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9sztr" podUID="a6908ab3-3d33-4e31-b226-b6607f34ee8b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:48:55 crc kubenswrapper[5094]: E0220 06:48:55.084002 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:55.583979522 +0000 UTC m=+150.456606233 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.083901 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.084458 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.084477 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66c35d10-d5cc-468f-95a1-b56fde3961b3-utilities\") pod \"certified-operators-7zj2v\" (UID: \"66c35d10-d5cc-468f-95a1-b56fde3961b3\") " pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.084867 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxwcp\" (UniqueName: \"kubernetes.io/projected/66c35d10-d5cc-468f-95a1-b56fde3961b3-kube-api-access-bxwcp\") pod \"certified-operators-7zj2v\" (UID: \"66c35d10-d5cc-468f-95a1-b56fde3961b3\") " pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.084982 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66c35d10-d5cc-468f-95a1-b56fde3961b3-catalog-content\") pod \"certified-operators-7zj2v\" (UID: \"66c35d10-d5cc-468f-95a1-b56fde3961b3\") " pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.085164 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:55 crc kubenswrapper[5094]: E0220 06:48:55.085520 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:55.585503478 +0000 UTC m=+150.458130189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.179166 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.186487 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.186945 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66c35d10-d5cc-468f-95a1-b56fde3961b3-utilities\") pod \"certified-operators-7zj2v\" (UID: \"66c35d10-d5cc-468f-95a1-b56fde3961b3\") " pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.187055 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66c35d10-d5cc-468f-95a1-b56fde3961b3-catalog-content\") pod \"certified-operators-7zj2v\" (UID: \"66c35d10-d5cc-468f-95a1-b56fde3961b3\") " pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.187134 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxwcp\" (UniqueName: \"kubernetes.io/projected/66c35d10-d5cc-468f-95a1-b56fde3961b3-kube-api-access-bxwcp\") pod \"certified-operators-7zj2v\" (UID: \"66c35d10-d5cc-468f-95a1-b56fde3961b3\") " pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:48:55 crc kubenswrapper[5094]: E0220 06:48:55.187734 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:55.687713248 +0000 UTC m=+150.560339949 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.188322 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66c35d10-d5cc-468f-95a1-b56fde3961b3-utilities\") pod \"certified-operators-7zj2v\" (UID: \"66c35d10-d5cc-468f-95a1-b56fde3961b3\") " pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.188366 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66c35d10-d5cc-468f-95a1-b56fde3961b3-catalog-content\") pod \"certified-operators-7zj2v\" (UID: \"66c35d10-d5cc-468f-95a1-b56fde3961b3\") " pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.191191 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-c94fj"] Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.191900 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.192278 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.197231 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.234775 5094 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.250735 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c94fj"] Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.256531 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxwcp\" (UniqueName: \"kubernetes.io/projected/66c35d10-d5cc-468f-95a1-b56fde3961b3-kube-api-access-bxwcp\") pod \"certified-operators-7zj2v\" (UID: \"66c35d10-d5cc-468f-95a1-b56fde3961b3\") " pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.288256 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f7t5\" (UniqueName: \"kubernetes.io/projected/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-kube-api-access-2f7t5\") pod \"community-operators-c94fj\" (UID: \"88e94523-c126-4ce8-a6c7-2f83eb91d3fc\") " pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.288902 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-catalog-content\") pod \"community-operators-c94fj\" (UID: \"88e94523-c126-4ce8-a6c7-2f83eb91d3fc\") " pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.289011 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.289099 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-utilities\") pod \"community-operators-c94fj\" (UID: \"88e94523-c126-4ce8-a6c7-2f83eb91d3fc\") " pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:48:55 crc kubenswrapper[5094]: E0220 06:48:55.289430 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:55.789408657 +0000 UTC m=+150.662035368 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.338958 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.391120 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.391450 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f7t5\" (UniqueName: \"kubernetes.io/projected/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-kube-api-access-2f7t5\") pod \"community-operators-c94fj\" (UID: \"88e94523-c126-4ce8-a6c7-2f83eb91d3fc\") " pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.391542 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-catalog-content\") pod \"community-operators-c94fj\" (UID: \"88e94523-c126-4ce8-a6c7-2f83eb91d3fc\") " pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.391584 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-utilities\") pod \"community-operators-c94fj\" (UID: \"88e94523-c126-4ce8-a6c7-2f83eb91d3fc\") " pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.392467 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-utilities\") pod \"community-operators-c94fj\" (UID: \"88e94523-c126-4ce8-a6c7-2f83eb91d3fc\") " pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:48:55 crc kubenswrapper[5094]: E0220 06:48:55.392557 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:55.892530299 +0000 UTC m=+150.765157010 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.393298 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-catalog-content\") pod \"community-operators-c94fj\" (UID: \"88e94523-c126-4ce8-a6c7-2f83eb91d3fc\") " pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.394889 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d5dnl"] Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.396291 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.415273 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d5dnl"] Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.484822 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f7t5\" (UniqueName: \"kubernetes.io/projected/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-kube-api-access-2f7t5\") pod \"community-operators-c94fj\" (UID: \"88e94523-c126-4ce8-a6c7-2f83eb91d3fc\") " pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.497347 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94d239c4-71e0-42e2-a2e7-69acd87b5986-utilities\") pod \"certified-operators-d5dnl\" (UID: \"94d239c4-71e0-42e2-a2e7-69acd87b5986\") " pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.497433 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94d239c4-71e0-42e2-a2e7-69acd87b5986-catalog-content\") pod \"certified-operators-d5dnl\" (UID: \"94d239c4-71e0-42e2-a2e7-69acd87b5986\") " pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.497476 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.497508 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk49c\" (UniqueName: \"kubernetes.io/projected/94d239c4-71e0-42e2-a2e7-69acd87b5986-kube-api-access-dk49c\") pod \"certified-operators-d5dnl\" (UID: \"94d239c4-71e0-42e2-a2e7-69acd87b5986\") " pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:48:55 crc kubenswrapper[5094]: E0220 06:48:55.497834 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:55.997821862 +0000 UTC m=+150.870448573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.514327 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.596985 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-85kbx"] Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.598064 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.598783 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.599077 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94d239c4-71e0-42e2-a2e7-69acd87b5986-catalog-content\") pod \"certified-operators-d5dnl\" (UID: \"94d239c4-71e0-42e2-a2e7-69acd87b5986\") " pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.599130 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk49c\" (UniqueName: \"kubernetes.io/projected/94d239c4-71e0-42e2-a2e7-69acd87b5986-kube-api-access-dk49c\") pod \"certified-operators-d5dnl\" (UID: \"94d239c4-71e0-42e2-a2e7-69acd87b5986\") " pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.599151 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94d239c4-71e0-42e2-a2e7-69acd87b5986-utilities\") pod \"certified-operators-d5dnl\" (UID: \"94d239c4-71e0-42e2-a2e7-69acd87b5986\") " pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.599660 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94d239c4-71e0-42e2-a2e7-69acd87b5986-utilities\") pod \"certified-operators-d5dnl\" (UID: \"94d239c4-71e0-42e2-a2e7-69acd87b5986\") " pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:48:55 crc kubenswrapper[5094]: E0220 06:48:55.599739 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:56.099726115 +0000 UTC m=+150.972352826 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.599939 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94d239c4-71e0-42e2-a2e7-69acd87b5986-catalog-content\") pod \"certified-operators-d5dnl\" (UID: \"94d239c4-71e0-42e2-a2e7-69acd87b5986\") " pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.624505 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk49c\" (UniqueName: \"kubernetes.io/projected/94d239c4-71e0-42e2-a2e7-69acd87b5986-kube-api-access-dk49c\") pod \"certified-operators-d5dnl\" (UID: \"94d239c4-71e0-42e2-a2e7-69acd87b5986\") " pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.633807 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-85kbx"] Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.666366 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.702586 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-utilities\") pod \"community-operators-85kbx\" (UID: \"4aa46aec-af59-49c6-9ff5-a08df3d68e5b\") " pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.702631 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmmb7\" (UniqueName: \"kubernetes.io/projected/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-kube-api-access-gmmb7\") pod \"community-operators-85kbx\" (UID: \"4aa46aec-af59-49c6-9ff5-a08df3d68e5b\") " pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.702693 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.702739 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-catalog-content\") pod \"community-operators-85kbx\" (UID: \"4aa46aec-af59-49c6-9ff5-a08df3d68e5b\") " pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:48:55 crc kubenswrapper[5094]: E0220 06:48:55.703389 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:56.203368689 +0000 UTC m=+151.075995400 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.764534 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" event={"ID":"368766ec-f562-4296-bcdb-4bcda1db6c45","Type":"ContainerStarted","Data":"3c2807259a1e3547abf1555a7d4f410742bfe18a3992ceee40a375b7ab31491a"} Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.764584 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" event={"ID":"368766ec-f562-4296-bcdb-4bcda1db6c45","Type":"ContainerStarted","Data":"1eced95afed87dd49973c9a7b8fecabb40b5b049ad20de866ba6fc4bc08b0fc0"} Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.810514 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.810805 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-catalog-content\") pod \"community-operators-85kbx\" (UID: \"4aa46aec-af59-49c6-9ff5-a08df3d68e5b\") " pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.810970 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-utilities\") pod \"community-operators-85kbx\" (UID: \"4aa46aec-af59-49c6-9ff5-a08df3d68e5b\") " pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.811022 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmmb7\" (UniqueName: \"kubernetes.io/projected/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-kube-api-access-gmmb7\") pod \"community-operators-85kbx\" (UID: \"4aa46aec-af59-49c6-9ff5-a08df3d68e5b\") " pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:48:55 crc kubenswrapper[5094]: E0220 06:48:55.812072 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:56.312056542 +0000 UTC m=+151.184683253 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.812721 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-catalog-content\") pod \"community-operators-85kbx\" (UID: \"4aa46aec-af59-49c6-9ff5-a08df3d68e5b\") " pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.813276 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" podStartSLOduration=10.813257201 podStartE2EDuration="10.813257201s" podCreationTimestamp="2026-02-20 06:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:55.810739381 +0000 UTC m=+150.683366092" watchObservedRunningTime="2026-02-20 06:48:55.813257201 +0000 UTC m=+150.685883912" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.815019 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-utilities\") pod \"community-operators-85kbx\" (UID: \"4aa46aec-af59-49c6-9ff5-a08df3d68e5b\") " pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.851492 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmmb7\" (UniqueName: \"kubernetes.io/projected/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-kube-api-access-gmmb7\") pod \"community-operators-85kbx\" (UID: \"4aa46aec-af59-49c6-9ff5-a08df3d68e5b\") " pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.913234 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:55 crc kubenswrapper[5094]: E0220 06:48:55.913592 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:56.413580046 +0000 UTC m=+151.286206757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.996078 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.014537 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:56 crc kubenswrapper[5094]: E0220 06:48:56.015133 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:56.51511709 +0000 UTC m=+151.387743801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.040195 5094 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-20T06:48:55.234801403Z","Handler":null,"Name":""} Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.050120 5094 patch_prober.go:28] interesting pod/router-default-5444994796-9sztr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 06:48:56 crc kubenswrapper[5094]: [-]has-synced failed: reason withheld Feb 20 06:48:56 crc kubenswrapper[5094]: [+]process-running ok Feb 20 06:48:56 crc kubenswrapper[5094]: healthz check failed Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.050167 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9sztr" podUID="a6908ab3-3d33-4e31-b226-b6607f34ee8b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.050711 5094 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.050745 5094 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.117471 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.119979 5094 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.120011 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:56 crc kubenswrapper[5094]: W0220 06:48:56.127352 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-bbfb98feecd2f0850dad905fc0957730e7d8b0af9c2e5ded6695aebe2f28369f WatchSource:0}: Error finding container bbfb98feecd2f0850dad905fc0957730e7d8b0af9c2e5ded6695aebe2f28369f: Status 404 returned error can't find the container with id bbfb98feecd2f0850dad905fc0957730e7d8b0af9c2e5ded6695aebe2f28369f Feb 20 06:48:56 crc kubenswrapper[5094]: W0220 06:48:56.205206 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-38ed73c2ce96afb9435212081fee98c3fa6dc707df7faad283e894a5481c32f9 WatchSource:0}: Error finding container 38ed73c2ce96afb9435212081fee98c3fa6dc707df7faad283e894a5481c32f9: Status 404 returned error can't find the container with id 38ed73c2ce96afb9435212081fee98c3fa6dc707df7faad283e894a5481c32f9 Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.213073 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.358807 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.386941 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d5dnl"] Feb 20 06:48:56 crc kubenswrapper[5094]: W0220 06:48:56.409840 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94d239c4_71e0_42e2_a2e7_69acd87b5986.slice/crio-d62690bad3f55dfb0952f61bd6ad5dc34171ea69ec48d54c64252d6633321266 WatchSource:0}: Error finding container d62690bad3f55dfb0952f61bd6ad5dc34171ea69ec48d54c64252d6633321266: Status 404 returned error can't find the container with id d62690bad3f55dfb0952f61bd6ad5dc34171ea69ec48d54c64252d6633321266 Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.418965 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.435355 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-85kbx"] Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.460073 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.492815 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7zj2v"] Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.495129 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c94fj"] Feb 20 06:48:56 crc kubenswrapper[5094]: W0220 06:48:56.510685 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66c35d10_d5cc_468f_95a1_b56fde3961b3.slice/crio-e17dd6ed8a9cb1673d9dc74ffad9c9a16fa5fcf970993588d1e5f6f1eb4c4133 WatchSource:0}: Error finding container e17dd6ed8a9cb1673d9dc74ffad9c9a16fa5fcf970993588d1e5f6f1eb4c4133: Status 404 returned error can't find the container with id e17dd6ed8a9cb1673d9dc74ffad9c9a16fa5fcf970993588d1e5f6f1eb4c4133 Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.773928 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zj2v" event={"ID":"66c35d10-d5cc-468f-95a1-b56fde3961b3","Type":"ContainerStarted","Data":"e17dd6ed8a9cb1673d9dc74ffad9c9a16fa5fcf970993588d1e5f6f1eb4c4133"} Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.775314 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"20503c16da8e0a242dea890729372365471b62e53857bc200f7164d82e653bcd"} Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.775336 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"bbfb98feecd2f0850dad905fc0957730e7d8b0af9c2e5ded6695aebe2f28369f"} Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.781999 5094 generic.go:334] "Generic (PLEG): container finished" podID="94d239c4-71e0-42e2-a2e7-69acd87b5986" containerID="ef3c91c789b3a8c2e2c7b970bccc8b3b862a287138831c82867ac750baa00328" exitCode=0 Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.782353 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5dnl" event={"ID":"94d239c4-71e0-42e2-a2e7-69acd87b5986","Type":"ContainerDied","Data":"ef3c91c789b3a8c2e2c7b970bccc8b3b862a287138831c82867ac750baa00328"} Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.782395 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5dnl" event={"ID":"94d239c4-71e0-42e2-a2e7-69acd87b5986","Type":"ContainerStarted","Data":"d62690bad3f55dfb0952f61bd6ad5dc34171ea69ec48d54c64252d6633321266"} Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.783895 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.786602 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85kbx" event={"ID":"4aa46aec-af59-49c6-9ff5-a08df3d68e5b","Type":"ContainerStarted","Data":"52b9be5e00a8e14d989754bcec98f6733777f2e861c8ab554e5876f65f7c7c0b"} Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.792833 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"837d9f622f86b6681941d2ed3daab1dd90f965a8ccdd36d619348d6b31d3cefb"} Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.792884 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ab92da8ed705d039b49ca77f094f134766be8706ef1bd5528abc5a65acbe03d4"} Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.800418 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"93ebb1e3c85fe9851f6799dfdd8ac3729027c0f1b0b45968e87380ff7ba3ca22"} Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.800605 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"38ed73c2ce96afb9435212081fee98c3fa6dc707df7faad283e894a5481c32f9"} Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.801376 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.816366 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c94fj" event={"ID":"88e94523-c126-4ce8-a6c7-2f83eb91d3fc","Type":"ContainerStarted","Data":"a1dea229adcbee55ddf6e0b41aedbdc8cf35c14a6f68369e5313338e917770ed"} Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.822645 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mfphn"] Feb 20 06:48:56 crc kubenswrapper[5094]: W0220 06:48:56.971685 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa6b00ff_07fb_4e9a_80da_780c22acbe69.slice/crio-cffd4cbccec1650f91f390b126272379be0c306a55bee926972d5822622f847a WatchSource:0}: Error finding container cffd4cbccec1650f91f390b126272379be0c306a55bee926972d5822622f847a: Status 404 returned error can't find the container with id cffd4cbccec1650f91f390b126272379be0c306a55bee926972d5822622f847a Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.042920 5094 patch_prober.go:28] interesting pod/router-default-5444994796-9sztr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 06:48:57 crc kubenswrapper[5094]: [-]has-synced failed: reason withheld Feb 20 06:48:57 crc kubenswrapper[5094]: [+]process-running ok Feb 20 06:48:57 crc kubenswrapper[5094]: healthz check failed Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.043039 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9sztr" podUID="a6908ab3-3d33-4e31-b226-b6607f34ee8b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.172549 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jlc84"] Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.174440 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.176915 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.187550 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jlc84"] Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.278878 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-catalog-content\") pod \"redhat-marketplace-jlc84\" (UID: \"f5c1eecf-1cc2-4480-ac22-99a970f5dc58\") " pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.279040 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-utilities\") pod \"redhat-marketplace-jlc84\" (UID: \"f5c1eecf-1cc2-4480-ac22-99a970f5dc58\") " pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.279080 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw4pb\" (UniqueName: \"kubernetes.io/projected/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-kube-api-access-dw4pb\") pod \"redhat-marketplace-jlc84\" (UID: \"f5c1eecf-1cc2-4480-ac22-99a970f5dc58\") " pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.380845 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-catalog-content\") pod \"redhat-marketplace-jlc84\" (UID: \"f5c1eecf-1cc2-4480-ac22-99a970f5dc58\") " pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.380973 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-utilities\") pod \"redhat-marketplace-jlc84\" (UID: \"f5c1eecf-1cc2-4480-ac22-99a970f5dc58\") " pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.381000 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw4pb\" (UniqueName: \"kubernetes.io/projected/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-kube-api-access-dw4pb\") pod \"redhat-marketplace-jlc84\" (UID: \"f5c1eecf-1cc2-4480-ac22-99a970f5dc58\") " pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.381967 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-catalog-content\") pod \"redhat-marketplace-jlc84\" (UID: \"f5c1eecf-1cc2-4480-ac22-99a970f5dc58\") " pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.382202 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-utilities\") pod \"redhat-marketplace-jlc84\" (UID: \"f5c1eecf-1cc2-4480-ac22-99a970f5dc58\") " pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.401437 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw4pb\" (UniqueName: \"kubernetes.io/projected/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-kube-api-access-dw4pb\") pod \"redhat-marketplace-jlc84\" (UID: \"f5c1eecf-1cc2-4480-ac22-99a970f5dc58\") " pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.490145 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.577916 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hw2zj"] Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.580767 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.585607 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hw2zj"] Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.684671 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d307716f-90ad-4b6b-9a49-10f8b5a98721-catalog-content\") pod \"redhat-marketplace-hw2zj\" (UID: \"d307716f-90ad-4b6b-9a49-10f8b5a98721\") " pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.684779 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-985sq\" (UniqueName: \"kubernetes.io/projected/d307716f-90ad-4b6b-9a49-10f8b5a98721-kube-api-access-985sq\") pod \"redhat-marketplace-hw2zj\" (UID: \"d307716f-90ad-4b6b-9a49-10f8b5a98721\") " pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.684803 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d307716f-90ad-4b6b-9a49-10f8b5a98721-utilities\") pod \"redhat-marketplace-hw2zj\" (UID: \"d307716f-90ad-4b6b-9a49-10f8b5a98721\") " pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.786696 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-985sq\" (UniqueName: \"kubernetes.io/projected/d307716f-90ad-4b6b-9a49-10f8b5a98721-kube-api-access-985sq\") pod \"redhat-marketplace-hw2zj\" (UID: \"d307716f-90ad-4b6b-9a49-10f8b5a98721\") " pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.786775 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d307716f-90ad-4b6b-9a49-10f8b5a98721-utilities\") pod \"redhat-marketplace-hw2zj\" (UID: \"d307716f-90ad-4b6b-9a49-10f8b5a98721\") " pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.786913 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d307716f-90ad-4b6b-9a49-10f8b5a98721-catalog-content\") pod \"redhat-marketplace-hw2zj\" (UID: \"d307716f-90ad-4b6b-9a49-10f8b5a98721\") " pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.788621 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d307716f-90ad-4b6b-9a49-10f8b5a98721-catalog-content\") pod \"redhat-marketplace-hw2zj\" (UID: \"d307716f-90ad-4b6b-9a49-10f8b5a98721\") " pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.791217 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d307716f-90ad-4b6b-9a49-10f8b5a98721-utilities\") pod \"redhat-marketplace-hw2zj\" (UID: \"d307716f-90ad-4b6b-9a49-10f8b5a98721\") " pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.798934 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jlc84"] Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.809975 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-985sq\" (UniqueName: \"kubernetes.io/projected/d307716f-90ad-4b6b-9a49-10f8b5a98721-kube-api-access-985sq\") pod \"redhat-marketplace-hw2zj\" (UID: \"d307716f-90ad-4b6b-9a49-10f8b5a98721\") " pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:48:57 crc kubenswrapper[5094]: W0220 06:48:57.821998 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5c1eecf_1cc2_4480_ac22_99a970f5dc58.slice/crio-b868f7016b99c578833cf3de0ff7d8c4b2b1a5f9351afa2373e2189f66239303 WatchSource:0}: Error finding container b868f7016b99c578833cf3de0ff7d8c4b2b1a5f9351afa2373e2189f66239303: Status 404 returned error can't find the container with id b868f7016b99c578833cf3de0ff7d8c4b2b1a5f9351afa2373e2189f66239303 Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.855842 5094 generic.go:334] "Generic (PLEG): container finished" podID="88e94523-c126-4ce8-a6c7-2f83eb91d3fc" containerID="6eb6b4fa4af198a75121d1f1d8845384553bbedcf18e198cdf00cc8282d9f5b7" exitCode=0 Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.874405 5094 generic.go:334] "Generic (PLEG): container finished" podID="66c35d10-d5cc-468f-95a1-b56fde3961b3" containerID="11182af2209155c05ce50ce6f5457662dfc62f8d75b0ebcee01f179c458884f9" exitCode=0 Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.876961 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.877751 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c94fj" event={"ID":"88e94523-c126-4ce8-a6c7-2f83eb91d3fc","Type":"ContainerDied","Data":"6eb6b4fa4af198a75121d1f1d8845384553bbedcf18e198cdf00cc8282d9f5b7"} Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.877787 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zj2v" event={"ID":"66c35d10-d5cc-468f-95a1-b56fde3961b3","Type":"ContainerDied","Data":"11182af2209155c05ce50ce6f5457662dfc62f8d75b0ebcee01f179c458884f9"} Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.880397 5094 generic.go:334] "Generic (PLEG): container finished" podID="4aa46aec-af59-49c6-9ff5-a08df3d68e5b" containerID="8667a8c7af968dab77a30064c202565770814938ef52fa50a032b1a65aa555d9" exitCode=0 Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.880482 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85kbx" event={"ID":"4aa46aec-af59-49c6-9ff5-a08df3d68e5b","Type":"ContainerDied","Data":"8667a8c7af968dab77a30064c202565770814938ef52fa50a032b1a65aa555d9"} Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.884692 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" event={"ID":"fa6b00ff-07fb-4e9a-80da-780c22acbe69","Type":"ContainerStarted","Data":"aea1000ba80dabd2eeaca77a83d09e249159833615723a3fcdb37fa198e8b5cd"} Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.884767 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" event={"ID":"fa6b00ff-07fb-4e9a-80da-780c22acbe69","Type":"ContainerStarted","Data":"cffd4cbccec1650f91f390b126272379be0c306a55bee926972d5822622f847a"} Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.899534 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.977103 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" podStartSLOduration=130.977079948 podStartE2EDuration="2m10.977079948s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:57.975978022 +0000 UTC m=+152.848604733" watchObservedRunningTime="2026-02-20 06:48:57.977079948 +0000 UTC m=+152.849706659" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.045104 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.050501 5094 patch_prober.go:28] interesting pod/router-default-5444994796-9sztr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 06:48:58 crc kubenswrapper[5094]: [-]has-synced failed: reason withheld Feb 20 06:48:58 crc kubenswrapper[5094]: [+]process-running ok Feb 20 06:48:58 crc kubenswrapper[5094]: healthz check failed Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.050580 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9sztr" podUID="a6908ab3-3d33-4e31-b226-b6607f34ee8b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.104534 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.104604 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.120022 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.144932 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.144984 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.166860 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.183969 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tpwcx"] Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.185818 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.188454 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.203880 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tpwcx"] Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.215121 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hw2zj"] Feb 20 06:48:58 crc kubenswrapper[5094]: W0220 06:48:58.215195 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd307716f_90ad_4b6b_9a49_10f8b5a98721.slice/crio-accfce42132240eac7ff5d1f0c619115e032c77085b53887db760a64c361f25b WatchSource:0}: Error finding container accfce42132240eac7ff5d1f0c619115e032c77085b53887db760a64c361f25b: Status 404 returned error can't find the container with id accfce42132240eac7ff5d1f0c619115e032c77085b53887db760a64c361f25b Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.308598 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ecc3e73-dd76-4a73-a366-92c78aca386e-catalog-content\") pod \"redhat-operators-tpwcx\" (UID: \"8ecc3e73-dd76-4a73-a366-92c78aca386e\") " pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.308686 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ecc3e73-dd76-4a73-a366-92c78aca386e-utilities\") pod \"redhat-operators-tpwcx\" (UID: \"8ecc3e73-dd76-4a73-a366-92c78aca386e\") " pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.308735 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5cms\" (UniqueName: \"kubernetes.io/projected/8ecc3e73-dd76-4a73-a366-92c78aca386e-kube-api-access-s5cms\") pod \"redhat-operators-tpwcx\" (UID: \"8ecc3e73-dd76-4a73-a366-92c78aca386e\") " pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.326773 5094 patch_prober.go:28] interesting pod/downloads-7954f5f757-d2l2r container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.326816 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d2l2r" podUID="f1faaf31-f0b6-4828-90cc-51de060dc826" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.327456 5094 patch_prober.go:28] interesting pod/downloads-7954f5f757-d2l2r container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.327485 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-d2l2r" podUID="f1faaf31-f0b6-4828-90cc-51de060dc826" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.410844 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ecc3e73-dd76-4a73-a366-92c78aca386e-catalog-content\") pod \"redhat-operators-tpwcx\" (UID: \"8ecc3e73-dd76-4a73-a366-92c78aca386e\") " pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.410904 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ecc3e73-dd76-4a73-a366-92c78aca386e-utilities\") pod \"redhat-operators-tpwcx\" (UID: \"8ecc3e73-dd76-4a73-a366-92c78aca386e\") " pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.410928 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5cms\" (UniqueName: \"kubernetes.io/projected/8ecc3e73-dd76-4a73-a366-92c78aca386e-kube-api-access-s5cms\") pod \"redhat-operators-tpwcx\" (UID: \"8ecc3e73-dd76-4a73-a366-92c78aca386e\") " pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.412093 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ecc3e73-dd76-4a73-a366-92c78aca386e-catalog-content\") pod \"redhat-operators-tpwcx\" (UID: \"8ecc3e73-dd76-4a73-a366-92c78aca386e\") " pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.412310 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ecc3e73-dd76-4a73-a366-92c78aca386e-utilities\") pod \"redhat-operators-tpwcx\" (UID: \"8ecc3e73-dd76-4a73-a366-92c78aca386e\") " pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.441555 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5cms\" (UniqueName: \"kubernetes.io/projected/8ecc3e73-dd76-4a73-a366-92c78aca386e-kube-api-access-s5cms\") pod \"redhat-operators-tpwcx\" (UID: \"8ecc3e73-dd76-4a73-a366-92c78aca386e\") " pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.511095 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.586362 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r5gcc"] Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.587559 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.605026 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r5gcc"] Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.650312 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.650375 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.664726 5094 patch_prober.go:28] interesting pod/console-f9d7485db-shq4j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.664796 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-shq4j" podUID="e130287f-996d-4ab0-8c12-351bf8d21df5" containerName="console" probeResult="failure" output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.715038 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkfpb\" (UniqueName: \"kubernetes.io/projected/4803c5cf-27e3-414a-8fa4-7e82730e311d-kube-api-access-vkfpb\") pod \"redhat-operators-r5gcc\" (UID: \"4803c5cf-27e3-414a-8fa4-7e82730e311d\") " pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.715111 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4803c5cf-27e3-414a-8fa4-7e82730e311d-utilities\") pod \"redhat-operators-r5gcc\" (UID: \"4803c5cf-27e3-414a-8fa4-7e82730e311d\") " pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.715147 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4803c5cf-27e3-414a-8fa4-7e82730e311d-catalog-content\") pod \"redhat-operators-r5gcc\" (UID: \"4803c5cf-27e3-414a-8fa4-7e82730e311d\") " pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.816492 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkfpb\" (UniqueName: \"kubernetes.io/projected/4803c5cf-27e3-414a-8fa4-7e82730e311d-kube-api-access-vkfpb\") pod \"redhat-operators-r5gcc\" (UID: \"4803c5cf-27e3-414a-8fa4-7e82730e311d\") " pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.816574 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4803c5cf-27e3-414a-8fa4-7e82730e311d-utilities\") pod \"redhat-operators-r5gcc\" (UID: \"4803c5cf-27e3-414a-8fa4-7e82730e311d\") " pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.816601 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4803c5cf-27e3-414a-8fa4-7e82730e311d-catalog-content\") pod \"redhat-operators-r5gcc\" (UID: \"4803c5cf-27e3-414a-8fa4-7e82730e311d\") " pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.817241 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4803c5cf-27e3-414a-8fa4-7e82730e311d-catalog-content\") pod \"redhat-operators-r5gcc\" (UID: \"4803c5cf-27e3-414a-8fa4-7e82730e311d\") " pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.817886 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4803c5cf-27e3-414a-8fa4-7e82730e311d-utilities\") pod \"redhat-operators-r5gcc\" (UID: \"4803c5cf-27e3-414a-8fa4-7e82730e311d\") " pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.836913 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkfpb\" (UniqueName: \"kubernetes.io/projected/4803c5cf-27e3-414a-8fa4-7e82730e311d-kube-api-access-vkfpb\") pod \"redhat-operators-r5gcc\" (UID: \"4803c5cf-27e3-414a-8fa4-7e82730e311d\") " pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.875461 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.877981 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.882579 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.882805 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.888403 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.901773 5094 generic.go:334] "Generic (PLEG): container finished" podID="806ba791-714c-4d13-b595-d4f6ccf06aea" containerID="97bfbe799eec96272d8e9a6b7afb335847d35846686c6b7385ed8fc78aa4aec5" exitCode=0 Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.901834 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" event={"ID":"806ba791-714c-4d13-b595-d4f6ccf06aea","Type":"ContainerDied","Data":"97bfbe799eec96272d8e9a6b7afb335847d35846686c6b7385ed8fc78aa4aec5"} Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.905078 5094 generic.go:334] "Generic (PLEG): container finished" podID="d307716f-90ad-4b6b-9a49-10f8b5a98721" containerID="4270db803bcdf262f5c0b9fda9d8278a355396ffba48defc3ee731db9488d8d6" exitCode=0 Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.905382 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw2zj" event={"ID":"d307716f-90ad-4b6b-9a49-10f8b5a98721","Type":"ContainerDied","Data":"4270db803bcdf262f5c0b9fda9d8278a355396ffba48defc3ee731db9488d8d6"} Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.905437 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw2zj" event={"ID":"d307716f-90ad-4b6b-9a49-10f8b5a98721","Type":"ContainerStarted","Data":"accfce42132240eac7ff5d1f0c619115e032c77085b53887db760a64c361f25b"} Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.909240 5094 generic.go:334] "Generic (PLEG): container finished" podID="f5c1eecf-1cc2-4480-ac22-99a970f5dc58" containerID="064b2c1894993da3d72d5837edcc05f078452e781464dc1e2a5cab9234eb9f15" exitCode=0 Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.909601 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jlc84" event={"ID":"f5c1eecf-1cc2-4480-ac22-99a970f5dc58","Type":"ContainerDied","Data":"064b2c1894993da3d72d5837edcc05f078452e781464dc1e2a5cab9234eb9f15"} Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.911934 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jlc84" event={"ID":"f5c1eecf-1cc2-4480-ac22-99a970f5dc58","Type":"ContainerStarted","Data":"b868f7016b99c578833cf3de0ff7d8c4b2b1a5f9351afa2373e2189f66239303"} Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.912406 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.930850 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.936350 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.944180 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.027633 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ecd4e39f-130b-4f63-aedb-6cda9ec7da80-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ecd4e39f-130b-4f63-aedb-6cda9ec7da80\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.027807 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ecd4e39f-130b-4f63-aedb-6cda9ec7da80-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ecd4e39f-130b-4f63-aedb-6cda9ec7da80\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.059870 5094 patch_prober.go:28] interesting pod/router-default-5444994796-9sztr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 06:48:59 crc kubenswrapper[5094]: [-]has-synced failed: reason withheld Feb 20 06:48:59 crc kubenswrapper[5094]: [+]process-running ok Feb 20 06:48:59 crc kubenswrapper[5094]: healthz check failed Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.059932 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9sztr" podUID="a6908ab3-3d33-4e31-b226-b6607f34ee8b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.093242 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tpwcx"] Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.131848 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ecd4e39f-130b-4f63-aedb-6cda9ec7da80-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ecd4e39f-130b-4f63-aedb-6cda9ec7da80\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.131937 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ecd4e39f-130b-4f63-aedb-6cda9ec7da80-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ecd4e39f-130b-4f63-aedb-6cda9ec7da80\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.132314 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ecd4e39f-130b-4f63-aedb-6cda9ec7da80-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ecd4e39f-130b-4f63-aedb-6cda9ec7da80\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 06:48:59 crc kubenswrapper[5094]: W0220 06:48:59.158316 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ecc3e73_dd76_4a73_a366_92c78aca386e.slice/crio-0e45d21f2e06751dfd4ed57ac1f4c3ab877bd9456401ea77210957c88c331288 WatchSource:0}: Error finding container 0e45d21f2e06751dfd4ed57ac1f4c3ab877bd9456401ea77210957c88c331288: Status 404 returned error can't find the container with id 0e45d21f2e06751dfd4ed57ac1f4c3ab877bd9456401ea77210957c88c331288 Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.169496 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ecd4e39f-130b-4f63-aedb-6cda9ec7da80-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ecd4e39f-130b-4f63-aedb-6cda9ec7da80\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.202202 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.502450 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r5gcc"] Feb 20 06:48:59 crc kubenswrapper[5094]: W0220 06:48:59.600767 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4803c5cf_27e3_414a_8fa4_7e82730e311d.slice/crio-22aca6363f2206b11762898817ccc1128807f675094b438a82160e56aa1326d5 WatchSource:0}: Error finding container 22aca6363f2206b11762898817ccc1128807f675094b438a82160e56aa1326d5: Status 404 returned error can't find the container with id 22aca6363f2206b11762898817ccc1128807f675094b438a82160e56aa1326d5 Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.835617 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 20 06:48:59 crc kubenswrapper[5094]: W0220 06:48:59.861323 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podecd4e39f_130b_4f63_aedb_6cda9ec7da80.slice/crio-42b38de35ebf1c38f39a8ad94f8897c2ba7db2a7bd42d6f1e604912889bd4ced WatchSource:0}: Error finding container 42b38de35ebf1c38f39a8ad94f8897c2ba7db2a7bd42d6f1e604912889bd4ced: Status 404 returned error can't find the container with id 42b38de35ebf1c38f39a8ad94f8897c2ba7db2a7bd42d6f1e604912889bd4ced Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.951203 5094 generic.go:334] "Generic (PLEG): container finished" podID="8ecc3e73-dd76-4a73-a366-92c78aca386e" containerID="30c497a9a14091198f358ef8a78f1c6b6da4ba9ab8621394e3d848ebf94da222" exitCode=0 Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.951411 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpwcx" event={"ID":"8ecc3e73-dd76-4a73-a366-92c78aca386e","Type":"ContainerDied","Data":"30c497a9a14091198f358ef8a78f1c6b6da4ba9ab8621394e3d848ebf94da222"} Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.951442 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpwcx" event={"ID":"8ecc3e73-dd76-4a73-a366-92c78aca386e","Type":"ContainerStarted","Data":"0e45d21f2e06751dfd4ed57ac1f4c3ab877bd9456401ea77210957c88c331288"} Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.955490 5094 generic.go:334] "Generic (PLEG): container finished" podID="4803c5cf-27e3-414a-8fa4-7e82730e311d" containerID="7f79938926dc3a0a2752cae1f32aa8708d69b23b4db2a3c1830917ae04282805" exitCode=0 Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.955543 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r5gcc" event={"ID":"4803c5cf-27e3-414a-8fa4-7e82730e311d","Type":"ContainerDied","Data":"7f79938926dc3a0a2752cae1f32aa8708d69b23b4db2a3c1830917ae04282805"} Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.955564 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r5gcc" event={"ID":"4803c5cf-27e3-414a-8fa4-7e82730e311d","Type":"ContainerStarted","Data":"22aca6363f2206b11762898817ccc1128807f675094b438a82160e56aa1326d5"} Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.982296 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ecd4e39f-130b-4f63-aedb-6cda9ec7da80","Type":"ContainerStarted","Data":"42b38de35ebf1c38f39a8ad94f8897c2ba7db2a7bd42d6f1e604912889bd4ced"} Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.043027 5094 patch_prober.go:28] interesting pod/router-default-5444994796-9sztr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 06:49:00 crc kubenswrapper[5094]: [-]has-synced failed: reason withheld Feb 20 06:49:00 crc kubenswrapper[5094]: [+]process-running ok Feb 20 06:49:00 crc kubenswrapper[5094]: healthz check failed Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.043117 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9sztr" podUID="a6908ab3-3d33-4e31-b226-b6607f34ee8b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.445659 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.574114 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j498r\" (UniqueName: \"kubernetes.io/projected/806ba791-714c-4d13-b595-d4f6ccf06aea-kube-api-access-j498r\") pod \"806ba791-714c-4d13-b595-d4f6ccf06aea\" (UID: \"806ba791-714c-4d13-b595-d4f6ccf06aea\") " Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.574437 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/806ba791-714c-4d13-b595-d4f6ccf06aea-secret-volume\") pod \"806ba791-714c-4d13-b595-d4f6ccf06aea\" (UID: \"806ba791-714c-4d13-b595-d4f6ccf06aea\") " Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.574516 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/806ba791-714c-4d13-b595-d4f6ccf06aea-config-volume\") pod \"806ba791-714c-4d13-b595-d4f6ccf06aea\" (UID: \"806ba791-714c-4d13-b595-d4f6ccf06aea\") " Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.575407 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/806ba791-714c-4d13-b595-d4f6ccf06aea-config-volume" (OuterVolumeSpecName: "config-volume") pod "806ba791-714c-4d13-b595-d4f6ccf06aea" (UID: "806ba791-714c-4d13-b595-d4f6ccf06aea"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.584327 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/806ba791-714c-4d13-b595-d4f6ccf06aea-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "806ba791-714c-4d13-b595-d4f6ccf06aea" (UID: "806ba791-714c-4d13-b595-d4f6ccf06aea"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.602990 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/806ba791-714c-4d13-b595-d4f6ccf06aea-kube-api-access-j498r" (OuterVolumeSpecName: "kube-api-access-j498r") pod "806ba791-714c-4d13-b595-d4f6ccf06aea" (UID: "806ba791-714c-4d13-b595-d4f6ccf06aea"). InnerVolumeSpecName "kube-api-access-j498r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.677133 5094 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/806ba791-714c-4d13-b595-d4f6ccf06aea-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.677208 5094 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/806ba791-714c-4d13-b595-d4f6ccf06aea-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.677222 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j498r\" (UniqueName: \"kubernetes.io/projected/806ba791-714c-4d13-b595-d4f6ccf06aea-kube-api-access-j498r\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.723827 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 20 06:49:00 crc kubenswrapper[5094]: E0220 06:49:00.724194 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="806ba791-714c-4d13-b595-d4f6ccf06aea" containerName="collect-profiles" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.724232 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="806ba791-714c-4d13-b595-d4f6ccf06aea" containerName="collect-profiles" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.724401 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="806ba791-714c-4d13-b595-d4f6ccf06aea" containerName="collect-profiles" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.727352 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.730640 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.730909 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.731303 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.779390 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b3e1407-37b6-4b4b-ac69-7f5acdcac274-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8b3e1407-37b6-4b4b-ac69-7f5acdcac274\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.779466 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b3e1407-37b6-4b4b-ac69-7f5acdcac274-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8b3e1407-37b6-4b4b-ac69-7f5acdcac274\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.881361 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b3e1407-37b6-4b4b-ac69-7f5acdcac274-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8b3e1407-37b6-4b4b-ac69-7f5acdcac274\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.881420 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b3e1407-37b6-4b4b-ac69-7f5acdcac274-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8b3e1407-37b6-4b4b-ac69-7f5acdcac274\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.881527 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b3e1407-37b6-4b4b-ac69-7f5acdcac274-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8b3e1407-37b6-4b4b-ac69-7f5acdcac274\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.900544 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b3e1407-37b6-4b4b-ac69-7f5acdcac274-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8b3e1407-37b6-4b4b-ac69-7f5acdcac274\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 06:49:01 crc kubenswrapper[5094]: I0220 06:49:01.006677 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" event={"ID":"806ba791-714c-4d13-b595-d4f6ccf06aea","Type":"ContainerDied","Data":"2cba69ee98eb39e06e5e23c24335f67d934d7ddd307c2f258f0da6b72887c796"} Feb 20 06:49:01 crc kubenswrapper[5094]: I0220 06:49:01.006758 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cba69ee98eb39e06e5e23c24335f67d934d7ddd307c2f258f0da6b72887c796" Feb 20 06:49:01 crc kubenswrapper[5094]: I0220 06:49:01.006834 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" Feb 20 06:49:01 crc kubenswrapper[5094]: I0220 06:49:01.013197 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ecd4e39f-130b-4f63-aedb-6cda9ec7da80","Type":"ContainerStarted","Data":"eac6d043a26a977dd2c97385c1aee97e96990a378d70f7183c056dc53f6aea96"} Feb 20 06:49:01 crc kubenswrapper[5094]: I0220 06:49:01.032642 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.032622099 podStartE2EDuration="3.032622099s" podCreationTimestamp="2026-02-20 06:48:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:49:01.028894601 +0000 UTC m=+155.901521312" watchObservedRunningTime="2026-02-20 06:49:01.032622099 +0000 UTC m=+155.905248810" Feb 20 06:49:01 crc kubenswrapper[5094]: I0220 06:49:01.043496 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 06:49:01 crc kubenswrapper[5094]: I0220 06:49:01.044314 5094 patch_prober.go:28] interesting pod/router-default-5444994796-9sztr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 06:49:01 crc kubenswrapper[5094]: [-]has-synced failed: reason withheld Feb 20 06:49:01 crc kubenswrapper[5094]: [+]process-running ok Feb 20 06:49:01 crc kubenswrapper[5094]: healthz check failed Feb 20 06:49:01 crc kubenswrapper[5094]: I0220 06:49:01.044359 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9sztr" podUID="a6908ab3-3d33-4e31-b226-b6607f34ee8b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:49:01 crc kubenswrapper[5094]: I0220 06:49:01.727116 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 20 06:49:01 crc kubenswrapper[5094]: W0220 06:49:01.772015 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8b3e1407_37b6_4b4b_ac69_7f5acdcac274.slice/crio-1f9f5ab04dc82fef1a825f24f0f0a468f7a92d6aa4dd5c0f6d0734ed6b6c45e4 WatchSource:0}: Error finding container 1f9f5ab04dc82fef1a825f24f0f0a468f7a92d6aa4dd5c0f6d0734ed6b6c45e4: Status 404 returned error can't find the container with id 1f9f5ab04dc82fef1a825f24f0f0a468f7a92d6aa4dd5c0f6d0734ed6b6c45e4 Feb 20 06:49:02 crc kubenswrapper[5094]: I0220 06:49:02.030063 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8b3e1407-37b6-4b4b-ac69-7f5acdcac274","Type":"ContainerStarted","Data":"1f9f5ab04dc82fef1a825f24f0f0a468f7a92d6aa4dd5c0f6d0734ed6b6c45e4"} Feb 20 06:49:02 crc kubenswrapper[5094]: I0220 06:49:02.048094 5094 patch_prober.go:28] interesting pod/router-default-5444994796-9sztr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 06:49:02 crc kubenswrapper[5094]: [-]has-synced failed: reason withheld Feb 20 06:49:02 crc kubenswrapper[5094]: [+]process-running ok Feb 20 06:49:02 crc kubenswrapper[5094]: healthz check failed Feb 20 06:49:02 crc kubenswrapper[5094]: I0220 06:49:02.048172 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9sztr" podUID="a6908ab3-3d33-4e31-b226-b6607f34ee8b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:49:02 crc kubenswrapper[5094]: I0220 06:49:02.048425 5094 generic.go:334] "Generic (PLEG): container finished" podID="ecd4e39f-130b-4f63-aedb-6cda9ec7da80" containerID="eac6d043a26a977dd2c97385c1aee97e96990a378d70f7183c056dc53f6aea96" exitCode=0 Feb 20 06:49:02 crc kubenswrapper[5094]: I0220 06:49:02.048469 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ecd4e39f-130b-4f63-aedb-6cda9ec7da80","Type":"ContainerDied","Data":"eac6d043a26a977dd2c97385c1aee97e96990a378d70f7183c056dc53f6aea96"} Feb 20 06:49:03 crc kubenswrapper[5094]: I0220 06:49:03.042807 5094 patch_prober.go:28] interesting pod/router-default-5444994796-9sztr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 06:49:03 crc kubenswrapper[5094]: [-]has-synced failed: reason withheld Feb 20 06:49:03 crc kubenswrapper[5094]: [+]process-running ok Feb 20 06:49:03 crc kubenswrapper[5094]: healthz check failed Feb 20 06:49:03 crc kubenswrapper[5094]: I0220 06:49:03.042888 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9sztr" podUID="a6908ab3-3d33-4e31-b226-b6607f34ee8b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:49:03 crc kubenswrapper[5094]: I0220 06:49:03.060095 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8b3e1407-37b6-4b4b-ac69-7f5acdcac274","Type":"ContainerStarted","Data":"3d1815b1d3749a263d6e1a9f306addfab6cde409eabb2876a93885391fc3aaf6"} Feb 20 06:49:03 crc kubenswrapper[5094]: I0220 06:49:03.429140 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 06:49:03 crc kubenswrapper[5094]: I0220 06:49:03.460802 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.460775655 podStartE2EDuration="3.460775655s" podCreationTimestamp="2026-02-20 06:49:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:49:03.081805422 +0000 UTC m=+157.954432133" watchObservedRunningTime="2026-02-20 06:49:03.460775655 +0000 UTC m=+158.333402366" Feb 20 06:49:03 crc kubenswrapper[5094]: I0220 06:49:03.561126 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ecd4e39f-130b-4f63-aedb-6cda9ec7da80-kubelet-dir\") pod \"ecd4e39f-130b-4f63-aedb-6cda9ec7da80\" (UID: \"ecd4e39f-130b-4f63-aedb-6cda9ec7da80\") " Feb 20 06:49:03 crc kubenswrapper[5094]: I0220 06:49:03.561280 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ecd4e39f-130b-4f63-aedb-6cda9ec7da80-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ecd4e39f-130b-4f63-aedb-6cda9ec7da80" (UID: "ecd4e39f-130b-4f63-aedb-6cda9ec7da80"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 06:49:03 crc kubenswrapper[5094]: I0220 06:49:03.561350 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ecd4e39f-130b-4f63-aedb-6cda9ec7da80-kube-api-access\") pod \"ecd4e39f-130b-4f63-aedb-6cda9ec7da80\" (UID: \"ecd4e39f-130b-4f63-aedb-6cda9ec7da80\") " Feb 20 06:49:03 crc kubenswrapper[5094]: I0220 06:49:03.561742 5094 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ecd4e39f-130b-4f63-aedb-6cda9ec7da80-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:03 crc kubenswrapper[5094]: I0220 06:49:03.570045 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecd4e39f-130b-4f63-aedb-6cda9ec7da80-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ecd4e39f-130b-4f63-aedb-6cda9ec7da80" (UID: "ecd4e39f-130b-4f63-aedb-6cda9ec7da80"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:49:03 crc kubenswrapper[5094]: I0220 06:49:03.663806 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ecd4e39f-130b-4f63-aedb-6cda9ec7da80-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:04 crc kubenswrapper[5094]: I0220 06:49:04.043728 5094 patch_prober.go:28] interesting pod/router-default-5444994796-9sztr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 06:49:04 crc kubenswrapper[5094]: [-]has-synced failed: reason withheld Feb 20 06:49:04 crc kubenswrapper[5094]: [+]process-running ok Feb 20 06:49:04 crc kubenswrapper[5094]: healthz check failed Feb 20 06:49:04 crc kubenswrapper[5094]: I0220 06:49:04.044139 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9sztr" podUID="a6908ab3-3d33-4e31-b226-b6607f34ee8b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:49:04 crc kubenswrapper[5094]: I0220 06:49:04.073395 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-l2hxn" Feb 20 06:49:04 crc kubenswrapper[5094]: I0220 06:49:04.084878 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ecd4e39f-130b-4f63-aedb-6cda9ec7da80","Type":"ContainerDied","Data":"42b38de35ebf1c38f39a8ad94f8897c2ba7db2a7bd42d6f1e604912889bd4ced"} Feb 20 06:49:04 crc kubenswrapper[5094]: I0220 06:49:04.084956 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42b38de35ebf1c38f39a8ad94f8897c2ba7db2a7bd42d6f1e604912889bd4ced" Feb 20 06:49:04 crc kubenswrapper[5094]: I0220 06:49:04.085033 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 06:49:04 crc kubenswrapper[5094]: I0220 06:49:04.103321 5094 generic.go:334] "Generic (PLEG): container finished" podID="8b3e1407-37b6-4b4b-ac69-7f5acdcac274" containerID="3d1815b1d3749a263d6e1a9f306addfab6cde409eabb2876a93885391fc3aaf6" exitCode=0 Feb 20 06:49:04 crc kubenswrapper[5094]: I0220 06:49:04.104069 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8b3e1407-37b6-4b4b-ac69-7f5acdcac274","Type":"ContainerDied","Data":"3d1815b1d3749a263d6e1a9f306addfab6cde409eabb2876a93885391fc3aaf6"} Feb 20 06:49:04 crc kubenswrapper[5094]: I0220 06:49:04.107009 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 06:49:04 crc kubenswrapper[5094]: I0220 06:49:04.107058 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 06:49:05 crc kubenswrapper[5094]: I0220 06:49:05.043523 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:49:05 crc kubenswrapper[5094]: I0220 06:49:05.055037 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:49:08 crc kubenswrapper[5094]: I0220 06:49:08.340372 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-d2l2r" Feb 20 06:49:08 crc kubenswrapper[5094]: I0220 06:49:08.649388 5094 patch_prober.go:28] interesting pod/console-f9d7485db-shq4j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Feb 20 06:49:08 crc kubenswrapper[5094]: I0220 06:49:08.649466 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-shq4j" podUID="e130287f-996d-4ab0-8c12-351bf8d21df5" containerName="console" probeResult="failure" output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" Feb 20 06:49:09 crc kubenswrapper[5094]: I0220 06:49:09.780401 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs\") pod \"network-metrics-daemon-8ww4n\" (UID: \"da0aa093-1adc-45f2-a942-e68d7be23ed4\") " pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:49:09 crc kubenswrapper[5094]: I0220 06:49:09.794016 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs\") pod \"network-metrics-daemon-8ww4n\" (UID: \"da0aa093-1adc-45f2-a942-e68d7be23ed4\") " pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:49:09 crc kubenswrapper[5094]: I0220 06:49:09.864877 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:49:16 crc kubenswrapper[5094]: I0220 06:49:16.030705 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 06:49:16 crc kubenswrapper[5094]: I0220 06:49:16.203738 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b3e1407-37b6-4b4b-ac69-7f5acdcac274-kube-api-access\") pod \"8b3e1407-37b6-4b4b-ac69-7f5acdcac274\" (UID: \"8b3e1407-37b6-4b4b-ac69-7f5acdcac274\") " Feb 20 06:49:16 crc kubenswrapper[5094]: I0220 06:49:16.203802 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b3e1407-37b6-4b4b-ac69-7f5acdcac274-kubelet-dir\") pod \"8b3e1407-37b6-4b4b-ac69-7f5acdcac274\" (UID: \"8b3e1407-37b6-4b4b-ac69-7f5acdcac274\") " Feb 20 06:49:16 crc kubenswrapper[5094]: I0220 06:49:16.204102 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b3e1407-37b6-4b4b-ac69-7f5acdcac274-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8b3e1407-37b6-4b4b-ac69-7f5acdcac274" (UID: "8b3e1407-37b6-4b4b-ac69-7f5acdcac274"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 06:49:16 crc kubenswrapper[5094]: I0220 06:49:16.218876 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b3e1407-37b6-4b4b-ac69-7f5acdcac274-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8b3e1407-37b6-4b4b-ac69-7f5acdcac274" (UID: "8b3e1407-37b6-4b4b-ac69-7f5acdcac274"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:49:16 crc kubenswrapper[5094]: I0220 06:49:16.229894 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8b3e1407-37b6-4b4b-ac69-7f5acdcac274","Type":"ContainerDied","Data":"1f9f5ab04dc82fef1a825f24f0f0a468f7a92d6aa4dd5c0f6d0734ed6b6c45e4"} Feb 20 06:49:16 crc kubenswrapper[5094]: I0220 06:49:16.229947 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f9f5ab04dc82fef1a825f24f0f0a468f7a92d6aa4dd5c0f6d0734ed6b6c45e4" Feb 20 06:49:16 crc kubenswrapper[5094]: I0220 06:49:16.229948 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 06:49:16 crc kubenswrapper[5094]: I0220 06:49:16.305651 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b3e1407-37b6-4b4b-ac69-7f5acdcac274-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:16 crc kubenswrapper[5094]: I0220 06:49:16.305720 5094 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b3e1407-37b6-4b4b-ac69-7f5acdcac274-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:16 crc kubenswrapper[5094]: I0220 06:49:16.466610 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:49:18 crc kubenswrapper[5094]: I0220 06:49:18.655565 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:49:18 crc kubenswrapper[5094]: I0220 06:49:18.660654 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:49:28 crc kubenswrapper[5094]: E0220 06:49:28.148781 5094 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 20 06:49:28 crc kubenswrapper[5094]: E0220 06:49:28.149901 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gmmb7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-85kbx_openshift-marketplace(4aa46aec-af59-49c6-9ff5-a08df3d68e5b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 20 06:49:28 crc kubenswrapper[5094]: E0220 06:49:28.151827 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-85kbx" podUID="4aa46aec-af59-49c6-9ff5-a08df3d68e5b" Feb 20 06:49:28 crc kubenswrapper[5094]: E0220 06:49:28.215097 5094 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 20 06:49:28 crc kubenswrapper[5094]: E0220 06:49:28.215820 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bxwcp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-7zj2v_openshift-marketplace(66c35d10-d5cc-468f-95a1-b56fde3961b3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 20 06:49:28 crc kubenswrapper[5094]: E0220 06:49:28.217195 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-7zj2v" podUID="66c35d10-d5cc-468f-95a1-b56fde3961b3" Feb 20 06:49:28 crc kubenswrapper[5094]: E0220 06:49:28.241225 5094 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 20 06:49:28 crc kubenswrapper[5094]: E0220 06:49:28.241394 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2f7t5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-c94fj_openshift-marketplace(88e94523-c126-4ce8-a6c7-2f83eb91d3fc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 20 06:49:28 crc kubenswrapper[5094]: E0220 06:49:28.243163 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-c94fj" podUID="88e94523-c126-4ce8-a6c7-2f83eb91d3fc" Feb 20 06:49:28 crc kubenswrapper[5094]: E0220 06:49:28.307348 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-85kbx" podUID="4aa46aec-af59-49c6-9ff5-a08df3d68e5b" Feb 20 06:49:28 crc kubenswrapper[5094]: E0220 06:49:28.307608 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-c94fj" podUID="88e94523-c126-4ce8-a6c7-2f83eb91d3fc" Feb 20 06:49:28 crc kubenswrapper[5094]: E0220 06:49:28.316038 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-7zj2v" podUID="66c35d10-d5cc-468f-95a1-b56fde3961b3" Feb 20 06:49:28 crc kubenswrapper[5094]: I0220 06:49:28.540416 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8ww4n"] Feb 20 06:49:29 crc kubenswrapper[5094]: I0220 06:49:29.018827 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q" Feb 20 06:49:29 crc kubenswrapper[5094]: I0220 06:49:29.310345 5094 generic.go:334] "Generic (PLEG): container finished" podID="8ecc3e73-dd76-4a73-a366-92c78aca386e" containerID="94a532b2709a2b2f9115437213f975f914284d32756c3ae440753867ceeb3799" exitCode=0 Feb 20 06:49:29 crc kubenswrapper[5094]: I0220 06:49:29.310445 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpwcx" event={"ID":"8ecc3e73-dd76-4a73-a366-92c78aca386e","Type":"ContainerDied","Data":"94a532b2709a2b2f9115437213f975f914284d32756c3ae440753867ceeb3799"} Feb 20 06:49:29 crc kubenswrapper[5094]: I0220 06:49:29.324334 5094 generic.go:334] "Generic (PLEG): container finished" podID="f5c1eecf-1cc2-4480-ac22-99a970f5dc58" containerID="ca9993542532855c09ba40fb79d1b2ff1916ab7e330faa07724168697397276c" exitCode=0 Feb 20 06:49:29 crc kubenswrapper[5094]: I0220 06:49:29.324420 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jlc84" event={"ID":"f5c1eecf-1cc2-4480-ac22-99a970f5dc58","Type":"ContainerDied","Data":"ca9993542532855c09ba40fb79d1b2ff1916ab7e330faa07724168697397276c"} Feb 20 06:49:29 crc kubenswrapper[5094]: I0220 06:49:29.335088 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r5gcc" event={"ID":"4803c5cf-27e3-414a-8fa4-7e82730e311d","Type":"ContainerStarted","Data":"6f5b6439f6f44bc36beab797da0a74ac28cd5d6755ba7368790483f6fcf685be"} Feb 20 06:49:29 crc kubenswrapper[5094]: I0220 06:49:29.337327 5094 generic.go:334] "Generic (PLEG): container finished" podID="d307716f-90ad-4b6b-9a49-10f8b5a98721" containerID="a98e2e4e640eec309b5e0629faade0edf22365e6a1089be8c0022fcfa2fa99aa" exitCode=0 Feb 20 06:49:29 crc kubenswrapper[5094]: I0220 06:49:29.337414 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw2zj" event={"ID":"d307716f-90ad-4b6b-9a49-10f8b5a98721","Type":"ContainerDied","Data":"a98e2e4e640eec309b5e0629faade0edf22365e6a1089be8c0022fcfa2fa99aa"} Feb 20 06:49:29 crc kubenswrapper[5094]: I0220 06:49:29.344665 5094 generic.go:334] "Generic (PLEG): container finished" podID="94d239c4-71e0-42e2-a2e7-69acd87b5986" containerID="bffc4090bcdc589daef83a404f87b0aa9489a6b07489e331199c26107f2625bc" exitCode=0 Feb 20 06:49:29 crc kubenswrapper[5094]: I0220 06:49:29.344754 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5dnl" event={"ID":"94d239c4-71e0-42e2-a2e7-69acd87b5986","Type":"ContainerDied","Data":"bffc4090bcdc589daef83a404f87b0aa9489a6b07489e331199c26107f2625bc"} Feb 20 06:49:29 crc kubenswrapper[5094]: I0220 06:49:29.346688 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" event={"ID":"da0aa093-1adc-45f2-a942-e68d7be23ed4","Type":"ContainerStarted","Data":"d85aaa7493881a069ae0b83502388eaef5721562cb786bfb34d14ba2e81caced"} Feb 20 06:49:29 crc kubenswrapper[5094]: I0220 06:49:29.346747 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" event={"ID":"da0aa093-1adc-45f2-a942-e68d7be23ed4","Type":"ContainerStarted","Data":"9753f8c6e0e76ae1629d27e1385daf8a5fb26b4937299a3ce52233b105ee6329"} Feb 20 06:49:30 crc kubenswrapper[5094]: I0220 06:49:30.357077 5094 generic.go:334] "Generic (PLEG): container finished" podID="4803c5cf-27e3-414a-8fa4-7e82730e311d" containerID="6f5b6439f6f44bc36beab797da0a74ac28cd5d6755ba7368790483f6fcf685be" exitCode=0 Feb 20 06:49:30 crc kubenswrapper[5094]: I0220 06:49:30.357155 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r5gcc" event={"ID":"4803c5cf-27e3-414a-8fa4-7e82730e311d","Type":"ContainerDied","Data":"6f5b6439f6f44bc36beab797da0a74ac28cd5d6755ba7368790483f6fcf685be"} Feb 20 06:49:30 crc kubenswrapper[5094]: I0220 06:49:30.360532 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" event={"ID":"da0aa093-1adc-45f2-a942-e68d7be23ed4","Type":"ContainerStarted","Data":"29f4d19d0726c8d6248c1e530378b460824c6a6903ba1d1131570c283b8ce0cf"} Feb 20 06:49:31 crc kubenswrapper[5094]: I0220 06:49:31.404096 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8ww4n" podStartSLOduration=164.404075667 podStartE2EDuration="2m44.404075667s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:49:31.400211395 +0000 UTC m=+186.272838106" watchObservedRunningTime="2026-02-20 06:49:31.404075667 +0000 UTC m=+186.276702378" Feb 20 06:49:33 crc kubenswrapper[5094]: I0220 06:49:33.392627 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5dnl" event={"ID":"94d239c4-71e0-42e2-a2e7-69acd87b5986","Type":"ContainerStarted","Data":"b6bb06ccb30a2b9b8fc736c8aac43c98de0a34a475734de5f7f10809e745e22d"} Feb 20 06:49:33 crc kubenswrapper[5094]: I0220 06:49:33.396391 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpwcx" event={"ID":"8ecc3e73-dd76-4a73-a366-92c78aca386e","Type":"ContainerStarted","Data":"42ce9dcbffc673121e18daf4e5c35fd9d54f6c58c1b7c83b8be06d9c3a2788d6"} Feb 20 06:49:33 crc kubenswrapper[5094]: I0220 06:49:33.398421 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jlc84" event={"ID":"f5c1eecf-1cc2-4480-ac22-99a970f5dc58","Type":"ContainerStarted","Data":"289c43eb8a0b4cdc450470847de0cf4c3dce76dfdb3673f02a9643bdf93ba57c"} Feb 20 06:49:33 crc kubenswrapper[5094]: I0220 06:49:33.400285 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r5gcc" event={"ID":"4803c5cf-27e3-414a-8fa4-7e82730e311d","Type":"ContainerStarted","Data":"51d312f52e87068fa0bc859138dfee26f32d3cf50be3717bdff37c98bea619cb"} Feb 20 06:49:33 crc kubenswrapper[5094]: I0220 06:49:33.402361 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw2zj" event={"ID":"d307716f-90ad-4b6b-9a49-10f8b5a98721","Type":"ContainerStarted","Data":"7a77c43df08e65efbe5f0ef6e087595759d1fd5a8dc86fd7a2d0e287b0015be0"} Feb 20 06:49:33 crc kubenswrapper[5094]: I0220 06:49:33.417147 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d5dnl" podStartSLOduration=2.68878911 podStartE2EDuration="38.417127972s" podCreationTimestamp="2026-02-20 06:48:55 +0000 UTC" firstStartedPulling="2026-02-20 06:48:56.783564766 +0000 UTC m=+151.656191477" lastFinishedPulling="2026-02-20 06:49:32.511903628 +0000 UTC m=+187.384530339" observedRunningTime="2026-02-20 06:49:33.415252898 +0000 UTC m=+188.287879629" watchObservedRunningTime="2026-02-20 06:49:33.417127972 +0000 UTC m=+188.289754683" Feb 20 06:49:33 crc kubenswrapper[5094]: I0220 06:49:33.438123 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hw2zj" podStartSLOduration=2.725728263 podStartE2EDuration="36.438103719s" podCreationTimestamp="2026-02-20 06:48:57 +0000 UTC" firstStartedPulling="2026-02-20 06:48:58.907918578 +0000 UTC m=+153.780545279" lastFinishedPulling="2026-02-20 06:49:32.620294004 +0000 UTC m=+187.492920735" observedRunningTime="2026-02-20 06:49:33.435133629 +0000 UTC m=+188.307760340" watchObservedRunningTime="2026-02-20 06:49:33.438103719 +0000 UTC m=+188.310730430" Feb 20 06:49:33 crc kubenswrapper[5094]: I0220 06:49:33.478681 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tpwcx" podStartSLOduration=3.215710933 podStartE2EDuration="35.478657159s" podCreationTimestamp="2026-02-20 06:48:58 +0000 UTC" firstStartedPulling="2026-02-20 06:48:59.963009162 +0000 UTC m=+154.835635873" lastFinishedPulling="2026-02-20 06:49:32.225955388 +0000 UTC m=+187.098582099" observedRunningTime="2026-02-20 06:49:33.457813106 +0000 UTC m=+188.330439817" watchObservedRunningTime="2026-02-20 06:49:33.478657159 +0000 UTC m=+188.351283870" Feb 20 06:49:33 crc kubenswrapper[5094]: I0220 06:49:33.479740 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r5gcc" podStartSLOduration=2.71596603 podStartE2EDuration="35.479732135s" podCreationTimestamp="2026-02-20 06:48:58 +0000 UTC" firstStartedPulling="2026-02-20 06:48:59.963633956 +0000 UTC m=+154.836260667" lastFinishedPulling="2026-02-20 06:49:32.727400061 +0000 UTC m=+187.600026772" observedRunningTime="2026-02-20 06:49:33.476642162 +0000 UTC m=+188.349268873" watchObservedRunningTime="2026-02-20 06:49:33.479732135 +0000 UTC m=+188.352358846" Feb 20 06:49:33 crc kubenswrapper[5094]: I0220 06:49:33.495354 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jlc84" podStartSLOduration=2.895338729 podStartE2EDuration="36.495309154s" podCreationTimestamp="2026-02-20 06:48:57 +0000 UTC" firstStartedPulling="2026-02-20 06:48:58.930078773 +0000 UTC m=+153.802705484" lastFinishedPulling="2026-02-20 06:49:32.530049208 +0000 UTC m=+187.402675909" observedRunningTime="2026-02-20 06:49:33.494147376 +0000 UTC m=+188.366774087" watchObservedRunningTime="2026-02-20 06:49:33.495309154 +0000 UTC m=+188.367935865" Feb 20 06:49:34 crc kubenswrapper[5094]: I0220 06:49:34.107155 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 06:49:34 crc kubenswrapper[5094]: I0220 06:49:34.107233 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 06:49:35 crc kubenswrapper[5094]: I0220 06:49:35.211286 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:49:35 crc kubenswrapper[5094]: I0220 06:49:35.667921 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:49:35 crc kubenswrapper[5094]: I0220 06:49:35.668011 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:49:35 crc kubenswrapper[5094]: I0220 06:49:35.811925 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:49:36 crc kubenswrapper[5094]: I0220 06:49:36.456021 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bslb9"] Feb 20 06:49:37 crc kubenswrapper[5094]: I0220 06:49:37.469468 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:49:37 crc kubenswrapper[5094]: I0220 06:49:37.490898 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:49:37 crc kubenswrapper[5094]: I0220 06:49:37.491917 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:49:37 crc kubenswrapper[5094]: I0220 06:49:37.557988 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:49:37 crc kubenswrapper[5094]: I0220 06:49:37.776619 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d5dnl"] Feb 20 06:49:37 crc kubenswrapper[5094]: I0220 06:49:37.900351 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:49:37 crc kubenswrapper[5094]: I0220 06:49:37.900424 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:49:37 crc kubenswrapper[5094]: I0220 06:49:37.939564 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:49:38 crc kubenswrapper[5094]: I0220 06:49:38.472937 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:49:38 crc kubenswrapper[5094]: I0220 06:49:38.482722 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:49:38 crc kubenswrapper[5094]: I0220 06:49:38.511903 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:49:38 crc kubenswrapper[5094]: I0220 06:49:38.512264 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:49:38 crc kubenswrapper[5094]: I0220 06:49:38.945190 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:49:38 crc kubenswrapper[5094]: I0220 06:49:38.945684 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.119648 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 20 06:49:39 crc kubenswrapper[5094]: E0220 06:49:39.119912 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b3e1407-37b6-4b4b-ac69-7f5acdcac274" containerName="pruner" Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.119925 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b3e1407-37b6-4b4b-ac69-7f5acdcac274" containerName="pruner" Feb 20 06:49:39 crc kubenswrapper[5094]: E0220 06:49:39.119945 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecd4e39f-130b-4f63-aedb-6cda9ec7da80" containerName="pruner" Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.119950 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecd4e39f-130b-4f63-aedb-6cda9ec7da80" containerName="pruner" Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.120043 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecd4e39f-130b-4f63-aedb-6cda9ec7da80" containerName="pruner" Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.120062 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b3e1407-37b6-4b4b-ac69-7f5acdcac274" containerName="pruner" Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.120452 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.127231 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.127482 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.132140 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.191122 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4e4b262a-e29c-492e-aba7-4d09a33ba01f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4e4b262a-e29c-492e-aba7-4d09a33ba01f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.191235 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4e4b262a-e29c-492e-aba7-4d09a33ba01f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4e4b262a-e29c-492e-aba7-4d09a33ba01f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.292422 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4e4b262a-e29c-492e-aba7-4d09a33ba01f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4e4b262a-e29c-492e-aba7-4d09a33ba01f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.292772 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4e4b262a-e29c-492e-aba7-4d09a33ba01f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4e4b262a-e29c-492e-aba7-4d09a33ba01f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.292610 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4e4b262a-e29c-492e-aba7-4d09a33ba01f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4e4b262a-e29c-492e-aba7-4d09a33ba01f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.338994 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4e4b262a-e29c-492e-aba7-4d09a33ba01f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4e4b262a-e29c-492e-aba7-4d09a33ba01f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.439881 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d5dnl" podUID="94d239c4-71e0-42e2-a2e7-69acd87b5986" containerName="registry-server" containerID="cri-o://b6bb06ccb30a2b9b8fc736c8aac43c98de0a34a475734de5f7f10809e745e22d" gracePeriod=2 Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.440255 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.577868 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tpwcx" podUID="8ecc3e73-dd76-4a73-a366-92c78aca386e" containerName="registry-server" probeResult="failure" output=< Feb 20 06:49:39 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 06:49:39 crc kubenswrapper[5094]: > Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.913498 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 20 06:49:39 crc kubenswrapper[5094]: W0220 06:49:39.935814 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4e4b262a_e29c_492e_aba7_4d09a33ba01f.slice/crio-868e46802e379785aa9f33c6c762d513a40e3089c4411608729ba82707a26831 WatchSource:0}: Error finding container 868e46802e379785aa9f33c6c762d513a40e3089c4411608729ba82707a26831: Status 404 returned error can't find the container with id 868e46802e379785aa9f33c6c762d513a40e3089c4411608729ba82707a26831 Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.978106 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hw2zj"] Feb 20 06:49:40 crc kubenswrapper[5094]: I0220 06:49:40.005171 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-r5gcc" podUID="4803c5cf-27e3-414a-8fa4-7e82730e311d" containerName="registry-server" probeResult="failure" output=< Feb 20 06:49:40 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 06:49:40 crc kubenswrapper[5094]: > Feb 20 06:49:40 crc kubenswrapper[5094]: I0220 06:49:40.450049 5094 generic.go:334] "Generic (PLEG): container finished" podID="94d239c4-71e0-42e2-a2e7-69acd87b5986" containerID="b6bb06ccb30a2b9b8fc736c8aac43c98de0a34a475734de5f7f10809e745e22d" exitCode=0 Feb 20 06:49:40 crc kubenswrapper[5094]: I0220 06:49:40.450133 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5dnl" event={"ID":"94d239c4-71e0-42e2-a2e7-69acd87b5986","Type":"ContainerDied","Data":"b6bb06ccb30a2b9b8fc736c8aac43c98de0a34a475734de5f7f10809e745e22d"} Feb 20 06:49:40 crc kubenswrapper[5094]: I0220 06:49:40.453077 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4e4b262a-e29c-492e-aba7-4d09a33ba01f","Type":"ContainerStarted","Data":"868e46802e379785aa9f33c6c762d513a40e3089c4411608729ba82707a26831"} Feb 20 06:49:40 crc kubenswrapper[5094]: I0220 06:49:40.453417 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hw2zj" podUID="d307716f-90ad-4b6b-9a49-10f8b5a98721" containerName="registry-server" containerID="cri-o://7a77c43df08e65efbe5f0ef6e087595759d1fd5a8dc86fd7a2d0e287b0015be0" gracePeriod=2 Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.341636 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.463310 5094 generic.go:334] "Generic (PLEG): container finished" podID="4e4b262a-e29c-492e-aba7-4d09a33ba01f" containerID="726b2f542d8a22099222541b412c4499abb5b4cb409d678a29dba6f8fa1aff1a" exitCode=0 Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.464257 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4e4b262a-e29c-492e-aba7-4d09a33ba01f","Type":"ContainerDied","Data":"726b2f542d8a22099222541b412c4499abb5b4cb409d678a29dba6f8fa1aff1a"} Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.470215 5094 generic.go:334] "Generic (PLEG): container finished" podID="d307716f-90ad-4b6b-9a49-10f8b5a98721" containerID="7a77c43df08e65efbe5f0ef6e087595759d1fd5a8dc86fd7a2d0e287b0015be0" exitCode=0 Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.470299 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw2zj" event={"ID":"d307716f-90ad-4b6b-9a49-10f8b5a98721","Type":"ContainerDied","Data":"7a77c43df08e65efbe5f0ef6e087595759d1fd5a8dc86fd7a2d0e287b0015be0"} Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.472457 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5dnl" event={"ID":"94d239c4-71e0-42e2-a2e7-69acd87b5986","Type":"ContainerDied","Data":"d62690bad3f55dfb0952f61bd6ad5dc34171ea69ec48d54c64252d6633321266"} Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.472500 5094 scope.go:117] "RemoveContainer" containerID="b6bb06ccb30a2b9b8fc736c8aac43c98de0a34a475734de5f7f10809e745e22d" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.472640 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.503262 5094 scope.go:117] "RemoveContainer" containerID="bffc4090bcdc589daef83a404f87b0aa9489a6b07489e331199c26107f2625bc" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.522364 5094 scope.go:117] "RemoveContainer" containerID="ef3c91c789b3a8c2e2c7b970bccc8b3b862a287138831c82867ac750baa00328" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.525692 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94d239c4-71e0-42e2-a2e7-69acd87b5986-catalog-content\") pod \"94d239c4-71e0-42e2-a2e7-69acd87b5986\" (UID: \"94d239c4-71e0-42e2-a2e7-69acd87b5986\") " Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.525813 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dk49c\" (UniqueName: \"kubernetes.io/projected/94d239c4-71e0-42e2-a2e7-69acd87b5986-kube-api-access-dk49c\") pod \"94d239c4-71e0-42e2-a2e7-69acd87b5986\" (UID: \"94d239c4-71e0-42e2-a2e7-69acd87b5986\") " Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.525981 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94d239c4-71e0-42e2-a2e7-69acd87b5986-utilities\") pod \"94d239c4-71e0-42e2-a2e7-69acd87b5986\" (UID: \"94d239c4-71e0-42e2-a2e7-69acd87b5986\") " Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.526647 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94d239c4-71e0-42e2-a2e7-69acd87b5986-utilities" (OuterVolumeSpecName: "utilities") pod "94d239c4-71e0-42e2-a2e7-69acd87b5986" (UID: "94d239c4-71e0-42e2-a2e7-69acd87b5986"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.532854 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94d239c4-71e0-42e2-a2e7-69acd87b5986-kube-api-access-dk49c" (OuterVolumeSpecName: "kube-api-access-dk49c") pod "94d239c4-71e0-42e2-a2e7-69acd87b5986" (UID: "94d239c4-71e0-42e2-a2e7-69acd87b5986"). InnerVolumeSpecName "kube-api-access-dk49c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.584192 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94d239c4-71e0-42e2-a2e7-69acd87b5986-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94d239c4-71e0-42e2-a2e7-69acd87b5986" (UID: "94d239c4-71e0-42e2-a2e7-69acd87b5986"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.601386 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.627934 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d307716f-90ad-4b6b-9a49-10f8b5a98721-catalog-content\") pod \"d307716f-90ad-4b6b-9a49-10f8b5a98721\" (UID: \"d307716f-90ad-4b6b-9a49-10f8b5a98721\") " Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.628004 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-985sq\" (UniqueName: \"kubernetes.io/projected/d307716f-90ad-4b6b-9a49-10f8b5a98721-kube-api-access-985sq\") pod \"d307716f-90ad-4b6b-9a49-10f8b5a98721\" (UID: \"d307716f-90ad-4b6b-9a49-10f8b5a98721\") " Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.628046 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d307716f-90ad-4b6b-9a49-10f8b5a98721-utilities\") pod \"d307716f-90ad-4b6b-9a49-10f8b5a98721\" (UID: \"d307716f-90ad-4b6b-9a49-10f8b5a98721\") " Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.628217 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94d239c4-71e0-42e2-a2e7-69acd87b5986-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.628230 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94d239c4-71e0-42e2-a2e7-69acd87b5986-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.628242 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dk49c\" (UniqueName: \"kubernetes.io/projected/94d239c4-71e0-42e2-a2e7-69acd87b5986-kube-api-access-dk49c\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.629130 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d307716f-90ad-4b6b-9a49-10f8b5a98721-utilities" (OuterVolumeSpecName: "utilities") pod "d307716f-90ad-4b6b-9a49-10f8b5a98721" (UID: "d307716f-90ad-4b6b-9a49-10f8b5a98721"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.632522 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d307716f-90ad-4b6b-9a49-10f8b5a98721-kube-api-access-985sq" (OuterVolumeSpecName: "kube-api-access-985sq") pod "d307716f-90ad-4b6b-9a49-10f8b5a98721" (UID: "d307716f-90ad-4b6b-9a49-10f8b5a98721"). InnerVolumeSpecName "kube-api-access-985sq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.651399 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d307716f-90ad-4b6b-9a49-10f8b5a98721-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d307716f-90ad-4b6b-9a49-10f8b5a98721" (UID: "d307716f-90ad-4b6b-9a49-10f8b5a98721"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.729467 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d307716f-90ad-4b6b-9a49-10f8b5a98721-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.729507 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d307716f-90ad-4b6b-9a49-10f8b5a98721-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.729519 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-985sq\" (UniqueName: \"kubernetes.io/projected/d307716f-90ad-4b6b-9a49-10f8b5a98721-kube-api-access-985sq\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.832020 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d5dnl"] Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.838303 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d5dnl"] Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.856329 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94d239c4-71e0-42e2-a2e7-69acd87b5986" path="/var/lib/kubelet/pods/94d239c4-71e0-42e2-a2e7-69acd87b5986/volumes" Feb 20 06:49:42 crc kubenswrapper[5094]: I0220 06:49:42.483771 5094 generic.go:334] "Generic (PLEG): container finished" podID="88e94523-c126-4ce8-a6c7-2f83eb91d3fc" containerID="a3085de07d8490c70b05f62d546c4844150be55db1f8b370f140f0fcadcb36da" exitCode=0 Feb 20 06:49:42 crc kubenswrapper[5094]: I0220 06:49:42.483817 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c94fj" event={"ID":"88e94523-c126-4ce8-a6c7-2f83eb91d3fc","Type":"ContainerDied","Data":"a3085de07d8490c70b05f62d546c4844150be55db1f8b370f140f0fcadcb36da"} Feb 20 06:49:42 crc kubenswrapper[5094]: I0220 06:49:42.488181 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:49:42 crc kubenswrapper[5094]: I0220 06:49:42.488558 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw2zj" event={"ID":"d307716f-90ad-4b6b-9a49-10f8b5a98721","Type":"ContainerDied","Data":"accfce42132240eac7ff5d1f0c619115e032c77085b53887db760a64c361f25b"} Feb 20 06:49:42 crc kubenswrapper[5094]: I0220 06:49:42.488584 5094 scope.go:117] "RemoveContainer" containerID="7a77c43df08e65efbe5f0ef6e087595759d1fd5a8dc86fd7a2d0e287b0015be0" Feb 20 06:49:42 crc kubenswrapper[5094]: I0220 06:49:42.524325 5094 scope.go:117] "RemoveContainer" containerID="a98e2e4e640eec309b5e0629faade0edf22365e6a1089be8c0022fcfa2fa99aa" Feb 20 06:49:42 crc kubenswrapper[5094]: I0220 06:49:42.535565 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hw2zj"] Feb 20 06:49:42 crc kubenswrapper[5094]: I0220 06:49:42.543320 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hw2zj"] Feb 20 06:49:42 crc kubenswrapper[5094]: I0220 06:49:42.558259 5094 scope.go:117] "RemoveContainer" containerID="4270db803bcdf262f5c0b9fda9d8278a355396ffba48defc3ee731db9488d8d6" Feb 20 06:49:42 crc kubenswrapper[5094]: I0220 06:49:42.842983 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 06:49:42 crc kubenswrapper[5094]: I0220 06:49:42.948203 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4e4b262a-e29c-492e-aba7-4d09a33ba01f-kubelet-dir\") pod \"4e4b262a-e29c-492e-aba7-4d09a33ba01f\" (UID: \"4e4b262a-e29c-492e-aba7-4d09a33ba01f\") " Feb 20 06:49:42 crc kubenswrapper[5094]: I0220 06:49:42.948251 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4e4b262a-e29c-492e-aba7-4d09a33ba01f-kube-api-access\") pod \"4e4b262a-e29c-492e-aba7-4d09a33ba01f\" (UID: \"4e4b262a-e29c-492e-aba7-4d09a33ba01f\") " Feb 20 06:49:42 crc kubenswrapper[5094]: I0220 06:49:42.948331 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e4b262a-e29c-492e-aba7-4d09a33ba01f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4e4b262a-e29c-492e-aba7-4d09a33ba01f" (UID: "4e4b262a-e29c-492e-aba7-4d09a33ba01f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 06:49:42 crc kubenswrapper[5094]: I0220 06:49:42.948781 5094 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4e4b262a-e29c-492e-aba7-4d09a33ba01f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:42 crc kubenswrapper[5094]: I0220 06:49:42.952563 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e4b262a-e29c-492e-aba7-4d09a33ba01f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4e4b262a-e29c-492e-aba7-4d09a33ba01f" (UID: "4e4b262a-e29c-492e-aba7-4d09a33ba01f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:49:43 crc kubenswrapper[5094]: I0220 06:49:43.050038 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4e4b262a-e29c-492e-aba7-4d09a33ba01f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:43 crc kubenswrapper[5094]: I0220 06:49:43.503175 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c94fj" event={"ID":"88e94523-c126-4ce8-a6c7-2f83eb91d3fc","Type":"ContainerStarted","Data":"0cfcc7a212abd3270f67b4c4a04afcaf890dc3482fbe7bdb289a52eed1f95836"} Feb 20 06:49:43 crc kubenswrapper[5094]: I0220 06:49:43.508125 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4e4b262a-e29c-492e-aba7-4d09a33ba01f","Type":"ContainerDied","Data":"868e46802e379785aa9f33c6c762d513a40e3089c4411608729ba82707a26831"} Feb 20 06:49:43 crc kubenswrapper[5094]: I0220 06:49:43.508180 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="868e46802e379785aa9f33c6c762d513a40e3089c4411608729ba82707a26831" Feb 20 06:49:43 crc kubenswrapper[5094]: I0220 06:49:43.508252 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 06:49:43 crc kubenswrapper[5094]: I0220 06:49:43.539636 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-c94fj" podStartSLOduration=3.545939544 podStartE2EDuration="48.539604517s" podCreationTimestamp="2026-02-20 06:48:55 +0000 UTC" firstStartedPulling="2026-02-20 06:48:57.874681863 +0000 UTC m=+152.747308594" lastFinishedPulling="2026-02-20 06:49:42.868346846 +0000 UTC m=+197.740973567" observedRunningTime="2026-02-20 06:49:43.538220994 +0000 UTC m=+198.410847735" watchObservedRunningTime="2026-02-20 06:49:43.539604517 +0000 UTC m=+198.412231268" Feb 20 06:49:43 crc kubenswrapper[5094]: I0220 06:49:43.852330 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d307716f-90ad-4b6b-9a49-10f8b5a98721" path="/var/lib/kubelet/pods/d307716f-90ad-4b6b-9a49-10f8b5a98721/volumes" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.514795 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.515410 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.519809 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 20 06:49:45 crc kubenswrapper[5094]: E0220 06:49:45.520175 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d307716f-90ad-4b6b-9a49-10f8b5a98721" containerName="extract-utilities" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.520269 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d307716f-90ad-4b6b-9a49-10f8b5a98721" containerName="extract-utilities" Feb 20 06:49:45 crc kubenswrapper[5094]: E0220 06:49:45.520352 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d307716f-90ad-4b6b-9a49-10f8b5a98721" containerName="extract-content" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.520428 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d307716f-90ad-4b6b-9a49-10f8b5a98721" containerName="extract-content" Feb 20 06:49:45 crc kubenswrapper[5094]: E0220 06:49:45.520508 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e4b262a-e29c-492e-aba7-4d09a33ba01f" containerName="pruner" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.520584 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e4b262a-e29c-492e-aba7-4d09a33ba01f" containerName="pruner" Feb 20 06:49:45 crc kubenswrapper[5094]: E0220 06:49:45.520663 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94d239c4-71e0-42e2-a2e7-69acd87b5986" containerName="extract-utilities" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.520757 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="94d239c4-71e0-42e2-a2e7-69acd87b5986" containerName="extract-utilities" Feb 20 06:49:45 crc kubenswrapper[5094]: E0220 06:49:45.520853 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d307716f-90ad-4b6b-9a49-10f8b5a98721" containerName="registry-server" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.520946 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d307716f-90ad-4b6b-9a49-10f8b5a98721" containerName="registry-server" Feb 20 06:49:45 crc kubenswrapper[5094]: E0220 06:49:45.521044 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94d239c4-71e0-42e2-a2e7-69acd87b5986" containerName="extract-content" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.521121 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="94d239c4-71e0-42e2-a2e7-69acd87b5986" containerName="extract-content" Feb 20 06:49:45 crc kubenswrapper[5094]: E0220 06:49:45.521201 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94d239c4-71e0-42e2-a2e7-69acd87b5986" containerName="registry-server" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.521281 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="94d239c4-71e0-42e2-a2e7-69acd87b5986" containerName="registry-server" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.521506 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e4b262a-e29c-492e-aba7-4d09a33ba01f" containerName="pruner" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.521606 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="94d239c4-71e0-42e2-a2e7-69acd87b5986" containerName="registry-server" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.521691 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d307716f-90ad-4b6b-9a49-10f8b5a98721" containerName="registry-server" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.522260 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.525614 5094 generic.go:334] "Generic (PLEG): container finished" podID="66c35d10-d5cc-468f-95a1-b56fde3961b3" containerID="61352ad384c7169a9a29e90c914460822eb2dc45803cccf5cac7d1c7d42a40b1" exitCode=0 Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.525762 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zj2v" event={"ID":"66c35d10-d5cc-468f-95a1-b56fde3961b3","Type":"ContainerDied","Data":"61352ad384c7169a9a29e90c914460822eb2dc45803cccf5cac7d1c7d42a40b1"} Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.529126 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.530241 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.540025 5094 generic.go:334] "Generic (PLEG): container finished" podID="4aa46aec-af59-49c6-9ff5-a08df3d68e5b" containerID="a331bfe9d22fc560ff4e659b1f997870c2ddce54f3c7858303fd34aa21993025" exitCode=0 Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.540095 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85kbx" event={"ID":"4aa46aec-af59-49c6-9ff5-a08df3d68e5b","Type":"ContainerDied","Data":"a331bfe9d22fc560ff4e659b1f997870c2ddce54f3c7858303fd34aa21993025"} Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.540871 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.596446 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.690726 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6d73a928-b634-44c7-a3ca-8ffc9a40277e-kube-api-access\") pod \"installer-9-crc\" (UID: \"6d73a928-b634-44c7-a3ca-8ffc9a40277e\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.690861 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d73a928-b634-44c7-a3ca-8ffc9a40277e-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6d73a928-b634-44c7-a3ca-8ffc9a40277e\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.690889 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6d73a928-b634-44c7-a3ca-8ffc9a40277e-var-lock\") pod \"installer-9-crc\" (UID: \"6d73a928-b634-44c7-a3ca-8ffc9a40277e\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.791953 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d73a928-b634-44c7-a3ca-8ffc9a40277e-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6d73a928-b634-44c7-a3ca-8ffc9a40277e\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.792010 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6d73a928-b634-44c7-a3ca-8ffc9a40277e-var-lock\") pod \"installer-9-crc\" (UID: \"6d73a928-b634-44c7-a3ca-8ffc9a40277e\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.792077 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6d73a928-b634-44c7-a3ca-8ffc9a40277e-kube-api-access\") pod \"installer-9-crc\" (UID: \"6d73a928-b634-44c7-a3ca-8ffc9a40277e\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.792073 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d73a928-b634-44c7-a3ca-8ffc9a40277e-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6d73a928-b634-44c7-a3ca-8ffc9a40277e\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.792183 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6d73a928-b634-44c7-a3ca-8ffc9a40277e-var-lock\") pod \"installer-9-crc\" (UID: \"6d73a928-b634-44c7-a3ca-8ffc9a40277e\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.813569 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6d73a928-b634-44c7-a3ca-8ffc9a40277e-kube-api-access\") pod \"installer-9-crc\" (UID: \"6d73a928-b634-44c7-a3ca-8ffc9a40277e\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.864669 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 20 06:49:46 crc kubenswrapper[5094]: I0220 06:49:46.401060 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 20 06:49:46 crc kubenswrapper[5094]: W0220 06:49:46.414159 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6d73a928_b634_44c7_a3ca_8ffc9a40277e.slice/crio-86853f3a851a6c1e525b0770bd0f66e0826a27532e42c59551bee7e0157d5b8f WatchSource:0}: Error finding container 86853f3a851a6c1e525b0770bd0f66e0826a27532e42c59551bee7e0157d5b8f: Status 404 returned error can't find the container with id 86853f3a851a6c1e525b0770bd0f66e0826a27532e42c59551bee7e0157d5b8f Feb 20 06:49:46 crc kubenswrapper[5094]: I0220 06:49:46.547963 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6d73a928-b634-44c7-a3ca-8ffc9a40277e","Type":"ContainerStarted","Data":"86853f3a851a6c1e525b0770bd0f66e0826a27532e42c59551bee7e0157d5b8f"} Feb 20 06:49:46 crc kubenswrapper[5094]: I0220 06:49:46.552130 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zj2v" event={"ID":"66c35d10-d5cc-468f-95a1-b56fde3961b3","Type":"ContainerStarted","Data":"beb8731903e3c8312abc22dea15752098a2c970cf1d17da2323d05db3f7ea0e4"} Feb 20 06:49:46 crc kubenswrapper[5094]: I0220 06:49:46.559747 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85kbx" event={"ID":"4aa46aec-af59-49c6-9ff5-a08df3d68e5b","Type":"ContainerStarted","Data":"885427d878899a96fb10ec45e403743858b214cabb65ddef410990fe2fdd6e53"} Feb 20 06:49:46 crc kubenswrapper[5094]: I0220 06:49:46.579829 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7zj2v" podStartSLOduration=4.524458684 podStartE2EDuration="52.579803331s" podCreationTimestamp="2026-02-20 06:48:54 +0000 UTC" firstStartedPulling="2026-02-20 06:48:57.879552708 +0000 UTC m=+152.752179429" lastFinishedPulling="2026-02-20 06:49:45.934897325 +0000 UTC m=+200.807524076" observedRunningTime="2026-02-20 06:49:46.57208989 +0000 UTC m=+201.444716601" watchObservedRunningTime="2026-02-20 06:49:46.579803331 +0000 UTC m=+201.452430062" Feb 20 06:49:46 crc kubenswrapper[5094]: I0220 06:49:46.595207 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-85kbx" podStartSLOduration=3.455875063 podStartE2EDuration="51.595178265s" podCreationTimestamp="2026-02-20 06:48:55 +0000 UTC" firstStartedPulling="2026-02-20 06:48:57.884175638 +0000 UTC m=+152.756802349" lastFinishedPulling="2026-02-20 06:49:46.02347882 +0000 UTC m=+200.896105551" observedRunningTime="2026-02-20 06:49:46.593209959 +0000 UTC m=+201.465836680" watchObservedRunningTime="2026-02-20 06:49:46.595178265 +0000 UTC m=+201.467804986" Feb 20 06:49:47 crc kubenswrapper[5094]: I0220 06:49:47.567317 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6d73a928-b634-44c7-a3ca-8ffc9a40277e","Type":"ContainerStarted","Data":"c577f6af9ab3888d7eaafd2ffe5bc4c1228acf1f9ea1fd93255848a9f2a96cbc"} Feb 20 06:49:47 crc kubenswrapper[5094]: I0220 06:49:47.586902 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.5868788780000003 podStartE2EDuration="2.586878878s" podCreationTimestamp="2026-02-20 06:49:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:49:47.58275918 +0000 UTC m=+202.455385891" watchObservedRunningTime="2026-02-20 06:49:47.586878878 +0000 UTC m=+202.459505589" Feb 20 06:49:48 crc kubenswrapper[5094]: I0220 06:49:48.580257 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:49:48 crc kubenswrapper[5094]: I0220 06:49:48.664216 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:49:49 crc kubenswrapper[5094]: I0220 06:49:49.001588 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:49:49 crc kubenswrapper[5094]: I0220 06:49:49.079370 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:49:52 crc kubenswrapper[5094]: I0220 06:49:52.584054 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r5gcc"] Feb 20 06:49:52 crc kubenswrapper[5094]: I0220 06:49:52.584882 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-r5gcc" podUID="4803c5cf-27e3-414a-8fa4-7e82730e311d" containerName="registry-server" containerID="cri-o://51d312f52e87068fa0bc859138dfee26f32d3cf50be3717bdff37c98bea619cb" gracePeriod=2 Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.091827 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.223026 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4803c5cf-27e3-414a-8fa4-7e82730e311d-utilities\") pod \"4803c5cf-27e3-414a-8fa4-7e82730e311d\" (UID: \"4803c5cf-27e3-414a-8fa4-7e82730e311d\") " Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.223199 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkfpb\" (UniqueName: \"kubernetes.io/projected/4803c5cf-27e3-414a-8fa4-7e82730e311d-kube-api-access-vkfpb\") pod \"4803c5cf-27e3-414a-8fa4-7e82730e311d\" (UID: \"4803c5cf-27e3-414a-8fa4-7e82730e311d\") " Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.224767 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4803c5cf-27e3-414a-8fa4-7e82730e311d-utilities" (OuterVolumeSpecName: "utilities") pod "4803c5cf-27e3-414a-8fa4-7e82730e311d" (UID: "4803c5cf-27e3-414a-8fa4-7e82730e311d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.225797 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4803c5cf-27e3-414a-8fa4-7e82730e311d-catalog-content\") pod \"4803c5cf-27e3-414a-8fa4-7e82730e311d\" (UID: \"4803c5cf-27e3-414a-8fa4-7e82730e311d\") " Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.226261 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4803c5cf-27e3-414a-8fa4-7e82730e311d-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.230328 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4803c5cf-27e3-414a-8fa4-7e82730e311d-kube-api-access-vkfpb" (OuterVolumeSpecName: "kube-api-access-vkfpb") pod "4803c5cf-27e3-414a-8fa4-7e82730e311d" (UID: "4803c5cf-27e3-414a-8fa4-7e82730e311d"). InnerVolumeSpecName "kube-api-access-vkfpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.327845 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkfpb\" (UniqueName: \"kubernetes.io/projected/4803c5cf-27e3-414a-8fa4-7e82730e311d-kube-api-access-vkfpb\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.391116 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4803c5cf-27e3-414a-8fa4-7e82730e311d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4803c5cf-27e3-414a-8fa4-7e82730e311d" (UID: "4803c5cf-27e3-414a-8fa4-7e82730e311d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.429190 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4803c5cf-27e3-414a-8fa4-7e82730e311d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.616822 5094 generic.go:334] "Generic (PLEG): container finished" podID="4803c5cf-27e3-414a-8fa4-7e82730e311d" containerID="51d312f52e87068fa0bc859138dfee26f32d3cf50be3717bdff37c98bea619cb" exitCode=0 Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.616903 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r5gcc" event={"ID":"4803c5cf-27e3-414a-8fa4-7e82730e311d","Type":"ContainerDied","Data":"51d312f52e87068fa0bc859138dfee26f32d3cf50be3717bdff37c98bea619cb"} Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.616948 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r5gcc" event={"ID":"4803c5cf-27e3-414a-8fa4-7e82730e311d","Type":"ContainerDied","Data":"22aca6363f2206b11762898817ccc1128807f675094b438a82160e56aa1326d5"} Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.616981 5094 scope.go:117] "RemoveContainer" containerID="51d312f52e87068fa0bc859138dfee26f32d3cf50be3717bdff37c98bea619cb" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.616988 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.669558 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r5gcc"] Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.672608 5094 scope.go:117] "RemoveContainer" containerID="6f5b6439f6f44bc36beab797da0a74ac28cd5d6755ba7368790483f6fcf685be" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.677948 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-r5gcc"] Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.705177 5094 scope.go:117] "RemoveContainer" containerID="7f79938926dc3a0a2752cae1f32aa8708d69b23b4db2a3c1830917ae04282805" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.728804 5094 scope.go:117] "RemoveContainer" containerID="51d312f52e87068fa0bc859138dfee26f32d3cf50be3717bdff37c98bea619cb" Feb 20 06:49:53 crc kubenswrapper[5094]: E0220 06:49:53.731236 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51d312f52e87068fa0bc859138dfee26f32d3cf50be3717bdff37c98bea619cb\": container with ID starting with 51d312f52e87068fa0bc859138dfee26f32d3cf50be3717bdff37c98bea619cb not found: ID does not exist" containerID="51d312f52e87068fa0bc859138dfee26f32d3cf50be3717bdff37c98bea619cb" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.731341 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51d312f52e87068fa0bc859138dfee26f32d3cf50be3717bdff37c98bea619cb"} err="failed to get container status \"51d312f52e87068fa0bc859138dfee26f32d3cf50be3717bdff37c98bea619cb\": rpc error: code = NotFound desc = could not find container \"51d312f52e87068fa0bc859138dfee26f32d3cf50be3717bdff37c98bea619cb\": container with ID starting with 51d312f52e87068fa0bc859138dfee26f32d3cf50be3717bdff37c98bea619cb not found: ID does not exist" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.731469 5094 scope.go:117] "RemoveContainer" containerID="6f5b6439f6f44bc36beab797da0a74ac28cd5d6755ba7368790483f6fcf685be" Feb 20 06:49:53 crc kubenswrapper[5094]: E0220 06:49:53.732271 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f5b6439f6f44bc36beab797da0a74ac28cd5d6755ba7368790483f6fcf685be\": container with ID starting with 6f5b6439f6f44bc36beab797da0a74ac28cd5d6755ba7368790483f6fcf685be not found: ID does not exist" containerID="6f5b6439f6f44bc36beab797da0a74ac28cd5d6755ba7368790483f6fcf685be" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.732341 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f5b6439f6f44bc36beab797da0a74ac28cd5d6755ba7368790483f6fcf685be"} err="failed to get container status \"6f5b6439f6f44bc36beab797da0a74ac28cd5d6755ba7368790483f6fcf685be\": rpc error: code = NotFound desc = could not find container \"6f5b6439f6f44bc36beab797da0a74ac28cd5d6755ba7368790483f6fcf685be\": container with ID starting with 6f5b6439f6f44bc36beab797da0a74ac28cd5d6755ba7368790483f6fcf685be not found: ID does not exist" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.732392 5094 scope.go:117] "RemoveContainer" containerID="7f79938926dc3a0a2752cae1f32aa8708d69b23b4db2a3c1830917ae04282805" Feb 20 06:49:53 crc kubenswrapper[5094]: E0220 06:49:53.732885 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f79938926dc3a0a2752cae1f32aa8708d69b23b4db2a3c1830917ae04282805\": container with ID starting with 7f79938926dc3a0a2752cae1f32aa8708d69b23b4db2a3c1830917ae04282805 not found: ID does not exist" containerID="7f79938926dc3a0a2752cae1f32aa8708d69b23b4db2a3c1830917ae04282805" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.732964 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f79938926dc3a0a2752cae1f32aa8708d69b23b4db2a3c1830917ae04282805"} err="failed to get container status \"7f79938926dc3a0a2752cae1f32aa8708d69b23b4db2a3c1830917ae04282805\": rpc error: code = NotFound desc = could not find container \"7f79938926dc3a0a2752cae1f32aa8708d69b23b4db2a3c1830917ae04282805\": container with ID starting with 7f79938926dc3a0a2752cae1f32aa8708d69b23b4db2a3c1830917ae04282805 not found: ID does not exist" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.854628 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4803c5cf-27e3-414a-8fa4-7e82730e311d" path="/var/lib/kubelet/pods/4803c5cf-27e3-414a-8fa4-7e82730e311d/volumes" Feb 20 06:49:55 crc kubenswrapper[5094]: I0220 06:49:55.340038 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:49:55 crc kubenswrapper[5094]: I0220 06:49:55.340101 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:49:55 crc kubenswrapper[5094]: I0220 06:49:55.419494 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:49:55 crc kubenswrapper[5094]: I0220 06:49:55.581059 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:49:55 crc kubenswrapper[5094]: I0220 06:49:55.695602 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:49:55 crc kubenswrapper[5094]: I0220 06:49:55.996742 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:49:55 crc kubenswrapper[5094]: I0220 06:49:55.996834 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:49:56 crc kubenswrapper[5094]: I0220 06:49:56.068291 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:49:56 crc kubenswrapper[5094]: I0220 06:49:56.707251 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:49:57 crc kubenswrapper[5094]: I0220 06:49:57.980564 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-85kbx"] Feb 20 06:49:58 crc kubenswrapper[5094]: I0220 06:49:58.655536 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-85kbx" podUID="4aa46aec-af59-49c6-9ff5-a08df3d68e5b" containerName="registry-server" containerID="cri-o://885427d878899a96fb10ec45e403743858b214cabb65ddef410990fe2fdd6e53" gracePeriod=2 Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.070176 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.219310 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-catalog-content\") pod \"4aa46aec-af59-49c6-9ff5-a08df3d68e5b\" (UID: \"4aa46aec-af59-49c6-9ff5-a08df3d68e5b\") " Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.219585 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmmb7\" (UniqueName: \"kubernetes.io/projected/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-kube-api-access-gmmb7\") pod \"4aa46aec-af59-49c6-9ff5-a08df3d68e5b\" (UID: \"4aa46aec-af59-49c6-9ff5-a08df3d68e5b\") " Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.219655 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-utilities\") pod \"4aa46aec-af59-49c6-9ff5-a08df3d68e5b\" (UID: \"4aa46aec-af59-49c6-9ff5-a08df3d68e5b\") " Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.220424 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-utilities" (OuterVolumeSpecName: "utilities") pod "4aa46aec-af59-49c6-9ff5-a08df3d68e5b" (UID: "4aa46aec-af59-49c6-9ff5-a08df3d68e5b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.220807 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.233176 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-kube-api-access-gmmb7" (OuterVolumeSpecName: "kube-api-access-gmmb7") pod "4aa46aec-af59-49c6-9ff5-a08df3d68e5b" (UID: "4aa46aec-af59-49c6-9ff5-a08df3d68e5b"). InnerVolumeSpecName "kube-api-access-gmmb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.275914 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4aa46aec-af59-49c6-9ff5-a08df3d68e5b" (UID: "4aa46aec-af59-49c6-9ff5-a08df3d68e5b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.322566 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.322796 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmmb7\" (UniqueName: \"kubernetes.io/projected/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-kube-api-access-gmmb7\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.665095 5094 generic.go:334] "Generic (PLEG): container finished" podID="4aa46aec-af59-49c6-9ff5-a08df3d68e5b" containerID="885427d878899a96fb10ec45e403743858b214cabb65ddef410990fe2fdd6e53" exitCode=0 Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.665393 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85kbx" event={"ID":"4aa46aec-af59-49c6-9ff5-a08df3d68e5b","Type":"ContainerDied","Data":"885427d878899a96fb10ec45e403743858b214cabb65ddef410990fe2fdd6e53"} Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.665482 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85kbx" event={"ID":"4aa46aec-af59-49c6-9ff5-a08df3d68e5b","Type":"ContainerDied","Data":"52b9be5e00a8e14d989754bcec98f6733777f2e861c8ab554e5876f65f7c7c0b"} Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.665588 5094 scope.go:117] "RemoveContainer" containerID="885427d878899a96fb10ec45e403743858b214cabb65ddef410990fe2fdd6e53" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.665829 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.705033 5094 scope.go:117] "RemoveContainer" containerID="a331bfe9d22fc560ff4e659b1f997870c2ddce54f3c7858303fd34aa21993025" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.720523 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-85kbx"] Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.728231 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-85kbx"] Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.735922 5094 scope.go:117] "RemoveContainer" containerID="8667a8c7af968dab77a30064c202565770814938ef52fa50a032b1a65aa555d9" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.773672 5094 scope.go:117] "RemoveContainer" containerID="885427d878899a96fb10ec45e403743858b214cabb65ddef410990fe2fdd6e53" Feb 20 06:49:59 crc kubenswrapper[5094]: E0220 06:49:59.775637 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"885427d878899a96fb10ec45e403743858b214cabb65ddef410990fe2fdd6e53\": container with ID starting with 885427d878899a96fb10ec45e403743858b214cabb65ddef410990fe2fdd6e53 not found: ID does not exist" containerID="885427d878899a96fb10ec45e403743858b214cabb65ddef410990fe2fdd6e53" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.775699 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"885427d878899a96fb10ec45e403743858b214cabb65ddef410990fe2fdd6e53"} err="failed to get container status \"885427d878899a96fb10ec45e403743858b214cabb65ddef410990fe2fdd6e53\": rpc error: code = NotFound desc = could not find container \"885427d878899a96fb10ec45e403743858b214cabb65ddef410990fe2fdd6e53\": container with ID starting with 885427d878899a96fb10ec45e403743858b214cabb65ddef410990fe2fdd6e53 not found: ID does not exist" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.775977 5094 scope.go:117] "RemoveContainer" containerID="a331bfe9d22fc560ff4e659b1f997870c2ddce54f3c7858303fd34aa21993025" Feb 20 06:49:59 crc kubenswrapper[5094]: E0220 06:49:59.777336 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a331bfe9d22fc560ff4e659b1f997870c2ddce54f3c7858303fd34aa21993025\": container with ID starting with a331bfe9d22fc560ff4e659b1f997870c2ddce54f3c7858303fd34aa21993025 not found: ID does not exist" containerID="a331bfe9d22fc560ff4e659b1f997870c2ddce54f3c7858303fd34aa21993025" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.777430 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a331bfe9d22fc560ff4e659b1f997870c2ddce54f3c7858303fd34aa21993025"} err="failed to get container status \"a331bfe9d22fc560ff4e659b1f997870c2ddce54f3c7858303fd34aa21993025\": rpc error: code = NotFound desc = could not find container \"a331bfe9d22fc560ff4e659b1f997870c2ddce54f3c7858303fd34aa21993025\": container with ID starting with a331bfe9d22fc560ff4e659b1f997870c2ddce54f3c7858303fd34aa21993025 not found: ID does not exist" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.777495 5094 scope.go:117] "RemoveContainer" containerID="8667a8c7af968dab77a30064c202565770814938ef52fa50a032b1a65aa555d9" Feb 20 06:49:59 crc kubenswrapper[5094]: E0220 06:49:59.778896 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8667a8c7af968dab77a30064c202565770814938ef52fa50a032b1a65aa555d9\": container with ID starting with 8667a8c7af968dab77a30064c202565770814938ef52fa50a032b1a65aa555d9 not found: ID does not exist" containerID="8667a8c7af968dab77a30064c202565770814938ef52fa50a032b1a65aa555d9" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.778961 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8667a8c7af968dab77a30064c202565770814938ef52fa50a032b1a65aa555d9"} err="failed to get container status \"8667a8c7af968dab77a30064c202565770814938ef52fa50a032b1a65aa555d9\": rpc error: code = NotFound desc = could not find container \"8667a8c7af968dab77a30064c202565770814938ef52fa50a032b1a65aa555d9\": container with ID starting with 8667a8c7af968dab77a30064c202565770814938ef52fa50a032b1a65aa555d9 not found: ID does not exist" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.849077 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4aa46aec-af59-49c6-9ff5-a08df3d68e5b" path="/var/lib/kubelet/pods/4aa46aec-af59-49c6-9ff5-a08df3d68e5b/volumes" Feb 20 06:50:01 crc kubenswrapper[5094]: I0220 06:50:01.490404 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" podUID="3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" containerName="oauth-openshift" containerID="cri-o://756c75b3fc1e466feb787f451786b0673b156a7f8bc22ab6ff9765d0ad70b734" gracePeriod=15 Feb 20 06:50:01 crc kubenswrapper[5094]: I0220 06:50:01.689991 5094 generic.go:334] "Generic (PLEG): container finished" podID="3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" containerID="756c75b3fc1e466feb787f451786b0673b156a7f8bc22ab6ff9765d0ad70b734" exitCode=0 Feb 20 06:50:01 crc kubenswrapper[5094]: I0220 06:50:01.690051 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" event={"ID":"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c","Type":"ContainerDied","Data":"756c75b3fc1e466feb787f451786b0673b156a7f8bc22ab6ff9765d0ad70b734"} Feb 20 06:50:01 crc kubenswrapper[5094]: I0220 06:50:01.995474 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.167095 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-router-certs\") pod \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.167218 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-cliconfig\") pod \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.167332 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfrvq\" (UniqueName: \"kubernetes.io/projected/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-kube-api-access-zfrvq\") pod \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.167430 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-service-ca\") pod \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.167479 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-audit-policies\") pod \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.167526 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-trusted-ca-bundle\") pod \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.167621 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-idp-0-file-data\") pod \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.167662 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-audit-dir\") pod \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.167761 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-ocp-branding-template\") pod \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.167819 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-session\") pod \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.167884 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-login\") pod \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.167976 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-error\") pod \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.168041 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-serving-cert\") pod \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.168109 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-provider-selection\") pod \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.171348 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" (UID: "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.172319 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" (UID: "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.172781 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" (UID: "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.173457 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" (UID: "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.173949 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" (UID: "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.177518 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" (UID: "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.178156 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-kube-api-access-zfrvq" (OuterVolumeSpecName: "kube-api-access-zfrvq") pod "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" (UID: "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c"). InnerVolumeSpecName "kube-api-access-zfrvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.178339 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" (UID: "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.179430 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" (UID: "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.179642 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" (UID: "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.180461 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" (UID: "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.180752 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" (UID: "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.181762 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" (UID: "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.182021 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" (UID: "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.269803 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.269870 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.269910 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.269942 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.269971 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.269997 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.270027 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.270054 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.270079 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfrvq\" (UniqueName: \"kubernetes.io/projected/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-kube-api-access-zfrvq\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.270107 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.270135 5094 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.270161 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.270187 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.270214 5094 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.701134 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" event={"ID":"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c","Type":"ContainerDied","Data":"48ac16689b00193d6e154a981653d9fe7dd39018c0acc1a7610d05cb116747a3"} Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.701241 5094 scope.go:117] "RemoveContainer" containerID="756c75b3fc1e466feb787f451786b0673b156a7f8bc22ab6ff9765d0ad70b734" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.701391 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.746436 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bslb9"] Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.749735 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bslb9"] Feb 20 06:50:03 crc kubenswrapper[5094]: I0220 06:50:03.848052 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" path="/var/lib/kubelet/pods/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c/volumes" Feb 20 06:50:04 crc kubenswrapper[5094]: I0220 06:50:04.107035 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 06:50:04 crc kubenswrapper[5094]: I0220 06:50:04.107119 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 06:50:04 crc kubenswrapper[5094]: I0220 06:50:04.107175 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:50:04 crc kubenswrapper[5094]: I0220 06:50:04.107916 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 06:50:04 crc kubenswrapper[5094]: I0220 06:50:04.108004 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f" gracePeriod=600 Feb 20 06:50:04 crc kubenswrapper[5094]: I0220 06:50:04.718525 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f" exitCode=0 Feb 20 06:50:04 crc kubenswrapper[5094]: I0220 06:50:04.718595 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f"} Feb 20 06:50:04 crc kubenswrapper[5094]: I0220 06:50:04.718667 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"c4c7f07b9c4d1d267d7fb57a087e1e994c52fa85636dd6484f657b9804207645"} Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.277375 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7b964c775c-zqtz6"] Feb 20 06:50:11 crc kubenswrapper[5094]: E0220 06:50:11.278548 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4803c5cf-27e3-414a-8fa4-7e82730e311d" containerName="registry-server" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.278572 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4803c5cf-27e3-414a-8fa4-7e82730e311d" containerName="registry-server" Feb 20 06:50:11 crc kubenswrapper[5094]: E0220 06:50:11.278604 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4803c5cf-27e3-414a-8fa4-7e82730e311d" containerName="extract-utilities" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.278618 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4803c5cf-27e3-414a-8fa4-7e82730e311d" containerName="extract-utilities" Feb 20 06:50:11 crc kubenswrapper[5094]: E0220 06:50:11.278633 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aa46aec-af59-49c6-9ff5-a08df3d68e5b" containerName="extract-content" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.278645 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa46aec-af59-49c6-9ff5-a08df3d68e5b" containerName="extract-content" Feb 20 06:50:11 crc kubenswrapper[5094]: E0220 06:50:11.278667 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aa46aec-af59-49c6-9ff5-a08df3d68e5b" containerName="extract-utilities" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.278680 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa46aec-af59-49c6-9ff5-a08df3d68e5b" containerName="extract-utilities" Feb 20 06:50:11 crc kubenswrapper[5094]: E0220 06:50:11.278726 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" containerName="oauth-openshift" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.278741 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" containerName="oauth-openshift" Feb 20 06:50:11 crc kubenswrapper[5094]: E0220 06:50:11.278764 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aa46aec-af59-49c6-9ff5-a08df3d68e5b" containerName="registry-server" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.278777 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa46aec-af59-49c6-9ff5-a08df3d68e5b" containerName="registry-server" Feb 20 06:50:11 crc kubenswrapper[5094]: E0220 06:50:11.278798 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4803c5cf-27e3-414a-8fa4-7e82730e311d" containerName="extract-content" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.278810 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4803c5cf-27e3-414a-8fa4-7e82730e311d" containerName="extract-content" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.278966 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" containerName="oauth-openshift" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.278988 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4803c5cf-27e3-414a-8fa4-7e82730e311d" containerName="registry-server" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.279010 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aa46aec-af59-49c6-9ff5-a08df3d68e5b" containerName="registry-server" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.279654 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.289626 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.289637 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.289930 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.290099 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.290311 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.290355 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.290661 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.293167 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.293215 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.293252 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.293444 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.295388 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.310781 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.311384 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7b964c775c-zqtz6"] Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.322756 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.341272 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.432745 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-service-ca\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.432811 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.432840 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.432861 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.432885 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-user-template-error\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.432911 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-session\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.432931 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-user-template-login\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.433768 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.433931 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b1b4e8ed-3d3f-4742-805b-056836f1216d-audit-policies\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.433987 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-router-certs\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.434155 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.434222 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.434317 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzcv4\" (UniqueName: \"kubernetes.io/projected/b1b4e8ed-3d3f-4742-805b-056836f1216d-kube-api-access-nzcv4\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.434440 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b1b4e8ed-3d3f-4742-805b-056836f1216d-audit-dir\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.536185 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.536301 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.536447 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzcv4\" (UniqueName: \"kubernetes.io/projected/b1b4e8ed-3d3f-4742-805b-056836f1216d-kube-api-access-nzcv4\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.536492 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b1b4e8ed-3d3f-4742-805b-056836f1216d-audit-dir\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.536581 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-service-ca\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.536619 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.536654 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.536695 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.536785 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b1b4e8ed-3d3f-4742-805b-056836f1216d-audit-dir\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.536876 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-user-template-error\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.536940 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-session\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.536993 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-user-template-login\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.537095 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.537152 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b1b4e8ed-3d3f-4742-805b-056836f1216d-audit-policies\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.537192 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-router-certs\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.537748 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-service-ca\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.538835 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.538968 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.539115 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b1b4e8ed-3d3f-4742-805b-056836f1216d-audit-policies\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.547154 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-user-template-login\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.547193 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.548243 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-router-certs\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.551031 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.552160 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-session\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.552178 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-user-template-error\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.552284 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.556295 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.558904 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzcv4\" (UniqueName: \"kubernetes.io/projected/b1b4e8ed-3d3f-4742-805b-056836f1216d-kube-api-access-nzcv4\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.621846 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:12 crc kubenswrapper[5094]: I0220 06:50:12.073534 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7b964c775c-zqtz6"] Feb 20 06:50:12 crc kubenswrapper[5094]: I0220 06:50:12.781300 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" event={"ID":"b1b4e8ed-3d3f-4742-805b-056836f1216d","Type":"ContainerStarted","Data":"cf1d1661019c22ef5dbd1c613cc1e5fb4ec2e5a7460b946130fe26bdcc99110c"} Feb 20 06:50:12 crc kubenswrapper[5094]: I0220 06:50:12.781781 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:12 crc kubenswrapper[5094]: I0220 06:50:12.781812 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" event={"ID":"b1b4e8ed-3d3f-4742-805b-056836f1216d","Type":"ContainerStarted","Data":"5d2bb7f9c9b2825c507f1b0a31c03d0ab6fb140b332003f5d5188121804a3b9c"} Feb 20 06:50:12 crc kubenswrapper[5094]: I0220 06:50:12.814646 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" podStartSLOduration=36.814611669 podStartE2EDuration="36.814611669s" podCreationTimestamp="2026-02-20 06:49:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:50:12.810722597 +0000 UTC m=+227.683349308" watchObservedRunningTime="2026-02-20 06:50:12.814611669 +0000 UTC m=+227.687238410" Feb 20 06:50:12 crc kubenswrapper[5094]: I0220 06:50:12.939763 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.822131 5094 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.824009 5094 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.824044 5094 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 20 06:50:24 crc kubenswrapper[5094]: E0220 06:50:24.824221 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.824238 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 20 06:50:24 crc kubenswrapper[5094]: E0220 06:50:24.824251 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.824260 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 20 06:50:24 crc kubenswrapper[5094]: E0220 06:50:24.824272 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.824283 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 20 06:50:24 crc kubenswrapper[5094]: E0220 06:50:24.824298 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.824308 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 20 06:50:24 crc kubenswrapper[5094]: E0220 06:50:24.824321 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.824334 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 20 06:50:24 crc kubenswrapper[5094]: E0220 06:50:24.824357 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.824385 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.824534 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.824548 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.824559 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.824573 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.824587 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 20 06:50:24 crc kubenswrapper[5094]: E0220 06:50:24.824754 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.824764 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.824907 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.825308 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.826078 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781" gracePeriod=15 Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.826243 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23" gracePeriod=15 Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.826302 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3" gracePeriod=15 Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.826345 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1" gracePeriod=15 Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.826396 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2" gracePeriod=15 Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.830312 5094 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.952574 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.953055 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.953109 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.953128 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.953151 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.953518 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.953630 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.954612 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.055550 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.055632 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.055757 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.055696 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.055783 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.055874 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.056741 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.056743 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.056816 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.056925 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.056976 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.057040 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.057063 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.057119 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.057136 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.057178 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.883063 5094 generic.go:334] "Generic (PLEG): container finished" podID="6d73a928-b634-44c7-a3ca-8ffc9a40277e" containerID="c577f6af9ab3888d7eaafd2ffe5bc4c1228acf1f9ea1fd93255848a9f2a96cbc" exitCode=0 Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.883174 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6d73a928-b634-44c7-a3ca-8ffc9a40277e","Type":"ContainerDied","Data":"c577f6af9ab3888d7eaafd2ffe5bc4c1228acf1f9ea1fd93255848a9f2a96cbc"} Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.884885 5094 status_manager.go:851] "Failed to get status for pod" podUID="6d73a928-b634-44c7-a3ca-8ffc9a40277e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.888483 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.891670 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.892993 5094 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23" exitCode=0 Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.893031 5094 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3" exitCode=0 Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.893041 5094 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1" exitCode=0 Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.893058 5094 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2" exitCode=2 Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.893091 5094 scope.go:117] "RemoveContainer" containerID="f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1" Feb 20 06:50:26 crc kubenswrapper[5094]: E0220 06:50:26.659301 5094 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:26 crc kubenswrapper[5094]: E0220 06:50:26.660785 5094 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:26 crc kubenswrapper[5094]: E0220 06:50:26.661442 5094 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:26 crc kubenswrapper[5094]: E0220 06:50:26.661939 5094 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:26 crc kubenswrapper[5094]: E0220 06:50:26.662347 5094 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:26 crc kubenswrapper[5094]: I0220 06:50:26.662391 5094 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 20 06:50:26 crc kubenswrapper[5094]: E0220 06:50:26.662948 5094 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="200ms" Feb 20 06:50:26 crc kubenswrapper[5094]: E0220 06:50:26.864601 5094 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="400ms" Feb 20 06:50:26 crc kubenswrapper[5094]: I0220 06:50:26.936301 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 20 06:50:27 crc kubenswrapper[5094]: E0220 06:50:27.266336 5094 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="800ms" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.291543 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.292544 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.293405 5094 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.294069 5094 status_manager.go:851] "Failed to get status for pod" podUID="6d73a928-b634-44c7-a3ca-8ffc9a40277e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.296858 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.297414 5094 status_manager.go:851] "Failed to get status for pod" podUID="6d73a928-b634-44c7-a3ca-8ffc9a40277e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.298177 5094 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.314857 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.314893 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.314919 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.315058 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.315074 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.315160 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.315401 5094 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.315432 5094 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.315442 5094 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.416796 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6d73a928-b634-44c7-a3ca-8ffc9a40277e-kube-api-access\") pod \"6d73a928-b634-44c7-a3ca-8ffc9a40277e\" (UID: \"6d73a928-b634-44c7-a3ca-8ffc9a40277e\") " Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.416949 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6d73a928-b634-44c7-a3ca-8ffc9a40277e-var-lock\") pod \"6d73a928-b634-44c7-a3ca-8ffc9a40277e\" (UID: \"6d73a928-b634-44c7-a3ca-8ffc9a40277e\") " Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.417022 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d73a928-b634-44c7-a3ca-8ffc9a40277e-kubelet-dir\") pod \"6d73a928-b634-44c7-a3ca-8ffc9a40277e\" (UID: \"6d73a928-b634-44c7-a3ca-8ffc9a40277e\") " Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.417127 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d73a928-b634-44c7-a3ca-8ffc9a40277e-var-lock" (OuterVolumeSpecName: "var-lock") pod "6d73a928-b634-44c7-a3ca-8ffc9a40277e" (UID: "6d73a928-b634-44c7-a3ca-8ffc9a40277e"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.417266 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d73a928-b634-44c7-a3ca-8ffc9a40277e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6d73a928-b634-44c7-a3ca-8ffc9a40277e" (UID: "6d73a928-b634-44c7-a3ca-8ffc9a40277e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.417363 5094 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6d73a928-b634-44c7-a3ca-8ffc9a40277e-var-lock\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.425000 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d73a928-b634-44c7-a3ca-8ffc9a40277e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6d73a928-b634-44c7-a3ca-8ffc9a40277e" (UID: "6d73a928-b634-44c7-a3ca-8ffc9a40277e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.519725 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6d73a928-b634-44c7-a3ca-8ffc9a40277e-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.519769 5094 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d73a928-b634-44c7-a3ca-8ffc9a40277e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.848602 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.953206 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6d73a928-b634-44c7-a3ca-8ffc9a40277e","Type":"ContainerDied","Data":"86853f3a851a6c1e525b0770bd0f66e0826a27532e42c59551bee7e0157d5b8f"} Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.953278 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86853f3a851a6c1e525b0770bd0f66e0826a27532e42c59551bee7e0157d5b8f" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.954911 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.961075 5094 status_manager.go:851] "Failed to get status for pod" podUID="6d73a928-b634-44c7-a3ca-8ffc9a40277e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.965007 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.966847 5094 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781" exitCode=0 Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.966920 5094 scope.go:117] "RemoveContainer" containerID="99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.967007 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.967957 5094 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.968555 5094 status_manager.go:851] "Failed to get status for pod" podUID="6d73a928-b634-44c7-a3ca-8ffc9a40277e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.970736 5094 status_manager.go:851] "Failed to get status for pod" podUID="6d73a928-b634-44c7-a3ca-8ffc9a40277e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.970948 5094 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.990931 5094 scope.go:117] "RemoveContainer" containerID="be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3" Feb 20 06:50:28 crc kubenswrapper[5094]: I0220 06:50:28.010425 5094 scope.go:117] "RemoveContainer" containerID="64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1" Feb 20 06:50:28 crc kubenswrapper[5094]: I0220 06:50:28.030329 5094 scope.go:117] "RemoveContainer" containerID="2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2" Feb 20 06:50:28 crc kubenswrapper[5094]: I0220 06:50:28.047475 5094 scope.go:117] "RemoveContainer" containerID="e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781" Feb 20 06:50:28 crc kubenswrapper[5094]: E0220 06:50:28.067397 5094 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="1.6s" Feb 20 06:50:28 crc kubenswrapper[5094]: I0220 06:50:28.078032 5094 scope.go:117] "RemoveContainer" containerID="c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326" Feb 20 06:50:28 crc kubenswrapper[5094]: I0220 06:50:28.100601 5094 scope.go:117] "RemoveContainer" containerID="99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23" Feb 20 06:50:28 crc kubenswrapper[5094]: E0220 06:50:28.101287 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\": container with ID starting with 99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23 not found: ID does not exist" containerID="99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23" Feb 20 06:50:28 crc kubenswrapper[5094]: I0220 06:50:28.101338 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23"} err="failed to get container status \"99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\": rpc error: code = NotFound desc = could not find container \"99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\": container with ID starting with 99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23 not found: ID does not exist" Feb 20 06:50:28 crc kubenswrapper[5094]: I0220 06:50:28.101378 5094 scope.go:117] "RemoveContainer" containerID="be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3" Feb 20 06:50:28 crc kubenswrapper[5094]: E0220 06:50:28.101697 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\": container with ID starting with be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3 not found: ID does not exist" containerID="be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3" Feb 20 06:50:28 crc kubenswrapper[5094]: I0220 06:50:28.101731 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3"} err="failed to get container status \"be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\": rpc error: code = NotFound desc = could not find container \"be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\": container with ID starting with be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3 not found: ID does not exist" Feb 20 06:50:28 crc kubenswrapper[5094]: I0220 06:50:28.101748 5094 scope.go:117] "RemoveContainer" containerID="64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1" Feb 20 06:50:28 crc kubenswrapper[5094]: E0220 06:50:28.102026 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\": container with ID starting with 64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1 not found: ID does not exist" containerID="64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1" Feb 20 06:50:28 crc kubenswrapper[5094]: I0220 06:50:28.102055 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1"} err="failed to get container status \"64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\": rpc error: code = NotFound desc = could not find container \"64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\": container with ID starting with 64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1 not found: ID does not exist" Feb 20 06:50:28 crc kubenswrapper[5094]: I0220 06:50:28.102079 5094 scope.go:117] "RemoveContainer" containerID="2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2" Feb 20 06:50:28 crc kubenswrapper[5094]: E0220 06:50:28.102361 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\": container with ID starting with 2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2 not found: ID does not exist" containerID="2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2" Feb 20 06:50:28 crc kubenswrapper[5094]: I0220 06:50:28.102384 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2"} err="failed to get container status \"2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\": rpc error: code = NotFound desc = could not find container \"2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\": container with ID starting with 2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2 not found: ID does not exist" Feb 20 06:50:28 crc kubenswrapper[5094]: I0220 06:50:28.102398 5094 scope.go:117] "RemoveContainer" containerID="e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781" Feb 20 06:50:28 crc kubenswrapper[5094]: E0220 06:50:28.102806 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\": container with ID starting with e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781 not found: ID does not exist" containerID="e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781" Feb 20 06:50:28 crc kubenswrapper[5094]: I0220 06:50:28.102828 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781"} err="failed to get container status \"e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\": rpc error: code = NotFound desc = could not find container \"e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\": container with ID starting with e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781 not found: ID does not exist" Feb 20 06:50:28 crc kubenswrapper[5094]: I0220 06:50:28.102842 5094 scope.go:117] "RemoveContainer" containerID="c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326" Feb 20 06:50:28 crc kubenswrapper[5094]: E0220 06:50:28.103120 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\": container with ID starting with c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326 not found: ID does not exist" containerID="c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326" Feb 20 06:50:28 crc kubenswrapper[5094]: I0220 06:50:28.103160 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326"} err="failed to get container status \"c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\": rpc error: code = NotFound desc = could not find container \"c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\": container with ID starting with c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326 not found: ID does not exist" Feb 20 06:50:29 crc kubenswrapper[5094]: E0220 06:50:29.669093 5094 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="3.2s" Feb 20 06:50:29 crc kubenswrapper[5094]: E0220 06:50:29.862085 5094 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.188:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:29 crc kubenswrapper[5094]: I0220 06:50:29.862535 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:29 crc kubenswrapper[5094]: W0220 06:50:29.886481 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-6b473ffdced6481c794ee94c8670d7c42606b6651c103fea1f98f03e6c985511 WatchSource:0}: Error finding container 6b473ffdced6481c794ee94c8670d7c42606b6651c103fea1f98f03e6c985511: Status 404 returned error can't find the container with id 6b473ffdced6481c794ee94c8670d7c42606b6651c103fea1f98f03e6c985511 Feb 20 06:50:29 crc kubenswrapper[5094]: E0220 06:50:29.890443 5094 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.188:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895e1bce73e4261 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-20 06:50:29.889876577 +0000 UTC m=+244.762503318,LastTimestamp:2026-02-20 06:50:29.889876577 +0000 UTC m=+244.762503318,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 20 06:50:29 crc kubenswrapper[5094]: I0220 06:50:29.986765 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"6b473ffdced6481c794ee94c8670d7c42606b6651c103fea1f98f03e6c985511"} Feb 20 06:50:30 crc kubenswrapper[5094]: I0220 06:50:30.996385 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f56fc184c72efdb0715102f2218438947dc7e97e505f7b2b3b9699c62410b375"} Feb 20 06:50:30 crc kubenswrapper[5094]: E0220 06:50:30.997756 5094 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.188:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:30 crc kubenswrapper[5094]: I0220 06:50:30.997972 5094 status_manager.go:851] "Failed to get status for pod" podUID="6d73a928-b634-44c7-a3ca-8ffc9a40277e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:32 crc kubenswrapper[5094]: E0220 06:50:32.004137 5094 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.188:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:32 crc kubenswrapper[5094]: E0220 06:50:32.870342 5094 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="6.4s" Feb 20 06:50:35 crc kubenswrapper[5094]: I0220 06:50:35.840067 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:35 crc kubenswrapper[5094]: I0220 06:50:35.843432 5094 status_manager.go:851] "Failed to get status for pod" podUID="6d73a928-b634-44c7-a3ca-8ffc9a40277e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:35 crc kubenswrapper[5094]: I0220 06:50:35.843855 5094 status_manager.go:851] "Failed to get status for pod" podUID="6d73a928-b634-44c7-a3ca-8ffc9a40277e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:35 crc kubenswrapper[5094]: I0220 06:50:35.869623 5094 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="069b1776-8adf-4339-bde2-43375d702571" Feb 20 06:50:35 crc kubenswrapper[5094]: I0220 06:50:35.869690 5094 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="069b1776-8adf-4339-bde2-43375d702571" Feb 20 06:50:35 crc kubenswrapper[5094]: E0220 06:50:35.870436 5094 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:35 crc kubenswrapper[5094]: I0220 06:50:35.871443 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:35 crc kubenswrapper[5094]: W0220 06:50:35.909531 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-dba34e838371845fa77bd0ebbc8e1a51372d4ff7dca8def5475a7a6f7bf35dd5 WatchSource:0}: Error finding container dba34e838371845fa77bd0ebbc8e1a51372d4ff7dca8def5475a7a6f7bf35dd5: Status 404 returned error can't find the container with id dba34e838371845fa77bd0ebbc8e1a51372d4ff7dca8def5475a7a6f7bf35dd5 Feb 20 06:50:36 crc kubenswrapper[5094]: I0220 06:50:36.034003 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"dba34e838371845fa77bd0ebbc8e1a51372d4ff7dca8def5475a7a6f7bf35dd5"} Feb 20 06:50:36 crc kubenswrapper[5094]: E0220 06:50:36.397417 5094 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.188:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895e1bce73e4261 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-20 06:50:29.889876577 +0000 UTC m=+244.762503318,LastTimestamp:2026-02-20 06:50:29.889876577 +0000 UTC m=+244.762503318,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 20 06:50:37 crc kubenswrapper[5094]: I0220 06:50:37.047033 5094 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="494dbad7b7d8411ed8e984463240b80543a607b517f185d55b75bf813eeccbbe" exitCode=0 Feb 20 06:50:37 crc kubenswrapper[5094]: I0220 06:50:37.047316 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"494dbad7b7d8411ed8e984463240b80543a607b517f185d55b75bf813eeccbbe"} Feb 20 06:50:37 crc kubenswrapper[5094]: I0220 06:50:37.047929 5094 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="069b1776-8adf-4339-bde2-43375d702571" Feb 20 06:50:37 crc kubenswrapper[5094]: I0220 06:50:37.049413 5094 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="069b1776-8adf-4339-bde2-43375d702571" Feb 20 06:50:37 crc kubenswrapper[5094]: I0220 06:50:37.048450 5094 status_manager.go:851] "Failed to get status for pod" podUID="6d73a928-b634-44c7-a3ca-8ffc9a40277e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:37 crc kubenswrapper[5094]: E0220 06:50:37.050577 5094 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:38 crc kubenswrapper[5094]: I0220 06:50:38.058270 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"af8420f0b5117c09a4cdb5c47e69a98711286d7eadfe72e752164d708b1a2b0f"} Feb 20 06:50:38 crc kubenswrapper[5094]: I0220 06:50:38.058608 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c7f800776012017ad1d052ce16777d8304bc3ad26b7fbf28db344ce7a35b0301"} Feb 20 06:50:38 crc kubenswrapper[5094]: I0220 06:50:38.058618 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"904e144ecb263d525941a4e1ab624d7e8a91bfc293db2fc5f07d80e51b0cd0b8"} Feb 20 06:50:39 crc kubenswrapper[5094]: I0220 06:50:39.066915 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"25b9b98c9f4fc705d901f9620c1677b6dd9067458524adc0ca91eb739c7ee4af"} Feb 20 06:50:39 crc kubenswrapper[5094]: I0220 06:50:39.067686 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5a04797bd174bb60faa197510196c0308fed7b84b3154b23d2486c87c8ede926"} Feb 20 06:50:39 crc kubenswrapper[5094]: I0220 06:50:39.067186 5094 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="069b1776-8adf-4339-bde2-43375d702571" Feb 20 06:50:39 crc kubenswrapper[5094]: I0220 06:50:39.067793 5094 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="069b1776-8adf-4339-bde2-43375d702571" Feb 20 06:50:39 crc kubenswrapper[5094]: I0220 06:50:39.067731 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:39 crc kubenswrapper[5094]: I0220 06:50:39.070043 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 20 06:50:39 crc kubenswrapper[5094]: I0220 06:50:39.070091 5094 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d" exitCode=1 Feb 20 06:50:39 crc kubenswrapper[5094]: I0220 06:50:39.070120 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d"} Feb 20 06:50:39 crc kubenswrapper[5094]: I0220 06:50:39.070592 5094 scope.go:117] "RemoveContainer" containerID="124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d" Feb 20 06:50:40 crc kubenswrapper[5094]: I0220 06:50:40.080124 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 20 06:50:40 crc kubenswrapper[5094]: I0220 06:50:40.080195 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a44df77fb674c081b2ea537a72225268301f6777f79712ac42b7047fea3be20c"} Feb 20 06:50:40 crc kubenswrapper[5094]: I0220 06:50:40.871740 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:40 crc kubenswrapper[5094]: I0220 06:50:40.871939 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:40 crc kubenswrapper[5094]: I0220 06:50:40.881328 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:44 crc kubenswrapper[5094]: I0220 06:50:44.081204 5094 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:44 crc kubenswrapper[5094]: I0220 06:50:44.111295 5094 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="069b1776-8adf-4339-bde2-43375d702571" Feb 20 06:50:44 crc kubenswrapper[5094]: I0220 06:50:44.111345 5094 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="069b1776-8adf-4339-bde2-43375d702571" Feb 20 06:50:44 crc kubenswrapper[5094]: I0220 06:50:44.116676 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:45 crc kubenswrapper[5094]: I0220 06:50:45.127897 5094 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="069b1776-8adf-4339-bde2-43375d702571" Feb 20 06:50:45 crc kubenswrapper[5094]: I0220 06:50:45.127952 5094 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="069b1776-8adf-4339-bde2-43375d702571" Feb 20 06:50:45 crc kubenswrapper[5094]: I0220 06:50:45.861789 5094 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="6686d6e2-d4b4-4cc4-a557-70ca70f590a8" Feb 20 06:50:46 crc kubenswrapper[5094]: I0220 06:50:46.785443 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:50:46 crc kubenswrapper[5094]: I0220 06:50:46.785821 5094 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 20 06:50:46 crc kubenswrapper[5094]: I0220 06:50:46.785938 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 20 06:50:47 crc kubenswrapper[5094]: I0220 06:50:47.960664 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:50:50 crc kubenswrapper[5094]: I0220 06:50:50.465554 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 20 06:50:54 crc kubenswrapper[5094]: I0220 06:50:54.419047 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 20 06:50:54 crc kubenswrapper[5094]: I0220 06:50:54.555911 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 20 06:50:55 crc kubenswrapper[5094]: I0220 06:50:55.082528 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 20 06:50:55 crc kubenswrapper[5094]: I0220 06:50:55.097443 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 20 06:50:56 crc kubenswrapper[5094]: I0220 06:50:56.115793 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 20 06:50:56 crc kubenswrapper[5094]: I0220 06:50:56.248885 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 20 06:50:56 crc kubenswrapper[5094]: I0220 06:50:56.266748 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 20 06:50:56 crc kubenswrapper[5094]: I0220 06:50:56.302823 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 20 06:50:56 crc kubenswrapper[5094]: I0220 06:50:56.624943 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 20 06:50:56 crc kubenswrapper[5094]: I0220 06:50:56.655084 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 20 06:50:56 crc kubenswrapper[5094]: I0220 06:50:56.685415 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 20 06:50:56 crc kubenswrapper[5094]: I0220 06:50:56.785106 5094 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 20 06:50:56 crc kubenswrapper[5094]: I0220 06:50:56.785183 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 20 06:50:56 crc kubenswrapper[5094]: I0220 06:50:56.901445 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 20 06:50:57 crc kubenswrapper[5094]: I0220 06:50:57.144736 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 20 06:50:57 crc kubenswrapper[5094]: I0220 06:50:57.218046 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 20 06:50:57 crc kubenswrapper[5094]: I0220 06:50:57.309931 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 20 06:50:57 crc kubenswrapper[5094]: I0220 06:50:57.506356 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 20 06:50:57 crc kubenswrapper[5094]: I0220 06:50:57.583564 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 20 06:50:57 crc kubenswrapper[5094]: I0220 06:50:57.864900 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 20 06:50:57 crc kubenswrapper[5094]: I0220 06:50:57.955514 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 20 06:50:57 crc kubenswrapper[5094]: I0220 06:50:57.967571 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 20 06:50:58 crc kubenswrapper[5094]: I0220 06:50:58.048001 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 20 06:50:58 crc kubenswrapper[5094]: I0220 06:50:58.049127 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 20 06:50:58 crc kubenswrapper[5094]: I0220 06:50:58.132559 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 20 06:50:58 crc kubenswrapper[5094]: I0220 06:50:58.192922 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 20 06:50:58 crc kubenswrapper[5094]: I0220 06:50:58.408002 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 20 06:50:58 crc kubenswrapper[5094]: I0220 06:50:58.438496 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 20 06:50:58 crc kubenswrapper[5094]: I0220 06:50:58.515360 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 20 06:50:58 crc kubenswrapper[5094]: I0220 06:50:58.567680 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 20 06:50:58 crc kubenswrapper[5094]: I0220 06:50:58.586764 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 20 06:50:58 crc kubenswrapper[5094]: I0220 06:50:58.612183 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 20 06:50:58 crc kubenswrapper[5094]: I0220 06:50:58.635287 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 20 06:50:58 crc kubenswrapper[5094]: I0220 06:50:58.884760 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 20 06:50:58 crc kubenswrapper[5094]: I0220 06:50:58.885878 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 20 06:50:58 crc kubenswrapper[5094]: I0220 06:50:58.946331 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 20 06:50:59 crc kubenswrapper[5094]: I0220 06:50:59.034098 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 20 06:50:59 crc kubenswrapper[5094]: I0220 06:50:59.045088 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 20 06:50:59 crc kubenswrapper[5094]: I0220 06:50:59.048476 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 20 06:50:59 crc kubenswrapper[5094]: I0220 06:50:59.057033 5094 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 20 06:50:59 crc kubenswrapper[5094]: I0220 06:50:59.064935 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 20 06:50:59 crc kubenswrapper[5094]: I0220 06:50:59.065012 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 20 06:50:59 crc kubenswrapper[5094]: I0220 06:50:59.070529 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:59 crc kubenswrapper[5094]: I0220 06:50:59.096962 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=15.096930181 podStartE2EDuration="15.096930181s" podCreationTimestamp="2026-02-20 06:50:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:50:59.088630812 +0000 UTC m=+273.961257533" watchObservedRunningTime="2026-02-20 06:50:59.096930181 +0000 UTC m=+273.969556932" Feb 20 06:50:59 crc kubenswrapper[5094]: I0220 06:50:59.296152 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 20 06:50:59 crc kubenswrapper[5094]: I0220 06:50:59.314943 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 20 06:50:59 crc kubenswrapper[5094]: I0220 06:50:59.468809 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 20 06:50:59 crc kubenswrapper[5094]: I0220 06:50:59.640777 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 20 06:50:59 crc kubenswrapper[5094]: I0220 06:50:59.821527 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 20 06:50:59 crc kubenswrapper[5094]: I0220 06:50:59.910165 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 20 06:50:59 crc kubenswrapper[5094]: I0220 06:50:59.949919 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 20 06:50:59 crc kubenswrapper[5094]: I0220 06:50:59.950100 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 20 06:51:00 crc kubenswrapper[5094]: I0220 06:51:00.051163 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 20 06:51:00 crc kubenswrapper[5094]: I0220 06:51:00.109000 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 20 06:51:00 crc kubenswrapper[5094]: I0220 06:51:00.123021 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 20 06:51:00 crc kubenswrapper[5094]: I0220 06:51:00.137855 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 20 06:51:00 crc kubenswrapper[5094]: I0220 06:51:00.428722 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 20 06:51:00 crc kubenswrapper[5094]: I0220 06:51:00.503481 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 20 06:51:00 crc kubenswrapper[5094]: I0220 06:51:00.530481 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 20 06:51:00 crc kubenswrapper[5094]: I0220 06:51:00.627122 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 20 06:51:00 crc kubenswrapper[5094]: I0220 06:51:00.636651 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 20 06:51:00 crc kubenswrapper[5094]: I0220 06:51:00.729028 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 20 06:51:00 crc kubenswrapper[5094]: I0220 06:51:00.752492 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 20 06:51:00 crc kubenswrapper[5094]: I0220 06:51:00.851078 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 20 06:51:00 crc kubenswrapper[5094]: I0220 06:51:00.954629 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 20 06:51:00 crc kubenswrapper[5094]: I0220 06:51:00.988923 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.036977 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.167653 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.253139 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.258248 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.308434 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.309853 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.314519 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.315773 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.370509 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.376271 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.415421 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.533245 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.540886 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.589051 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.632649 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.655815 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.765257 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.823632 5094 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.915141 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.163004 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.268872 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.289773 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.292637 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.347969 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.390055 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.416467 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.436899 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.503949 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.524550 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.581685 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.616192 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.631128 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.645562 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.647202 5094 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.697460 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.710664 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.716022 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.818920 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.944975 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.011241 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.020006 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.089734 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.124747 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.315022 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.384020 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.418874 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.491243 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.522107 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.662343 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.758862 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.767774 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.770348 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.826495 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.848698 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.871270 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.906869 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.954293 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.954728 5094 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 20 06:51:04 crc kubenswrapper[5094]: I0220 06:51:04.076459 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 20 06:51:04 crc kubenswrapper[5094]: I0220 06:51:04.096674 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 20 06:51:04 crc kubenswrapper[5094]: I0220 06:51:04.102752 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 20 06:51:04 crc kubenswrapper[5094]: I0220 06:51:04.233215 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 20 06:51:04 crc kubenswrapper[5094]: I0220 06:51:04.236299 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 20 06:51:04 crc kubenswrapper[5094]: I0220 06:51:04.297044 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 20 06:51:04 crc kubenswrapper[5094]: I0220 06:51:04.328150 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 20 06:51:04 crc kubenswrapper[5094]: I0220 06:51:04.393674 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 20 06:51:04 crc kubenswrapper[5094]: I0220 06:51:04.458327 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 20 06:51:04 crc kubenswrapper[5094]: I0220 06:51:04.488788 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 20 06:51:04 crc kubenswrapper[5094]: I0220 06:51:04.488891 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 20 06:51:04 crc kubenswrapper[5094]: I0220 06:51:04.777381 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 20 06:51:04 crc kubenswrapper[5094]: I0220 06:51:04.784231 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 20 06:51:04 crc kubenswrapper[5094]: I0220 06:51:04.785330 5094 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 20 06:51:04 crc kubenswrapper[5094]: I0220 06:51:04.816134 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 20 06:51:04 crc kubenswrapper[5094]: I0220 06:51:04.958384 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.062325 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.064542 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.115683 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.196469 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.240683 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.247627 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.392986 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.417137 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.439455 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.454620 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.476599 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.484145 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.678838 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.760606 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.878844 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.903073 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.917464 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.927588 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.021023 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.055023 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.108272 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.157350 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.311141 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.342646 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.349884 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.356691 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.451244 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.452437 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.455682 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.466592 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.501054 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.555911 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.570958 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.618794 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.677834 5094 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.678106 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://f56fc184c72efdb0715102f2218438947dc7e97e505f7b2b3b9699c62410b375" gracePeriod=5 Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.738038 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.744903 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.785374 5094 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.785499 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.785609 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.787203 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"a44df77fb674c081b2ea537a72225268301f6777f79712ac42b7047fea3be20c"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.787436 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://a44df77fb674c081b2ea537a72225268301f6777f79712ac42b7047fea3be20c" gracePeriod=30 Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.801899 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.836347 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.960603 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.968265 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.020786 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.096485 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.133827 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.276149 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.357584 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.449672 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.452372 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.606564 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.609088 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.610915 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.611349 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.627455 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.756686 5094 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.803360 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.817604 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.904776 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.937494 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.954897 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.016666 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.111738 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.122522 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.171486 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.361615 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.485565 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.502052 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.539984 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.542029 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.618900 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.685935 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.709796 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.768874 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.824647 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.837363 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.843656 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.849856 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.908011 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.912686 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.918375 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 20 06:51:09 crc kubenswrapper[5094]: I0220 06:51:09.045315 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 20 06:51:09 crc kubenswrapper[5094]: I0220 06:51:09.058879 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 20 06:51:09 crc kubenswrapper[5094]: I0220 06:51:09.061692 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 20 06:51:09 crc kubenswrapper[5094]: I0220 06:51:09.086251 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 20 06:51:09 crc kubenswrapper[5094]: I0220 06:51:09.215300 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 20 06:51:09 crc kubenswrapper[5094]: I0220 06:51:09.296588 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 20 06:51:09 crc kubenswrapper[5094]: I0220 06:51:09.317801 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 20 06:51:09 crc kubenswrapper[5094]: I0220 06:51:09.361026 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 20 06:51:09 crc kubenswrapper[5094]: I0220 06:51:09.534192 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 20 06:51:09 crc kubenswrapper[5094]: I0220 06:51:09.616689 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 20 06:51:09 crc kubenswrapper[5094]: I0220 06:51:09.746772 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 20 06:51:09 crc kubenswrapper[5094]: I0220 06:51:09.785467 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 20 06:51:09 crc kubenswrapper[5094]: I0220 06:51:09.792452 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 20 06:51:09 crc kubenswrapper[5094]: I0220 06:51:09.852665 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 20 06:51:09 crc kubenswrapper[5094]: I0220 06:51:09.883965 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 20 06:51:10 crc kubenswrapper[5094]: I0220 06:51:10.100693 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 20 06:51:10 crc kubenswrapper[5094]: I0220 06:51:10.143518 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 20 06:51:10 crc kubenswrapper[5094]: I0220 06:51:10.171183 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 20 06:51:10 crc kubenswrapper[5094]: I0220 06:51:10.262476 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 20 06:51:10 crc kubenswrapper[5094]: I0220 06:51:10.315242 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 20 06:51:10 crc kubenswrapper[5094]: I0220 06:51:10.414592 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 20 06:51:10 crc kubenswrapper[5094]: I0220 06:51:10.483012 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 20 06:51:10 crc kubenswrapper[5094]: I0220 06:51:10.601470 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 20 06:51:10 crc kubenswrapper[5094]: I0220 06:51:10.646986 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 20 06:51:10 crc kubenswrapper[5094]: I0220 06:51:10.734194 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 20 06:51:11 crc kubenswrapper[5094]: I0220 06:51:11.067035 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 20 06:51:11 crc kubenswrapper[5094]: I0220 06:51:11.344641 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 20 06:51:11 crc kubenswrapper[5094]: I0220 06:51:11.353646 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 20 06:51:11 crc kubenswrapper[5094]: I0220 06:51:11.515693 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 20 06:51:11 crc kubenswrapper[5094]: I0220 06:51:11.592573 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 20 06:51:11 crc kubenswrapper[5094]: I0220 06:51:11.665002 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 20 06:51:11 crc kubenswrapper[5094]: I0220 06:51:11.722441 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 20 06:51:11 crc kubenswrapper[5094]: I0220 06:51:11.804886 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.132646 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.284403 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.284573 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.346002 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.346092 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.346115 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.346140 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.346297 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.346414 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.346491 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.346620 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.346832 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.347294 5094 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.347329 5094 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.347355 5094 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.347379 5094 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.360250 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.361094 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.361170 5094 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="f56fc184c72efdb0715102f2218438947dc7e97e505f7b2b3b9699c62410b375" exitCode=137 Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.361243 5094 scope.go:117] "RemoveContainer" containerID="f56fc184c72efdb0715102f2218438947dc7e97e505f7b2b3b9699c62410b375" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.361442 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.382397 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.428198 5094 scope.go:117] "RemoveContainer" containerID="f56fc184c72efdb0715102f2218438947dc7e97e505f7b2b3b9699c62410b375" Feb 20 06:51:12 crc kubenswrapper[5094]: E0220 06:51:12.428925 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f56fc184c72efdb0715102f2218438947dc7e97e505f7b2b3b9699c62410b375\": container with ID starting with f56fc184c72efdb0715102f2218438947dc7e97e505f7b2b3b9699c62410b375 not found: ID does not exist" containerID="f56fc184c72efdb0715102f2218438947dc7e97e505f7b2b3b9699c62410b375" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.429008 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f56fc184c72efdb0715102f2218438947dc7e97e505f7b2b3b9699c62410b375"} err="failed to get container status \"f56fc184c72efdb0715102f2218438947dc7e97e505f7b2b3b9699c62410b375\": rpc error: code = NotFound desc = could not find container \"f56fc184c72efdb0715102f2218438947dc7e97e505f7b2b3b9699c62410b375\": container with ID starting with f56fc184c72efdb0715102f2218438947dc7e97e505f7b2b3b9699c62410b375 not found: ID does not exist" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.449423 5094 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.800443 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.821385 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 20 06:51:13 crc kubenswrapper[5094]: I0220 06:51:13.853221 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 20 06:51:14 crc kubenswrapper[5094]: I0220 06:51:14.332131 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 20 06:51:14 crc kubenswrapper[5094]: I0220 06:51:14.672259 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 20 06:51:25 crc kubenswrapper[5094]: I0220 06:51:25.609567 5094 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 20 06:51:37 crc kubenswrapper[5094]: I0220 06:51:37.549873 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 20 06:51:37 crc kubenswrapper[5094]: I0220 06:51:37.554426 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 20 06:51:37 crc kubenswrapper[5094]: I0220 06:51:37.554806 5094 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="a44df77fb674c081b2ea537a72225268301f6777f79712ac42b7047fea3be20c" exitCode=137 Feb 20 06:51:37 crc kubenswrapper[5094]: I0220 06:51:37.554866 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"a44df77fb674c081b2ea537a72225268301f6777f79712ac42b7047fea3be20c"} Feb 20 06:51:37 crc kubenswrapper[5094]: I0220 06:51:37.554916 5094 scope.go:117] "RemoveContainer" containerID="124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d" Feb 20 06:51:38 crc kubenswrapper[5094]: I0220 06:51:38.567069 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 20 06:51:38 crc kubenswrapper[5094]: I0220 06:51:38.570911 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6f9d419f2b468dd916110a489ee71eeb5463e527d671db8c7f43354327174777"} Feb 20 06:51:46 crc kubenswrapper[5094]: I0220 06:51:46.785152 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:51:46 crc kubenswrapper[5094]: I0220 06:51:46.790773 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:51:47 crc kubenswrapper[5094]: I0220 06:51:47.631167 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:51:47 crc kubenswrapper[5094]: I0220 06:51:47.637488 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.755187 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl"] Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.757023 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" podUID="d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5" containerName="route-controller-manager" containerID="cri-o://bf96c34879e8b3b8b43d54fe3cde7b504e57f4682c37550a83f12cccfaf7d07c" gracePeriod=30 Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.763456 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fnbl8"] Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.763529 5094 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-qrtpl container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.763561 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" podUID="d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.763688 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" podUID="18cc290d-78be-42c6-af5b-3b8b86941eb2" containerName="controller-manager" containerID="cri-o://0f59159cf70471ec88327f095916949d208d3b4159ba3e8002ae3f04cc1b0b02" gracePeriod=30 Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.783293 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8zvf9"] Feb 20 06:51:57 crc kubenswrapper[5094]: E0220 06:51:57.783534 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.783548 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 20 06:51:57 crc kubenswrapper[5094]: E0220 06:51:57.783567 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d73a928-b634-44c7-a3ca-8ffc9a40277e" containerName="installer" Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.783574 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d73a928-b634-44c7-a3ca-8ffc9a40277e" containerName="installer" Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.783674 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.783690 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d73a928-b634-44c7-a3ca-8ffc9a40277e" containerName="installer" Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.784140 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.803557 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8zvf9"] Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.933012 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4f210823-4d80-4b35-aaef-bb100cf601dd-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.933077 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4f210823-4d80-4b35-aaef-bb100cf601dd-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.933127 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.933151 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4f210823-4d80-4b35-aaef-bb100cf601dd-registry-tls\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.933179 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4f210823-4d80-4b35-aaef-bb100cf601dd-registry-certificates\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.933199 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4f210823-4d80-4b35-aaef-bb100cf601dd-bound-sa-token\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.933218 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhlpm\" (UniqueName: \"kubernetes.io/projected/4f210823-4d80-4b35-aaef-bb100cf601dd-kube-api-access-lhlpm\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.933247 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f210823-4d80-4b35-aaef-bb100cf601dd-trusted-ca\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.973024 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.034473 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4f210823-4d80-4b35-aaef-bb100cf601dd-registry-tls\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.034535 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4f210823-4d80-4b35-aaef-bb100cf601dd-registry-certificates\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.034557 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4f210823-4d80-4b35-aaef-bb100cf601dd-bound-sa-token\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.034576 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhlpm\" (UniqueName: \"kubernetes.io/projected/4f210823-4d80-4b35-aaef-bb100cf601dd-kube-api-access-lhlpm\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.034606 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f210823-4d80-4b35-aaef-bb100cf601dd-trusted-ca\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.034639 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4f210823-4d80-4b35-aaef-bb100cf601dd-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.034666 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4f210823-4d80-4b35-aaef-bb100cf601dd-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.039226 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f210823-4d80-4b35-aaef-bb100cf601dd-trusted-ca\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.039593 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4f210823-4d80-4b35-aaef-bb100cf601dd-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.040124 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4f210823-4d80-4b35-aaef-bb100cf601dd-registry-certificates\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.048677 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4f210823-4d80-4b35-aaef-bb100cf601dd-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.051693 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4f210823-4d80-4b35-aaef-bb100cf601dd-registry-tls\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.059995 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4f210823-4d80-4b35-aaef-bb100cf601dd-bound-sa-token\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.066429 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhlpm\" (UniqueName: \"kubernetes.io/projected/4f210823-4d80-4b35-aaef-bb100cf601dd-kube-api-access-lhlpm\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.100735 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.185126 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.245982 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.276951 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-serving-cert\") pod \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\" (UID: \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\") " Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.277474 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-config\") pod \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\" (UID: \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\") " Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.277566 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-proxy-ca-bundles\") pod \"18cc290d-78be-42c6-af5b-3b8b86941eb2\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.277623 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18cc290d-78be-42c6-af5b-3b8b86941eb2-serving-cert\") pod \"18cc290d-78be-42c6-af5b-3b8b86941eb2\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.277656 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vnhn\" (UniqueName: \"kubernetes.io/projected/18cc290d-78be-42c6-af5b-3b8b86941eb2-kube-api-access-6vnhn\") pod \"18cc290d-78be-42c6-af5b-3b8b86941eb2\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.277812 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-config\") pod \"18cc290d-78be-42c6-af5b-3b8b86941eb2\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.277858 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmphr\" (UniqueName: \"kubernetes.io/projected/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-kube-api-access-fmphr\") pod \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\" (UID: \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\") " Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.277922 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-client-ca\") pod \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\" (UID: \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\") " Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.277941 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-client-ca\") pod \"18cc290d-78be-42c6-af5b-3b8b86941eb2\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.278887 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-config" (OuterVolumeSpecName: "config") pod "d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5" (UID: "d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.279273 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "18cc290d-78be-42c6-af5b-3b8b86941eb2" (UID: "18cc290d-78be-42c6-af5b-3b8b86941eb2"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.279400 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-config" (OuterVolumeSpecName: "config") pod "18cc290d-78be-42c6-af5b-3b8b86941eb2" (UID: "18cc290d-78be-42c6-af5b-3b8b86941eb2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.279740 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-client-ca" (OuterVolumeSpecName: "client-ca") pod "18cc290d-78be-42c6-af5b-3b8b86941eb2" (UID: "18cc290d-78be-42c6-af5b-3b8b86941eb2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.280013 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-client-ca" (OuterVolumeSpecName: "client-ca") pod "d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5" (UID: "d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.284122 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-kube-api-access-fmphr" (OuterVolumeSpecName: "kube-api-access-fmphr") pod "d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5" (UID: "d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5"). InnerVolumeSpecName "kube-api-access-fmphr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.285396 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5" (UID: "d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.285507 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18cc290d-78be-42c6-af5b-3b8b86941eb2-kube-api-access-6vnhn" (OuterVolumeSpecName: "kube-api-access-6vnhn") pod "18cc290d-78be-42c6-af5b-3b8b86941eb2" (UID: "18cc290d-78be-42c6-af5b-3b8b86941eb2"). InnerVolumeSpecName "kube-api-access-6vnhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.285608 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18cc290d-78be-42c6-af5b-3b8b86941eb2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "18cc290d-78be-42c6-af5b-3b8b86941eb2" (UID: "18cc290d-78be-42c6-af5b-3b8b86941eb2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.372989 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8zvf9"] Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.379957 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vnhn\" (UniqueName: \"kubernetes.io/projected/18cc290d-78be-42c6-af5b-3b8b86941eb2-kube-api-access-6vnhn\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.380009 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.380023 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmphr\" (UniqueName: \"kubernetes.io/projected/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-kube-api-access-fmphr\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.380045 5094 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-client-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.380089 5094 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-client-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.380158 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.380172 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.380184 5094 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.380217 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18cc290d-78be-42c6-af5b-3b8b86941eb2-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:58 crc kubenswrapper[5094]: W0220 06:51:58.382976 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f210823_4d80_4b35_aaef_bb100cf601dd.slice/crio-7948b22984973673df41f6d46e1722609ad42ee07f39532b50fde3af57f4ba94 WatchSource:0}: Error finding container 7948b22984973673df41f6d46e1722609ad42ee07f39532b50fde3af57f4ba94: Status 404 returned error can't find the container with id 7948b22984973673df41f6d46e1722609ad42ee07f39532b50fde3af57f4ba94 Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.703929 5094 generic.go:334] "Generic (PLEG): container finished" podID="d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5" containerID="bf96c34879e8b3b8b43d54fe3cde7b504e57f4682c37550a83f12cccfaf7d07c" exitCode=0 Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.704047 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" event={"ID":"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5","Type":"ContainerDied","Data":"bf96c34879e8b3b8b43d54fe3cde7b504e57f4682c37550a83f12cccfaf7d07c"} Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.704046 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.704113 5094 scope.go:117] "RemoveContainer" containerID="bf96c34879e8b3b8b43d54fe3cde7b504e57f4682c37550a83f12cccfaf7d07c" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.704094 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" event={"ID":"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5","Type":"ContainerDied","Data":"94f008c3fadb5b12936b7080560c6d49126d29712d46d44cae8050bc4be3dbe7"} Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.707504 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" event={"ID":"4f210823-4d80-4b35-aaef-bb100cf601dd","Type":"ContainerStarted","Data":"eb6f5628caae9cda47a27fe82dedee5a5710337704919079e0bcc13a086126fc"} Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.707551 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" event={"ID":"4f210823-4d80-4b35-aaef-bb100cf601dd","Type":"ContainerStarted","Data":"7948b22984973673df41f6d46e1722609ad42ee07f39532b50fde3af57f4ba94"} Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.707661 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.711507 5094 generic.go:334] "Generic (PLEG): container finished" podID="18cc290d-78be-42c6-af5b-3b8b86941eb2" containerID="0f59159cf70471ec88327f095916949d208d3b4159ba3e8002ae3f04cc1b0b02" exitCode=0 Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.711573 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" event={"ID":"18cc290d-78be-42c6-af5b-3b8b86941eb2","Type":"ContainerDied","Data":"0f59159cf70471ec88327f095916949d208d3b4159ba3e8002ae3f04cc1b0b02"} Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.711609 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" event={"ID":"18cc290d-78be-42c6-af5b-3b8b86941eb2","Type":"ContainerDied","Data":"a8a623d9daef3ed6dfb32889d5aba7e415d276108ff677b5411cb61659224d60"} Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.711678 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.731979 5094 scope.go:117] "RemoveContainer" containerID="bf96c34879e8b3b8b43d54fe3cde7b504e57f4682c37550a83f12cccfaf7d07c" Feb 20 06:51:58 crc kubenswrapper[5094]: E0220 06:51:58.734096 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf96c34879e8b3b8b43d54fe3cde7b504e57f4682c37550a83f12cccfaf7d07c\": container with ID starting with bf96c34879e8b3b8b43d54fe3cde7b504e57f4682c37550a83f12cccfaf7d07c not found: ID does not exist" containerID="bf96c34879e8b3b8b43d54fe3cde7b504e57f4682c37550a83f12cccfaf7d07c" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.734170 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf96c34879e8b3b8b43d54fe3cde7b504e57f4682c37550a83f12cccfaf7d07c"} err="failed to get container status \"bf96c34879e8b3b8b43d54fe3cde7b504e57f4682c37550a83f12cccfaf7d07c\": rpc error: code = NotFound desc = could not find container \"bf96c34879e8b3b8b43d54fe3cde7b504e57f4682c37550a83f12cccfaf7d07c\": container with ID starting with bf96c34879e8b3b8b43d54fe3cde7b504e57f4682c37550a83f12cccfaf7d07c not found: ID does not exist" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.734229 5094 scope.go:117] "RemoveContainer" containerID="0f59159cf70471ec88327f095916949d208d3b4159ba3e8002ae3f04cc1b0b02" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.735326 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" podStartSLOduration=1.7353025899999999 podStartE2EDuration="1.73530259s" podCreationTimestamp="2026-02-20 06:51:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:51:58.73237331 +0000 UTC m=+333.605000021" watchObservedRunningTime="2026-02-20 06:51:58.73530259 +0000 UTC m=+333.607929311" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.749117 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl"] Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.758137 5094 scope.go:117] "RemoveContainer" containerID="0f59159cf70471ec88327f095916949d208d3b4159ba3e8002ae3f04cc1b0b02" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.760044 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl"] Feb 20 06:51:58 crc kubenswrapper[5094]: E0220 06:51:58.761857 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f59159cf70471ec88327f095916949d208d3b4159ba3e8002ae3f04cc1b0b02\": container with ID starting with 0f59159cf70471ec88327f095916949d208d3b4159ba3e8002ae3f04cc1b0b02 not found: ID does not exist" containerID="0f59159cf70471ec88327f095916949d208d3b4159ba3e8002ae3f04cc1b0b02" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.761904 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f59159cf70471ec88327f095916949d208d3b4159ba3e8002ae3f04cc1b0b02"} err="failed to get container status \"0f59159cf70471ec88327f095916949d208d3b4159ba3e8002ae3f04cc1b0b02\": rpc error: code = NotFound desc = could not find container \"0f59159cf70471ec88327f095916949d208d3b4159ba3e8002ae3f04cc1b0b02\": container with ID starting with 0f59159cf70471ec88327f095916949d208d3b4159ba3e8002ae3f04cc1b0b02 not found: ID does not exist" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.769353 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fnbl8"] Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.774525 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fnbl8"] Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.376539 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p"] Feb 20 06:51:59 crc kubenswrapper[5094]: E0220 06:51:59.377817 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5" containerName="route-controller-manager" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.377869 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5" containerName="route-controller-manager" Feb 20 06:51:59 crc kubenswrapper[5094]: E0220 06:51:59.377930 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18cc290d-78be-42c6-af5b-3b8b86941eb2" containerName="controller-manager" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.377951 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="18cc290d-78be-42c6-af5b-3b8b86941eb2" containerName="controller-manager" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.378250 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5" containerName="route-controller-manager" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.378303 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="18cc290d-78be-42c6-af5b-3b8b86941eb2" containerName="controller-manager" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.379332 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.380667 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h"] Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.381790 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.382739 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.382881 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.382895 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.385655 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.393385 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/490c709b-3530-46b0-9418-fb8e74f8ea3d-client-ca\") pod \"route-controller-manager-67fd64d77-2jt4h\" (UID: \"490c709b-3530-46b0-9418-fb8e74f8ea3d\") " pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.393460 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mxsb\" (UniqueName: \"kubernetes.io/projected/490c709b-3530-46b0-9418-fb8e74f8ea3d-kube-api-access-2mxsb\") pod \"route-controller-manager-67fd64d77-2jt4h\" (UID: \"490c709b-3530-46b0-9418-fb8e74f8ea3d\") " pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.393532 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mvmt\" (UniqueName: \"kubernetes.io/projected/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-kube-api-access-6mvmt\") pod \"controller-manager-bbbbdff7c-c2w7p\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.393534 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.393569 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/490c709b-3530-46b0-9418-fb8e74f8ea3d-serving-cert\") pod \"route-controller-manager-67fd64d77-2jt4h\" (UID: \"490c709b-3530-46b0-9418-fb8e74f8ea3d\") " pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.393633 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/490c709b-3530-46b0-9418-fb8e74f8ea3d-config\") pod \"route-controller-manager-67fd64d77-2jt4h\" (UID: \"490c709b-3530-46b0-9418-fb8e74f8ea3d\") " pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.393667 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-serving-cert\") pod \"controller-manager-bbbbdff7c-c2w7p\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.393715 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-client-ca\") pod \"controller-manager-bbbbdff7c-c2w7p\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.393744 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-proxy-ca-bundles\") pod \"controller-manager-bbbbdff7c-c2w7p\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.393784 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-config\") pod \"controller-manager-bbbbdff7c-c2w7p\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.393546 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.394673 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.394970 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.394975 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.395014 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.395175 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.395590 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.404855 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.407106 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h"] Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.414213 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p"] Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.490768 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p"] Feb 20 06:51:59 crc kubenswrapper[5094]: E0220 06:51:59.491447 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-6mvmt proxy-ca-bundles serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" podUID="6cdcb672-e1fd-4cb0-8721-8327b1bca1f2" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.497268 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/490c709b-3530-46b0-9418-fb8e74f8ea3d-config\") pod \"route-controller-manager-67fd64d77-2jt4h\" (UID: \"490c709b-3530-46b0-9418-fb8e74f8ea3d\") " pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.497364 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-serving-cert\") pod \"controller-manager-bbbbdff7c-c2w7p\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.497411 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-client-ca\") pod \"controller-manager-bbbbdff7c-c2w7p\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.497446 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-proxy-ca-bundles\") pod \"controller-manager-bbbbdff7c-c2w7p\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.497524 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-config\") pod \"controller-manager-bbbbdff7c-c2w7p\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.497585 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/490c709b-3530-46b0-9418-fb8e74f8ea3d-client-ca\") pod \"route-controller-manager-67fd64d77-2jt4h\" (UID: \"490c709b-3530-46b0-9418-fb8e74f8ea3d\") " pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.497610 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mxsb\" (UniqueName: \"kubernetes.io/projected/490c709b-3530-46b0-9418-fb8e74f8ea3d-kube-api-access-2mxsb\") pod \"route-controller-manager-67fd64d77-2jt4h\" (UID: \"490c709b-3530-46b0-9418-fb8e74f8ea3d\") " pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.497640 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mvmt\" (UniqueName: \"kubernetes.io/projected/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-kube-api-access-6mvmt\") pod \"controller-manager-bbbbdff7c-c2w7p\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.497671 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/490c709b-3530-46b0-9418-fb8e74f8ea3d-serving-cert\") pod \"route-controller-manager-67fd64d77-2jt4h\" (UID: \"490c709b-3530-46b0-9418-fb8e74f8ea3d\") " pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.498003 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h"] Feb 20 06:51:59 crc kubenswrapper[5094]: E0220 06:51:59.498564 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-2mxsb serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" podUID="490c709b-3530-46b0-9418-fb8e74f8ea3d" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.499913 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-client-ca\") pod \"controller-manager-bbbbdff7c-c2w7p\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.500139 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/490c709b-3530-46b0-9418-fb8e74f8ea3d-config\") pod \"route-controller-manager-67fd64d77-2jt4h\" (UID: \"490c709b-3530-46b0-9418-fb8e74f8ea3d\") " pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.500477 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-config\") pod \"controller-manager-bbbbdff7c-c2w7p\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.500677 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-proxy-ca-bundles\") pod \"controller-manager-bbbbdff7c-c2w7p\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.504547 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-serving-cert\") pod \"controller-manager-bbbbdff7c-c2w7p\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.505690 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/490c709b-3530-46b0-9418-fb8e74f8ea3d-client-ca\") pod \"route-controller-manager-67fd64d77-2jt4h\" (UID: \"490c709b-3530-46b0-9418-fb8e74f8ea3d\") " pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.513485 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/490c709b-3530-46b0-9418-fb8e74f8ea3d-serving-cert\") pod \"route-controller-manager-67fd64d77-2jt4h\" (UID: \"490c709b-3530-46b0-9418-fb8e74f8ea3d\") " pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.525788 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mvmt\" (UniqueName: \"kubernetes.io/projected/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-kube-api-access-6mvmt\") pod \"controller-manager-bbbbdff7c-c2w7p\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.536841 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mxsb\" (UniqueName: \"kubernetes.io/projected/490c709b-3530-46b0-9418-fb8e74f8ea3d-kube-api-access-2mxsb\") pod \"route-controller-manager-67fd64d77-2jt4h\" (UID: \"490c709b-3530-46b0-9418-fb8e74f8ea3d\") " pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.722393 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.722455 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.733682 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.747977 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.801639 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-serving-cert\") pod \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.801767 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mxsb\" (UniqueName: \"kubernetes.io/projected/490c709b-3530-46b0-9418-fb8e74f8ea3d-kube-api-access-2mxsb\") pod \"490c709b-3530-46b0-9418-fb8e74f8ea3d\" (UID: \"490c709b-3530-46b0-9418-fb8e74f8ea3d\") " Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.801836 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/490c709b-3530-46b0-9418-fb8e74f8ea3d-serving-cert\") pod \"490c709b-3530-46b0-9418-fb8e74f8ea3d\" (UID: \"490c709b-3530-46b0-9418-fb8e74f8ea3d\") " Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.801891 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-config\") pod \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.801915 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mvmt\" (UniqueName: \"kubernetes.io/projected/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-kube-api-access-6mvmt\") pod \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.801937 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/490c709b-3530-46b0-9418-fb8e74f8ea3d-config\") pod \"490c709b-3530-46b0-9418-fb8e74f8ea3d\" (UID: \"490c709b-3530-46b0-9418-fb8e74f8ea3d\") " Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.801963 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-proxy-ca-bundles\") pod \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.801987 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/490c709b-3530-46b0-9418-fb8e74f8ea3d-client-ca\") pod \"490c709b-3530-46b0-9418-fb8e74f8ea3d\" (UID: \"490c709b-3530-46b0-9418-fb8e74f8ea3d\") " Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.802022 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-client-ca\") pod \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.803382 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/490c709b-3530-46b0-9418-fb8e74f8ea3d-config" (OuterVolumeSpecName: "config") pod "490c709b-3530-46b0-9418-fb8e74f8ea3d" (UID: "490c709b-3530-46b0-9418-fb8e74f8ea3d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.803935 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/490c709b-3530-46b0-9418-fb8e74f8ea3d-client-ca" (OuterVolumeSpecName: "client-ca") pod "490c709b-3530-46b0-9418-fb8e74f8ea3d" (UID: "490c709b-3530-46b0-9418-fb8e74f8ea3d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.804246 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-config" (OuterVolumeSpecName: "config") pod "6cdcb672-e1fd-4cb0-8721-8327b1bca1f2" (UID: "6cdcb672-e1fd-4cb0-8721-8327b1bca1f2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.805082 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-client-ca" (OuterVolumeSpecName: "client-ca") pod "6cdcb672-e1fd-4cb0-8721-8327b1bca1f2" (UID: "6cdcb672-e1fd-4cb0-8721-8327b1bca1f2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.805273 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6cdcb672-e1fd-4cb0-8721-8327b1bca1f2" (UID: "6cdcb672-e1fd-4cb0-8721-8327b1bca1f2"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.805780 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/490c709b-3530-46b0-9418-fb8e74f8ea3d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "490c709b-3530-46b0-9418-fb8e74f8ea3d" (UID: "490c709b-3530-46b0-9418-fb8e74f8ea3d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.807272 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/490c709b-3530-46b0-9418-fb8e74f8ea3d-kube-api-access-2mxsb" (OuterVolumeSpecName: "kube-api-access-2mxsb") pod "490c709b-3530-46b0-9418-fb8e74f8ea3d" (UID: "490c709b-3530-46b0-9418-fb8e74f8ea3d"). InnerVolumeSpecName "kube-api-access-2mxsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.807860 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-kube-api-access-6mvmt" (OuterVolumeSpecName: "kube-api-access-6mvmt") pod "6cdcb672-e1fd-4cb0-8721-8327b1bca1f2" (UID: "6cdcb672-e1fd-4cb0-8721-8327b1bca1f2"). InnerVolumeSpecName "kube-api-access-6mvmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.809115 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6cdcb672-e1fd-4cb0-8721-8327b1bca1f2" (UID: "6cdcb672-e1fd-4cb0-8721-8327b1bca1f2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.850594 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18cc290d-78be-42c6-af5b-3b8b86941eb2" path="/var/lib/kubelet/pods/18cc290d-78be-42c6-af5b-3b8b86941eb2/volumes" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.851530 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5" path="/var/lib/kubelet/pods/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5/volumes" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.902784 5094 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/490c709b-3530-46b0-9418-fb8e74f8ea3d-client-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.902816 5094 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-client-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.902826 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.902836 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mxsb\" (UniqueName: \"kubernetes.io/projected/490c709b-3530-46b0-9418-fb8e74f8ea3d-kube-api-access-2mxsb\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.902848 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/490c709b-3530-46b0-9418-fb8e74f8ea3d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.902863 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.902872 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mvmt\" (UniqueName: \"kubernetes.io/projected/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-kube-api-access-6mvmt\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.902881 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/490c709b-3530-46b0-9418-fb8e74f8ea3d-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.902889 5094 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:59 crc kubenswrapper[5094]: E0220 06:51:59.932004 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod490c709b_3530_46b0_9418_fb8e74f8ea3d.slice\": RecentStats: unable to find data in memory cache]" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.728343 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.728370 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.760974 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h"] Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.765198 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h"] Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.769895 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8"] Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.770860 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.780674 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.781175 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.781175 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.787016 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.787820 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.787880 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.792797 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8"] Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.805406 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p"] Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.810366 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p"] Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.815724 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3-config\") pod \"route-controller-manager-54fdd8b649-fzlw8\" (UID: \"3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3\") " pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.815766 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3-client-ca\") pod \"route-controller-manager-54fdd8b649-fzlw8\" (UID: \"3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3\") " pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.815820 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5225\" (UniqueName: \"kubernetes.io/projected/3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3-kube-api-access-q5225\") pod \"route-controller-manager-54fdd8b649-fzlw8\" (UID: \"3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3\") " pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.815856 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3-serving-cert\") pod \"route-controller-manager-54fdd8b649-fzlw8\" (UID: \"3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3\") " pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.917783 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3-config\") pod \"route-controller-manager-54fdd8b649-fzlw8\" (UID: \"3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3\") " pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.917837 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3-client-ca\") pod \"route-controller-manager-54fdd8b649-fzlw8\" (UID: \"3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3\") " pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.917920 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5225\" (UniqueName: \"kubernetes.io/projected/3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3-kube-api-access-q5225\") pod \"route-controller-manager-54fdd8b649-fzlw8\" (UID: \"3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3\") " pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.917957 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3-serving-cert\") pod \"route-controller-manager-54fdd8b649-fzlw8\" (UID: \"3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3\") " pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.918971 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3-client-ca\") pod \"route-controller-manager-54fdd8b649-fzlw8\" (UID: \"3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3\") " pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.919307 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3-config\") pod \"route-controller-manager-54fdd8b649-fzlw8\" (UID: \"3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3\") " pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.922107 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3-serving-cert\") pod \"route-controller-manager-54fdd8b649-fzlw8\" (UID: \"3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3\") " pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.933120 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5225\" (UniqueName: \"kubernetes.io/projected/3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3-kube-api-access-q5225\") pod \"route-controller-manager-54fdd8b649-fzlw8\" (UID: \"3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3\") " pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" Feb 20 06:52:01 crc kubenswrapper[5094]: I0220 06:52:01.094361 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" Feb 20 06:52:01 crc kubenswrapper[5094]: I0220 06:52:01.389848 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8"] Feb 20 06:52:01 crc kubenswrapper[5094]: W0220 06:52:01.397262 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a79b08a_58f4_4ce4_a969_8cd6dd46bcb3.slice/crio-ee4a956134fd19cd37a0a5f856dfe65473a79b1a02057f20124d1d3afe62d4f8 WatchSource:0}: Error finding container ee4a956134fd19cd37a0a5f856dfe65473a79b1a02057f20124d1d3afe62d4f8: Status 404 returned error can't find the container with id ee4a956134fd19cd37a0a5f856dfe65473a79b1a02057f20124d1d3afe62d4f8 Feb 20 06:52:01 crc kubenswrapper[5094]: I0220 06:52:01.734946 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" event={"ID":"3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3","Type":"ContainerStarted","Data":"50fc91aeb169884b1c2057fd672faa52f419ffac32eb752bd24ff76767186fe3"} Feb 20 06:52:01 crc kubenswrapper[5094]: I0220 06:52:01.735004 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" event={"ID":"3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3","Type":"ContainerStarted","Data":"ee4a956134fd19cd37a0a5f856dfe65473a79b1a02057f20124d1d3afe62d4f8"} Feb 20 06:52:01 crc kubenswrapper[5094]: I0220 06:52:01.735306 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" Feb 20 06:52:01 crc kubenswrapper[5094]: I0220 06:52:01.755807 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" podStartSLOduration=2.75578467 podStartE2EDuration="2.75578467s" podCreationTimestamp="2026-02-20 06:51:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:52:01.75286387 +0000 UTC m=+336.625490591" watchObservedRunningTime="2026-02-20 06:52:01.75578467 +0000 UTC m=+336.628411401" Feb 20 06:52:01 crc kubenswrapper[5094]: I0220 06:52:01.847802 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="490c709b-3530-46b0-9418-fb8e74f8ea3d" path="/var/lib/kubelet/pods/490c709b-3530-46b0-9418-fb8e74f8ea3d/volumes" Feb 20 06:52:01 crc kubenswrapper[5094]: I0220 06:52:01.848518 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cdcb672-e1fd-4cb0-8721-8327b1bca1f2" path="/var/lib/kubelet/pods/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2/volumes" Feb 20 06:52:02 crc kubenswrapper[5094]: I0220 06:52:02.035891 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.394407 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj"] Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.395745 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.398311 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.398544 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.400656 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj"] Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.401672 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.401849 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.402319 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.402583 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.409862 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.555616 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpttx\" (UniqueName: \"kubernetes.io/projected/c9baff55-48ba-47a8-9f75-ed2e819db14b-kube-api-access-xpttx\") pod \"controller-manager-6bbd8c4f55-srkvj\" (UID: \"c9baff55-48ba-47a8-9f75-ed2e819db14b\") " pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.555674 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9baff55-48ba-47a8-9f75-ed2e819db14b-serving-cert\") pod \"controller-manager-6bbd8c4f55-srkvj\" (UID: \"c9baff55-48ba-47a8-9f75-ed2e819db14b\") " pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.555734 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9baff55-48ba-47a8-9f75-ed2e819db14b-config\") pod \"controller-manager-6bbd8c4f55-srkvj\" (UID: \"c9baff55-48ba-47a8-9f75-ed2e819db14b\") " pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.555758 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c9baff55-48ba-47a8-9f75-ed2e819db14b-proxy-ca-bundles\") pod \"controller-manager-6bbd8c4f55-srkvj\" (UID: \"c9baff55-48ba-47a8-9f75-ed2e819db14b\") " pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.555784 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9baff55-48ba-47a8-9f75-ed2e819db14b-client-ca\") pod \"controller-manager-6bbd8c4f55-srkvj\" (UID: \"c9baff55-48ba-47a8-9f75-ed2e819db14b\") " pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.657555 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpttx\" (UniqueName: \"kubernetes.io/projected/c9baff55-48ba-47a8-9f75-ed2e819db14b-kube-api-access-xpttx\") pod \"controller-manager-6bbd8c4f55-srkvj\" (UID: \"c9baff55-48ba-47a8-9f75-ed2e819db14b\") " pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.657619 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9baff55-48ba-47a8-9f75-ed2e819db14b-serving-cert\") pod \"controller-manager-6bbd8c4f55-srkvj\" (UID: \"c9baff55-48ba-47a8-9f75-ed2e819db14b\") " pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.657720 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9baff55-48ba-47a8-9f75-ed2e819db14b-config\") pod \"controller-manager-6bbd8c4f55-srkvj\" (UID: \"c9baff55-48ba-47a8-9f75-ed2e819db14b\") " pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.657751 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c9baff55-48ba-47a8-9f75-ed2e819db14b-proxy-ca-bundles\") pod \"controller-manager-6bbd8c4f55-srkvj\" (UID: \"c9baff55-48ba-47a8-9f75-ed2e819db14b\") " pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.657781 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9baff55-48ba-47a8-9f75-ed2e819db14b-client-ca\") pod \"controller-manager-6bbd8c4f55-srkvj\" (UID: \"c9baff55-48ba-47a8-9f75-ed2e819db14b\") " pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.658983 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9baff55-48ba-47a8-9f75-ed2e819db14b-client-ca\") pod \"controller-manager-6bbd8c4f55-srkvj\" (UID: \"c9baff55-48ba-47a8-9f75-ed2e819db14b\") " pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.659856 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9baff55-48ba-47a8-9f75-ed2e819db14b-config\") pod \"controller-manager-6bbd8c4f55-srkvj\" (UID: \"c9baff55-48ba-47a8-9f75-ed2e819db14b\") " pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.660013 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c9baff55-48ba-47a8-9f75-ed2e819db14b-proxy-ca-bundles\") pod \"controller-manager-6bbd8c4f55-srkvj\" (UID: \"c9baff55-48ba-47a8-9f75-ed2e819db14b\") " pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.671506 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9baff55-48ba-47a8-9f75-ed2e819db14b-serving-cert\") pod \"controller-manager-6bbd8c4f55-srkvj\" (UID: \"c9baff55-48ba-47a8-9f75-ed2e819db14b\") " pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.684182 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpttx\" (UniqueName: \"kubernetes.io/projected/c9baff55-48ba-47a8-9f75-ed2e819db14b-kube-api-access-xpttx\") pod \"controller-manager-6bbd8c4f55-srkvj\" (UID: \"c9baff55-48ba-47a8-9f75-ed2e819db14b\") " pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.734776 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:04 crc kubenswrapper[5094]: I0220 06:52:04.005293 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj"] Feb 20 06:52:04 crc kubenswrapper[5094]: W0220 06:52:04.023126 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9baff55_48ba_47a8_9f75_ed2e819db14b.slice/crio-169d20830fc8150f84b501fbf2b926b5d613948e72cb11bf713ef6fd9ba49dad WatchSource:0}: Error finding container 169d20830fc8150f84b501fbf2b926b5d613948e72cb11bf713ef6fd9ba49dad: Status 404 returned error can't find the container with id 169d20830fc8150f84b501fbf2b926b5d613948e72cb11bf713ef6fd9ba49dad Feb 20 06:52:04 crc kubenswrapper[5094]: I0220 06:52:04.108026 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 06:52:04 crc kubenswrapper[5094]: I0220 06:52:04.108162 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 06:52:04 crc kubenswrapper[5094]: I0220 06:52:04.765872 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" event={"ID":"c9baff55-48ba-47a8-9f75-ed2e819db14b","Type":"ContainerStarted","Data":"71ce46af593781e8d088039b07ecf8ad760da89d320957ceeb474c4e06c04dae"} Feb 20 06:52:04 crc kubenswrapper[5094]: I0220 06:52:04.766309 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" event={"ID":"c9baff55-48ba-47a8-9f75-ed2e819db14b","Type":"ContainerStarted","Data":"169d20830fc8150f84b501fbf2b926b5d613948e72cb11bf713ef6fd9ba49dad"} Feb 20 06:52:04 crc kubenswrapper[5094]: I0220 06:52:04.766412 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:04 crc kubenswrapper[5094]: I0220 06:52:04.771905 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:04 crc kubenswrapper[5094]: I0220 06:52:04.794082 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" podStartSLOduration=5.794061005 podStartE2EDuration="5.794061005s" podCreationTimestamp="2026-02-20 06:51:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:52:04.789318812 +0000 UTC m=+339.661945523" watchObservedRunningTime="2026-02-20 06:52:04.794061005 +0000 UTC m=+339.666687716" Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.635457 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7zj2v"] Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.636766 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7zj2v" podUID="66c35d10-d5cc-468f-95a1-b56fde3961b3" containerName="registry-server" containerID="cri-o://beb8731903e3c8312abc22dea15752098a2c970cf1d17da2323d05db3f7ea0e4" gracePeriod=30 Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.648889 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c94fj"] Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.649158 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-c94fj" podUID="88e94523-c126-4ce8-a6c7-2f83eb91d3fc" containerName="registry-server" containerID="cri-o://0cfcc7a212abd3270f67b4c4a04afcaf890dc3482fbe7bdb289a52eed1f95836" gracePeriod=30 Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.669281 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-45dt8"] Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.669572 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" podUID="ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60" containerName="marketplace-operator" containerID="cri-o://ffaff9fdb2f03b0e17c5d470bccb1479bdc3ed514c6b3c3a9637e3af949a185c" gracePeriod=30 Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.676424 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jlc84"] Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.676743 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jlc84" podUID="f5c1eecf-1cc2-4480-ac22-99a970f5dc58" containerName="registry-server" containerID="cri-o://289c43eb8a0b4cdc450470847de0cf4c3dce76dfdb3673f02a9643bdf93ba57c" gracePeriod=30 Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.692554 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tpwcx"] Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.692876 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tpwcx" podUID="8ecc3e73-dd76-4a73-a366-92c78aca386e" containerName="registry-server" containerID="cri-o://42ce9dcbffc673121e18daf4e5c35fd9d54f6c58c1b7c83b8be06d9c3a2788d6" gracePeriod=30 Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.698117 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j8j9k"] Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.700555 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.722186 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j8j9k"] Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.807211 5094 generic.go:334] "Generic (PLEG): container finished" podID="66c35d10-d5cc-468f-95a1-b56fde3961b3" containerID="beb8731903e3c8312abc22dea15752098a2c970cf1d17da2323d05db3f7ea0e4" exitCode=0 Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.807281 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zj2v" event={"ID":"66c35d10-d5cc-468f-95a1-b56fde3961b3","Type":"ContainerDied","Data":"beb8731903e3c8312abc22dea15752098a2c970cf1d17da2323d05db3f7ea0e4"} Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.809585 5094 generic.go:334] "Generic (PLEG): container finished" podID="ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60" containerID="ffaff9fdb2f03b0e17c5d470bccb1479bdc3ed514c6b3c3a9637e3af949a185c" exitCode=0 Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.809649 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" event={"ID":"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60","Type":"ContainerDied","Data":"ffaff9fdb2f03b0e17c5d470bccb1479bdc3ed514c6b3c3a9637e3af949a185c"} Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.811865 5094 generic.go:334] "Generic (PLEG): container finished" podID="f5c1eecf-1cc2-4480-ac22-99a970f5dc58" containerID="289c43eb8a0b4cdc450470847de0cf4c3dce76dfdb3673f02a9643bdf93ba57c" exitCode=0 Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.811925 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jlc84" event={"ID":"f5c1eecf-1cc2-4480-ac22-99a970f5dc58","Type":"ContainerDied","Data":"289c43eb8a0b4cdc450470847de0cf4c3dce76dfdb3673f02a9643bdf93ba57c"} Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.825690 5094 generic.go:334] "Generic (PLEG): container finished" podID="88e94523-c126-4ce8-a6c7-2f83eb91d3fc" containerID="0cfcc7a212abd3270f67b4c4a04afcaf890dc3482fbe7bdb289a52eed1f95836" exitCode=0 Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.825789 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c94fj" event={"ID":"88e94523-c126-4ce8-a6c7-2f83eb91d3fc","Type":"ContainerDied","Data":"0cfcc7a212abd3270f67b4c4a04afcaf890dc3482fbe7bdb289a52eed1f95836"} Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.884501 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-j8j9k\" (UID: \"a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.884603 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-j8j9k\" (UID: \"a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.884627 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqs2q\" (UniqueName: \"kubernetes.io/projected/a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9-kube-api-access-wqs2q\") pod \"marketplace-operator-79b997595-j8j9k\" (UID: \"a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.985586 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-j8j9k\" (UID: \"a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.987387 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-j8j9k\" (UID: \"a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.988339 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqs2q\" (UniqueName: \"kubernetes.io/projected/a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9-kube-api-access-wqs2q\") pod \"marketplace-operator-79b997595-j8j9k\" (UID: \"a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.988563 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-j8j9k\" (UID: \"a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.998775 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-j8j9k\" (UID: \"a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.015864 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqs2q\" (UniqueName: \"kubernetes.io/projected/a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9-kube-api-access-wqs2q\") pod \"marketplace-operator-79b997595-j8j9k\" (UID: \"a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.016688 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.175781 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.192363 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-utilities\") pod \"88e94523-c126-4ce8-a6c7-2f83eb91d3fc\" (UID: \"88e94523-c126-4ce8-a6c7-2f83eb91d3fc\") " Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.192429 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-catalog-content\") pod \"88e94523-c126-4ce8-a6c7-2f83eb91d3fc\" (UID: \"88e94523-c126-4ce8-a6c7-2f83eb91d3fc\") " Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.192532 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2f7t5\" (UniqueName: \"kubernetes.io/projected/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-kube-api-access-2f7t5\") pod \"88e94523-c126-4ce8-a6c7-2f83eb91d3fc\" (UID: \"88e94523-c126-4ce8-a6c7-2f83eb91d3fc\") " Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.193582 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-utilities" (OuterVolumeSpecName: "utilities") pod "88e94523-c126-4ce8-a6c7-2f83eb91d3fc" (UID: "88e94523-c126-4ce8-a6c7-2f83eb91d3fc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.207330 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-kube-api-access-2f7t5" (OuterVolumeSpecName: "kube-api-access-2f7t5") pod "88e94523-c126-4ce8-a6c7-2f83eb91d3fc" (UID: "88e94523-c126-4ce8-a6c7-2f83eb91d3fc"). InnerVolumeSpecName "kube-api-access-2f7t5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.270423 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "88e94523-c126-4ce8-a6c7-2f83eb91d3fc" (UID: "88e94523-c126-4ce8-a6c7-2f83eb91d3fc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.299216 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2f7t5\" (UniqueName: \"kubernetes.io/projected/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-kube-api-access-2f7t5\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.299253 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.299264 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.299519 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.350130 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.353109 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.362551 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.400437 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj5hg\" (UniqueName: \"kubernetes.io/projected/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-kube-api-access-kj5hg\") pod \"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60\" (UID: \"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60\") " Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.400518 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ecc3e73-dd76-4a73-a366-92c78aca386e-utilities\") pod \"8ecc3e73-dd76-4a73-a366-92c78aca386e\" (UID: \"8ecc3e73-dd76-4a73-a366-92c78aca386e\") " Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.400574 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5cms\" (UniqueName: \"kubernetes.io/projected/8ecc3e73-dd76-4a73-a366-92c78aca386e-kube-api-access-s5cms\") pod \"8ecc3e73-dd76-4a73-a366-92c78aca386e\" (UID: \"8ecc3e73-dd76-4a73-a366-92c78aca386e\") " Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.400619 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-marketplace-trusted-ca\") pod \"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60\" (UID: \"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60\") " Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.400655 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66c35d10-d5cc-468f-95a1-b56fde3961b3-utilities\") pod \"66c35d10-d5cc-468f-95a1-b56fde3961b3\" (UID: \"66c35d10-d5cc-468f-95a1-b56fde3961b3\") " Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.400734 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxwcp\" (UniqueName: \"kubernetes.io/projected/66c35d10-d5cc-468f-95a1-b56fde3961b3-kube-api-access-bxwcp\") pod \"66c35d10-d5cc-468f-95a1-b56fde3961b3\" (UID: \"66c35d10-d5cc-468f-95a1-b56fde3961b3\") " Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.400770 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-utilities\") pod \"f5c1eecf-1cc2-4480-ac22-99a970f5dc58\" (UID: \"f5c1eecf-1cc2-4480-ac22-99a970f5dc58\") " Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.400796 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-marketplace-operator-metrics\") pod \"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60\" (UID: \"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60\") " Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.400865 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ecc3e73-dd76-4a73-a366-92c78aca386e-catalog-content\") pod \"8ecc3e73-dd76-4a73-a366-92c78aca386e\" (UID: \"8ecc3e73-dd76-4a73-a366-92c78aca386e\") " Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.400899 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw4pb\" (UniqueName: \"kubernetes.io/projected/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-kube-api-access-dw4pb\") pod \"f5c1eecf-1cc2-4480-ac22-99a970f5dc58\" (UID: \"f5c1eecf-1cc2-4480-ac22-99a970f5dc58\") " Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.400923 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-catalog-content\") pod \"f5c1eecf-1cc2-4480-ac22-99a970f5dc58\" (UID: \"f5c1eecf-1cc2-4480-ac22-99a970f5dc58\") " Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.400958 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66c35d10-d5cc-468f-95a1-b56fde3961b3-catalog-content\") pod \"66c35d10-d5cc-468f-95a1-b56fde3961b3\" (UID: \"66c35d10-d5cc-468f-95a1-b56fde3961b3\") " Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.401712 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60" (UID: "ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.402687 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ecc3e73-dd76-4a73-a366-92c78aca386e-utilities" (OuterVolumeSpecName: "utilities") pod "8ecc3e73-dd76-4a73-a366-92c78aca386e" (UID: "8ecc3e73-dd76-4a73-a366-92c78aca386e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.405984 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60" (UID: "ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.410021 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-kube-api-access-dw4pb" (OuterVolumeSpecName: "kube-api-access-dw4pb") pod "f5c1eecf-1cc2-4480-ac22-99a970f5dc58" (UID: "f5c1eecf-1cc2-4480-ac22-99a970f5dc58"). InnerVolumeSpecName "kube-api-access-dw4pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.411084 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-utilities" (OuterVolumeSpecName: "utilities") pod "f5c1eecf-1cc2-4480-ac22-99a970f5dc58" (UID: "f5c1eecf-1cc2-4480-ac22-99a970f5dc58"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.413650 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66c35d10-d5cc-468f-95a1-b56fde3961b3-kube-api-access-bxwcp" (OuterVolumeSpecName: "kube-api-access-bxwcp") pod "66c35d10-d5cc-468f-95a1-b56fde3961b3" (UID: "66c35d10-d5cc-468f-95a1-b56fde3961b3"). InnerVolumeSpecName "kube-api-access-bxwcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.415531 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ecc3e73-dd76-4a73-a366-92c78aca386e-kube-api-access-s5cms" (OuterVolumeSpecName: "kube-api-access-s5cms") pod "8ecc3e73-dd76-4a73-a366-92c78aca386e" (UID: "8ecc3e73-dd76-4a73-a366-92c78aca386e"). InnerVolumeSpecName "kube-api-access-s5cms". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.417021 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66c35d10-d5cc-468f-95a1-b56fde3961b3-utilities" (OuterVolumeSpecName: "utilities") pod "66c35d10-d5cc-468f-95a1-b56fde3961b3" (UID: "66c35d10-d5cc-468f-95a1-b56fde3961b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.423303 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-kube-api-access-kj5hg" (OuterVolumeSpecName: "kube-api-access-kj5hg") pod "ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60" (UID: "ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60"). InnerVolumeSpecName "kube-api-access-kj5hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.442633 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f5c1eecf-1cc2-4480-ac22-99a970f5dc58" (UID: "f5c1eecf-1cc2-4480-ac22-99a970f5dc58"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.473566 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66c35d10-d5cc-468f-95a1-b56fde3961b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "66c35d10-d5cc-468f-95a1-b56fde3961b3" (UID: "66c35d10-d5cc-468f-95a1-b56fde3961b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.503262 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ecc3e73-dd76-4a73-a366-92c78aca386e-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.503306 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5cms\" (UniqueName: \"kubernetes.io/projected/8ecc3e73-dd76-4a73-a366-92c78aca386e-kube-api-access-s5cms\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.503317 5094 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.503328 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66c35d10-d5cc-468f-95a1-b56fde3961b3-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.503339 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxwcp\" (UniqueName: \"kubernetes.io/projected/66c35d10-d5cc-468f-95a1-b56fde3961b3-kube-api-access-bxwcp\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.503348 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.503358 5094 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.503368 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw4pb\" (UniqueName: \"kubernetes.io/projected/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-kube-api-access-dw4pb\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.503377 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.503387 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66c35d10-d5cc-468f-95a1-b56fde3961b3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.503396 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj5hg\" (UniqueName: \"kubernetes.io/projected/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-kube-api-access-kj5hg\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.557302 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ecc3e73-dd76-4a73-a366-92c78aca386e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ecc3e73-dd76-4a73-a366-92c78aca386e" (UID: "8ecc3e73-dd76-4a73-a366-92c78aca386e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.598829 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j8j9k"] Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.604242 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ecc3e73-dd76-4a73-a366-92c78aca386e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.837830 5094 generic.go:334] "Generic (PLEG): container finished" podID="8ecc3e73-dd76-4a73-a366-92c78aca386e" containerID="42ce9dcbffc673121e18daf4e5c35fd9d54f6c58c1b7c83b8be06d9c3a2788d6" exitCode=0 Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.837916 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpwcx" event={"ID":"8ecc3e73-dd76-4a73-a366-92c78aca386e","Type":"ContainerDied","Data":"42ce9dcbffc673121e18daf4e5c35fd9d54f6c58c1b7c83b8be06d9c3a2788d6"} Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.837931 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.839605 5094 scope.go:117] "RemoveContainer" containerID="42ce9dcbffc673121e18daf4e5c35fd9d54f6c58c1b7c83b8be06d9c3a2788d6" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.839611 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpwcx" event={"ID":"8ecc3e73-dd76-4a73-a366-92c78aca386e","Type":"ContainerDied","Data":"0e45d21f2e06751dfd4ed57ac1f4c3ab877bd9456401ea77210957c88c331288"} Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.851735 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jlc84" event={"ID":"f5c1eecf-1cc2-4480-ac22-99a970f5dc58","Type":"ContainerDied","Data":"b868f7016b99c578833cf3de0ff7d8c4b2b1a5f9351afa2373e2189f66239303"} Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.851852 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.856503 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.856502 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c94fj" event={"ID":"88e94523-c126-4ce8-a6c7-2f83eb91d3fc","Type":"ContainerDied","Data":"a1dea229adcbee55ddf6e0b41aedbdc8cf35c14a6f68369e5313338e917770ed"} Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.861578 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zj2v" event={"ID":"66c35d10-d5cc-468f-95a1-b56fde3961b3","Type":"ContainerDied","Data":"e17dd6ed8a9cb1673d9dc74ffad9c9a16fa5fcf970993588d1e5f6f1eb4c4133"} Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.861623 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.866941 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" event={"ID":"a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9","Type":"ContainerStarted","Data":"e34846bdab6e9a22c8340dffa1a148d16bdd358d036afce89d81c4ada172b476"} Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.866976 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" event={"ID":"a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9","Type":"ContainerStarted","Data":"ae843642a9b18bca47bf28346b209a715fd9f8e7e68ae0ddfbc1b9266a543482"} Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.867960 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.870086 5094 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-j8j9k container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.63:8080/healthz\": dial tcp 10.217.0.63:8080: connect: connection refused" start-of-body= Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.870131 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" podUID="a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.63:8080/healthz\": dial tcp 10.217.0.63:8080: connect: connection refused" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.872904 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" event={"ID":"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60","Type":"ContainerDied","Data":"c96c01104aff3baab51a0a0d07e5aec2067827a35dc850409192571365bb82c1"} Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.873053 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.878225 5094 scope.go:117] "RemoveContainer" containerID="94a532b2709a2b2f9115437213f975f914284d32756c3ae440753867ceeb3799" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.889415 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tpwcx"] Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.893366 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tpwcx"] Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.911561 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" podStartSLOduration=1.911532254 podStartE2EDuration="1.911532254s" podCreationTimestamp="2026-02-20 06:52:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:52:10.903729218 +0000 UTC m=+345.776355949" watchObservedRunningTime="2026-02-20 06:52:10.911532254 +0000 UTC m=+345.784158985" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.928749 5094 scope.go:117] "RemoveContainer" containerID="30c497a9a14091198f358ef8a78f1c6b6da4ba9ab8621394e3d848ebf94da222" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.943482 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c94fj"] Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.961865 5094 scope.go:117] "RemoveContainer" containerID="42ce9dcbffc673121e18daf4e5c35fd9d54f6c58c1b7c83b8be06d9c3a2788d6" Feb 20 06:52:10 crc kubenswrapper[5094]: E0220 06:52:10.962885 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42ce9dcbffc673121e18daf4e5c35fd9d54f6c58c1b7c83b8be06d9c3a2788d6\": container with ID starting with 42ce9dcbffc673121e18daf4e5c35fd9d54f6c58c1b7c83b8be06d9c3a2788d6 not found: ID does not exist" containerID="42ce9dcbffc673121e18daf4e5c35fd9d54f6c58c1b7c83b8be06d9c3a2788d6" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.962925 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42ce9dcbffc673121e18daf4e5c35fd9d54f6c58c1b7c83b8be06d9c3a2788d6"} err="failed to get container status \"42ce9dcbffc673121e18daf4e5c35fd9d54f6c58c1b7c83b8be06d9c3a2788d6\": rpc error: code = NotFound desc = could not find container \"42ce9dcbffc673121e18daf4e5c35fd9d54f6c58c1b7c83b8be06d9c3a2788d6\": container with ID starting with 42ce9dcbffc673121e18daf4e5c35fd9d54f6c58c1b7c83b8be06d9c3a2788d6 not found: ID does not exist" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.962951 5094 scope.go:117] "RemoveContainer" containerID="94a532b2709a2b2f9115437213f975f914284d32756c3ae440753867ceeb3799" Feb 20 06:52:10 crc kubenswrapper[5094]: E0220 06:52:10.964156 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94a532b2709a2b2f9115437213f975f914284d32756c3ae440753867ceeb3799\": container with ID starting with 94a532b2709a2b2f9115437213f975f914284d32756c3ae440753867ceeb3799 not found: ID does not exist" containerID="94a532b2709a2b2f9115437213f975f914284d32756c3ae440753867ceeb3799" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.964180 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94a532b2709a2b2f9115437213f975f914284d32756c3ae440753867ceeb3799"} err="failed to get container status \"94a532b2709a2b2f9115437213f975f914284d32756c3ae440753867ceeb3799\": rpc error: code = NotFound desc = could not find container \"94a532b2709a2b2f9115437213f975f914284d32756c3ae440753867ceeb3799\": container with ID starting with 94a532b2709a2b2f9115437213f975f914284d32756c3ae440753867ceeb3799 not found: ID does not exist" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.964196 5094 scope.go:117] "RemoveContainer" containerID="30c497a9a14091198f358ef8a78f1c6b6da4ba9ab8621394e3d848ebf94da222" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.964365 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-c94fj"] Feb 20 06:52:10 crc kubenswrapper[5094]: E0220 06:52:10.964615 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30c497a9a14091198f358ef8a78f1c6b6da4ba9ab8621394e3d848ebf94da222\": container with ID starting with 30c497a9a14091198f358ef8a78f1c6b6da4ba9ab8621394e3d848ebf94da222 not found: ID does not exist" containerID="30c497a9a14091198f358ef8a78f1c6b6da4ba9ab8621394e3d848ebf94da222" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.964694 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30c497a9a14091198f358ef8a78f1c6b6da4ba9ab8621394e3d848ebf94da222"} err="failed to get container status \"30c497a9a14091198f358ef8a78f1c6b6da4ba9ab8621394e3d848ebf94da222\": rpc error: code = NotFound desc = could not find container \"30c497a9a14091198f358ef8a78f1c6b6da4ba9ab8621394e3d848ebf94da222\": container with ID starting with 30c497a9a14091198f358ef8a78f1c6b6da4ba9ab8621394e3d848ebf94da222 not found: ID does not exist" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.964784 5094 scope.go:117] "RemoveContainer" containerID="289c43eb8a0b4cdc450470847de0cf4c3dce76dfdb3673f02a9643bdf93ba57c" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.976050 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7zj2v"] Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.981502 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7zj2v"] Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.997913 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-45dt8"] Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.999521 5094 scope.go:117] "RemoveContainer" containerID="ca9993542532855c09ba40fb79d1b2ff1916ab7e330faa07724168697397276c" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.005542 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-45dt8"] Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.011031 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jlc84"] Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.016895 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jlc84"] Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.022840 5094 scope.go:117] "RemoveContainer" containerID="064b2c1894993da3d72d5837edcc05f078452e781464dc1e2a5cab9234eb9f15" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.043073 5094 scope.go:117] "RemoveContainer" containerID="0cfcc7a212abd3270f67b4c4a04afcaf890dc3482fbe7bdb289a52eed1f95836" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.063399 5094 scope.go:117] "RemoveContainer" containerID="a3085de07d8490c70b05f62d546c4844150be55db1f8b370f140f0fcadcb36da" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.080268 5094 scope.go:117] "RemoveContainer" containerID="6eb6b4fa4af198a75121d1f1d8845384553bbedcf18e198cdf00cc8282d9f5b7" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.097744 5094 scope.go:117] "RemoveContainer" containerID="beb8731903e3c8312abc22dea15752098a2c970cf1d17da2323d05db3f7ea0e4" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.112821 5094 scope.go:117] "RemoveContainer" containerID="61352ad384c7169a9a29e90c914460822eb2dc45803cccf5cac7d1c7d42a40b1" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.130527 5094 scope.go:117] "RemoveContainer" containerID="11182af2209155c05ce50ce6f5457662dfc62f8d75b0ebcee01f179c458884f9" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.156384 5094 scope.go:117] "RemoveContainer" containerID="ffaff9fdb2f03b0e17c5d470bccb1479bdc3ed514c6b3c3a9637e3af949a185c" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.865201 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66c35d10-d5cc-468f-95a1-b56fde3961b3" path="/var/lib/kubelet/pods/66c35d10-d5cc-468f-95a1-b56fde3961b3/volumes" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.867820 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88e94523-c126-4ce8-a6c7-2f83eb91d3fc" path="/var/lib/kubelet/pods/88e94523-c126-4ce8-a6c7-2f83eb91d3fc/volumes" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.869587 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ecc3e73-dd76-4a73-a366-92c78aca386e" path="/var/lib/kubelet/pods/8ecc3e73-dd76-4a73-a366-92c78aca386e/volumes" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.872338 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60" path="/var/lib/kubelet/pods/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60/volumes" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.873519 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5c1eecf-1cc2-4480-ac22-99a970f5dc58" path="/var/lib/kubelet/pods/f5c1eecf-1cc2-4480-ac22-99a970f5dc58/volumes" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.874789 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zw5zl"] Feb 20 06:52:11 crc kubenswrapper[5094]: E0220 06:52:11.875140 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66c35d10-d5cc-468f-95a1-b56fde3961b3" containerName="extract-content" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.875183 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c35d10-d5cc-468f-95a1-b56fde3961b3" containerName="extract-content" Feb 20 06:52:11 crc kubenswrapper[5094]: E0220 06:52:11.875225 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60" containerName="marketplace-operator" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.875243 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60" containerName="marketplace-operator" Feb 20 06:52:11 crc kubenswrapper[5094]: E0220 06:52:11.875270 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5c1eecf-1cc2-4480-ac22-99a970f5dc58" containerName="registry-server" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.875289 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5c1eecf-1cc2-4480-ac22-99a970f5dc58" containerName="registry-server" Feb 20 06:52:11 crc kubenswrapper[5094]: E0220 06:52:11.875313 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66c35d10-d5cc-468f-95a1-b56fde3961b3" containerName="extract-utilities" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.875331 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c35d10-d5cc-468f-95a1-b56fde3961b3" containerName="extract-utilities" Feb 20 06:52:11 crc kubenswrapper[5094]: E0220 06:52:11.875348 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5c1eecf-1cc2-4480-ac22-99a970f5dc58" containerName="extract-content" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.875365 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5c1eecf-1cc2-4480-ac22-99a970f5dc58" containerName="extract-content" Feb 20 06:52:11 crc kubenswrapper[5094]: E0220 06:52:11.875380 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ecc3e73-dd76-4a73-a366-92c78aca386e" containerName="registry-server" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.875393 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ecc3e73-dd76-4a73-a366-92c78aca386e" containerName="registry-server" Feb 20 06:52:11 crc kubenswrapper[5094]: E0220 06:52:11.875414 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88e94523-c126-4ce8-a6c7-2f83eb91d3fc" containerName="extract-utilities" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.875427 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="88e94523-c126-4ce8-a6c7-2f83eb91d3fc" containerName="extract-utilities" Feb 20 06:52:11 crc kubenswrapper[5094]: E0220 06:52:11.875445 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ecc3e73-dd76-4a73-a366-92c78aca386e" containerName="extract-content" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.875458 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ecc3e73-dd76-4a73-a366-92c78aca386e" containerName="extract-content" Feb 20 06:52:11 crc kubenswrapper[5094]: E0220 06:52:11.875477 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88e94523-c126-4ce8-a6c7-2f83eb91d3fc" containerName="registry-server" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.875493 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="88e94523-c126-4ce8-a6c7-2f83eb91d3fc" containerName="registry-server" Feb 20 06:52:11 crc kubenswrapper[5094]: E0220 06:52:11.875508 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66c35d10-d5cc-468f-95a1-b56fde3961b3" containerName="registry-server" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.875520 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c35d10-d5cc-468f-95a1-b56fde3961b3" containerName="registry-server" Feb 20 06:52:11 crc kubenswrapper[5094]: E0220 06:52:11.875537 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5c1eecf-1cc2-4480-ac22-99a970f5dc58" containerName="extract-utilities" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.875554 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5c1eecf-1cc2-4480-ac22-99a970f5dc58" containerName="extract-utilities" Feb 20 06:52:11 crc kubenswrapper[5094]: E0220 06:52:11.875571 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88e94523-c126-4ce8-a6c7-2f83eb91d3fc" containerName="extract-content" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.875584 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="88e94523-c126-4ce8-a6c7-2f83eb91d3fc" containerName="extract-content" Feb 20 06:52:11 crc kubenswrapper[5094]: E0220 06:52:11.875607 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ecc3e73-dd76-4a73-a366-92c78aca386e" containerName="extract-utilities" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.875619 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ecc3e73-dd76-4a73-a366-92c78aca386e" containerName="extract-utilities" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.876856 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5c1eecf-1cc2-4480-ac22-99a970f5dc58" containerName="registry-server" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.876912 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="66c35d10-d5cc-468f-95a1-b56fde3961b3" containerName="registry-server" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.876940 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ecc3e73-dd76-4a73-a366-92c78aca386e" containerName="registry-server" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.876968 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="88e94523-c126-4ce8-a6c7-2f83eb91d3fc" containerName="registry-server" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.876989 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60" containerName="marketplace-operator" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.878652 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zw5zl"] Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.878892 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zw5zl" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.883066 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.900124 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.057830 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ba2a013-0ac4-4983-92e6-875272450307-utilities\") pod \"redhat-marketplace-zw5zl\" (UID: \"4ba2a013-0ac4-4983-92e6-875272450307\") " pod="openshift-marketplace/redhat-marketplace-zw5zl" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.057922 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ba2a013-0ac4-4983-92e6-875272450307-catalog-content\") pod \"redhat-marketplace-zw5zl\" (UID: \"4ba2a013-0ac4-4983-92e6-875272450307\") " pod="openshift-marketplace/redhat-marketplace-zw5zl" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.058014 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56hm5\" (UniqueName: \"kubernetes.io/projected/4ba2a013-0ac4-4983-92e6-875272450307-kube-api-access-56hm5\") pod \"redhat-marketplace-zw5zl\" (UID: \"4ba2a013-0ac4-4983-92e6-875272450307\") " pod="openshift-marketplace/redhat-marketplace-zw5zl" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.060641 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nhpxw"] Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.062594 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nhpxw" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.065816 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.072903 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nhpxw"] Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.160436 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmx6j\" (UniqueName: \"kubernetes.io/projected/061991e0-0b0a-4e47-9275-e00b323e9fb2-kube-api-access-bmx6j\") pod \"community-operators-nhpxw\" (UID: \"061991e0-0b0a-4e47-9275-e00b323e9fb2\") " pod="openshift-marketplace/community-operators-nhpxw" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.160528 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/061991e0-0b0a-4e47-9275-e00b323e9fb2-utilities\") pod \"community-operators-nhpxw\" (UID: \"061991e0-0b0a-4e47-9275-e00b323e9fb2\") " pod="openshift-marketplace/community-operators-nhpxw" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.160571 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ba2a013-0ac4-4983-92e6-875272450307-utilities\") pod \"redhat-marketplace-zw5zl\" (UID: \"4ba2a013-0ac4-4983-92e6-875272450307\") " pod="openshift-marketplace/redhat-marketplace-zw5zl" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.160613 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ba2a013-0ac4-4983-92e6-875272450307-catalog-content\") pod \"redhat-marketplace-zw5zl\" (UID: \"4ba2a013-0ac4-4983-92e6-875272450307\") " pod="openshift-marketplace/redhat-marketplace-zw5zl" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.160674 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/061991e0-0b0a-4e47-9275-e00b323e9fb2-catalog-content\") pod \"community-operators-nhpxw\" (UID: \"061991e0-0b0a-4e47-9275-e00b323e9fb2\") " pod="openshift-marketplace/community-operators-nhpxw" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.160752 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56hm5\" (UniqueName: \"kubernetes.io/projected/4ba2a013-0ac4-4983-92e6-875272450307-kube-api-access-56hm5\") pod \"redhat-marketplace-zw5zl\" (UID: \"4ba2a013-0ac4-4983-92e6-875272450307\") " pod="openshift-marketplace/redhat-marketplace-zw5zl" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.161070 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ba2a013-0ac4-4983-92e6-875272450307-utilities\") pod \"redhat-marketplace-zw5zl\" (UID: \"4ba2a013-0ac4-4983-92e6-875272450307\") " pod="openshift-marketplace/redhat-marketplace-zw5zl" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.161415 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ba2a013-0ac4-4983-92e6-875272450307-catalog-content\") pod \"redhat-marketplace-zw5zl\" (UID: \"4ba2a013-0ac4-4983-92e6-875272450307\") " pod="openshift-marketplace/redhat-marketplace-zw5zl" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.201272 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56hm5\" (UniqueName: \"kubernetes.io/projected/4ba2a013-0ac4-4983-92e6-875272450307-kube-api-access-56hm5\") pod \"redhat-marketplace-zw5zl\" (UID: \"4ba2a013-0ac4-4983-92e6-875272450307\") " pod="openshift-marketplace/redhat-marketplace-zw5zl" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.201808 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zw5zl" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.262279 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmx6j\" (UniqueName: \"kubernetes.io/projected/061991e0-0b0a-4e47-9275-e00b323e9fb2-kube-api-access-bmx6j\") pod \"community-operators-nhpxw\" (UID: \"061991e0-0b0a-4e47-9275-e00b323e9fb2\") " pod="openshift-marketplace/community-operators-nhpxw" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.262333 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/061991e0-0b0a-4e47-9275-e00b323e9fb2-utilities\") pod \"community-operators-nhpxw\" (UID: \"061991e0-0b0a-4e47-9275-e00b323e9fb2\") " pod="openshift-marketplace/community-operators-nhpxw" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.262386 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/061991e0-0b0a-4e47-9275-e00b323e9fb2-catalog-content\") pod \"community-operators-nhpxw\" (UID: \"061991e0-0b0a-4e47-9275-e00b323e9fb2\") " pod="openshift-marketplace/community-operators-nhpxw" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.262953 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/061991e0-0b0a-4e47-9275-e00b323e9fb2-catalog-content\") pod \"community-operators-nhpxw\" (UID: \"061991e0-0b0a-4e47-9275-e00b323e9fb2\") " pod="openshift-marketplace/community-operators-nhpxw" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.263130 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/061991e0-0b0a-4e47-9275-e00b323e9fb2-utilities\") pod \"community-operators-nhpxw\" (UID: \"061991e0-0b0a-4e47-9275-e00b323e9fb2\") " pod="openshift-marketplace/community-operators-nhpxw" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.285980 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmx6j\" (UniqueName: \"kubernetes.io/projected/061991e0-0b0a-4e47-9275-e00b323e9fb2-kube-api-access-bmx6j\") pod \"community-operators-nhpxw\" (UID: \"061991e0-0b0a-4e47-9275-e00b323e9fb2\") " pod="openshift-marketplace/community-operators-nhpxw" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.381897 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nhpxw" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.655474 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zw5zl"] Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.831835 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nhpxw"] Feb 20 06:52:12 crc kubenswrapper[5094]: W0220 06:52:12.840475 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod061991e0_0b0a_4e47_9275_e00b323e9fb2.slice/crio-e9e27186df68fe530955f183de9f930dd6ce5b2ea9902cb61d24b627fd7e8b4f WatchSource:0}: Error finding container e9e27186df68fe530955f183de9f930dd6ce5b2ea9902cb61d24b627fd7e8b4f: Status 404 returned error can't find the container with id e9e27186df68fe530955f183de9f930dd6ce5b2ea9902cb61d24b627fd7e8b4f Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.902846 5094 generic.go:334] "Generic (PLEG): container finished" podID="4ba2a013-0ac4-4983-92e6-875272450307" containerID="7dc011fed8be0e19eb2f7338ebba3000b50cb6bde5998e42c6796930786d7ccc" exitCode=0 Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.902954 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zw5zl" event={"ID":"4ba2a013-0ac4-4983-92e6-875272450307","Type":"ContainerDied","Data":"7dc011fed8be0e19eb2f7338ebba3000b50cb6bde5998e42c6796930786d7ccc"} Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.903023 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zw5zl" event={"ID":"4ba2a013-0ac4-4983-92e6-875272450307","Type":"ContainerStarted","Data":"1c5ebaedc38619e3f93441f65705f34966ac3b27c07b83db392d20fb8c0b8942"} Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.904756 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nhpxw" event={"ID":"061991e0-0b0a-4e47-9275-e00b323e9fb2","Type":"ContainerStarted","Data":"e9e27186df68fe530955f183de9f930dd6ce5b2ea9902cb61d24b627fd7e8b4f"} Feb 20 06:52:13 crc kubenswrapper[5094]: I0220 06:52:13.914096 5094 generic.go:334] "Generic (PLEG): container finished" podID="061991e0-0b0a-4e47-9275-e00b323e9fb2" containerID="258470f2010631a4f587bce056365561c7d5a4f1c012ad7808c46dac865f443a" exitCode=0 Feb 20 06:52:13 crc kubenswrapper[5094]: I0220 06:52:13.914195 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nhpxw" event={"ID":"061991e0-0b0a-4e47-9275-e00b323e9fb2","Type":"ContainerDied","Data":"258470f2010631a4f587bce056365561c7d5a4f1c012ad7808c46dac865f443a"} Feb 20 06:52:13 crc kubenswrapper[5094]: I0220 06:52:13.918269 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zw5zl" event={"ID":"4ba2a013-0ac4-4983-92e6-875272450307","Type":"ContainerStarted","Data":"aacf0b6e60280be26e57e0c8961f993dc2cc741791b6fdeaf12dfe5d0c9f21d7"} Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.256084 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tzpc7"] Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.257382 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.262064 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.266796 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tzpc7"] Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.318683 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjhpt\" (UniqueName: \"kubernetes.io/projected/c6db9ece-1aa7-4ea4-b800-b710a760edf6-kube-api-access-qjhpt\") pod \"certified-operators-tzpc7\" (UID: \"c6db9ece-1aa7-4ea4-b800-b710a760edf6\") " pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.318832 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6db9ece-1aa7-4ea4-b800-b710a760edf6-catalog-content\") pod \"certified-operators-tzpc7\" (UID: \"c6db9ece-1aa7-4ea4-b800-b710a760edf6\") " pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.318912 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6db9ece-1aa7-4ea4-b800-b710a760edf6-utilities\") pod \"certified-operators-tzpc7\" (UID: \"c6db9ece-1aa7-4ea4-b800-b710a760edf6\") " pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.421104 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjhpt\" (UniqueName: \"kubernetes.io/projected/c6db9ece-1aa7-4ea4-b800-b710a760edf6-kube-api-access-qjhpt\") pod \"certified-operators-tzpc7\" (UID: \"c6db9ece-1aa7-4ea4-b800-b710a760edf6\") " pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.421210 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6db9ece-1aa7-4ea4-b800-b710a760edf6-catalog-content\") pod \"certified-operators-tzpc7\" (UID: \"c6db9ece-1aa7-4ea4-b800-b710a760edf6\") " pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.421331 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6db9ece-1aa7-4ea4-b800-b710a760edf6-utilities\") pod \"certified-operators-tzpc7\" (UID: \"c6db9ece-1aa7-4ea4-b800-b710a760edf6\") " pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.422022 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6db9ece-1aa7-4ea4-b800-b710a760edf6-catalog-content\") pod \"certified-operators-tzpc7\" (UID: \"c6db9ece-1aa7-4ea4-b800-b710a760edf6\") " pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.422061 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6db9ece-1aa7-4ea4-b800-b710a760edf6-utilities\") pod \"certified-operators-tzpc7\" (UID: \"c6db9ece-1aa7-4ea4-b800-b710a760edf6\") " pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.456856 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qcxs4"] Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.458762 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qcxs4" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.460677 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjhpt\" (UniqueName: \"kubernetes.io/projected/c6db9ece-1aa7-4ea4-b800-b710a760edf6-kube-api-access-qjhpt\") pod \"certified-operators-tzpc7\" (UID: \"c6db9ece-1aa7-4ea4-b800-b710a760edf6\") " pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.461533 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.472057 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qcxs4"] Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.524276 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/447e3a00-67d2-44c4-89cd-def383a3693d-utilities\") pod \"redhat-operators-qcxs4\" (UID: \"447e3a00-67d2-44c4-89cd-def383a3693d\") " pod="openshift-marketplace/redhat-operators-qcxs4" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.524572 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/447e3a00-67d2-44c4-89cd-def383a3693d-catalog-content\") pod \"redhat-operators-qcxs4\" (UID: \"447e3a00-67d2-44c4-89cd-def383a3693d\") " pod="openshift-marketplace/redhat-operators-qcxs4" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.524983 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m48fn\" (UniqueName: \"kubernetes.io/projected/447e3a00-67d2-44c4-89cd-def383a3693d-kube-api-access-m48fn\") pod \"redhat-operators-qcxs4\" (UID: \"447e3a00-67d2-44c4-89cd-def383a3693d\") " pod="openshift-marketplace/redhat-operators-qcxs4" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.588568 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.627829 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/447e3a00-67d2-44c4-89cd-def383a3693d-utilities\") pod \"redhat-operators-qcxs4\" (UID: \"447e3a00-67d2-44c4-89cd-def383a3693d\") " pod="openshift-marketplace/redhat-operators-qcxs4" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.627904 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/447e3a00-67d2-44c4-89cd-def383a3693d-catalog-content\") pod \"redhat-operators-qcxs4\" (UID: \"447e3a00-67d2-44c4-89cd-def383a3693d\") " pod="openshift-marketplace/redhat-operators-qcxs4" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.627931 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m48fn\" (UniqueName: \"kubernetes.io/projected/447e3a00-67d2-44c4-89cd-def383a3693d-kube-api-access-m48fn\") pod \"redhat-operators-qcxs4\" (UID: \"447e3a00-67d2-44c4-89cd-def383a3693d\") " pod="openshift-marketplace/redhat-operators-qcxs4" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.628957 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/447e3a00-67d2-44c4-89cd-def383a3693d-utilities\") pod \"redhat-operators-qcxs4\" (UID: \"447e3a00-67d2-44c4-89cd-def383a3693d\") " pod="openshift-marketplace/redhat-operators-qcxs4" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.629275 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/447e3a00-67d2-44c4-89cd-def383a3693d-catalog-content\") pod \"redhat-operators-qcxs4\" (UID: \"447e3a00-67d2-44c4-89cd-def383a3693d\") " pod="openshift-marketplace/redhat-operators-qcxs4" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.647550 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m48fn\" (UniqueName: \"kubernetes.io/projected/447e3a00-67d2-44c4-89cd-def383a3693d-kube-api-access-m48fn\") pod \"redhat-operators-qcxs4\" (UID: \"447e3a00-67d2-44c4-89cd-def383a3693d\") " pod="openshift-marketplace/redhat-operators-qcxs4" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.802122 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qcxs4" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.930563 5094 generic.go:334] "Generic (PLEG): container finished" podID="4ba2a013-0ac4-4983-92e6-875272450307" containerID="aacf0b6e60280be26e57e0c8961f993dc2cc741791b6fdeaf12dfe5d0c9f21d7" exitCode=0 Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.930641 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zw5zl" event={"ID":"4ba2a013-0ac4-4983-92e6-875272450307","Type":"ContainerDied","Data":"aacf0b6e60280be26e57e0c8961f993dc2cc741791b6fdeaf12dfe5d0c9f21d7"} Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.935356 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nhpxw" event={"ID":"061991e0-0b0a-4e47-9275-e00b323e9fb2","Type":"ContainerStarted","Data":"6c1deece1e9db9069d22aa8f5ae753fd3cc50686aa2d45d85821104ae79591b4"} Feb 20 06:52:15 crc kubenswrapper[5094]: I0220 06:52:15.032639 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tzpc7"] Feb 20 06:52:15 crc kubenswrapper[5094]: W0220 06:52:15.036262 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6db9ece_1aa7_4ea4_b800_b710a760edf6.slice/crio-b005b27266dcf7c793cadc5ec806c3eba13a4c965c0c0b82e485488383c3f93f WatchSource:0}: Error finding container b005b27266dcf7c793cadc5ec806c3eba13a4c965c0c0b82e485488383c3f93f: Status 404 returned error can't find the container with id b005b27266dcf7c793cadc5ec806c3eba13a4c965c0c0b82e485488383c3f93f Feb 20 06:52:15 crc kubenswrapper[5094]: I0220 06:52:15.208026 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qcxs4"] Feb 20 06:52:15 crc kubenswrapper[5094]: W0220 06:52:15.216487 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod447e3a00_67d2_44c4_89cd_def383a3693d.slice/crio-85fa3465d015c7e21ff1075f0d4bc07fee5aef80c97796211e2764a44ea98d92 WatchSource:0}: Error finding container 85fa3465d015c7e21ff1075f0d4bc07fee5aef80c97796211e2764a44ea98d92: Status 404 returned error can't find the container with id 85fa3465d015c7e21ff1075f0d4bc07fee5aef80c97796211e2764a44ea98d92 Feb 20 06:52:15 crc kubenswrapper[5094]: I0220 06:52:15.946232 5094 generic.go:334] "Generic (PLEG): container finished" podID="c6db9ece-1aa7-4ea4-b800-b710a760edf6" containerID="0e5f06dce56f85bd90954f924e3c901b30cb5173e84b14dd5d12f82eb9ebf4a0" exitCode=0 Feb 20 06:52:15 crc kubenswrapper[5094]: I0220 06:52:15.946322 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzpc7" event={"ID":"c6db9ece-1aa7-4ea4-b800-b710a760edf6","Type":"ContainerDied","Data":"0e5f06dce56f85bd90954f924e3c901b30cb5173e84b14dd5d12f82eb9ebf4a0"} Feb 20 06:52:15 crc kubenswrapper[5094]: I0220 06:52:15.946750 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzpc7" event={"ID":"c6db9ece-1aa7-4ea4-b800-b710a760edf6","Type":"ContainerStarted","Data":"b005b27266dcf7c793cadc5ec806c3eba13a4c965c0c0b82e485488383c3f93f"} Feb 20 06:52:15 crc kubenswrapper[5094]: I0220 06:52:15.950004 5094 generic.go:334] "Generic (PLEG): container finished" podID="447e3a00-67d2-44c4-89cd-def383a3693d" containerID="f62030d23eeaced6be390007f7315e448784f435f42fdbb04b42cabfa6e3035a" exitCode=0 Feb 20 06:52:15 crc kubenswrapper[5094]: I0220 06:52:15.950159 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qcxs4" event={"ID":"447e3a00-67d2-44c4-89cd-def383a3693d","Type":"ContainerDied","Data":"f62030d23eeaced6be390007f7315e448784f435f42fdbb04b42cabfa6e3035a"} Feb 20 06:52:15 crc kubenswrapper[5094]: I0220 06:52:15.950191 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qcxs4" event={"ID":"447e3a00-67d2-44c4-89cd-def383a3693d","Type":"ContainerStarted","Data":"85fa3465d015c7e21ff1075f0d4bc07fee5aef80c97796211e2764a44ea98d92"} Feb 20 06:52:15 crc kubenswrapper[5094]: I0220 06:52:15.955116 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zw5zl" event={"ID":"4ba2a013-0ac4-4983-92e6-875272450307","Type":"ContainerStarted","Data":"8b624fcc4bd17032c51f91b024fb53e5316783a2b5cdf796c45c17851b976620"} Feb 20 06:52:15 crc kubenswrapper[5094]: I0220 06:52:15.957851 5094 generic.go:334] "Generic (PLEG): container finished" podID="061991e0-0b0a-4e47-9275-e00b323e9fb2" containerID="6c1deece1e9db9069d22aa8f5ae753fd3cc50686aa2d45d85821104ae79591b4" exitCode=0 Feb 20 06:52:15 crc kubenswrapper[5094]: I0220 06:52:15.957892 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nhpxw" event={"ID":"061991e0-0b0a-4e47-9275-e00b323e9fb2","Type":"ContainerDied","Data":"6c1deece1e9db9069d22aa8f5ae753fd3cc50686aa2d45d85821104ae79591b4"} Feb 20 06:52:16 crc kubenswrapper[5094]: I0220 06:52:16.047561 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zw5zl" podStartSLOduration=2.582064351 podStartE2EDuration="5.047524658s" podCreationTimestamp="2026-02-20 06:52:11 +0000 UTC" firstStartedPulling="2026-02-20 06:52:12.905004322 +0000 UTC m=+347.777631033" lastFinishedPulling="2026-02-20 06:52:15.370464639 +0000 UTC m=+350.243091340" observedRunningTime="2026-02-20 06:52:16.045870809 +0000 UTC m=+350.918497520" watchObservedRunningTime="2026-02-20 06:52:16.047524658 +0000 UTC m=+350.920151379" Feb 20 06:52:16 crc kubenswrapper[5094]: I0220 06:52:16.967003 5094 generic.go:334] "Generic (PLEG): container finished" podID="c6db9ece-1aa7-4ea4-b800-b710a760edf6" containerID="b699a1c50072acbdab88a18e4209ddc7b6f1665995200477921d9b14cf4874bc" exitCode=0 Feb 20 06:52:16 crc kubenswrapper[5094]: I0220 06:52:16.967114 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzpc7" event={"ID":"c6db9ece-1aa7-4ea4-b800-b710a760edf6","Type":"ContainerDied","Data":"b699a1c50072acbdab88a18e4209ddc7b6f1665995200477921d9b14cf4874bc"} Feb 20 06:52:16 crc kubenswrapper[5094]: I0220 06:52:16.975010 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qcxs4" event={"ID":"447e3a00-67d2-44c4-89cd-def383a3693d","Type":"ContainerStarted","Data":"4fae2eaa3d4ffc51aab4f89c4f82ef33cbcb1a20909f5dc51ac0898e5482ee62"} Feb 20 06:52:16 crc kubenswrapper[5094]: I0220 06:52:16.977768 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nhpxw" event={"ID":"061991e0-0b0a-4e47-9275-e00b323e9fb2","Type":"ContainerStarted","Data":"533d4a0d04fa804ad9211edefb6a91600dc963cb9cea7ea16b1aa22fe13ba2dd"} Feb 20 06:52:17 crc kubenswrapper[5094]: I0220 06:52:17.010143 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nhpxw" podStartSLOduration=2.610081958 podStartE2EDuration="5.010113381s" podCreationTimestamp="2026-02-20 06:52:12 +0000 UTC" firstStartedPulling="2026-02-20 06:52:13.915907879 +0000 UTC m=+348.788534600" lastFinishedPulling="2026-02-20 06:52:16.315939312 +0000 UTC m=+351.188566023" observedRunningTime="2026-02-20 06:52:17.008516443 +0000 UTC m=+351.881143154" watchObservedRunningTime="2026-02-20 06:52:17.010113381 +0000 UTC m=+351.882740102" Feb 20 06:52:17 crc kubenswrapper[5094]: I0220 06:52:17.986606 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzpc7" event={"ID":"c6db9ece-1aa7-4ea4-b800-b710a760edf6","Type":"ContainerStarted","Data":"e09d2889270f03a1a2710e1eedb583da8696096d2d0903240186438a06d8fabb"} Feb 20 06:52:17 crc kubenswrapper[5094]: I0220 06:52:17.989965 5094 generic.go:334] "Generic (PLEG): container finished" podID="447e3a00-67d2-44c4-89cd-def383a3693d" containerID="4fae2eaa3d4ffc51aab4f89c4f82ef33cbcb1a20909f5dc51ac0898e5482ee62" exitCode=0 Feb 20 06:52:17 crc kubenswrapper[5094]: I0220 06:52:17.990098 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qcxs4" event={"ID":"447e3a00-67d2-44c4-89cd-def383a3693d","Type":"ContainerDied","Data":"4fae2eaa3d4ffc51aab4f89c4f82ef33cbcb1a20909f5dc51ac0898e5482ee62"} Feb 20 06:52:18 crc kubenswrapper[5094]: I0220 06:52:18.015098 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tzpc7" podStartSLOduration=2.6231710550000003 podStartE2EDuration="4.015077687s" podCreationTimestamp="2026-02-20 06:52:14 +0000 UTC" firstStartedPulling="2026-02-20 06:52:15.949778813 +0000 UTC m=+350.822405544" lastFinishedPulling="2026-02-20 06:52:17.341685465 +0000 UTC m=+352.214312176" observedRunningTime="2026-02-20 06:52:18.013059258 +0000 UTC m=+352.885686009" watchObservedRunningTime="2026-02-20 06:52:18.015077687 +0000 UTC m=+352.887704398" Feb 20 06:52:18 crc kubenswrapper[5094]: I0220 06:52:18.106236 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:52:18 crc kubenswrapper[5094]: I0220 06:52:18.168962 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mfphn"] Feb 20 06:52:18 crc kubenswrapper[5094]: I0220 06:52:18.997415 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qcxs4" event={"ID":"447e3a00-67d2-44c4-89cd-def383a3693d","Type":"ContainerStarted","Data":"5d1a70b01add6fd3acd49cff0d943afdd7848ce36a44eeba122d28a271fa3ccd"} Feb 20 06:52:22 crc kubenswrapper[5094]: I0220 06:52:22.202788 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zw5zl" Feb 20 06:52:22 crc kubenswrapper[5094]: I0220 06:52:22.202865 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zw5zl" Feb 20 06:52:22 crc kubenswrapper[5094]: I0220 06:52:22.265533 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zw5zl" Feb 20 06:52:22 crc kubenswrapper[5094]: I0220 06:52:22.290968 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qcxs4" podStartSLOduration=5.857528876 podStartE2EDuration="8.290946927s" podCreationTimestamp="2026-02-20 06:52:14 +0000 UTC" firstStartedPulling="2026-02-20 06:52:15.955894139 +0000 UTC m=+350.828520870" lastFinishedPulling="2026-02-20 06:52:18.38931221 +0000 UTC m=+353.261938921" observedRunningTime="2026-02-20 06:52:19.031933887 +0000 UTC m=+353.904560608" watchObservedRunningTime="2026-02-20 06:52:22.290946927 +0000 UTC m=+357.163573628" Feb 20 06:52:22 crc kubenswrapper[5094]: I0220 06:52:22.382458 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nhpxw" Feb 20 06:52:22 crc kubenswrapper[5094]: I0220 06:52:22.382882 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nhpxw" Feb 20 06:52:22 crc kubenswrapper[5094]: I0220 06:52:22.441642 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nhpxw" Feb 20 06:52:23 crc kubenswrapper[5094]: I0220 06:52:23.081837 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nhpxw" Feb 20 06:52:23 crc kubenswrapper[5094]: I0220 06:52:23.096055 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zw5zl" Feb 20 06:52:24 crc kubenswrapper[5094]: I0220 06:52:24.588979 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 06:52:24 crc kubenswrapper[5094]: I0220 06:52:24.589567 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 06:52:24 crc kubenswrapper[5094]: I0220 06:52:24.638081 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 06:52:24 crc kubenswrapper[5094]: I0220 06:52:24.802831 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qcxs4" Feb 20 06:52:24 crc kubenswrapper[5094]: I0220 06:52:24.803830 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qcxs4" Feb 20 06:52:25 crc kubenswrapper[5094]: I0220 06:52:25.096519 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 06:52:25 crc kubenswrapper[5094]: I0220 06:52:25.846854 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qcxs4" podUID="447e3a00-67d2-44c4-89cd-def383a3693d" containerName="registry-server" probeResult="failure" output=< Feb 20 06:52:25 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 06:52:25 crc kubenswrapper[5094]: > Feb 20 06:52:34 crc kubenswrapper[5094]: I0220 06:52:34.106584 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 06:52:34 crc kubenswrapper[5094]: I0220 06:52:34.107900 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 06:52:34 crc kubenswrapper[5094]: I0220 06:52:34.843215 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qcxs4" Feb 20 06:52:34 crc kubenswrapper[5094]: I0220 06:52:34.885004 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qcxs4" Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.218058 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" podUID="fa6b00ff-07fb-4e9a-80da-780c22acbe69" containerName="registry" containerID="cri-o://aea1000ba80dabd2eeaca77a83d09e249159833615723a3fcdb37fa198e8b5cd" gracePeriod=30 Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.710360 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.778473 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76rz6\" (UniqueName: \"kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-kube-api-access-76rz6\") pod \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.778527 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-bound-sa-token\") pod \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.778596 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa6b00ff-07fb-4e9a-80da-780c22acbe69-trusted-ca\") pod \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.778655 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fa6b00ff-07fb-4e9a-80da-780c22acbe69-installation-pull-secrets\") pod \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.778692 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fa6b00ff-07fb-4e9a-80da-780c22acbe69-ca-trust-extracted\") pod \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.778853 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.778886 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-registry-tls\") pod \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.778909 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fa6b00ff-07fb-4e9a-80da-780c22acbe69-registry-certificates\") pod \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.779834 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa6b00ff-07fb-4e9a-80da-780c22acbe69-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "fa6b00ff-07fb-4e9a-80da-780c22acbe69" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.781685 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa6b00ff-07fb-4e9a-80da-780c22acbe69-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "fa6b00ff-07fb-4e9a-80da-780c22acbe69" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.789400 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-kube-api-access-76rz6" (OuterVolumeSpecName: "kube-api-access-76rz6") pod "fa6b00ff-07fb-4e9a-80da-780c22acbe69" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69"). InnerVolumeSpecName "kube-api-access-76rz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.794045 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "fa6b00ff-07fb-4e9a-80da-780c22acbe69" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.794351 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa6b00ff-07fb-4e9a-80da-780c22acbe69-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "fa6b00ff-07fb-4e9a-80da-780c22acbe69" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.794905 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "fa6b00ff-07fb-4e9a-80da-780c22acbe69" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.796030 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "fa6b00ff-07fb-4e9a-80da-780c22acbe69" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.807018 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa6b00ff-07fb-4e9a-80da-780c22acbe69-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "fa6b00ff-07fb-4e9a-80da-780c22acbe69" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.880688 5094 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa6b00ff-07fb-4e9a-80da-780c22acbe69-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.880962 5094 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fa6b00ff-07fb-4e9a-80da-780c22acbe69-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.881134 5094 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fa6b00ff-07fb-4e9a-80da-780c22acbe69-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.881290 5094 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.881436 5094 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fa6b00ff-07fb-4e9a-80da-780c22acbe69-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.881656 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76rz6\" (UniqueName: \"kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-kube-api-access-76rz6\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.881843 5094 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:44 crc kubenswrapper[5094]: I0220 06:52:44.191459 5094 generic.go:334] "Generic (PLEG): container finished" podID="fa6b00ff-07fb-4e9a-80da-780c22acbe69" containerID="aea1000ba80dabd2eeaca77a83d09e249159833615723a3fcdb37fa198e8b5cd" exitCode=0 Feb 20 06:52:44 crc kubenswrapper[5094]: I0220 06:52:44.191522 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:52:44 crc kubenswrapper[5094]: I0220 06:52:44.191542 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" event={"ID":"fa6b00ff-07fb-4e9a-80da-780c22acbe69","Type":"ContainerDied","Data":"aea1000ba80dabd2eeaca77a83d09e249159833615723a3fcdb37fa198e8b5cd"} Feb 20 06:52:44 crc kubenswrapper[5094]: I0220 06:52:44.191604 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" event={"ID":"fa6b00ff-07fb-4e9a-80da-780c22acbe69","Type":"ContainerDied","Data":"cffd4cbccec1650f91f390b126272379be0c306a55bee926972d5822622f847a"} Feb 20 06:52:44 crc kubenswrapper[5094]: I0220 06:52:44.191628 5094 scope.go:117] "RemoveContainer" containerID="aea1000ba80dabd2eeaca77a83d09e249159833615723a3fcdb37fa198e8b5cd" Feb 20 06:52:44 crc kubenswrapper[5094]: I0220 06:52:44.230463 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mfphn"] Feb 20 06:52:44 crc kubenswrapper[5094]: I0220 06:52:44.233731 5094 scope.go:117] "RemoveContainer" containerID="aea1000ba80dabd2eeaca77a83d09e249159833615723a3fcdb37fa198e8b5cd" Feb 20 06:52:44 crc kubenswrapper[5094]: E0220 06:52:44.234390 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aea1000ba80dabd2eeaca77a83d09e249159833615723a3fcdb37fa198e8b5cd\": container with ID starting with aea1000ba80dabd2eeaca77a83d09e249159833615723a3fcdb37fa198e8b5cd not found: ID does not exist" containerID="aea1000ba80dabd2eeaca77a83d09e249159833615723a3fcdb37fa198e8b5cd" Feb 20 06:52:44 crc kubenswrapper[5094]: I0220 06:52:44.234432 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aea1000ba80dabd2eeaca77a83d09e249159833615723a3fcdb37fa198e8b5cd"} err="failed to get container status \"aea1000ba80dabd2eeaca77a83d09e249159833615723a3fcdb37fa198e8b5cd\": rpc error: code = NotFound desc = could not find container \"aea1000ba80dabd2eeaca77a83d09e249159833615723a3fcdb37fa198e8b5cd\": container with ID starting with aea1000ba80dabd2eeaca77a83d09e249159833615723a3fcdb37fa198e8b5cd not found: ID does not exist" Feb 20 06:52:44 crc kubenswrapper[5094]: I0220 06:52:44.239961 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mfphn"] Feb 20 06:52:45 crc kubenswrapper[5094]: I0220 06:52:45.859862 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa6b00ff-07fb-4e9a-80da-780c22acbe69" path="/var/lib/kubelet/pods/fa6b00ff-07fb-4e9a-80da-780c22acbe69/volumes" Feb 20 06:53:04 crc kubenswrapper[5094]: I0220 06:53:04.107121 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 06:53:04 crc kubenswrapper[5094]: I0220 06:53:04.108034 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 06:53:04 crc kubenswrapper[5094]: I0220 06:53:04.108138 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:53:04 crc kubenswrapper[5094]: I0220 06:53:04.109380 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c4c7f07b9c4d1d267d7fb57a087e1e994c52fa85636dd6484f657b9804207645"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 06:53:04 crc kubenswrapper[5094]: I0220 06:53:04.109518 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://c4c7f07b9c4d1d267d7fb57a087e1e994c52fa85636dd6484f657b9804207645" gracePeriod=600 Feb 20 06:53:04 crc kubenswrapper[5094]: I0220 06:53:04.373007 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="c4c7f07b9c4d1d267d7fb57a087e1e994c52fa85636dd6484f657b9804207645" exitCode=0 Feb 20 06:53:04 crc kubenswrapper[5094]: I0220 06:53:04.373464 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"c4c7f07b9c4d1d267d7fb57a087e1e994c52fa85636dd6484f657b9804207645"} Feb 20 06:53:04 crc kubenswrapper[5094]: I0220 06:53:04.373526 5094 scope.go:117] "RemoveContainer" containerID="85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f" Feb 20 06:53:05 crc kubenswrapper[5094]: I0220 06:53:05.385193 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"f3c668007520ff2c7deb46a6ed520de8c2bf8999084ab5e4c0bcef19c03e1852"} Feb 20 06:55:04 crc kubenswrapper[5094]: I0220 06:55:04.107807 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 06:55:04 crc kubenswrapper[5094]: I0220 06:55:04.108642 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 06:55:34 crc kubenswrapper[5094]: I0220 06:55:34.107353 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 06:55:34 crc kubenswrapper[5094]: I0220 06:55:34.108554 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 06:56:04 crc kubenswrapper[5094]: I0220 06:56:04.106414 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 06:56:04 crc kubenswrapper[5094]: I0220 06:56:04.107322 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 06:56:04 crc kubenswrapper[5094]: I0220 06:56:04.107370 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:56:04 crc kubenswrapper[5094]: I0220 06:56:04.107962 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f3c668007520ff2c7deb46a6ed520de8c2bf8999084ab5e4c0bcef19c03e1852"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 06:56:04 crc kubenswrapper[5094]: I0220 06:56:04.108048 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://f3c668007520ff2c7deb46a6ed520de8c2bf8999084ab5e4c0bcef19c03e1852" gracePeriod=600 Feb 20 06:56:04 crc kubenswrapper[5094]: I0220 06:56:04.907770 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="f3c668007520ff2c7deb46a6ed520de8c2bf8999084ab5e4c0bcef19c03e1852" exitCode=0 Feb 20 06:56:04 crc kubenswrapper[5094]: I0220 06:56:04.907856 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"f3c668007520ff2c7deb46a6ed520de8c2bf8999084ab5e4c0bcef19c03e1852"} Feb 20 06:56:04 crc kubenswrapper[5094]: I0220 06:56:04.908199 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"935ee674e42fed17c73f6a106b1b7b34bda161038510a874c1b2347b0ce2c2b3"} Feb 20 06:56:04 crc kubenswrapper[5094]: I0220 06:56:04.908230 5094 scope.go:117] "RemoveContainer" containerID="c4c7f07b9c4d1d267d7fb57a087e1e994c52fa85636dd6484f657b9804207645" Feb 20 06:58:04 crc kubenswrapper[5094]: I0220 06:58:04.107560 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 06:58:04 crc kubenswrapper[5094]: I0220 06:58:04.109323 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 06:58:34 crc kubenswrapper[5094]: I0220 06:58:34.107155 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 06:58:34 crc kubenswrapper[5094]: I0220 06:58:34.108080 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 06:59:01 crc kubenswrapper[5094]: I0220 06:59:01.828622 5094 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 20 06:59:04 crc kubenswrapper[5094]: I0220 06:59:04.107154 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 06:59:04 crc kubenswrapper[5094]: I0220 06:59:04.107251 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 06:59:04 crc kubenswrapper[5094]: I0220 06:59:04.107308 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:59:04 crc kubenswrapper[5094]: I0220 06:59:04.108036 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"935ee674e42fed17c73f6a106b1b7b34bda161038510a874c1b2347b0ce2c2b3"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 06:59:04 crc kubenswrapper[5094]: I0220 06:59:04.108107 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://935ee674e42fed17c73f6a106b1b7b34bda161038510a874c1b2347b0ce2c2b3" gracePeriod=600 Feb 20 06:59:04 crc kubenswrapper[5094]: I0220 06:59:04.283904 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="935ee674e42fed17c73f6a106b1b7b34bda161038510a874c1b2347b0ce2c2b3" exitCode=0 Feb 20 06:59:04 crc kubenswrapper[5094]: I0220 06:59:04.283989 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"935ee674e42fed17c73f6a106b1b7b34bda161038510a874c1b2347b0ce2c2b3"} Feb 20 06:59:04 crc kubenswrapper[5094]: I0220 06:59:04.284467 5094 scope.go:117] "RemoveContainer" containerID="f3c668007520ff2c7deb46a6ed520de8c2bf8999084ab5e4c0bcef19c03e1852" Feb 20 06:59:05 crc kubenswrapper[5094]: I0220 06:59:05.295271 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"8a6efb8b2f6985d19679a1508c25eeeb609faa457ea57f65ac79a9b48c9574e6"} Feb 20 06:59:19 crc kubenswrapper[5094]: I0220 06:59:19.376025 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4z8d2"] Feb 20 06:59:19 crc kubenswrapper[5094]: E0220 06:59:19.377372 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa6b00ff-07fb-4e9a-80da-780c22acbe69" containerName="registry" Feb 20 06:59:19 crc kubenswrapper[5094]: I0220 06:59:19.377396 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa6b00ff-07fb-4e9a-80da-780c22acbe69" containerName="registry" Feb 20 06:59:19 crc kubenswrapper[5094]: I0220 06:59:19.377613 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa6b00ff-07fb-4e9a-80da-780c22acbe69" containerName="registry" Feb 20 06:59:19 crc kubenswrapper[5094]: I0220 06:59:19.379139 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:19 crc kubenswrapper[5094]: I0220 06:59:19.402227 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4z8d2"] Feb 20 06:59:19 crc kubenswrapper[5094]: I0220 06:59:19.543831 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq6q6\" (UniqueName: \"kubernetes.io/projected/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-kube-api-access-lq6q6\") pod \"redhat-marketplace-4z8d2\" (UID: \"821ee7ec-9cd3-4402-b70d-1c06d52aeb22\") " pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:19 crc kubenswrapper[5094]: I0220 06:59:19.543952 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-utilities\") pod \"redhat-marketplace-4z8d2\" (UID: \"821ee7ec-9cd3-4402-b70d-1c06d52aeb22\") " pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:19 crc kubenswrapper[5094]: I0220 06:59:19.544027 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-catalog-content\") pod \"redhat-marketplace-4z8d2\" (UID: \"821ee7ec-9cd3-4402-b70d-1c06d52aeb22\") " pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:19 crc kubenswrapper[5094]: I0220 06:59:19.645984 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq6q6\" (UniqueName: \"kubernetes.io/projected/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-kube-api-access-lq6q6\") pod \"redhat-marketplace-4z8d2\" (UID: \"821ee7ec-9cd3-4402-b70d-1c06d52aeb22\") " pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:19 crc kubenswrapper[5094]: I0220 06:59:19.646078 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-utilities\") pod \"redhat-marketplace-4z8d2\" (UID: \"821ee7ec-9cd3-4402-b70d-1c06d52aeb22\") " pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:19 crc kubenswrapper[5094]: I0220 06:59:19.646125 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-catalog-content\") pod \"redhat-marketplace-4z8d2\" (UID: \"821ee7ec-9cd3-4402-b70d-1c06d52aeb22\") " pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:19 crc kubenswrapper[5094]: I0220 06:59:19.646946 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-utilities\") pod \"redhat-marketplace-4z8d2\" (UID: \"821ee7ec-9cd3-4402-b70d-1c06d52aeb22\") " pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:19 crc kubenswrapper[5094]: I0220 06:59:19.647055 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-catalog-content\") pod \"redhat-marketplace-4z8d2\" (UID: \"821ee7ec-9cd3-4402-b70d-1c06d52aeb22\") " pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:19 crc kubenswrapper[5094]: I0220 06:59:19.677366 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq6q6\" (UniqueName: \"kubernetes.io/projected/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-kube-api-access-lq6q6\") pod \"redhat-marketplace-4z8d2\" (UID: \"821ee7ec-9cd3-4402-b70d-1c06d52aeb22\") " pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:19 crc kubenswrapper[5094]: I0220 06:59:19.760823 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:20 crc kubenswrapper[5094]: I0220 06:59:20.040225 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4z8d2"] Feb 20 06:59:20 crc kubenswrapper[5094]: W0220 06:59:20.054966 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod821ee7ec_9cd3_4402_b70d_1c06d52aeb22.slice/crio-5541eaf56506c3f5d6d1d2eb83fc2e63f98af88e2b9771e22d69f20383788ecb WatchSource:0}: Error finding container 5541eaf56506c3f5d6d1d2eb83fc2e63f98af88e2b9771e22d69f20383788ecb: Status 404 returned error can't find the container with id 5541eaf56506c3f5d6d1d2eb83fc2e63f98af88e2b9771e22d69f20383788ecb Feb 20 06:59:20 crc kubenswrapper[5094]: I0220 06:59:20.458033 5094 generic.go:334] "Generic (PLEG): container finished" podID="821ee7ec-9cd3-4402-b70d-1c06d52aeb22" containerID="892bd5ba5079073acf61763945511a54b1e887f88b07c6a293f330e935e0f8e2" exitCode=0 Feb 20 06:59:20 crc kubenswrapper[5094]: I0220 06:59:20.458776 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4z8d2" event={"ID":"821ee7ec-9cd3-4402-b70d-1c06d52aeb22","Type":"ContainerDied","Data":"892bd5ba5079073acf61763945511a54b1e887f88b07c6a293f330e935e0f8e2"} Feb 20 06:59:20 crc kubenswrapper[5094]: I0220 06:59:20.458836 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4z8d2" event={"ID":"821ee7ec-9cd3-4402-b70d-1c06d52aeb22","Type":"ContainerStarted","Data":"5541eaf56506c3f5d6d1d2eb83fc2e63f98af88e2b9771e22d69f20383788ecb"} Feb 20 06:59:20 crc kubenswrapper[5094]: I0220 06:59:20.462109 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 06:59:21 crc kubenswrapper[5094]: I0220 06:59:21.469373 5094 generic.go:334] "Generic (PLEG): container finished" podID="821ee7ec-9cd3-4402-b70d-1c06d52aeb22" containerID="e8182412f7a171ee5e0773b3af2b1e29c4e601ab3d3e8a7ac8a8c9fe09e483f1" exitCode=0 Feb 20 06:59:21 crc kubenswrapper[5094]: I0220 06:59:21.469501 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4z8d2" event={"ID":"821ee7ec-9cd3-4402-b70d-1c06d52aeb22","Type":"ContainerDied","Data":"e8182412f7a171ee5e0773b3af2b1e29c4e601ab3d3e8a7ac8a8c9fe09e483f1"} Feb 20 06:59:22 crc kubenswrapper[5094]: I0220 06:59:22.479761 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4z8d2" event={"ID":"821ee7ec-9cd3-4402-b70d-1c06d52aeb22","Type":"ContainerStarted","Data":"2b10876c43b8f6f70135e7079fce1a03a047a31f088519b1d834d3555fef4e67"} Feb 20 06:59:22 crc kubenswrapper[5094]: I0220 06:59:22.511481 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4z8d2" podStartSLOduration=2.09378517 podStartE2EDuration="3.511454592s" podCreationTimestamp="2026-02-20 06:59:19 +0000 UTC" firstStartedPulling="2026-02-20 06:59:20.461725778 +0000 UTC m=+775.334352509" lastFinishedPulling="2026-02-20 06:59:21.87939519 +0000 UTC m=+776.752021931" observedRunningTime="2026-02-20 06:59:22.503560612 +0000 UTC m=+777.376187333" watchObservedRunningTime="2026-02-20 06:59:22.511454592 +0000 UTC m=+777.384081343" Feb 20 06:59:29 crc kubenswrapper[5094]: I0220 06:59:29.762151 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:29 crc kubenswrapper[5094]: I0220 06:59:29.763225 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:29 crc kubenswrapper[5094]: I0220 06:59:29.804798 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:30 crc kubenswrapper[5094]: I0220 06:59:30.614022 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:30 crc kubenswrapper[5094]: I0220 06:59:30.686626 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4z8d2"] Feb 20 06:59:32 crc kubenswrapper[5094]: I0220 06:59:32.463841 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4nkfk"] Feb 20 06:59:32 crc kubenswrapper[5094]: I0220 06:59:32.465436 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:32 crc kubenswrapper[5094]: I0220 06:59:32.486336 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4nkfk"] Feb 20 06:59:32 crc kubenswrapper[5094]: I0220 06:59:32.554904 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4z8d2" podUID="821ee7ec-9cd3-4402-b70d-1c06d52aeb22" containerName="registry-server" containerID="cri-o://2b10876c43b8f6f70135e7079fce1a03a047a31f088519b1d834d3555fef4e67" gracePeriod=2 Feb 20 06:59:32 crc kubenswrapper[5094]: I0220 06:59:32.558820 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cea80939-e9e2-40e3-9a29-06cef37a5482-catalog-content\") pod \"redhat-operators-4nkfk\" (UID: \"cea80939-e9e2-40e3-9a29-06cef37a5482\") " pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:32 crc kubenswrapper[5094]: I0220 06:59:32.559024 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgm4r\" (UniqueName: \"kubernetes.io/projected/cea80939-e9e2-40e3-9a29-06cef37a5482-kube-api-access-xgm4r\") pod \"redhat-operators-4nkfk\" (UID: \"cea80939-e9e2-40e3-9a29-06cef37a5482\") " pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:32 crc kubenswrapper[5094]: I0220 06:59:32.559094 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cea80939-e9e2-40e3-9a29-06cef37a5482-utilities\") pod \"redhat-operators-4nkfk\" (UID: \"cea80939-e9e2-40e3-9a29-06cef37a5482\") " pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:32 crc kubenswrapper[5094]: I0220 06:59:32.660029 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgm4r\" (UniqueName: \"kubernetes.io/projected/cea80939-e9e2-40e3-9a29-06cef37a5482-kube-api-access-xgm4r\") pod \"redhat-operators-4nkfk\" (UID: \"cea80939-e9e2-40e3-9a29-06cef37a5482\") " pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:32 crc kubenswrapper[5094]: I0220 06:59:32.660096 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cea80939-e9e2-40e3-9a29-06cef37a5482-utilities\") pod \"redhat-operators-4nkfk\" (UID: \"cea80939-e9e2-40e3-9a29-06cef37a5482\") " pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:32 crc kubenswrapper[5094]: I0220 06:59:32.660149 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cea80939-e9e2-40e3-9a29-06cef37a5482-catalog-content\") pod \"redhat-operators-4nkfk\" (UID: \"cea80939-e9e2-40e3-9a29-06cef37a5482\") " pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:32 crc kubenswrapper[5094]: I0220 06:59:32.660736 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cea80939-e9e2-40e3-9a29-06cef37a5482-catalog-content\") pod \"redhat-operators-4nkfk\" (UID: \"cea80939-e9e2-40e3-9a29-06cef37a5482\") " pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:32 crc kubenswrapper[5094]: I0220 06:59:32.661058 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cea80939-e9e2-40e3-9a29-06cef37a5482-utilities\") pod \"redhat-operators-4nkfk\" (UID: \"cea80939-e9e2-40e3-9a29-06cef37a5482\") " pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:32 crc kubenswrapper[5094]: I0220 06:59:32.708040 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgm4r\" (UniqueName: \"kubernetes.io/projected/cea80939-e9e2-40e3-9a29-06cef37a5482-kube-api-access-xgm4r\") pod \"redhat-operators-4nkfk\" (UID: \"cea80939-e9e2-40e3-9a29-06cef37a5482\") " pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:32 crc kubenswrapper[5094]: I0220 06:59:32.798619 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.000986 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.067386 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4nkfk"] Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.068027 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq6q6\" (UniqueName: \"kubernetes.io/projected/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-kube-api-access-lq6q6\") pod \"821ee7ec-9cd3-4402-b70d-1c06d52aeb22\" (UID: \"821ee7ec-9cd3-4402-b70d-1c06d52aeb22\") " Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.068228 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-utilities\") pod \"821ee7ec-9cd3-4402-b70d-1c06d52aeb22\" (UID: \"821ee7ec-9cd3-4402-b70d-1c06d52aeb22\") " Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.068286 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-catalog-content\") pod \"821ee7ec-9cd3-4402-b70d-1c06d52aeb22\" (UID: \"821ee7ec-9cd3-4402-b70d-1c06d52aeb22\") " Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.069275 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-utilities" (OuterVolumeSpecName: "utilities") pod "821ee7ec-9cd3-4402-b70d-1c06d52aeb22" (UID: "821ee7ec-9cd3-4402-b70d-1c06d52aeb22"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.075342 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-kube-api-access-lq6q6" (OuterVolumeSpecName: "kube-api-access-lq6q6") pod "821ee7ec-9cd3-4402-b70d-1c06d52aeb22" (UID: "821ee7ec-9cd3-4402-b70d-1c06d52aeb22"). InnerVolumeSpecName "kube-api-access-lq6q6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.093768 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "821ee7ec-9cd3-4402-b70d-1c06d52aeb22" (UID: "821ee7ec-9cd3-4402-b70d-1c06d52aeb22"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.169386 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.169428 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.169446 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq6q6\" (UniqueName: \"kubernetes.io/projected/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-kube-api-access-lq6q6\") on node \"crc\" DevicePath \"\"" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.561871 5094 generic.go:334] "Generic (PLEG): container finished" podID="cea80939-e9e2-40e3-9a29-06cef37a5482" containerID="503e00c005f8a4aa0a101cf737ccd14fe94ec232147036287c455d933cd8c8df" exitCode=0 Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.562000 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4nkfk" event={"ID":"cea80939-e9e2-40e3-9a29-06cef37a5482","Type":"ContainerDied","Data":"503e00c005f8a4aa0a101cf737ccd14fe94ec232147036287c455d933cd8c8df"} Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.562077 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4nkfk" event={"ID":"cea80939-e9e2-40e3-9a29-06cef37a5482","Type":"ContainerStarted","Data":"5046403aa61e0aba40eb47656898bc09f17cf4dac06fc90a524dc06a79c5bd91"} Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.564921 5094 generic.go:334] "Generic (PLEG): container finished" podID="821ee7ec-9cd3-4402-b70d-1c06d52aeb22" containerID="2b10876c43b8f6f70135e7079fce1a03a047a31f088519b1d834d3555fef4e67" exitCode=0 Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.564987 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4z8d2" event={"ID":"821ee7ec-9cd3-4402-b70d-1c06d52aeb22","Type":"ContainerDied","Data":"2b10876c43b8f6f70135e7079fce1a03a047a31f088519b1d834d3555fef4e67"} Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.565033 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4z8d2" event={"ID":"821ee7ec-9cd3-4402-b70d-1c06d52aeb22","Type":"ContainerDied","Data":"5541eaf56506c3f5d6d1d2eb83fc2e63f98af88e2b9771e22d69f20383788ecb"} Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.565041 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.565057 5094 scope.go:117] "RemoveContainer" containerID="2b10876c43b8f6f70135e7079fce1a03a047a31f088519b1d834d3555fef4e67" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.581229 5094 scope.go:117] "RemoveContainer" containerID="e8182412f7a171ee5e0773b3af2b1e29c4e601ab3d3e8a7ac8a8c9fe09e483f1" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.610159 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4z8d2"] Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.612118 5094 scope.go:117] "RemoveContainer" containerID="892bd5ba5079073acf61763945511a54b1e887f88b07c6a293f330e935e0f8e2" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.623881 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4z8d2"] Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.629558 5094 scope.go:117] "RemoveContainer" containerID="2b10876c43b8f6f70135e7079fce1a03a047a31f088519b1d834d3555fef4e67" Feb 20 06:59:33 crc kubenswrapper[5094]: E0220 06:59:33.630171 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b10876c43b8f6f70135e7079fce1a03a047a31f088519b1d834d3555fef4e67\": container with ID starting with 2b10876c43b8f6f70135e7079fce1a03a047a31f088519b1d834d3555fef4e67 not found: ID does not exist" containerID="2b10876c43b8f6f70135e7079fce1a03a047a31f088519b1d834d3555fef4e67" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.630224 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b10876c43b8f6f70135e7079fce1a03a047a31f088519b1d834d3555fef4e67"} err="failed to get container status \"2b10876c43b8f6f70135e7079fce1a03a047a31f088519b1d834d3555fef4e67\": rpc error: code = NotFound desc = could not find container \"2b10876c43b8f6f70135e7079fce1a03a047a31f088519b1d834d3555fef4e67\": container with ID starting with 2b10876c43b8f6f70135e7079fce1a03a047a31f088519b1d834d3555fef4e67 not found: ID does not exist" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.630257 5094 scope.go:117] "RemoveContainer" containerID="e8182412f7a171ee5e0773b3af2b1e29c4e601ab3d3e8a7ac8a8c9fe09e483f1" Feb 20 06:59:33 crc kubenswrapper[5094]: E0220 06:59:33.630496 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8182412f7a171ee5e0773b3af2b1e29c4e601ab3d3e8a7ac8a8c9fe09e483f1\": container with ID starting with e8182412f7a171ee5e0773b3af2b1e29c4e601ab3d3e8a7ac8a8c9fe09e483f1 not found: ID does not exist" containerID="e8182412f7a171ee5e0773b3af2b1e29c4e601ab3d3e8a7ac8a8c9fe09e483f1" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.630526 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8182412f7a171ee5e0773b3af2b1e29c4e601ab3d3e8a7ac8a8c9fe09e483f1"} err="failed to get container status \"e8182412f7a171ee5e0773b3af2b1e29c4e601ab3d3e8a7ac8a8c9fe09e483f1\": rpc error: code = NotFound desc = could not find container \"e8182412f7a171ee5e0773b3af2b1e29c4e601ab3d3e8a7ac8a8c9fe09e483f1\": container with ID starting with e8182412f7a171ee5e0773b3af2b1e29c4e601ab3d3e8a7ac8a8c9fe09e483f1 not found: ID does not exist" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.630543 5094 scope.go:117] "RemoveContainer" containerID="892bd5ba5079073acf61763945511a54b1e887f88b07c6a293f330e935e0f8e2" Feb 20 06:59:33 crc kubenswrapper[5094]: E0220 06:59:33.631475 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"892bd5ba5079073acf61763945511a54b1e887f88b07c6a293f330e935e0f8e2\": container with ID starting with 892bd5ba5079073acf61763945511a54b1e887f88b07c6a293f330e935e0f8e2 not found: ID does not exist" containerID="892bd5ba5079073acf61763945511a54b1e887f88b07c6a293f330e935e0f8e2" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.631627 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"892bd5ba5079073acf61763945511a54b1e887f88b07c6a293f330e935e0f8e2"} err="failed to get container status \"892bd5ba5079073acf61763945511a54b1e887f88b07c6a293f330e935e0f8e2\": rpc error: code = NotFound desc = could not find container \"892bd5ba5079073acf61763945511a54b1e887f88b07c6a293f330e935e0f8e2\": container with ID starting with 892bd5ba5079073acf61763945511a54b1e887f88b07c6a293f330e935e0f8e2 not found: ID does not exist" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.849186 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="821ee7ec-9cd3-4402-b70d-1c06d52aeb22" path="/var/lib/kubelet/pods/821ee7ec-9cd3-4402-b70d-1c06d52aeb22/volumes" Feb 20 06:59:34 crc kubenswrapper[5094]: I0220 06:59:34.576581 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4nkfk" event={"ID":"cea80939-e9e2-40e3-9a29-06cef37a5482","Type":"ContainerStarted","Data":"76d64a917fb426c24c5fd09ad03b0e50da2941358dfcf209ceb4a8d5797261f4"} Feb 20 06:59:35 crc kubenswrapper[5094]: I0220 06:59:35.590674 5094 generic.go:334] "Generic (PLEG): container finished" podID="cea80939-e9e2-40e3-9a29-06cef37a5482" containerID="76d64a917fb426c24c5fd09ad03b0e50da2941358dfcf209ceb4a8d5797261f4" exitCode=0 Feb 20 06:59:35 crc kubenswrapper[5094]: I0220 06:59:35.591828 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4nkfk" event={"ID":"cea80939-e9e2-40e3-9a29-06cef37a5482","Type":"ContainerDied","Data":"76d64a917fb426c24c5fd09ad03b0e50da2941358dfcf209ceb4a8d5797261f4"} Feb 20 06:59:36 crc kubenswrapper[5094]: I0220 06:59:36.602137 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4nkfk" event={"ID":"cea80939-e9e2-40e3-9a29-06cef37a5482","Type":"ContainerStarted","Data":"b112ad3d8a346efd44956e8ef786de7e633cac9e475e5e014c8ead99185fa2f2"} Feb 20 06:59:36 crc kubenswrapper[5094]: I0220 06:59:36.636860 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4nkfk" podStartSLOduration=2.185496107 podStartE2EDuration="4.636831395s" podCreationTimestamp="2026-02-20 06:59:32 +0000 UTC" firstStartedPulling="2026-02-20 06:59:33.563757353 +0000 UTC m=+788.436384054" lastFinishedPulling="2026-02-20 06:59:36.015092621 +0000 UTC m=+790.887719342" observedRunningTime="2026-02-20 06:59:36.633109667 +0000 UTC m=+791.505736418" watchObservedRunningTime="2026-02-20 06:59:36.636831395 +0000 UTC m=+791.509458116" Feb 20 06:59:42 crc kubenswrapper[5094]: I0220 06:59:42.799462 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:42 crc kubenswrapper[5094]: I0220 06:59:42.800734 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:43 crc kubenswrapper[5094]: I0220 06:59:43.885413 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4nkfk" podUID="cea80939-e9e2-40e3-9a29-06cef37a5482" containerName="registry-server" probeResult="failure" output=< Feb 20 06:59:43 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 06:59:43 crc kubenswrapper[5094]: > Feb 20 06:59:49 crc kubenswrapper[5094]: I0220 06:59:49.792207 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4gmch"] Feb 20 06:59:49 crc kubenswrapper[5094]: E0220 06:59:49.793213 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="821ee7ec-9cd3-4402-b70d-1c06d52aeb22" containerName="extract-utilities" Feb 20 06:59:49 crc kubenswrapper[5094]: I0220 06:59:49.793242 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="821ee7ec-9cd3-4402-b70d-1c06d52aeb22" containerName="extract-utilities" Feb 20 06:59:49 crc kubenswrapper[5094]: E0220 06:59:49.793296 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="821ee7ec-9cd3-4402-b70d-1c06d52aeb22" containerName="registry-server" Feb 20 06:59:49 crc kubenswrapper[5094]: I0220 06:59:49.793310 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="821ee7ec-9cd3-4402-b70d-1c06d52aeb22" containerName="registry-server" Feb 20 06:59:49 crc kubenswrapper[5094]: E0220 06:59:49.793340 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="821ee7ec-9cd3-4402-b70d-1c06d52aeb22" containerName="extract-content" Feb 20 06:59:49 crc kubenswrapper[5094]: I0220 06:59:49.793355 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="821ee7ec-9cd3-4402-b70d-1c06d52aeb22" containerName="extract-content" Feb 20 06:59:49 crc kubenswrapper[5094]: I0220 06:59:49.793602 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="821ee7ec-9cd3-4402-b70d-1c06d52aeb22" containerName="registry-server" Feb 20 06:59:49 crc kubenswrapper[5094]: I0220 06:59:49.795139 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4gmch" Feb 20 06:59:49 crc kubenswrapper[5094]: I0220 06:59:49.814391 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4gmch"] Feb 20 06:59:49 crc kubenswrapper[5094]: I0220 06:59:49.849575 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89113f3f-3992-4b74-af7f-4d31f0322f24-catalog-content\") pod \"certified-operators-4gmch\" (UID: \"89113f3f-3992-4b74-af7f-4d31f0322f24\") " pod="openshift-marketplace/certified-operators-4gmch" Feb 20 06:59:49 crc kubenswrapper[5094]: I0220 06:59:49.849784 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89113f3f-3992-4b74-af7f-4d31f0322f24-utilities\") pod \"certified-operators-4gmch\" (UID: \"89113f3f-3992-4b74-af7f-4d31f0322f24\") " pod="openshift-marketplace/certified-operators-4gmch" Feb 20 06:59:49 crc kubenswrapper[5094]: I0220 06:59:49.849828 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnn7b\" (UniqueName: \"kubernetes.io/projected/89113f3f-3992-4b74-af7f-4d31f0322f24-kube-api-access-gnn7b\") pod \"certified-operators-4gmch\" (UID: \"89113f3f-3992-4b74-af7f-4d31f0322f24\") " pod="openshift-marketplace/certified-operators-4gmch" Feb 20 06:59:49 crc kubenswrapper[5094]: I0220 06:59:49.951260 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89113f3f-3992-4b74-af7f-4d31f0322f24-catalog-content\") pod \"certified-operators-4gmch\" (UID: \"89113f3f-3992-4b74-af7f-4d31f0322f24\") " pod="openshift-marketplace/certified-operators-4gmch" Feb 20 06:59:49 crc kubenswrapper[5094]: I0220 06:59:49.951582 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89113f3f-3992-4b74-af7f-4d31f0322f24-utilities\") pod \"certified-operators-4gmch\" (UID: \"89113f3f-3992-4b74-af7f-4d31f0322f24\") " pod="openshift-marketplace/certified-operators-4gmch" Feb 20 06:59:49 crc kubenswrapper[5094]: I0220 06:59:49.951653 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnn7b\" (UniqueName: \"kubernetes.io/projected/89113f3f-3992-4b74-af7f-4d31f0322f24-kube-api-access-gnn7b\") pod \"certified-operators-4gmch\" (UID: \"89113f3f-3992-4b74-af7f-4d31f0322f24\") " pod="openshift-marketplace/certified-operators-4gmch" Feb 20 06:59:49 crc kubenswrapper[5094]: I0220 06:59:49.952470 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89113f3f-3992-4b74-af7f-4d31f0322f24-catalog-content\") pod \"certified-operators-4gmch\" (UID: \"89113f3f-3992-4b74-af7f-4d31f0322f24\") " pod="openshift-marketplace/certified-operators-4gmch" Feb 20 06:59:49 crc kubenswrapper[5094]: I0220 06:59:49.952974 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89113f3f-3992-4b74-af7f-4d31f0322f24-utilities\") pod \"certified-operators-4gmch\" (UID: \"89113f3f-3992-4b74-af7f-4d31f0322f24\") " pod="openshift-marketplace/certified-operators-4gmch" Feb 20 06:59:49 crc kubenswrapper[5094]: I0220 06:59:49.985977 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnn7b\" (UniqueName: \"kubernetes.io/projected/89113f3f-3992-4b74-af7f-4d31f0322f24-kube-api-access-gnn7b\") pod \"certified-operators-4gmch\" (UID: \"89113f3f-3992-4b74-af7f-4d31f0322f24\") " pod="openshift-marketplace/certified-operators-4gmch" Feb 20 06:59:50 crc kubenswrapper[5094]: I0220 06:59:50.135122 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4gmch" Feb 20 06:59:50 crc kubenswrapper[5094]: I0220 06:59:50.434456 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4gmch"] Feb 20 06:59:50 crc kubenswrapper[5094]: I0220 06:59:50.708390 5094 generic.go:334] "Generic (PLEG): container finished" podID="89113f3f-3992-4b74-af7f-4d31f0322f24" containerID="95943edcca5414670df824fe9a787cee8631e2e14f0043b7c6c71162088d8a5d" exitCode=0 Feb 20 06:59:50 crc kubenswrapper[5094]: I0220 06:59:50.708790 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gmch" event={"ID":"89113f3f-3992-4b74-af7f-4d31f0322f24","Type":"ContainerDied","Data":"95943edcca5414670df824fe9a787cee8631e2e14f0043b7c6c71162088d8a5d"} Feb 20 06:59:50 crc kubenswrapper[5094]: I0220 06:59:50.708917 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gmch" event={"ID":"89113f3f-3992-4b74-af7f-4d31f0322f24","Type":"ContainerStarted","Data":"1e6e6e94902ae5b12e4f706c4f350273359bd465dfb08de600352fdbf620c8db"} Feb 20 06:59:51 crc kubenswrapper[5094]: I0220 06:59:51.717186 5094 generic.go:334] "Generic (PLEG): container finished" podID="89113f3f-3992-4b74-af7f-4d31f0322f24" containerID="6b02775e20ebc894175db6b08d70c4d79abe2c30b9e0122c0e58798e158df89f" exitCode=0 Feb 20 06:59:51 crc kubenswrapper[5094]: I0220 06:59:51.717309 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gmch" event={"ID":"89113f3f-3992-4b74-af7f-4d31f0322f24","Type":"ContainerDied","Data":"6b02775e20ebc894175db6b08d70c4d79abe2c30b9e0122c0e58798e158df89f"} Feb 20 06:59:52 crc kubenswrapper[5094]: I0220 06:59:52.729640 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gmch" event={"ID":"89113f3f-3992-4b74-af7f-4d31f0322f24","Type":"ContainerStarted","Data":"613bfd350078e386d2d12c32df7b2dd6b2bef25710e430584dfa155d005c6577"} Feb 20 06:59:52 crc kubenswrapper[5094]: I0220 06:59:52.765112 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4gmch" podStartSLOduration=2.34297934 podStartE2EDuration="3.765077918s" podCreationTimestamp="2026-02-20 06:59:49 +0000 UTC" firstStartedPulling="2026-02-20 06:59:50.71018237 +0000 UTC m=+805.582809081" lastFinishedPulling="2026-02-20 06:59:52.132280918 +0000 UTC m=+807.004907659" observedRunningTime="2026-02-20 06:59:52.758189882 +0000 UTC m=+807.630816613" watchObservedRunningTime="2026-02-20 06:59:52.765077918 +0000 UTC m=+807.637704669" Feb 20 06:59:52 crc kubenswrapper[5094]: I0220 06:59:52.875469 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:52 crc kubenswrapper[5094]: I0220 06:59:52.943028 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.176573 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4nkfk"] Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.176996 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4nkfk" podUID="cea80939-e9e2-40e3-9a29-06cef37a5482" containerName="registry-server" containerID="cri-o://b112ad3d8a346efd44956e8ef786de7e633cac9e475e5e014c8ead99185fa2f2" gracePeriod=2 Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.638059 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.752910 5094 generic.go:334] "Generic (PLEG): container finished" podID="cea80939-e9e2-40e3-9a29-06cef37a5482" containerID="b112ad3d8a346efd44956e8ef786de7e633cac9e475e5e014c8ead99185fa2f2" exitCode=0 Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.752979 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4nkfk" event={"ID":"cea80939-e9e2-40e3-9a29-06cef37a5482","Type":"ContainerDied","Data":"b112ad3d8a346efd44956e8ef786de7e633cac9e475e5e014c8ead99185fa2f2"} Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.753039 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4nkfk" event={"ID":"cea80939-e9e2-40e3-9a29-06cef37a5482","Type":"ContainerDied","Data":"5046403aa61e0aba40eb47656898bc09f17cf4dac06fc90a524dc06a79c5bd91"} Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.753038 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.753074 5094 scope.go:117] "RemoveContainer" containerID="b112ad3d8a346efd44956e8ef786de7e633cac9e475e5e014c8ead99185fa2f2" Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.776571 5094 scope.go:117] "RemoveContainer" containerID="76d64a917fb426c24c5fd09ad03b0e50da2941358dfcf209ceb4a8d5797261f4" Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.799549 5094 scope.go:117] "RemoveContainer" containerID="503e00c005f8a4aa0a101cf737ccd14fe94ec232147036287c455d933cd8c8df" Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.821418 5094 scope.go:117] "RemoveContainer" containerID="b112ad3d8a346efd44956e8ef786de7e633cac9e475e5e014c8ead99185fa2f2" Feb 20 06:59:55 crc kubenswrapper[5094]: E0220 06:59:55.822319 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b112ad3d8a346efd44956e8ef786de7e633cac9e475e5e014c8ead99185fa2f2\": container with ID starting with b112ad3d8a346efd44956e8ef786de7e633cac9e475e5e014c8ead99185fa2f2 not found: ID does not exist" containerID="b112ad3d8a346efd44956e8ef786de7e633cac9e475e5e014c8ead99185fa2f2" Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.822379 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b112ad3d8a346efd44956e8ef786de7e633cac9e475e5e014c8ead99185fa2f2"} err="failed to get container status \"b112ad3d8a346efd44956e8ef786de7e633cac9e475e5e014c8ead99185fa2f2\": rpc error: code = NotFound desc = could not find container \"b112ad3d8a346efd44956e8ef786de7e633cac9e475e5e014c8ead99185fa2f2\": container with ID starting with b112ad3d8a346efd44956e8ef786de7e633cac9e475e5e014c8ead99185fa2f2 not found: ID does not exist" Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.822413 5094 scope.go:117] "RemoveContainer" containerID="76d64a917fb426c24c5fd09ad03b0e50da2941358dfcf209ceb4a8d5797261f4" Feb 20 06:59:55 crc kubenswrapper[5094]: E0220 06:59:55.823070 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76d64a917fb426c24c5fd09ad03b0e50da2941358dfcf209ceb4a8d5797261f4\": container with ID starting with 76d64a917fb426c24c5fd09ad03b0e50da2941358dfcf209ceb4a8d5797261f4 not found: ID does not exist" containerID="76d64a917fb426c24c5fd09ad03b0e50da2941358dfcf209ceb4a8d5797261f4" Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.823108 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76d64a917fb426c24c5fd09ad03b0e50da2941358dfcf209ceb4a8d5797261f4"} err="failed to get container status \"76d64a917fb426c24c5fd09ad03b0e50da2941358dfcf209ceb4a8d5797261f4\": rpc error: code = NotFound desc = could not find container \"76d64a917fb426c24c5fd09ad03b0e50da2941358dfcf209ceb4a8d5797261f4\": container with ID starting with 76d64a917fb426c24c5fd09ad03b0e50da2941358dfcf209ceb4a8d5797261f4 not found: ID does not exist" Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.823146 5094 scope.go:117] "RemoveContainer" containerID="503e00c005f8a4aa0a101cf737ccd14fe94ec232147036287c455d933cd8c8df" Feb 20 06:59:55 crc kubenswrapper[5094]: E0220 06:59:55.824181 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"503e00c005f8a4aa0a101cf737ccd14fe94ec232147036287c455d933cd8c8df\": container with ID starting with 503e00c005f8a4aa0a101cf737ccd14fe94ec232147036287c455d933cd8c8df not found: ID does not exist" containerID="503e00c005f8a4aa0a101cf737ccd14fe94ec232147036287c455d933cd8c8df" Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.824241 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"503e00c005f8a4aa0a101cf737ccd14fe94ec232147036287c455d933cd8c8df"} err="failed to get container status \"503e00c005f8a4aa0a101cf737ccd14fe94ec232147036287c455d933cd8c8df\": rpc error: code = NotFound desc = could not find container \"503e00c005f8a4aa0a101cf737ccd14fe94ec232147036287c455d933cd8c8df\": container with ID starting with 503e00c005f8a4aa0a101cf737ccd14fe94ec232147036287c455d933cd8c8df not found: ID does not exist" Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.837918 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cea80939-e9e2-40e3-9a29-06cef37a5482-utilities\") pod \"cea80939-e9e2-40e3-9a29-06cef37a5482\" (UID: \"cea80939-e9e2-40e3-9a29-06cef37a5482\") " Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.837982 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cea80939-e9e2-40e3-9a29-06cef37a5482-catalog-content\") pod \"cea80939-e9e2-40e3-9a29-06cef37a5482\" (UID: \"cea80939-e9e2-40e3-9a29-06cef37a5482\") " Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.838072 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgm4r\" (UniqueName: \"kubernetes.io/projected/cea80939-e9e2-40e3-9a29-06cef37a5482-kube-api-access-xgm4r\") pod \"cea80939-e9e2-40e3-9a29-06cef37a5482\" (UID: \"cea80939-e9e2-40e3-9a29-06cef37a5482\") " Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.840637 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cea80939-e9e2-40e3-9a29-06cef37a5482-utilities" (OuterVolumeSpecName: "utilities") pod "cea80939-e9e2-40e3-9a29-06cef37a5482" (UID: "cea80939-e9e2-40e3-9a29-06cef37a5482"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.846312 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cea80939-e9e2-40e3-9a29-06cef37a5482-kube-api-access-xgm4r" (OuterVolumeSpecName: "kube-api-access-xgm4r") pod "cea80939-e9e2-40e3-9a29-06cef37a5482" (UID: "cea80939-e9e2-40e3-9a29-06cef37a5482"). InnerVolumeSpecName "kube-api-access-xgm4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.941663 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cea80939-e9e2-40e3-9a29-06cef37a5482-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.942845 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgm4r\" (UniqueName: \"kubernetes.io/projected/cea80939-e9e2-40e3-9a29-06cef37a5482-kube-api-access-xgm4r\") on node \"crc\" DevicePath \"\"" Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.961204 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cea80939-e9e2-40e3-9a29-06cef37a5482-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cea80939-e9e2-40e3-9a29-06cef37a5482" (UID: "cea80939-e9e2-40e3-9a29-06cef37a5482"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:59:56 crc kubenswrapper[5094]: I0220 06:59:56.044568 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cea80939-e9e2-40e3-9a29-06cef37a5482-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 06:59:56 crc kubenswrapper[5094]: I0220 06:59:56.090266 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4nkfk"] Feb 20 06:59:56 crc kubenswrapper[5094]: I0220 06:59:56.097365 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4nkfk"] Feb 20 06:59:57 crc kubenswrapper[5094]: I0220 06:59:57.850724 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cea80939-e9e2-40e3-9a29-06cef37a5482" path="/var/lib/kubelet/pods/cea80939-e9e2-40e3-9a29-06cef37a5482/volumes" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.135638 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4gmch" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.135934 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4gmch" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.195815 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5"] Feb 20 07:00:00 crc kubenswrapper[5094]: E0220 07:00:00.196572 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea80939-e9e2-40e3-9a29-06cef37a5482" containerName="registry-server" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.198032 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea80939-e9e2-40e3-9a29-06cef37a5482" containerName="registry-server" Feb 20 07:00:00 crc kubenswrapper[5094]: E0220 07:00:00.198221 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea80939-e9e2-40e3-9a29-06cef37a5482" containerName="extract-content" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.198335 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea80939-e9e2-40e3-9a29-06cef37a5482" containerName="extract-content" Feb 20 07:00:00 crc kubenswrapper[5094]: E0220 07:00:00.198452 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea80939-e9e2-40e3-9a29-06cef37a5482" containerName="extract-utilities" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.198557 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea80939-e9e2-40e3-9a29-06cef37a5482" containerName="extract-utilities" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.198926 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="cea80939-e9e2-40e3-9a29-06cef37a5482" containerName="registry-server" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.199795 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.203047 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.204078 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.210264 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5"] Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.224266 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4gmch" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.311887 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-config-volume\") pod \"collect-profiles-29526180-74wt5\" (UID: \"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.312067 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49snn\" (UniqueName: \"kubernetes.io/projected/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-kube-api-access-49snn\") pod \"collect-profiles-29526180-74wt5\" (UID: \"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.312102 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-secret-volume\") pod \"collect-profiles-29526180-74wt5\" (UID: \"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.414068 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49snn\" (UniqueName: \"kubernetes.io/projected/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-kube-api-access-49snn\") pod \"collect-profiles-29526180-74wt5\" (UID: \"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.414143 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-secret-volume\") pod \"collect-profiles-29526180-74wt5\" (UID: \"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.414175 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-config-volume\") pod \"collect-profiles-29526180-74wt5\" (UID: \"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.415513 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-config-volume\") pod \"collect-profiles-29526180-74wt5\" (UID: \"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.422471 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-secret-volume\") pod \"collect-profiles-29526180-74wt5\" (UID: \"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.438490 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49snn\" (UniqueName: \"kubernetes.io/projected/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-kube-api-access-49snn\") pod \"collect-profiles-29526180-74wt5\" (UID: \"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.522210 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.769865 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5"] Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.788193 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5" event={"ID":"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d","Type":"ContainerStarted","Data":"b58487640e24a04a74b5b564afcd111dac042cd6580a942d0fd81dbfd354738f"} Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.853227 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4gmch" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.925740 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4gmch"] Feb 20 07:00:01 crc kubenswrapper[5094]: I0220 07:00:01.796879 5094 generic.go:334] "Generic (PLEG): container finished" podID="76c5d78f-89fe-4ed3-b64e-52de9f0dec4d" containerID="a3c7984448d7f3db690223dff864550548436ad39114dac24772a87d3288c8ea" exitCode=0 Feb 20 07:00:01 crc kubenswrapper[5094]: I0220 07:00:01.796966 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5" event={"ID":"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d","Type":"ContainerDied","Data":"a3c7984448d7f3db690223dff864550548436ad39114dac24772a87d3288c8ea"} Feb 20 07:00:02 crc kubenswrapper[5094]: I0220 07:00:02.806606 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4gmch" podUID="89113f3f-3992-4b74-af7f-4d31f0322f24" containerName="registry-server" containerID="cri-o://613bfd350078e386d2d12c32df7b2dd6b2bef25710e430584dfa155d005c6577" gracePeriod=2 Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.179612 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.302229 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4gmch" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.366768 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-secret-volume\") pod \"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d\" (UID: \"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d\") " Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.366941 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49snn\" (UniqueName: \"kubernetes.io/projected/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-kube-api-access-49snn\") pod \"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d\" (UID: \"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d\") " Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.367186 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-config-volume\") pod \"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d\" (UID: \"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d\") " Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.368229 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-config-volume" (OuterVolumeSpecName: "config-volume") pod "76c5d78f-89fe-4ed3-b64e-52de9f0dec4d" (UID: "76c5d78f-89fe-4ed3-b64e-52de9f0dec4d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.373171 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "76c5d78f-89fe-4ed3-b64e-52de9f0dec4d" (UID: "76c5d78f-89fe-4ed3-b64e-52de9f0dec4d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.374970 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-kube-api-access-49snn" (OuterVolumeSpecName: "kube-api-access-49snn") pod "76c5d78f-89fe-4ed3-b64e-52de9f0dec4d" (UID: "76c5d78f-89fe-4ed3-b64e-52de9f0dec4d"). InnerVolumeSpecName "kube-api-access-49snn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.468768 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89113f3f-3992-4b74-af7f-4d31f0322f24-catalog-content\") pod \"89113f3f-3992-4b74-af7f-4d31f0322f24\" (UID: \"89113f3f-3992-4b74-af7f-4d31f0322f24\") " Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.468968 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnn7b\" (UniqueName: \"kubernetes.io/projected/89113f3f-3992-4b74-af7f-4d31f0322f24-kube-api-access-gnn7b\") pod \"89113f3f-3992-4b74-af7f-4d31f0322f24\" (UID: \"89113f3f-3992-4b74-af7f-4d31f0322f24\") " Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.469033 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89113f3f-3992-4b74-af7f-4d31f0322f24-utilities\") pod \"89113f3f-3992-4b74-af7f-4d31f0322f24\" (UID: \"89113f3f-3992-4b74-af7f-4d31f0322f24\") " Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.469310 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49snn\" (UniqueName: \"kubernetes.io/projected/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-kube-api-access-49snn\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.469330 5094 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.469343 5094 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.470155 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89113f3f-3992-4b74-af7f-4d31f0322f24-utilities" (OuterVolumeSpecName: "utilities") pod "89113f3f-3992-4b74-af7f-4d31f0322f24" (UID: "89113f3f-3992-4b74-af7f-4d31f0322f24"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.473064 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89113f3f-3992-4b74-af7f-4d31f0322f24-kube-api-access-gnn7b" (OuterVolumeSpecName: "kube-api-access-gnn7b") pod "89113f3f-3992-4b74-af7f-4d31f0322f24" (UID: "89113f3f-3992-4b74-af7f-4d31f0322f24"). InnerVolumeSpecName "kube-api-access-gnn7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.548378 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89113f3f-3992-4b74-af7f-4d31f0322f24-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "89113f3f-3992-4b74-af7f-4d31f0322f24" (UID: "89113f3f-3992-4b74-af7f-4d31f0322f24"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.586531 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89113f3f-3992-4b74-af7f-4d31f0322f24-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.586650 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnn7b\" (UniqueName: \"kubernetes.io/projected/89113f3f-3992-4b74-af7f-4d31f0322f24-kube-api-access-gnn7b\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.586684 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89113f3f-3992-4b74-af7f-4d31f0322f24-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.814963 5094 generic.go:334] "Generic (PLEG): container finished" podID="89113f3f-3992-4b74-af7f-4d31f0322f24" containerID="613bfd350078e386d2d12c32df7b2dd6b2bef25710e430584dfa155d005c6577" exitCode=0 Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.815056 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4gmch" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.815069 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gmch" event={"ID":"89113f3f-3992-4b74-af7f-4d31f0322f24","Type":"ContainerDied","Data":"613bfd350078e386d2d12c32df7b2dd6b2bef25710e430584dfa155d005c6577"} Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.815514 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gmch" event={"ID":"89113f3f-3992-4b74-af7f-4d31f0322f24","Type":"ContainerDied","Data":"1e6e6e94902ae5b12e4f706c4f350273359bd465dfb08de600352fdbf620c8db"} Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.815538 5094 scope.go:117] "RemoveContainer" containerID="613bfd350078e386d2d12c32df7b2dd6b2bef25710e430584dfa155d005c6577" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.817237 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5" event={"ID":"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d","Type":"ContainerDied","Data":"b58487640e24a04a74b5b564afcd111dac042cd6580a942d0fd81dbfd354738f"} Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.817264 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b58487640e24a04a74b5b564afcd111dac042cd6580a942d0fd81dbfd354738f" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.817320 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.838566 5094 scope.go:117] "RemoveContainer" containerID="6b02775e20ebc894175db6b08d70c4d79abe2c30b9e0122c0e58798e158df89f" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.858639 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4gmch"] Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.862293 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4gmch"] Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.871639 5094 scope.go:117] "RemoveContainer" containerID="95943edcca5414670df824fe9a787cee8631e2e14f0043b7c6c71162088d8a5d" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.888105 5094 scope.go:117] "RemoveContainer" containerID="613bfd350078e386d2d12c32df7b2dd6b2bef25710e430584dfa155d005c6577" Feb 20 07:00:03 crc kubenswrapper[5094]: E0220 07:00:03.888643 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"613bfd350078e386d2d12c32df7b2dd6b2bef25710e430584dfa155d005c6577\": container with ID starting with 613bfd350078e386d2d12c32df7b2dd6b2bef25710e430584dfa155d005c6577 not found: ID does not exist" containerID="613bfd350078e386d2d12c32df7b2dd6b2bef25710e430584dfa155d005c6577" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.888746 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"613bfd350078e386d2d12c32df7b2dd6b2bef25710e430584dfa155d005c6577"} err="failed to get container status \"613bfd350078e386d2d12c32df7b2dd6b2bef25710e430584dfa155d005c6577\": rpc error: code = NotFound desc = could not find container \"613bfd350078e386d2d12c32df7b2dd6b2bef25710e430584dfa155d005c6577\": container with ID starting with 613bfd350078e386d2d12c32df7b2dd6b2bef25710e430584dfa155d005c6577 not found: ID does not exist" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.888793 5094 scope.go:117] "RemoveContainer" containerID="6b02775e20ebc894175db6b08d70c4d79abe2c30b9e0122c0e58798e158df89f" Feb 20 07:00:03 crc kubenswrapper[5094]: E0220 07:00:03.889239 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b02775e20ebc894175db6b08d70c4d79abe2c30b9e0122c0e58798e158df89f\": container with ID starting with 6b02775e20ebc894175db6b08d70c4d79abe2c30b9e0122c0e58798e158df89f not found: ID does not exist" containerID="6b02775e20ebc894175db6b08d70c4d79abe2c30b9e0122c0e58798e158df89f" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.889311 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b02775e20ebc894175db6b08d70c4d79abe2c30b9e0122c0e58798e158df89f"} err="failed to get container status \"6b02775e20ebc894175db6b08d70c4d79abe2c30b9e0122c0e58798e158df89f\": rpc error: code = NotFound desc = could not find container \"6b02775e20ebc894175db6b08d70c4d79abe2c30b9e0122c0e58798e158df89f\": container with ID starting with 6b02775e20ebc894175db6b08d70c4d79abe2c30b9e0122c0e58798e158df89f not found: ID does not exist" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.889370 5094 scope.go:117] "RemoveContainer" containerID="95943edcca5414670df824fe9a787cee8631e2e14f0043b7c6c71162088d8a5d" Feb 20 07:00:03 crc kubenswrapper[5094]: E0220 07:00:03.889860 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95943edcca5414670df824fe9a787cee8631e2e14f0043b7c6c71162088d8a5d\": container with ID starting with 95943edcca5414670df824fe9a787cee8631e2e14f0043b7c6c71162088d8a5d not found: ID does not exist" containerID="95943edcca5414670df824fe9a787cee8631e2e14f0043b7c6c71162088d8a5d" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.889935 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95943edcca5414670df824fe9a787cee8631e2e14f0043b7c6c71162088d8a5d"} err="failed to get container status \"95943edcca5414670df824fe9a787cee8631e2e14f0043b7c6c71162088d8a5d\": rpc error: code = NotFound desc = could not find container \"95943edcca5414670df824fe9a787cee8631e2e14f0043b7c6c71162088d8a5d\": container with ID starting with 95943edcca5414670df824fe9a787cee8631e2e14f0043b7c6c71162088d8a5d not found: ID does not exist" Feb 20 07:00:05 crc kubenswrapper[5094]: I0220 07:00:05.847694 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89113f3f-3992-4b74-af7f-4d31f0322f24" path="/var/lib/kubelet/pods/89113f3f-3992-4b74-af7f-4d31f0322f24/volumes" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.450021 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-29bjc"] Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.451529 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovn-controller" containerID="cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60" gracePeriod=30 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.451613 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf" gracePeriod=30 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.451835 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="northd" containerID="cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e" gracePeriod=30 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.451897 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovn-acl-logging" containerID="cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba" gracePeriod=30 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.451819 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="nbdb" containerID="cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9" gracePeriod=30 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.451936 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="kube-rbac-proxy-node" containerID="cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e" gracePeriod=30 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.452100 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="sbdb" containerID="cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77" gracePeriod=30 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.558141 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovnkube-controller" containerID="cri-o://1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9" gracePeriod=30 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.841248 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovnkube-controller/3.log" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.844337 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovn-acl-logging/0.log" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.844853 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovn-controller/0.log" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.845496 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.885991 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovnkube-script-lib\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886050 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-env-overrides\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886059 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovnkube-controller/3.log" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886087 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovnkube-config\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886117 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-systemd\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886145 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-cni-bin\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886204 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-node-log\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886249 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-var-lib-openvswitch\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886277 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-run-netns\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886314 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-slash\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886341 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886375 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-etc-openvswitch\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886406 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-log-socket\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886436 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swnw6\" (UniqueName: \"kubernetes.io/projected/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-kube-api-access-swnw6\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886478 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-ovn\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886506 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-run-ovn-kubernetes\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886539 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-systemd-units\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886582 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-kubelet\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886605 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-openvswitch\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886635 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-cni-netd\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886667 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovn-node-metrics-cert\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886882 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.887395 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.887440 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.887818 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.887899 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-log-socket" (OuterVolumeSpecName: "log-socket") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.887966 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.888805 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-node-log" (OuterVolumeSpecName: "node-log") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.888871 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.888909 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.888941 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-slash" (OuterVolumeSpecName: "host-slash") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.888972 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.889399 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.889452 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.889464 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.889490 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.889526 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.889585 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.894309 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovn-acl-logging/0.log" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.895184 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovn-controller/0.log" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896127 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896578 5094 generic.go:334] "Generic (PLEG): container finished" podID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerID="1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9" exitCode=0 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896617 5094 generic.go:334] "Generic (PLEG): container finished" podID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerID="7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77" exitCode=0 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896626 5094 generic.go:334] "Generic (PLEG): container finished" podID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerID="835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9" exitCode=0 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896634 5094 generic.go:334] "Generic (PLEG): container finished" podID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerID="bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e" exitCode=0 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896644 5094 generic.go:334] "Generic (PLEG): container finished" podID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerID="192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf" exitCode=0 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896652 5094 generic.go:334] "Generic (PLEG): container finished" podID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerID="f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e" exitCode=0 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896660 5094 generic.go:334] "Generic (PLEG): container finished" podID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerID="2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba" exitCode=143 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896672 5094 generic.go:334] "Generic (PLEG): container finished" podID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerID="9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60" exitCode=143 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896766 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerDied","Data":"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896791 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerDied","Data":"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896806 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerDied","Data":"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896818 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerDied","Data":"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896828 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerDied","Data":"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896840 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerDied","Data":"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896856 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896868 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896874 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896880 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896885 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896890 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896896 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896901 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896906 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896912 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerDied","Data":"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896920 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896927 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896933 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896938 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896943 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896949 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896954 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896960 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896966 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896971 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896979 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerDied","Data":"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896987 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896994 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897000 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897006 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897014 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897020 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897026 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897032 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897038 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897272 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897282 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerDied","Data":"7a9fcbdbce2beb3a7269a8c12a8179221e63193036cba570355ff8c9f0adb656"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897293 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897302 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897308 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897314 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897319 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897325 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897330 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897335 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897340 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897346 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897258 5094 scope.go:117] "RemoveContainer" containerID="1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897243 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.900285 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-kube-api-access-swnw6" (OuterVolumeSpecName: "kube-api-access-swnw6") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "kube-api-access-swnw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.910670 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zr8rz_c3900f6d-3035-4fc4-80a2-9e79154f4f5e/kube-multus/2.log" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.912306 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zr8rz_c3900f6d-3035-4fc4-80a2-9e79154f4f5e/kube-multus/1.log" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.912812 5094 generic.go:334] "Generic (PLEG): container finished" podID="c3900f6d-3035-4fc4-80a2-9e79154f4f5e" containerID="0e16f340af41f0cc3bffd1d98c9695dc8ad9491384da855c5637478b18c6f793" exitCode=2 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.912977 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zr8rz" event={"ID":"c3900f6d-3035-4fc4-80a2-9e79154f4f5e","Type":"ContainerDied","Data":"0e16f340af41f0cc3bffd1d98c9695dc8ad9491384da855c5637478b18c6f793"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.913166 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aba2be0e4774b30500be76b546e3ffff5c136a2e26675822931a400ca3090e79"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.913984 5094 scope.go:117] "RemoveContainer" containerID="0e16f340af41f0cc3bffd1d98c9695dc8ad9491384da855c5637478b18c6f793" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.918775 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mfxc9"] Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.919122 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="kube-rbac-proxy-ovn-metrics" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919144 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="kube-rbac-proxy-ovn-metrics" Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.919165 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovnkube-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919179 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovnkube-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.919193 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="sbdb" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919206 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="sbdb" Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.919219 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovn-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919229 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovn-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.919244 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="northd" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919254 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="northd" Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.919268 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="nbdb" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919279 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="nbdb" Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.919293 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89113f3f-3992-4b74-af7f-4d31f0322f24" containerName="extract-utilities" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919303 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="89113f3f-3992-4b74-af7f-4d31f0322f24" containerName="extract-utilities" Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.919315 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovnkube-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919326 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovnkube-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.919338 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovnkube-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919348 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovnkube-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.919362 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76c5d78f-89fe-4ed3-b64e-52de9f0dec4d" containerName="collect-profiles" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919371 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="76c5d78f-89fe-4ed3-b64e-52de9f0dec4d" containerName="collect-profiles" Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.919387 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovn-acl-logging" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919397 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovn-acl-logging" Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.919411 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89113f3f-3992-4b74-af7f-4d31f0322f24" containerName="registry-server" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919462 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="89113f3f-3992-4b74-af7f-4d31f0322f24" containerName="registry-server" Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.919482 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="kube-rbac-proxy-node" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919493 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="kube-rbac-proxy-node" Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.919511 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovnkube-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919521 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovnkube-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.919558 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="kubecfg-setup" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919572 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="kubecfg-setup" Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.919588 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89113f3f-3992-4b74-af7f-4d31f0322f24" containerName="extract-content" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919598 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="89113f3f-3992-4b74-af7f-4d31f0322f24" containerName="extract-content" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919781 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovnkube-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919801 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="76c5d78f-89fe-4ed3-b64e-52de9f0dec4d" containerName="collect-profiles" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919818 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovnkube-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919830 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="sbdb" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919847 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="kube-rbac-proxy-node" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919858 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="kube-rbac-proxy-ovn-metrics" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919875 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovnkube-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919889 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovn-acl-logging" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919904 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="nbdb" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919918 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="northd" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919930 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovn-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919940 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="89113f3f-3992-4b74-af7f-4d31f0322f24" containerName="registry-server" Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.920100 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovnkube-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.920114 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovnkube-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.920262 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovnkube-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.920553 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovnkube-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.926454 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.929181 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.930651 5094 scope.go:117] "RemoveContainer" containerID="0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.971633 5094 scope.go:117] "RemoveContainer" containerID="7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.989788 5094 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.989841 5094 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.989856 5094 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.989933 5094 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.989948 5094 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.989962 5094 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.989976 5094 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.989989 5094 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.990001 5094 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.990013 5094 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.990025 5094 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.990039 5094 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-node-log\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.990051 5094 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.990066 5094 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.990079 5094 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-slash\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.990096 5094 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.990114 5094 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.990130 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swnw6\" (UniqueName: \"kubernetes.io/projected/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-kube-api-access-swnw6\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.990146 5094 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-log-socket\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.990159 5094 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.011548 5094 scope.go:117] "RemoveContainer" containerID="835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.045301 5094 scope.go:117] "RemoveContainer" containerID="bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.067273 5094 scope.go:117] "RemoveContainer" containerID="192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.085404 5094 scope.go:117] "RemoveContainer" containerID="f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.091512 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9e4a996a-7aca-4bec-b29f-084adfb08333-ovn-node-metrics-cert\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.091562 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-node-log\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.091596 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-run-netns\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.091627 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9e4a996a-7aca-4bec-b29f-084adfb08333-env-overrides\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.091663 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-kubelet\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.092543 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9e4a996a-7aca-4bec-b29f-084adfb08333-ovnkube-script-lib\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.092632 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-slash\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.092785 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-etc-openvswitch\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.092878 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-cni-bin\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.092940 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-run-openvswitch\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.092991 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bw5z\" (UniqueName: \"kubernetes.io/projected/9e4a996a-7aca-4bec-b29f-084adfb08333-kube-api-access-4bw5z\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.093068 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-run-ovn-kubernetes\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.093210 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-var-lib-openvswitch\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.093273 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-log-socket\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.093366 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.093467 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-cni-netd\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.093526 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-run-ovn\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.093565 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9e4a996a-7aca-4bec-b29f-084adfb08333-ovnkube-config\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.093591 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-run-systemd\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.093622 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-systemd-units\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.109952 5094 scope.go:117] "RemoveContainer" containerID="2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.130284 5094 scope.go:117] "RemoveContainer" containerID="9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.149916 5094 scope.go:117] "RemoveContainer" containerID="218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.167792 5094 scope.go:117] "RemoveContainer" containerID="1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9" Feb 20 07:00:11 crc kubenswrapper[5094]: E0220 07:00:11.168314 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9\": container with ID starting with 1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9 not found: ID does not exist" containerID="1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.168353 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9"} err="failed to get container status \"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9\": rpc error: code = NotFound desc = could not find container \"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9\": container with ID starting with 1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.168387 5094 scope.go:117] "RemoveContainer" containerID="0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd" Feb 20 07:00:11 crc kubenswrapper[5094]: E0220 07:00:11.168729 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd\": container with ID starting with 0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd not found: ID does not exist" containerID="0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.168758 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd"} err="failed to get container status \"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd\": rpc error: code = NotFound desc = could not find container \"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd\": container with ID starting with 0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.168777 5094 scope.go:117] "RemoveContainer" containerID="7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77" Feb 20 07:00:11 crc kubenswrapper[5094]: E0220 07:00:11.169374 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\": container with ID starting with 7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77 not found: ID does not exist" containerID="7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.169402 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77"} err="failed to get container status \"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\": rpc error: code = NotFound desc = could not find container \"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\": container with ID starting with 7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.169420 5094 scope.go:117] "RemoveContainer" containerID="835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9" Feb 20 07:00:11 crc kubenswrapper[5094]: E0220 07:00:11.169691 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\": container with ID starting with 835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9 not found: ID does not exist" containerID="835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.169736 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9"} err="failed to get container status \"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\": rpc error: code = NotFound desc = could not find container \"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\": container with ID starting with 835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.169753 5094 scope.go:117] "RemoveContainer" containerID="bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e" Feb 20 07:00:11 crc kubenswrapper[5094]: E0220 07:00:11.170139 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\": container with ID starting with bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e not found: ID does not exist" containerID="bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.170166 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e"} err="failed to get container status \"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\": rpc error: code = NotFound desc = could not find container \"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\": container with ID starting with bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.170188 5094 scope.go:117] "RemoveContainer" containerID="192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf" Feb 20 07:00:11 crc kubenswrapper[5094]: E0220 07:00:11.170599 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\": container with ID starting with 192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf not found: ID does not exist" containerID="192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.170696 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf"} err="failed to get container status \"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\": rpc error: code = NotFound desc = could not find container \"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\": container with ID starting with 192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.170748 5094 scope.go:117] "RemoveContainer" containerID="f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e" Feb 20 07:00:11 crc kubenswrapper[5094]: E0220 07:00:11.171222 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\": container with ID starting with f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e not found: ID does not exist" containerID="f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.171258 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e"} err="failed to get container status \"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\": rpc error: code = NotFound desc = could not find container \"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\": container with ID starting with f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.171278 5094 scope.go:117] "RemoveContainer" containerID="2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba" Feb 20 07:00:11 crc kubenswrapper[5094]: E0220 07:00:11.171526 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\": container with ID starting with 2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba not found: ID does not exist" containerID="2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.171556 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba"} err="failed to get container status \"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\": rpc error: code = NotFound desc = could not find container \"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\": container with ID starting with 2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.171582 5094 scope.go:117] "RemoveContainer" containerID="9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60" Feb 20 07:00:11 crc kubenswrapper[5094]: E0220 07:00:11.171925 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\": container with ID starting with 9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60 not found: ID does not exist" containerID="9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.171964 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60"} err="failed to get container status \"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\": rpc error: code = NotFound desc = could not find container \"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\": container with ID starting with 9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.171984 5094 scope.go:117] "RemoveContainer" containerID="218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10" Feb 20 07:00:11 crc kubenswrapper[5094]: E0220 07:00:11.172229 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\": container with ID starting with 218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10 not found: ID does not exist" containerID="218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.172258 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10"} err="failed to get container status \"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\": rpc error: code = NotFound desc = could not find container \"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\": container with ID starting with 218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.172277 5094 scope.go:117] "RemoveContainer" containerID="1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.172766 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9"} err="failed to get container status \"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9\": rpc error: code = NotFound desc = could not find container \"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9\": container with ID starting with 1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.172797 5094 scope.go:117] "RemoveContainer" containerID="0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.173110 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd"} err="failed to get container status \"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd\": rpc error: code = NotFound desc = could not find container \"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd\": container with ID starting with 0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.173139 5094 scope.go:117] "RemoveContainer" containerID="7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.173378 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77"} err="failed to get container status \"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\": rpc error: code = NotFound desc = could not find container \"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\": container with ID starting with 7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.173405 5094 scope.go:117] "RemoveContainer" containerID="835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.173828 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9"} err="failed to get container status \"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\": rpc error: code = NotFound desc = could not find container \"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\": container with ID starting with 835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.173857 5094 scope.go:117] "RemoveContainer" containerID="bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.174227 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e"} err="failed to get container status \"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\": rpc error: code = NotFound desc = could not find container \"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\": container with ID starting with bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.174254 5094 scope.go:117] "RemoveContainer" containerID="192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.174494 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf"} err="failed to get container status \"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\": rpc error: code = NotFound desc = could not find container \"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\": container with ID starting with 192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.174521 5094 scope.go:117] "RemoveContainer" containerID="f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.174826 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e"} err="failed to get container status \"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\": rpc error: code = NotFound desc = could not find container \"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\": container with ID starting with f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.174854 5094 scope.go:117] "RemoveContainer" containerID="2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.175286 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba"} err="failed to get container status \"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\": rpc error: code = NotFound desc = could not find container \"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\": container with ID starting with 2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.175317 5094 scope.go:117] "RemoveContainer" containerID="9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.175542 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60"} err="failed to get container status \"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\": rpc error: code = NotFound desc = could not find container \"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\": container with ID starting with 9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.175568 5094 scope.go:117] "RemoveContainer" containerID="218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.175970 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10"} err="failed to get container status \"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\": rpc error: code = NotFound desc = could not find container \"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\": container with ID starting with 218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.175997 5094 scope.go:117] "RemoveContainer" containerID="1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.176324 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9"} err="failed to get container status \"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9\": rpc error: code = NotFound desc = could not find container \"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9\": container with ID starting with 1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.176354 5094 scope.go:117] "RemoveContainer" containerID="0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.176757 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd"} err="failed to get container status \"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd\": rpc error: code = NotFound desc = could not find container \"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd\": container with ID starting with 0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.176821 5094 scope.go:117] "RemoveContainer" containerID="7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.177087 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77"} err="failed to get container status \"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\": rpc error: code = NotFound desc = could not find container \"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\": container with ID starting with 7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.177116 5094 scope.go:117] "RemoveContainer" containerID="835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.177522 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9"} err="failed to get container status \"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\": rpc error: code = NotFound desc = could not find container \"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\": container with ID starting with 835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.177554 5094 scope.go:117] "RemoveContainer" containerID="bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.177867 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e"} err="failed to get container status \"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\": rpc error: code = NotFound desc = could not find container \"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\": container with ID starting with bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.177896 5094 scope.go:117] "RemoveContainer" containerID="192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.178141 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf"} err="failed to get container status \"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\": rpc error: code = NotFound desc = could not find container \"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\": container with ID starting with 192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.178167 5094 scope.go:117] "RemoveContainer" containerID="f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.178420 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e"} err="failed to get container status \"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\": rpc error: code = NotFound desc = could not find container \"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\": container with ID starting with f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.178462 5094 scope.go:117] "RemoveContainer" containerID="2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.179008 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba"} err="failed to get container status \"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\": rpc error: code = NotFound desc = could not find container \"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\": container with ID starting with 2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.179039 5094 scope.go:117] "RemoveContainer" containerID="9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.179303 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60"} err="failed to get container status \"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\": rpc error: code = NotFound desc = could not find container \"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\": container with ID starting with 9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.179335 5094 scope.go:117] "RemoveContainer" containerID="218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.179540 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10"} err="failed to get container status \"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\": rpc error: code = NotFound desc = could not find container \"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\": container with ID starting with 218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.179568 5094 scope.go:117] "RemoveContainer" containerID="1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.180044 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9"} err="failed to get container status \"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9\": rpc error: code = NotFound desc = could not find container \"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9\": container with ID starting with 1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.180081 5094 scope.go:117] "RemoveContainer" containerID="0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.180326 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd"} err="failed to get container status \"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd\": rpc error: code = NotFound desc = could not find container \"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd\": container with ID starting with 0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.180352 5094 scope.go:117] "RemoveContainer" containerID="7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.180560 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77"} err="failed to get container status \"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\": rpc error: code = NotFound desc = could not find container \"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\": container with ID starting with 7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.180580 5094 scope.go:117] "RemoveContainer" containerID="835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.181037 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9"} err="failed to get container status \"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\": rpc error: code = NotFound desc = could not find container \"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\": container with ID starting with 835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.181066 5094 scope.go:117] "RemoveContainer" containerID="bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.181308 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e"} err="failed to get container status \"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\": rpc error: code = NotFound desc = could not find container \"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\": container with ID starting with bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.181334 5094 scope.go:117] "RemoveContainer" containerID="192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.181741 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf"} err="failed to get container status \"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\": rpc error: code = NotFound desc = could not find container \"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\": container with ID starting with 192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.181767 5094 scope.go:117] "RemoveContainer" containerID="f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.184749 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e"} err="failed to get container status \"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\": rpc error: code = NotFound desc = could not find container \"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\": container with ID starting with f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.184785 5094 scope.go:117] "RemoveContainer" containerID="2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.185135 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba"} err="failed to get container status \"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\": rpc error: code = NotFound desc = could not find container \"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\": container with ID starting with 2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.185202 5094 scope.go:117] "RemoveContainer" containerID="9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.185659 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60"} err="failed to get container status \"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\": rpc error: code = NotFound desc = could not find container \"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\": container with ID starting with 9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.185691 5094 scope.go:117] "RemoveContainer" containerID="218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.186061 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10"} err="failed to get container status \"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\": rpc error: code = NotFound desc = could not find container \"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\": container with ID starting with 218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.186092 5094 scope.go:117] "RemoveContainer" containerID="1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.186606 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9"} err="failed to get container status \"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9\": rpc error: code = NotFound desc = could not find container \"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9\": container with ID starting with 1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194228 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-run-netns\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194287 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9e4a996a-7aca-4bec-b29f-084adfb08333-env-overrides\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194323 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-kubelet\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194373 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9e4a996a-7aca-4bec-b29f-084adfb08333-ovnkube-script-lib\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194395 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-slash\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194421 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-etc-openvswitch\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194443 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-cni-bin\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194469 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-run-openvswitch\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194494 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bw5z\" (UniqueName: \"kubernetes.io/projected/9e4a996a-7aca-4bec-b29f-084adfb08333-kube-api-access-4bw5z\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194517 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-run-ovn-kubernetes\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194556 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-var-lib-openvswitch\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194582 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-log-socket\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194612 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194644 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-cni-netd\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194670 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-run-ovn\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194690 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9e4a996a-7aca-4bec-b29f-084adfb08333-ovnkube-config\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194740 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-run-systemd\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194765 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-systemd-units\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194789 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9e4a996a-7aca-4bec-b29f-084adfb08333-ovn-node-metrics-cert\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194811 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-node-log\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194928 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-node-log\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194980 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-run-netns\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.195373 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-kubelet\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.195748 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9e4a996a-7aca-4bec-b29f-084adfb08333-env-overrides\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.195803 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-log-socket\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.195834 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-slash\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.195860 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-etc-openvswitch\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.195888 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-cni-bin\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.195916 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-run-openvswitch\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.196193 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9e4a996a-7aca-4bec-b29f-084adfb08333-ovnkube-script-lib\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.196249 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.196286 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-cni-netd\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.196320 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-run-ovn\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.196329 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-var-lib-openvswitch\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.196943 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9e4a996a-7aca-4bec-b29f-084adfb08333-ovnkube-config\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.196998 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-systemd-units\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.197033 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-run-systemd\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.198413 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-run-ovn-kubernetes\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.203406 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9e4a996a-7aca-4bec-b29f-084adfb08333-ovn-node-metrics-cert\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.214550 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bw5z\" (UniqueName: \"kubernetes.io/projected/9e4a996a-7aca-4bec-b29f-084adfb08333-kube-api-access-4bw5z\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.240218 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-29bjc"] Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.245088 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-29bjc"] Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.256672 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: W0220 07:00:11.288334 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e4a996a_7aca_4bec_b29f_084adfb08333.slice/crio-37906bdec6775dcd5daa77687930c110bd1dcd9a4dee4df0b7ddf1b22b11d361 WatchSource:0}: Error finding container 37906bdec6775dcd5daa77687930c110bd1dcd9a4dee4df0b7ddf1b22b11d361: Status 404 returned error can't find the container with id 37906bdec6775dcd5daa77687930c110bd1dcd9a4dee4df0b7ddf1b22b11d361 Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.851090 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" path="/var/lib/kubelet/pods/d1c36de3-d36b-48ed-9d4d-3aa52d72add0/volumes" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.927310 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zr8rz_c3900f6d-3035-4fc4-80a2-9e79154f4f5e/kube-multus/2.log" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.928369 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zr8rz_c3900f6d-3035-4fc4-80a2-9e79154f4f5e/kube-multus/1.log" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.928488 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zr8rz" event={"ID":"c3900f6d-3035-4fc4-80a2-9e79154f4f5e","Type":"ContainerStarted","Data":"1d279e83ea8ce219bda95b62bfc1a070141835ac04e351b972e1f436008c1683"} Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.932471 5094 generic.go:334] "Generic (PLEG): container finished" podID="9e4a996a-7aca-4bec-b29f-084adfb08333" containerID="f77448e24adb1d4db64bba7460ba8b08e7bf9afb7b7b9e95a62bbc45afd95581" exitCode=0 Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.932524 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" event={"ID":"9e4a996a-7aca-4bec-b29f-084adfb08333","Type":"ContainerDied","Data":"f77448e24adb1d4db64bba7460ba8b08e7bf9afb7b7b9e95a62bbc45afd95581"} Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.932545 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" event={"ID":"9e4a996a-7aca-4bec-b29f-084adfb08333","Type":"ContainerStarted","Data":"37906bdec6775dcd5daa77687930c110bd1dcd9a4dee4df0b7ddf1b22b11d361"} Feb 20 07:00:12 crc kubenswrapper[5094]: I0220 07:00:12.944651 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" event={"ID":"9e4a996a-7aca-4bec-b29f-084adfb08333","Type":"ContainerStarted","Data":"04922c8d1356f0213893ad03eb2eb98102ac071346518cf74a777806c4766952"} Feb 20 07:00:12 crc kubenswrapper[5094]: I0220 07:00:12.945389 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" event={"ID":"9e4a996a-7aca-4bec-b29f-084adfb08333","Type":"ContainerStarted","Data":"8f8e6cc3d30ef9904866577122b5f3fb523207ceea1a1103df1fb9c7f084d947"} Feb 20 07:00:12 crc kubenswrapper[5094]: I0220 07:00:12.945406 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" event={"ID":"9e4a996a-7aca-4bec-b29f-084adfb08333","Type":"ContainerStarted","Data":"91b200f21b2028e01f125b98f346d32989f273ef639b1a40fa04e342f6361bc1"} Feb 20 07:00:12 crc kubenswrapper[5094]: I0220 07:00:12.945419 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" event={"ID":"9e4a996a-7aca-4bec-b29f-084adfb08333","Type":"ContainerStarted","Data":"e65dcc683e28168ee2a8e8e100fff4170469a417a2153f8103028dbe2708a161"} Feb 20 07:00:12 crc kubenswrapper[5094]: I0220 07:00:12.945432 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" event={"ID":"9e4a996a-7aca-4bec-b29f-084adfb08333","Type":"ContainerStarted","Data":"88c024bcff9b37a832bdfebb468494f719fe1848779e758af6cda6462f947610"} Feb 20 07:00:13 crc kubenswrapper[5094]: I0220 07:00:13.960627 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" event={"ID":"9e4a996a-7aca-4bec-b29f-084adfb08333","Type":"ContainerStarted","Data":"c6acd2b6f417927a757a926676c9d09cddd0ab6c3955612a17294de6e0e22c20"} Feb 20 07:00:15 crc kubenswrapper[5094]: I0220 07:00:15.982507 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" event={"ID":"9e4a996a-7aca-4bec-b29f-084adfb08333","Type":"ContainerStarted","Data":"26c55ff0bec4de361a2c42c2f62fd56bb505ae7e61509c1c8b001cec3dd7c816"} Feb 20 07:00:18 crc kubenswrapper[5094]: I0220 07:00:18.004129 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" event={"ID":"9e4a996a-7aca-4bec-b29f-084adfb08333","Type":"ContainerStarted","Data":"339f90966a3a5116c39b9597ba0abbb3d3664f9fb05a6b1ef881eb423112fec4"} Feb 20 07:00:18 crc kubenswrapper[5094]: I0220 07:00:18.004558 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:18 crc kubenswrapper[5094]: I0220 07:00:18.004587 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:18 crc kubenswrapper[5094]: I0220 07:00:18.004597 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:18 crc kubenswrapper[5094]: I0220 07:00:18.042427 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" podStartSLOduration=8.042406697 podStartE2EDuration="8.042406697s" podCreationTimestamp="2026-02-20 07:00:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:00:18.041374312 +0000 UTC m=+832.914001063" watchObservedRunningTime="2026-02-20 07:00:18.042406697 +0000 UTC m=+832.915033418" Feb 20 07:00:18 crc kubenswrapper[5094]: I0220 07:00:18.050932 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:18 crc kubenswrapper[5094]: I0220 07:00:18.053139 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:19 crc kubenswrapper[5094]: I0220 07:00:19.996806 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-48gmd"] Feb 20 07:00:19 crc kubenswrapper[5094]: I0220 07:00:19.998471 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:20 crc kubenswrapper[5094]: I0220 07:00:20.001063 5094 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-c5nt4" Feb 20 07:00:20 crc kubenswrapper[5094]: I0220 07:00:20.003391 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 20 07:00:20 crc kubenswrapper[5094]: I0220 07:00:20.003399 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 20 07:00:20 crc kubenswrapper[5094]: I0220 07:00:20.003864 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 20 07:00:20 crc kubenswrapper[5094]: I0220 07:00:20.010969 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-48gmd"] Feb 20 07:00:20 crc kubenswrapper[5094]: I0220 07:00:20.039130 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-node-mnt\") pod \"crc-storage-crc-48gmd\" (UID: \"9d8b4842-acdc-4e60-9de5-b7b6dde61b62\") " pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:20 crc kubenswrapper[5094]: I0220 07:00:20.039499 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlmfk\" (UniqueName: \"kubernetes.io/projected/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-kube-api-access-wlmfk\") pod \"crc-storage-crc-48gmd\" (UID: \"9d8b4842-acdc-4e60-9de5-b7b6dde61b62\") " pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:20 crc kubenswrapper[5094]: I0220 07:00:20.039800 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-crc-storage\") pod \"crc-storage-crc-48gmd\" (UID: \"9d8b4842-acdc-4e60-9de5-b7b6dde61b62\") " pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:20 crc kubenswrapper[5094]: I0220 07:00:20.140924 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-node-mnt\") pod \"crc-storage-crc-48gmd\" (UID: \"9d8b4842-acdc-4e60-9de5-b7b6dde61b62\") " pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:20 crc kubenswrapper[5094]: I0220 07:00:20.141423 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlmfk\" (UniqueName: \"kubernetes.io/projected/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-kube-api-access-wlmfk\") pod \"crc-storage-crc-48gmd\" (UID: \"9d8b4842-acdc-4e60-9de5-b7b6dde61b62\") " pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:20 crc kubenswrapper[5094]: I0220 07:00:20.141474 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-crc-storage\") pod \"crc-storage-crc-48gmd\" (UID: \"9d8b4842-acdc-4e60-9de5-b7b6dde61b62\") " pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:20 crc kubenswrapper[5094]: I0220 07:00:20.141666 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-node-mnt\") pod \"crc-storage-crc-48gmd\" (UID: \"9d8b4842-acdc-4e60-9de5-b7b6dde61b62\") " pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:20 crc kubenswrapper[5094]: I0220 07:00:20.142508 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-crc-storage\") pod \"crc-storage-crc-48gmd\" (UID: \"9d8b4842-acdc-4e60-9de5-b7b6dde61b62\") " pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:20 crc kubenswrapper[5094]: I0220 07:00:20.173407 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlmfk\" (UniqueName: \"kubernetes.io/projected/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-kube-api-access-wlmfk\") pod \"crc-storage-crc-48gmd\" (UID: \"9d8b4842-acdc-4e60-9de5-b7b6dde61b62\") " pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:20 crc kubenswrapper[5094]: I0220 07:00:20.328282 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:20 crc kubenswrapper[5094]: E0220 07:00:20.369288 5094 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-48gmd_crc-storage_9d8b4842-acdc-4e60-9de5-b7b6dde61b62_0(e239c521a0661f805585d75a5caeaeeed17de22e6929841ac5ef805c84e4d0e9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 20 07:00:20 crc kubenswrapper[5094]: E0220 07:00:20.369377 5094 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-48gmd_crc-storage_9d8b4842-acdc-4e60-9de5-b7b6dde61b62_0(e239c521a0661f805585d75a5caeaeeed17de22e6929841ac5ef805c84e4d0e9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:20 crc kubenswrapper[5094]: E0220 07:00:20.369405 5094 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-48gmd_crc-storage_9d8b4842-acdc-4e60-9de5-b7b6dde61b62_0(e239c521a0661f805585d75a5caeaeeed17de22e6929841ac5ef805c84e4d0e9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:20 crc kubenswrapper[5094]: E0220 07:00:20.369464 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-48gmd_crc-storage(9d8b4842-acdc-4e60-9de5-b7b6dde61b62)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-48gmd_crc-storage(9d8b4842-acdc-4e60-9de5-b7b6dde61b62)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-48gmd_crc-storage_9d8b4842-acdc-4e60-9de5-b7b6dde61b62_0(e239c521a0661f805585d75a5caeaeeed17de22e6929841ac5ef805c84e4d0e9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-48gmd" podUID="9d8b4842-acdc-4e60-9de5-b7b6dde61b62" Feb 20 07:00:21 crc kubenswrapper[5094]: I0220 07:00:21.024030 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:21 crc kubenswrapper[5094]: I0220 07:00:21.024871 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:21 crc kubenswrapper[5094]: E0220 07:00:21.067817 5094 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-48gmd_crc-storage_9d8b4842-acdc-4e60-9de5-b7b6dde61b62_0(5765095bf77cfaa9ecc1ed25ec88f0d1ce4f72351ae9ad105fcc1602ff307f5e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 20 07:00:21 crc kubenswrapper[5094]: E0220 07:00:21.067924 5094 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-48gmd_crc-storage_9d8b4842-acdc-4e60-9de5-b7b6dde61b62_0(5765095bf77cfaa9ecc1ed25ec88f0d1ce4f72351ae9ad105fcc1602ff307f5e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:21 crc kubenswrapper[5094]: E0220 07:00:21.067967 5094 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-48gmd_crc-storage_9d8b4842-acdc-4e60-9de5-b7b6dde61b62_0(5765095bf77cfaa9ecc1ed25ec88f0d1ce4f72351ae9ad105fcc1602ff307f5e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:21 crc kubenswrapper[5094]: E0220 07:00:21.068053 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-48gmd_crc-storage(9d8b4842-acdc-4e60-9de5-b7b6dde61b62)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-48gmd_crc-storage(9d8b4842-acdc-4e60-9de5-b7b6dde61b62)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-48gmd_crc-storage_9d8b4842-acdc-4e60-9de5-b7b6dde61b62_0(5765095bf77cfaa9ecc1ed25ec88f0d1ce4f72351ae9ad105fcc1602ff307f5e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-48gmd" podUID="9d8b4842-acdc-4e60-9de5-b7b6dde61b62" Feb 20 07:00:26 crc kubenswrapper[5094]: I0220 07:00:26.182525 5094 scope.go:117] "RemoveContainer" containerID="aba2be0e4774b30500be76b546e3ffff5c136a2e26675822931a400ca3090e79" Feb 20 07:00:27 crc kubenswrapper[5094]: I0220 07:00:27.079093 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zr8rz_c3900f6d-3035-4fc4-80a2-9e79154f4f5e/kube-multus/2.log" Feb 20 07:00:35 crc kubenswrapper[5094]: I0220 07:00:35.844290 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:35 crc kubenswrapper[5094]: I0220 07:00:35.847165 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:36 crc kubenswrapper[5094]: I0220 07:00:36.140986 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-48gmd"] Feb 20 07:00:37 crc kubenswrapper[5094]: I0220 07:00:37.156269 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-48gmd" event={"ID":"9d8b4842-acdc-4e60-9de5-b7b6dde61b62","Type":"ContainerStarted","Data":"667a553c1273f6b25d4e22a567d26c154e982736d7b65b462fc6b3be733c0965"} Feb 20 07:00:38 crc kubenswrapper[5094]: I0220 07:00:38.168183 5094 generic.go:334] "Generic (PLEG): container finished" podID="9d8b4842-acdc-4e60-9de5-b7b6dde61b62" containerID="07fcab491ccca10a02c6e686a0115bd8c0916121144d5fd12b7356bb88847cbf" exitCode=0 Feb 20 07:00:38 crc kubenswrapper[5094]: I0220 07:00:38.168282 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-48gmd" event={"ID":"9d8b4842-acdc-4e60-9de5-b7b6dde61b62","Type":"ContainerDied","Data":"07fcab491ccca10a02c6e686a0115bd8c0916121144d5fd12b7356bb88847cbf"} Feb 20 07:00:39 crc kubenswrapper[5094]: I0220 07:00:39.460518 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:39 crc kubenswrapper[5094]: I0220 07:00:39.475010 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlmfk\" (UniqueName: \"kubernetes.io/projected/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-kube-api-access-wlmfk\") pod \"9d8b4842-acdc-4e60-9de5-b7b6dde61b62\" (UID: \"9d8b4842-acdc-4e60-9de5-b7b6dde61b62\") " Feb 20 07:00:39 crc kubenswrapper[5094]: I0220 07:00:39.475077 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-crc-storage\") pod \"9d8b4842-acdc-4e60-9de5-b7b6dde61b62\" (UID: \"9d8b4842-acdc-4e60-9de5-b7b6dde61b62\") " Feb 20 07:00:39 crc kubenswrapper[5094]: I0220 07:00:39.475097 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-node-mnt\") pod \"9d8b4842-acdc-4e60-9de5-b7b6dde61b62\" (UID: \"9d8b4842-acdc-4e60-9de5-b7b6dde61b62\") " Feb 20 07:00:39 crc kubenswrapper[5094]: I0220 07:00:39.475298 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "9d8b4842-acdc-4e60-9de5-b7b6dde61b62" (UID: "9d8b4842-acdc-4e60-9de5-b7b6dde61b62"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:00:39 crc kubenswrapper[5094]: I0220 07:00:39.486505 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-kube-api-access-wlmfk" (OuterVolumeSpecName: "kube-api-access-wlmfk") pod "9d8b4842-acdc-4e60-9de5-b7b6dde61b62" (UID: "9d8b4842-acdc-4e60-9de5-b7b6dde61b62"). InnerVolumeSpecName "kube-api-access-wlmfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:00:39 crc kubenswrapper[5094]: I0220 07:00:39.493168 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "9d8b4842-acdc-4e60-9de5-b7b6dde61b62" (UID: "9d8b4842-acdc-4e60-9de5-b7b6dde61b62"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:00:39 crc kubenswrapper[5094]: I0220 07:00:39.577003 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlmfk\" (UniqueName: \"kubernetes.io/projected/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-kube-api-access-wlmfk\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:39 crc kubenswrapper[5094]: I0220 07:00:39.577065 5094 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:39 crc kubenswrapper[5094]: I0220 07:00:39.577086 5094 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:40 crc kubenswrapper[5094]: I0220 07:00:40.183844 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-48gmd" event={"ID":"9d8b4842-acdc-4e60-9de5-b7b6dde61b62","Type":"ContainerDied","Data":"667a553c1273f6b25d4e22a567d26c154e982736d7b65b462fc6b3be733c0965"} Feb 20 07:00:40 crc kubenswrapper[5094]: I0220 07:00:40.183978 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="667a553c1273f6b25d4e22a567d26c154e982736d7b65b462fc6b3be733c0965" Feb 20 07:00:40 crc kubenswrapper[5094]: I0220 07:00:40.184025 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:41 crc kubenswrapper[5094]: I0220 07:00:41.296807 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:47 crc kubenswrapper[5094]: I0220 07:00:47.729844 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr"] Feb 20 07:00:47 crc kubenswrapper[5094]: E0220 07:00:47.730473 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d8b4842-acdc-4e60-9de5-b7b6dde61b62" containerName="storage" Feb 20 07:00:47 crc kubenswrapper[5094]: I0220 07:00:47.730491 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d8b4842-acdc-4e60-9de5-b7b6dde61b62" containerName="storage" Feb 20 07:00:47 crc kubenswrapper[5094]: I0220 07:00:47.730613 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d8b4842-acdc-4e60-9de5-b7b6dde61b62" containerName="storage" Feb 20 07:00:47 crc kubenswrapper[5094]: I0220 07:00:47.731630 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" Feb 20 07:00:47 crc kubenswrapper[5094]: I0220 07:00:47.735121 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 20 07:00:47 crc kubenswrapper[5094]: I0220 07:00:47.744080 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr"] Feb 20 07:00:47 crc kubenswrapper[5094]: I0220 07:00:47.902180 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr\" (UID: \"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" Feb 20 07:00:47 crc kubenswrapper[5094]: I0220 07:00:47.903108 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr\" (UID: \"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" Feb 20 07:00:47 crc kubenswrapper[5094]: I0220 07:00:47.904268 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppxmn\" (UniqueName: \"kubernetes.io/projected/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-kube-api-access-ppxmn\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr\" (UID: \"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" Feb 20 07:00:48 crc kubenswrapper[5094]: I0220 07:00:48.005786 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppxmn\" (UniqueName: \"kubernetes.io/projected/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-kube-api-access-ppxmn\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr\" (UID: \"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" Feb 20 07:00:48 crc kubenswrapper[5094]: I0220 07:00:48.006357 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr\" (UID: \"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" Feb 20 07:00:48 crc kubenswrapper[5094]: I0220 07:00:48.006433 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr\" (UID: \"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" Feb 20 07:00:48 crc kubenswrapper[5094]: I0220 07:00:48.007199 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr\" (UID: \"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" Feb 20 07:00:48 crc kubenswrapper[5094]: I0220 07:00:48.007472 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr\" (UID: \"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" Feb 20 07:00:48 crc kubenswrapper[5094]: I0220 07:00:48.038654 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppxmn\" (UniqueName: \"kubernetes.io/projected/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-kube-api-access-ppxmn\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr\" (UID: \"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" Feb 20 07:00:48 crc kubenswrapper[5094]: I0220 07:00:48.049650 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" Feb 20 07:00:48 crc kubenswrapper[5094]: I0220 07:00:48.307196 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr"] Feb 20 07:00:49 crc kubenswrapper[5094]: I0220 07:00:49.266918 5094 generic.go:334] "Generic (PLEG): container finished" podID="b191dcbc-7e3b-4a36-a913-1fe0f53c83d5" containerID="f8c8b136d680dadbbed473d708411afa42e083a811c03a3a1b033f0d8631afea" exitCode=0 Feb 20 07:00:49 crc kubenswrapper[5094]: I0220 07:00:49.266994 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" event={"ID":"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5","Type":"ContainerDied","Data":"f8c8b136d680dadbbed473d708411afa42e083a811c03a3a1b033f0d8631afea"} Feb 20 07:00:49 crc kubenswrapper[5094]: I0220 07:00:49.267384 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" event={"ID":"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5","Type":"ContainerStarted","Data":"adc4324be402413f71eed86574b9f5335193dda4aaef65f248bc678eef38fdbf"} Feb 20 07:00:51 crc kubenswrapper[5094]: I0220 07:00:51.284228 5094 generic.go:334] "Generic (PLEG): container finished" podID="b191dcbc-7e3b-4a36-a913-1fe0f53c83d5" containerID="e747dcf9fca7b80536c627106751461820e39ec33525c96f1c60053606c68ad6" exitCode=0 Feb 20 07:00:51 crc kubenswrapper[5094]: I0220 07:00:51.284989 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" event={"ID":"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5","Type":"ContainerDied","Data":"e747dcf9fca7b80536c627106751461820e39ec33525c96f1c60053606c68ad6"} Feb 20 07:00:52 crc kubenswrapper[5094]: I0220 07:00:52.297917 5094 generic.go:334] "Generic (PLEG): container finished" podID="b191dcbc-7e3b-4a36-a913-1fe0f53c83d5" containerID="97f353e9e01dbc2588d65d9823ad2fa367bc88377f879c7e8bb262dd866c71e7" exitCode=0 Feb 20 07:00:52 crc kubenswrapper[5094]: I0220 07:00:52.298015 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" event={"ID":"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5","Type":"ContainerDied","Data":"97f353e9e01dbc2588d65d9823ad2fa367bc88377f879c7e8bb262dd866c71e7"} Feb 20 07:00:53 crc kubenswrapper[5094]: I0220 07:00:53.617749 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" Feb 20 07:00:53 crc kubenswrapper[5094]: I0220 07:00:53.814992 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-util\") pod \"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5\" (UID: \"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5\") " Feb 20 07:00:53 crc kubenswrapper[5094]: I0220 07:00:53.815375 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppxmn\" (UniqueName: \"kubernetes.io/projected/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-kube-api-access-ppxmn\") pod \"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5\" (UID: \"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5\") " Feb 20 07:00:53 crc kubenswrapper[5094]: I0220 07:00:53.815439 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-bundle\") pod \"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5\" (UID: \"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5\") " Feb 20 07:00:53 crc kubenswrapper[5094]: I0220 07:00:53.817300 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-bundle" (OuterVolumeSpecName: "bundle") pod "b191dcbc-7e3b-4a36-a913-1fe0f53c83d5" (UID: "b191dcbc-7e3b-4a36-a913-1fe0f53c83d5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:00:53 crc kubenswrapper[5094]: I0220 07:00:53.827235 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-kube-api-access-ppxmn" (OuterVolumeSpecName: "kube-api-access-ppxmn") pod "b191dcbc-7e3b-4a36-a913-1fe0f53c83d5" (UID: "b191dcbc-7e3b-4a36-a913-1fe0f53c83d5"). InnerVolumeSpecName "kube-api-access-ppxmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:00:53 crc kubenswrapper[5094]: I0220 07:00:53.917312 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppxmn\" (UniqueName: \"kubernetes.io/projected/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-kube-api-access-ppxmn\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:53 crc kubenswrapper[5094]: I0220 07:00:53.917423 5094 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:54 crc kubenswrapper[5094]: I0220 07:00:54.041432 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-util" (OuterVolumeSpecName: "util") pod "b191dcbc-7e3b-4a36-a913-1fe0f53c83d5" (UID: "b191dcbc-7e3b-4a36-a913-1fe0f53c83d5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:00:54 crc kubenswrapper[5094]: I0220 07:00:54.120641 5094 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-util\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:54 crc kubenswrapper[5094]: I0220 07:00:54.316916 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" event={"ID":"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5","Type":"ContainerDied","Data":"adc4324be402413f71eed86574b9f5335193dda4aaef65f248bc678eef38fdbf"} Feb 20 07:00:54 crc kubenswrapper[5094]: I0220 07:00:54.316993 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adc4324be402413f71eed86574b9f5335193dda4aaef65f248bc678eef38fdbf" Feb 20 07:00:54 crc kubenswrapper[5094]: I0220 07:00:54.317013 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" Feb 20 07:00:59 crc kubenswrapper[5094]: I0220 07:00:59.330383 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-qg9ms"] Feb 20 07:00:59 crc kubenswrapper[5094]: E0220 07:00:59.331071 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b191dcbc-7e3b-4a36-a913-1fe0f53c83d5" containerName="extract" Feb 20 07:00:59 crc kubenswrapper[5094]: I0220 07:00:59.331088 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b191dcbc-7e3b-4a36-a913-1fe0f53c83d5" containerName="extract" Feb 20 07:00:59 crc kubenswrapper[5094]: E0220 07:00:59.331107 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b191dcbc-7e3b-4a36-a913-1fe0f53c83d5" containerName="util" Feb 20 07:00:59 crc kubenswrapper[5094]: I0220 07:00:59.331115 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b191dcbc-7e3b-4a36-a913-1fe0f53c83d5" containerName="util" Feb 20 07:00:59 crc kubenswrapper[5094]: E0220 07:00:59.331132 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b191dcbc-7e3b-4a36-a913-1fe0f53c83d5" containerName="pull" Feb 20 07:00:59 crc kubenswrapper[5094]: I0220 07:00:59.331141 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b191dcbc-7e3b-4a36-a913-1fe0f53c83d5" containerName="pull" Feb 20 07:00:59 crc kubenswrapper[5094]: I0220 07:00:59.331257 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b191dcbc-7e3b-4a36-a913-1fe0f53c83d5" containerName="extract" Feb 20 07:00:59 crc kubenswrapper[5094]: I0220 07:00:59.331759 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-qg9ms" Feb 20 07:00:59 crc kubenswrapper[5094]: I0220 07:00:59.335686 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 20 07:00:59 crc kubenswrapper[5094]: I0220 07:00:59.337427 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 20 07:00:59 crc kubenswrapper[5094]: I0220 07:00:59.337625 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-5hz2d" Feb 20 07:00:59 crc kubenswrapper[5094]: I0220 07:00:59.358375 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-qg9ms"] Feb 20 07:00:59 crc kubenswrapper[5094]: I0220 07:00:59.506043 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqzkm\" (UniqueName: \"kubernetes.io/projected/6804a7c3-a0d7-46d4-b317-e9c54265841e-kube-api-access-dqzkm\") pod \"nmstate-operator-694c9596b7-qg9ms\" (UID: \"6804a7c3-a0d7-46d4-b317-e9c54265841e\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-qg9ms" Feb 20 07:00:59 crc kubenswrapper[5094]: I0220 07:00:59.607953 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqzkm\" (UniqueName: \"kubernetes.io/projected/6804a7c3-a0d7-46d4-b317-e9c54265841e-kube-api-access-dqzkm\") pod \"nmstate-operator-694c9596b7-qg9ms\" (UID: \"6804a7c3-a0d7-46d4-b317-e9c54265841e\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-qg9ms" Feb 20 07:00:59 crc kubenswrapper[5094]: I0220 07:00:59.643203 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqzkm\" (UniqueName: \"kubernetes.io/projected/6804a7c3-a0d7-46d4-b317-e9c54265841e-kube-api-access-dqzkm\") pod \"nmstate-operator-694c9596b7-qg9ms\" (UID: \"6804a7c3-a0d7-46d4-b317-e9c54265841e\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-qg9ms" Feb 20 07:00:59 crc kubenswrapper[5094]: I0220 07:00:59.700226 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-qg9ms" Feb 20 07:00:59 crc kubenswrapper[5094]: I0220 07:00:59.933332 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-qg9ms"] Feb 20 07:01:00 crc kubenswrapper[5094]: I0220 07:01:00.363548 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-qg9ms" event={"ID":"6804a7c3-a0d7-46d4-b317-e9c54265841e","Type":"ContainerStarted","Data":"eab1a11034ff39ef5fda44571267c1e6c0ac93b83ad9c282a61ab3324839e8e3"} Feb 20 07:01:02 crc kubenswrapper[5094]: I0220 07:01:02.376840 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-qg9ms" event={"ID":"6804a7c3-a0d7-46d4-b317-e9c54265841e","Type":"ContainerStarted","Data":"740b9e396fff92319abe6b0800e4bcd00c92c4eeb0297b46d6d41e0d1337ea99"} Feb 20 07:01:02 crc kubenswrapper[5094]: I0220 07:01:02.400633 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-qg9ms" podStartSLOduration=1.310232204 podStartE2EDuration="3.400612127s" podCreationTimestamp="2026-02-20 07:00:59 +0000 UTC" firstStartedPulling="2026-02-20 07:00:59.97056232 +0000 UTC m=+874.843189061" lastFinishedPulling="2026-02-20 07:01:02.060942273 +0000 UTC m=+876.933568984" observedRunningTime="2026-02-20 07:01:02.396162401 +0000 UTC m=+877.268789112" watchObservedRunningTime="2026-02-20 07:01:02.400612127 +0000 UTC m=+877.273238838" Feb 20 07:01:04 crc kubenswrapper[5094]: I0220 07:01:04.108041 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:01:04 crc kubenswrapper[5094]: I0220 07:01:04.108178 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:01:07 crc kubenswrapper[5094]: I0220 07:01:07.817115 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-gvgqm"] Feb 20 07:01:07 crc kubenswrapper[5094]: I0220 07:01:07.818522 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-gvgqm" Feb 20 07:01:07 crc kubenswrapper[5094]: I0220 07:01:07.821490 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-s8wfg" Feb 20 07:01:07 crc kubenswrapper[5094]: I0220 07:01:07.846592 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-frvdg"] Feb 20 07:01:07 crc kubenswrapper[5094]: I0220 07:01:07.847417 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-frvdg" Feb 20 07:01:07 crc kubenswrapper[5094]: I0220 07:01:07.848332 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-gvgqm"] Feb 20 07:01:07 crc kubenswrapper[5094]: I0220 07:01:07.849771 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 20 07:01:07 crc kubenswrapper[5094]: I0220 07:01:07.859455 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-frvdg"] Feb 20 07:01:07 crc kubenswrapper[5094]: I0220 07:01:07.877645 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-jr284"] Feb 20 07:01:07 crc kubenswrapper[5094]: I0220 07:01:07.879658 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-jr284" Feb 20 07:01:07 crc kubenswrapper[5094]: I0220 07:01:07.926426 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnzp8\" (UniqueName: \"kubernetes.io/projected/93238aee-86f0-497a-8880-531338e8245f-kube-api-access-tnzp8\") pod \"nmstate-metrics-58c85c668d-gvgqm\" (UID: \"93238aee-86f0-497a-8880-531338e8245f\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-gvgqm" Feb 20 07:01:07 crc kubenswrapper[5094]: I0220 07:01:07.982339 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm"] Feb 20 07:01:07 crc kubenswrapper[5094]: I0220 07:01:07.988924 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm" Feb 20 07:01:07 crc kubenswrapper[5094]: I0220 07:01:07.991595 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-q8vqp" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.002021 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.006006 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.014612 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm"] Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.031856 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/edd001fc-3ddc-4010-8a98-54f4ffeaba72-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-74czm\" (UID: \"edd001fc-3ddc-4010-8a98-54f4ffeaba72\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.031937 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/55b1a421-7ec5-4442-b4c5-11767715cc4b-ovs-socket\") pod \"nmstate-handler-jr284\" (UID: \"55b1a421-7ec5-4442-b4c5-11767715cc4b\") " pod="openshift-nmstate/nmstate-handler-jr284" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.031962 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf4mx\" (UniqueName: \"kubernetes.io/projected/df45fab4-d183-4702-b5b6-2a4e559eff22-kube-api-access-kf4mx\") pod \"nmstate-webhook-866bcb46dc-frvdg\" (UID: \"df45fab4-d183-4702-b5b6-2a4e559eff22\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-frvdg" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.031981 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp2f5\" (UniqueName: \"kubernetes.io/projected/edd001fc-3ddc-4010-8a98-54f4ffeaba72-kube-api-access-dp2f5\") pod \"nmstate-console-plugin-5c78fc5d65-74czm\" (UID: \"edd001fc-3ddc-4010-8a98-54f4ffeaba72\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.032002 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/df45fab4-d183-4702-b5b6-2a4e559eff22-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-frvdg\" (UID: \"df45fab4-d183-4702-b5b6-2a4e559eff22\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-frvdg" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.032020 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/55b1a421-7ec5-4442-b4c5-11767715cc4b-dbus-socket\") pod \"nmstate-handler-jr284\" (UID: \"55b1a421-7ec5-4442-b4c5-11767715cc4b\") " pod="openshift-nmstate/nmstate-handler-jr284" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.032047 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnzp8\" (UniqueName: \"kubernetes.io/projected/93238aee-86f0-497a-8880-531338e8245f-kube-api-access-tnzp8\") pod \"nmstate-metrics-58c85c668d-gvgqm\" (UID: \"93238aee-86f0-497a-8880-531338e8245f\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-gvgqm" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.032065 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/edd001fc-3ddc-4010-8a98-54f4ffeaba72-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-74czm\" (UID: \"edd001fc-3ddc-4010-8a98-54f4ffeaba72\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.032087 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffcc6\" (UniqueName: \"kubernetes.io/projected/55b1a421-7ec5-4442-b4c5-11767715cc4b-kube-api-access-ffcc6\") pod \"nmstate-handler-jr284\" (UID: \"55b1a421-7ec5-4442-b4c5-11767715cc4b\") " pod="openshift-nmstate/nmstate-handler-jr284" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.032108 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/55b1a421-7ec5-4442-b4c5-11767715cc4b-nmstate-lock\") pod \"nmstate-handler-jr284\" (UID: \"55b1a421-7ec5-4442-b4c5-11767715cc4b\") " pod="openshift-nmstate/nmstate-handler-jr284" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.061209 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnzp8\" (UniqueName: \"kubernetes.io/projected/93238aee-86f0-497a-8880-531338e8245f-kube-api-access-tnzp8\") pod \"nmstate-metrics-58c85c668d-gvgqm\" (UID: \"93238aee-86f0-497a-8880-531338e8245f\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-gvgqm" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.133308 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/edd001fc-3ddc-4010-8a98-54f4ffeaba72-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-74czm\" (UID: \"edd001fc-3ddc-4010-8a98-54f4ffeaba72\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.133397 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/55b1a421-7ec5-4442-b4c5-11767715cc4b-ovs-socket\") pod \"nmstate-handler-jr284\" (UID: \"55b1a421-7ec5-4442-b4c5-11767715cc4b\") " pod="openshift-nmstate/nmstate-handler-jr284" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.133425 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf4mx\" (UniqueName: \"kubernetes.io/projected/df45fab4-d183-4702-b5b6-2a4e559eff22-kube-api-access-kf4mx\") pod \"nmstate-webhook-866bcb46dc-frvdg\" (UID: \"df45fab4-d183-4702-b5b6-2a4e559eff22\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-frvdg" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.133447 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp2f5\" (UniqueName: \"kubernetes.io/projected/edd001fc-3ddc-4010-8a98-54f4ffeaba72-kube-api-access-dp2f5\") pod \"nmstate-console-plugin-5c78fc5d65-74czm\" (UID: \"edd001fc-3ddc-4010-8a98-54f4ffeaba72\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.133471 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/df45fab4-d183-4702-b5b6-2a4e559eff22-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-frvdg\" (UID: \"df45fab4-d183-4702-b5b6-2a4e559eff22\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-frvdg" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.133490 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/55b1a421-7ec5-4442-b4c5-11767715cc4b-dbus-socket\") pod \"nmstate-handler-jr284\" (UID: \"55b1a421-7ec5-4442-b4c5-11767715cc4b\") " pod="openshift-nmstate/nmstate-handler-jr284" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.133518 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/edd001fc-3ddc-4010-8a98-54f4ffeaba72-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-74czm\" (UID: \"edd001fc-3ddc-4010-8a98-54f4ffeaba72\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.133525 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/55b1a421-7ec5-4442-b4c5-11767715cc4b-ovs-socket\") pod \"nmstate-handler-jr284\" (UID: \"55b1a421-7ec5-4442-b4c5-11767715cc4b\") " pod="openshift-nmstate/nmstate-handler-jr284" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.133537 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffcc6\" (UniqueName: \"kubernetes.io/projected/55b1a421-7ec5-4442-b4c5-11767715cc4b-kube-api-access-ffcc6\") pod \"nmstate-handler-jr284\" (UID: \"55b1a421-7ec5-4442-b4c5-11767715cc4b\") " pod="openshift-nmstate/nmstate-handler-jr284" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.133635 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/55b1a421-7ec5-4442-b4c5-11767715cc4b-nmstate-lock\") pod \"nmstate-handler-jr284\" (UID: \"55b1a421-7ec5-4442-b4c5-11767715cc4b\") " pod="openshift-nmstate/nmstate-handler-jr284" Feb 20 07:01:08 crc kubenswrapper[5094]: E0220 07:01:08.133795 5094 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.133818 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/55b1a421-7ec5-4442-b4c5-11767715cc4b-nmstate-lock\") pod \"nmstate-handler-jr284\" (UID: \"55b1a421-7ec5-4442-b4c5-11767715cc4b\") " pod="openshift-nmstate/nmstate-handler-jr284" Feb 20 07:01:08 crc kubenswrapper[5094]: E0220 07:01:08.133917 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edd001fc-3ddc-4010-8a98-54f4ffeaba72-plugin-serving-cert podName:edd001fc-3ddc-4010-8a98-54f4ffeaba72 nodeName:}" failed. No retries permitted until 2026-02-20 07:01:08.633887364 +0000 UTC m=+883.506514075 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/edd001fc-3ddc-4010-8a98-54f4ffeaba72-plugin-serving-cert") pod "nmstate-console-plugin-5c78fc5d65-74czm" (UID: "edd001fc-3ddc-4010-8a98-54f4ffeaba72") : secret "plugin-serving-cert" not found Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.133945 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/55b1a421-7ec5-4442-b4c5-11767715cc4b-dbus-socket\") pod \"nmstate-handler-jr284\" (UID: \"55b1a421-7ec5-4442-b4c5-11767715cc4b\") " pod="openshift-nmstate/nmstate-handler-jr284" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.135658 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/edd001fc-3ddc-4010-8a98-54f4ffeaba72-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-74czm\" (UID: \"edd001fc-3ddc-4010-8a98-54f4ffeaba72\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.140301 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-gvgqm" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.148901 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/df45fab4-d183-4702-b5b6-2a4e559eff22-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-frvdg\" (UID: \"df45fab4-d183-4702-b5b6-2a4e559eff22\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-frvdg" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.157718 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffcc6\" (UniqueName: \"kubernetes.io/projected/55b1a421-7ec5-4442-b4c5-11767715cc4b-kube-api-access-ffcc6\") pod \"nmstate-handler-jr284\" (UID: \"55b1a421-7ec5-4442-b4c5-11767715cc4b\") " pod="openshift-nmstate/nmstate-handler-jr284" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.161223 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp2f5\" (UniqueName: \"kubernetes.io/projected/edd001fc-3ddc-4010-8a98-54f4ffeaba72-kube-api-access-dp2f5\") pod \"nmstate-console-plugin-5c78fc5d65-74czm\" (UID: \"edd001fc-3ddc-4010-8a98-54f4ffeaba72\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.167946 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf4mx\" (UniqueName: \"kubernetes.io/projected/df45fab4-d183-4702-b5b6-2a4e559eff22-kube-api-access-kf4mx\") pod \"nmstate-webhook-866bcb46dc-frvdg\" (UID: \"df45fab4-d183-4702-b5b6-2a4e559eff22\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-frvdg" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.173138 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-frvdg" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.197286 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-576d49774f-hnd7r"] Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.198205 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.203735 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-jr284" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.221083 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-576d49774f-hnd7r"] Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.238100 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-trusted-ca-bundle\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.238195 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-console-config\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.238216 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksqgt\" (UniqueName: \"kubernetes.io/projected/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-kube-api-access-ksqgt\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.238292 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-oauth-serving-cert\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.238373 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-console-serving-cert\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.238404 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-service-ca\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.238469 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-console-oauth-config\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.339791 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-trusted-ca-bundle\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.340287 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-console-config\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.340318 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksqgt\" (UniqueName: \"kubernetes.io/projected/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-kube-api-access-ksqgt\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.340338 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-oauth-serving-cert\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.340380 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-console-serving-cert\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.340406 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-service-ca\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.340427 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-console-oauth-config\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.340992 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-trusted-ca-bundle\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.341861 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-service-ca\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.342181 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-console-config\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.342543 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-oauth-serving-cert\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.350241 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-console-serving-cert\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.353101 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-console-oauth-config\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.359070 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksqgt\" (UniqueName: \"kubernetes.io/projected/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-kube-api-access-ksqgt\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.413816 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-gvgqm"] Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.559575 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.615573 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-jr284" event={"ID":"55b1a421-7ec5-4442-b4c5-11767715cc4b","Type":"ContainerStarted","Data":"c6a4943977a48d5175b7fd4b510b3e380a8721941eafb11c1cb04bd173f5b2f9"} Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.620330 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-gvgqm" event={"ID":"93238aee-86f0-497a-8880-531338e8245f","Type":"ContainerStarted","Data":"7bf3c0518fdab7634cd49d528214cd2710cb999119a452fee819345c75715e60"} Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.645512 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/edd001fc-3ddc-4010-8a98-54f4ffeaba72-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-74czm\" (UID: \"edd001fc-3ddc-4010-8a98-54f4ffeaba72\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.657424 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/edd001fc-3ddc-4010-8a98-54f4ffeaba72-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-74czm\" (UID: \"edd001fc-3ddc-4010-8a98-54f4ffeaba72\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.698012 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-frvdg"] Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.817798 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-576d49774f-hnd7r"] Feb 20 07:01:08 crc kubenswrapper[5094]: W0220 07:01:08.831301 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod731a7ef0_b1ea_4ad3_96d4_8668ebbe871b.slice/crio-9d637a988014941be6193cc0048507bc75538a885ca68a3dd469ac5eaace1209 WatchSource:0}: Error finding container 9d637a988014941be6193cc0048507bc75538a885ca68a3dd469ac5eaace1209: Status 404 returned error can't find the container with id 9d637a988014941be6193cc0048507bc75538a885ca68a3dd469ac5eaace1209 Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.901960 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm" Feb 20 07:01:09 crc kubenswrapper[5094]: I0220 07:01:09.208415 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm"] Feb 20 07:01:09 crc kubenswrapper[5094]: W0220 07:01:09.214971 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedd001fc_3ddc_4010_8a98_54f4ffeaba72.slice/crio-da350b6a21209ea8c6d1ed74a94780433633c7d9bb0d268b94c0dc1fd5e3880d WatchSource:0}: Error finding container da350b6a21209ea8c6d1ed74a94780433633c7d9bb0d268b94c0dc1fd5e3880d: Status 404 returned error can't find the container with id da350b6a21209ea8c6d1ed74a94780433633c7d9bb0d268b94c0dc1fd5e3880d Feb 20 07:01:09 crc kubenswrapper[5094]: I0220 07:01:09.629599 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-frvdg" event={"ID":"df45fab4-d183-4702-b5b6-2a4e559eff22","Type":"ContainerStarted","Data":"0a97949b43680f0ac65c2584124f8331bb477d979486e053779dd6a56de3a9e3"} Feb 20 07:01:09 crc kubenswrapper[5094]: I0220 07:01:09.630812 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm" event={"ID":"edd001fc-3ddc-4010-8a98-54f4ffeaba72","Type":"ContainerStarted","Data":"da350b6a21209ea8c6d1ed74a94780433633c7d9bb0d268b94c0dc1fd5e3880d"} Feb 20 07:01:09 crc kubenswrapper[5094]: I0220 07:01:09.632452 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-576d49774f-hnd7r" event={"ID":"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b","Type":"ContainerStarted","Data":"570a069da75de306ed661786db7b4736fffbcee8209d9f5c2a6bc25200d0c644"} Feb 20 07:01:09 crc kubenswrapper[5094]: I0220 07:01:09.632480 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-576d49774f-hnd7r" event={"ID":"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b","Type":"ContainerStarted","Data":"9d637a988014941be6193cc0048507bc75538a885ca68a3dd469ac5eaace1209"} Feb 20 07:01:09 crc kubenswrapper[5094]: I0220 07:01:09.658012 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-576d49774f-hnd7r" podStartSLOduration=1.6579836860000001 podStartE2EDuration="1.657983686s" podCreationTimestamp="2026-02-20 07:01:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:01:09.652160105 +0000 UTC m=+884.524786826" watchObservedRunningTime="2026-02-20 07:01:09.657983686 +0000 UTC m=+884.530610397" Feb 20 07:01:11 crc kubenswrapper[5094]: I0220 07:01:11.658671 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-jr284" event={"ID":"55b1a421-7ec5-4442-b4c5-11767715cc4b","Type":"ContainerStarted","Data":"dd1809a65bcabb6d4cf68c2bc01cfea4e44d1732f241daaf2a9030621bf66bc2"} Feb 20 07:01:11 crc kubenswrapper[5094]: I0220 07:01:11.659492 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-jr284" Feb 20 07:01:11 crc kubenswrapper[5094]: I0220 07:01:11.661536 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-frvdg" event={"ID":"df45fab4-d183-4702-b5b6-2a4e559eff22","Type":"ContainerStarted","Data":"1ce204f47b0796a3ad7bbd6d3057ed5ba4b01cef0dbb4c2c09a67125c0a9f31f"} Feb 20 07:01:11 crc kubenswrapper[5094]: I0220 07:01:11.661672 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-frvdg" Feb 20 07:01:11 crc kubenswrapper[5094]: I0220 07:01:11.663901 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-gvgqm" event={"ID":"93238aee-86f0-497a-8880-531338e8245f","Type":"ContainerStarted","Data":"336ae33ea64168293ed52b20fad9e99a463a68f0f5079ea299128fa36e633b69"} Feb 20 07:01:11 crc kubenswrapper[5094]: I0220 07:01:11.686693 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-jr284" podStartSLOduration=2.160956592 podStartE2EDuration="4.68665875s" podCreationTimestamp="2026-02-20 07:01:07 +0000 UTC" firstStartedPulling="2026-02-20 07:01:08.248904869 +0000 UTC m=+883.121531580" lastFinishedPulling="2026-02-20 07:01:10.774606987 +0000 UTC m=+885.647233738" observedRunningTime="2026-02-20 07:01:11.67618634 +0000 UTC m=+886.548813081" watchObservedRunningTime="2026-02-20 07:01:11.68665875 +0000 UTC m=+886.559285491" Feb 20 07:01:11 crc kubenswrapper[5094]: I0220 07:01:11.696832 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-frvdg" podStartSLOduration=2.650744791 podStartE2EDuration="4.696807124s" podCreationTimestamp="2026-02-20 07:01:07 +0000 UTC" firstStartedPulling="2026-02-20 07:01:08.711335383 +0000 UTC m=+883.583962094" lastFinishedPulling="2026-02-20 07:01:10.757397716 +0000 UTC m=+885.630024427" observedRunningTime="2026-02-20 07:01:11.695209026 +0000 UTC m=+886.567835737" watchObservedRunningTime="2026-02-20 07:01:11.696807124 +0000 UTC m=+886.569433825" Feb 20 07:01:12 crc kubenswrapper[5094]: I0220 07:01:12.675094 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm" event={"ID":"edd001fc-3ddc-4010-8a98-54f4ffeaba72","Type":"ContainerStarted","Data":"a83266cf9701e5d698bf6dcbd7f4d1e65dea29c5fc42d18ddab48ff95ae8275c"} Feb 20 07:01:12 crc kubenswrapper[5094]: I0220 07:01:12.700964 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm" podStartSLOduration=2.869391759 podStartE2EDuration="5.700462951s" podCreationTimestamp="2026-02-20 07:01:07 +0000 UTC" firstStartedPulling="2026-02-20 07:01:09.217297891 +0000 UTC m=+884.089924602" lastFinishedPulling="2026-02-20 07:01:12.048369083 +0000 UTC m=+886.920995794" observedRunningTime="2026-02-20 07:01:12.693213596 +0000 UTC m=+887.565840317" watchObservedRunningTime="2026-02-20 07:01:12.700462951 +0000 UTC m=+887.573089662" Feb 20 07:01:13 crc kubenswrapper[5094]: I0220 07:01:13.688159 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-gvgqm" event={"ID":"93238aee-86f0-497a-8880-531338e8245f","Type":"ContainerStarted","Data":"2edca5c4f7a834ac8a12a5d615a5f1d7706444ebe10163410c24ccf2fa989edf"} Feb 20 07:01:18 crc kubenswrapper[5094]: I0220 07:01:18.234553 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-jr284" Feb 20 07:01:18 crc kubenswrapper[5094]: I0220 07:01:18.260349 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-gvgqm" podStartSLOduration=6.557505205 podStartE2EDuration="11.260317444s" podCreationTimestamp="2026-02-20 07:01:07 +0000 UTC" firstStartedPulling="2026-02-20 07:01:08.432419554 +0000 UTC m=+883.305046265" lastFinishedPulling="2026-02-20 07:01:13.135231803 +0000 UTC m=+888.007858504" observedRunningTime="2026-02-20 07:01:13.724993667 +0000 UTC m=+888.597620388" watchObservedRunningTime="2026-02-20 07:01:18.260317444 +0000 UTC m=+893.132944185" Feb 20 07:01:18 crc kubenswrapper[5094]: I0220 07:01:18.559837 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:18 crc kubenswrapper[5094]: I0220 07:01:18.559914 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:18 crc kubenswrapper[5094]: I0220 07:01:18.581611 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:18 crc kubenswrapper[5094]: I0220 07:01:18.737162 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:18 crc kubenswrapper[5094]: I0220 07:01:18.820203 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-shq4j"] Feb 20 07:01:28 crc kubenswrapper[5094]: I0220 07:01:28.183648 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-frvdg" Feb 20 07:01:34 crc kubenswrapper[5094]: I0220 07:01:34.106538 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:01:34 crc kubenswrapper[5094]: I0220 07:01:34.107424 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:01:43 crc kubenswrapper[5094]: I0220 07:01:43.891013 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-shq4j" podUID="e130287f-996d-4ab0-8c12-351bf8d21df5" containerName="console" containerID="cri-o://d6b6a4d5efc598b2291d4820a26080dd11c2b20b96ec9c20aea95f5a7b87fc1a" gracePeriod=15 Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.270477 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-shq4j_e130287f-996d-4ab0-8c12-351bf8d21df5/console/0.log" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.270950 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-shq4j" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.382022 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-trusted-ca-bundle\") pod \"e130287f-996d-4ab0-8c12-351bf8d21df5\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.382071 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8jnn\" (UniqueName: \"kubernetes.io/projected/e130287f-996d-4ab0-8c12-351bf8d21df5-kube-api-access-x8jnn\") pod \"e130287f-996d-4ab0-8c12-351bf8d21df5\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.382158 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-service-ca\") pod \"e130287f-996d-4ab0-8c12-351bf8d21df5\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.382235 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-console-config\") pod \"e130287f-996d-4ab0-8c12-351bf8d21df5\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.382332 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e130287f-996d-4ab0-8c12-351bf8d21df5-console-oauth-config\") pod \"e130287f-996d-4ab0-8c12-351bf8d21df5\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.382354 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e130287f-996d-4ab0-8c12-351bf8d21df5-console-serving-cert\") pod \"e130287f-996d-4ab0-8c12-351bf8d21df5\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.382374 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-oauth-serving-cert\") pod \"e130287f-996d-4ab0-8c12-351bf8d21df5\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.383318 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e130287f-996d-4ab0-8c12-351bf8d21df5" (UID: "e130287f-996d-4ab0-8c12-351bf8d21df5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.383407 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e130287f-996d-4ab0-8c12-351bf8d21df5" (UID: "e130287f-996d-4ab0-8c12-351bf8d21df5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.383514 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-console-config" (OuterVolumeSpecName: "console-config") pod "e130287f-996d-4ab0-8c12-351bf8d21df5" (UID: "e130287f-996d-4ab0-8c12-351bf8d21df5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.384073 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-service-ca" (OuterVolumeSpecName: "service-ca") pod "e130287f-996d-4ab0-8c12-351bf8d21df5" (UID: "e130287f-996d-4ab0-8c12-351bf8d21df5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.391855 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e130287f-996d-4ab0-8c12-351bf8d21df5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e130287f-996d-4ab0-8c12-351bf8d21df5" (UID: "e130287f-996d-4ab0-8c12-351bf8d21df5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.392520 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e130287f-996d-4ab0-8c12-351bf8d21df5-kube-api-access-x8jnn" (OuterVolumeSpecName: "kube-api-access-x8jnn") pod "e130287f-996d-4ab0-8c12-351bf8d21df5" (UID: "e130287f-996d-4ab0-8c12-351bf8d21df5"). InnerVolumeSpecName "kube-api-access-x8jnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.392575 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e130287f-996d-4ab0-8c12-351bf8d21df5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e130287f-996d-4ab0-8c12-351bf8d21df5" (UID: "e130287f-996d-4ab0-8c12-351bf8d21df5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.483620 5094 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e130287f-996d-4ab0-8c12-351bf8d21df5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.483657 5094 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e130287f-996d-4ab0-8c12-351bf8d21df5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.483672 5094 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.483686 5094 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.483697 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8jnn\" (UniqueName: \"kubernetes.io/projected/e130287f-996d-4ab0-8c12-351bf8d21df5-kube-api-access-x8jnn\") on node \"crc\" DevicePath \"\"" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.483732 5094 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.483745 5094 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-console-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.954611 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-shq4j_e130287f-996d-4ab0-8c12-351bf8d21df5/console/0.log" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.954682 5094 generic.go:334] "Generic (PLEG): container finished" podID="e130287f-996d-4ab0-8c12-351bf8d21df5" containerID="d6b6a4d5efc598b2291d4820a26080dd11c2b20b96ec9c20aea95f5a7b87fc1a" exitCode=2 Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.954742 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-shq4j" event={"ID":"e130287f-996d-4ab0-8c12-351bf8d21df5","Type":"ContainerDied","Data":"d6b6a4d5efc598b2291d4820a26080dd11c2b20b96ec9c20aea95f5a7b87fc1a"} Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.954787 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-shq4j" event={"ID":"e130287f-996d-4ab0-8c12-351bf8d21df5","Type":"ContainerDied","Data":"9e9ebe837e9f43c76e6f912fde9f9a4d76af0096fe554a67909ac3cf138a323a"} Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.954809 5094 scope.go:117] "RemoveContainer" containerID="d6b6a4d5efc598b2291d4820a26080dd11c2b20b96ec9c20aea95f5a7b87fc1a" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.954825 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-shq4j" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.980875 5094 scope.go:117] "RemoveContainer" containerID="d6b6a4d5efc598b2291d4820a26080dd11c2b20b96ec9c20aea95f5a7b87fc1a" Feb 20 07:01:44 crc kubenswrapper[5094]: E0220 07:01:44.982242 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6b6a4d5efc598b2291d4820a26080dd11c2b20b96ec9c20aea95f5a7b87fc1a\": container with ID starting with d6b6a4d5efc598b2291d4820a26080dd11c2b20b96ec9c20aea95f5a7b87fc1a not found: ID does not exist" containerID="d6b6a4d5efc598b2291d4820a26080dd11c2b20b96ec9c20aea95f5a7b87fc1a" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.982297 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6b6a4d5efc598b2291d4820a26080dd11c2b20b96ec9c20aea95f5a7b87fc1a"} err="failed to get container status \"d6b6a4d5efc598b2291d4820a26080dd11c2b20b96ec9c20aea95f5a7b87fc1a\": rpc error: code = NotFound desc = could not find container \"d6b6a4d5efc598b2291d4820a26080dd11c2b20b96ec9c20aea95f5a7b87fc1a\": container with ID starting with d6b6a4d5efc598b2291d4820a26080dd11c2b20b96ec9c20aea95f5a7b87fc1a not found: ID does not exist" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.997809 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-shq4j"] Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.004383 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-shq4j"] Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.559395 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx"] Feb 20 07:01:45 crc kubenswrapper[5094]: E0220 07:01:45.559955 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e130287f-996d-4ab0-8c12-351bf8d21df5" containerName="console" Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.560003 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="e130287f-996d-4ab0-8c12-351bf8d21df5" containerName="console" Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.560261 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="e130287f-996d-4ab0-8c12-351bf8d21df5" containerName="console" Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.562108 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.565599 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.578357 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx"] Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.702641 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ffda3d5-82a2-4a0c-9052-2546188c107a-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx\" (UID: \"4ffda3d5-82a2-4a0c-9052-2546188c107a\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.702747 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ffda3d5-82a2-4a0c-9052-2546188c107a-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx\" (UID: \"4ffda3d5-82a2-4a0c-9052-2546188c107a\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.702837 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhtj2\" (UniqueName: \"kubernetes.io/projected/4ffda3d5-82a2-4a0c-9052-2546188c107a-kube-api-access-rhtj2\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx\" (UID: \"4ffda3d5-82a2-4a0c-9052-2546188c107a\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.804082 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhtj2\" (UniqueName: \"kubernetes.io/projected/4ffda3d5-82a2-4a0c-9052-2546188c107a-kube-api-access-rhtj2\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx\" (UID: \"4ffda3d5-82a2-4a0c-9052-2546188c107a\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.804266 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ffda3d5-82a2-4a0c-9052-2546188c107a-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx\" (UID: \"4ffda3d5-82a2-4a0c-9052-2546188c107a\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.804312 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ffda3d5-82a2-4a0c-9052-2546188c107a-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx\" (UID: \"4ffda3d5-82a2-4a0c-9052-2546188c107a\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.805327 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ffda3d5-82a2-4a0c-9052-2546188c107a-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx\" (UID: \"4ffda3d5-82a2-4a0c-9052-2546188c107a\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.805379 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ffda3d5-82a2-4a0c-9052-2546188c107a-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx\" (UID: \"4ffda3d5-82a2-4a0c-9052-2546188c107a\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.837826 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhtj2\" (UniqueName: \"kubernetes.io/projected/4ffda3d5-82a2-4a0c-9052-2546188c107a-kube-api-access-rhtj2\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx\" (UID: \"4ffda3d5-82a2-4a0c-9052-2546188c107a\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.869595 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e130287f-996d-4ab0-8c12-351bf8d21df5" path="/var/lib/kubelet/pods/e130287f-996d-4ab0-8c12-351bf8d21df5/volumes" Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.879162 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" Feb 20 07:01:46 crc kubenswrapper[5094]: I0220 07:01:46.399780 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx"] Feb 20 07:01:46 crc kubenswrapper[5094]: I0220 07:01:46.980558 5094 generic.go:334] "Generic (PLEG): container finished" podID="4ffda3d5-82a2-4a0c-9052-2546188c107a" containerID="de973fce7e21a1784784f7a7a78d34e6b9fa55160cbf7d0ce0c93ae083f6be75" exitCode=0 Feb 20 07:01:46 crc kubenswrapper[5094]: I0220 07:01:46.980633 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" event={"ID":"4ffda3d5-82a2-4a0c-9052-2546188c107a","Type":"ContainerDied","Data":"de973fce7e21a1784784f7a7a78d34e6b9fa55160cbf7d0ce0c93ae083f6be75"} Feb 20 07:01:46 crc kubenswrapper[5094]: I0220 07:01:46.980678 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" event={"ID":"4ffda3d5-82a2-4a0c-9052-2546188c107a","Type":"ContainerStarted","Data":"6a7724051ddaaa78a6755cc4cbefdaa122af49429cca7f0feaad15fbed9ac2e9"} Feb 20 07:01:49 crc kubenswrapper[5094]: I0220 07:01:49.002329 5094 generic.go:334] "Generic (PLEG): container finished" podID="4ffda3d5-82a2-4a0c-9052-2546188c107a" containerID="47051f79c300c7ff20786f74ddda68a16f5e36cd8b898ed81cf7f5aceabdc46b" exitCode=0 Feb 20 07:01:49 crc kubenswrapper[5094]: I0220 07:01:49.002496 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" event={"ID":"4ffda3d5-82a2-4a0c-9052-2546188c107a","Type":"ContainerDied","Data":"47051f79c300c7ff20786f74ddda68a16f5e36cd8b898ed81cf7f5aceabdc46b"} Feb 20 07:01:50 crc kubenswrapper[5094]: I0220 07:01:50.013227 5094 generic.go:334] "Generic (PLEG): container finished" podID="4ffda3d5-82a2-4a0c-9052-2546188c107a" containerID="86d72c9f6066ce2d15d3e5f4133f7e9f32b85c81cea4d75c03ff8fef2121e66b" exitCode=0 Feb 20 07:01:50 crc kubenswrapper[5094]: I0220 07:01:50.013286 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" event={"ID":"4ffda3d5-82a2-4a0c-9052-2546188c107a","Type":"ContainerDied","Data":"86d72c9f6066ce2d15d3e5f4133f7e9f32b85c81cea4d75c03ff8fef2121e66b"} Feb 20 07:01:51 crc kubenswrapper[5094]: I0220 07:01:51.383750 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" Feb 20 07:01:51 crc kubenswrapper[5094]: I0220 07:01:51.517577 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ffda3d5-82a2-4a0c-9052-2546188c107a-bundle\") pod \"4ffda3d5-82a2-4a0c-9052-2546188c107a\" (UID: \"4ffda3d5-82a2-4a0c-9052-2546188c107a\") " Feb 20 07:01:51 crc kubenswrapper[5094]: I0220 07:01:51.518091 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhtj2\" (UniqueName: \"kubernetes.io/projected/4ffda3d5-82a2-4a0c-9052-2546188c107a-kube-api-access-rhtj2\") pod \"4ffda3d5-82a2-4a0c-9052-2546188c107a\" (UID: \"4ffda3d5-82a2-4a0c-9052-2546188c107a\") " Feb 20 07:01:51 crc kubenswrapper[5094]: I0220 07:01:51.518212 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ffda3d5-82a2-4a0c-9052-2546188c107a-util\") pod \"4ffda3d5-82a2-4a0c-9052-2546188c107a\" (UID: \"4ffda3d5-82a2-4a0c-9052-2546188c107a\") " Feb 20 07:01:51 crc kubenswrapper[5094]: I0220 07:01:51.520815 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ffda3d5-82a2-4a0c-9052-2546188c107a-bundle" (OuterVolumeSpecName: "bundle") pod "4ffda3d5-82a2-4a0c-9052-2546188c107a" (UID: "4ffda3d5-82a2-4a0c-9052-2546188c107a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:01:51 crc kubenswrapper[5094]: I0220 07:01:51.526068 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ffda3d5-82a2-4a0c-9052-2546188c107a-kube-api-access-rhtj2" (OuterVolumeSpecName: "kube-api-access-rhtj2") pod "4ffda3d5-82a2-4a0c-9052-2546188c107a" (UID: "4ffda3d5-82a2-4a0c-9052-2546188c107a"). InnerVolumeSpecName "kube-api-access-rhtj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:01:51 crc kubenswrapper[5094]: I0220 07:01:51.537997 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ffda3d5-82a2-4a0c-9052-2546188c107a-util" (OuterVolumeSpecName: "util") pod "4ffda3d5-82a2-4a0c-9052-2546188c107a" (UID: "4ffda3d5-82a2-4a0c-9052-2546188c107a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:01:51 crc kubenswrapper[5094]: I0220 07:01:51.620687 5094 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ffda3d5-82a2-4a0c-9052-2546188c107a-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:01:51 crc kubenswrapper[5094]: I0220 07:01:51.620772 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhtj2\" (UniqueName: \"kubernetes.io/projected/4ffda3d5-82a2-4a0c-9052-2546188c107a-kube-api-access-rhtj2\") on node \"crc\" DevicePath \"\"" Feb 20 07:01:51 crc kubenswrapper[5094]: I0220 07:01:51.620788 5094 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ffda3d5-82a2-4a0c-9052-2546188c107a-util\") on node \"crc\" DevicePath \"\"" Feb 20 07:01:52 crc kubenswrapper[5094]: I0220 07:01:52.033866 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" Feb 20 07:01:52 crc kubenswrapper[5094]: I0220 07:01:52.033887 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" event={"ID":"4ffda3d5-82a2-4a0c-9052-2546188c107a","Type":"ContainerDied","Data":"6a7724051ddaaa78a6755cc4cbefdaa122af49429cca7f0feaad15fbed9ac2e9"} Feb 20 07:01:52 crc kubenswrapper[5094]: I0220 07:01:52.033953 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a7724051ddaaa78a6755cc4cbefdaa122af49429cca7f0feaad15fbed9ac2e9" Feb 20 07:01:57 crc kubenswrapper[5094]: I0220 07:01:57.874060 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-87drp"] Feb 20 07:01:57 crc kubenswrapper[5094]: E0220 07:01:57.875153 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ffda3d5-82a2-4a0c-9052-2546188c107a" containerName="util" Feb 20 07:01:57 crc kubenswrapper[5094]: I0220 07:01:57.875177 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ffda3d5-82a2-4a0c-9052-2546188c107a" containerName="util" Feb 20 07:01:57 crc kubenswrapper[5094]: E0220 07:01:57.875202 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ffda3d5-82a2-4a0c-9052-2546188c107a" containerName="pull" Feb 20 07:01:57 crc kubenswrapper[5094]: I0220 07:01:57.875217 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ffda3d5-82a2-4a0c-9052-2546188c107a" containerName="pull" Feb 20 07:01:57 crc kubenswrapper[5094]: E0220 07:01:57.875253 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ffda3d5-82a2-4a0c-9052-2546188c107a" containerName="extract" Feb 20 07:01:57 crc kubenswrapper[5094]: I0220 07:01:57.875265 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ffda3d5-82a2-4a0c-9052-2546188c107a" containerName="extract" Feb 20 07:01:57 crc kubenswrapper[5094]: I0220 07:01:57.875473 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ffda3d5-82a2-4a0c-9052-2546188c107a" containerName="extract" Feb 20 07:01:57 crc kubenswrapper[5094]: I0220 07:01:57.877016 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-87drp" Feb 20 07:01:57 crc kubenswrapper[5094]: I0220 07:01:57.884786 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-87drp"] Feb 20 07:01:58 crc kubenswrapper[5094]: I0220 07:01:58.026042 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15dcf959-ddef-4835-a1fc-21247f8d81d4-catalog-content\") pod \"community-operators-87drp\" (UID: \"15dcf959-ddef-4835-a1fc-21247f8d81d4\") " pod="openshift-marketplace/community-operators-87drp" Feb 20 07:01:58 crc kubenswrapper[5094]: I0220 07:01:58.026119 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15dcf959-ddef-4835-a1fc-21247f8d81d4-utilities\") pod \"community-operators-87drp\" (UID: \"15dcf959-ddef-4835-a1fc-21247f8d81d4\") " pod="openshift-marketplace/community-operators-87drp" Feb 20 07:01:58 crc kubenswrapper[5094]: I0220 07:01:58.026211 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj5wf\" (UniqueName: \"kubernetes.io/projected/15dcf959-ddef-4835-a1fc-21247f8d81d4-kube-api-access-vj5wf\") pod \"community-operators-87drp\" (UID: \"15dcf959-ddef-4835-a1fc-21247f8d81d4\") " pod="openshift-marketplace/community-operators-87drp" Feb 20 07:01:58 crc kubenswrapper[5094]: I0220 07:01:58.127408 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15dcf959-ddef-4835-a1fc-21247f8d81d4-catalog-content\") pod \"community-operators-87drp\" (UID: \"15dcf959-ddef-4835-a1fc-21247f8d81d4\") " pod="openshift-marketplace/community-operators-87drp" Feb 20 07:01:58 crc kubenswrapper[5094]: I0220 07:01:58.127479 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15dcf959-ddef-4835-a1fc-21247f8d81d4-utilities\") pod \"community-operators-87drp\" (UID: \"15dcf959-ddef-4835-a1fc-21247f8d81d4\") " pod="openshift-marketplace/community-operators-87drp" Feb 20 07:01:58 crc kubenswrapper[5094]: I0220 07:01:58.127553 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj5wf\" (UniqueName: \"kubernetes.io/projected/15dcf959-ddef-4835-a1fc-21247f8d81d4-kube-api-access-vj5wf\") pod \"community-operators-87drp\" (UID: \"15dcf959-ddef-4835-a1fc-21247f8d81d4\") " pod="openshift-marketplace/community-operators-87drp" Feb 20 07:01:58 crc kubenswrapper[5094]: I0220 07:01:58.127999 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15dcf959-ddef-4835-a1fc-21247f8d81d4-catalog-content\") pod \"community-operators-87drp\" (UID: \"15dcf959-ddef-4835-a1fc-21247f8d81d4\") " pod="openshift-marketplace/community-operators-87drp" Feb 20 07:01:58 crc kubenswrapper[5094]: I0220 07:01:58.128329 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15dcf959-ddef-4835-a1fc-21247f8d81d4-utilities\") pod \"community-operators-87drp\" (UID: \"15dcf959-ddef-4835-a1fc-21247f8d81d4\") " pod="openshift-marketplace/community-operators-87drp" Feb 20 07:01:58 crc kubenswrapper[5094]: I0220 07:01:58.162048 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj5wf\" (UniqueName: \"kubernetes.io/projected/15dcf959-ddef-4835-a1fc-21247f8d81d4-kube-api-access-vj5wf\") pod \"community-operators-87drp\" (UID: \"15dcf959-ddef-4835-a1fc-21247f8d81d4\") " pod="openshift-marketplace/community-operators-87drp" Feb 20 07:01:58 crc kubenswrapper[5094]: I0220 07:01:58.214411 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-87drp" Feb 20 07:01:58 crc kubenswrapper[5094]: I0220 07:01:58.549523 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-87drp"] Feb 20 07:01:59 crc kubenswrapper[5094]: I0220 07:01:59.084757 5094 generic.go:334] "Generic (PLEG): container finished" podID="15dcf959-ddef-4835-a1fc-21247f8d81d4" containerID="ee705756cd462fc8503463c9f1feec3f3088c9f60a1f9bcbdd33ccd0b49c34dc" exitCode=0 Feb 20 07:01:59 crc kubenswrapper[5094]: I0220 07:01:59.084865 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-87drp" event={"ID":"15dcf959-ddef-4835-a1fc-21247f8d81d4","Type":"ContainerDied","Data":"ee705756cd462fc8503463c9f1feec3f3088c9f60a1f9bcbdd33ccd0b49c34dc"} Feb 20 07:01:59 crc kubenswrapper[5094]: I0220 07:01:59.085087 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-87drp" event={"ID":"15dcf959-ddef-4835-a1fc-21247f8d81d4","Type":"ContainerStarted","Data":"a942b404d650c5574e3a557cf6ef05b40134fdcd0171aaf5ba7a37ca35943970"} Feb 20 07:02:00 crc kubenswrapper[5094]: I0220 07:02:00.106450 5094 generic.go:334] "Generic (PLEG): container finished" podID="15dcf959-ddef-4835-a1fc-21247f8d81d4" containerID="1a192e2a4da6e1385050ac77e7224362961f35c186a994879fded218b12b2977" exitCode=0 Feb 20 07:02:00 crc kubenswrapper[5094]: I0220 07:02:00.106516 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-87drp" event={"ID":"15dcf959-ddef-4835-a1fc-21247f8d81d4","Type":"ContainerDied","Data":"1a192e2a4da6e1385050ac77e7224362961f35c186a994879fded218b12b2977"} Feb 20 07:02:01 crc kubenswrapper[5094]: I0220 07:02:01.116876 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-87drp" event={"ID":"15dcf959-ddef-4835-a1fc-21247f8d81d4","Type":"ContainerStarted","Data":"40e3a68506e2b6947da889b0b7ed2574ae46c81e0eb0bf99ba467197f227a4a0"} Feb 20 07:02:01 crc kubenswrapper[5094]: I0220 07:02:01.150470 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-87drp" podStartSLOduration=2.541090217 podStartE2EDuration="4.15044742s" podCreationTimestamp="2026-02-20 07:01:57 +0000 UTC" firstStartedPulling="2026-02-20 07:01:59.087166986 +0000 UTC m=+933.959793697" lastFinishedPulling="2026-02-20 07:02:00.696524179 +0000 UTC m=+935.569150900" observedRunningTime="2026-02-20 07:02:01.145460961 +0000 UTC m=+936.018087682" watchObservedRunningTime="2026-02-20 07:02:01.15044742 +0000 UTC m=+936.023074131" Feb 20 07:02:04 crc kubenswrapper[5094]: I0220 07:02:04.106954 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:02:04 crc kubenswrapper[5094]: I0220 07:02:04.107561 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:02:04 crc kubenswrapper[5094]: I0220 07:02:04.107648 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 07:02:04 crc kubenswrapper[5094]: I0220 07:02:04.108817 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8a6efb8b2f6985d19679a1508c25eeeb609faa457ea57f65ac79a9b48c9574e6"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 07:02:04 crc kubenswrapper[5094]: I0220 07:02:04.108928 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://8a6efb8b2f6985d19679a1508c25eeeb609faa457ea57f65ac79a9b48c9574e6" gracePeriod=600 Feb 20 07:02:05 crc kubenswrapper[5094]: I0220 07:02:05.156033 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="8a6efb8b2f6985d19679a1508c25eeeb609faa457ea57f65ac79a9b48c9574e6" exitCode=0 Feb 20 07:02:05 crc kubenswrapper[5094]: I0220 07:02:05.156136 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"8a6efb8b2f6985d19679a1508c25eeeb609faa457ea57f65ac79a9b48c9574e6"} Feb 20 07:02:05 crc kubenswrapper[5094]: I0220 07:02:05.156803 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"2fef18a95cc33ac92b0e7f9a9dd8176f054fb1a2db35c994afe1e4273592eb9d"} Feb 20 07:02:05 crc kubenswrapper[5094]: I0220 07:02:05.156834 5094 scope.go:117] "RemoveContainer" containerID="935ee674e42fed17c73f6a106b1b7b34bda161038510a874c1b2347b0ce2c2b3" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.697914 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv"] Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.698875 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.701449 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.702295 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.702756 5094 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.702820 5094 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-mmdpk" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.703163 5094 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.724474 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv"] Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.865503 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/059e3724-d657-4f2e-beec-f4f55e09e498-apiservice-cert\") pod \"metallb-operator-controller-manager-6d969f468d-fd5gv\" (UID: \"059e3724-d657-4f2e-beec-f4f55e09e498\") " pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.865582 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7nrl\" (UniqueName: \"kubernetes.io/projected/059e3724-d657-4f2e-beec-f4f55e09e498-kube-api-access-w7nrl\") pod \"metallb-operator-controller-manager-6d969f468d-fd5gv\" (UID: \"059e3724-d657-4f2e-beec-f4f55e09e498\") " pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.865843 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/059e3724-d657-4f2e-beec-f4f55e09e498-webhook-cert\") pod \"metallb-operator-controller-manager-6d969f468d-fd5gv\" (UID: \"059e3724-d657-4f2e-beec-f4f55e09e498\") " pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.958992 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf"] Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.959825 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.965455 5094 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.965471 5094 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.965553 5094 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-dvdgj" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.967449 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/059e3724-d657-4f2e-beec-f4f55e09e498-apiservice-cert\") pod \"metallb-operator-controller-manager-6d969f468d-fd5gv\" (UID: \"059e3724-d657-4f2e-beec-f4f55e09e498\") " pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.967514 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7nrl\" (UniqueName: \"kubernetes.io/projected/059e3724-d657-4f2e-beec-f4f55e09e498-kube-api-access-w7nrl\") pod \"metallb-operator-controller-manager-6d969f468d-fd5gv\" (UID: \"059e3724-d657-4f2e-beec-f4f55e09e498\") " pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.967589 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/059e3724-d657-4f2e-beec-f4f55e09e498-webhook-cert\") pod \"metallb-operator-controller-manager-6d969f468d-fd5gv\" (UID: \"059e3724-d657-4f2e-beec-f4f55e09e498\") " pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.985260 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/059e3724-d657-4f2e-beec-f4f55e09e498-webhook-cert\") pod \"metallb-operator-controller-manager-6d969f468d-fd5gv\" (UID: \"059e3724-d657-4f2e-beec-f4f55e09e498\") " pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.985729 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/059e3724-d657-4f2e-beec-f4f55e09e498-apiservice-cert\") pod \"metallb-operator-controller-manager-6d969f468d-fd5gv\" (UID: \"059e3724-d657-4f2e-beec-f4f55e09e498\") " pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" Feb 20 07:02:07 crc kubenswrapper[5094]: I0220 07:02:07.003609 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7nrl\" (UniqueName: \"kubernetes.io/projected/059e3724-d657-4f2e-beec-f4f55e09e498-kube-api-access-w7nrl\") pod \"metallb-operator-controller-manager-6d969f468d-fd5gv\" (UID: \"059e3724-d657-4f2e-beec-f4f55e09e498\") " pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" Feb 20 07:02:07 crc kubenswrapper[5094]: I0220 07:02:07.003739 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf"] Feb 20 07:02:07 crc kubenswrapper[5094]: I0220 07:02:07.026224 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" Feb 20 07:02:07 crc kubenswrapper[5094]: I0220 07:02:07.069432 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2dde7604-2a93-4dc0-9b15-b8fe41f79e1e-webhook-cert\") pod \"metallb-operator-webhook-server-c5fbff78-jk6cf\" (UID: \"2dde7604-2a93-4dc0-9b15-b8fe41f79e1e\") " pod="metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf" Feb 20 07:02:07 crc kubenswrapper[5094]: I0220 07:02:07.069546 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmt9k\" (UniqueName: \"kubernetes.io/projected/2dde7604-2a93-4dc0-9b15-b8fe41f79e1e-kube-api-access-hmt9k\") pod \"metallb-operator-webhook-server-c5fbff78-jk6cf\" (UID: \"2dde7604-2a93-4dc0-9b15-b8fe41f79e1e\") " pod="metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf" Feb 20 07:02:07 crc kubenswrapper[5094]: I0220 07:02:07.069616 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2dde7604-2a93-4dc0-9b15-b8fe41f79e1e-apiservice-cert\") pod \"metallb-operator-webhook-server-c5fbff78-jk6cf\" (UID: \"2dde7604-2a93-4dc0-9b15-b8fe41f79e1e\") " pod="metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf" Feb 20 07:02:07 crc kubenswrapper[5094]: I0220 07:02:07.171801 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2dde7604-2a93-4dc0-9b15-b8fe41f79e1e-apiservice-cert\") pod \"metallb-operator-webhook-server-c5fbff78-jk6cf\" (UID: \"2dde7604-2a93-4dc0-9b15-b8fe41f79e1e\") " pod="metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf" Feb 20 07:02:07 crc kubenswrapper[5094]: I0220 07:02:07.172291 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2dde7604-2a93-4dc0-9b15-b8fe41f79e1e-webhook-cert\") pod \"metallb-operator-webhook-server-c5fbff78-jk6cf\" (UID: \"2dde7604-2a93-4dc0-9b15-b8fe41f79e1e\") " pod="metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf" Feb 20 07:02:07 crc kubenswrapper[5094]: I0220 07:02:07.172332 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmt9k\" (UniqueName: \"kubernetes.io/projected/2dde7604-2a93-4dc0-9b15-b8fe41f79e1e-kube-api-access-hmt9k\") pod \"metallb-operator-webhook-server-c5fbff78-jk6cf\" (UID: \"2dde7604-2a93-4dc0-9b15-b8fe41f79e1e\") " pod="metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf" Feb 20 07:02:07 crc kubenswrapper[5094]: I0220 07:02:07.179594 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2dde7604-2a93-4dc0-9b15-b8fe41f79e1e-apiservice-cert\") pod \"metallb-operator-webhook-server-c5fbff78-jk6cf\" (UID: \"2dde7604-2a93-4dc0-9b15-b8fe41f79e1e\") " pod="metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf" Feb 20 07:02:07 crc kubenswrapper[5094]: I0220 07:02:07.180630 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2dde7604-2a93-4dc0-9b15-b8fe41f79e1e-webhook-cert\") pod \"metallb-operator-webhook-server-c5fbff78-jk6cf\" (UID: \"2dde7604-2a93-4dc0-9b15-b8fe41f79e1e\") " pod="metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf" Feb 20 07:02:07 crc kubenswrapper[5094]: I0220 07:02:07.197260 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmt9k\" (UniqueName: \"kubernetes.io/projected/2dde7604-2a93-4dc0-9b15-b8fe41f79e1e-kube-api-access-hmt9k\") pod \"metallb-operator-webhook-server-c5fbff78-jk6cf\" (UID: \"2dde7604-2a93-4dc0-9b15-b8fe41f79e1e\") " pod="metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf" Feb 20 07:02:07 crc kubenswrapper[5094]: I0220 07:02:07.298039 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv"] Feb 20 07:02:07 crc kubenswrapper[5094]: I0220 07:02:07.336541 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf" Feb 20 07:02:07 crc kubenswrapper[5094]: I0220 07:02:07.550779 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf"] Feb 20 07:02:07 crc kubenswrapper[5094]: W0220 07:02:07.559064 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2dde7604_2a93_4dc0_9b15_b8fe41f79e1e.slice/crio-322c8144164d6f2b2fb535de382420d37d195a35068710e6a9e89938f724128b WatchSource:0}: Error finding container 322c8144164d6f2b2fb535de382420d37d195a35068710e6a9e89938f724128b: Status 404 returned error can't find the container with id 322c8144164d6f2b2fb535de382420d37d195a35068710e6a9e89938f724128b Feb 20 07:02:08 crc kubenswrapper[5094]: I0220 07:02:08.203740 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" event={"ID":"059e3724-d657-4f2e-beec-f4f55e09e498","Type":"ContainerStarted","Data":"bf300391a6d05acf75f7ec345df4af7a3803a3f8893829a2dc835ae88ddb3feb"} Feb 20 07:02:08 crc kubenswrapper[5094]: I0220 07:02:08.205990 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf" event={"ID":"2dde7604-2a93-4dc0-9b15-b8fe41f79e1e","Type":"ContainerStarted","Data":"322c8144164d6f2b2fb535de382420d37d195a35068710e6a9e89938f724128b"} Feb 20 07:02:08 crc kubenswrapper[5094]: I0220 07:02:08.215800 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-87drp" Feb 20 07:02:08 crc kubenswrapper[5094]: I0220 07:02:08.215871 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-87drp" Feb 20 07:02:08 crc kubenswrapper[5094]: I0220 07:02:08.273731 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-87drp" Feb 20 07:02:09 crc kubenswrapper[5094]: I0220 07:02:09.261179 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-87drp" Feb 20 07:02:11 crc kubenswrapper[5094]: I0220 07:02:11.868186 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-87drp"] Feb 20 07:02:11 crc kubenswrapper[5094]: I0220 07:02:11.868853 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-87drp" podUID="15dcf959-ddef-4835-a1fc-21247f8d81d4" containerName="registry-server" containerID="cri-o://40e3a68506e2b6947da889b0b7ed2574ae46c81e0eb0bf99ba467197f227a4a0" gracePeriod=2 Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.231965 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-87drp" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.260176 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf" event={"ID":"2dde7604-2a93-4dc0-9b15-b8fe41f79e1e","Type":"ContainerStarted","Data":"7b078e96d2ded7833eeddb5001d0076622f42a122bc025e2075d3c5da6e5fc85"} Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.260298 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.262624 5094 generic.go:334] "Generic (PLEG): container finished" podID="15dcf959-ddef-4835-a1fc-21247f8d81d4" containerID="40e3a68506e2b6947da889b0b7ed2574ae46c81e0eb0bf99ba467197f227a4a0" exitCode=0 Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.262681 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-87drp" event={"ID":"15dcf959-ddef-4835-a1fc-21247f8d81d4","Type":"ContainerDied","Data":"40e3a68506e2b6947da889b0b7ed2574ae46c81e0eb0bf99ba467197f227a4a0"} Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.262733 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-87drp" event={"ID":"15dcf959-ddef-4835-a1fc-21247f8d81d4","Type":"ContainerDied","Data":"a942b404d650c5574e3a557cf6ef05b40134fdcd0171aaf5ba7a37ca35943970"} Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.262755 5094 scope.go:117] "RemoveContainer" containerID="40e3a68506e2b6947da889b0b7ed2574ae46c81e0eb0bf99ba467197f227a4a0" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.262873 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-87drp" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.266012 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" event={"ID":"059e3724-d657-4f2e-beec-f4f55e09e498","Type":"ContainerStarted","Data":"4ca87069728b97927b3f16eaaf6d81d28815ea82f1ca04ff56de39baad6ac450"} Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.266519 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.278983 5094 scope.go:117] "RemoveContainer" containerID="1a192e2a4da6e1385050ac77e7224362961f35c186a994879fded218b12b2977" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.282677 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf" podStartSLOduration=1.897471326 podStartE2EDuration="6.282664848s" podCreationTimestamp="2026-02-20 07:02:06 +0000 UTC" firstStartedPulling="2026-02-20 07:02:07.563274682 +0000 UTC m=+942.435901403" lastFinishedPulling="2026-02-20 07:02:11.948468214 +0000 UTC m=+946.821094925" observedRunningTime="2026-02-20 07:02:12.280583828 +0000 UTC m=+947.153210539" watchObservedRunningTime="2026-02-20 07:02:12.282664848 +0000 UTC m=+947.155291559" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.301025 5094 scope.go:117] "RemoveContainer" containerID="ee705756cd462fc8503463c9f1feec3f3088c9f60a1f9bcbdd33ccd0b49c34dc" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.310089 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" podStartSLOduration=3.50765722 podStartE2EDuration="6.310068655s" podCreationTimestamp="2026-02-20 07:02:06 +0000 UTC" firstStartedPulling="2026-02-20 07:02:07.321781289 +0000 UTC m=+942.194408000" lastFinishedPulling="2026-02-20 07:02:10.124192724 +0000 UTC m=+944.996819435" observedRunningTime="2026-02-20 07:02:12.304658145 +0000 UTC m=+947.177284856" watchObservedRunningTime="2026-02-20 07:02:12.310068655 +0000 UTC m=+947.182695366" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.319982 5094 scope.go:117] "RemoveContainer" containerID="40e3a68506e2b6947da889b0b7ed2574ae46c81e0eb0bf99ba467197f227a4a0" Feb 20 07:02:12 crc kubenswrapper[5094]: E0220 07:02:12.320658 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40e3a68506e2b6947da889b0b7ed2574ae46c81e0eb0bf99ba467197f227a4a0\": container with ID starting with 40e3a68506e2b6947da889b0b7ed2574ae46c81e0eb0bf99ba467197f227a4a0 not found: ID does not exist" containerID="40e3a68506e2b6947da889b0b7ed2574ae46c81e0eb0bf99ba467197f227a4a0" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.320724 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40e3a68506e2b6947da889b0b7ed2574ae46c81e0eb0bf99ba467197f227a4a0"} err="failed to get container status \"40e3a68506e2b6947da889b0b7ed2574ae46c81e0eb0bf99ba467197f227a4a0\": rpc error: code = NotFound desc = could not find container \"40e3a68506e2b6947da889b0b7ed2574ae46c81e0eb0bf99ba467197f227a4a0\": container with ID starting with 40e3a68506e2b6947da889b0b7ed2574ae46c81e0eb0bf99ba467197f227a4a0 not found: ID does not exist" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.320755 5094 scope.go:117] "RemoveContainer" containerID="1a192e2a4da6e1385050ac77e7224362961f35c186a994879fded218b12b2977" Feb 20 07:02:12 crc kubenswrapper[5094]: E0220 07:02:12.321175 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a192e2a4da6e1385050ac77e7224362961f35c186a994879fded218b12b2977\": container with ID starting with 1a192e2a4da6e1385050ac77e7224362961f35c186a994879fded218b12b2977 not found: ID does not exist" containerID="1a192e2a4da6e1385050ac77e7224362961f35c186a994879fded218b12b2977" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.321208 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a192e2a4da6e1385050ac77e7224362961f35c186a994879fded218b12b2977"} err="failed to get container status \"1a192e2a4da6e1385050ac77e7224362961f35c186a994879fded218b12b2977\": rpc error: code = NotFound desc = could not find container \"1a192e2a4da6e1385050ac77e7224362961f35c186a994879fded218b12b2977\": container with ID starting with 1a192e2a4da6e1385050ac77e7224362961f35c186a994879fded218b12b2977 not found: ID does not exist" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.321229 5094 scope.go:117] "RemoveContainer" containerID="ee705756cd462fc8503463c9f1feec3f3088c9f60a1f9bcbdd33ccd0b49c34dc" Feb 20 07:02:12 crc kubenswrapper[5094]: E0220 07:02:12.321561 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee705756cd462fc8503463c9f1feec3f3088c9f60a1f9bcbdd33ccd0b49c34dc\": container with ID starting with ee705756cd462fc8503463c9f1feec3f3088c9f60a1f9bcbdd33ccd0b49c34dc not found: ID does not exist" containerID="ee705756cd462fc8503463c9f1feec3f3088c9f60a1f9bcbdd33ccd0b49c34dc" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.321584 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee705756cd462fc8503463c9f1feec3f3088c9f60a1f9bcbdd33ccd0b49c34dc"} err="failed to get container status \"ee705756cd462fc8503463c9f1feec3f3088c9f60a1f9bcbdd33ccd0b49c34dc\": rpc error: code = NotFound desc = could not find container \"ee705756cd462fc8503463c9f1feec3f3088c9f60a1f9bcbdd33ccd0b49c34dc\": container with ID starting with ee705756cd462fc8503463c9f1feec3f3088c9f60a1f9bcbdd33ccd0b49c34dc not found: ID does not exist" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.353860 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vj5wf\" (UniqueName: \"kubernetes.io/projected/15dcf959-ddef-4835-a1fc-21247f8d81d4-kube-api-access-vj5wf\") pod \"15dcf959-ddef-4835-a1fc-21247f8d81d4\" (UID: \"15dcf959-ddef-4835-a1fc-21247f8d81d4\") " Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.353926 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15dcf959-ddef-4835-a1fc-21247f8d81d4-utilities\") pod \"15dcf959-ddef-4835-a1fc-21247f8d81d4\" (UID: \"15dcf959-ddef-4835-a1fc-21247f8d81d4\") " Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.353961 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15dcf959-ddef-4835-a1fc-21247f8d81d4-catalog-content\") pod \"15dcf959-ddef-4835-a1fc-21247f8d81d4\" (UID: \"15dcf959-ddef-4835-a1fc-21247f8d81d4\") " Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.356801 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15dcf959-ddef-4835-a1fc-21247f8d81d4-utilities" (OuterVolumeSpecName: "utilities") pod "15dcf959-ddef-4835-a1fc-21247f8d81d4" (UID: "15dcf959-ddef-4835-a1fc-21247f8d81d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.361524 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15dcf959-ddef-4835-a1fc-21247f8d81d4-kube-api-access-vj5wf" (OuterVolumeSpecName: "kube-api-access-vj5wf") pod "15dcf959-ddef-4835-a1fc-21247f8d81d4" (UID: "15dcf959-ddef-4835-a1fc-21247f8d81d4"). InnerVolumeSpecName "kube-api-access-vj5wf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.412191 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15dcf959-ddef-4835-a1fc-21247f8d81d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15dcf959-ddef-4835-a1fc-21247f8d81d4" (UID: "15dcf959-ddef-4835-a1fc-21247f8d81d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.455516 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15dcf959-ddef-4835-a1fc-21247f8d81d4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.455556 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vj5wf\" (UniqueName: \"kubernetes.io/projected/15dcf959-ddef-4835-a1fc-21247f8d81d4-kube-api-access-vj5wf\") on node \"crc\" DevicePath \"\"" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.455570 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15dcf959-ddef-4835-a1fc-21247f8d81d4-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.593343 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-87drp"] Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.599401 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-87drp"] Feb 20 07:02:13 crc kubenswrapper[5094]: I0220 07:02:13.846852 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15dcf959-ddef-4835-a1fc-21247f8d81d4" path="/var/lib/kubelet/pods/15dcf959-ddef-4835-a1fc-21247f8d81d4/volumes" Feb 20 07:02:27 crc kubenswrapper[5094]: I0220 07:02:27.340280 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.030660 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.822684 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-4p57m"] Feb 20 07:02:47 crc kubenswrapper[5094]: E0220 07:02:47.823459 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15dcf959-ddef-4835-a1fc-21247f8d81d4" containerName="extract-utilities" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.823478 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="15dcf959-ddef-4835-a1fc-21247f8d81d4" containerName="extract-utilities" Feb 20 07:02:47 crc kubenswrapper[5094]: E0220 07:02:47.823500 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15dcf959-ddef-4835-a1fc-21247f8d81d4" containerName="registry-server" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.823511 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="15dcf959-ddef-4835-a1fc-21247f8d81d4" containerName="registry-server" Feb 20 07:02:47 crc kubenswrapper[5094]: E0220 07:02:47.823534 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15dcf959-ddef-4835-a1fc-21247f8d81d4" containerName="extract-content" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.823543 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="15dcf959-ddef-4835-a1fc-21247f8d81d4" containerName="extract-content" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.823677 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="15dcf959-ddef-4835-a1fc-21247f8d81d4" containerName="registry-server" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.826207 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.832149 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6"] Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.833101 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.837096 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.837106 5094 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.837358 5094 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-26hgd" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.856154 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6"] Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.861986 5094 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.914507 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxmcp\" (UniqueName: \"kubernetes.io/projected/f065adc1-f6c1-4895-a933-906a708555c1-kube-api-access-rxmcp\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.914588 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe469d05-edeb-4d23-b06b-6bdbfc646e99-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-dljs6\" (UID: \"fe469d05-edeb-4d23-b06b-6bdbfc646e99\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.914617 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f065adc1-f6c1-4895-a933-906a708555c1-reloader\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.914640 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f065adc1-f6c1-4895-a933-906a708555c1-frr-conf\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.914675 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f065adc1-f6c1-4895-a933-906a708555c1-frr-sockets\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.914831 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f065adc1-f6c1-4895-a933-906a708555c1-metrics\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.914907 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh9qq\" (UniqueName: \"kubernetes.io/projected/fe469d05-edeb-4d23-b06b-6bdbfc646e99-kube-api-access-vh9qq\") pod \"frr-k8s-webhook-server-78b44bf5bb-dljs6\" (UID: \"fe469d05-edeb-4d23-b06b-6bdbfc646e99\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.915039 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f065adc1-f6c1-4895-a933-906a708555c1-metrics-certs\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.915113 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f065adc1-f6c1-4895-a933-906a708555c1-frr-startup\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.939476 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-gjp5f"] Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.940554 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-gjp5f" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.943369 5094 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.945362 5094 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-nrnbq" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.945487 5094 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.945523 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.960635 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-s7ndd"] Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.961766 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-s7ndd" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.965264 5094 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.995507 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-s7ndd"] Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.016908 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f065adc1-f6c1-4895-a933-906a708555c1-reloader\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.016971 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t86x\" (UniqueName: \"kubernetes.io/projected/4d145cb8-0c5c-40f7-a99c-15f1575629c3-kube-api-access-8t86x\") pod \"speaker-gjp5f\" (UID: \"4d145cb8-0c5c-40f7-a99c-15f1575629c3\") " pod="metallb-system/speaker-gjp5f" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.016994 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f065adc1-f6c1-4895-a933-906a708555c1-frr-conf\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.017023 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4d145cb8-0c5c-40f7-a99c-15f1575629c3-memberlist\") pod \"speaker-gjp5f\" (UID: \"4d145cb8-0c5c-40f7-a99c-15f1575629c3\") " pod="metallb-system/speaker-gjp5f" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.017136 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2a03b7d3-8e22-4a62-98f0-8d72500fab69-metrics-certs\") pod \"controller-69bbfbf88f-s7ndd\" (UID: \"2a03b7d3-8e22-4a62-98f0-8d72500fab69\") " pod="metallb-system/controller-69bbfbf88f-s7ndd" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.017277 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f065adc1-f6c1-4895-a933-906a708555c1-frr-sockets\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.017313 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f065adc1-f6c1-4895-a933-906a708555c1-metrics\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.017492 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f065adc1-f6c1-4895-a933-906a708555c1-reloader\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.017541 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh9qq\" (UniqueName: \"kubernetes.io/projected/fe469d05-edeb-4d23-b06b-6bdbfc646e99-kube-api-access-vh9qq\") pod \"frr-k8s-webhook-server-78b44bf5bb-dljs6\" (UID: \"fe469d05-edeb-4d23-b06b-6bdbfc646e99\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.017772 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f065adc1-f6c1-4895-a933-906a708555c1-metrics\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.017930 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f065adc1-f6c1-4895-a933-906a708555c1-frr-conf\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.017990 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f065adc1-f6c1-4895-a933-906a708555c1-frr-sockets\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.019868 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4d145cb8-0c5c-40f7-a99c-15f1575629c3-metallb-excludel2\") pod \"speaker-gjp5f\" (UID: \"4d145cb8-0c5c-40f7-a99c-15f1575629c3\") " pod="metallb-system/speaker-gjp5f" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.020018 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f065adc1-f6c1-4895-a933-906a708555c1-metrics-certs\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.020070 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f065adc1-f6c1-4895-a933-906a708555c1-frr-startup\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.020389 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxmcp\" (UniqueName: \"kubernetes.io/projected/f065adc1-f6c1-4895-a933-906a708555c1-kube-api-access-rxmcp\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.020601 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a03b7d3-8e22-4a62-98f0-8d72500fab69-cert\") pod \"controller-69bbfbf88f-s7ndd\" (UID: \"2a03b7d3-8e22-4a62-98f0-8d72500fab69\") " pod="metallb-system/controller-69bbfbf88f-s7ndd" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.020631 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d145cb8-0c5c-40f7-a99c-15f1575629c3-metrics-certs\") pod \"speaker-gjp5f\" (UID: \"4d145cb8-0c5c-40f7-a99c-15f1575629c3\") " pod="metallb-system/speaker-gjp5f" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.020657 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmxh8\" (UniqueName: \"kubernetes.io/projected/2a03b7d3-8e22-4a62-98f0-8d72500fab69-kube-api-access-tmxh8\") pod \"controller-69bbfbf88f-s7ndd\" (UID: \"2a03b7d3-8e22-4a62-98f0-8d72500fab69\") " pod="metallb-system/controller-69bbfbf88f-s7ndd" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.020686 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe469d05-edeb-4d23-b06b-6bdbfc646e99-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-dljs6\" (UID: \"fe469d05-edeb-4d23-b06b-6bdbfc646e99\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.021173 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f065adc1-f6c1-4895-a933-906a708555c1-frr-startup\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.029549 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f065adc1-f6c1-4895-a933-906a708555c1-metrics-certs\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.035286 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxmcp\" (UniqueName: \"kubernetes.io/projected/f065adc1-f6c1-4895-a933-906a708555c1-kube-api-access-rxmcp\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.035980 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh9qq\" (UniqueName: \"kubernetes.io/projected/fe469d05-edeb-4d23-b06b-6bdbfc646e99-kube-api-access-vh9qq\") pod \"frr-k8s-webhook-server-78b44bf5bb-dljs6\" (UID: \"fe469d05-edeb-4d23-b06b-6bdbfc646e99\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.055830 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe469d05-edeb-4d23-b06b-6bdbfc646e99-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-dljs6\" (UID: \"fe469d05-edeb-4d23-b06b-6bdbfc646e99\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.121913 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a03b7d3-8e22-4a62-98f0-8d72500fab69-cert\") pod \"controller-69bbfbf88f-s7ndd\" (UID: \"2a03b7d3-8e22-4a62-98f0-8d72500fab69\") " pod="metallb-system/controller-69bbfbf88f-s7ndd" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.121961 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d145cb8-0c5c-40f7-a99c-15f1575629c3-metrics-certs\") pod \"speaker-gjp5f\" (UID: \"4d145cb8-0c5c-40f7-a99c-15f1575629c3\") " pod="metallb-system/speaker-gjp5f" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.121987 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmxh8\" (UniqueName: \"kubernetes.io/projected/2a03b7d3-8e22-4a62-98f0-8d72500fab69-kube-api-access-tmxh8\") pod \"controller-69bbfbf88f-s7ndd\" (UID: \"2a03b7d3-8e22-4a62-98f0-8d72500fab69\") " pod="metallb-system/controller-69bbfbf88f-s7ndd" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.122022 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t86x\" (UniqueName: \"kubernetes.io/projected/4d145cb8-0c5c-40f7-a99c-15f1575629c3-kube-api-access-8t86x\") pod \"speaker-gjp5f\" (UID: \"4d145cb8-0c5c-40f7-a99c-15f1575629c3\") " pod="metallb-system/speaker-gjp5f" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.122047 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4d145cb8-0c5c-40f7-a99c-15f1575629c3-memberlist\") pod \"speaker-gjp5f\" (UID: \"4d145cb8-0c5c-40f7-a99c-15f1575629c3\") " pod="metallb-system/speaker-gjp5f" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.122067 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2a03b7d3-8e22-4a62-98f0-8d72500fab69-metrics-certs\") pod \"controller-69bbfbf88f-s7ndd\" (UID: \"2a03b7d3-8e22-4a62-98f0-8d72500fab69\") " pod="metallb-system/controller-69bbfbf88f-s7ndd" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.122091 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4d145cb8-0c5c-40f7-a99c-15f1575629c3-metallb-excludel2\") pod \"speaker-gjp5f\" (UID: \"4d145cb8-0c5c-40f7-a99c-15f1575629c3\") " pod="metallb-system/speaker-gjp5f" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.122863 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4d145cb8-0c5c-40f7-a99c-15f1575629c3-metallb-excludel2\") pod \"speaker-gjp5f\" (UID: \"4d145cb8-0c5c-40f7-a99c-15f1575629c3\") " pod="metallb-system/speaker-gjp5f" Feb 20 07:02:48 crc kubenswrapper[5094]: E0220 07:02:48.122974 5094 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 20 07:02:48 crc kubenswrapper[5094]: E0220 07:02:48.123029 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d145cb8-0c5c-40f7-a99c-15f1575629c3-memberlist podName:4d145cb8-0c5c-40f7-a99c-15f1575629c3 nodeName:}" failed. No retries permitted until 2026-02-20 07:02:48.623010817 +0000 UTC m=+983.495637528 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/4d145cb8-0c5c-40f7-a99c-15f1575629c3-memberlist") pod "speaker-gjp5f" (UID: "4d145cb8-0c5c-40f7-a99c-15f1575629c3") : secret "metallb-memberlist" not found Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.124869 5094 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.127555 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d145cb8-0c5c-40f7-a99c-15f1575629c3-metrics-certs\") pod \"speaker-gjp5f\" (UID: \"4d145cb8-0c5c-40f7-a99c-15f1575629c3\") " pod="metallb-system/speaker-gjp5f" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.127770 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2a03b7d3-8e22-4a62-98f0-8d72500fab69-metrics-certs\") pod \"controller-69bbfbf88f-s7ndd\" (UID: \"2a03b7d3-8e22-4a62-98f0-8d72500fab69\") " pod="metallb-system/controller-69bbfbf88f-s7ndd" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.136385 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a03b7d3-8e22-4a62-98f0-8d72500fab69-cert\") pod \"controller-69bbfbf88f-s7ndd\" (UID: \"2a03b7d3-8e22-4a62-98f0-8d72500fab69\") " pod="metallb-system/controller-69bbfbf88f-s7ndd" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.140468 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmxh8\" (UniqueName: \"kubernetes.io/projected/2a03b7d3-8e22-4a62-98f0-8d72500fab69-kube-api-access-tmxh8\") pod \"controller-69bbfbf88f-s7ndd\" (UID: \"2a03b7d3-8e22-4a62-98f0-8d72500fab69\") " pod="metallb-system/controller-69bbfbf88f-s7ndd" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.141723 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t86x\" (UniqueName: \"kubernetes.io/projected/4d145cb8-0c5c-40f7-a99c-15f1575629c3-kube-api-access-8t86x\") pod \"speaker-gjp5f\" (UID: \"4d145cb8-0c5c-40f7-a99c-15f1575629c3\") " pod="metallb-system/speaker-gjp5f" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.149230 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.164496 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.280253 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-s7ndd" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.493397 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-s7ndd"] Feb 20 07:02:48 crc kubenswrapper[5094]: W0220 07:02:48.500209 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a03b7d3_8e22_4a62_98f0_8d72500fab69.slice/crio-194f3b628ec5b75608cf438ca9a8aaf9c27c5fb0b5f65a21faeb197c62701c5d WatchSource:0}: Error finding container 194f3b628ec5b75608cf438ca9a8aaf9c27c5fb0b5f65a21faeb197c62701c5d: Status 404 returned error can't find the container with id 194f3b628ec5b75608cf438ca9a8aaf9c27c5fb0b5f65a21faeb197c62701c5d Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.543903 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4p57m" event={"ID":"f065adc1-f6c1-4895-a933-906a708555c1","Type":"ContainerStarted","Data":"db851ee29f42c86621bfd89bef428bf083277405675601d774f432ee4fd656f8"} Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.545454 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-s7ndd" event={"ID":"2a03b7d3-8e22-4a62-98f0-8d72500fab69","Type":"ContainerStarted","Data":"194f3b628ec5b75608cf438ca9a8aaf9c27c5fb0b5f65a21faeb197c62701c5d"} Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.592834 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6"] Feb 20 07:02:48 crc kubenswrapper[5094]: W0220 07:02:48.594528 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe469d05_edeb_4d23_b06b_6bdbfc646e99.slice/crio-f495c902158f8adc1974f3c51c7357ebae7fcce6111f9164310daa3b57b99ad3 WatchSource:0}: Error finding container f495c902158f8adc1974f3c51c7357ebae7fcce6111f9164310daa3b57b99ad3: Status 404 returned error can't find the container with id f495c902158f8adc1974f3c51c7357ebae7fcce6111f9164310daa3b57b99ad3 Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.641038 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4d145cb8-0c5c-40f7-a99c-15f1575629c3-memberlist\") pod \"speaker-gjp5f\" (UID: \"4d145cb8-0c5c-40f7-a99c-15f1575629c3\") " pod="metallb-system/speaker-gjp5f" Feb 20 07:02:48 crc kubenswrapper[5094]: E0220 07:02:48.641255 5094 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 20 07:02:48 crc kubenswrapper[5094]: E0220 07:02:48.641317 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d145cb8-0c5c-40f7-a99c-15f1575629c3-memberlist podName:4d145cb8-0c5c-40f7-a99c-15f1575629c3 nodeName:}" failed. No retries permitted until 2026-02-20 07:02:49.641300649 +0000 UTC m=+984.513927360 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/4d145cb8-0c5c-40f7-a99c-15f1575629c3-memberlist") pod "speaker-gjp5f" (UID: "4d145cb8-0c5c-40f7-a99c-15f1575629c3") : secret "metallb-memberlist" not found Feb 20 07:02:49 crc kubenswrapper[5094]: I0220 07:02:49.555335 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-s7ndd" event={"ID":"2a03b7d3-8e22-4a62-98f0-8d72500fab69","Type":"ContainerStarted","Data":"82d0bf0b027a6c2af72dd4a0802d131e09d95fe1c345e059a39a39206cb92bd8"} Feb 20 07:02:49 crc kubenswrapper[5094]: I0220 07:02:49.555921 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-s7ndd" event={"ID":"2a03b7d3-8e22-4a62-98f0-8d72500fab69","Type":"ContainerStarted","Data":"01b95510a80fea6eeec2b14943604b8e6613411df3e75aa19ad70a120cc86791"} Feb 20 07:02:49 crc kubenswrapper[5094]: I0220 07:02:49.555942 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-s7ndd" Feb 20 07:02:49 crc kubenswrapper[5094]: I0220 07:02:49.558193 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6" event={"ID":"fe469d05-edeb-4d23-b06b-6bdbfc646e99","Type":"ContainerStarted","Data":"f495c902158f8adc1974f3c51c7357ebae7fcce6111f9164310daa3b57b99ad3"} Feb 20 07:02:49 crc kubenswrapper[5094]: I0220 07:02:49.585151 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-s7ndd" podStartSLOduration=2.585127843 podStartE2EDuration="2.585127843s" podCreationTimestamp="2026-02-20 07:02:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:02:49.578160186 +0000 UTC m=+984.450786897" watchObservedRunningTime="2026-02-20 07:02:49.585127843 +0000 UTC m=+984.457754554" Feb 20 07:02:49 crc kubenswrapper[5094]: I0220 07:02:49.668869 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4d145cb8-0c5c-40f7-a99c-15f1575629c3-memberlist\") pod \"speaker-gjp5f\" (UID: \"4d145cb8-0c5c-40f7-a99c-15f1575629c3\") " pod="metallb-system/speaker-gjp5f" Feb 20 07:02:49 crc kubenswrapper[5094]: I0220 07:02:49.675556 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4d145cb8-0c5c-40f7-a99c-15f1575629c3-memberlist\") pod \"speaker-gjp5f\" (UID: \"4d145cb8-0c5c-40f7-a99c-15f1575629c3\") " pod="metallb-system/speaker-gjp5f" Feb 20 07:02:49 crc kubenswrapper[5094]: I0220 07:02:49.755203 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-gjp5f" Feb 20 07:02:49 crc kubenswrapper[5094]: W0220 07:02:49.782193 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d145cb8_0c5c_40f7_a99c_15f1575629c3.slice/crio-4a1ae9dff887422ddf24fabb2f8cd51e0df3b31c0eeb4c7797c825b69c59a972 WatchSource:0}: Error finding container 4a1ae9dff887422ddf24fabb2f8cd51e0df3b31c0eeb4c7797c825b69c59a972: Status 404 returned error can't find the container with id 4a1ae9dff887422ddf24fabb2f8cd51e0df3b31c0eeb4c7797c825b69c59a972 Feb 20 07:02:50 crc kubenswrapper[5094]: I0220 07:02:50.568679 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gjp5f" event={"ID":"4d145cb8-0c5c-40f7-a99c-15f1575629c3","Type":"ContainerStarted","Data":"488664b5bc5adbeb38221a500978db23d753c6a8ee17485fbf0e6f5a3dd75897"} Feb 20 07:02:50 crc kubenswrapper[5094]: I0220 07:02:50.569209 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gjp5f" event={"ID":"4d145cb8-0c5c-40f7-a99c-15f1575629c3","Type":"ContainerStarted","Data":"8be93dc71a92facf56e5abb04779c569f66346b846440a9174bad148ecbc58f6"} Feb 20 07:02:50 crc kubenswrapper[5094]: I0220 07:02:50.569227 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gjp5f" event={"ID":"4d145cb8-0c5c-40f7-a99c-15f1575629c3","Type":"ContainerStarted","Data":"4a1ae9dff887422ddf24fabb2f8cd51e0df3b31c0eeb4c7797c825b69c59a972"} Feb 20 07:02:50 crc kubenswrapper[5094]: I0220 07:02:50.569458 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-gjp5f" Feb 20 07:02:50 crc kubenswrapper[5094]: I0220 07:02:50.594893 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-gjp5f" podStartSLOduration=3.594872206 podStartE2EDuration="3.594872206s" podCreationTimestamp="2026-02-20 07:02:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:02:50.58582415 +0000 UTC m=+985.458450871" watchObservedRunningTime="2026-02-20 07:02:50.594872206 +0000 UTC m=+985.467498937" Feb 20 07:02:55 crc kubenswrapper[5094]: I0220 07:02:55.621682 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6" event={"ID":"fe469d05-edeb-4d23-b06b-6bdbfc646e99","Type":"ContainerStarted","Data":"a4f4e3d94829bcb20bb68586c26fe960c2872a13289025271ed153f7f35abced"} Feb 20 07:02:55 crc kubenswrapper[5094]: I0220 07:02:55.622822 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6" Feb 20 07:02:55 crc kubenswrapper[5094]: I0220 07:02:55.627249 5094 generic.go:334] "Generic (PLEG): container finished" podID="f065adc1-f6c1-4895-a933-906a708555c1" containerID="a3af44e0f4e02e8a8325d470de9b0b4bc5dbd143660fd690507fd27f9ba720c9" exitCode=0 Feb 20 07:02:55 crc kubenswrapper[5094]: I0220 07:02:55.627293 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4p57m" event={"ID":"f065adc1-f6c1-4895-a933-906a708555c1","Type":"ContainerDied","Data":"a3af44e0f4e02e8a8325d470de9b0b4bc5dbd143660fd690507fd27f9ba720c9"} Feb 20 07:02:55 crc kubenswrapper[5094]: I0220 07:02:55.644821 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6" podStartSLOduration=1.939134543 podStartE2EDuration="8.644803398s" podCreationTimestamp="2026-02-20 07:02:47 +0000 UTC" firstStartedPulling="2026-02-20 07:02:48.598405782 +0000 UTC m=+983.471032493" lastFinishedPulling="2026-02-20 07:02:55.304074627 +0000 UTC m=+990.176701348" observedRunningTime="2026-02-20 07:02:55.640012313 +0000 UTC m=+990.512639034" watchObservedRunningTime="2026-02-20 07:02:55.644803398 +0000 UTC m=+990.517430109" Feb 20 07:02:56 crc kubenswrapper[5094]: I0220 07:02:56.651817 5094 generic.go:334] "Generic (PLEG): container finished" podID="f065adc1-f6c1-4895-a933-906a708555c1" containerID="3bbff53e5707494cd376f154bb19901ef4f1364b1823ba367ced8c70ab66dde0" exitCode=0 Feb 20 07:02:56 crc kubenswrapper[5094]: I0220 07:02:56.652377 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4p57m" event={"ID":"f065adc1-f6c1-4895-a933-906a708555c1","Type":"ContainerDied","Data":"3bbff53e5707494cd376f154bb19901ef4f1364b1823ba367ced8c70ab66dde0"} Feb 20 07:02:57 crc kubenswrapper[5094]: I0220 07:02:57.663855 5094 generic.go:334] "Generic (PLEG): container finished" podID="f065adc1-f6c1-4895-a933-906a708555c1" containerID="21c349f70a9e4f427492dfd3250103d18c4b74f84647fb9189af408beec72719" exitCode=0 Feb 20 07:02:57 crc kubenswrapper[5094]: I0220 07:02:57.663945 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4p57m" event={"ID":"f065adc1-f6c1-4895-a933-906a708555c1","Type":"ContainerDied","Data":"21c349f70a9e4f427492dfd3250103d18c4b74f84647fb9189af408beec72719"} Feb 20 07:02:58 crc kubenswrapper[5094]: I0220 07:02:58.285405 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-s7ndd" Feb 20 07:02:58 crc kubenswrapper[5094]: I0220 07:02:58.677247 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4p57m" event={"ID":"f065adc1-f6c1-4895-a933-906a708555c1","Type":"ContainerStarted","Data":"3582500ece9162a401385f04947e726da321096541453bb35d380a32493b15ab"} Feb 20 07:02:58 crc kubenswrapper[5094]: I0220 07:02:58.677307 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4p57m" event={"ID":"f065adc1-f6c1-4895-a933-906a708555c1","Type":"ContainerStarted","Data":"c7ef767e01941b707e320790173c642449e6d1247c24d7d543261cdb6b3774bc"} Feb 20 07:02:58 crc kubenswrapper[5094]: I0220 07:02:58.677318 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4p57m" event={"ID":"f065adc1-f6c1-4895-a933-906a708555c1","Type":"ContainerStarted","Data":"479971f48bcdfa12fa91407bc456683e1f99728d0366fb8fd1a1e825a16c85fe"} Feb 20 07:02:58 crc kubenswrapper[5094]: I0220 07:02:58.677328 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4p57m" event={"ID":"f065adc1-f6c1-4895-a933-906a708555c1","Type":"ContainerStarted","Data":"b2ec4dc1a1aacaf00209de2de8a58cefb5293c65b6ccf1ebab4f97134c587fc2"} Feb 20 07:02:58 crc kubenswrapper[5094]: I0220 07:02:58.677337 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4p57m" event={"ID":"f065adc1-f6c1-4895-a933-906a708555c1","Type":"ContainerStarted","Data":"af1c47ff236eab0ec8b623533b0f5c7ae98507e0c3d152394d037fec855ad13f"} Feb 20 07:02:59 crc kubenswrapper[5094]: I0220 07:02:59.693689 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4p57m" event={"ID":"f065adc1-f6c1-4895-a933-906a708555c1","Type":"ContainerStarted","Data":"cd6b82792d0d03f7b56ed72a30cb17fc94fc0279488e1b7feec76953d082b3d2"} Feb 20 07:02:59 crc kubenswrapper[5094]: I0220 07:02:59.693982 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:59 crc kubenswrapper[5094]: I0220 07:02:59.743576 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-4p57m" podStartSLOduration=5.777964179 podStartE2EDuration="12.743538279s" podCreationTimestamp="2026-02-20 07:02:47 +0000 UTC" firstStartedPulling="2026-02-20 07:02:48.307799173 +0000 UTC m=+983.180425904" lastFinishedPulling="2026-02-20 07:02:55.273373293 +0000 UTC m=+990.146000004" observedRunningTime="2026-02-20 07:02:59.737916465 +0000 UTC m=+994.610543166" watchObservedRunningTime="2026-02-20 07:02:59.743538279 +0000 UTC m=+994.616165030" Feb 20 07:03:03 crc kubenswrapper[5094]: I0220 07:03:03.150110 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-4p57m" Feb 20 07:03:03 crc kubenswrapper[5094]: I0220 07:03:03.216021 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-4p57m" Feb 20 07:03:08 crc kubenswrapper[5094]: I0220 07:03:08.155865 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-4p57m" Feb 20 07:03:08 crc kubenswrapper[5094]: I0220 07:03:08.169428 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6" Feb 20 07:03:09 crc kubenswrapper[5094]: I0220 07:03:09.760776 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-gjp5f" Feb 20 07:03:11 crc kubenswrapper[5094]: I0220 07:03:11.315686 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26"] Feb 20 07:03:11 crc kubenswrapper[5094]: I0220 07:03:11.318752 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" Feb 20 07:03:11 crc kubenswrapper[5094]: I0220 07:03:11.320852 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26"] Feb 20 07:03:11 crc kubenswrapper[5094]: I0220 07:03:11.325739 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 20 07:03:11 crc kubenswrapper[5094]: I0220 07:03:11.367939 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67055673-f25d-44d3-99e5-2ac1474b1872-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26\" (UID: \"67055673-f25d-44d3-99e5-2ac1474b1872\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" Feb 20 07:03:11 crc kubenswrapper[5094]: I0220 07:03:11.367995 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67055673-f25d-44d3-99e5-2ac1474b1872-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26\" (UID: \"67055673-f25d-44d3-99e5-2ac1474b1872\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" Feb 20 07:03:11 crc kubenswrapper[5094]: I0220 07:03:11.368024 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgqz2\" (UniqueName: \"kubernetes.io/projected/67055673-f25d-44d3-99e5-2ac1474b1872-kube-api-access-vgqz2\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26\" (UID: \"67055673-f25d-44d3-99e5-2ac1474b1872\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" Feb 20 07:03:11 crc kubenswrapper[5094]: I0220 07:03:11.469258 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67055673-f25d-44d3-99e5-2ac1474b1872-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26\" (UID: \"67055673-f25d-44d3-99e5-2ac1474b1872\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" Feb 20 07:03:11 crc kubenswrapper[5094]: I0220 07:03:11.469363 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67055673-f25d-44d3-99e5-2ac1474b1872-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26\" (UID: \"67055673-f25d-44d3-99e5-2ac1474b1872\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" Feb 20 07:03:11 crc kubenswrapper[5094]: I0220 07:03:11.469420 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgqz2\" (UniqueName: \"kubernetes.io/projected/67055673-f25d-44d3-99e5-2ac1474b1872-kube-api-access-vgqz2\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26\" (UID: \"67055673-f25d-44d3-99e5-2ac1474b1872\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" Feb 20 07:03:11 crc kubenswrapper[5094]: I0220 07:03:11.470576 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67055673-f25d-44d3-99e5-2ac1474b1872-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26\" (UID: \"67055673-f25d-44d3-99e5-2ac1474b1872\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" Feb 20 07:03:11 crc kubenswrapper[5094]: I0220 07:03:11.470857 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67055673-f25d-44d3-99e5-2ac1474b1872-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26\" (UID: \"67055673-f25d-44d3-99e5-2ac1474b1872\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" Feb 20 07:03:11 crc kubenswrapper[5094]: I0220 07:03:11.494631 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgqz2\" (UniqueName: \"kubernetes.io/projected/67055673-f25d-44d3-99e5-2ac1474b1872-kube-api-access-vgqz2\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26\" (UID: \"67055673-f25d-44d3-99e5-2ac1474b1872\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" Feb 20 07:03:11 crc kubenswrapper[5094]: I0220 07:03:11.647855 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" Feb 20 07:03:11 crc kubenswrapper[5094]: I0220 07:03:11.875746 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26"] Feb 20 07:03:11 crc kubenswrapper[5094]: W0220 07:03:11.888032 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67055673_f25d_44d3_99e5_2ac1474b1872.slice/crio-a06fb2e63973050f3ef6d3689ccd00add9e5f2c7009337beef4233767ff022a2 WatchSource:0}: Error finding container a06fb2e63973050f3ef6d3689ccd00add9e5f2c7009337beef4233767ff022a2: Status 404 returned error can't find the container with id a06fb2e63973050f3ef6d3689ccd00add9e5f2c7009337beef4233767ff022a2 Feb 20 07:03:12 crc kubenswrapper[5094]: I0220 07:03:12.817601 5094 generic.go:334] "Generic (PLEG): container finished" podID="67055673-f25d-44d3-99e5-2ac1474b1872" containerID="1da52198b68563bd53f89d4575f494e4f72f2f9554fa289e7566682c09e4ef00" exitCode=0 Feb 20 07:03:12 crc kubenswrapper[5094]: I0220 07:03:12.817760 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" event={"ID":"67055673-f25d-44d3-99e5-2ac1474b1872","Type":"ContainerDied","Data":"1da52198b68563bd53f89d4575f494e4f72f2f9554fa289e7566682c09e4ef00"} Feb 20 07:03:12 crc kubenswrapper[5094]: I0220 07:03:12.819036 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" event={"ID":"67055673-f25d-44d3-99e5-2ac1474b1872","Type":"ContainerStarted","Data":"a06fb2e63973050f3ef6d3689ccd00add9e5f2c7009337beef4233767ff022a2"} Feb 20 07:03:16 crc kubenswrapper[5094]: I0220 07:03:16.854849 5094 generic.go:334] "Generic (PLEG): container finished" podID="67055673-f25d-44d3-99e5-2ac1474b1872" containerID="46af8a535f540b4e725b7d665b3d691bcbfe11d0f6da44e3eeb4d68f82a2f9af" exitCode=0 Feb 20 07:03:16 crc kubenswrapper[5094]: I0220 07:03:16.854913 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" event={"ID":"67055673-f25d-44d3-99e5-2ac1474b1872","Type":"ContainerDied","Data":"46af8a535f540b4e725b7d665b3d691bcbfe11d0f6da44e3eeb4d68f82a2f9af"} Feb 20 07:03:17 crc kubenswrapper[5094]: I0220 07:03:17.873296 5094 generic.go:334] "Generic (PLEG): container finished" podID="67055673-f25d-44d3-99e5-2ac1474b1872" containerID="cdefbd01dfaf8a33479b0145b8ed11e8938f4b0935863e19a04a08a68cef9f59" exitCode=0 Feb 20 07:03:17 crc kubenswrapper[5094]: I0220 07:03:17.873377 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" event={"ID":"67055673-f25d-44d3-99e5-2ac1474b1872","Type":"ContainerDied","Data":"cdefbd01dfaf8a33479b0145b8ed11e8938f4b0935863e19a04a08a68cef9f59"} Feb 20 07:03:19 crc kubenswrapper[5094]: I0220 07:03:19.328764 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" Feb 20 07:03:19 crc kubenswrapper[5094]: I0220 07:03:19.436138 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgqz2\" (UniqueName: \"kubernetes.io/projected/67055673-f25d-44d3-99e5-2ac1474b1872-kube-api-access-vgqz2\") pod \"67055673-f25d-44d3-99e5-2ac1474b1872\" (UID: \"67055673-f25d-44d3-99e5-2ac1474b1872\") " Feb 20 07:03:19 crc kubenswrapper[5094]: I0220 07:03:19.436354 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67055673-f25d-44d3-99e5-2ac1474b1872-util\") pod \"67055673-f25d-44d3-99e5-2ac1474b1872\" (UID: \"67055673-f25d-44d3-99e5-2ac1474b1872\") " Feb 20 07:03:19 crc kubenswrapper[5094]: I0220 07:03:19.436439 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67055673-f25d-44d3-99e5-2ac1474b1872-bundle\") pod \"67055673-f25d-44d3-99e5-2ac1474b1872\" (UID: \"67055673-f25d-44d3-99e5-2ac1474b1872\") " Feb 20 07:03:19 crc kubenswrapper[5094]: I0220 07:03:19.438677 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67055673-f25d-44d3-99e5-2ac1474b1872-bundle" (OuterVolumeSpecName: "bundle") pod "67055673-f25d-44d3-99e5-2ac1474b1872" (UID: "67055673-f25d-44d3-99e5-2ac1474b1872"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:03:19 crc kubenswrapper[5094]: I0220 07:03:19.447329 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67055673-f25d-44d3-99e5-2ac1474b1872-kube-api-access-vgqz2" (OuterVolumeSpecName: "kube-api-access-vgqz2") pod "67055673-f25d-44d3-99e5-2ac1474b1872" (UID: "67055673-f25d-44d3-99e5-2ac1474b1872"). InnerVolumeSpecName "kube-api-access-vgqz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:03:19 crc kubenswrapper[5094]: I0220 07:03:19.459261 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67055673-f25d-44d3-99e5-2ac1474b1872-util" (OuterVolumeSpecName: "util") pod "67055673-f25d-44d3-99e5-2ac1474b1872" (UID: "67055673-f25d-44d3-99e5-2ac1474b1872"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:03:19 crc kubenswrapper[5094]: I0220 07:03:19.538804 5094 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67055673-f25d-44d3-99e5-2ac1474b1872-util\") on node \"crc\" DevicePath \"\"" Feb 20 07:03:19 crc kubenswrapper[5094]: I0220 07:03:19.538869 5094 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67055673-f25d-44d3-99e5-2ac1474b1872-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:03:19 crc kubenswrapper[5094]: I0220 07:03:19.538891 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgqz2\" (UniqueName: \"kubernetes.io/projected/67055673-f25d-44d3-99e5-2ac1474b1872-kube-api-access-vgqz2\") on node \"crc\" DevicePath \"\"" Feb 20 07:03:19 crc kubenswrapper[5094]: I0220 07:03:19.900396 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" event={"ID":"67055673-f25d-44d3-99e5-2ac1474b1872","Type":"ContainerDied","Data":"a06fb2e63973050f3ef6d3689ccd00add9e5f2c7009337beef4233767ff022a2"} Feb 20 07:03:19 crc kubenswrapper[5094]: I0220 07:03:19.900880 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a06fb2e63973050f3ef6d3689ccd00add9e5f2c7009337beef4233767ff022a2" Feb 20 07:03:19 crc kubenswrapper[5094]: I0220 07:03:19.900466 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.406595 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fc864"] Feb 20 07:03:24 crc kubenswrapper[5094]: E0220 07:03:24.407624 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67055673-f25d-44d3-99e5-2ac1474b1872" containerName="pull" Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.407642 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="67055673-f25d-44d3-99e5-2ac1474b1872" containerName="pull" Feb 20 07:03:24 crc kubenswrapper[5094]: E0220 07:03:24.407656 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67055673-f25d-44d3-99e5-2ac1474b1872" containerName="util" Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.407662 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="67055673-f25d-44d3-99e5-2ac1474b1872" containerName="util" Feb 20 07:03:24 crc kubenswrapper[5094]: E0220 07:03:24.407675 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67055673-f25d-44d3-99e5-2ac1474b1872" containerName="extract" Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.407681 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="67055673-f25d-44d3-99e5-2ac1474b1872" containerName="extract" Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.407831 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="67055673-f25d-44d3-99e5-2ac1474b1872" containerName="extract" Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.408367 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fc864" Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.419693 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8736e3bf-949d-48fb-a246-83adc37708df-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-fc864\" (UID: \"8736e3bf-949d-48fb-a246-83adc37708df\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fc864" Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.419691 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.419964 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b446m\" (UniqueName: \"kubernetes.io/projected/8736e3bf-949d-48fb-a246-83adc37708df-kube-api-access-b446m\") pod \"cert-manager-operator-controller-manager-66c8bdd694-fc864\" (UID: \"8736e3bf-949d-48fb-a246-83adc37708df\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fc864" Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.420407 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.422194 5094 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-gj5vk" Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.443290 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fc864"] Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.521925 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8736e3bf-949d-48fb-a246-83adc37708df-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-fc864\" (UID: \"8736e3bf-949d-48fb-a246-83adc37708df\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fc864" Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.522027 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b446m\" (UniqueName: \"kubernetes.io/projected/8736e3bf-949d-48fb-a246-83adc37708df-kube-api-access-b446m\") pod \"cert-manager-operator-controller-manager-66c8bdd694-fc864\" (UID: \"8736e3bf-949d-48fb-a246-83adc37708df\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fc864" Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.522909 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8736e3bf-949d-48fb-a246-83adc37708df-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-fc864\" (UID: \"8736e3bf-949d-48fb-a246-83adc37708df\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fc864" Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.556880 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b446m\" (UniqueName: \"kubernetes.io/projected/8736e3bf-949d-48fb-a246-83adc37708df-kube-api-access-b446m\") pod \"cert-manager-operator-controller-manager-66c8bdd694-fc864\" (UID: \"8736e3bf-949d-48fb-a246-83adc37708df\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fc864" Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.734153 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fc864" Feb 20 07:03:25 crc kubenswrapper[5094]: I0220 07:03:25.019721 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fc864"] Feb 20 07:03:25 crc kubenswrapper[5094]: I0220 07:03:25.965019 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fc864" event={"ID":"8736e3bf-949d-48fb-a246-83adc37708df","Type":"ContainerStarted","Data":"90fd8fcb765ed206b923255dde2caf620239eb46c6dd821a503c263e73384e7f"} Feb 20 07:03:29 crc kubenswrapper[5094]: I0220 07:03:29.000527 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fc864" event={"ID":"8736e3bf-949d-48fb-a246-83adc37708df","Type":"ContainerStarted","Data":"276d433b83ee5155b9dc708d6dc7de577bd267260a14381bcca1d992a65a461c"} Feb 20 07:03:29 crc kubenswrapper[5094]: I0220 07:03:29.035037 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fc864" podStartSLOduration=1.626340323 podStartE2EDuration="5.035007907s" podCreationTimestamp="2026-02-20 07:03:24 +0000 UTC" firstStartedPulling="2026-02-20 07:03:25.039113489 +0000 UTC m=+1019.911740200" lastFinishedPulling="2026-02-20 07:03:28.447781063 +0000 UTC m=+1023.320407784" observedRunningTime="2026-02-20 07:03:29.027165229 +0000 UTC m=+1023.899791950" watchObservedRunningTime="2026-02-20 07:03:29.035007907 +0000 UTC m=+1023.907634658" Feb 20 07:03:33 crc kubenswrapper[5094]: I0220 07:03:33.650158 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-sxrw7"] Feb 20 07:03:33 crc kubenswrapper[5094]: I0220 07:03:33.651883 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-sxrw7" Feb 20 07:03:33 crc kubenswrapper[5094]: I0220 07:03:33.654340 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 20 07:03:33 crc kubenswrapper[5094]: I0220 07:03:33.654765 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 20 07:03:33 crc kubenswrapper[5094]: I0220 07:03:33.655125 5094 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-pp4px" Feb 20 07:03:33 crc kubenswrapper[5094]: I0220 07:03:33.666958 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-sxrw7"] Feb 20 07:03:33 crc kubenswrapper[5094]: I0220 07:03:33.771662 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-795dd\" (UniqueName: \"kubernetes.io/projected/bc1f2312-eb97-4f63-b37b-975d9dfb5a73-kube-api-access-795dd\") pod \"cert-manager-webhook-6888856db4-sxrw7\" (UID: \"bc1f2312-eb97-4f63-b37b-975d9dfb5a73\") " pod="cert-manager/cert-manager-webhook-6888856db4-sxrw7" Feb 20 07:03:33 crc kubenswrapper[5094]: I0220 07:03:33.771752 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bc1f2312-eb97-4f63-b37b-975d9dfb5a73-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-sxrw7\" (UID: \"bc1f2312-eb97-4f63-b37b-975d9dfb5a73\") " pod="cert-manager/cert-manager-webhook-6888856db4-sxrw7" Feb 20 07:03:33 crc kubenswrapper[5094]: I0220 07:03:33.873229 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-795dd\" (UniqueName: \"kubernetes.io/projected/bc1f2312-eb97-4f63-b37b-975d9dfb5a73-kube-api-access-795dd\") pod \"cert-manager-webhook-6888856db4-sxrw7\" (UID: \"bc1f2312-eb97-4f63-b37b-975d9dfb5a73\") " pod="cert-manager/cert-manager-webhook-6888856db4-sxrw7" Feb 20 07:03:33 crc kubenswrapper[5094]: I0220 07:03:33.873326 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bc1f2312-eb97-4f63-b37b-975d9dfb5a73-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-sxrw7\" (UID: \"bc1f2312-eb97-4f63-b37b-975d9dfb5a73\") " pod="cert-manager/cert-manager-webhook-6888856db4-sxrw7" Feb 20 07:03:33 crc kubenswrapper[5094]: I0220 07:03:33.897293 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-795dd\" (UniqueName: \"kubernetes.io/projected/bc1f2312-eb97-4f63-b37b-975d9dfb5a73-kube-api-access-795dd\") pod \"cert-manager-webhook-6888856db4-sxrw7\" (UID: \"bc1f2312-eb97-4f63-b37b-975d9dfb5a73\") " pod="cert-manager/cert-manager-webhook-6888856db4-sxrw7" Feb 20 07:03:33 crc kubenswrapper[5094]: I0220 07:03:33.901834 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bc1f2312-eb97-4f63-b37b-975d9dfb5a73-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-sxrw7\" (UID: \"bc1f2312-eb97-4f63-b37b-975d9dfb5a73\") " pod="cert-manager/cert-manager-webhook-6888856db4-sxrw7" Feb 20 07:03:33 crc kubenswrapper[5094]: I0220 07:03:33.982327 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-sxrw7" Feb 20 07:03:34 crc kubenswrapper[5094]: I0220 07:03:34.494903 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-sxrw7"] Feb 20 07:03:35 crc kubenswrapper[5094]: I0220 07:03:35.045555 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-sxrw7" event={"ID":"bc1f2312-eb97-4f63-b37b-975d9dfb5a73","Type":"ContainerStarted","Data":"a040f4dfe7b135477a3354c3b1349616346ab73cadd25e233d7882bd6138c00f"} Feb 20 07:03:36 crc kubenswrapper[5094]: I0220 07:03:36.449783 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-mtw89"] Feb 20 07:03:36 crc kubenswrapper[5094]: I0220 07:03:36.454478 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-mtw89" Feb 20 07:03:36 crc kubenswrapper[5094]: I0220 07:03:36.457488 5094 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-zjv8f" Feb 20 07:03:36 crc kubenswrapper[5094]: I0220 07:03:36.470500 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-mtw89"] Feb 20 07:03:36 crc kubenswrapper[5094]: I0220 07:03:36.515723 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/34f53f0e-6a22-42c9-a953-3ec38e87a70f-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-mtw89\" (UID: \"34f53f0e-6a22-42c9-a953-3ec38e87a70f\") " pod="cert-manager/cert-manager-cainjector-5545bd876-mtw89" Feb 20 07:03:36 crc kubenswrapper[5094]: I0220 07:03:36.515772 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c847p\" (UniqueName: \"kubernetes.io/projected/34f53f0e-6a22-42c9-a953-3ec38e87a70f-kube-api-access-c847p\") pod \"cert-manager-cainjector-5545bd876-mtw89\" (UID: \"34f53f0e-6a22-42c9-a953-3ec38e87a70f\") " pod="cert-manager/cert-manager-cainjector-5545bd876-mtw89" Feb 20 07:03:36 crc kubenswrapper[5094]: I0220 07:03:36.617034 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/34f53f0e-6a22-42c9-a953-3ec38e87a70f-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-mtw89\" (UID: \"34f53f0e-6a22-42c9-a953-3ec38e87a70f\") " pod="cert-manager/cert-manager-cainjector-5545bd876-mtw89" Feb 20 07:03:36 crc kubenswrapper[5094]: I0220 07:03:36.617093 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c847p\" (UniqueName: \"kubernetes.io/projected/34f53f0e-6a22-42c9-a953-3ec38e87a70f-kube-api-access-c847p\") pod \"cert-manager-cainjector-5545bd876-mtw89\" (UID: \"34f53f0e-6a22-42c9-a953-3ec38e87a70f\") " pod="cert-manager/cert-manager-cainjector-5545bd876-mtw89" Feb 20 07:03:36 crc kubenswrapper[5094]: I0220 07:03:36.643651 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/34f53f0e-6a22-42c9-a953-3ec38e87a70f-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-mtw89\" (UID: \"34f53f0e-6a22-42c9-a953-3ec38e87a70f\") " pod="cert-manager/cert-manager-cainjector-5545bd876-mtw89" Feb 20 07:03:36 crc kubenswrapper[5094]: I0220 07:03:36.646555 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c847p\" (UniqueName: \"kubernetes.io/projected/34f53f0e-6a22-42c9-a953-3ec38e87a70f-kube-api-access-c847p\") pod \"cert-manager-cainjector-5545bd876-mtw89\" (UID: \"34f53f0e-6a22-42c9-a953-3ec38e87a70f\") " pod="cert-manager/cert-manager-cainjector-5545bd876-mtw89" Feb 20 07:03:36 crc kubenswrapper[5094]: I0220 07:03:36.780007 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-mtw89" Feb 20 07:03:37 crc kubenswrapper[5094]: I0220 07:03:37.197549 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-mtw89"] Feb 20 07:03:37 crc kubenswrapper[5094]: W0220 07:03:37.210187 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34f53f0e_6a22_42c9_a953_3ec38e87a70f.slice/crio-6499dbac1ebbf4f6972f6acd2f422eeb14cd49350e286826a6d9dd9d215ce031 WatchSource:0}: Error finding container 6499dbac1ebbf4f6972f6acd2f422eeb14cd49350e286826a6d9dd9d215ce031: Status 404 returned error can't find the container with id 6499dbac1ebbf4f6972f6acd2f422eeb14cd49350e286826a6d9dd9d215ce031 Feb 20 07:03:38 crc kubenswrapper[5094]: I0220 07:03:38.070770 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-mtw89" event={"ID":"34f53f0e-6a22-42c9-a953-3ec38e87a70f","Type":"ContainerStarted","Data":"6499dbac1ebbf4f6972f6acd2f422eeb14cd49350e286826a6d9dd9d215ce031"} Feb 20 07:03:40 crc kubenswrapper[5094]: I0220 07:03:40.084579 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-sxrw7" event={"ID":"bc1f2312-eb97-4f63-b37b-975d9dfb5a73","Type":"ContainerStarted","Data":"f4c1835b58e3139b5342090792e852f6fca1eb07f87f163c9d882a331e7766d5"} Feb 20 07:03:40 crc kubenswrapper[5094]: I0220 07:03:40.085141 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-sxrw7" Feb 20 07:03:40 crc kubenswrapper[5094]: I0220 07:03:40.086173 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-mtw89" event={"ID":"34f53f0e-6a22-42c9-a953-3ec38e87a70f","Type":"ContainerStarted","Data":"af7e1f46a75d66bd1c0de4843cf28caaa41cfb30822018e359a0a2fecc214a0b"} Feb 20 07:03:40 crc kubenswrapper[5094]: I0220 07:03:40.134349 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-mtw89" podStartSLOduration=1.516528272 podStartE2EDuration="4.134311637s" podCreationTimestamp="2026-02-20 07:03:36 +0000 UTC" firstStartedPulling="2026-02-20 07:03:37.213742781 +0000 UTC m=+1032.086369492" lastFinishedPulling="2026-02-20 07:03:39.831526146 +0000 UTC m=+1034.704152857" observedRunningTime="2026-02-20 07:03:40.123405225 +0000 UTC m=+1034.996031976" watchObservedRunningTime="2026-02-20 07:03:40.134311637 +0000 UTC m=+1035.006938368" Feb 20 07:03:40 crc kubenswrapper[5094]: I0220 07:03:40.137050 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-sxrw7" podStartSLOduration=1.7821294060000001 podStartE2EDuration="7.137018522s" podCreationTimestamp="2026-02-20 07:03:33 +0000 UTC" firstStartedPulling="2026-02-20 07:03:34.506144616 +0000 UTC m=+1029.378771337" lastFinishedPulling="2026-02-20 07:03:39.861033742 +0000 UTC m=+1034.733660453" observedRunningTime="2026-02-20 07:03:40.103691383 +0000 UTC m=+1034.976318094" watchObservedRunningTime="2026-02-20 07:03:40.137018522 +0000 UTC m=+1035.009645243" Feb 20 07:03:43 crc kubenswrapper[5094]: I0220 07:03:43.406638 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-pdnlx"] Feb 20 07:03:43 crc kubenswrapper[5094]: I0220 07:03:43.408944 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-pdnlx" Feb 20 07:03:43 crc kubenswrapper[5094]: I0220 07:03:43.412162 5094 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-99mmx" Feb 20 07:03:43 crc kubenswrapper[5094]: I0220 07:03:43.422862 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-pdnlx"] Feb 20 07:03:43 crc kubenswrapper[5094]: I0220 07:03:43.564844 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk4d5\" (UniqueName: \"kubernetes.io/projected/d6360113-cdd8-48a4-a145-4b54eb5510eb-kube-api-access-gk4d5\") pod \"cert-manager-545d4d4674-pdnlx\" (UID: \"d6360113-cdd8-48a4-a145-4b54eb5510eb\") " pod="cert-manager/cert-manager-545d4d4674-pdnlx" Feb 20 07:03:43 crc kubenswrapper[5094]: I0220 07:03:43.564961 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d6360113-cdd8-48a4-a145-4b54eb5510eb-bound-sa-token\") pod \"cert-manager-545d4d4674-pdnlx\" (UID: \"d6360113-cdd8-48a4-a145-4b54eb5510eb\") " pod="cert-manager/cert-manager-545d4d4674-pdnlx" Feb 20 07:03:43 crc kubenswrapper[5094]: I0220 07:03:43.668268 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk4d5\" (UniqueName: \"kubernetes.io/projected/d6360113-cdd8-48a4-a145-4b54eb5510eb-kube-api-access-gk4d5\") pod \"cert-manager-545d4d4674-pdnlx\" (UID: \"d6360113-cdd8-48a4-a145-4b54eb5510eb\") " pod="cert-manager/cert-manager-545d4d4674-pdnlx" Feb 20 07:03:43 crc kubenswrapper[5094]: I0220 07:03:43.668851 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d6360113-cdd8-48a4-a145-4b54eb5510eb-bound-sa-token\") pod \"cert-manager-545d4d4674-pdnlx\" (UID: \"d6360113-cdd8-48a4-a145-4b54eb5510eb\") " pod="cert-manager/cert-manager-545d4d4674-pdnlx" Feb 20 07:03:43 crc kubenswrapper[5094]: I0220 07:03:43.703952 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk4d5\" (UniqueName: \"kubernetes.io/projected/d6360113-cdd8-48a4-a145-4b54eb5510eb-kube-api-access-gk4d5\") pod \"cert-manager-545d4d4674-pdnlx\" (UID: \"d6360113-cdd8-48a4-a145-4b54eb5510eb\") " pod="cert-manager/cert-manager-545d4d4674-pdnlx" Feb 20 07:03:43 crc kubenswrapper[5094]: I0220 07:03:43.706571 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d6360113-cdd8-48a4-a145-4b54eb5510eb-bound-sa-token\") pod \"cert-manager-545d4d4674-pdnlx\" (UID: \"d6360113-cdd8-48a4-a145-4b54eb5510eb\") " pod="cert-manager/cert-manager-545d4d4674-pdnlx" Feb 20 07:03:43 crc kubenswrapper[5094]: I0220 07:03:43.740328 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-pdnlx" Feb 20 07:03:44 crc kubenswrapper[5094]: I0220 07:03:44.323628 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-pdnlx"] Feb 20 07:03:44 crc kubenswrapper[5094]: W0220 07:03:44.343297 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6360113_cdd8_48a4_a145_4b54eb5510eb.slice/crio-ead8a41c806bfe66337ce671ca0b252303a3993e14017bcf3de19807af393c52 WatchSource:0}: Error finding container ead8a41c806bfe66337ce671ca0b252303a3993e14017bcf3de19807af393c52: Status 404 returned error can't find the container with id ead8a41c806bfe66337ce671ca0b252303a3993e14017bcf3de19807af393c52 Feb 20 07:03:45 crc kubenswrapper[5094]: I0220 07:03:45.127831 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-pdnlx" event={"ID":"d6360113-cdd8-48a4-a145-4b54eb5510eb","Type":"ContainerStarted","Data":"b79a77daa180badf772abd691f2f5997fd80d06cc147083eb970116ad6b44067"} Feb 20 07:03:45 crc kubenswrapper[5094]: I0220 07:03:45.128163 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-pdnlx" event={"ID":"d6360113-cdd8-48a4-a145-4b54eb5510eb","Type":"ContainerStarted","Data":"ead8a41c806bfe66337ce671ca0b252303a3993e14017bcf3de19807af393c52"} Feb 20 07:03:45 crc kubenswrapper[5094]: I0220 07:03:45.151693 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-pdnlx" podStartSLOduration=2.151668518 podStartE2EDuration="2.151668518s" podCreationTimestamp="2026-02-20 07:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:03:45.150356427 +0000 UTC m=+1040.022983138" watchObservedRunningTime="2026-02-20 07:03:45.151668518 +0000 UTC m=+1040.024295229" Feb 20 07:03:48 crc kubenswrapper[5094]: I0220 07:03:48.986075 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-sxrw7" Feb 20 07:03:52 crc kubenswrapper[5094]: I0220 07:03:52.531286 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-nmt6q"] Feb 20 07:03:52 crc kubenswrapper[5094]: I0220 07:03:52.535780 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nmt6q" Feb 20 07:03:52 crc kubenswrapper[5094]: I0220 07:03:52.543413 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 20 07:03:52 crc kubenswrapper[5094]: I0220 07:03:52.545009 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-m77f6" Feb 20 07:03:52 crc kubenswrapper[5094]: I0220 07:03:52.545047 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 20 07:03:52 crc kubenswrapper[5094]: I0220 07:03:52.611073 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nmt6q"] Feb 20 07:03:52 crc kubenswrapper[5094]: I0220 07:03:52.638910 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzqsm\" (UniqueName: \"kubernetes.io/projected/84ef4848-2558-4cf8-bac6-2f5ed78a74af-kube-api-access-fzqsm\") pod \"openstack-operator-index-nmt6q\" (UID: \"84ef4848-2558-4cf8-bac6-2f5ed78a74af\") " pod="openstack-operators/openstack-operator-index-nmt6q" Feb 20 07:03:52 crc kubenswrapper[5094]: I0220 07:03:52.740637 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzqsm\" (UniqueName: \"kubernetes.io/projected/84ef4848-2558-4cf8-bac6-2f5ed78a74af-kube-api-access-fzqsm\") pod \"openstack-operator-index-nmt6q\" (UID: \"84ef4848-2558-4cf8-bac6-2f5ed78a74af\") " pod="openstack-operators/openstack-operator-index-nmt6q" Feb 20 07:03:52 crc kubenswrapper[5094]: I0220 07:03:52.764636 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzqsm\" (UniqueName: \"kubernetes.io/projected/84ef4848-2558-4cf8-bac6-2f5ed78a74af-kube-api-access-fzqsm\") pod \"openstack-operator-index-nmt6q\" (UID: \"84ef4848-2558-4cf8-bac6-2f5ed78a74af\") " pod="openstack-operators/openstack-operator-index-nmt6q" Feb 20 07:03:52 crc kubenswrapper[5094]: I0220 07:03:52.862431 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nmt6q" Feb 20 07:03:53 crc kubenswrapper[5094]: I0220 07:03:53.366548 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nmt6q"] Feb 20 07:03:53 crc kubenswrapper[5094]: W0220 07:03:53.375573 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84ef4848_2558_4cf8_bac6_2f5ed78a74af.slice/crio-815a90dae64743f7ad59b41441f7594e7eca46d0f3dab56ecdb9ab0d3c224a44 WatchSource:0}: Error finding container 815a90dae64743f7ad59b41441f7594e7eca46d0f3dab56ecdb9ab0d3c224a44: Status 404 returned error can't find the container with id 815a90dae64743f7ad59b41441f7594e7eca46d0f3dab56ecdb9ab0d3c224a44 Feb 20 07:03:54 crc kubenswrapper[5094]: I0220 07:03:54.202053 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nmt6q" event={"ID":"84ef4848-2558-4cf8-bac6-2f5ed78a74af","Type":"ContainerStarted","Data":"815a90dae64743f7ad59b41441f7594e7eca46d0f3dab56ecdb9ab0d3c224a44"} Feb 20 07:03:55 crc kubenswrapper[5094]: I0220 07:03:55.214102 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nmt6q" event={"ID":"84ef4848-2558-4cf8-bac6-2f5ed78a74af","Type":"ContainerStarted","Data":"ad4ce4677efdfcf754c1cbcc9ee5621ccc1cb4030f623f01b62bbf2b0e05c251"} Feb 20 07:03:55 crc kubenswrapper[5094]: I0220 07:03:55.242263 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-nmt6q" podStartSLOduration=2.546658751 podStartE2EDuration="3.242234559s" podCreationTimestamp="2026-02-20 07:03:52 +0000 UTC" firstStartedPulling="2026-02-20 07:03:53.379659142 +0000 UTC m=+1048.252285893" lastFinishedPulling="2026-02-20 07:03:54.07523498 +0000 UTC m=+1048.947861701" observedRunningTime="2026-02-20 07:03:55.237880124 +0000 UTC m=+1050.110506875" watchObservedRunningTime="2026-02-20 07:03:55.242234559 +0000 UTC m=+1050.114861280" Feb 20 07:03:55 crc kubenswrapper[5094]: I0220 07:03:55.297945 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-nmt6q"] Feb 20 07:03:55 crc kubenswrapper[5094]: I0220 07:03:55.922012 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-72vrj"] Feb 20 07:03:55 crc kubenswrapper[5094]: I0220 07:03:55.925272 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-72vrj" Feb 20 07:03:55 crc kubenswrapper[5094]: I0220 07:03:55.944350 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-72vrj"] Feb 20 07:03:56 crc kubenswrapper[5094]: I0220 07:03:56.002219 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cpnf\" (UniqueName: \"kubernetes.io/projected/be2cc842-778e-4963-80f8-bb5c7426f175-kube-api-access-4cpnf\") pod \"openstack-operator-index-72vrj\" (UID: \"be2cc842-778e-4963-80f8-bb5c7426f175\") " pod="openstack-operators/openstack-operator-index-72vrj" Feb 20 07:03:56 crc kubenswrapper[5094]: I0220 07:03:56.103579 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cpnf\" (UniqueName: \"kubernetes.io/projected/be2cc842-778e-4963-80f8-bb5c7426f175-kube-api-access-4cpnf\") pod \"openstack-operator-index-72vrj\" (UID: \"be2cc842-778e-4963-80f8-bb5c7426f175\") " pod="openstack-operators/openstack-operator-index-72vrj" Feb 20 07:03:56 crc kubenswrapper[5094]: I0220 07:03:56.138687 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cpnf\" (UniqueName: \"kubernetes.io/projected/be2cc842-778e-4963-80f8-bb5c7426f175-kube-api-access-4cpnf\") pod \"openstack-operator-index-72vrj\" (UID: \"be2cc842-778e-4963-80f8-bb5c7426f175\") " pod="openstack-operators/openstack-operator-index-72vrj" Feb 20 07:03:56 crc kubenswrapper[5094]: I0220 07:03:56.267244 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-72vrj" Feb 20 07:03:56 crc kubenswrapper[5094]: I0220 07:03:56.848490 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-72vrj"] Feb 20 07:03:57 crc kubenswrapper[5094]: I0220 07:03:57.228842 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-72vrj" event={"ID":"be2cc842-778e-4963-80f8-bb5c7426f175","Type":"ContainerStarted","Data":"f5c4c71d04ff2231e43246d1822cb9a245ea195f3ded88f709d65978fdc2022b"} Feb 20 07:03:57 crc kubenswrapper[5094]: I0220 07:03:57.228966 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-nmt6q" podUID="84ef4848-2558-4cf8-bac6-2f5ed78a74af" containerName="registry-server" containerID="cri-o://ad4ce4677efdfcf754c1cbcc9ee5621ccc1cb4030f623f01b62bbf2b0e05c251" gracePeriod=2 Feb 20 07:03:57 crc kubenswrapper[5094]: I0220 07:03:57.653823 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nmt6q" Feb 20 07:03:57 crc kubenswrapper[5094]: I0220 07:03:57.739616 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzqsm\" (UniqueName: \"kubernetes.io/projected/84ef4848-2558-4cf8-bac6-2f5ed78a74af-kube-api-access-fzqsm\") pod \"84ef4848-2558-4cf8-bac6-2f5ed78a74af\" (UID: \"84ef4848-2558-4cf8-bac6-2f5ed78a74af\") " Feb 20 07:03:57 crc kubenswrapper[5094]: I0220 07:03:57.749564 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84ef4848-2558-4cf8-bac6-2f5ed78a74af-kube-api-access-fzqsm" (OuterVolumeSpecName: "kube-api-access-fzqsm") pod "84ef4848-2558-4cf8-bac6-2f5ed78a74af" (UID: "84ef4848-2558-4cf8-bac6-2f5ed78a74af"). InnerVolumeSpecName "kube-api-access-fzqsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:03:57 crc kubenswrapper[5094]: I0220 07:03:57.841514 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzqsm\" (UniqueName: \"kubernetes.io/projected/84ef4848-2558-4cf8-bac6-2f5ed78a74af-kube-api-access-fzqsm\") on node \"crc\" DevicePath \"\"" Feb 20 07:03:58 crc kubenswrapper[5094]: I0220 07:03:58.245853 5094 generic.go:334] "Generic (PLEG): container finished" podID="84ef4848-2558-4cf8-bac6-2f5ed78a74af" containerID="ad4ce4677efdfcf754c1cbcc9ee5621ccc1cb4030f623f01b62bbf2b0e05c251" exitCode=0 Feb 20 07:03:58 crc kubenswrapper[5094]: I0220 07:03:58.245941 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nmt6q" Feb 20 07:03:58 crc kubenswrapper[5094]: I0220 07:03:58.245958 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nmt6q" event={"ID":"84ef4848-2558-4cf8-bac6-2f5ed78a74af","Type":"ContainerDied","Data":"ad4ce4677efdfcf754c1cbcc9ee5621ccc1cb4030f623f01b62bbf2b0e05c251"} Feb 20 07:03:58 crc kubenswrapper[5094]: I0220 07:03:58.246699 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nmt6q" event={"ID":"84ef4848-2558-4cf8-bac6-2f5ed78a74af","Type":"ContainerDied","Data":"815a90dae64743f7ad59b41441f7594e7eca46d0f3dab56ecdb9ab0d3c224a44"} Feb 20 07:03:58 crc kubenswrapper[5094]: I0220 07:03:58.246762 5094 scope.go:117] "RemoveContainer" containerID="ad4ce4677efdfcf754c1cbcc9ee5621ccc1cb4030f623f01b62bbf2b0e05c251" Feb 20 07:03:58 crc kubenswrapper[5094]: I0220 07:03:58.248632 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-72vrj" event={"ID":"be2cc842-778e-4963-80f8-bb5c7426f175","Type":"ContainerStarted","Data":"3bfa36b9236a3b7f7fea4cd454e8e0192b64db37a8cd11d33836efaef3df6216"} Feb 20 07:03:58 crc kubenswrapper[5094]: I0220 07:03:58.274856 5094 scope.go:117] "RemoveContainer" containerID="ad4ce4677efdfcf754c1cbcc9ee5621ccc1cb4030f623f01b62bbf2b0e05c251" Feb 20 07:03:58 crc kubenswrapper[5094]: E0220 07:03:58.275528 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad4ce4677efdfcf754c1cbcc9ee5621ccc1cb4030f623f01b62bbf2b0e05c251\": container with ID starting with ad4ce4677efdfcf754c1cbcc9ee5621ccc1cb4030f623f01b62bbf2b0e05c251 not found: ID does not exist" containerID="ad4ce4677efdfcf754c1cbcc9ee5621ccc1cb4030f623f01b62bbf2b0e05c251" Feb 20 07:03:58 crc kubenswrapper[5094]: I0220 07:03:58.275576 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad4ce4677efdfcf754c1cbcc9ee5621ccc1cb4030f623f01b62bbf2b0e05c251"} err="failed to get container status \"ad4ce4677efdfcf754c1cbcc9ee5621ccc1cb4030f623f01b62bbf2b0e05c251\": rpc error: code = NotFound desc = could not find container \"ad4ce4677efdfcf754c1cbcc9ee5621ccc1cb4030f623f01b62bbf2b0e05c251\": container with ID starting with ad4ce4677efdfcf754c1cbcc9ee5621ccc1cb4030f623f01b62bbf2b0e05c251 not found: ID does not exist" Feb 20 07:03:58 crc kubenswrapper[5094]: I0220 07:03:58.275522 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-72vrj" podStartSLOduration=2.861411185 podStartE2EDuration="3.275497483s" podCreationTimestamp="2026-02-20 07:03:55 +0000 UTC" firstStartedPulling="2026-02-20 07:03:56.866028307 +0000 UTC m=+1051.738655038" lastFinishedPulling="2026-02-20 07:03:57.280114585 +0000 UTC m=+1052.152741336" observedRunningTime="2026-02-20 07:03:58.271090267 +0000 UTC m=+1053.143716978" watchObservedRunningTime="2026-02-20 07:03:58.275497483 +0000 UTC m=+1053.148124194" Feb 20 07:03:58 crc kubenswrapper[5094]: I0220 07:03:58.290096 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-nmt6q"] Feb 20 07:03:58 crc kubenswrapper[5094]: I0220 07:03:58.295831 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-nmt6q"] Feb 20 07:03:59 crc kubenswrapper[5094]: I0220 07:03:59.866898 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84ef4848-2558-4cf8-bac6-2f5ed78a74af" path="/var/lib/kubelet/pods/84ef4848-2558-4cf8-bac6-2f5ed78a74af/volumes" Feb 20 07:04:04 crc kubenswrapper[5094]: I0220 07:04:04.108321 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:04:04 crc kubenswrapper[5094]: I0220 07:04:04.109132 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:04:06 crc kubenswrapper[5094]: I0220 07:04:06.268246 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-72vrj" Feb 20 07:04:06 crc kubenswrapper[5094]: I0220 07:04:06.268761 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-72vrj" Feb 20 07:04:06 crc kubenswrapper[5094]: I0220 07:04:06.324898 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-72vrj" Feb 20 07:04:07 crc kubenswrapper[5094]: I0220 07:04:07.376779 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-72vrj" Feb 20 07:04:08 crc kubenswrapper[5094]: I0220 07:04:08.182406 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9"] Feb 20 07:04:08 crc kubenswrapper[5094]: E0220 07:04:08.183341 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84ef4848-2558-4cf8-bac6-2f5ed78a74af" containerName="registry-server" Feb 20 07:04:08 crc kubenswrapper[5094]: I0220 07:04:08.183365 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="84ef4848-2558-4cf8-bac6-2f5ed78a74af" containerName="registry-server" Feb 20 07:04:08 crc kubenswrapper[5094]: I0220 07:04:08.183592 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="84ef4848-2558-4cf8-bac6-2f5ed78a74af" containerName="registry-server" Feb 20 07:04:08 crc kubenswrapper[5094]: I0220 07:04:08.185381 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" Feb 20 07:04:08 crc kubenswrapper[5094]: I0220 07:04:08.188531 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-djg5h" Feb 20 07:04:08 crc kubenswrapper[5094]: I0220 07:04:08.211623 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9"] Feb 20 07:04:08 crc kubenswrapper[5094]: I0220 07:04:08.376251 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-bundle\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9\" (UID: \"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" Feb 20 07:04:08 crc kubenswrapper[5094]: I0220 07:04:08.376407 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-util\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9\" (UID: \"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" Feb 20 07:04:08 crc kubenswrapper[5094]: I0220 07:04:08.376554 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9swrr\" (UniqueName: \"kubernetes.io/projected/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-kube-api-access-9swrr\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9\" (UID: \"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" Feb 20 07:04:08 crc kubenswrapper[5094]: I0220 07:04:08.479028 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-bundle\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9\" (UID: \"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" Feb 20 07:04:08 crc kubenswrapper[5094]: I0220 07:04:08.479147 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-util\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9\" (UID: \"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" Feb 20 07:04:08 crc kubenswrapper[5094]: I0220 07:04:08.479297 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9swrr\" (UniqueName: \"kubernetes.io/projected/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-kube-api-access-9swrr\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9\" (UID: \"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" Feb 20 07:04:08 crc kubenswrapper[5094]: I0220 07:04:08.480193 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-util\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9\" (UID: \"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" Feb 20 07:04:08 crc kubenswrapper[5094]: I0220 07:04:08.480334 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-bundle\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9\" (UID: \"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" Feb 20 07:04:08 crc kubenswrapper[5094]: I0220 07:04:08.527105 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9swrr\" (UniqueName: \"kubernetes.io/projected/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-kube-api-access-9swrr\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9\" (UID: \"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" Feb 20 07:04:08 crc kubenswrapper[5094]: I0220 07:04:08.545988 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" Feb 20 07:04:09 crc kubenswrapper[5094]: I0220 07:04:09.113094 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9"] Feb 20 07:04:09 crc kubenswrapper[5094]: I0220 07:04:09.353840 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" event={"ID":"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8","Type":"ContainerStarted","Data":"1dbf16311d7a0ac4186a4a3270744782a0fa09c450c7a74b1f6b9e7716c6c9fa"} Feb 20 07:04:09 crc kubenswrapper[5094]: I0220 07:04:09.354276 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" event={"ID":"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8","Type":"ContainerStarted","Data":"5bb6e1a7942b18183fdd9a00f5713cbb7a86224eb665594b5faadb12cd12d191"} Feb 20 07:04:10 crc kubenswrapper[5094]: I0220 07:04:10.368369 5094 generic.go:334] "Generic (PLEG): container finished" podID="9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8" containerID="1dbf16311d7a0ac4186a4a3270744782a0fa09c450c7a74b1f6b9e7716c6c9fa" exitCode=0 Feb 20 07:04:10 crc kubenswrapper[5094]: I0220 07:04:10.368514 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" event={"ID":"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8","Type":"ContainerDied","Data":"1dbf16311d7a0ac4186a4a3270744782a0fa09c450c7a74b1f6b9e7716c6c9fa"} Feb 20 07:04:11 crc kubenswrapper[5094]: I0220 07:04:11.380765 5094 generic.go:334] "Generic (PLEG): container finished" podID="9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8" containerID="f70005987e48bddefbb3bf68ca3f16bff1c9e7d0a808b97397a6f4f05c1592a2" exitCode=0 Feb 20 07:04:11 crc kubenswrapper[5094]: I0220 07:04:11.380895 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" event={"ID":"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8","Type":"ContainerDied","Data":"f70005987e48bddefbb3bf68ca3f16bff1c9e7d0a808b97397a6f4f05c1592a2"} Feb 20 07:04:12 crc kubenswrapper[5094]: I0220 07:04:12.397199 5094 generic.go:334] "Generic (PLEG): container finished" podID="9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8" containerID="6daec5c360406e0c342eeb7a99717114607f5233e49dd5dcd337f4f5cf56d753" exitCode=0 Feb 20 07:04:12 crc kubenswrapper[5094]: I0220 07:04:12.397279 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" event={"ID":"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8","Type":"ContainerDied","Data":"6daec5c360406e0c342eeb7a99717114607f5233e49dd5dcd337f4f5cf56d753"} Feb 20 07:04:13 crc kubenswrapper[5094]: I0220 07:04:13.807988 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" Feb 20 07:04:14 crc kubenswrapper[5094]: I0220 07:04:14.002987 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-bundle\") pod \"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8\" (UID: \"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8\") " Feb 20 07:04:14 crc kubenswrapper[5094]: I0220 07:04:14.003128 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-util\") pod \"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8\" (UID: \"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8\") " Feb 20 07:04:14 crc kubenswrapper[5094]: I0220 07:04:14.003249 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9swrr\" (UniqueName: \"kubernetes.io/projected/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-kube-api-access-9swrr\") pod \"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8\" (UID: \"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8\") " Feb 20 07:04:14 crc kubenswrapper[5094]: I0220 07:04:14.004662 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-bundle" (OuterVolumeSpecName: "bundle") pod "9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8" (UID: "9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:04:14 crc kubenswrapper[5094]: I0220 07:04:14.013611 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-kube-api-access-9swrr" (OuterVolumeSpecName: "kube-api-access-9swrr") pod "9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8" (UID: "9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8"). InnerVolumeSpecName "kube-api-access-9swrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:04:14 crc kubenswrapper[5094]: I0220 07:04:14.034617 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-util" (OuterVolumeSpecName: "util") pod "9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8" (UID: "9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:04:14 crc kubenswrapper[5094]: I0220 07:04:14.105376 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9swrr\" (UniqueName: \"kubernetes.io/projected/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-kube-api-access-9swrr\") on node \"crc\" DevicePath \"\"" Feb 20 07:04:14 crc kubenswrapper[5094]: I0220 07:04:14.105421 5094 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:04:14 crc kubenswrapper[5094]: I0220 07:04:14.105435 5094 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-util\") on node \"crc\" DevicePath \"\"" Feb 20 07:04:14 crc kubenswrapper[5094]: I0220 07:04:14.417496 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" Feb 20 07:04:14 crc kubenswrapper[5094]: I0220 07:04:14.417485 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" event={"ID":"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8","Type":"ContainerDied","Data":"5bb6e1a7942b18183fdd9a00f5713cbb7a86224eb665594b5faadb12cd12d191"} Feb 20 07:04:14 crc kubenswrapper[5094]: I0220 07:04:14.417664 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bb6e1a7942b18183fdd9a00f5713cbb7a86224eb665594b5faadb12cd12d191" Feb 20 07:04:19 crc kubenswrapper[5094]: I0220 07:04:19.632185 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6679bf9b57-9glnw"] Feb 20 07:04:19 crc kubenswrapper[5094]: E0220 07:04:19.632935 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8" containerName="extract" Feb 20 07:04:19 crc kubenswrapper[5094]: I0220 07:04:19.632950 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8" containerName="extract" Feb 20 07:04:19 crc kubenswrapper[5094]: E0220 07:04:19.632963 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8" containerName="util" Feb 20 07:04:19 crc kubenswrapper[5094]: I0220 07:04:19.632970 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8" containerName="util" Feb 20 07:04:19 crc kubenswrapper[5094]: E0220 07:04:19.632987 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8" containerName="pull" Feb 20 07:04:19 crc kubenswrapper[5094]: I0220 07:04:19.632992 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8" containerName="pull" Feb 20 07:04:19 crc kubenswrapper[5094]: I0220 07:04:19.633125 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8" containerName="extract" Feb 20 07:04:19 crc kubenswrapper[5094]: I0220 07:04:19.633624 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-9glnw" Feb 20 07:04:19 crc kubenswrapper[5094]: I0220 07:04:19.637062 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-7wwrf" Feb 20 07:04:19 crc kubenswrapper[5094]: I0220 07:04:19.675914 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6679bf9b57-9glnw"] Feb 20 07:04:19 crc kubenswrapper[5094]: I0220 07:04:19.809937 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t982r\" (UniqueName: \"kubernetes.io/projected/234632e4-6191-4ec8-94c5-c93d71c13ad0-kube-api-access-t982r\") pod \"openstack-operator-controller-init-6679bf9b57-9glnw\" (UID: \"234632e4-6191-4ec8-94c5-c93d71c13ad0\") " pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-9glnw" Feb 20 07:04:19 crc kubenswrapper[5094]: I0220 07:04:19.911516 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t982r\" (UniqueName: \"kubernetes.io/projected/234632e4-6191-4ec8-94c5-c93d71c13ad0-kube-api-access-t982r\") pod \"openstack-operator-controller-init-6679bf9b57-9glnw\" (UID: \"234632e4-6191-4ec8-94c5-c93d71c13ad0\") " pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-9glnw" Feb 20 07:04:19 crc kubenswrapper[5094]: I0220 07:04:19.931780 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t982r\" (UniqueName: \"kubernetes.io/projected/234632e4-6191-4ec8-94c5-c93d71c13ad0-kube-api-access-t982r\") pod \"openstack-operator-controller-init-6679bf9b57-9glnw\" (UID: \"234632e4-6191-4ec8-94c5-c93d71c13ad0\") " pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-9glnw" Feb 20 07:04:19 crc kubenswrapper[5094]: I0220 07:04:19.957409 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-9glnw" Feb 20 07:04:20 crc kubenswrapper[5094]: I0220 07:04:20.207764 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6679bf9b57-9glnw"] Feb 20 07:04:20 crc kubenswrapper[5094]: I0220 07:04:20.463882 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-9glnw" event={"ID":"234632e4-6191-4ec8-94c5-c93d71c13ad0","Type":"ContainerStarted","Data":"d5053b001dd9f5b362508d148cc8d4e057cd56f7263bb415d58b44bd56fc15c0"} Feb 20 07:04:26 crc kubenswrapper[5094]: I0220 07:04:26.520477 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-9glnw" event={"ID":"234632e4-6191-4ec8-94c5-c93d71c13ad0","Type":"ContainerStarted","Data":"d5069cfe9313c7bb2e8bde9749c8e36f2b3253e24164df4741ab0743381ff918"} Feb 20 07:04:26 crc kubenswrapper[5094]: I0220 07:04:26.521363 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-9glnw" Feb 20 07:04:26 crc kubenswrapper[5094]: I0220 07:04:26.570217 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-9glnw" podStartSLOduration=2.347067089 podStartE2EDuration="7.570105066s" podCreationTimestamp="2026-02-20 07:04:19 +0000 UTC" firstStartedPulling="2026-02-20 07:04:20.218815898 +0000 UTC m=+1075.091442609" lastFinishedPulling="2026-02-20 07:04:25.441853875 +0000 UTC m=+1080.314480586" observedRunningTime="2026-02-20 07:04:26.560185529 +0000 UTC m=+1081.432812240" watchObservedRunningTime="2026-02-20 07:04:26.570105066 +0000 UTC m=+1081.442731857" Feb 20 07:04:34 crc kubenswrapper[5094]: I0220 07:04:34.107494 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:04:34 crc kubenswrapper[5094]: I0220 07:04:34.108383 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:04:39 crc kubenswrapper[5094]: I0220 07:04:39.961487 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-9glnw" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.148619 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-k5dkn"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.150457 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5dkn" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.154961 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-24cv7"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.155580 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-zj5lz" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.156003 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-24cv7" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.157754 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-5qvsj" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.171521 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-24cv7"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.178618 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-k5dkn"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.189499 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-xjng5"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.190334 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-xjng5" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.192752 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-2nzgh" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.202785 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-26vtn"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.203778 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-26vtn" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.207194 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-2nt96" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.212772 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-xjng5"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.227779 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-26vtn"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.234133 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4v8q\" (UniqueName: \"kubernetes.io/projected/6b09cc76-8cba-42ed-bb2c-fdf4473c9afe-kube-api-access-x4v8q\") pod \"barbican-operator-controller-manager-868647ff47-k5dkn\" (UID: \"6b09cc76-8cba-42ed-bb2c-fdf4473c9afe\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5dkn" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.234214 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw7ck\" (UniqueName: \"kubernetes.io/projected/a91a9b82-fc6b-4900-becb-6dc3c100e429-kube-api-access-kw7ck\") pod \"designate-operator-controller-manager-6d8bf5c495-xjng5\" (UID: \"a91a9b82-fc6b-4900-becb-6dc3c100e429\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-xjng5" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.234237 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xhkb\" (UniqueName: \"kubernetes.io/projected/f6c8e20e-ecca-42d4-9e0e-5547ae567d9f-kube-api-access-5xhkb\") pod \"glance-operator-controller-manager-77987464f4-26vtn\" (UID: \"f6c8e20e-ecca-42d4-9e0e-5547ae567d9f\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-26vtn" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.234261 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnh9x\" (UniqueName: \"kubernetes.io/projected/32338b54-c33f-4dc5-b328-9cf4d92d1db6-kube-api-access-hnh9x\") pod \"cinder-operator-controller-manager-5d946d989d-24cv7\" (UID: \"32338b54-c33f-4dc5-b328-9cf4d92d1db6\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-24cv7" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.250755 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-p689m"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.251758 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-p689m" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.253563 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-47cqd" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.264044 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-p689m"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.295320 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-mnd7v"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.296388 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-mnd7v" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.300123 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-9l9lb" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.309944 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.310968 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.316016 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.316785 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-twhdx" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.338857 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw9kw\" (UniqueName: \"kubernetes.io/projected/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-kube-api-access-bw9kw\") pod \"infra-operator-controller-manager-79d975b745-nxtc7\" (UID: \"eb67a9bc-35a6-4ce3-bca8-a08ee824cda7\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.338909 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbsk2\" (UniqueName: \"kubernetes.io/projected/36d60210-52d5-4f28-ae0b-28cce632d5cb-kube-api-access-xbsk2\") pod \"heat-operator-controller-manager-69f49c598c-p689m\" (UID: \"36d60210-52d5-4f28-ae0b-28cce632d5cb\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-p689m" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.338929 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr97x\" (UniqueName: \"kubernetes.io/projected/292cb132-b03c-4d20-8bee-c90ad3c4486b-kube-api-access-rr97x\") pod \"horizon-operator-controller-manager-5b9b8895d5-mnd7v\" (UID: \"292cb132-b03c-4d20-8bee-c90ad3c4486b\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-mnd7v" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.338946 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert\") pod \"infra-operator-controller-manager-79d975b745-nxtc7\" (UID: \"eb67a9bc-35a6-4ce3-bca8-a08ee824cda7\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.338995 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4v8q\" (UniqueName: \"kubernetes.io/projected/6b09cc76-8cba-42ed-bb2c-fdf4473c9afe-kube-api-access-x4v8q\") pod \"barbican-operator-controller-manager-868647ff47-k5dkn\" (UID: \"6b09cc76-8cba-42ed-bb2c-fdf4473c9afe\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5dkn" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.339035 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw7ck\" (UniqueName: \"kubernetes.io/projected/a91a9b82-fc6b-4900-becb-6dc3c100e429-kube-api-access-kw7ck\") pod \"designate-operator-controller-manager-6d8bf5c495-xjng5\" (UID: \"a91a9b82-fc6b-4900-becb-6dc3c100e429\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-xjng5" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.339056 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xhkb\" (UniqueName: \"kubernetes.io/projected/f6c8e20e-ecca-42d4-9e0e-5547ae567d9f-kube-api-access-5xhkb\") pod \"glance-operator-controller-manager-77987464f4-26vtn\" (UID: \"f6c8e20e-ecca-42d4-9e0e-5547ae567d9f\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-26vtn" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.339078 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnh9x\" (UniqueName: \"kubernetes.io/projected/32338b54-c33f-4dc5-b328-9cf4d92d1db6-kube-api-access-hnh9x\") pod \"cinder-operator-controller-manager-5d946d989d-24cv7\" (UID: \"32338b54-c33f-4dc5-b328-9cf4d92d1db6\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-24cv7" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.359068 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-mnd7v"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.405490 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.412833 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnh9x\" (UniqueName: \"kubernetes.io/projected/32338b54-c33f-4dc5-b328-9cf4d92d1db6-kube-api-access-hnh9x\") pod \"cinder-operator-controller-manager-5d946d989d-24cv7\" (UID: \"32338b54-c33f-4dc5-b328-9cf4d92d1db6\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-24cv7" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.417025 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xhkb\" (UniqueName: \"kubernetes.io/projected/f6c8e20e-ecca-42d4-9e0e-5547ae567d9f-kube-api-access-5xhkb\") pod \"glance-operator-controller-manager-77987464f4-26vtn\" (UID: \"f6c8e20e-ecca-42d4-9e0e-5547ae567d9f\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-26vtn" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.422118 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-8j8pv"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.423912 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-8j8pv" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.428294 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-lghsf" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.429418 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw7ck\" (UniqueName: \"kubernetes.io/projected/a91a9b82-fc6b-4900-becb-6dc3c100e429-kube-api-access-kw7ck\") pod \"designate-operator-controller-manager-6d8bf5c495-xjng5\" (UID: \"a91a9b82-fc6b-4900-becb-6dc3c100e429\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-xjng5" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.455164 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4v8q\" (UniqueName: \"kubernetes.io/projected/6b09cc76-8cba-42ed-bb2c-fdf4473c9afe-kube-api-access-x4v8q\") pod \"barbican-operator-controller-manager-868647ff47-k5dkn\" (UID: \"6b09cc76-8cba-42ed-bb2c-fdf4473c9afe\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5dkn" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.456850 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25cw9\" (UniqueName: \"kubernetes.io/projected/fcf15128-56ef-42dc-b230-1cd8b7638d33-kube-api-access-25cw9\") pod \"ironic-operator-controller-manager-554564d7fc-8j8pv\" (UID: \"fcf15128-56ef-42dc-b230-1cd8b7638d33\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-8j8pv" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.456927 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-86bxl"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.457183 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw9kw\" (UniqueName: \"kubernetes.io/projected/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-kube-api-access-bw9kw\") pod \"infra-operator-controller-manager-79d975b745-nxtc7\" (UID: \"eb67a9bc-35a6-4ce3-bca8-a08ee824cda7\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.457225 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbsk2\" (UniqueName: \"kubernetes.io/projected/36d60210-52d5-4f28-ae0b-28cce632d5cb-kube-api-access-xbsk2\") pod \"heat-operator-controller-manager-69f49c598c-p689m\" (UID: \"36d60210-52d5-4f28-ae0b-28cce632d5cb\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-p689m" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.457491 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert\") pod \"infra-operator-controller-manager-79d975b745-nxtc7\" (UID: \"eb67a9bc-35a6-4ce3-bca8-a08ee824cda7\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.457539 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr97x\" (UniqueName: \"kubernetes.io/projected/292cb132-b03c-4d20-8bee-c90ad3c4486b-kube-api-access-rr97x\") pod \"horizon-operator-controller-manager-5b9b8895d5-mnd7v\" (UID: \"292cb132-b03c-4d20-8bee-c90ad3c4486b\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-mnd7v" Feb 20 07:05:00 crc kubenswrapper[5094]: E0220 07:05:00.457780 5094 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 20 07:05:00 crc kubenswrapper[5094]: E0220 07:05:00.457856 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert podName:eb67a9bc-35a6-4ce3-bca8-a08ee824cda7 nodeName:}" failed. No retries permitted until 2026-02-20 07:05:00.95783095 +0000 UTC m=+1115.830457661 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert") pod "infra-operator-controller-manager-79d975b745-nxtc7" (UID: "eb67a9bc-35a6-4ce3-bca8-a08ee824cda7") : secret "infra-operator-webhook-server-cert" not found Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.474666 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-86bxl" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.475379 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5dkn" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.477832 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-mc85t" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.488274 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-24cv7" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.508382 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-8j8pv"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.514612 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr97x\" (UniqueName: \"kubernetes.io/projected/292cb132-b03c-4d20-8bee-c90ad3c4486b-kube-api-access-rr97x\") pod \"horizon-operator-controller-manager-5b9b8895d5-mnd7v\" (UID: \"292cb132-b03c-4d20-8bee-c90ad3c4486b\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-mnd7v" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.515215 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbsk2\" (UniqueName: \"kubernetes.io/projected/36d60210-52d5-4f28-ae0b-28cce632d5cb-kube-api-access-xbsk2\") pod \"heat-operator-controller-manager-69f49c598c-p689m\" (UID: \"36d60210-52d5-4f28-ae0b-28cce632d5cb\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-p689m" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.521769 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-xjng5" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.528564 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-26vtn" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.529413 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-86bxl"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.535507 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw9kw\" (UniqueName: \"kubernetes.io/projected/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-kube-api-access-bw9kw\") pod \"infra-operator-controller-manager-79d975b745-nxtc7\" (UID: \"eb67a9bc-35a6-4ce3-bca8-a08ee824cda7\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.573837 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgzj5\" (UniqueName: \"kubernetes.io/projected/8a1c02cd-3546-45fa-b7db-5903c80681a4-kube-api-access-tgzj5\") pod \"keystone-operator-controller-manager-b4d948c87-86bxl\" (UID: \"8a1c02cd-3546-45fa-b7db-5903c80681a4\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-86bxl" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.574057 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-p689m" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.580115 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25cw9\" (UniqueName: \"kubernetes.io/projected/fcf15128-56ef-42dc-b230-1cd8b7638d33-kube-api-access-25cw9\") pod \"ironic-operator-controller-manager-554564d7fc-8j8pv\" (UID: \"fcf15128-56ef-42dc-b230-1cd8b7638d33\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-8j8pv" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.589192 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-hfkff"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.590336 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-hfkff" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.615613 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-swrx9" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.616062 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-mnd7v" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.632230 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25cw9\" (UniqueName: \"kubernetes.io/projected/fcf15128-56ef-42dc-b230-1cd8b7638d33-kube-api-access-25cw9\") pod \"ironic-operator-controller-manager-554564d7fc-8j8pv\" (UID: \"fcf15128-56ef-42dc-b230-1cd8b7638d33\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-8j8pv" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.632307 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-n5dgn"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.633271 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-n5dgn" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.641740 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-wmcg8" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.681315 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cc8w\" (UniqueName: \"kubernetes.io/projected/b863d4f9-063a-4102-8c3d-f7e092e4e2c0-kube-api-access-7cc8w\") pod \"manila-operator-controller-manager-54f6768c69-hfkff\" (UID: \"b863d4f9-063a-4102-8c3d-f7e092e4e2c0\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-hfkff" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.681391 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgzj5\" (UniqueName: \"kubernetes.io/projected/8a1c02cd-3546-45fa-b7db-5903c80681a4-kube-api-access-tgzj5\") pod \"keystone-operator-controller-manager-b4d948c87-86bxl\" (UID: \"8a1c02cd-3546-45fa-b7db-5903c80681a4\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-86bxl" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.681428 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsjg7\" (UniqueName: \"kubernetes.io/projected/6177108e-bc02-497c-80ab-312f61fbd1c2-kube-api-access-hsjg7\") pod \"mariadb-operator-controller-manager-6994f66f48-n5dgn\" (UID: \"6177108e-bc02-497c-80ab-312f61fbd1c2\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-n5dgn" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.701807 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-hfkff"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.707227 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgzj5\" (UniqueName: \"kubernetes.io/projected/8a1c02cd-3546-45fa-b7db-5903c80681a4-kube-api-access-tgzj5\") pod \"keystone-operator-controller-manager-b4d948c87-86bxl\" (UID: \"8a1c02cd-3546-45fa-b7db-5903c80681a4\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-86bxl" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.743994 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-n5dgn"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.785820 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsjg7\" (UniqueName: \"kubernetes.io/projected/6177108e-bc02-497c-80ab-312f61fbd1c2-kube-api-access-hsjg7\") pod \"mariadb-operator-controller-manager-6994f66f48-n5dgn\" (UID: \"6177108e-bc02-497c-80ab-312f61fbd1c2\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-n5dgn" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.785928 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cc8w\" (UniqueName: \"kubernetes.io/projected/b863d4f9-063a-4102-8c3d-f7e092e4e2c0-kube-api-access-7cc8w\") pod \"manila-operator-controller-manager-54f6768c69-hfkff\" (UID: \"b863d4f9-063a-4102-8c3d-f7e092e4e2c0\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-hfkff" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.786776 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bd9tr"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.787849 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-2ftdz"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.788494 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-2ftdz" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.789165 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bd9tr" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.824075 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bd9tr"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.824585 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-8zgn5" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.825128 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-7bhrk" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.861770 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-2ftdz"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.886982 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rc9s\" (UniqueName: \"kubernetes.io/projected/93dbc041-00c2-4189-abca-6bb3a00abc2d-kube-api-access-6rc9s\") pod \"nova-operator-controller-manager-567668f5cf-2ftdz\" (UID: \"93dbc041-00c2-4189-abca-6bb3a00abc2d\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-2ftdz" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.887549 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhj9q\" (UniqueName: \"kubernetes.io/projected/eba1b8e0-b529-47ad-a657-75ce01bad56a-kube-api-access-bhj9q\") pod \"neutron-operator-controller-manager-64ddbf8bb-bd9tr\" (UID: \"eba1b8e0-b529-47ad-a657-75ce01bad56a\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bd9tr" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.888656 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsjg7\" (UniqueName: \"kubernetes.io/projected/6177108e-bc02-497c-80ab-312f61fbd1c2-kube-api-access-hsjg7\") pod \"mariadb-operator-controller-manager-6994f66f48-n5dgn\" (UID: \"6177108e-bc02-497c-80ab-312f61fbd1c2\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-n5dgn" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.893690 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cc8w\" (UniqueName: \"kubernetes.io/projected/b863d4f9-063a-4102-8c3d-f7e092e4e2c0-kube-api-access-7cc8w\") pod \"manila-operator-controller-manager-54f6768c69-hfkff\" (UID: \"b863d4f9-063a-4102-8c3d-f7e092e4e2c0\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-hfkff" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.915412 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-8j8pv" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.919868 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-ngkcq"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.922426 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ngkcq" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.925320 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-psclz" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.936791 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-ngkcq"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.960970 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-ct2h7"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.962098 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ct2h7" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.962152 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-86bxl" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.966055 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-kkvr8" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.987000 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-ct2h7"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.989643 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-hfkff" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:00.999454 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhj9q\" (UniqueName: \"kubernetes.io/projected/eba1b8e0-b529-47ad-a657-75ce01bad56a-kube-api-access-bhj9q\") pod \"neutron-operator-controller-manager-64ddbf8bb-bd9tr\" (UID: \"eba1b8e0-b529-47ad-a657-75ce01bad56a\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bd9tr" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:00.999537 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rc9s\" (UniqueName: \"kubernetes.io/projected/93dbc041-00c2-4189-abca-6bb3a00abc2d-kube-api-access-6rc9s\") pod \"nova-operator-controller-manager-567668f5cf-2ftdz\" (UID: \"93dbc041-00c2-4189-abca-6bb3a00abc2d\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-2ftdz" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:00.999579 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert\") pod \"infra-operator-controller-manager-79d975b745-nxtc7\" (UID: \"eb67a9bc-35a6-4ce3-bca8-a08ee824cda7\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:00.999730 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk9bv\" (UniqueName: \"kubernetes.io/projected/683351ac-f508-4961-b07a-eaac9c26a4f3-kube-api-access-pk9bv\") pod \"octavia-operator-controller-manager-69f8888797-ngkcq\" (UID: \"683351ac-f508-4961-b07a-eaac9c26a4f3\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ngkcq" Feb 20 07:05:01 crc kubenswrapper[5094]: E0220 07:05:01.000229 5094 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 20 07:05:01 crc kubenswrapper[5094]: E0220 07:05:01.005834 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert podName:eb67a9bc-35a6-4ce3-bca8-a08ee824cda7 nodeName:}" failed. No retries permitted until 2026-02-20 07:05:02.005805184 +0000 UTC m=+1116.878431895 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert") pod "infra-operator-controller-manager-79d975b745-nxtc7" (UID: "eb67a9bc-35a6-4ce3-bca8-a08ee824cda7") : secret "infra-operator-webhook-server-cert" not found Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.018785 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.019866 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-n5dgn" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.019893 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.022416 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.023192 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-j64q2" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.026740 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rc9s\" (UniqueName: \"kubernetes.io/projected/93dbc041-00c2-4189-abca-6bb3a00abc2d-kube-api-access-6rc9s\") pod \"nova-operator-controller-manager-567668f5cf-2ftdz\" (UID: \"93dbc041-00c2-4189-abca-6bb3a00abc2d\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-2ftdz" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.028183 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhj9q\" (UniqueName: \"kubernetes.io/projected/eba1b8e0-b529-47ad-a657-75ce01bad56a-kube-api-access-bhj9q\") pod \"neutron-operator-controller-manager-64ddbf8bb-bd9tr\" (UID: \"eba1b8e0-b529-47ad-a657-75ce01bad56a\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bd9tr" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.037895 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-9c2k5"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.045955 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9c2k5" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.052808 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-kkw69" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.063049 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-dh48q"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.064121 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dh48q" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.068400 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-kxv5p" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.073246 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-9c2k5"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.086234 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.091146 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-cbtkd"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.092394 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-cbtkd" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.100348 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-f6gzp" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.108358 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk9bv\" (UniqueName: \"kubernetes.io/projected/683351ac-f508-4961-b07a-eaac9c26a4f3-kube-api-access-pk9bv\") pod \"octavia-operator-controller-manager-69f8888797-ngkcq\" (UID: \"683351ac-f508-4961-b07a-eaac9c26a4f3\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ngkcq" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.108421 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8\" (UID: \"57b4cb2c-e7bc-4430-bfb8-3642dab61d84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.108457 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg9gs\" (UniqueName: \"kubernetes.io/projected/74861845-de37-4091-9226-bcb1bbe64b35-kube-api-access-pg9gs\") pod \"ovn-operator-controller-manager-d44cf6b75-ct2h7\" (UID: \"74861845-de37-4091-9226-bcb1bbe64b35\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ct2h7" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.108494 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv4q2\" (UniqueName: \"kubernetes.io/projected/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-kube-api-access-fv4q2\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8\" (UID: \"57b4cb2c-e7bc-4430-bfb8-3642dab61d84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.108526 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rws2w\" (UniqueName: \"kubernetes.io/projected/c510ecc1-53ce-4611-af6a-09488f9317ed-kube-api-access-rws2w\") pod \"placement-operator-controller-manager-8497b45c89-9c2k5\" (UID: \"c510ecc1-53ce-4611-af6a-09488f9317ed\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9c2k5" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.130878 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk9bv\" (UniqueName: \"kubernetes.io/projected/683351ac-f508-4961-b07a-eaac9c26a4f3-kube-api-access-pk9bv\") pod \"octavia-operator-controller-manager-69f8888797-ngkcq\" (UID: \"683351ac-f508-4961-b07a-eaac9c26a4f3\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ngkcq" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.131449 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-dh48q"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.139881 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-cbtkd"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.199940 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-nll74"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.206836 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-nll74" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.213029 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-8hrbp" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.222380 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8\" (UID: \"57b4cb2c-e7bc-4430-bfb8-3642dab61d84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.222569 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg9gs\" (UniqueName: \"kubernetes.io/projected/74861845-de37-4091-9226-bcb1bbe64b35-kube-api-access-pg9gs\") pod \"ovn-operator-controller-manager-d44cf6b75-ct2h7\" (UID: \"74861845-de37-4091-9226-bcb1bbe64b35\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ct2h7" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.222749 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv4q2\" (UniqueName: \"kubernetes.io/projected/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-kube-api-access-fv4q2\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8\" (UID: \"57b4cb2c-e7bc-4430-bfb8-3642dab61d84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.222837 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zldt2\" (UniqueName: \"kubernetes.io/projected/a0d3c29b-4f57-4647-b1a5-bfd6c887b0b5-kube-api-access-zldt2\") pod \"swift-operator-controller-manager-68f46476f-dh48q\" (UID: \"a0d3c29b-4f57-4647-b1a5-bfd6c887b0b5\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-dh48q" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.222869 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptz6g\" (UniqueName: \"kubernetes.io/projected/c45dcc1f-a95d-4492-9139-16d550809a8e-kube-api-access-ptz6g\") pod \"telemetry-operator-controller-manager-7f45b4ff68-cbtkd\" (UID: \"c45dcc1f-a95d-4492-9139-16d550809a8e\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-cbtkd" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.222903 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rws2w\" (UniqueName: \"kubernetes.io/projected/c510ecc1-53ce-4611-af6a-09488f9317ed-kube-api-access-rws2w\") pod \"placement-operator-controller-manager-8497b45c89-9c2k5\" (UID: \"c510ecc1-53ce-4611-af6a-09488f9317ed\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9c2k5" Feb 20 07:05:01 crc kubenswrapper[5094]: E0220 07:05:01.222932 5094 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 07:05:01 crc kubenswrapper[5094]: E0220 07:05:01.223021 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert podName:57b4cb2c-e7bc-4430-bfb8-3642dab61d84 nodeName:}" failed. No retries permitted until 2026-02-20 07:05:01.722994596 +0000 UTC m=+1116.595621307 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" (UID: "57b4cb2c-e7bc-4430-bfb8-3642dab61d84") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.233354 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-nll74"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.246001 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv4q2\" (UniqueName: \"kubernetes.io/projected/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-kube-api-access-fv4q2\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8\" (UID: \"57b4cb2c-e7bc-4430-bfb8-3642dab61d84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.246747 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rws2w\" (UniqueName: \"kubernetes.io/projected/c510ecc1-53ce-4611-af6a-09488f9317ed-kube-api-access-rws2w\") pod \"placement-operator-controller-manager-8497b45c89-9c2k5\" (UID: \"c510ecc1-53ce-4611-af6a-09488f9317ed\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9c2k5" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.247826 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg9gs\" (UniqueName: \"kubernetes.io/projected/74861845-de37-4091-9226-bcb1bbe64b35-kube-api-access-pg9gs\") pod \"ovn-operator-controller-manager-d44cf6b75-ct2h7\" (UID: \"74861845-de37-4091-9226-bcb1bbe64b35\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ct2h7" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.256492 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-2ftdz" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.258831 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-lz57q"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.261232 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lz57q" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.265886 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-ms752" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.271864 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-lz57q"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.279551 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bd9tr" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.301426 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ngkcq" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.301453 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.303162 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.307220 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.307503 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-prrqj" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.307675 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.309779 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.321234 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ct2h7" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.321881 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fq9n6"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.325458 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ckl9\" (UniqueName: \"kubernetes.io/projected/4413dc36-58b0-447a-ba69-cdd2cee9589c-kube-api-access-2ckl9\") pod \"watcher-operator-controller-manager-5db88f68c-lz57q\" (UID: \"4413dc36-58b0-447a-ba69-cdd2cee9589c\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lz57q" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.325549 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zldt2\" (UniqueName: \"kubernetes.io/projected/a0d3c29b-4f57-4647-b1a5-bfd6c887b0b5-kube-api-access-zldt2\") pod \"swift-operator-controller-manager-68f46476f-dh48q\" (UID: \"a0d3c29b-4f57-4647-b1a5-bfd6c887b0b5\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-dh48q" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.325579 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49rgc\" (UniqueName: \"kubernetes.io/projected/fce6c9b3-2075-479d-9a16-738831a871c4-kube-api-access-49rgc\") pod \"test-operator-controller-manager-7866795846-nll74\" (UID: \"fce6c9b3-2075-479d-9a16-738831a871c4\") " pod="openstack-operators/test-operator-controller-manager-7866795846-nll74" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.325603 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptz6g\" (UniqueName: \"kubernetes.io/projected/c45dcc1f-a95d-4492-9139-16d550809a8e-kube-api-access-ptz6g\") pod \"telemetry-operator-controller-manager-7f45b4ff68-cbtkd\" (UID: \"c45dcc1f-a95d-4492-9139-16d550809a8e\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-cbtkd" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.330628 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fq9n6" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.345037 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fq9n6"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.351317 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-7kj87" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.357470 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zldt2\" (UniqueName: \"kubernetes.io/projected/a0d3c29b-4f57-4647-b1a5-bfd6c887b0b5-kube-api-access-zldt2\") pod \"swift-operator-controller-manager-68f46476f-dh48q\" (UID: \"a0d3c29b-4f57-4647-b1a5-bfd6c887b0b5\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-dh48q" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.363624 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.369925 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptz6g\" (UniqueName: \"kubernetes.io/projected/c45dcc1f-a95d-4492-9139-16d550809a8e-kube-api-access-ptz6g\") pod \"telemetry-operator-controller-manager-7f45b4ff68-cbtkd\" (UID: \"c45dcc1f-a95d-4492-9139-16d550809a8e\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-cbtkd" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.382096 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9c2k5" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.391887 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-k5dkn"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.416685 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dh48q" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.425887 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-cbtkd" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.428327 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.428697 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxl72\" (UniqueName: \"kubernetes.io/projected/f45a4211-8890-4e4a-af96-ccffec62160c-kube-api-access-fxl72\") pod \"rabbitmq-cluster-operator-manager-668c99d594-fq9n6\" (UID: \"f45a4211-8890-4e4a-af96-ccffec62160c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fq9n6" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.428772 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.428845 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78z96\" (UniqueName: \"kubernetes.io/projected/a1b74404-906b-4466-a3bd-289458ef90ea-kube-api-access-78z96\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.428944 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ckl9\" (UniqueName: \"kubernetes.io/projected/4413dc36-58b0-447a-ba69-cdd2cee9589c-kube-api-access-2ckl9\") pod \"watcher-operator-controller-manager-5db88f68c-lz57q\" (UID: \"4413dc36-58b0-447a-ba69-cdd2cee9589c\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lz57q" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.428995 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49rgc\" (UniqueName: \"kubernetes.io/projected/fce6c9b3-2075-479d-9a16-738831a871c4-kube-api-access-49rgc\") pod \"test-operator-controller-manager-7866795846-nll74\" (UID: \"fce6c9b3-2075-479d-9a16-738831a871c4\") " pod="openstack-operators/test-operator-controller-manager-7866795846-nll74" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.470375 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49rgc\" (UniqueName: \"kubernetes.io/projected/fce6c9b3-2075-479d-9a16-738831a871c4-kube-api-access-49rgc\") pod \"test-operator-controller-manager-7866795846-nll74\" (UID: \"fce6c9b3-2075-479d-9a16-738831a871c4\") " pod="openstack-operators/test-operator-controller-manager-7866795846-nll74" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.478925 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ckl9\" (UniqueName: \"kubernetes.io/projected/4413dc36-58b0-447a-ba69-cdd2cee9589c-kube-api-access-2ckl9\") pod \"watcher-operator-controller-manager-5db88f68c-lz57q\" (UID: \"4413dc36-58b0-447a-ba69-cdd2cee9589c\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lz57q" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.513357 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-24cv7"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.530815 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78z96\" (UniqueName: \"kubernetes.io/projected/a1b74404-906b-4466-a3bd-289458ef90ea-kube-api-access-78z96\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.530934 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.530974 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxl72\" (UniqueName: \"kubernetes.io/projected/f45a4211-8890-4e4a-af96-ccffec62160c-kube-api-access-fxl72\") pod \"rabbitmq-cluster-operator-manager-668c99d594-fq9n6\" (UID: \"f45a4211-8890-4e4a-af96-ccffec62160c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fq9n6" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.531003 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:01 crc kubenswrapper[5094]: E0220 07:05:01.531143 5094 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 20 07:05:01 crc kubenswrapper[5094]: E0220 07:05:01.531212 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs podName:a1b74404-906b-4466-a3bd-289458ef90ea nodeName:}" failed. No retries permitted until 2026-02-20 07:05:02.031194117 +0000 UTC m=+1116.903820828 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-57z9v" (UID: "a1b74404-906b-4466-a3bd-289458ef90ea") : secret "metrics-server-cert" not found Feb 20 07:05:01 crc kubenswrapper[5094]: E0220 07:05:01.531401 5094 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 20 07:05:01 crc kubenswrapper[5094]: E0220 07:05:01.531469 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs podName:a1b74404-906b-4466-a3bd-289458ef90ea nodeName:}" failed. No retries permitted until 2026-02-20 07:05:02.031448223 +0000 UTC m=+1116.904074934 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-57z9v" (UID: "a1b74404-906b-4466-a3bd-289458ef90ea") : secret "webhook-server-cert" not found Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.546099 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-nll74" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.550577 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxl72\" (UniqueName: \"kubernetes.io/projected/f45a4211-8890-4e4a-af96-ccffec62160c-kube-api-access-fxl72\") pod \"rabbitmq-cluster-operator-manager-668c99d594-fq9n6\" (UID: \"f45a4211-8890-4e4a-af96-ccffec62160c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fq9n6" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.552622 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78z96\" (UniqueName: \"kubernetes.io/projected/a1b74404-906b-4466-a3bd-289458ef90ea-kube-api-access-78z96\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:01 crc kubenswrapper[5094]: W0220 07:05:01.594592 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32338b54_c33f_4dc5_b328_9cf4d92d1db6.slice/crio-ac144efc2d6e85620522f37359c351e382952854b83910c1f79ae4cbbdfdecd1 WatchSource:0}: Error finding container ac144efc2d6e85620522f37359c351e382952854b83910c1f79ae4cbbdfdecd1: Status 404 returned error can't find the container with id ac144efc2d6e85620522f37359c351e382952854b83910c1f79ae4cbbdfdecd1 Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.599595 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lz57q" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.672753 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fq9n6" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.674631 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-26vtn"] Feb 20 07:05:01 crc kubenswrapper[5094]: W0220 07:05:01.689293 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6c8e20e_ecca_42d4_9e0e_5547ae567d9f.slice/crio-e5dd85e8d7a6343df7ca59cedf2d67210e1591793b5c752aed6e28e07c6e852b WatchSource:0}: Error finding container e5dd85e8d7a6343df7ca59cedf2d67210e1591793b5c752aed6e28e07c6e852b: Status 404 returned error can't find the container with id e5dd85e8d7a6343df7ca59cedf2d67210e1591793b5c752aed6e28e07c6e852b Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.694791 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-mnd7v"] Feb 20 07:05:01 crc kubenswrapper[5094]: W0220 07:05:01.703200 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod292cb132_b03c_4d20_8bee_c90ad3c4486b.slice/crio-a99ad4d19cd3b8825b8454b38f2efbf88ab56b3ed7d4277a43ed743b03428a76 WatchSource:0}: Error finding container a99ad4d19cd3b8825b8454b38f2efbf88ab56b3ed7d4277a43ed743b03428a76: Status 404 returned error can't find the container with id a99ad4d19cd3b8825b8454b38f2efbf88ab56b3ed7d4277a43ed743b03428a76 Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.735726 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8\" (UID: \"57b4cb2c-e7bc-4430-bfb8-3642dab61d84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" Feb 20 07:05:01 crc kubenswrapper[5094]: E0220 07:05:01.736031 5094 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 07:05:01 crc kubenswrapper[5094]: E0220 07:05:01.736098 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert podName:57b4cb2c-e7bc-4430-bfb8-3642dab61d84 nodeName:}" failed. No retries permitted until 2026-02-20 07:05:02.736077194 +0000 UTC m=+1117.608703905 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" (UID: "57b4cb2c-e7bc-4430-bfb8-3642dab61d84") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.852680 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-26vtn" event={"ID":"f6c8e20e-ecca-42d4-9e0e-5547ae567d9f","Type":"ContainerStarted","Data":"e5dd85e8d7a6343df7ca59cedf2d67210e1591793b5c752aed6e28e07c6e852b"} Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.853888 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-mnd7v" event={"ID":"292cb132-b03c-4d20-8bee-c90ad3c4486b","Type":"ContainerStarted","Data":"a99ad4d19cd3b8825b8454b38f2efbf88ab56b3ed7d4277a43ed743b03428a76"} Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.854573 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-24cv7" event={"ID":"32338b54-c33f-4dc5-b328-9cf4d92d1db6","Type":"ContainerStarted","Data":"ac144efc2d6e85620522f37359c351e382952854b83910c1f79ae4cbbdfdecd1"} Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.856217 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5dkn" event={"ID":"6b09cc76-8cba-42ed-bb2c-fdf4473c9afe","Type":"ContainerStarted","Data":"2546da07e9828a80836e443d394ea45756a6da690df77f82e48dffa15b8a38fb"} Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.989680 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-xjng5"] Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.013842 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-8j8pv"] Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.030430 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-86bxl"] Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.040925 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert\") pod \"infra-operator-controller-manager-79d975b745-nxtc7\" (UID: \"eb67a9bc-35a6-4ce3-bca8-a08ee824cda7\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.040984 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.041052 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.041134 5094 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.041191 5094 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.041236 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert podName:eb67a9bc-35a6-4ce3-bca8-a08ee824cda7 nodeName:}" failed. No retries permitted until 2026-02-20 07:05:04.041209211 +0000 UTC m=+1118.913835922 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert") pod "infra-operator-controller-manager-79d975b745-nxtc7" (UID: "eb67a9bc-35a6-4ce3-bca8-a08ee824cda7") : secret "infra-operator-webhook-server-cert" not found Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.041276 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs podName:a1b74404-906b-4466-a3bd-289458ef90ea nodeName:}" failed. No retries permitted until 2026-02-20 07:05:03.041246382 +0000 UTC m=+1117.913873093 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-57z9v" (UID: "a1b74404-906b-4466-a3bd-289458ef90ea") : secret "webhook-server-cert" not found Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.041326 5094 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.041419 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs podName:a1b74404-906b-4466-a3bd-289458ef90ea nodeName:}" failed. No retries permitted until 2026-02-20 07:05:03.041393056 +0000 UTC m=+1117.914019957 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-57z9v" (UID: "a1b74404-906b-4466-a3bd-289458ef90ea") : secret "metrics-server-cert" not found Feb 20 07:05:02 crc kubenswrapper[5094]: W0220 07:05:02.041967 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcf15128_56ef_42dc_b230_1cd8b7638d33.slice/crio-c27cc197ae3cfb8322a01660c7caa68cb61067b0047361c994e7c67c15520de9 WatchSource:0}: Error finding container c27cc197ae3cfb8322a01660c7caa68cb61067b0047361c994e7c67c15520de9: Status 404 returned error can't find the container with id c27cc197ae3cfb8322a01660c7caa68cb61067b0047361c994e7c67c15520de9 Feb 20 07:05:02 crc kubenswrapper[5094]: W0220 07:05:02.042214 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a1c02cd_3546_45fa_b7db_5903c80681a4.slice/crio-94e198a52bae9685c7e612a6273059e65100e87d1b3365d57754e26fe56218a8 WatchSource:0}: Error finding container 94e198a52bae9685c7e612a6273059e65100e87d1b3365d57754e26fe56218a8: Status 404 returned error can't find the container with id 94e198a52bae9685c7e612a6273059e65100e87d1b3365d57754e26fe56218a8 Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.052175 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-p689m"] Feb 20 07:05:02 crc kubenswrapper[5094]: W0220 07:05:02.055662 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6177108e_bc02_497c_80ab_312f61fbd1c2.slice/crio-e2937e5f8d3d26a5dbf243abc30abcfe4d7c4d0103f4a75837e8a78d25776177 WatchSource:0}: Error finding container e2937e5f8d3d26a5dbf243abc30abcfe4d7c4d0103f4a75837e8a78d25776177: Status 404 returned error can't find the container with id e2937e5f8d3d26a5dbf243abc30abcfe4d7c4d0103f4a75837e8a78d25776177 Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.072606 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-n5dgn"] Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.078949 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-ngkcq"] Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.083248 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bd9tr"] Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.087065 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-hfkff"] Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.110086 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-2ftdz"] Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.219026 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-lz57q"] Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.231428 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-ct2h7"] Feb 20 07:05:02 crc kubenswrapper[5094]: W0220 07:05:02.239317 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0d3c29b_4f57_4647_b1a5_bfd6c887b0b5.slice/crio-32f946e4607a39a8e0dfa954eeaa0f471004b2b97208dfe5bb5437461ccef7a4 WatchSource:0}: Error finding container 32f946e4607a39a8e0dfa954eeaa0f471004b2b97208dfe5bb5437461ccef7a4: Status 404 returned error can't find the container with id 32f946e4607a39a8e0dfa954eeaa0f471004b2b97208dfe5bb5437461ccef7a4 Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.240579 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-9c2k5"] Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.240774 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pg9gs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-d44cf6b75-ct2h7_openstack-operators(74861845-de37-4091-9226-bcb1bbe64b35): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.241945 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ct2h7" podUID="74861845-de37-4091-9226-bcb1bbe64b35" Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.242572 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2ckl9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-5db88f68c-lz57q_openstack-operators(4413dc36-58b0-447a-ba69-cdd2cee9589c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.242803 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zldt2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68f46476f-dh48q_openstack-operators(a0d3c29b-4f57-4647-b1a5-bfd6c887b0b5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.243848 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dh48q" podUID="a0d3c29b-4f57-4647-b1a5-bfd6c887b0b5" Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.243888 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lz57q" podUID="4413dc36-58b0-447a-ba69-cdd2cee9589c" Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.246267 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-dh48q"] Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.385596 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-nll74"] Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.393363 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-cbtkd"] Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.398036 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fq9n6"] Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.406123 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-49rgc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7866795846-nll74_openstack-operators(fce6c9b3-2075-479d-9a16-738831a871c4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.407279 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-7866795846-nll74" podUID="fce6c9b3-2075-479d-9a16-738831a871c4" Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.433597 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fxl72,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-fq9n6_openstack-operators(f45a4211-8890-4e4a-af96-ccffec62160c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.435495 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fq9n6" podUID="f45a4211-8890-4e4a-af96-ccffec62160c" Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.753792 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8\" (UID: \"57b4cb2c-e7bc-4430-bfb8-3642dab61d84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.753999 5094 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.754055 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert podName:57b4cb2c-e7bc-4430-bfb8-3642dab61d84 nodeName:}" failed. No retries permitted until 2026-02-20 07:05:04.754039843 +0000 UTC m=+1119.626666554 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" (UID: "57b4cb2c-e7bc-4430-bfb8-3642dab61d84") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.874164 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-86bxl" event={"ID":"8a1c02cd-3546-45fa-b7db-5903c80681a4","Type":"ContainerStarted","Data":"94e198a52bae9685c7e612a6273059e65100e87d1b3365d57754e26fe56218a8"} Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.876660 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dh48q" event={"ID":"a0d3c29b-4f57-4647-b1a5-bfd6c887b0b5","Type":"ContainerStarted","Data":"32f946e4607a39a8e0dfa954eeaa0f471004b2b97208dfe5bb5437461ccef7a4"} Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.879764 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dh48q" podUID="a0d3c29b-4f57-4647-b1a5-bfd6c887b0b5" Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.883630 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ct2h7" event={"ID":"74861845-de37-4091-9226-bcb1bbe64b35","Type":"ContainerStarted","Data":"1226a272d4576b1c0f5b17f0133b715d6616e04e7a6af25043e09c3a5535b3fa"} Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.885132 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ct2h7" podUID="74861845-de37-4091-9226-bcb1bbe64b35" Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.888206 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-8j8pv" event={"ID":"fcf15128-56ef-42dc-b230-1cd8b7638d33","Type":"ContainerStarted","Data":"c27cc197ae3cfb8322a01660c7caa68cb61067b0047361c994e7c67c15520de9"} Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.901074 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fq9n6" event={"ID":"f45a4211-8890-4e4a-af96-ccffec62160c","Type":"ContainerStarted","Data":"977319233c272af4c112ca04768045fd464b4196ee3e93e05889715106d9f8d8"} Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.903496 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fq9n6" podUID="f45a4211-8890-4e4a-af96-ccffec62160c" Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.905431 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-n5dgn" event={"ID":"6177108e-bc02-497c-80ab-312f61fbd1c2","Type":"ContainerStarted","Data":"e2937e5f8d3d26a5dbf243abc30abcfe4d7c4d0103f4a75837e8a78d25776177"} Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.912865 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bd9tr" event={"ID":"eba1b8e0-b529-47ad-a657-75ce01bad56a","Type":"ContainerStarted","Data":"f1543b3c2008b61d7a8c5098b0f18e5e652ea6483cc681db043f746c50f6e9e5"} Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.933846 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-p689m" event={"ID":"36d60210-52d5-4f28-ae0b-28cce632d5cb","Type":"ContainerStarted","Data":"ff6c2fb5b9eae6a01844bd0c8a44ddaaf03e5f4cc39c426c041a41154f90d4bd"} Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.935661 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-cbtkd" event={"ID":"c45dcc1f-a95d-4492-9139-16d550809a8e","Type":"ContainerStarted","Data":"dbd2eb3298c5c979ede34dd908ae0c73cf9b7f02ff7d949ed98293cd33f68023"} Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.937208 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lz57q" event={"ID":"4413dc36-58b0-447a-ba69-cdd2cee9589c","Type":"ContainerStarted","Data":"8fcedb8f6c462ae8e7b5be314a0fd227e808595a19ba58a9a54306c677684035"} Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.939982 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lz57q" podUID="4413dc36-58b0-447a-ba69-cdd2cee9589c" Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.940887 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-nll74" event={"ID":"fce6c9b3-2075-479d-9a16-738831a871c4","Type":"ContainerStarted","Data":"cd111fcc519bebf20aba9638139d03fe4c850a8a3a4db049045f83d9be5cd50a"} Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.946235 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-nll74" podUID="fce6c9b3-2075-479d-9a16-738831a871c4" Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.947467 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-xjng5" event={"ID":"a91a9b82-fc6b-4900-becb-6dc3c100e429","Type":"ContainerStarted","Data":"945797c64dae2ce83c388ea43f46c0236748a7466110c6458b5981969d8e34ed"} Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.952610 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ngkcq" event={"ID":"683351ac-f508-4961-b07a-eaac9c26a4f3","Type":"ContainerStarted","Data":"b110eae661d3734a215563cf14698a21d98de1f29eae56bda957525836393a10"} Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.954108 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-2ftdz" event={"ID":"93dbc041-00c2-4189-abca-6bb3a00abc2d","Type":"ContainerStarted","Data":"cf42301ed1a82263f83e2b5d3c2a0a708ac6d8e891bc7952422ccfc2e50fc430"} Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.973834 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9c2k5" event={"ID":"c510ecc1-53ce-4611-af6a-09488f9317ed","Type":"ContainerStarted","Data":"8f0fbeddce022b4d7e052550a161d07abfb4ca488977490ac278930c29fa26d0"} Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.980773 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-hfkff" event={"ID":"b863d4f9-063a-4102-8c3d-f7e092e4e2c0","Type":"ContainerStarted","Data":"b006f500e4680d825e2f399ca940bd6b93e1075ebea8faf060f34ba1f608f4f6"} Feb 20 07:05:03 crc kubenswrapper[5094]: I0220 07:05:03.058793 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:03 crc kubenswrapper[5094]: I0220 07:05:03.058889 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:03 crc kubenswrapper[5094]: E0220 07:05:03.059434 5094 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 20 07:05:03 crc kubenswrapper[5094]: E0220 07:05:03.059506 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs podName:a1b74404-906b-4466-a3bd-289458ef90ea nodeName:}" failed. No retries permitted until 2026-02-20 07:05:05.059483508 +0000 UTC m=+1119.932110219 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-57z9v" (UID: "a1b74404-906b-4466-a3bd-289458ef90ea") : secret "metrics-server-cert" not found Feb 20 07:05:03 crc kubenswrapper[5094]: E0220 07:05:03.060874 5094 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 20 07:05:03 crc kubenswrapper[5094]: E0220 07:05:03.060914 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs podName:a1b74404-906b-4466-a3bd-289458ef90ea nodeName:}" failed. No retries permitted until 2026-02-20 07:05:05.060903142 +0000 UTC m=+1119.933529853 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-57z9v" (UID: "a1b74404-906b-4466-a3bd-289458ef90ea") : secret "webhook-server-cert" not found Feb 20 07:05:04 crc kubenswrapper[5094]: E0220 07:05:04.041250 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-nll74" podUID="fce6c9b3-2075-479d-9a16-738831a871c4" Feb 20 07:05:04 crc kubenswrapper[5094]: E0220 07:05:04.041597 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dh48q" podUID="a0d3c29b-4f57-4647-b1a5-bfd6c887b0b5" Feb 20 07:05:04 crc kubenswrapper[5094]: E0220 07:05:04.041351 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fq9n6" podUID="f45a4211-8890-4e4a-af96-ccffec62160c" Feb 20 07:05:04 crc kubenswrapper[5094]: E0220 07:05:04.041423 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ct2h7" podUID="74861845-de37-4091-9226-bcb1bbe64b35" Feb 20 07:05:04 crc kubenswrapper[5094]: E0220 07:05:04.041538 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lz57q" podUID="4413dc36-58b0-447a-ba69-cdd2cee9589c" Feb 20 07:05:04 crc kubenswrapper[5094]: I0220 07:05:04.082375 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert\") pod \"infra-operator-controller-manager-79d975b745-nxtc7\" (UID: \"eb67a9bc-35a6-4ce3-bca8-a08ee824cda7\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" Feb 20 07:05:04 crc kubenswrapper[5094]: E0220 07:05:04.083864 5094 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 20 07:05:04 crc kubenswrapper[5094]: E0220 07:05:04.083953 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert podName:eb67a9bc-35a6-4ce3-bca8-a08ee824cda7 nodeName:}" failed. No retries permitted until 2026-02-20 07:05:08.083931473 +0000 UTC m=+1122.956558184 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert") pod "infra-operator-controller-manager-79d975b745-nxtc7" (UID: "eb67a9bc-35a6-4ce3-bca8-a08ee824cda7") : secret "infra-operator-webhook-server-cert" not found Feb 20 07:05:04 crc kubenswrapper[5094]: I0220 07:05:04.107534 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:05:04 crc kubenswrapper[5094]: I0220 07:05:04.107624 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:05:04 crc kubenswrapper[5094]: I0220 07:05:04.107688 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 07:05:04 crc kubenswrapper[5094]: I0220 07:05:04.112125 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2fef18a95cc33ac92b0e7f9a9dd8176f054fb1a2db35c994afe1e4273592eb9d"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 07:05:04 crc kubenswrapper[5094]: I0220 07:05:04.112216 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://2fef18a95cc33ac92b0e7f9a9dd8176f054fb1a2db35c994afe1e4273592eb9d" gracePeriod=600 Feb 20 07:05:04 crc kubenswrapper[5094]: I0220 07:05:04.802537 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8\" (UID: \"57b4cb2c-e7bc-4430-bfb8-3642dab61d84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" Feb 20 07:05:04 crc kubenswrapper[5094]: E0220 07:05:04.802898 5094 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 07:05:04 crc kubenswrapper[5094]: E0220 07:05:04.803723 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert podName:57b4cb2c-e7bc-4430-bfb8-3642dab61d84 nodeName:}" failed. No retries permitted until 2026-02-20 07:05:08.80367465 +0000 UTC m=+1123.676301361 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" (UID: "57b4cb2c-e7bc-4430-bfb8-3642dab61d84") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 07:05:05 crc kubenswrapper[5094]: I0220 07:05:05.070319 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="2fef18a95cc33ac92b0e7f9a9dd8176f054fb1a2db35c994afe1e4273592eb9d" exitCode=0 Feb 20 07:05:05 crc kubenswrapper[5094]: I0220 07:05:05.070383 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"2fef18a95cc33ac92b0e7f9a9dd8176f054fb1a2db35c994afe1e4273592eb9d"} Feb 20 07:05:05 crc kubenswrapper[5094]: I0220 07:05:05.070420 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"b476ab5b54b82460e3be1d827bfff187825879d93c9cc19cc5170f59943c3ef7"} Feb 20 07:05:05 crc kubenswrapper[5094]: I0220 07:05:05.070441 5094 scope.go:117] "RemoveContainer" containerID="8a6efb8b2f6985d19679a1508c25eeeb609faa457ea57f65ac79a9b48c9574e6" Feb 20 07:05:05 crc kubenswrapper[5094]: I0220 07:05:05.123374 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:05 crc kubenswrapper[5094]: I0220 07:05:05.123681 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:05 crc kubenswrapper[5094]: E0220 07:05:05.123905 5094 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 20 07:05:05 crc kubenswrapper[5094]: E0220 07:05:05.124061 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs podName:a1b74404-906b-4466-a3bd-289458ef90ea nodeName:}" failed. No retries permitted until 2026-02-20 07:05:09.124036923 +0000 UTC m=+1123.996663624 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-57z9v" (UID: "a1b74404-906b-4466-a3bd-289458ef90ea") : secret "metrics-server-cert" not found Feb 20 07:05:05 crc kubenswrapper[5094]: E0220 07:05:05.124056 5094 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 20 07:05:05 crc kubenswrapper[5094]: E0220 07:05:05.124173 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs podName:a1b74404-906b-4466-a3bd-289458ef90ea nodeName:}" failed. No retries permitted until 2026-02-20 07:05:09.124146995 +0000 UTC m=+1123.996773706 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-57z9v" (UID: "a1b74404-906b-4466-a3bd-289458ef90ea") : secret "webhook-server-cert" not found Feb 20 07:05:08 crc kubenswrapper[5094]: I0220 07:05:08.185338 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert\") pod \"infra-operator-controller-manager-79d975b745-nxtc7\" (UID: \"eb67a9bc-35a6-4ce3-bca8-a08ee824cda7\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" Feb 20 07:05:08 crc kubenswrapper[5094]: E0220 07:05:08.185800 5094 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 20 07:05:08 crc kubenswrapper[5094]: E0220 07:05:08.185971 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert podName:eb67a9bc-35a6-4ce3-bca8-a08ee824cda7 nodeName:}" failed. No retries permitted until 2026-02-20 07:05:16.185950293 +0000 UTC m=+1131.058577004 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert") pod "infra-operator-controller-manager-79d975b745-nxtc7" (UID: "eb67a9bc-35a6-4ce3-bca8-a08ee824cda7") : secret "infra-operator-webhook-server-cert" not found Feb 20 07:05:08 crc kubenswrapper[5094]: I0220 07:05:08.898383 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8\" (UID: \"57b4cb2c-e7bc-4430-bfb8-3642dab61d84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" Feb 20 07:05:08 crc kubenswrapper[5094]: E0220 07:05:08.899176 5094 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 07:05:08 crc kubenswrapper[5094]: E0220 07:05:08.899283 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert podName:57b4cb2c-e7bc-4430-bfb8-3642dab61d84 nodeName:}" failed. No retries permitted until 2026-02-20 07:05:16.899257986 +0000 UTC m=+1131.771884697 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" (UID: "57b4cb2c-e7bc-4430-bfb8-3642dab61d84") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 07:05:09 crc kubenswrapper[5094]: I0220 07:05:09.208463 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:09 crc kubenswrapper[5094]: E0220 07:05:09.208722 5094 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 20 07:05:09 crc kubenswrapper[5094]: I0220 07:05:09.208740 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:09 crc kubenswrapper[5094]: E0220 07:05:09.208835 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs podName:a1b74404-906b-4466-a3bd-289458ef90ea nodeName:}" failed. No retries permitted until 2026-02-20 07:05:17.20880525 +0000 UTC m=+1132.081431981 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-57z9v" (UID: "a1b74404-906b-4466-a3bd-289458ef90ea") : secret "metrics-server-cert" not found Feb 20 07:05:09 crc kubenswrapper[5094]: E0220 07:05:09.208899 5094 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 20 07:05:09 crc kubenswrapper[5094]: E0220 07:05:09.209051 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs podName:a1b74404-906b-4466-a3bd-289458ef90ea nodeName:}" failed. No retries permitted until 2026-02-20 07:05:17.209030105 +0000 UTC m=+1132.081656816 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-57z9v" (UID: "a1b74404-906b-4466-a3bd-289458ef90ea") : secret "webhook-server-cert" not found Feb 20 07:05:15 crc kubenswrapper[5094]: E0220 07:05:15.195440 5094 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:1ab3ec59cd8e30dd8423e91ad832403bdefbae3b8ac47e15578d5a677d7ba0df" Feb 20 07:05:15 crc kubenswrapper[5094]: E0220 07:05:15.196234 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:1ab3ec59cd8e30dd8423e91ad832403bdefbae3b8ac47e15578d5a677d7ba0df,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5xhkb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-77987464f4-26vtn_openstack-operators(f6c8e20e-ecca-42d4-9e0e-5547ae567d9f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 07:05:15 crc kubenswrapper[5094]: E0220 07:05:15.197447 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-77987464f4-26vtn" podUID="f6c8e20e-ecca-42d4-9e0e-5547ae567d9f" Feb 20 07:05:15 crc kubenswrapper[5094]: E0220 07:05:15.891129 5094 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c" Feb 20 07:05:15 crc kubenswrapper[5094]: E0220 07:05:15.891424 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7cc8w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-54f6768c69-hfkff_openstack-operators(b863d4f9-063a-4102-8c3d-f7e092e4e2c0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 07:05:15 crc kubenswrapper[5094]: E0220 07:05:15.892829 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-hfkff" podUID="b863d4f9-063a-4102-8c3d-f7e092e4e2c0" Feb 20 07:05:16 crc kubenswrapper[5094]: E0220 07:05:16.161910 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:1ab3ec59cd8e30dd8423e91ad832403bdefbae3b8ac47e15578d5a677d7ba0df\\\"\"" pod="openstack-operators/glance-operator-controller-manager-77987464f4-26vtn" podUID="f6c8e20e-ecca-42d4-9e0e-5547ae567d9f" Feb 20 07:05:16 crc kubenswrapper[5094]: E0220 07:05:16.167163 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c\\\"\"" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-hfkff" podUID="b863d4f9-063a-4102-8c3d-f7e092e4e2c0" Feb 20 07:05:16 crc kubenswrapper[5094]: I0220 07:05:16.270649 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert\") pod \"infra-operator-controller-manager-79d975b745-nxtc7\" (UID: \"eb67a9bc-35a6-4ce3-bca8-a08ee824cda7\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" Feb 20 07:05:16 crc kubenswrapper[5094]: E0220 07:05:16.270876 5094 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 20 07:05:16 crc kubenswrapper[5094]: E0220 07:05:16.270952 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert podName:eb67a9bc-35a6-4ce3-bca8-a08ee824cda7 nodeName:}" failed. No retries permitted until 2026-02-20 07:05:32.270927081 +0000 UTC m=+1147.143553782 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert") pod "infra-operator-controller-manager-79d975b745-nxtc7" (UID: "eb67a9bc-35a6-4ce3-bca8-a08ee824cda7") : secret "infra-operator-webhook-server-cert" not found Feb 20 07:05:16 crc kubenswrapper[5094]: E0220 07:05:16.665991 5094 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf" Feb 20 07:05:16 crc kubenswrapper[5094]: E0220 07:05:16.666460 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bhj9q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-64ddbf8bb-bd9tr_openstack-operators(eba1b8e0-b529-47ad-a657-75ce01bad56a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 07:05:16 crc kubenswrapper[5094]: E0220 07:05:16.668838 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bd9tr" podUID="eba1b8e0-b529-47ad-a657-75ce01bad56a" Feb 20 07:05:16 crc kubenswrapper[5094]: I0220 07:05:16.985960 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8\" (UID: \"57b4cb2c-e7bc-4430-bfb8-3642dab61d84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" Feb 20 07:05:16 crc kubenswrapper[5094]: E0220 07:05:16.986166 5094 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 07:05:16 crc kubenswrapper[5094]: E0220 07:05:16.986744 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert podName:57b4cb2c-e7bc-4430-bfb8-3642dab61d84 nodeName:}" failed. No retries permitted until 2026-02-20 07:05:32.986720134 +0000 UTC m=+1147.859346845 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" (UID: "57b4cb2c-e7bc-4430-bfb8-3642dab61d84") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 07:05:17 crc kubenswrapper[5094]: E0220 07:05:17.179590 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bd9tr" podUID="eba1b8e0-b529-47ad-a657-75ce01bad56a" Feb 20 07:05:17 crc kubenswrapper[5094]: I0220 07:05:17.295729 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:17 crc kubenswrapper[5094]: I0220 07:05:17.295817 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:17 crc kubenswrapper[5094]: E0220 07:05:17.296024 5094 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 20 07:05:17 crc kubenswrapper[5094]: E0220 07:05:17.296056 5094 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 20 07:05:17 crc kubenswrapper[5094]: E0220 07:05:17.296093 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs podName:a1b74404-906b-4466-a3bd-289458ef90ea nodeName:}" failed. No retries permitted until 2026-02-20 07:05:33.296075492 +0000 UTC m=+1148.168702203 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-57z9v" (UID: "a1b74404-906b-4466-a3bd-289458ef90ea") : secret "metrics-server-cert" not found Feb 20 07:05:17 crc kubenswrapper[5094]: E0220 07:05:17.296186 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs podName:a1b74404-906b-4466-a3bd-289458ef90ea nodeName:}" failed. No retries permitted until 2026-02-20 07:05:33.296158354 +0000 UTC m=+1148.168785155 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-57z9v" (UID: "a1b74404-906b-4466-a3bd-289458ef90ea") : secret "webhook-server-cert" not found Feb 20 07:05:17 crc kubenswrapper[5094]: E0220 07:05:17.410268 5094 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838" Feb 20 07:05:17 crc kubenswrapper[5094]: E0220 07:05:17.410502 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6rc9s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-2ftdz_openstack-operators(93dbc041-00c2-4189-abca-6bb3a00abc2d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 07:05:17 crc kubenswrapper[5094]: E0220 07:05:17.411722 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-2ftdz" podUID="93dbc041-00c2-4189-abca-6bb3a00abc2d" Feb 20 07:05:17 crc kubenswrapper[5094]: E0220 07:05:17.982039 5094 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1" Feb 20 07:05:17 crc kubenswrapper[5094]: E0220 07:05:17.982403 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tgzj5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-b4d948c87-86bxl_openstack-operators(8a1c02cd-3546-45fa-b7db-5903c80681a4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 07:05:17 crc kubenswrapper[5094]: E0220 07:05:17.983664 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-86bxl" podUID="8a1c02cd-3546-45fa-b7db-5903c80681a4" Feb 20 07:05:18 crc kubenswrapper[5094]: E0220 07:05:18.185682 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-86bxl" podUID="8a1c02cd-3546-45fa-b7db-5903c80681a4" Feb 20 07:05:18 crc kubenswrapper[5094]: E0220 07:05:18.186053 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-2ftdz" podUID="93dbc041-00c2-4189-abca-6bb3a00abc2d" Feb 20 07:05:20 crc kubenswrapper[5094]: I0220 07:05:20.203022 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ngkcq" event={"ID":"683351ac-f508-4961-b07a-eaac9c26a4f3","Type":"ContainerStarted","Data":"636f0837c2e6575d9e75e03bc4f8506710b5cc62fa99829f7eecb7b7ec543aaa"} Feb 20 07:05:20 crc kubenswrapper[5094]: I0220 07:05:20.203553 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ngkcq" Feb 20 07:05:20 crc kubenswrapper[5094]: I0220 07:05:20.205932 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-8j8pv" event={"ID":"fcf15128-56ef-42dc-b230-1cd8b7638d33","Type":"ContainerStarted","Data":"473fcdb9f8b94742ba0beb96434c6166f8469323ad6c9b75af3c10d454df84ef"} Feb 20 07:05:20 crc kubenswrapper[5094]: I0220 07:05:20.206082 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-8j8pv" Feb 20 07:05:20 crc kubenswrapper[5094]: I0220 07:05:20.208659 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-n5dgn" event={"ID":"6177108e-bc02-497c-80ab-312f61fbd1c2","Type":"ContainerStarted","Data":"a16e4827a448d0324601fa9c38fa56da5f23ba1117c3be7efb0bc1bcb40517c2"} Feb 20 07:05:20 crc kubenswrapper[5094]: I0220 07:05:20.209025 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-n5dgn" Feb 20 07:05:20 crc kubenswrapper[5094]: I0220 07:05:20.210179 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9c2k5" event={"ID":"c510ecc1-53ce-4611-af6a-09488f9317ed","Type":"ContainerStarted","Data":"e8a259b0c995ebbe163e1105c929a394dabd010c2fd03ca3ac6aa334dd10430c"} Feb 20 07:05:20 crc kubenswrapper[5094]: I0220 07:05:20.210368 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9c2k5" Feb 20 07:05:20 crc kubenswrapper[5094]: I0220 07:05:20.229883 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ngkcq" podStartSLOduration=3.140056058 podStartE2EDuration="20.229847614s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:02.108538584 +0000 UTC m=+1116.981165295" lastFinishedPulling="2026-02-20 07:05:19.19833014 +0000 UTC m=+1134.070956851" observedRunningTime="2026-02-20 07:05:20.217861757 +0000 UTC m=+1135.090488468" watchObservedRunningTime="2026-02-20 07:05:20.229847614 +0000 UTC m=+1135.102474325" Feb 20 07:05:20 crc kubenswrapper[5094]: I0220 07:05:20.260177 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-n5dgn" podStartSLOduration=3.124349291 podStartE2EDuration="20.26015113s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:02.062178583 +0000 UTC m=+1116.934805294" lastFinishedPulling="2026-02-20 07:05:19.197980422 +0000 UTC m=+1134.070607133" observedRunningTime="2026-02-20 07:05:20.23886663 +0000 UTC m=+1135.111493341" watchObservedRunningTime="2026-02-20 07:05:20.26015113 +0000 UTC m=+1135.132777841" Feb 20 07:05:20 crc kubenswrapper[5094]: I0220 07:05:20.260887 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9c2k5" podStartSLOduration=4.554018202 podStartE2EDuration="20.260879888s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:02.240085414 +0000 UTC m=+1117.112712125" lastFinishedPulling="2026-02-20 07:05:17.94694708 +0000 UTC m=+1132.819573811" observedRunningTime="2026-02-20 07:05:20.256648076 +0000 UTC m=+1135.129274787" watchObservedRunningTime="2026-02-20 07:05:20.260879888 +0000 UTC m=+1135.133506599" Feb 20 07:05:20 crc kubenswrapper[5094]: I0220 07:05:20.282350 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-8j8pv" podStartSLOduration=3.132777944 podStartE2EDuration="20.282328721s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:02.051024626 +0000 UTC m=+1116.923651337" lastFinishedPulling="2026-02-20 07:05:19.200575403 +0000 UTC m=+1134.073202114" observedRunningTime="2026-02-20 07:05:20.280075647 +0000 UTC m=+1135.152702358" watchObservedRunningTime="2026-02-20 07:05:20.282328721 +0000 UTC m=+1135.154955432" Feb 20 07:05:27 crc kubenswrapper[5094]: I0220 07:05:27.281556 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-xjng5" event={"ID":"a91a9b82-fc6b-4900-becb-6dc3c100e429","Type":"ContainerStarted","Data":"b03564b29955f9e95c748c28ab6fecc3eeb609037719f6f48604e68481b2f640"} Feb 20 07:05:27 crc kubenswrapper[5094]: I0220 07:05:27.282784 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-xjng5" Feb 20 07:05:27 crc kubenswrapper[5094]: I0220 07:05:27.287763 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-mnd7v" event={"ID":"292cb132-b03c-4d20-8bee-c90ad3c4486b","Type":"ContainerStarted","Data":"8b4ed3acff59323b9b5362b8c86698480082e25a2f5d7ba8ba718f3c183b82f1"} Feb 20 07:05:27 crc kubenswrapper[5094]: I0220 07:05:27.288843 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-mnd7v" Feb 20 07:05:27 crc kubenswrapper[5094]: I0220 07:05:27.307798 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-xjng5" podStartSLOduration=11.377370575 podStartE2EDuration="27.307772375s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:02.015474185 +0000 UTC m=+1116.888100886" lastFinishedPulling="2026-02-20 07:05:17.945875975 +0000 UTC m=+1132.818502686" observedRunningTime="2026-02-20 07:05:27.300340866 +0000 UTC m=+1142.172967587" watchObservedRunningTime="2026-02-20 07:05:27.307772375 +0000 UTC m=+1142.180399106" Feb 20 07:05:27 crc kubenswrapper[5094]: I0220 07:05:27.325981 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-mnd7v" podStartSLOduration=11.087351749 podStartE2EDuration="27.32594919s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:01.707255163 +0000 UTC m=+1116.579881874" lastFinishedPulling="2026-02-20 07:05:17.945852614 +0000 UTC m=+1132.818479315" observedRunningTime="2026-02-20 07:05:27.320533331 +0000 UTC m=+1142.193160042" watchObservedRunningTime="2026-02-20 07:05:27.32594919 +0000 UTC m=+1142.198575921" Feb 20 07:05:29 crc kubenswrapper[5094]: I0220 07:05:29.329970 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-cbtkd" event={"ID":"c45dcc1f-a95d-4492-9139-16d550809a8e","Type":"ContainerStarted","Data":"1cb81276893e4adf3bf82e13819f46c33ccc912a39584e0cad9bf24855f4bb33"} Feb 20 07:05:29 crc kubenswrapper[5094]: I0220 07:05:29.331801 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-cbtkd" Feb 20 07:05:29 crc kubenswrapper[5094]: I0220 07:05:29.335912 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-nll74" event={"ID":"fce6c9b3-2075-479d-9a16-738831a871c4","Type":"ContainerStarted","Data":"3cc28b52343224699ce5bec9612f82ea520b5f3fb0d60b0eb025cfedec2b6aae"} Feb 20 07:05:29 crc kubenswrapper[5094]: I0220 07:05:29.336366 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7866795846-nll74" Feb 20 07:05:29 crc kubenswrapper[5094]: I0220 07:05:29.337925 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-p689m" event={"ID":"36d60210-52d5-4f28-ae0b-28cce632d5cb","Type":"ContainerStarted","Data":"3c0c2ea8b0cbf1c81bd5a6ba841d7dea29d115d4686f0e0a075a6cf71285e8b2"} Feb 20 07:05:29 crc kubenswrapper[5094]: I0220 07:05:29.338052 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-p689m" Feb 20 07:05:29 crc kubenswrapper[5094]: I0220 07:05:29.339522 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5dkn" event={"ID":"6b09cc76-8cba-42ed-bb2c-fdf4473c9afe","Type":"ContainerStarted","Data":"d8c9ae7389671c7b3dd5af71666815e4bcf18f376d81ba1876d1b4989a6699f3"} Feb 20 07:05:29 crc kubenswrapper[5094]: I0220 07:05:29.339643 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5dkn" Feb 20 07:05:29 crc kubenswrapper[5094]: I0220 07:05:29.362541 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-24cv7" event={"ID":"32338b54-c33f-4dc5-b328-9cf4d92d1db6","Type":"ContainerStarted","Data":"3b8131ad97940a5d11910fe6adada88383a8d3997e80bed72efd5db8a3b2230c"} Feb 20 07:05:29 crc kubenswrapper[5094]: I0220 07:05:29.362590 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-24cv7" Feb 20 07:05:29 crc kubenswrapper[5094]: I0220 07:05:29.393431 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-24cv7" podStartSLOduration=11.810348723 podStartE2EDuration="29.393391593s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:01.61736919 +0000 UTC m=+1116.489995901" lastFinishedPulling="2026-02-20 07:05:19.20041206 +0000 UTC m=+1134.073038771" observedRunningTime="2026-02-20 07:05:29.385637188 +0000 UTC m=+1144.258263899" watchObservedRunningTime="2026-02-20 07:05:29.393391593 +0000 UTC m=+1144.266018304" Feb 20 07:05:29 crc kubenswrapper[5094]: I0220 07:05:29.395458 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-cbtkd" podStartSLOduration=12.62146673 podStartE2EDuration="29.395450443s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:02.424290536 +0000 UTC m=+1117.296917267" lastFinishedPulling="2026-02-20 07:05:19.198274269 +0000 UTC m=+1134.070900980" observedRunningTime="2026-02-20 07:05:29.370554327 +0000 UTC m=+1144.243181038" watchObservedRunningTime="2026-02-20 07:05:29.395450443 +0000 UTC m=+1144.268077154" Feb 20 07:05:29 crc kubenswrapper[5094]: I0220 07:05:29.412097 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5dkn" podStartSLOduration=11.577333024 podStartE2EDuration="29.412073531s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:01.363335547 +0000 UTC m=+1116.235962268" lastFinishedPulling="2026-02-20 07:05:19.198076064 +0000 UTC m=+1134.070702775" observedRunningTime="2026-02-20 07:05:29.409862198 +0000 UTC m=+1144.282488909" watchObservedRunningTime="2026-02-20 07:05:29.412073531 +0000 UTC m=+1144.284700242" Feb 20 07:05:29 crc kubenswrapper[5094]: I0220 07:05:29.438400 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-p689m" podStartSLOduration=12.270509115 podStartE2EDuration="29.438382211s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:02.032461072 +0000 UTC m=+1116.905087783" lastFinishedPulling="2026-02-20 07:05:19.200334168 +0000 UTC m=+1134.072960879" observedRunningTime="2026-02-20 07:05:29.437229224 +0000 UTC m=+1144.309855935" watchObservedRunningTime="2026-02-20 07:05:29.438382211 +0000 UTC m=+1144.311008922" Feb 20 07:05:29 crc kubenswrapper[5094]: I0220 07:05:29.469170 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7866795846-nll74" podStartSLOduration=12.610971068 podStartE2EDuration="29.469145228s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:02.405884884 +0000 UTC m=+1117.278511595" lastFinishedPulling="2026-02-20 07:05:19.264059044 +0000 UTC m=+1134.136685755" observedRunningTime="2026-02-20 07:05:29.461648748 +0000 UTC m=+1144.334275459" watchObservedRunningTime="2026-02-20 07:05:29.469145228 +0000 UTC m=+1144.341771939" Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.371019 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fq9n6" event={"ID":"f45a4211-8890-4e4a-af96-ccffec62160c","Type":"ContainerStarted","Data":"d6b14c87e61a5ce0c28866ab7791e9f9c288b6ac76ac375e138e71d8ed8c709a"} Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.373011 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-26vtn" event={"ID":"f6c8e20e-ecca-42d4-9e0e-5547ae567d9f","Type":"ContainerStarted","Data":"86ae7dab61271ba568191fd4743d167f4d87e005eddbbba8b77fee05909c1284"} Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.373314 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987464f4-26vtn" Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.374940 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-86bxl" event={"ID":"8a1c02cd-3546-45fa-b7db-5903c80681a4","Type":"ContainerStarted","Data":"3684e8b81fc66683808c13368b66854e3de68171ce23e01070b8e0970743c3c4"} Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.375324 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-86bxl" Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.379234 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dh48q" event={"ID":"a0d3c29b-4f57-4647-b1a5-bfd6c887b0b5","Type":"ContainerStarted","Data":"0c90819418a8e3dd75259c7f23eb53e15d434c602fe6a4469131d205261dd6d7"} Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.379416 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dh48q" Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.381022 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-hfkff" event={"ID":"b863d4f9-063a-4102-8c3d-f7e092e4e2c0","Type":"ContainerStarted","Data":"baaccfe1c5f13a3900e7cc76f1d91037031f5b1f68927234f39ff96baa8f2cd2"} Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.381297 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-hfkff" Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.382621 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ct2h7" event={"ID":"74861845-de37-4091-9226-bcb1bbe64b35","Type":"ContainerStarted","Data":"1c7b8de53e9ca51fc133d387dcd05968671c66159d44551b9a5ca284e5643cdf"} Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.382987 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ct2h7" Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.385084 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-2ftdz" event={"ID":"93dbc041-00c2-4189-abca-6bb3a00abc2d","Type":"ContainerStarted","Data":"2e8c12073614c7188ef7925e53ed028ed710e9a944b9c21866b720f228e64a3f"} Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.385477 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-2ftdz" Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.391743 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lz57q" event={"ID":"4413dc36-58b0-447a-ba69-cdd2cee9589c","Type":"ContainerStarted","Data":"831a296804a99995767f55cd675f9382bdb0141f31d26430c827697dd599456a"} Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.392099 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lz57q" Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.397783 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fq9n6" podStartSLOduration=2.595486685 podStartE2EDuration="29.397771868s" podCreationTimestamp="2026-02-20 07:05:01 +0000 UTC" firstStartedPulling="2026-02-20 07:05:02.433370493 +0000 UTC m=+1117.305997214" lastFinishedPulling="2026-02-20 07:05:29.235655676 +0000 UTC m=+1144.108282397" observedRunningTime="2026-02-20 07:05:30.393793253 +0000 UTC m=+1145.266419964" watchObservedRunningTime="2026-02-20 07:05:30.397771868 +0000 UTC m=+1145.270398579" Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.423595 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987464f4-26vtn" podStartSLOduration=2.71215947 podStartE2EDuration="30.423576326s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:01.692877249 +0000 UTC m=+1116.565503960" lastFinishedPulling="2026-02-20 07:05:29.404294105 +0000 UTC m=+1144.276920816" observedRunningTime="2026-02-20 07:05:30.415909923 +0000 UTC m=+1145.288536634" watchObservedRunningTime="2026-02-20 07:05:30.423576326 +0000 UTC m=+1145.296203037" Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.441148 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lz57q" podStartSLOduration=3.559958525 podStartE2EDuration="30.441134597s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:02.24241084 +0000 UTC m=+1117.115037551" lastFinishedPulling="2026-02-20 07:05:29.123586902 +0000 UTC m=+1143.996213623" observedRunningTime="2026-02-20 07:05:30.436870034 +0000 UTC m=+1145.309496745" watchObservedRunningTime="2026-02-20 07:05:30.441134597 +0000 UTC m=+1145.313761298" Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.465367 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-2ftdz" podStartSLOduration=3.112861476 podStartE2EDuration="30.465348706s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:02.140732595 +0000 UTC m=+1117.013359306" lastFinishedPulling="2026-02-20 07:05:29.493219825 +0000 UTC m=+1144.365846536" observedRunningTime="2026-02-20 07:05:30.460281895 +0000 UTC m=+1145.332908606" watchObservedRunningTime="2026-02-20 07:05:30.465348706 +0000 UTC m=+1145.337975417" Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.486093 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ct2h7" podStartSLOduration=3.6046175639999998 podStartE2EDuration="30.486075853s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:02.240556465 +0000 UTC m=+1117.113183176" lastFinishedPulling="2026-02-20 07:05:29.122014744 +0000 UTC m=+1143.994641465" observedRunningTime="2026-02-20 07:05:30.482416025 +0000 UTC m=+1145.355042736" watchObservedRunningTime="2026-02-20 07:05:30.486075853 +0000 UTC m=+1145.358702564" Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.512737 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dh48q" podStartSLOduration=13.442435192 podStartE2EDuration="30.512721721s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:02.242660836 +0000 UTC m=+1117.115287547" lastFinishedPulling="2026-02-20 07:05:19.312947365 +0000 UTC m=+1134.185574076" observedRunningTime="2026-02-20 07:05:30.50812772 +0000 UTC m=+1145.380754431" watchObservedRunningTime="2026-02-20 07:05:30.512721721 +0000 UTC m=+1145.385348432" Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.534253 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-86bxl" podStartSLOduration=3.168427447 podStartE2EDuration="30.534233666s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:02.051019546 +0000 UTC m=+1116.923646247" lastFinishedPulling="2026-02-20 07:05:29.416825765 +0000 UTC m=+1144.289452466" observedRunningTime="2026-02-20 07:05:30.527583396 +0000 UTC m=+1145.400210107" watchObservedRunningTime="2026-02-20 07:05:30.534233666 +0000 UTC m=+1145.406860367" Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.575218 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-hfkff" podStartSLOduration=3.481030053 podStartE2EDuration="30.575206127s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:02.138689926 +0000 UTC m=+1117.011316637" lastFinishedPulling="2026-02-20 07:05:29.232866 +0000 UTC m=+1144.105492711" observedRunningTime="2026-02-20 07:05:30.573417744 +0000 UTC m=+1145.446044455" watchObservedRunningTime="2026-02-20 07:05:30.575206127 +0000 UTC m=+1145.447832838" Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.919281 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-8j8pv" Feb 20 07:05:31 crc kubenswrapper[5094]: I0220 07:05:31.027497 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-n5dgn" Feb 20 07:05:31 crc kubenswrapper[5094]: I0220 07:05:31.305692 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ngkcq" Feb 20 07:05:31 crc kubenswrapper[5094]: I0220 07:05:31.391889 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9c2k5" Feb 20 07:05:32 crc kubenswrapper[5094]: I0220 07:05:32.347063 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert\") pod \"infra-operator-controller-manager-79d975b745-nxtc7\" (UID: \"eb67a9bc-35a6-4ce3-bca8-a08ee824cda7\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" Feb 20 07:05:32 crc kubenswrapper[5094]: I0220 07:05:32.363942 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert\") pod \"infra-operator-controller-manager-79d975b745-nxtc7\" (UID: \"eb67a9bc-35a6-4ce3-bca8-a08ee824cda7\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" Feb 20 07:05:32 crc kubenswrapper[5094]: I0220 07:05:32.439824 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-twhdx" Feb 20 07:05:32 crc kubenswrapper[5094]: I0220 07:05:32.441929 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bd9tr" event={"ID":"eba1b8e0-b529-47ad-a657-75ce01bad56a","Type":"ContainerStarted","Data":"dc9033c6796474544bbfb7242972c4be3d2a328bec53f134465730ab0b3828d4"} Feb 20 07:05:32 crc kubenswrapper[5094]: I0220 07:05:32.442316 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bd9tr" Feb 20 07:05:32 crc kubenswrapper[5094]: I0220 07:05:32.445016 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" Feb 20 07:05:32 crc kubenswrapper[5094]: I0220 07:05:32.482275 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bd9tr" podStartSLOduration=3.321730228 podStartE2EDuration="32.482250589s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:02.111111535 +0000 UTC m=+1116.983738246" lastFinishedPulling="2026-02-20 07:05:31.271631896 +0000 UTC m=+1146.144258607" observedRunningTime="2026-02-20 07:05:32.479585775 +0000 UTC m=+1147.352212526" watchObservedRunningTime="2026-02-20 07:05:32.482250589 +0000 UTC m=+1147.354877300" Feb 20 07:05:32 crc kubenswrapper[5094]: I0220 07:05:32.952763 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7"] Feb 20 07:05:32 crc kubenswrapper[5094]: W0220 07:05:32.962143 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb67a9bc_35a6_4ce3_bca8_a08ee824cda7.slice/crio-da2ff4eb721f51bbecbf21ba917d3c7cfb5dd4ea81f818d4707cd8d3327c0b3a WatchSource:0}: Error finding container da2ff4eb721f51bbecbf21ba917d3c7cfb5dd4ea81f818d4707cd8d3327c0b3a: Status 404 returned error can't find the container with id da2ff4eb721f51bbecbf21ba917d3c7cfb5dd4ea81f818d4707cd8d3327c0b3a Feb 20 07:05:33 crc kubenswrapper[5094]: I0220 07:05:33.062770 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8\" (UID: \"57b4cb2c-e7bc-4430-bfb8-3642dab61d84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" Feb 20 07:05:33 crc kubenswrapper[5094]: I0220 07:05:33.070564 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8\" (UID: \"57b4cb2c-e7bc-4430-bfb8-3642dab61d84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" Feb 20 07:05:33 crc kubenswrapper[5094]: I0220 07:05:33.143687 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-j64q2" Feb 20 07:05:33 crc kubenswrapper[5094]: I0220 07:05:33.150978 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" Feb 20 07:05:33 crc kubenswrapper[5094]: I0220 07:05:33.367517 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:33 crc kubenswrapper[5094]: I0220 07:05:33.368059 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:33 crc kubenswrapper[5094]: I0220 07:05:33.374734 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:33 crc kubenswrapper[5094]: I0220 07:05:33.375390 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:33 crc kubenswrapper[5094]: I0220 07:05:33.457501 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-prrqj" Feb 20 07:05:33 crc kubenswrapper[5094]: I0220 07:05:33.457886 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" event={"ID":"eb67a9bc-35a6-4ce3-bca8-a08ee824cda7","Type":"ContainerStarted","Data":"da2ff4eb721f51bbecbf21ba917d3c7cfb5dd4ea81f818d4707cd8d3327c0b3a"} Feb 20 07:05:33 crc kubenswrapper[5094]: I0220 07:05:33.464582 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:33 crc kubenswrapper[5094]: I0220 07:05:33.639082 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8"] Feb 20 07:05:33 crc kubenswrapper[5094]: W0220 07:05:33.651914 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57b4cb2c_e7bc_4430_bfb8_3642dab61d84.slice/crio-d6d5c8723b1634e26915dcb313cde75827ae5d73ad72295741ffd667abb304c9 WatchSource:0}: Error finding container d6d5c8723b1634e26915dcb313cde75827ae5d73ad72295741ffd667abb304c9: Status 404 returned error can't find the container with id d6d5c8723b1634e26915dcb313cde75827ae5d73ad72295741ffd667abb304c9 Feb 20 07:05:33 crc kubenswrapper[5094]: I0220 07:05:33.970951 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v"] Feb 20 07:05:33 crc kubenswrapper[5094]: W0220 07:05:33.973473 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1b74404_906b_4466_a3bd_289458ef90ea.slice/crio-26b1776deae46d8de81c72bfe440be76c6862386fabf2ba453cdedf61ff83dbe WatchSource:0}: Error finding container 26b1776deae46d8de81c72bfe440be76c6862386fabf2ba453cdedf61ff83dbe: Status 404 returned error can't find the container with id 26b1776deae46d8de81c72bfe440be76c6862386fabf2ba453cdedf61ff83dbe Feb 20 07:05:34 crc kubenswrapper[5094]: I0220 07:05:34.469188 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" event={"ID":"a1b74404-906b-4466-a3bd-289458ef90ea","Type":"ContainerStarted","Data":"92972e17b7597d7d246ec89d508ca3dc503e4c27b0890de6d8d8f76defbcf1dd"} Feb 20 07:05:34 crc kubenswrapper[5094]: I0220 07:05:34.469261 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" event={"ID":"a1b74404-906b-4466-a3bd-289458ef90ea","Type":"ContainerStarted","Data":"26b1776deae46d8de81c72bfe440be76c6862386fabf2ba453cdedf61ff83dbe"} Feb 20 07:05:34 crc kubenswrapper[5094]: I0220 07:05:34.469401 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:34 crc kubenswrapper[5094]: I0220 07:05:34.472869 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" event={"ID":"57b4cb2c-e7bc-4430-bfb8-3642dab61d84","Type":"ContainerStarted","Data":"d6d5c8723b1634e26915dcb313cde75827ae5d73ad72295741ffd667abb304c9"} Feb 20 07:05:34 crc kubenswrapper[5094]: I0220 07:05:34.503330 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" podStartSLOduration=34.503304972 podStartE2EDuration="34.503304972s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:05:34.496800876 +0000 UTC m=+1149.369427597" watchObservedRunningTime="2026-02-20 07:05:34.503304972 +0000 UTC m=+1149.375931693" Feb 20 07:05:36 crc kubenswrapper[5094]: I0220 07:05:36.490762 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" event={"ID":"eb67a9bc-35a6-4ce3-bca8-a08ee824cda7","Type":"ContainerStarted","Data":"f8dbef55b69f2b26129eb4aff60bb4f83ca3198a9863d5f2edc242f42986c98a"} Feb 20 07:05:36 crc kubenswrapper[5094]: I0220 07:05:36.491236 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" Feb 20 07:05:36 crc kubenswrapper[5094]: I0220 07:05:36.493802 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" event={"ID":"57b4cb2c-e7bc-4430-bfb8-3642dab61d84","Type":"ContainerStarted","Data":"9b0b47dfc0d39af6f26080c06a44ddae0ffc8f865a0a51586ed513c1d57933a7"} Feb 20 07:05:36 crc kubenswrapper[5094]: I0220 07:05:36.493902 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" Feb 20 07:05:36 crc kubenswrapper[5094]: I0220 07:05:36.517006 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" podStartSLOduration=33.755968694 podStartE2EDuration="36.516982898s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:32.965450931 +0000 UTC m=+1147.838077682" lastFinishedPulling="2026-02-20 07:05:35.726465165 +0000 UTC m=+1150.599091886" observedRunningTime="2026-02-20 07:05:36.514099829 +0000 UTC m=+1151.386726560" watchObservedRunningTime="2026-02-20 07:05:36.516982898 +0000 UTC m=+1151.389609609" Feb 20 07:05:36 crc kubenswrapper[5094]: I0220 07:05:36.560933 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" podStartSLOduration=34.500328411 podStartE2EDuration="36.56089751s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:33.661869031 +0000 UTC m=+1148.534495752" lastFinishedPulling="2026-02-20 07:05:35.7224381 +0000 UTC m=+1150.595064851" observedRunningTime="2026-02-20 07:05:36.552414776 +0000 UTC m=+1151.425041517" watchObservedRunningTime="2026-02-20 07:05:36.56089751 +0000 UTC m=+1151.433524261" Feb 20 07:05:40 crc kubenswrapper[5094]: I0220 07:05:40.480274 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5dkn" Feb 20 07:05:40 crc kubenswrapper[5094]: I0220 07:05:40.498503 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-24cv7" Feb 20 07:05:40 crc kubenswrapper[5094]: I0220 07:05:40.533947 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987464f4-26vtn" Feb 20 07:05:40 crc kubenswrapper[5094]: I0220 07:05:40.535344 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-xjng5" Feb 20 07:05:40 crc kubenswrapper[5094]: I0220 07:05:40.594969 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-p689m" Feb 20 07:05:40 crc kubenswrapper[5094]: I0220 07:05:40.622975 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-mnd7v" Feb 20 07:05:40 crc kubenswrapper[5094]: I0220 07:05:40.967572 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-86bxl" Feb 20 07:05:40 crc kubenswrapper[5094]: I0220 07:05:40.994774 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-hfkff" Feb 20 07:05:41 crc kubenswrapper[5094]: I0220 07:05:41.263036 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-2ftdz" Feb 20 07:05:41 crc kubenswrapper[5094]: I0220 07:05:41.293617 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bd9tr" Feb 20 07:05:41 crc kubenswrapper[5094]: I0220 07:05:41.349859 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ct2h7" Feb 20 07:05:41 crc kubenswrapper[5094]: I0220 07:05:41.419832 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dh48q" Feb 20 07:05:41 crc kubenswrapper[5094]: I0220 07:05:41.431648 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-cbtkd" Feb 20 07:05:41 crc kubenswrapper[5094]: I0220 07:05:41.550460 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7866795846-nll74" Feb 20 07:05:41 crc kubenswrapper[5094]: I0220 07:05:41.603607 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lz57q" Feb 20 07:05:42 crc kubenswrapper[5094]: I0220 07:05:42.460833 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" Feb 20 07:05:43 crc kubenswrapper[5094]: I0220 07:05:43.160628 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" Feb 20 07:05:43 crc kubenswrapper[5094]: I0220 07:05:43.474639 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.087791 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-tz76m"] Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.089628 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-tz76m" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.095243 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.095311 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.095552 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.095262 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-pv4gg" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.103952 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-tz76m"] Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.191026 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmdmc\" (UniqueName: \"kubernetes.io/projected/9654de45-6750-4105-a4db-050ae521d91c-kube-api-access-vmdmc\") pod \"dnsmasq-dns-855cbc58c5-tz76m\" (UID: \"9654de45-6750-4105-a4db-050ae521d91c\") " pod="openstack/dnsmasq-dns-855cbc58c5-tz76m" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.191551 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9654de45-6750-4105-a4db-050ae521d91c-config\") pod \"dnsmasq-dns-855cbc58c5-tz76m\" (UID: \"9654de45-6750-4105-a4db-050ae521d91c\") " pod="openstack/dnsmasq-dns-855cbc58c5-tz76m" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.208674 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-z44mn"] Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.210153 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-z44mn" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.212190 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.223290 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-z44mn"] Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.293541 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc7th\" (UniqueName: \"kubernetes.io/projected/ac34a5f7-69bb-416d-994d-056f5f1513e8-kube-api-access-gc7th\") pod \"dnsmasq-dns-6fcf94d689-z44mn\" (UID: \"ac34a5f7-69bb-416d-994d-056f5f1513e8\") " pod="openstack/dnsmasq-dns-6fcf94d689-z44mn" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.293658 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9654de45-6750-4105-a4db-050ae521d91c-config\") pod \"dnsmasq-dns-855cbc58c5-tz76m\" (UID: \"9654de45-6750-4105-a4db-050ae521d91c\") " pod="openstack/dnsmasq-dns-855cbc58c5-tz76m" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.293717 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac34a5f7-69bb-416d-994d-056f5f1513e8-config\") pod \"dnsmasq-dns-6fcf94d689-z44mn\" (UID: \"ac34a5f7-69bb-416d-994d-056f5f1513e8\") " pod="openstack/dnsmasq-dns-6fcf94d689-z44mn" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.293741 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac34a5f7-69bb-416d-994d-056f5f1513e8-dns-svc\") pod \"dnsmasq-dns-6fcf94d689-z44mn\" (UID: \"ac34a5f7-69bb-416d-994d-056f5f1513e8\") " pod="openstack/dnsmasq-dns-6fcf94d689-z44mn" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.293777 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmdmc\" (UniqueName: \"kubernetes.io/projected/9654de45-6750-4105-a4db-050ae521d91c-kube-api-access-vmdmc\") pod \"dnsmasq-dns-855cbc58c5-tz76m\" (UID: \"9654de45-6750-4105-a4db-050ae521d91c\") " pod="openstack/dnsmasq-dns-855cbc58c5-tz76m" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.294575 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9654de45-6750-4105-a4db-050ae521d91c-config\") pod \"dnsmasq-dns-855cbc58c5-tz76m\" (UID: \"9654de45-6750-4105-a4db-050ae521d91c\") " pod="openstack/dnsmasq-dns-855cbc58c5-tz76m" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.312755 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmdmc\" (UniqueName: \"kubernetes.io/projected/9654de45-6750-4105-a4db-050ae521d91c-kube-api-access-vmdmc\") pod \"dnsmasq-dns-855cbc58c5-tz76m\" (UID: \"9654de45-6750-4105-a4db-050ae521d91c\") " pod="openstack/dnsmasq-dns-855cbc58c5-tz76m" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.395719 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc7th\" (UniqueName: \"kubernetes.io/projected/ac34a5f7-69bb-416d-994d-056f5f1513e8-kube-api-access-gc7th\") pod \"dnsmasq-dns-6fcf94d689-z44mn\" (UID: \"ac34a5f7-69bb-416d-994d-056f5f1513e8\") " pod="openstack/dnsmasq-dns-6fcf94d689-z44mn" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.395865 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac34a5f7-69bb-416d-994d-056f5f1513e8-config\") pod \"dnsmasq-dns-6fcf94d689-z44mn\" (UID: \"ac34a5f7-69bb-416d-994d-056f5f1513e8\") " pod="openstack/dnsmasq-dns-6fcf94d689-z44mn" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.395907 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac34a5f7-69bb-416d-994d-056f5f1513e8-dns-svc\") pod \"dnsmasq-dns-6fcf94d689-z44mn\" (UID: \"ac34a5f7-69bb-416d-994d-056f5f1513e8\") " pod="openstack/dnsmasq-dns-6fcf94d689-z44mn" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.397112 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac34a5f7-69bb-416d-994d-056f5f1513e8-dns-svc\") pod \"dnsmasq-dns-6fcf94d689-z44mn\" (UID: \"ac34a5f7-69bb-416d-994d-056f5f1513e8\") " pod="openstack/dnsmasq-dns-6fcf94d689-z44mn" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.398999 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac34a5f7-69bb-416d-994d-056f5f1513e8-config\") pod \"dnsmasq-dns-6fcf94d689-z44mn\" (UID: \"ac34a5f7-69bb-416d-994d-056f5f1513e8\") " pod="openstack/dnsmasq-dns-6fcf94d689-z44mn" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.407801 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-tz76m" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.433991 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc7th\" (UniqueName: \"kubernetes.io/projected/ac34a5f7-69bb-416d-994d-056f5f1513e8-kube-api-access-gc7th\") pod \"dnsmasq-dns-6fcf94d689-z44mn\" (UID: \"ac34a5f7-69bb-416d-994d-056f5f1513e8\") " pod="openstack/dnsmasq-dns-6fcf94d689-z44mn" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.535195 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-z44mn" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.896448 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-tz76m"] Feb 20 07:05:59 crc kubenswrapper[5094]: W0220 07:05:59.907869 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9654de45_6750_4105_a4db_050ae521d91c.slice/crio-4ca5e7f2f7ecf776f59bc6a0cb3299deb8526fcf123e1301653c3165cbcd47dd WatchSource:0}: Error finding container 4ca5e7f2f7ecf776f59bc6a0cb3299deb8526fcf123e1301653c3165cbcd47dd: Status 404 returned error can't find the container with id 4ca5e7f2f7ecf776f59bc6a0cb3299deb8526fcf123e1301653c3165cbcd47dd Feb 20 07:06:00 crc kubenswrapper[5094]: I0220 07:06:00.050052 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-z44mn"] Feb 20 07:06:00 crc kubenswrapper[5094]: I0220 07:06:00.732932 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fcf94d689-z44mn" event={"ID":"ac34a5f7-69bb-416d-994d-056f5f1513e8","Type":"ContainerStarted","Data":"349408517b5e79dc8415f7cc916f79548181c386e9ff1081d09f8f6e492cac5c"} Feb 20 07:06:00 crc kubenswrapper[5094]: I0220 07:06:00.735052 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-855cbc58c5-tz76m" event={"ID":"9654de45-6750-4105-a4db-050ae521d91c","Type":"ContainerStarted","Data":"4ca5e7f2f7ecf776f59bc6a0cb3299deb8526fcf123e1301653c3165cbcd47dd"} Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.159810 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-z44mn"] Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.179757 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d6b9fdb89-jpfm5"] Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.181078 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.238035 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d6b9fdb89-jpfm5"] Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.347117 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-config\") pod \"dnsmasq-dns-6d6b9fdb89-jpfm5\" (UID: \"91df42cd-5b04-4d4c-862b-c9ccbb1b488d\") " pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.347197 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zn97\" (UniqueName: \"kubernetes.io/projected/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-kube-api-access-2zn97\") pod \"dnsmasq-dns-6d6b9fdb89-jpfm5\" (UID: \"91df42cd-5b04-4d4c-862b-c9ccbb1b488d\") " pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.348561 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-dns-svc\") pod \"dnsmasq-dns-6d6b9fdb89-jpfm5\" (UID: \"91df42cd-5b04-4d4c-862b-c9ccbb1b488d\") " pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.450668 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-config\") pod \"dnsmasq-dns-6d6b9fdb89-jpfm5\" (UID: \"91df42cd-5b04-4d4c-862b-c9ccbb1b488d\") " pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.450785 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zn97\" (UniqueName: \"kubernetes.io/projected/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-kube-api-access-2zn97\") pod \"dnsmasq-dns-6d6b9fdb89-jpfm5\" (UID: \"91df42cd-5b04-4d4c-862b-c9ccbb1b488d\") " pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.450838 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-dns-svc\") pod \"dnsmasq-dns-6d6b9fdb89-jpfm5\" (UID: \"91df42cd-5b04-4d4c-862b-c9ccbb1b488d\") " pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.453597 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-config\") pod \"dnsmasq-dns-6d6b9fdb89-jpfm5\" (UID: \"91df42cd-5b04-4d4c-862b-c9ccbb1b488d\") " pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.461549 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-dns-svc\") pod \"dnsmasq-dns-6d6b9fdb89-jpfm5\" (UID: \"91df42cd-5b04-4d4c-862b-c9ccbb1b488d\") " pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.477003 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zn97\" (UniqueName: \"kubernetes.io/projected/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-kube-api-access-2zn97\") pod \"dnsmasq-dns-6d6b9fdb89-jpfm5\" (UID: \"91df42cd-5b04-4d4c-862b-c9ccbb1b488d\") " pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.523545 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.777133 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-tz76m"] Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.808526 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-kfk8f"] Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.810179 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.822886 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-kfk8f"] Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.932047 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d6b9fdb89-jpfm5"] Feb 20 07:06:01 crc kubenswrapper[5094]: W0220 07:06:01.947165 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91df42cd_5b04_4d4c_862b_c9ccbb1b488d.slice/crio-b6889d1465b6da017b2081be2bef0da1f6bef33f4ba683f9fc9daa54a5c436ed WatchSource:0}: Error finding container b6889d1465b6da017b2081be2bef0da1f6bef33f4ba683f9fc9daa54a5c436ed: Status 404 returned error can't find the container with id b6889d1465b6da017b2081be2bef0da1f6bef33f4ba683f9fc9daa54a5c436ed Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.960601 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e16affe7-2f3d-438d-98b8-deedcf70053c-dns-svc\") pod \"dnsmasq-dns-67ff45466c-kfk8f\" (UID: \"e16affe7-2f3d-438d-98b8-deedcf70053c\") " pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.961113 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x746k\" (UniqueName: \"kubernetes.io/projected/e16affe7-2f3d-438d-98b8-deedcf70053c-kube-api-access-x746k\") pod \"dnsmasq-dns-67ff45466c-kfk8f\" (UID: \"e16affe7-2f3d-438d-98b8-deedcf70053c\") " pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.961217 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e16affe7-2f3d-438d-98b8-deedcf70053c-config\") pod \"dnsmasq-dns-67ff45466c-kfk8f\" (UID: \"e16affe7-2f3d-438d-98b8-deedcf70053c\") " pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.062685 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e16affe7-2f3d-438d-98b8-deedcf70053c-dns-svc\") pod \"dnsmasq-dns-67ff45466c-kfk8f\" (UID: \"e16affe7-2f3d-438d-98b8-deedcf70053c\") " pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.062821 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x746k\" (UniqueName: \"kubernetes.io/projected/e16affe7-2f3d-438d-98b8-deedcf70053c-kube-api-access-x746k\") pod \"dnsmasq-dns-67ff45466c-kfk8f\" (UID: \"e16affe7-2f3d-438d-98b8-deedcf70053c\") " pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.062848 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e16affe7-2f3d-438d-98b8-deedcf70053c-config\") pod \"dnsmasq-dns-67ff45466c-kfk8f\" (UID: \"e16affe7-2f3d-438d-98b8-deedcf70053c\") " pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.063803 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e16affe7-2f3d-438d-98b8-deedcf70053c-config\") pod \"dnsmasq-dns-67ff45466c-kfk8f\" (UID: \"e16affe7-2f3d-438d-98b8-deedcf70053c\") " pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.064387 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e16affe7-2f3d-438d-98b8-deedcf70053c-dns-svc\") pod \"dnsmasq-dns-67ff45466c-kfk8f\" (UID: \"e16affe7-2f3d-438d-98b8-deedcf70053c\") " pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.084788 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x746k\" (UniqueName: \"kubernetes.io/projected/e16affe7-2f3d-438d-98b8-deedcf70053c-kube-api-access-x746k\") pod \"dnsmasq-dns-67ff45466c-kfk8f\" (UID: \"e16affe7-2f3d-438d-98b8-deedcf70053c\") " pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.137294 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.323829 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.325689 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.329218 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.329394 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.330007 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.330189 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.330319 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.331941 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.332412 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.332689 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-2rc9x" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.468444 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.468500 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.468527 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.468558 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.468581 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a829c6b3-7069-4544-90dc-40ae83aba524-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.468607 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a829c6b3-7069-4544-90dc-40ae83aba524-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.468632 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.468658 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.468684 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.468730 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxfjr\" (UniqueName: \"kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-kube-api-access-fxfjr\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.468758 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.570822 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.570931 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.570991 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.571023 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.571054 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.571082 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a829c6b3-7069-4544-90dc-40ae83aba524-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.571119 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a829c6b3-7069-4544-90dc-40ae83aba524-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.571153 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.571183 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.571222 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.571267 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxfjr\" (UniqueName: \"kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-kube-api-access-fxfjr\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.571881 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.571940 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.572295 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.572626 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.572789 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.572807 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.579108 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.580080 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a829c6b3-7069-4544-90dc-40ae83aba524-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.583821 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.588725 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a829c6b3-7069-4544-90dc-40ae83aba524-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.589931 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxfjr\" (UniqueName: \"kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-kube-api-access-fxfjr\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.607017 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.644344 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.673563 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-kfk8f"] Feb 20 07:06:02 crc kubenswrapper[5094]: W0220 07:06:02.717859 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode16affe7_2f3d_438d_98b8_deedcf70053c.slice/crio-5a6851df1ef68fb3b4857718de007871d9f5899041eb1b42abb44d6c0d12abe2 WatchSource:0}: Error finding container 5a6851df1ef68fb3b4857718de007871d9f5899041eb1b42abb44d6c0d12abe2: Status 404 returned error can't find the container with id 5a6851df1ef68fb3b4857718de007871d9f5899041eb1b42abb44d6c0d12abe2 Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.771586 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" event={"ID":"91df42cd-5b04-4d4c-862b-c9ccbb1b488d","Type":"ContainerStarted","Data":"b6889d1465b6da017b2081be2bef0da1f6bef33f4ba683f9fc9daa54a5c436ed"} Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.776831 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" event={"ID":"e16affe7-2f3d-438d-98b8-deedcf70053c","Type":"ContainerStarted","Data":"5a6851df1ef68fb3b4857718de007871d9f5899041eb1b42abb44d6c0d12abe2"} Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.904948 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.906794 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.911349 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.911635 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.911686 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.911729 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-9fv2z" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.911727 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.916808 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.917034 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.931184 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.086189 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.086290 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.086328 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-config-data\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.086347 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.086372 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.086399 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-968ct\" (UniqueName: \"kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-kube-api-access-968ct\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.086443 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.086469 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.086494 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.086518 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.086536 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.187909 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-config-data\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.188144 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.188361 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.188387 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-968ct\" (UniqueName: \"kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-kube-api-access-968ct\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.188608 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.188649 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.188682 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.188727 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.188749 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.188787 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.188824 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.189122 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.189210 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.189431 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.189470 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-config-data\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.190406 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.190973 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.196003 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.196300 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.197032 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.205670 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.213515 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-968ct\" (UniqueName: \"kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-kube-api-access-968ct\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.237285 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.241874 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: W0220 07:06:03.254053 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda829c6b3_7069_4544_90dc_40ae83aba524.slice/crio-85bfead9f46b6ccf4ed616ed7699233ce7fd6b5f310fe2c58dee08099b82f9b7 WatchSource:0}: Error finding container 85bfead9f46b6ccf4ed616ed7699233ce7fd6b5f310fe2c58dee08099b82f9b7: Status 404 returned error can't find the container with id 85bfead9f46b6ccf4ed616ed7699233ce7fd6b5f310fe2c58dee08099b82f9b7 Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.545395 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.792289 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a829c6b3-7069-4544-90dc-40ae83aba524","Type":"ContainerStarted","Data":"85bfead9f46b6ccf4ed616ed7699233ce7fd6b5f310fe2c58dee08099b82f9b7"} Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.100438 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 07:06:04 crc kubenswrapper[5094]: W0220 07:06:04.127223 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod219c74d6_9f45_4bf8_8c67_acdea3c0fab3.slice/crio-a54c1c15c6f9b215b75b1006c2e0a8430344b718fe1307994740a6fb6ec55a17 WatchSource:0}: Error finding container a54c1c15c6f9b215b75b1006c2e0a8430344b718fe1307994740a6fb6ec55a17: Status 404 returned error can't find the container with id a54c1c15c6f9b215b75b1006c2e0a8430344b718fe1307994740a6fb6ec55a17 Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.279909 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.281396 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.285002 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.287458 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-cp9vz" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.287472 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.287931 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.292514 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.302283 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.413437 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d3ab399-3fc6-47e1-995c-5e855c554e9e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.413511 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3ab399-3fc6-47e1-995c-5e855c554e9e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.413579 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5mtb\" (UniqueName: \"kubernetes.io/projected/3d3ab399-3fc6-47e1-995c-5e855c554e9e-kube-api-access-b5mtb\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.413674 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-kolla-config\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.413726 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3d3ab399-3fc6-47e1-995c-5e855c554e9e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.413963 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.414057 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-config-data-default\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.414149 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.525620 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.525681 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-config-data-default\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.525735 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.525784 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d3ab399-3fc6-47e1-995c-5e855c554e9e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.525819 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3ab399-3fc6-47e1-995c-5e855c554e9e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.525839 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5mtb\" (UniqueName: \"kubernetes.io/projected/3d3ab399-3fc6-47e1-995c-5e855c554e9e-kube-api-access-b5mtb\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.525898 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-kolla-config\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.525923 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3d3ab399-3fc6-47e1-995c-5e855c554e9e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.526184 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.527092 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3d3ab399-3fc6-47e1-995c-5e855c554e9e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.533797 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-kolla-config\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.534489 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-config-data-default\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.534939 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.537963 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d3ab399-3fc6-47e1-995c-5e855c554e9e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.539604 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3ab399-3fc6-47e1-995c-5e855c554e9e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.545173 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.573254 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5mtb\" (UniqueName: \"kubernetes.io/projected/3d3ab399-3fc6-47e1-995c-5e855c554e9e-kube-api-access-b5mtb\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.617562 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.830280 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"219c74d6-9f45-4bf8-8c67-acdea3c0fab3","Type":"ContainerStarted","Data":"a54c1c15c6f9b215b75b1006c2e0a8430344b718fe1307994740a6fb6ec55a17"} Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.615319 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.616936 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.621808 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.622079 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-bng96" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.622208 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.625117 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.631233 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.752002 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fss5r\" (UniqueName: \"kubernetes.io/projected/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-kube-api-access-fss5r\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.752112 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.752161 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.752197 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.752262 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.752350 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.752370 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.752424 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.859234 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.859342 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.859393 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.859420 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.859465 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.859493 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fss5r\" (UniqueName: \"kubernetes.io/projected/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-kube-api-access-fss5r\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.859560 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.859598 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.863178 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.863914 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.870065 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.871376 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.872428 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.884651 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.891125 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fss5r\" (UniqueName: \"kubernetes.io/projected/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-kube-api-access-fss5r\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.895812 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.899792 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.901631 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.901744 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.907311 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-vr5c7" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.907633 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.907685 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.940831 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.961817 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:06 crc kubenswrapper[5094]: I0220 07:06:06.064908 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7dd0ff85-ae3a-4035-a096-fea5952b19a7-config-data\") pod \"memcached-0\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " pod="openstack/memcached-0" Feb 20 07:06:06 crc kubenswrapper[5094]: I0220 07:06:06.064958 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dd0ff85-ae3a-4035-a096-fea5952b19a7-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " pod="openstack/memcached-0" Feb 20 07:06:06 crc kubenswrapper[5094]: I0220 07:06:06.065019 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7dd0ff85-ae3a-4035-a096-fea5952b19a7-kolla-config\") pod \"memcached-0\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " pod="openstack/memcached-0" Feb 20 07:06:06 crc kubenswrapper[5094]: I0220 07:06:06.065085 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knph6\" (UniqueName: \"kubernetes.io/projected/7dd0ff85-ae3a-4035-a096-fea5952b19a7-kube-api-access-knph6\") pod \"memcached-0\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " pod="openstack/memcached-0" Feb 20 07:06:06 crc kubenswrapper[5094]: I0220 07:06:06.065118 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd0ff85-ae3a-4035-a096-fea5952b19a7-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " pod="openstack/memcached-0" Feb 20 07:06:06 crc kubenswrapper[5094]: I0220 07:06:06.167727 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7dd0ff85-ae3a-4035-a096-fea5952b19a7-config-data\") pod \"memcached-0\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " pod="openstack/memcached-0" Feb 20 07:06:06 crc kubenswrapper[5094]: I0220 07:06:06.167779 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dd0ff85-ae3a-4035-a096-fea5952b19a7-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " pod="openstack/memcached-0" Feb 20 07:06:06 crc kubenswrapper[5094]: I0220 07:06:06.167838 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7dd0ff85-ae3a-4035-a096-fea5952b19a7-kolla-config\") pod \"memcached-0\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " pod="openstack/memcached-0" Feb 20 07:06:06 crc kubenswrapper[5094]: I0220 07:06:06.167914 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knph6\" (UniqueName: \"kubernetes.io/projected/7dd0ff85-ae3a-4035-a096-fea5952b19a7-kube-api-access-knph6\") pod \"memcached-0\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " pod="openstack/memcached-0" Feb 20 07:06:06 crc kubenswrapper[5094]: I0220 07:06:06.167943 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd0ff85-ae3a-4035-a096-fea5952b19a7-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " pod="openstack/memcached-0" Feb 20 07:06:06 crc kubenswrapper[5094]: I0220 07:06:06.170066 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7dd0ff85-ae3a-4035-a096-fea5952b19a7-kolla-config\") pod \"memcached-0\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " pod="openstack/memcached-0" Feb 20 07:06:06 crc kubenswrapper[5094]: I0220 07:06:06.171043 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7dd0ff85-ae3a-4035-a096-fea5952b19a7-config-data\") pod \"memcached-0\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " pod="openstack/memcached-0" Feb 20 07:06:06 crc kubenswrapper[5094]: I0220 07:06:06.171600 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd0ff85-ae3a-4035-a096-fea5952b19a7-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " pod="openstack/memcached-0" Feb 20 07:06:06 crc kubenswrapper[5094]: I0220 07:06:06.176814 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dd0ff85-ae3a-4035-a096-fea5952b19a7-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " pod="openstack/memcached-0" Feb 20 07:06:06 crc kubenswrapper[5094]: I0220 07:06:06.191966 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knph6\" (UniqueName: \"kubernetes.io/projected/7dd0ff85-ae3a-4035-a096-fea5952b19a7-kube-api-access-knph6\") pod \"memcached-0\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " pod="openstack/memcached-0" Feb 20 07:06:06 crc kubenswrapper[5094]: I0220 07:06:06.292093 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 20 07:06:08 crc kubenswrapper[5094]: I0220 07:06:08.152866 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 07:06:08 crc kubenswrapper[5094]: I0220 07:06:08.154234 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 07:06:08 crc kubenswrapper[5094]: I0220 07:06:08.159524 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-qqz9g" Feb 20 07:06:08 crc kubenswrapper[5094]: I0220 07:06:08.167989 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 07:06:08 crc kubenswrapper[5094]: I0220 07:06:08.305343 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7k8s\" (UniqueName: \"kubernetes.io/projected/715094df-6704-4332-b990-95d790fd5ff1-kube-api-access-b7k8s\") pod \"kube-state-metrics-0\" (UID: \"715094df-6704-4332-b990-95d790fd5ff1\") " pod="openstack/kube-state-metrics-0" Feb 20 07:06:08 crc kubenswrapper[5094]: I0220 07:06:08.406358 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7k8s\" (UniqueName: \"kubernetes.io/projected/715094df-6704-4332-b990-95d790fd5ff1-kube-api-access-b7k8s\") pod \"kube-state-metrics-0\" (UID: \"715094df-6704-4332-b990-95d790fd5ff1\") " pod="openstack/kube-state-metrics-0" Feb 20 07:06:08 crc kubenswrapper[5094]: I0220 07:06:08.434180 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7k8s\" (UniqueName: \"kubernetes.io/projected/715094df-6704-4332-b990-95d790fd5ff1-kube-api-access-b7k8s\") pod \"kube-state-metrics-0\" (UID: \"715094df-6704-4332-b990-95d790fd5ff1\") " pod="openstack/kube-state-metrics-0" Feb 20 07:06:08 crc kubenswrapper[5094]: I0220 07:06:08.475650 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.490287 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-lvlr2"] Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.493175 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.496851 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.497132 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.502058 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-vpr9w" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.536249 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lvlr2"] Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.554803 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-tj42x"] Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.557883 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.567724 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-tj42x"] Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.593776 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-run\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.594141 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ecb5d91-5ba1-457e-af42-0d78c8643250-ovn-controller-tls-certs\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.594267 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ecb5d91-5ba1-457e-af42-0d78c8643250-combined-ca-bundle\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.594379 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-run-ovn\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.594469 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5tkf\" (UniqueName: \"kubernetes.io/projected/8ecb5d91-5ba1-457e-af42-0d78c8643250-kube-api-access-s5tkf\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.594557 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-log-ovn\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.594983 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ecb5d91-5ba1-457e-af42-0d78c8643250-scripts\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.699265 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ecb5d91-5ba1-457e-af42-0d78c8643250-ovn-controller-tls-certs\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.699369 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-log\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.699413 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ecb5d91-5ba1-457e-af42-0d78c8643250-combined-ca-bundle\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.699444 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-run-ovn\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.699469 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5tkf\" (UniqueName: \"kubernetes.io/projected/8ecb5d91-5ba1-457e-af42-0d78c8643250-kube-api-access-s5tkf\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.699495 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-log-ovn\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.699612 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ecb5d91-5ba1-457e-af42-0d78c8643250-scripts\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.699648 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-lib\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.699674 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-etc-ovs\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.699857 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-scripts\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.699878 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88866\" (UniqueName: \"kubernetes.io/projected/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-kube-api-access-88866\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.699906 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-run\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.699931 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-run\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.701636 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-run\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.701820 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-log-ovn\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.708326 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-run-ovn\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.708359 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ecb5d91-5ba1-457e-af42-0d78c8643250-scripts\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.735992 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ecb5d91-5ba1-457e-af42-0d78c8643250-combined-ca-bundle\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.739317 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5tkf\" (UniqueName: \"kubernetes.io/projected/8ecb5d91-5ba1-457e-af42-0d78c8643250-kube-api-access-s5tkf\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.753353 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ecb5d91-5ba1-457e-af42-0d78c8643250-ovn-controller-tls-certs\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.801367 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-log\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.801494 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-lib\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.801513 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-etc-ovs\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.801560 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-scripts\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.801578 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88866\" (UniqueName: \"kubernetes.io/projected/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-kube-api-access-88866\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.801601 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-run\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.801730 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-run\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.801876 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-log\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.801898 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-lib\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.801930 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-etc-ovs\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.803822 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-scripts\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.818105 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.844577 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88866\" (UniqueName: \"kubernetes.io/projected/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-kube-api-access-88866\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.924158 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.056635 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.058937 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.063911 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.064222 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-26v89" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.064582 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.064951 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.072960 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.074134 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.209009 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.209065 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4rzz\" (UniqueName: \"kubernetes.io/projected/cadd011d-8dde-4346-8608-c5f74376204d-kube-api-access-s4rzz\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.209094 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cadd011d-8dde-4346-8608-c5f74376204d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.209120 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.209139 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.209189 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.209209 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cadd011d-8dde-4346-8608-c5f74376204d-config\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.209232 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cadd011d-8dde-4346-8608-c5f74376204d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.313027 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.313763 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cadd011d-8dde-4346-8608-c5f74376204d-config\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.313825 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cadd011d-8dde-4346-8608-c5f74376204d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.313893 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.313928 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4rzz\" (UniqueName: \"kubernetes.io/projected/cadd011d-8dde-4346-8608-c5f74376204d-kube-api-access-s4rzz\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.313967 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cadd011d-8dde-4346-8608-c5f74376204d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.314003 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.314023 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.314414 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cadd011d-8dde-4346-8608-c5f74376204d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.314944 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.318719 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cadd011d-8dde-4346-8608-c5f74376204d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.318956 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cadd011d-8dde-4346-8608-c5f74376204d-config\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.322520 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.326543 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.339889 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4rzz\" (UniqueName: \"kubernetes.io/projected/cadd011d-8dde-4346-8608-c5f74376204d-kube-api-access-s4rzz\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.348798 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.351882 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.391906 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:14 crc kubenswrapper[5094]: I0220 07:06:14.883866 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.705875 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.709043 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.713640 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.714010 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.714051 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-9xcbp" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.716800 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.719530 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.812537 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-config\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.812662 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.813963 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.815458 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.815501 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.815674 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55pw9\" (UniqueName: \"kubernetes.io/projected/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-kube-api-access-55pw9\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.815849 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.815986 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.917830 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.918045 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.918214 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.918286 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.918335 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55pw9\" (UniqueName: \"kubernetes.io/projected/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-kube-api-access-55pw9\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.918367 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.918387 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.918786 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.918995 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-config\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.919217 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.919899 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-config\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.921534 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.926197 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.926659 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.926761 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.936038 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55pw9\" (UniqueName: \"kubernetes.io/projected/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-kube-api-access-55pw9\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.960300 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:16 crc kubenswrapper[5094]: I0220 07:06:16.030450 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:20 crc kubenswrapper[5094]: I0220 07:06:20.036179 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"30e79ba9-83fc-4246-9fb2-7136f6ae30a5","Type":"ContainerStarted","Data":"ab6b9e58f533ca8387a46ffe5e0cb304794c4450b59c803c80417c57e86e76ef"} Feb 20 07:06:20 crc kubenswrapper[5094]: E0220 07:06:20.615859 5094 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 20 07:06:20 crc kubenswrapper[5094]: E0220 07:06:20.616091 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x746k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-67ff45466c-kfk8f_openstack(e16affe7-2f3d-438d-98b8-deedcf70053c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 07:06:20 crc kubenswrapper[5094]: E0220 07:06:20.617282 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" podUID="e16affe7-2f3d-438d-98b8-deedcf70053c" Feb 20 07:06:20 crc kubenswrapper[5094]: E0220 07:06:20.631881 5094 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 20 07:06:20 crc kubenswrapper[5094]: E0220 07:06:20.632071 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2zn97,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6d6b9fdb89-jpfm5_openstack(91df42cd-5b04-4d4c-862b-c9ccbb1b488d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 07:06:20 crc kubenswrapper[5094]: E0220 07:06:20.633176 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" podUID="91df42cd-5b04-4d4c-862b-c9ccbb1b488d" Feb 20 07:06:20 crc kubenswrapper[5094]: E0220 07:06:20.637394 5094 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 20 07:06:20 crc kubenswrapper[5094]: E0220 07:06:20.637612 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gc7th,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6fcf94d689-z44mn_openstack(ac34a5f7-69bb-416d-994d-056f5f1513e8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 07:06:20 crc kubenswrapper[5094]: E0220 07:06:20.640025 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6fcf94d689-z44mn" podUID="ac34a5f7-69bb-416d-994d-056f5f1513e8" Feb 20 07:06:20 crc kubenswrapper[5094]: E0220 07:06:20.762417 5094 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 20 07:06:20 crc kubenswrapper[5094]: E0220 07:06:20.762948 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vmdmc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-855cbc58c5-tz76m_openstack(9654de45-6750-4105-a4db-050ae521d91c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 07:06:20 crc kubenswrapper[5094]: E0220 07:06:20.764262 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-855cbc58c5-tz76m" podUID="9654de45-6750-4105-a4db-050ae521d91c" Feb 20 07:06:21 crc kubenswrapper[5094]: E0220 07:06:21.047298 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2\\\"\"" pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" podUID="e16affe7-2f3d-438d-98b8-deedcf70053c" Feb 20 07:06:21 crc kubenswrapper[5094]: E0220 07:06:21.047352 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2\\\"\"" pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" podUID="91df42cd-5b04-4d4c-862b-c9ccbb1b488d" Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.169126 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.204330 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 07:06:21 crc kubenswrapper[5094]: W0220 07:06:21.207343 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod715094df_6704_4332_b990_95d790fd5ff1.slice/crio-3204759b1dff0695d3d73b3d9357f50ef02ad54eb0c3291abc0e31ecef319da3 WatchSource:0}: Error finding container 3204759b1dff0695d3d73b3d9357f50ef02ad54eb0c3291abc0e31ecef319da3: Status 404 returned error can't find the container with id 3204759b1dff0695d3d73b3d9357f50ef02ad54eb0c3291abc0e31ecef319da3 Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.288224 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-tj42x"] Feb 20 07:06:21 crc kubenswrapper[5094]: W0220 07:06:21.306687 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07969dc9_1a07_455c_b6c4_6b5f3bb23cb9.slice/crio-6ed1fedc1eeb0edbe644ddd0c1faadaf13e34c06761ccbb5563c487421983aa5 WatchSource:0}: Error finding container 6ed1fedc1eeb0edbe644ddd0c1faadaf13e34c06761ccbb5563c487421983aa5: Status 404 returned error can't find the container with id 6ed1fedc1eeb0edbe644ddd0c1faadaf13e34c06761ccbb5563c487421983aa5 Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.509493 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 20 07:06:21 crc kubenswrapper[5094]: W0220 07:06:21.514955 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcadd011d_8dde_4346_8608_c5f74376204d.slice/crio-066aae04f2be17dda3fe7b7198c86cbe95ffc64bcd807181570b9426d3daf04a WatchSource:0}: Error finding container 066aae04f2be17dda3fe7b7198c86cbe95ffc64bcd807181570b9426d3daf04a: Status 404 returned error can't find the container with id 066aae04f2be17dda3fe7b7198c86cbe95ffc64bcd807181570b9426d3daf04a Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.557311 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-tz76m" Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.569178 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lvlr2"] Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.582882 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-z44mn" Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.586668 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 20 07:06:21 crc kubenswrapper[5094]: W0220 07:06:21.600032 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ecb5d91_5ba1_457e_af42_0d78c8643250.slice/crio-af003a9a5a57c0d56da81db8c0252ac6e5ab00ee16e46167572891f004ac0dbf WatchSource:0}: Error finding container af003a9a5a57c0d56da81db8c0252ac6e5ab00ee16e46167572891f004ac0dbf: Status 404 returned error can't find the container with id af003a9a5a57c0d56da81db8c0252ac6e5ab00ee16e46167572891f004ac0dbf Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.671570 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac34a5f7-69bb-416d-994d-056f5f1513e8-dns-svc\") pod \"ac34a5f7-69bb-416d-994d-056f5f1513e8\" (UID: \"ac34a5f7-69bb-416d-994d-056f5f1513e8\") " Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.671882 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac34a5f7-69bb-416d-994d-056f5f1513e8-config\") pod \"ac34a5f7-69bb-416d-994d-056f5f1513e8\" (UID: \"ac34a5f7-69bb-416d-994d-056f5f1513e8\") " Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.672017 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gc7th\" (UniqueName: \"kubernetes.io/projected/ac34a5f7-69bb-416d-994d-056f5f1513e8-kube-api-access-gc7th\") pod \"ac34a5f7-69bb-416d-994d-056f5f1513e8\" (UID: \"ac34a5f7-69bb-416d-994d-056f5f1513e8\") " Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.672076 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9654de45-6750-4105-a4db-050ae521d91c-config\") pod \"9654de45-6750-4105-a4db-050ae521d91c\" (UID: \"9654de45-6750-4105-a4db-050ae521d91c\") " Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.672110 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmdmc\" (UniqueName: \"kubernetes.io/projected/9654de45-6750-4105-a4db-050ae521d91c-kube-api-access-vmdmc\") pod \"9654de45-6750-4105-a4db-050ae521d91c\" (UID: \"9654de45-6750-4105-a4db-050ae521d91c\") " Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.672587 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac34a5f7-69bb-416d-994d-056f5f1513e8-config" (OuterVolumeSpecName: "config") pod "ac34a5f7-69bb-416d-994d-056f5f1513e8" (UID: "ac34a5f7-69bb-416d-994d-056f5f1513e8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.672449 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac34a5f7-69bb-416d-994d-056f5f1513e8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ac34a5f7-69bb-416d-994d-056f5f1513e8" (UID: "ac34a5f7-69bb-416d-994d-056f5f1513e8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.672685 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9654de45-6750-4105-a4db-050ae521d91c-config" (OuterVolumeSpecName: "config") pod "9654de45-6750-4105-a4db-050ae521d91c" (UID: "9654de45-6750-4105-a4db-050ae521d91c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.679519 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9654de45-6750-4105-a4db-050ae521d91c-kube-api-access-vmdmc" (OuterVolumeSpecName: "kube-api-access-vmdmc") pod "9654de45-6750-4105-a4db-050ae521d91c" (UID: "9654de45-6750-4105-a4db-050ae521d91c"). InnerVolumeSpecName "kube-api-access-vmdmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.679589 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac34a5f7-69bb-416d-994d-056f5f1513e8-kube-api-access-gc7th" (OuterVolumeSpecName: "kube-api-access-gc7th") pod "ac34a5f7-69bb-416d-994d-056f5f1513e8" (UID: "ac34a5f7-69bb-416d-994d-056f5f1513e8"). InnerVolumeSpecName "kube-api-access-gc7th". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.774234 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmdmc\" (UniqueName: \"kubernetes.io/projected/9654de45-6750-4105-a4db-050ae521d91c-kube-api-access-vmdmc\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.774285 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac34a5f7-69bb-416d-994d-056f5f1513e8-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.774298 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac34a5f7-69bb-416d-994d-056f5f1513e8-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.774313 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gc7th\" (UniqueName: \"kubernetes.io/projected/ac34a5f7-69bb-416d-994d-056f5f1513e8-kube-api-access-gc7th\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.774325 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9654de45-6750-4105-a4db-050ae521d91c-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:22 crc kubenswrapper[5094]: I0220 07:06:22.054165 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"7dd0ff85-ae3a-4035-a096-fea5952b19a7","Type":"ContainerStarted","Data":"c905b3c584b4bdb1a44662bb87e5389e8137126047bfc23039edbbaea024118a"} Feb 20 07:06:22 crc kubenswrapper[5094]: I0220 07:06:22.056748 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lvlr2" event={"ID":"8ecb5d91-5ba1-457e-af42-0d78c8643250","Type":"ContainerStarted","Data":"af003a9a5a57c0d56da81db8c0252ac6e5ab00ee16e46167572891f004ac0dbf"} Feb 20 07:06:22 crc kubenswrapper[5094]: I0220 07:06:22.058563 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"cadd011d-8dde-4346-8608-c5f74376204d","Type":"ContainerStarted","Data":"066aae04f2be17dda3fe7b7198c86cbe95ffc64bcd807181570b9426d3daf04a"} Feb 20 07:06:22 crc kubenswrapper[5094]: I0220 07:06:22.060472 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tj42x" event={"ID":"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9","Type":"ContainerStarted","Data":"6ed1fedc1eeb0edbe644ddd0c1faadaf13e34c06761ccbb5563c487421983aa5"} Feb 20 07:06:22 crc kubenswrapper[5094]: I0220 07:06:22.062067 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-z44mn" Feb 20 07:06:22 crc kubenswrapper[5094]: I0220 07:06:22.062071 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fcf94d689-z44mn" event={"ID":"ac34a5f7-69bb-416d-994d-056f5f1513e8","Type":"ContainerDied","Data":"349408517b5e79dc8415f7cc916f79548181c386e9ff1081d09f8f6e492cac5c"} Feb 20 07:06:22 crc kubenswrapper[5094]: I0220 07:06:22.066939 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-855cbc58c5-tz76m" event={"ID":"9654de45-6750-4105-a4db-050ae521d91c","Type":"ContainerDied","Data":"4ca5e7f2f7ecf776f59bc6a0cb3299deb8526fcf123e1301653c3165cbcd47dd"} Feb 20 07:06:22 crc kubenswrapper[5094]: I0220 07:06:22.066979 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-tz76m" Feb 20 07:06:22 crc kubenswrapper[5094]: I0220 07:06:22.069306 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"715094df-6704-4332-b990-95d790fd5ff1","Type":"ContainerStarted","Data":"3204759b1dff0695d3d73b3d9357f50ef02ad54eb0c3291abc0e31ecef319da3"} Feb 20 07:06:22 crc kubenswrapper[5094]: I0220 07:06:22.072098 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3d3ab399-3fc6-47e1-995c-5e855c554e9e","Type":"ContainerStarted","Data":"69690ca4b7abd1bc1955c808ad93fa95a3a579fa31419e9ead102d78d2680915"} Feb 20 07:06:22 crc kubenswrapper[5094]: I0220 07:06:22.114858 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-z44mn"] Feb 20 07:06:22 crc kubenswrapper[5094]: I0220 07:06:22.131163 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-z44mn"] Feb 20 07:06:22 crc kubenswrapper[5094]: I0220 07:06:22.148083 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-tz76m"] Feb 20 07:06:22 crc kubenswrapper[5094]: I0220 07:06:22.155502 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-tz76m"] Feb 20 07:06:22 crc kubenswrapper[5094]: I0220 07:06:22.577047 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 20 07:06:23 crc kubenswrapper[5094]: I0220 07:06:23.102533 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a829c6b3-7069-4544-90dc-40ae83aba524","Type":"ContainerStarted","Data":"24c496b6fb0954ae89602f61cc9646541943f3eefe9945f4d983f81e20a69e1c"} Feb 20 07:06:23 crc kubenswrapper[5094]: I0220 07:06:23.105415 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"219c74d6-9f45-4bf8-8c67-acdea3c0fab3","Type":"ContainerStarted","Data":"0b471ed3a22d48304ba36ebcf4e6acbda6b19074d6e3a72832e1dda4c6f2f145"} Feb 20 07:06:23 crc kubenswrapper[5094]: W0220 07:06:23.524589 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3ec8857_5a33_44ea_bdd0_97b343adfc8a.slice/crio-c486e86d620d37448242e0209a1c7bf77c53c9f654c68f860c22b2ce1ff67ce9 WatchSource:0}: Error finding container c486e86d620d37448242e0209a1c7bf77c53c9f654c68f860c22b2ce1ff67ce9: Status 404 returned error can't find the container with id c486e86d620d37448242e0209a1c7bf77c53c9f654c68f860c22b2ce1ff67ce9 Feb 20 07:06:23 crc kubenswrapper[5094]: I0220 07:06:23.850678 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9654de45-6750-4105-a4db-050ae521d91c" path="/var/lib/kubelet/pods/9654de45-6750-4105-a4db-050ae521d91c/volumes" Feb 20 07:06:23 crc kubenswrapper[5094]: I0220 07:06:23.851210 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac34a5f7-69bb-416d-994d-056f5f1513e8" path="/var/lib/kubelet/pods/ac34a5f7-69bb-416d-994d-056f5f1513e8/volumes" Feb 20 07:06:24 crc kubenswrapper[5094]: I0220 07:06:24.115971 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d3ec8857-5a33-44ea-bdd0-97b343adfc8a","Type":"ContainerStarted","Data":"c486e86d620d37448242e0209a1c7bf77c53c9f654c68f860c22b2ce1ff67ce9"} Feb 20 07:06:30 crc kubenswrapper[5094]: I0220 07:06:30.195450 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"715094df-6704-4332-b990-95d790fd5ff1","Type":"ContainerStarted","Data":"ffe736a6fc24efeb2e9463249f14ea2fb642857b322f6c26497919b56fb7314a"} Feb 20 07:06:30 crc kubenswrapper[5094]: I0220 07:06:30.196170 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 20 07:06:30 crc kubenswrapper[5094]: I0220 07:06:30.198799 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3d3ab399-3fc6-47e1-995c-5e855c554e9e","Type":"ContainerStarted","Data":"c2fe8328d6470157d2d40189e9eccecbe7d439584278e384fea09cbd5a11eb25"} Feb 20 07:06:30 crc kubenswrapper[5094]: I0220 07:06:30.201505 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"7dd0ff85-ae3a-4035-a096-fea5952b19a7","Type":"ContainerStarted","Data":"631fbada038605f5c51cfb450102e200cb6c919e6e8732fe4d271c4b0f19cc19"} Feb 20 07:06:30 crc kubenswrapper[5094]: I0220 07:06:30.202056 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 20 07:06:30 crc kubenswrapper[5094]: I0220 07:06:30.203993 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d3ec8857-5a33-44ea-bdd0-97b343adfc8a","Type":"ContainerStarted","Data":"87a953c9ab5036126a76e97a66b02ef98a2f8720e10ee1bc11a373190ac13d0d"} Feb 20 07:06:30 crc kubenswrapper[5094]: I0220 07:06:30.205848 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"30e79ba9-83fc-4246-9fb2-7136f6ae30a5","Type":"ContainerStarted","Data":"1a298d17b5fd33f9f36c09a0339c08e7faff16875504e0c410ecbbdc1176b9c8"} Feb 20 07:06:30 crc kubenswrapper[5094]: I0220 07:06:30.211255 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lvlr2" event={"ID":"8ecb5d91-5ba1-457e-af42-0d78c8643250","Type":"ContainerStarted","Data":"4f0a991715f978e32d8d86416765ee44cb6bb02db7b4482e7725dce37e0de301"} Feb 20 07:06:30 crc kubenswrapper[5094]: I0220 07:06:30.219635 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=13.653726319 podStartE2EDuration="22.219611684s" podCreationTimestamp="2026-02-20 07:06:08 +0000 UTC" firstStartedPulling="2026-02-20 07:06:21.210328619 +0000 UTC m=+1196.082955330" lastFinishedPulling="2026-02-20 07:06:29.776213984 +0000 UTC m=+1204.648840695" observedRunningTime="2026-02-20 07:06:30.215932516 +0000 UTC m=+1205.088559227" watchObservedRunningTime="2026-02-20 07:06:30.219611684 +0000 UTC m=+1205.092238395" Feb 20 07:06:30 crc kubenswrapper[5094]: I0220 07:06:30.240041 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"cadd011d-8dde-4346-8608-c5f74376204d","Type":"ContainerStarted","Data":"333434623dc65ba599c492292fd799a78a2d1d5581438ea036aa6124a9583e68"} Feb 20 07:06:30 crc kubenswrapper[5094]: I0220 07:06:30.240738 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-lvlr2" podStartSLOduration=11.887803551 podStartE2EDuration="19.240683698s" podCreationTimestamp="2026-02-20 07:06:11 +0000 UTC" firstStartedPulling="2026-02-20 07:06:21.606231349 +0000 UTC m=+1196.478858060" lastFinishedPulling="2026-02-20 07:06:28.959111496 +0000 UTC m=+1203.831738207" observedRunningTime="2026-02-20 07:06:30.23616366 +0000 UTC m=+1205.108790371" watchObservedRunningTime="2026-02-20 07:06:30.240683698 +0000 UTC m=+1205.113310409" Feb 20 07:06:30 crc kubenswrapper[5094]: I0220 07:06:30.244447 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tj42x" event={"ID":"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9","Type":"ContainerStarted","Data":"35c773a054d35ff2b9950832498531c8e991fc4664138ffa116e29aba081b5bf"} Feb 20 07:06:30 crc kubenswrapper[5094]: I0220 07:06:30.320766 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=18.106807956 podStartE2EDuration="25.320746315s" podCreationTimestamp="2026-02-20 07:06:05 +0000 UTC" firstStartedPulling="2026-02-20 07:06:21.182155543 +0000 UTC m=+1196.054782254" lastFinishedPulling="2026-02-20 07:06:28.396093892 +0000 UTC m=+1203.268720613" observedRunningTime="2026-02-20 07:06:30.312891648 +0000 UTC m=+1205.185518359" watchObservedRunningTime="2026-02-20 07:06:30.320746315 +0000 UTC m=+1205.193373026" Feb 20 07:06:31 crc kubenswrapper[5094]: I0220 07:06:31.256180 5094 generic.go:334] "Generic (PLEG): container finished" podID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerID="35c773a054d35ff2b9950832498531c8e991fc4664138ffa116e29aba081b5bf" exitCode=0 Feb 20 07:06:31 crc kubenswrapper[5094]: I0220 07:06:31.257807 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tj42x" event={"ID":"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9","Type":"ContainerDied","Data":"35c773a054d35ff2b9950832498531c8e991fc4664138ffa116e29aba081b5bf"} Feb 20 07:06:31 crc kubenswrapper[5094]: I0220 07:06:31.258318 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:32 crc kubenswrapper[5094]: I0220 07:06:32.272034 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tj42x" event={"ID":"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9","Type":"ContainerStarted","Data":"ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d"} Feb 20 07:06:33 crc kubenswrapper[5094]: I0220 07:06:33.285544 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"cadd011d-8dde-4346-8608-c5f74376204d","Type":"ContainerStarted","Data":"2a6097f5aaeab1082991a6d095181e3f753899eb178f0d829dd1ed8b74f20e47"} Feb 20 07:06:33 crc kubenswrapper[5094]: I0220 07:06:33.293241 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tj42x" event={"ID":"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9","Type":"ContainerStarted","Data":"381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a"} Feb 20 07:06:33 crc kubenswrapper[5094]: I0220 07:06:33.293547 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:33 crc kubenswrapper[5094]: I0220 07:06:33.296173 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d3ec8857-5a33-44ea-bdd0-97b343adfc8a","Type":"ContainerStarted","Data":"d8891902715ab819be070436bd6baa8c209b3026708544fcd16518b5902e4976"} Feb 20 07:06:33 crc kubenswrapper[5094]: I0220 07:06:33.322507 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=12.074221776 podStartE2EDuration="22.322484745s" podCreationTimestamp="2026-02-20 07:06:11 +0000 UTC" firstStartedPulling="2026-02-20 07:06:21.518086088 +0000 UTC m=+1196.390712799" lastFinishedPulling="2026-02-20 07:06:31.766349057 +0000 UTC m=+1206.638975768" observedRunningTime="2026-02-20 07:06:33.319900723 +0000 UTC m=+1208.192527514" watchObservedRunningTime="2026-02-20 07:06:33.322484745 +0000 UTC m=+1208.195111466" Feb 20 07:06:33 crc kubenswrapper[5094]: I0220 07:06:33.342935 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=11.093758263 podStartE2EDuration="19.342914775s" podCreationTimestamp="2026-02-20 07:06:14 +0000 UTC" firstStartedPulling="2026-02-20 07:06:23.526733714 +0000 UTC m=+1198.399360425" lastFinishedPulling="2026-02-20 07:06:31.775890226 +0000 UTC m=+1206.648516937" observedRunningTime="2026-02-20 07:06:33.342432283 +0000 UTC m=+1208.215058984" watchObservedRunningTime="2026-02-20 07:06:33.342914775 +0000 UTC m=+1208.215541486" Feb 20 07:06:33 crc kubenswrapper[5094]: I0220 07:06:33.374846 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-tj42x" podStartSLOduration=14.752109311 podStartE2EDuration="22.374815059s" podCreationTimestamp="2026-02-20 07:06:11 +0000 UTC" firstStartedPulling="2026-02-20 07:06:21.309635647 +0000 UTC m=+1196.182262358" lastFinishedPulling="2026-02-20 07:06:28.932341385 +0000 UTC m=+1203.804968106" observedRunningTime="2026-02-20 07:06:33.36403313 +0000 UTC m=+1208.236659841" watchObservedRunningTime="2026-02-20 07:06:33.374815059 +0000 UTC m=+1208.247441780" Feb 20 07:06:33 crc kubenswrapper[5094]: I0220 07:06:33.393679 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:33 crc kubenswrapper[5094]: I0220 07:06:33.439330 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.031210 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.104806 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.309953 5094 generic.go:334] "Generic (PLEG): container finished" podID="3d3ab399-3fc6-47e1-995c-5e855c554e9e" containerID="c2fe8328d6470157d2d40189e9eccecbe7d439584278e384fea09cbd5a11eb25" exitCode=0 Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.310113 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3d3ab399-3fc6-47e1-995c-5e855c554e9e","Type":"ContainerDied","Data":"c2fe8328d6470157d2d40189e9eccecbe7d439584278e384fea09cbd5a11eb25"} Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.313600 5094 generic.go:334] "Generic (PLEG): container finished" podID="30e79ba9-83fc-4246-9fb2-7136f6ae30a5" containerID="1a298d17b5fd33f9f36c09a0339c08e7faff16875504e0c410ecbbdc1176b9c8" exitCode=0 Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.313660 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"30e79ba9-83fc-4246-9fb2-7136f6ae30a5","Type":"ContainerDied","Data":"1a298d17b5fd33f9f36c09a0339c08e7faff16875504e0c410ecbbdc1176b9c8"} Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.315505 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.315557 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.315581 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.384645 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.705343 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-kfk8f"] Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.732814 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-gg2h9"] Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.734769 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.737333 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.786680 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-gg2h9"] Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.818936 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b9c664f7-sg5jw"] Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.823035 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.829571 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.864817 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b9c664f7-sg5jw"] Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.869961 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-ovsdbserver-sb\") pod \"dnsmasq-dns-5b9c664f7-sg5jw\" (UID: \"304622e2-8b43-4cc6-b17b-79a03bc74a58\") " pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.870090 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c67x\" (UniqueName: \"kubernetes.io/projected/304622e2-8b43-4cc6-b17b-79a03bc74a58-kube-api-access-4c67x\") pod \"dnsmasq-dns-5b9c664f7-sg5jw\" (UID: \"304622e2-8b43-4cc6-b17b-79a03bc74a58\") " pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.870139 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ca32118b-2e77-4484-b753-3467e1ba8df1-ovn-rundir\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.870162 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca32118b-2e77-4484-b753-3467e1ba8df1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.870211 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca32118b-2e77-4484-b753-3467e1ba8df1-combined-ca-bundle\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.870259 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6d2f\" (UniqueName: \"kubernetes.io/projected/ca32118b-2e77-4484-b753-3467e1ba8df1-kube-api-access-n6d2f\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.870333 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ca32118b-2e77-4484-b753-3467e1ba8df1-ovs-rundir\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.870385 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-config\") pod \"dnsmasq-dns-5b9c664f7-sg5jw\" (UID: \"304622e2-8b43-4cc6-b17b-79a03bc74a58\") " pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.870448 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca32118b-2e77-4484-b753-3467e1ba8df1-config\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.870491 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-dns-svc\") pod \"dnsmasq-dns-5b9c664f7-sg5jw\" (UID: \"304622e2-8b43-4cc6-b17b-79a03bc74a58\") " pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.972566 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c67x\" (UniqueName: \"kubernetes.io/projected/304622e2-8b43-4cc6-b17b-79a03bc74a58-kube-api-access-4c67x\") pod \"dnsmasq-dns-5b9c664f7-sg5jw\" (UID: \"304622e2-8b43-4cc6-b17b-79a03bc74a58\") " pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.972658 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ca32118b-2e77-4484-b753-3467e1ba8df1-ovn-rundir\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.972683 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca32118b-2e77-4484-b753-3467e1ba8df1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.972743 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca32118b-2e77-4484-b753-3467e1ba8df1-combined-ca-bundle\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.972816 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6d2f\" (UniqueName: \"kubernetes.io/projected/ca32118b-2e77-4484-b753-3467e1ba8df1-kube-api-access-n6d2f\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.972874 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ca32118b-2e77-4484-b753-3467e1ba8df1-ovs-rundir\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.972906 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-config\") pod \"dnsmasq-dns-5b9c664f7-sg5jw\" (UID: \"304622e2-8b43-4cc6-b17b-79a03bc74a58\") " pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.972963 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca32118b-2e77-4484-b753-3467e1ba8df1-config\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.973006 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-dns-svc\") pod \"dnsmasq-dns-5b9c664f7-sg5jw\" (UID: \"304622e2-8b43-4cc6-b17b-79a03bc74a58\") " pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.973048 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-ovsdbserver-sb\") pod \"dnsmasq-dns-5b9c664f7-sg5jw\" (UID: \"304622e2-8b43-4cc6-b17b-79a03bc74a58\") " pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.974974 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ca32118b-2e77-4484-b753-3467e1ba8df1-ovn-rundir\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.976092 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ca32118b-2e77-4484-b753-3467e1ba8df1-ovs-rundir\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.976652 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-dns-svc\") pod \"dnsmasq-dns-5b9c664f7-sg5jw\" (UID: \"304622e2-8b43-4cc6-b17b-79a03bc74a58\") " pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.977296 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-ovsdbserver-sb\") pod \"dnsmasq-dns-5b9c664f7-sg5jw\" (UID: \"304622e2-8b43-4cc6-b17b-79a03bc74a58\") " pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.977407 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-config\") pod \"dnsmasq-dns-5b9c664f7-sg5jw\" (UID: \"304622e2-8b43-4cc6-b17b-79a03bc74a58\") " pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.977593 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca32118b-2e77-4484-b753-3467e1ba8df1-config\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.984351 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca32118b-2e77-4484-b753-3467e1ba8df1-combined-ca-bundle\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.990180 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c67x\" (UniqueName: \"kubernetes.io/projected/304622e2-8b43-4cc6-b17b-79a03bc74a58-kube-api-access-4c67x\") pod \"dnsmasq-dns-5b9c664f7-sg5jw\" (UID: \"304622e2-8b43-4cc6-b17b-79a03bc74a58\") " pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.995475 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6d2f\" (UniqueName: \"kubernetes.io/projected/ca32118b-2e77-4484-b753-3467e1ba8df1-kube-api-access-n6d2f\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.002498 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca32118b-2e77-4484-b753-3467e1ba8df1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.108357 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d6b9fdb89-jpfm5"] Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.136015 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-wdrds"] Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.137528 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.137630 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.144074 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.161661 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-wdrds"] Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.177993 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-ovsdbserver-sb\") pod \"dnsmasq-dns-75b7bcc64f-wdrds\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.178093 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-config\") pod \"dnsmasq-dns-75b7bcc64f-wdrds\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.178114 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-dns-svc\") pod \"dnsmasq-dns-75b7bcc64f-wdrds\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.178339 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-ovsdbserver-nb\") pod \"dnsmasq-dns-75b7bcc64f-wdrds\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.178444 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-928v8\" (UniqueName: \"kubernetes.io/projected/b40815a7-cffa-44ed-8acf-98261cc7e14c-kube-api-access-928v8\") pod \"dnsmasq-dns-75b7bcc64f-wdrds\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.184248 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.238024 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.284195 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e16affe7-2f3d-438d-98b8-deedcf70053c-config\") pod \"e16affe7-2f3d-438d-98b8-deedcf70053c\" (UID: \"e16affe7-2f3d-438d-98b8-deedcf70053c\") " Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.284306 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e16affe7-2f3d-438d-98b8-deedcf70053c-dns-svc\") pod \"e16affe7-2f3d-438d-98b8-deedcf70053c\" (UID: \"e16affe7-2f3d-438d-98b8-deedcf70053c\") " Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.284424 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x746k\" (UniqueName: \"kubernetes.io/projected/e16affe7-2f3d-438d-98b8-deedcf70053c-kube-api-access-x746k\") pod \"e16affe7-2f3d-438d-98b8-deedcf70053c\" (UID: \"e16affe7-2f3d-438d-98b8-deedcf70053c\") " Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.284735 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-ovsdbserver-nb\") pod \"dnsmasq-dns-75b7bcc64f-wdrds\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.284790 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-928v8\" (UniqueName: \"kubernetes.io/projected/b40815a7-cffa-44ed-8acf-98261cc7e14c-kube-api-access-928v8\") pod \"dnsmasq-dns-75b7bcc64f-wdrds\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.284897 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-ovsdbserver-sb\") pod \"dnsmasq-dns-75b7bcc64f-wdrds\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.284950 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-config\") pod \"dnsmasq-dns-75b7bcc64f-wdrds\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.284977 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-dns-svc\") pod \"dnsmasq-dns-75b7bcc64f-wdrds\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.286092 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-dns-svc\") pod \"dnsmasq-dns-75b7bcc64f-wdrds\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.286913 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e16affe7-2f3d-438d-98b8-deedcf70053c-config" (OuterVolumeSpecName: "config") pod "e16affe7-2f3d-438d-98b8-deedcf70053c" (UID: "e16affe7-2f3d-438d-98b8-deedcf70053c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.287302 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e16affe7-2f3d-438d-98b8-deedcf70053c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e16affe7-2f3d-438d-98b8-deedcf70053c" (UID: "e16affe7-2f3d-438d-98b8-deedcf70053c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.290390 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-ovsdbserver-nb\") pod \"dnsmasq-dns-75b7bcc64f-wdrds\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.290536 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-ovsdbserver-sb\") pod \"dnsmasq-dns-75b7bcc64f-wdrds\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.291445 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-config\") pod \"dnsmasq-dns-75b7bcc64f-wdrds\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.292436 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e16affe7-2f3d-438d-98b8-deedcf70053c-kube-api-access-x746k" (OuterVolumeSpecName: "kube-api-access-x746k") pod "e16affe7-2f3d-438d-98b8-deedcf70053c" (UID: "e16affe7-2f3d-438d-98b8-deedcf70053c"). InnerVolumeSpecName "kube-api-access-x746k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.388108 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e16affe7-2f3d-438d-98b8-deedcf70053c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.388138 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x746k\" (UniqueName: \"kubernetes.io/projected/e16affe7-2f3d-438d-98b8-deedcf70053c-kube-api-access-x746k\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.388151 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e16affe7-2f3d-438d-98b8-deedcf70053c-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.389578 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-928v8\" (UniqueName: \"kubernetes.io/projected/b40815a7-cffa-44ed-8acf-98261cc7e14c-kube-api-access-928v8\") pod \"dnsmasq-dns-75b7bcc64f-wdrds\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.409186 5094 generic.go:334] "Generic (PLEG): container finished" podID="91df42cd-5b04-4d4c-862b-c9ccbb1b488d" containerID="1b34af8db883238b2e612d791bb36ef95778df03bbf733644cb48e464b571a49" exitCode=0 Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.409289 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" event={"ID":"91df42cd-5b04-4d4c-862b-c9ccbb1b488d","Type":"ContainerDied","Data":"1b34af8db883238b2e612d791bb36ef95778df03bbf733644cb48e464b571a49"} Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.466338 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3d3ab399-3fc6-47e1-995c-5e855c554e9e","Type":"ContainerStarted","Data":"b512a557484cc85ddc51b76d4e4921f265f1c6139b4d9b37aa3c645ee24c7d27"} Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.477389 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.495580 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" event={"ID":"e16affe7-2f3d-438d-98b8-deedcf70053c","Type":"ContainerDied","Data":"5a6851df1ef68fb3b4857718de007871d9f5899041eb1b42abb44d6c0d12abe2"} Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.495692 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.535600 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=24.973406419 podStartE2EDuration="32.535578277s" podCreationTimestamp="2026-02-20 07:06:03 +0000 UTC" firstStartedPulling="2026-02-20 07:06:21.595561745 +0000 UTC m=+1196.468188446" lastFinishedPulling="2026-02-20 07:06:29.157733563 +0000 UTC m=+1204.030360304" observedRunningTime="2026-02-20 07:06:35.514271417 +0000 UTC m=+1210.386898128" watchObservedRunningTime="2026-02-20 07:06:35.535578277 +0000 UTC m=+1210.408204988" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.544142 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"30e79ba9-83fc-4246-9fb2-7136f6ae30a5","Type":"ContainerStarted","Data":"f1ef94c055965af4c60a329875d73bb23ffc248862090105b8c3a3484d931c14"} Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.604253 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=22.278498907 podStartE2EDuration="31.604228191s" podCreationTimestamp="2026-02-20 07:06:04 +0000 UTC" firstStartedPulling="2026-02-20 07:06:19.606502078 +0000 UTC m=+1194.479128819" lastFinishedPulling="2026-02-20 07:06:28.932231402 +0000 UTC m=+1203.804858103" observedRunningTime="2026-02-20 07:06:35.59082104 +0000 UTC m=+1210.463447751" watchObservedRunningTime="2026-02-20 07:06:35.604228191 +0000 UTC m=+1210.476854902" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.666939 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.695789 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-kfk8f"] Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.704156 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-kfk8f"] Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.710429 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b9c664f7-sg5jw"] Feb 20 07:06:35 crc kubenswrapper[5094]: W0220 07:06:35.737757 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod304622e2_8b43_4cc6_b17b_79a03bc74a58.slice/crio-31a848846b08ea21777cbda10f64b475f792939e4ec24c88cd9f5397a007043b WatchSource:0}: Error finding container 31a848846b08ea21777cbda10f64b475f792939e4ec24c88cd9f5397a007043b: Status 404 returned error can't find the container with id 31a848846b08ea21777cbda10f64b475f792939e4ec24c88cd9f5397a007043b Feb 20 07:06:35 crc kubenswrapper[5094]: E0220 07:06:35.810226 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode16affe7_2f3d_438d_98b8_deedcf70053c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode16affe7_2f3d_438d_98b8_deedcf70053c.slice/crio-5a6851df1ef68fb3b4857718de007871d9f5899041eb1b42abb44d6c0d12abe2\": RecentStats: unable to find data in memory cache]" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.867864 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e16affe7-2f3d-438d-98b8-deedcf70053c" path="/var/lib/kubelet/pods/e16affe7-2f3d-438d-98b8-deedcf70053c/volumes" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.959941 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.963864 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.963902 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.964031 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.970069 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.970726 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-4w8fv" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.971000 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.972287 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:35.999557 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.008789 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-gg2h9"] Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.029870 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.072606 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-wdrds"] Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.140576 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zn97\" (UniqueName: \"kubernetes.io/projected/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-kube-api-access-2zn97\") pod \"91df42cd-5b04-4d4c-862b-c9ccbb1b488d\" (UID: \"91df42cd-5b04-4d4c-862b-c9ccbb1b488d\") " Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.140954 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-dns-svc\") pod \"91df42cd-5b04-4d4c-862b-c9ccbb1b488d\" (UID: \"91df42cd-5b04-4d4c-862b-c9ccbb1b488d\") " Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.141253 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-config\") pod \"91df42cd-5b04-4d4c-862b-c9ccbb1b488d\" (UID: \"91df42cd-5b04-4d4c-862b-c9ccbb1b488d\") " Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.142031 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.142194 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.142334 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd92d75e-9882-4bb7-a41e-cab9777424e8-config\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.142444 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.142526 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cd92d75e-9882-4bb7-a41e-cab9777424e8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.142631 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fksb\" (UniqueName: \"kubernetes.io/projected/cd92d75e-9882-4bb7-a41e-cab9777424e8-kube-api-access-8fksb\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.142766 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd92d75e-9882-4bb7-a41e-cab9777424e8-scripts\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.145842 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-kube-api-access-2zn97" (OuterVolumeSpecName: "kube-api-access-2zn97") pod "91df42cd-5b04-4d4c-862b-c9ccbb1b488d" (UID: "91df42cd-5b04-4d4c-862b-c9ccbb1b488d"). InnerVolumeSpecName "kube-api-access-2zn97". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.160818 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-config" (OuterVolumeSpecName: "config") pod "91df42cd-5b04-4d4c-862b-c9ccbb1b488d" (UID: "91df42cd-5b04-4d4c-862b-c9ccbb1b488d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.164523 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "91df42cd-5b04-4d4c-862b-c9ccbb1b488d" (UID: "91df42cd-5b04-4d4c-862b-c9ccbb1b488d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.245219 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fksb\" (UniqueName: \"kubernetes.io/projected/cd92d75e-9882-4bb7-a41e-cab9777424e8-kube-api-access-8fksb\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.245542 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd92d75e-9882-4bb7-a41e-cab9777424e8-scripts\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.245783 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.245973 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.246145 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd92d75e-9882-4bb7-a41e-cab9777424e8-config\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.246318 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.246480 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cd92d75e-9882-4bb7-a41e-cab9777424e8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.247523 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.247779 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.247916 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zn97\" (UniqueName: \"kubernetes.io/projected/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-kube-api-access-2zn97\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.247353 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cd92d75e-9882-4bb7-a41e-cab9777424e8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.248590 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd92d75e-9882-4bb7-a41e-cab9777424e8-scripts\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.248758 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd92d75e-9882-4bb7-a41e-cab9777424e8-config\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.254890 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.255013 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.255372 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.267130 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fksb\" (UniqueName: \"kubernetes.io/projected/cd92d75e-9882-4bb7-a41e-cab9777424e8-kube-api-access-8fksb\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.293852 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.307638 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.554954 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-gg2h9" event={"ID":"ca32118b-2e77-4484-b753-3467e1ba8df1","Type":"ContainerStarted","Data":"ca9c2e6bc34587959334740a47a8ea31e6c558cced5427d8002996ae38da9310"} Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.555339 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-gg2h9" event={"ID":"ca32118b-2e77-4484-b753-3467e1ba8df1","Type":"ContainerStarted","Data":"8220b8fae0a21f561557c1539a6ab409db96f4d4c24a493c8737608b37dc4bc1"} Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.557525 5094 generic.go:334] "Generic (PLEG): container finished" podID="b40815a7-cffa-44ed-8acf-98261cc7e14c" containerID="f3c82a94d6648ba5c9c19bfa7de7c6f84242900741fd3a27425da31ee3d8ec82" exitCode=0 Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.557618 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" event={"ID":"b40815a7-cffa-44ed-8acf-98261cc7e14c","Type":"ContainerDied","Data":"f3c82a94d6648ba5c9c19bfa7de7c6f84242900741fd3a27425da31ee3d8ec82"} Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.557683 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" event={"ID":"b40815a7-cffa-44ed-8acf-98261cc7e14c","Type":"ContainerStarted","Data":"ee40b807024485305f67748b89b6f21b26eeabd4f3866126d5f1e66f00f01af7"} Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.559466 5094 generic.go:334] "Generic (PLEG): container finished" podID="304622e2-8b43-4cc6-b17b-79a03bc74a58" containerID="b7767b5d4adb615e7aa00736fc2736b07d8efff1a7193f90cf7bc360858c7390" exitCode=0 Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.559676 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" event={"ID":"304622e2-8b43-4cc6-b17b-79a03bc74a58","Type":"ContainerDied","Data":"b7767b5d4adb615e7aa00736fc2736b07d8efff1a7193f90cf7bc360858c7390"} Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.559722 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" event={"ID":"304622e2-8b43-4cc6-b17b-79a03bc74a58","Type":"ContainerStarted","Data":"31a848846b08ea21777cbda10f64b475f792939e4ec24c88cd9f5397a007043b"} Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.563900 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" event={"ID":"91df42cd-5b04-4d4c-862b-c9ccbb1b488d","Type":"ContainerDied","Data":"b6889d1465b6da017b2081be2bef0da1f6bef33f4ba683f9fc9daa54a5c436ed"} Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.564006 5094 scope.go:117] "RemoveContainer" containerID="1b34af8db883238b2e612d791bb36ef95778df03bbf733644cb48e464b571a49" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.564015 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.581502 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-gg2h9" podStartSLOduration=2.581467015 podStartE2EDuration="2.581467015s" podCreationTimestamp="2026-02-20 07:06:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:06:36.578773951 +0000 UTC m=+1211.451400662" watchObservedRunningTime="2026-02-20 07:06:36.581467015 +0000 UTC m=+1211.454093726" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.728738 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d6b9fdb89-jpfm5"] Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.738225 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d6b9fdb89-jpfm5"] Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.829018 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 20 07:06:36 crc kubenswrapper[5094]: W0220 07:06:36.837852 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd92d75e_9882_4bb7_a41e_cab9777424e8.slice/crio-152906df1c7c73cf9f0f0833ac0e10aba2f396ee2bbcbe89c7d223cc8c01b900 WatchSource:0}: Error finding container 152906df1c7c73cf9f0f0833ac0e10aba2f396ee2bbcbe89c7d223cc8c01b900: Status 404 returned error can't find the container with id 152906df1c7c73cf9f0f0833ac0e10aba2f396ee2bbcbe89c7d223cc8c01b900 Feb 20 07:06:37 crc kubenswrapper[5094]: I0220 07:06:37.579770 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"cd92d75e-9882-4bb7-a41e-cab9777424e8","Type":"ContainerStarted","Data":"152906df1c7c73cf9f0f0833ac0e10aba2f396ee2bbcbe89c7d223cc8c01b900"} Feb 20 07:06:37 crc kubenswrapper[5094]: I0220 07:06:37.585634 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" event={"ID":"b40815a7-cffa-44ed-8acf-98261cc7e14c","Type":"ContainerStarted","Data":"32e1079abe135198c0ec21ba4d5ead802416d98d3408a465053fe0c56b193faf"} Feb 20 07:06:37 crc kubenswrapper[5094]: I0220 07:06:37.585998 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:37 crc kubenswrapper[5094]: I0220 07:06:37.588580 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" event={"ID":"304622e2-8b43-4cc6-b17b-79a03bc74a58","Type":"ContainerStarted","Data":"c21bda5d10e0bfeb59652d25722c1c88fa06556f9418d2ba2d4cd6cd8a9af72e"} Feb 20 07:06:37 crc kubenswrapper[5094]: I0220 07:06:37.628817 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" podStartSLOduration=2.628792958 podStartE2EDuration="2.628792958s" podCreationTimestamp="2026-02-20 07:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:06:37.624539066 +0000 UTC m=+1212.497165787" watchObservedRunningTime="2026-02-20 07:06:37.628792958 +0000 UTC m=+1212.501419679" Feb 20 07:06:37 crc kubenswrapper[5094]: I0220 07:06:37.656500 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" podStartSLOduration=3.656478061 podStartE2EDuration="3.656478061s" podCreationTimestamp="2026-02-20 07:06:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:06:37.651497382 +0000 UTC m=+1212.524124103" watchObservedRunningTime="2026-02-20 07:06:37.656478061 +0000 UTC m=+1212.529104762" Feb 20 07:06:37 crc kubenswrapper[5094]: I0220 07:06:37.855430 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91df42cd-5b04-4d4c-862b-c9ccbb1b488d" path="/var/lib/kubelet/pods/91df42cd-5b04-4d4c-862b-c9ccbb1b488d/volumes" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.494405 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.497343 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b9c664f7-sg5jw"] Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.575597 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-wdn2q"] Feb 20 07:06:38 crc kubenswrapper[5094]: E0220 07:06:38.576070 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91df42cd-5b04-4d4c-862b-c9ccbb1b488d" containerName="init" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.576090 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="91df42cd-5b04-4d4c-862b-c9ccbb1b488d" containerName="init" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.576266 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="91df42cd-5b04-4d4c-862b-c9ccbb1b488d" containerName="init" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.577263 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.592346 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-ovsdbserver-nb\") pod \"dnsmasq-dns-689df5d84f-wdn2q\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.592419 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-config\") pod \"dnsmasq-dns-689df5d84f-wdn2q\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.592478 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzbc9\" (UniqueName: \"kubernetes.io/projected/53d83e89-d39a-4ed6-ab65-02820d089bec-kube-api-access-mzbc9\") pod \"dnsmasq-dns-689df5d84f-wdn2q\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.592501 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-dns-svc\") pod \"dnsmasq-dns-689df5d84f-wdn2q\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.592520 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-ovsdbserver-sb\") pod \"dnsmasq-dns-689df5d84f-wdn2q\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.603920 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-wdn2q"] Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.607696 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"cd92d75e-9882-4bb7-a41e-cab9777424e8","Type":"ContainerStarted","Data":"aae8dde1864d6aaeb77909eb96b990c39e162c1e7c99e123f9bcf832ed144feb"} Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.607746 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.607759 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"cd92d75e-9882-4bb7-a41e-cab9777424e8","Type":"ContainerStarted","Data":"3f7c742aada7ed25bb815042dbf9602749ab67ca6990bc8a7e5d047b638c6680"} Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.607770 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.694346 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-ovsdbserver-nb\") pod \"dnsmasq-dns-689df5d84f-wdn2q\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.694437 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-config\") pod \"dnsmasq-dns-689df5d84f-wdn2q\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.694554 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzbc9\" (UniqueName: \"kubernetes.io/projected/53d83e89-d39a-4ed6-ab65-02820d089bec-kube-api-access-mzbc9\") pod \"dnsmasq-dns-689df5d84f-wdn2q\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.694577 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-dns-svc\") pod \"dnsmasq-dns-689df5d84f-wdn2q\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.694599 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-ovsdbserver-sb\") pod \"dnsmasq-dns-689df5d84f-wdn2q\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.695626 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-config\") pod \"dnsmasq-dns-689df5d84f-wdn2q\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.695881 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-ovsdbserver-sb\") pod \"dnsmasq-dns-689df5d84f-wdn2q\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.696397 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-dns-svc\") pod \"dnsmasq-dns-689df5d84f-wdn2q\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.696557 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-ovsdbserver-nb\") pod \"dnsmasq-dns-689df5d84f-wdn2q\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.719553 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzbc9\" (UniqueName: \"kubernetes.io/projected/53d83e89-d39a-4ed6-ab65-02820d089bec-kube-api-access-mzbc9\") pod \"dnsmasq-dns-689df5d84f-wdn2q\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.905765 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.558181 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.57590325 podStartE2EDuration="4.558158075s" podCreationTimestamp="2026-02-20 07:06:35 +0000 UTC" firstStartedPulling="2026-02-20 07:06:36.862208368 +0000 UTC m=+1211.734835079" lastFinishedPulling="2026-02-20 07:06:37.844463183 +0000 UTC m=+1212.717089904" observedRunningTime="2026-02-20 07:06:38.640004676 +0000 UTC m=+1213.512631387" watchObservedRunningTime="2026-02-20 07:06:39.558158075 +0000 UTC m=+1214.430784786" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.559416 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-wdn2q"] Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.620430 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" event={"ID":"53d83e89-d39a-4ed6-ab65-02820d089bec","Type":"ContainerStarted","Data":"4f29f1584725a78d33c79c69353a3c195206f10f8b9ed911fdd59866eb9d81be"} Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.620571 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" podUID="304622e2-8b43-4cc6-b17b-79a03bc74a58" containerName="dnsmasq-dns" containerID="cri-o://c21bda5d10e0bfeb59652d25722c1c88fa06556f9418d2ba2d4cd6cd8a9af72e" gracePeriod=10 Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.689584 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.694985 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.698651 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-5btkp" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.698784 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.698714 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.700260 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.720301 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.720494 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-lock\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.720587 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.720694 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-cache\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.721077 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.721294 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5g7x\" (UniqueName: \"kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-kube-api-access-n5g7x\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.752673 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.823787 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-lock\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.823846 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.823888 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-cache\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.823939 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.823982 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5g7x\" (UniqueName: \"kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-kube-api-access-n5g7x\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.824044 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.824494 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: E0220 07:06:39.825044 5094 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 20 07:06:39 crc kubenswrapper[5094]: E0220 07:06:39.825090 5094 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 20 07:06:39 crc kubenswrapper[5094]: E0220 07:06:39.825166 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift podName:b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3 nodeName:}" failed. No retries permitted until 2026-02-20 07:06:40.325134429 +0000 UTC m=+1215.197761150 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift") pod "swift-storage-0" (UID: "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3") : configmap "swift-ring-files" not found Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.825646 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-lock\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.825878 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-cache\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.835418 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.847161 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5g7x\" (UniqueName: \"kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-kube-api-access-n5g7x\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.855424 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.262677 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-vpv24"] Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.263879 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.281489 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-vpv24"] Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.301790 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.302623 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.302847 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.334373 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:40 crc kubenswrapper[5094]: E0220 07:06:40.334578 5094 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 20 07:06:40 crc kubenswrapper[5094]: E0220 07:06:40.334594 5094 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 20 07:06:40 crc kubenswrapper[5094]: E0220 07:06:40.334638 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift podName:b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3 nodeName:}" failed. No retries permitted until 2026-02-20 07:06:41.33462215 +0000 UTC m=+1216.207248861 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift") pod "swift-storage-0" (UID: "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3") : configmap "swift-ring-files" not found Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.436460 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-combined-ca-bundle\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.436612 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-swiftconf\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.436659 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-dispersionconf\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.436790 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-scripts\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.437093 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-etc-swift\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.437176 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-ring-data-devices\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.437285 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjgt6\" (UniqueName: \"kubernetes.io/projected/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-kube-api-access-cjgt6\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.540542 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-etc-swift\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.540631 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-ring-data-devices\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.540680 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjgt6\" (UniqueName: \"kubernetes.io/projected/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-kube-api-access-cjgt6\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.540891 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-combined-ca-bundle\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.540965 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-swiftconf\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.541008 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-dispersionconf\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.541066 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-scripts\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.541325 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-etc-swift\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.541973 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-scripts\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.542458 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-ring-data-devices\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.547319 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-swiftconf\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.547773 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-dispersionconf\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.551310 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-combined-ca-bundle\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.558971 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjgt6\" (UniqueName: \"kubernetes.io/projected/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-kube-api-access-cjgt6\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.617370 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.628372 5094 generic.go:334] "Generic (PLEG): container finished" podID="304622e2-8b43-4cc6-b17b-79a03bc74a58" containerID="c21bda5d10e0bfeb59652d25722c1c88fa06556f9418d2ba2d4cd6cd8a9af72e" exitCode=0 Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.628426 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" event={"ID":"304622e2-8b43-4cc6-b17b-79a03bc74a58","Type":"ContainerDied","Data":"c21bda5d10e0bfeb59652d25722c1c88fa06556f9418d2ba2d4cd6cd8a9af72e"} Feb 20 07:06:41 crc kubenswrapper[5094]: I0220 07:06:41.100221 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-vpv24"] Feb 20 07:06:41 crc kubenswrapper[5094]: W0220 07:06:41.105965 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffd170be_0f58_4016_a451_5fb1f7fd9f1b.slice/crio-1204be4a8d2c968ca778e1391eab76ffdababc3edfb0acee83669813b41f3e1b WatchSource:0}: Error finding container 1204be4a8d2c968ca778e1391eab76ffdababc3edfb0acee83669813b41f3e1b: Status 404 returned error can't find the container with id 1204be4a8d2c968ca778e1391eab76ffdababc3edfb0acee83669813b41f3e1b Feb 20 07:06:41 crc kubenswrapper[5094]: I0220 07:06:41.361297 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:41 crc kubenswrapper[5094]: E0220 07:06:41.362221 5094 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 20 07:06:41 crc kubenswrapper[5094]: E0220 07:06:41.362240 5094 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 20 07:06:41 crc kubenswrapper[5094]: E0220 07:06:41.362531 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift podName:b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3 nodeName:}" failed. No retries permitted until 2026-02-20 07:06:43.362405545 +0000 UTC m=+1218.235032256 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift") pod "swift-storage-0" (UID: "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3") : configmap "swift-ring-files" not found Feb 20 07:06:41 crc kubenswrapper[5094]: I0220 07:06:41.639796 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vpv24" event={"ID":"ffd170be-0f58-4016-a451-5fb1f7fd9f1b","Type":"ContainerStarted","Data":"1204be4a8d2c968ca778e1391eab76ffdababc3edfb0acee83669813b41f3e1b"} Feb 20 07:06:43 crc kubenswrapper[5094]: I0220 07:06:43.443961 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:43 crc kubenswrapper[5094]: E0220 07:06:43.444191 5094 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 20 07:06:43 crc kubenswrapper[5094]: E0220 07:06:43.444642 5094 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 20 07:06:43 crc kubenswrapper[5094]: E0220 07:06:43.444724 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift podName:b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3 nodeName:}" failed. No retries permitted until 2026-02-20 07:06:47.444689825 +0000 UTC m=+1222.317316536 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift") pod "swift-storage-0" (UID: "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3") : configmap "swift-ring-files" not found Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.306676 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.365777 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-ovsdbserver-sb\") pod \"304622e2-8b43-4cc6-b17b-79a03bc74a58\" (UID: \"304622e2-8b43-4cc6-b17b-79a03bc74a58\") " Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.365896 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-dns-svc\") pod \"304622e2-8b43-4cc6-b17b-79a03bc74a58\" (UID: \"304622e2-8b43-4cc6-b17b-79a03bc74a58\") " Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.365976 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-config\") pod \"304622e2-8b43-4cc6-b17b-79a03bc74a58\" (UID: \"304622e2-8b43-4cc6-b17b-79a03bc74a58\") " Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.366148 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4c67x\" (UniqueName: \"kubernetes.io/projected/304622e2-8b43-4cc6-b17b-79a03bc74a58-kube-api-access-4c67x\") pod \"304622e2-8b43-4cc6-b17b-79a03bc74a58\" (UID: \"304622e2-8b43-4cc6-b17b-79a03bc74a58\") " Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.388201 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/304622e2-8b43-4cc6-b17b-79a03bc74a58-kube-api-access-4c67x" (OuterVolumeSpecName: "kube-api-access-4c67x") pod "304622e2-8b43-4cc6-b17b-79a03bc74a58" (UID: "304622e2-8b43-4cc6-b17b-79a03bc74a58"). InnerVolumeSpecName "kube-api-access-4c67x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.421800 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "304622e2-8b43-4cc6-b17b-79a03bc74a58" (UID: "304622e2-8b43-4cc6-b17b-79a03bc74a58"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.439369 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "304622e2-8b43-4cc6-b17b-79a03bc74a58" (UID: "304622e2-8b43-4cc6-b17b-79a03bc74a58"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.454779 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-config" (OuterVolumeSpecName: "config") pod "304622e2-8b43-4cc6-b17b-79a03bc74a58" (UID: "304622e2-8b43-4cc6-b17b-79a03bc74a58"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.478976 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4c67x\" (UniqueName: \"kubernetes.io/projected/304622e2-8b43-4cc6-b17b-79a03bc74a58-kube-api-access-4c67x\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.479016 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.479026 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.479036 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.618151 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.618211 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.638168 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.673809 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" event={"ID":"304622e2-8b43-4cc6-b17b-79a03bc74a58","Type":"ContainerDied","Data":"31a848846b08ea21777cbda10f64b475f792939e4ec24c88cd9f5397a007043b"} Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.673882 5094 scope.go:117] "RemoveContainer" containerID="c21bda5d10e0bfeb59652d25722c1c88fa06556f9418d2ba2d4cd6cd8a9af72e" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.674048 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.693476 5094 generic.go:334] "Generic (PLEG): container finished" podID="53d83e89-d39a-4ed6-ab65-02820d089bec" containerID="d125587006c31c65f1eeb83ce252e5afe7d019516fe47b88f48976b518f4ec0b" exitCode=0 Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.693524 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" event={"ID":"53d83e89-d39a-4ed6-ab65-02820d089bec","Type":"ContainerDied","Data":"d125587006c31c65f1eeb83ce252e5afe7d019516fe47b88f48976b518f4ec0b"} Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.724078 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b9c664f7-sg5jw"] Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.725587 5094 scope.go:117] "RemoveContainer" containerID="b7767b5d4adb615e7aa00736fc2736b07d8efff1a7193f90cf7bc360858c7390" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.730258 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.735464 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b9c664f7-sg5jw"] Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.740653 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.868782 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 20 07:06:45 crc kubenswrapper[5094]: I0220 07:06:45.479525 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:45 crc kubenswrapper[5094]: I0220 07:06:45.707499 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" event={"ID":"53d83e89-d39a-4ed6-ab65-02820d089bec","Type":"ContainerStarted","Data":"a4bd321aef47fd2509dc250944b4a0aee8240092ee8ccb473ab0a8a33aee200d"} Feb 20 07:06:45 crc kubenswrapper[5094]: I0220 07:06:45.708906 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:45 crc kubenswrapper[5094]: I0220 07:06:45.730228 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" podStartSLOduration=7.730212941 podStartE2EDuration="7.730212941s" podCreationTimestamp="2026-02-20 07:06:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:06:45.726258126 +0000 UTC m=+1220.598884857" watchObservedRunningTime="2026-02-20 07:06:45.730212941 +0000 UTC m=+1220.602839652" Feb 20 07:06:45 crc kubenswrapper[5094]: I0220 07:06:45.858165 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="304622e2-8b43-4cc6-b17b-79a03bc74a58" path="/var/lib/kubelet/pods/304622e2-8b43-4cc6-b17b-79a03bc74a58/volumes" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.568262 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-1fa0-account-create-update-8g9zk"] Feb 20 07:06:46 crc kubenswrapper[5094]: E0220 07:06:46.569384 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="304622e2-8b43-4cc6-b17b-79a03bc74a58" containerName="init" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.569405 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="304622e2-8b43-4cc6-b17b-79a03bc74a58" containerName="init" Feb 20 07:06:46 crc kubenswrapper[5094]: E0220 07:06:46.569470 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="304622e2-8b43-4cc6-b17b-79a03bc74a58" containerName="dnsmasq-dns" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.569481 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="304622e2-8b43-4cc6-b17b-79a03bc74a58" containerName="dnsmasq-dns" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.569699 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="304622e2-8b43-4cc6-b17b-79a03bc74a58" containerName="dnsmasq-dns" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.570380 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1fa0-account-create-update-8g9zk" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.575264 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.585781 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1fa0-account-create-update-8g9zk"] Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.633181 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/772e2155-8d29-40de-8aff-5e42112e6171-operator-scripts\") pod \"glance-1fa0-account-create-update-8g9zk\" (UID: \"772e2155-8d29-40de-8aff-5e42112e6171\") " pod="openstack/glance-1fa0-account-create-update-8g9zk" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.633649 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf4vt\" (UniqueName: \"kubernetes.io/projected/772e2155-8d29-40de-8aff-5e42112e6171-kube-api-access-rf4vt\") pod \"glance-1fa0-account-create-update-8g9zk\" (UID: \"772e2155-8d29-40de-8aff-5e42112e6171\") " pod="openstack/glance-1fa0-account-create-update-8g9zk" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.669327 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-z4m42"] Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.672372 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-z4m42" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.693878 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-z4m42"] Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.734881 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km9jx\" (UniqueName: \"kubernetes.io/projected/a87399a2-42e4-4f46-b93c-cd4f25594a16-kube-api-access-km9jx\") pod \"glance-db-create-z4m42\" (UID: \"a87399a2-42e4-4f46-b93c-cd4f25594a16\") " pod="openstack/glance-db-create-z4m42" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.734945 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a87399a2-42e4-4f46-b93c-cd4f25594a16-operator-scripts\") pod \"glance-db-create-z4m42\" (UID: \"a87399a2-42e4-4f46-b93c-cd4f25594a16\") " pod="openstack/glance-db-create-z4m42" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.734978 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf4vt\" (UniqueName: \"kubernetes.io/projected/772e2155-8d29-40de-8aff-5e42112e6171-kube-api-access-rf4vt\") pod \"glance-1fa0-account-create-update-8g9zk\" (UID: \"772e2155-8d29-40de-8aff-5e42112e6171\") " pod="openstack/glance-1fa0-account-create-update-8g9zk" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.735031 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/772e2155-8d29-40de-8aff-5e42112e6171-operator-scripts\") pod \"glance-1fa0-account-create-update-8g9zk\" (UID: \"772e2155-8d29-40de-8aff-5e42112e6171\") " pod="openstack/glance-1fa0-account-create-update-8g9zk" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.736515 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/772e2155-8d29-40de-8aff-5e42112e6171-operator-scripts\") pod \"glance-1fa0-account-create-update-8g9zk\" (UID: \"772e2155-8d29-40de-8aff-5e42112e6171\") " pod="openstack/glance-1fa0-account-create-update-8g9zk" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.759513 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf4vt\" (UniqueName: \"kubernetes.io/projected/772e2155-8d29-40de-8aff-5e42112e6171-kube-api-access-rf4vt\") pod \"glance-1fa0-account-create-update-8g9zk\" (UID: \"772e2155-8d29-40de-8aff-5e42112e6171\") " pod="openstack/glance-1fa0-account-create-update-8g9zk" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.837594 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a87399a2-42e4-4f46-b93c-cd4f25594a16-operator-scripts\") pod \"glance-db-create-z4m42\" (UID: \"a87399a2-42e4-4f46-b93c-cd4f25594a16\") " pod="openstack/glance-db-create-z4m42" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.837829 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km9jx\" (UniqueName: \"kubernetes.io/projected/a87399a2-42e4-4f46-b93c-cd4f25594a16-kube-api-access-km9jx\") pod \"glance-db-create-z4m42\" (UID: \"a87399a2-42e4-4f46-b93c-cd4f25594a16\") " pod="openstack/glance-db-create-z4m42" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.838867 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a87399a2-42e4-4f46-b93c-cd4f25594a16-operator-scripts\") pod \"glance-db-create-z4m42\" (UID: \"a87399a2-42e4-4f46-b93c-cd4f25594a16\") " pod="openstack/glance-db-create-z4m42" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.864966 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km9jx\" (UniqueName: \"kubernetes.io/projected/a87399a2-42e4-4f46-b93c-cd4f25594a16-kube-api-access-km9jx\") pod \"glance-db-create-z4m42\" (UID: \"a87399a2-42e4-4f46-b93c-cd4f25594a16\") " pod="openstack/glance-db-create-z4m42" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.902406 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1fa0-account-create-update-8g9zk" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.993974 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-z4m42" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.217080 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-kbc28"] Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.218479 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-kbc28" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.233778 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-kbc28"] Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.248447 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqdrp\" (UniqueName: \"kubernetes.io/projected/a0e18d8b-2657-4e87-b6ca-009df89bbac8-kube-api-access-mqdrp\") pod \"keystone-db-create-kbc28\" (UID: \"a0e18d8b-2657-4e87-b6ca-009df89bbac8\") " pod="openstack/keystone-db-create-kbc28" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.248637 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0e18d8b-2657-4e87-b6ca-009df89bbac8-operator-scripts\") pod \"keystone-db-create-kbc28\" (UID: \"a0e18d8b-2657-4e87-b6ca-009df89bbac8\") " pod="openstack/keystone-db-create-kbc28" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.305820 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6cc2-account-create-update-vqjjf"] Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.307015 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6cc2-account-create-update-vqjjf" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.312539 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.315040 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6cc2-account-create-update-vqjjf"] Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.353726 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/876bc507-6cf2-466a-9cd3-6131a1cc590e-operator-scripts\") pod \"keystone-6cc2-account-create-update-vqjjf\" (UID: \"876bc507-6cf2-466a-9cd3-6131a1cc590e\") " pod="openstack/keystone-6cc2-account-create-update-vqjjf" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.353866 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqdrp\" (UniqueName: \"kubernetes.io/projected/a0e18d8b-2657-4e87-b6ca-009df89bbac8-kube-api-access-mqdrp\") pod \"keystone-db-create-kbc28\" (UID: \"a0e18d8b-2657-4e87-b6ca-009df89bbac8\") " pod="openstack/keystone-db-create-kbc28" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.353985 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0e18d8b-2657-4e87-b6ca-009df89bbac8-operator-scripts\") pod \"keystone-db-create-kbc28\" (UID: \"a0e18d8b-2657-4e87-b6ca-009df89bbac8\") " pod="openstack/keystone-db-create-kbc28" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.354049 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm2vj\" (UniqueName: \"kubernetes.io/projected/876bc507-6cf2-466a-9cd3-6131a1cc590e-kube-api-access-xm2vj\") pod \"keystone-6cc2-account-create-update-vqjjf\" (UID: \"876bc507-6cf2-466a-9cd3-6131a1cc590e\") " pod="openstack/keystone-6cc2-account-create-update-vqjjf" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.355671 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0e18d8b-2657-4e87-b6ca-009df89bbac8-operator-scripts\") pod \"keystone-db-create-kbc28\" (UID: \"a0e18d8b-2657-4e87-b6ca-009df89bbac8\") " pod="openstack/keystone-db-create-kbc28" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.393555 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqdrp\" (UniqueName: \"kubernetes.io/projected/a0e18d8b-2657-4e87-b6ca-009df89bbac8-kube-api-access-mqdrp\") pod \"keystone-db-create-kbc28\" (UID: \"a0e18d8b-2657-4e87-b6ca-009df89bbac8\") " pod="openstack/keystone-db-create-kbc28" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.455008 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-p4nhd"] Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.455880 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.455975 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/876bc507-6cf2-466a-9cd3-6131a1cc590e-operator-scripts\") pod \"keystone-6cc2-account-create-update-vqjjf\" (UID: \"876bc507-6cf2-466a-9cd3-6131a1cc590e\") " pod="openstack/keystone-6cc2-account-create-update-vqjjf" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.456116 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm2vj\" (UniqueName: \"kubernetes.io/projected/876bc507-6cf2-466a-9cd3-6131a1cc590e-kube-api-access-xm2vj\") pod \"keystone-6cc2-account-create-update-vqjjf\" (UID: \"876bc507-6cf2-466a-9cd3-6131a1cc590e\") " pod="openstack/keystone-6cc2-account-create-update-vqjjf" Feb 20 07:06:47 crc kubenswrapper[5094]: E0220 07:06:47.456140 5094 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 20 07:06:47 crc kubenswrapper[5094]: E0220 07:06:47.456504 5094 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 20 07:06:47 crc kubenswrapper[5094]: E0220 07:06:47.456560 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift podName:b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3 nodeName:}" failed. No retries permitted until 2026-02-20 07:06:55.456540985 +0000 UTC m=+1230.329167696 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift") pod "swift-storage-0" (UID: "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3") : configmap "swift-ring-files" not found Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.456689 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-p4nhd" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.457745 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/876bc507-6cf2-466a-9cd3-6131a1cc590e-operator-scripts\") pod \"keystone-6cc2-account-create-update-vqjjf\" (UID: \"876bc507-6cf2-466a-9cd3-6131a1cc590e\") " pod="openstack/keystone-6cc2-account-create-update-vqjjf" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.473644 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-p4nhd"] Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.483662 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm2vj\" (UniqueName: \"kubernetes.io/projected/876bc507-6cf2-466a-9cd3-6131a1cc590e-kube-api-access-xm2vj\") pod \"keystone-6cc2-account-create-update-vqjjf\" (UID: \"876bc507-6cf2-466a-9cd3-6131a1cc590e\") " pod="openstack/keystone-6cc2-account-create-update-vqjjf" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.548235 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-kbc28" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.559175 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/462ace9b-51c7-4cd0-850a-65d714c5f3b6-operator-scripts\") pod \"placement-db-create-p4nhd\" (UID: \"462ace9b-51c7-4cd0-850a-65d714c5f3b6\") " pod="openstack/placement-db-create-p4nhd" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.559253 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84jkk\" (UniqueName: \"kubernetes.io/projected/462ace9b-51c7-4cd0-850a-65d714c5f3b6-kube-api-access-84jkk\") pod \"placement-db-create-p4nhd\" (UID: \"462ace9b-51c7-4cd0-850a-65d714c5f3b6\") " pod="openstack/placement-db-create-p4nhd" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.588537 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6e9d-account-create-update-f9bgk"] Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.630572 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6e9d-account-create-update-f9bgk" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.630857 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6cc2-account-create-update-vqjjf" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.633956 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.638066 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6e9d-account-create-update-f9bgk"] Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.660846 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qh77\" (UniqueName: \"kubernetes.io/projected/20ff73f2-0b55-4d81-9342-92dbe47435f0-kube-api-access-6qh77\") pod \"placement-6e9d-account-create-update-f9bgk\" (UID: \"20ff73f2-0b55-4d81-9342-92dbe47435f0\") " pod="openstack/placement-6e9d-account-create-update-f9bgk" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.661093 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20ff73f2-0b55-4d81-9342-92dbe47435f0-operator-scripts\") pod \"placement-6e9d-account-create-update-f9bgk\" (UID: \"20ff73f2-0b55-4d81-9342-92dbe47435f0\") " pod="openstack/placement-6e9d-account-create-update-f9bgk" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.661223 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/462ace9b-51c7-4cd0-850a-65d714c5f3b6-operator-scripts\") pod \"placement-db-create-p4nhd\" (UID: \"462ace9b-51c7-4cd0-850a-65d714c5f3b6\") " pod="openstack/placement-db-create-p4nhd" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.661309 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84jkk\" (UniqueName: \"kubernetes.io/projected/462ace9b-51c7-4cd0-850a-65d714c5f3b6-kube-api-access-84jkk\") pod \"placement-db-create-p4nhd\" (UID: \"462ace9b-51c7-4cd0-850a-65d714c5f3b6\") " pod="openstack/placement-db-create-p4nhd" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.662561 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/462ace9b-51c7-4cd0-850a-65d714c5f3b6-operator-scripts\") pod \"placement-db-create-p4nhd\" (UID: \"462ace9b-51c7-4cd0-850a-65d714c5f3b6\") " pod="openstack/placement-db-create-p4nhd" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.684214 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84jkk\" (UniqueName: \"kubernetes.io/projected/462ace9b-51c7-4cd0-850a-65d714c5f3b6-kube-api-access-84jkk\") pod \"placement-db-create-p4nhd\" (UID: \"462ace9b-51c7-4cd0-850a-65d714c5f3b6\") " pod="openstack/placement-db-create-p4nhd" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.764139 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qh77\" (UniqueName: \"kubernetes.io/projected/20ff73f2-0b55-4d81-9342-92dbe47435f0-kube-api-access-6qh77\") pod \"placement-6e9d-account-create-update-f9bgk\" (UID: \"20ff73f2-0b55-4d81-9342-92dbe47435f0\") " pod="openstack/placement-6e9d-account-create-update-f9bgk" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.764296 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20ff73f2-0b55-4d81-9342-92dbe47435f0-operator-scripts\") pod \"placement-6e9d-account-create-update-f9bgk\" (UID: \"20ff73f2-0b55-4d81-9342-92dbe47435f0\") " pod="openstack/placement-6e9d-account-create-update-f9bgk" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.765789 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20ff73f2-0b55-4d81-9342-92dbe47435f0-operator-scripts\") pod \"placement-6e9d-account-create-update-f9bgk\" (UID: \"20ff73f2-0b55-4d81-9342-92dbe47435f0\") " pod="openstack/placement-6e9d-account-create-update-f9bgk" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.783681 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qh77\" (UniqueName: \"kubernetes.io/projected/20ff73f2-0b55-4d81-9342-92dbe47435f0-kube-api-access-6qh77\") pod \"placement-6e9d-account-create-update-f9bgk\" (UID: \"20ff73f2-0b55-4d81-9342-92dbe47435f0\") " pod="openstack/placement-6e9d-account-create-update-f9bgk" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.821635 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-p4nhd" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.963315 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6e9d-account-create-update-f9bgk" Feb 20 07:06:48 crc kubenswrapper[5094]: I0220 07:06:48.337775 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-z4m42"] Feb 20 07:06:48 crc kubenswrapper[5094]: I0220 07:06:48.491477 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-kbc28"] Feb 20 07:06:48 crc kubenswrapper[5094]: I0220 07:06:48.587822 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-p4nhd"] Feb 20 07:06:48 crc kubenswrapper[5094]: I0220 07:06:48.658542 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6cc2-account-create-update-vqjjf"] Feb 20 07:06:48 crc kubenswrapper[5094]: I0220 07:06:48.673800 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1fa0-account-create-update-8g9zk"] Feb 20 07:06:48 crc kubenswrapper[5094]: I0220 07:06:48.687943 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6e9d-account-create-update-f9bgk"] Feb 20 07:06:48 crc kubenswrapper[5094]: I0220 07:06:48.738057 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-kbc28" event={"ID":"a0e18d8b-2657-4e87-b6ca-009df89bbac8","Type":"ContainerStarted","Data":"9a416b443cda054982b69d39244868053fabefa912f2d78cab0a7899918d1ec1"} Feb 20 07:06:48 crc kubenswrapper[5094]: I0220 07:06:48.740601 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6cc2-account-create-update-vqjjf" event={"ID":"876bc507-6cf2-466a-9cd3-6131a1cc590e","Type":"ContainerStarted","Data":"d597bee2674c3a780daf20b3f5ea75594b44e3bee62e1bae78efb516888a6f5b"} Feb 20 07:06:48 crc kubenswrapper[5094]: I0220 07:06:48.742338 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vpv24" event={"ID":"ffd170be-0f58-4016-a451-5fb1f7fd9f1b","Type":"ContainerStarted","Data":"a9068c6d92101548d5c981bed911d9bb32d305a4eac5c47442863a89e8e65fe9"} Feb 20 07:06:48 crc kubenswrapper[5094]: I0220 07:06:48.744470 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-p4nhd" event={"ID":"462ace9b-51c7-4cd0-850a-65d714c5f3b6","Type":"ContainerStarted","Data":"466bc14b6e47a7f8a431b6bfab8a713c013e95d60e6dbe91267a47d164014dad"} Feb 20 07:06:48 crc kubenswrapper[5094]: I0220 07:06:48.746750 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1fa0-account-create-update-8g9zk" event={"ID":"772e2155-8d29-40de-8aff-5e42112e6171","Type":"ContainerStarted","Data":"0331760d548ee8233a915834188613dcd12684d1c1089583d261fa2fb26afee8"} Feb 20 07:06:48 crc kubenswrapper[5094]: I0220 07:06:48.749269 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6e9d-account-create-update-f9bgk" event={"ID":"20ff73f2-0b55-4d81-9342-92dbe47435f0","Type":"ContainerStarted","Data":"2f8481373beadef4696472d4f06094adaa3a02416567c1059a09c95ff4c7fc9d"} Feb 20 07:06:48 crc kubenswrapper[5094]: I0220 07:06:48.751730 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-z4m42" event={"ID":"a87399a2-42e4-4f46-b93c-cd4f25594a16","Type":"ContainerStarted","Data":"75b382b7cacb24f534d4ec56b90d8492c32e928df430e4a3f769babf549e4aa5"} Feb 20 07:06:48 crc kubenswrapper[5094]: I0220 07:06:48.751785 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-z4m42" event={"ID":"a87399a2-42e4-4f46-b93c-cd4f25594a16","Type":"ContainerStarted","Data":"7d549be08c997e24ddc46194864b94c1ed01c341dc44d1d9faa29c3e8fd1f0b4"} Feb 20 07:06:48 crc kubenswrapper[5094]: I0220 07:06:48.766924 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-vpv24" podStartSLOduration=1.9196456 podStartE2EDuration="8.766901647s" podCreationTimestamp="2026-02-20 07:06:40 +0000 UTC" firstStartedPulling="2026-02-20 07:06:41.108659188 +0000 UTC m=+1215.981285889" lastFinishedPulling="2026-02-20 07:06:47.955915215 +0000 UTC m=+1222.828541936" observedRunningTime="2026-02-20 07:06:48.76201443 +0000 UTC m=+1223.634641141" watchObservedRunningTime="2026-02-20 07:06:48.766901647 +0000 UTC m=+1223.639528348" Feb 20 07:06:48 crc kubenswrapper[5094]: I0220 07:06:48.783540 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-z4m42" podStartSLOduration=2.783524175 podStartE2EDuration="2.783524175s" podCreationTimestamp="2026-02-20 07:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:06:48.77785242 +0000 UTC m=+1223.650479151" watchObservedRunningTime="2026-02-20 07:06:48.783524175 +0000 UTC m=+1223.656150886" Feb 20 07:06:49 crc kubenswrapper[5094]: I0220 07:06:49.775288 5094 generic.go:334] "Generic (PLEG): container finished" podID="462ace9b-51c7-4cd0-850a-65d714c5f3b6" containerID="2a24e00dad1ec7884b816153f26b2812e819c54c4c8093cf48001992ec89df96" exitCode=0 Feb 20 07:06:49 crc kubenswrapper[5094]: I0220 07:06:49.775426 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-p4nhd" event={"ID":"462ace9b-51c7-4cd0-850a-65d714c5f3b6","Type":"ContainerDied","Data":"2a24e00dad1ec7884b816153f26b2812e819c54c4c8093cf48001992ec89df96"} Feb 20 07:06:49 crc kubenswrapper[5094]: I0220 07:06:49.778861 5094 generic.go:334] "Generic (PLEG): container finished" podID="772e2155-8d29-40de-8aff-5e42112e6171" containerID="b37e6501ae2ac6b9c9a4901b1e8b894900b9c70d4214f260a2bb15e75fba5205" exitCode=0 Feb 20 07:06:49 crc kubenswrapper[5094]: I0220 07:06:49.778939 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1fa0-account-create-update-8g9zk" event={"ID":"772e2155-8d29-40de-8aff-5e42112e6171","Type":"ContainerDied","Data":"b37e6501ae2ac6b9c9a4901b1e8b894900b9c70d4214f260a2bb15e75fba5205"} Feb 20 07:06:49 crc kubenswrapper[5094]: I0220 07:06:49.782586 5094 generic.go:334] "Generic (PLEG): container finished" podID="20ff73f2-0b55-4d81-9342-92dbe47435f0" containerID="c8a3057121d16618bfdbd39860a04679adcd72a905e063ca9af153f1f199e6f2" exitCode=0 Feb 20 07:06:49 crc kubenswrapper[5094]: I0220 07:06:49.782922 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6e9d-account-create-update-f9bgk" event={"ID":"20ff73f2-0b55-4d81-9342-92dbe47435f0","Type":"ContainerDied","Data":"c8a3057121d16618bfdbd39860a04679adcd72a905e063ca9af153f1f199e6f2"} Feb 20 07:06:49 crc kubenswrapper[5094]: I0220 07:06:49.785845 5094 generic.go:334] "Generic (PLEG): container finished" podID="a87399a2-42e4-4f46-b93c-cd4f25594a16" containerID="75b382b7cacb24f534d4ec56b90d8492c32e928df430e4a3f769babf549e4aa5" exitCode=0 Feb 20 07:06:49 crc kubenswrapper[5094]: I0220 07:06:49.786111 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-z4m42" event={"ID":"a87399a2-42e4-4f46-b93c-cd4f25594a16","Type":"ContainerDied","Data":"75b382b7cacb24f534d4ec56b90d8492c32e928df430e4a3f769babf549e4aa5"} Feb 20 07:06:49 crc kubenswrapper[5094]: I0220 07:06:49.788341 5094 generic.go:334] "Generic (PLEG): container finished" podID="876bc507-6cf2-466a-9cd3-6131a1cc590e" containerID="5fbdea48cb9017b90d8f206860f008a7a92776227fe74ba390e642bcf9bceabc" exitCode=0 Feb 20 07:06:49 crc kubenswrapper[5094]: I0220 07:06:49.788568 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6cc2-account-create-update-vqjjf" event={"ID":"876bc507-6cf2-466a-9cd3-6131a1cc590e","Type":"ContainerDied","Data":"5fbdea48cb9017b90d8f206860f008a7a92776227fe74ba390e642bcf9bceabc"} Feb 20 07:06:49 crc kubenswrapper[5094]: I0220 07:06:49.791621 5094 generic.go:334] "Generic (PLEG): container finished" podID="a0e18d8b-2657-4e87-b6ca-009df89bbac8" containerID="b09eaa7d98442981b4e7ce37eedd93a2e3a6cd66a6970eb460a847a861e69caa" exitCode=0 Feb 20 07:06:49 crc kubenswrapper[5094]: I0220 07:06:49.791834 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-kbc28" event={"ID":"a0e18d8b-2657-4e87-b6ca-009df89bbac8","Type":"ContainerDied","Data":"b09eaa7d98442981b4e7ce37eedd93a2e3a6cd66a6970eb460a847a861e69caa"} Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.359535 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-z4m42" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.459047 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km9jx\" (UniqueName: \"kubernetes.io/projected/a87399a2-42e4-4f46-b93c-cd4f25594a16-kube-api-access-km9jx\") pod \"a87399a2-42e4-4f46-b93c-cd4f25594a16\" (UID: \"a87399a2-42e4-4f46-b93c-cd4f25594a16\") " Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.459319 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a87399a2-42e4-4f46-b93c-cd4f25594a16-operator-scripts\") pod \"a87399a2-42e4-4f46-b93c-cd4f25594a16\" (UID: \"a87399a2-42e4-4f46-b93c-cd4f25594a16\") " Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.460112 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a87399a2-42e4-4f46-b93c-cd4f25594a16-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a87399a2-42e4-4f46-b93c-cd4f25594a16" (UID: "a87399a2-42e4-4f46-b93c-cd4f25594a16"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.469606 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a87399a2-42e4-4f46-b93c-cd4f25594a16-kube-api-access-km9jx" (OuterVolumeSpecName: "kube-api-access-km9jx") pod "a87399a2-42e4-4f46-b93c-cd4f25594a16" (UID: "a87399a2-42e4-4f46-b93c-cd4f25594a16"). InnerVolumeSpecName "kube-api-access-km9jx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.562199 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a87399a2-42e4-4f46-b93c-cd4f25594a16-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.562239 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km9jx\" (UniqueName: \"kubernetes.io/projected/a87399a2-42e4-4f46-b93c-cd4f25594a16-kube-api-access-km9jx\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.565056 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-p4nhd" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.570411 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-kbc28" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.574938 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1fa0-account-create-update-8g9zk" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.581878 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6cc2-account-create-update-vqjjf" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.596417 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6e9d-account-create-update-f9bgk" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.671542 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qh77\" (UniqueName: \"kubernetes.io/projected/20ff73f2-0b55-4d81-9342-92dbe47435f0-kube-api-access-6qh77\") pod \"20ff73f2-0b55-4d81-9342-92dbe47435f0\" (UID: \"20ff73f2-0b55-4d81-9342-92dbe47435f0\") " Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.671639 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rf4vt\" (UniqueName: \"kubernetes.io/projected/772e2155-8d29-40de-8aff-5e42112e6171-kube-api-access-rf4vt\") pod \"772e2155-8d29-40de-8aff-5e42112e6171\" (UID: \"772e2155-8d29-40de-8aff-5e42112e6171\") " Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.671750 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqdrp\" (UniqueName: \"kubernetes.io/projected/a0e18d8b-2657-4e87-b6ca-009df89bbac8-kube-api-access-mqdrp\") pod \"a0e18d8b-2657-4e87-b6ca-009df89bbac8\" (UID: \"a0e18d8b-2657-4e87-b6ca-009df89bbac8\") " Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.671797 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/462ace9b-51c7-4cd0-850a-65d714c5f3b6-operator-scripts\") pod \"462ace9b-51c7-4cd0-850a-65d714c5f3b6\" (UID: \"462ace9b-51c7-4cd0-850a-65d714c5f3b6\") " Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.671819 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/772e2155-8d29-40de-8aff-5e42112e6171-operator-scripts\") pod \"772e2155-8d29-40de-8aff-5e42112e6171\" (UID: \"772e2155-8d29-40de-8aff-5e42112e6171\") " Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.671921 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm2vj\" (UniqueName: \"kubernetes.io/projected/876bc507-6cf2-466a-9cd3-6131a1cc590e-kube-api-access-xm2vj\") pod \"876bc507-6cf2-466a-9cd3-6131a1cc590e\" (UID: \"876bc507-6cf2-466a-9cd3-6131a1cc590e\") " Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.671947 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84jkk\" (UniqueName: \"kubernetes.io/projected/462ace9b-51c7-4cd0-850a-65d714c5f3b6-kube-api-access-84jkk\") pod \"462ace9b-51c7-4cd0-850a-65d714c5f3b6\" (UID: \"462ace9b-51c7-4cd0-850a-65d714c5f3b6\") " Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.672017 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20ff73f2-0b55-4d81-9342-92dbe47435f0-operator-scripts\") pod \"20ff73f2-0b55-4d81-9342-92dbe47435f0\" (UID: \"20ff73f2-0b55-4d81-9342-92dbe47435f0\") " Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.672085 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0e18d8b-2657-4e87-b6ca-009df89bbac8-operator-scripts\") pod \"a0e18d8b-2657-4e87-b6ca-009df89bbac8\" (UID: \"a0e18d8b-2657-4e87-b6ca-009df89bbac8\") " Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.672111 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/876bc507-6cf2-466a-9cd3-6131a1cc590e-operator-scripts\") pod \"876bc507-6cf2-466a-9cd3-6131a1cc590e\" (UID: \"876bc507-6cf2-466a-9cd3-6131a1cc590e\") " Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.673600 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/772e2155-8d29-40de-8aff-5e42112e6171-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "772e2155-8d29-40de-8aff-5e42112e6171" (UID: "772e2155-8d29-40de-8aff-5e42112e6171"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.673961 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20ff73f2-0b55-4d81-9342-92dbe47435f0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "20ff73f2-0b55-4d81-9342-92dbe47435f0" (UID: "20ff73f2-0b55-4d81-9342-92dbe47435f0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.674157 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/462ace9b-51c7-4cd0-850a-65d714c5f3b6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "462ace9b-51c7-4cd0-850a-65d714c5f3b6" (UID: "462ace9b-51c7-4cd0-850a-65d714c5f3b6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.674491 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/876bc507-6cf2-466a-9cd3-6131a1cc590e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "876bc507-6cf2-466a-9cd3-6131a1cc590e" (UID: "876bc507-6cf2-466a-9cd3-6131a1cc590e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.675877 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0e18d8b-2657-4e87-b6ca-009df89bbac8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a0e18d8b-2657-4e87-b6ca-009df89bbac8" (UID: "a0e18d8b-2657-4e87-b6ca-009df89bbac8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.677571 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/462ace9b-51c7-4cd0-850a-65d714c5f3b6-kube-api-access-84jkk" (OuterVolumeSpecName: "kube-api-access-84jkk") pod "462ace9b-51c7-4cd0-850a-65d714c5f3b6" (UID: "462ace9b-51c7-4cd0-850a-65d714c5f3b6"). InnerVolumeSpecName "kube-api-access-84jkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.678482 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/772e2155-8d29-40de-8aff-5e42112e6171-kube-api-access-rf4vt" (OuterVolumeSpecName: "kube-api-access-rf4vt") pod "772e2155-8d29-40de-8aff-5e42112e6171" (UID: "772e2155-8d29-40de-8aff-5e42112e6171"). InnerVolumeSpecName "kube-api-access-rf4vt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.680623 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/876bc507-6cf2-466a-9cd3-6131a1cc590e-kube-api-access-xm2vj" (OuterVolumeSpecName: "kube-api-access-xm2vj") pod "876bc507-6cf2-466a-9cd3-6131a1cc590e" (UID: "876bc507-6cf2-466a-9cd3-6131a1cc590e"). InnerVolumeSpecName "kube-api-access-xm2vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.682981 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20ff73f2-0b55-4d81-9342-92dbe47435f0-kube-api-access-6qh77" (OuterVolumeSpecName: "kube-api-access-6qh77") pod "20ff73f2-0b55-4d81-9342-92dbe47435f0" (UID: "20ff73f2-0b55-4d81-9342-92dbe47435f0"). InnerVolumeSpecName "kube-api-access-6qh77". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.685292 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0e18d8b-2657-4e87-b6ca-009df89bbac8-kube-api-access-mqdrp" (OuterVolumeSpecName: "kube-api-access-mqdrp") pod "a0e18d8b-2657-4e87-b6ca-009df89bbac8" (UID: "a0e18d8b-2657-4e87-b6ca-009df89bbac8"). InnerVolumeSpecName "kube-api-access-mqdrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.773786 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm2vj\" (UniqueName: \"kubernetes.io/projected/876bc507-6cf2-466a-9cd3-6131a1cc590e-kube-api-access-xm2vj\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.773831 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84jkk\" (UniqueName: \"kubernetes.io/projected/462ace9b-51c7-4cd0-850a-65d714c5f3b6-kube-api-access-84jkk\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.773865 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20ff73f2-0b55-4d81-9342-92dbe47435f0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.773877 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0e18d8b-2657-4e87-b6ca-009df89bbac8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.773888 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/876bc507-6cf2-466a-9cd3-6131a1cc590e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.773898 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qh77\" (UniqueName: \"kubernetes.io/projected/20ff73f2-0b55-4d81-9342-92dbe47435f0-kube-api-access-6qh77\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.773907 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rf4vt\" (UniqueName: \"kubernetes.io/projected/772e2155-8d29-40de-8aff-5e42112e6171-kube-api-access-rf4vt\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.773918 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqdrp\" (UniqueName: \"kubernetes.io/projected/a0e18d8b-2657-4e87-b6ca-009df89bbac8-kube-api-access-mqdrp\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.773927 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/462ace9b-51c7-4cd0-850a-65d714c5f3b6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.773938 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/772e2155-8d29-40de-8aff-5e42112e6171-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.818602 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-p4nhd" event={"ID":"462ace9b-51c7-4cd0-850a-65d714c5f3b6","Type":"ContainerDied","Data":"466bc14b6e47a7f8a431b6bfab8a713c013e95d60e6dbe91267a47d164014dad"} Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.818671 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="466bc14b6e47a7f8a431b6bfab8a713c013e95d60e6dbe91267a47d164014dad" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.818775 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-p4nhd" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.825136 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1fa0-account-create-update-8g9zk" event={"ID":"772e2155-8d29-40de-8aff-5e42112e6171","Type":"ContainerDied","Data":"0331760d548ee8233a915834188613dcd12684d1c1089583d261fa2fb26afee8"} Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.825182 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0331760d548ee8233a915834188613dcd12684d1c1089583d261fa2fb26afee8" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.825239 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1fa0-account-create-update-8g9zk" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.830051 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6e9d-account-create-update-f9bgk" event={"ID":"20ff73f2-0b55-4d81-9342-92dbe47435f0","Type":"ContainerDied","Data":"2f8481373beadef4696472d4f06094adaa3a02416567c1059a09c95ff4c7fc9d"} Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.830179 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f8481373beadef4696472d4f06094adaa3a02416567c1059a09c95ff4c7fc9d" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.830310 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6e9d-account-create-update-f9bgk" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.836185 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-z4m42" event={"ID":"a87399a2-42e4-4f46-b93c-cd4f25594a16","Type":"ContainerDied","Data":"7d549be08c997e24ddc46194864b94c1ed01c341dc44d1d9faa29c3e8fd1f0b4"} Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.836281 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d549be08c997e24ddc46194864b94c1ed01c341dc44d1d9faa29c3e8fd1f0b4" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.836346 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-z4m42" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.842214 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-kbc28" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.847697 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6cc2-account-create-update-vqjjf" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.875787 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-kbc28" event={"ID":"a0e18d8b-2657-4e87-b6ca-009df89bbac8","Type":"ContainerDied","Data":"9a416b443cda054982b69d39244868053fabefa912f2d78cab0a7899918d1ec1"} Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.875838 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a416b443cda054982b69d39244868053fabefa912f2d78cab0a7899918d1ec1" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.875850 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6cc2-account-create-update-vqjjf" event={"ID":"876bc507-6cf2-466a-9cd3-6131a1cc590e","Type":"ContainerDied","Data":"d597bee2674c3a780daf20b3f5ea75594b44e3bee62e1bae78efb516888a6f5b"} Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.875865 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d597bee2674c3a780daf20b3f5ea75594b44e3bee62e1bae78efb516888a6f5b" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.244686 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-b2w5s"] Feb 20 07:06:53 crc kubenswrapper[5094]: E0220 07:06:53.245939 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a87399a2-42e4-4f46-b93c-cd4f25594a16" containerName="mariadb-database-create" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.245963 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a87399a2-42e4-4f46-b93c-cd4f25594a16" containerName="mariadb-database-create" Feb 20 07:06:53 crc kubenswrapper[5094]: E0220 07:06:53.246000 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="876bc507-6cf2-466a-9cd3-6131a1cc590e" containerName="mariadb-account-create-update" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.246011 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="876bc507-6cf2-466a-9cd3-6131a1cc590e" containerName="mariadb-account-create-update" Feb 20 07:06:53 crc kubenswrapper[5094]: E0220 07:06:53.246026 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0e18d8b-2657-4e87-b6ca-009df89bbac8" containerName="mariadb-database-create" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.246035 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0e18d8b-2657-4e87-b6ca-009df89bbac8" containerName="mariadb-database-create" Feb 20 07:06:53 crc kubenswrapper[5094]: E0220 07:06:53.246045 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="772e2155-8d29-40de-8aff-5e42112e6171" containerName="mariadb-account-create-update" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.246054 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="772e2155-8d29-40de-8aff-5e42112e6171" containerName="mariadb-account-create-update" Feb 20 07:06:53 crc kubenswrapper[5094]: E0220 07:06:53.246072 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="462ace9b-51c7-4cd0-850a-65d714c5f3b6" containerName="mariadb-database-create" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.246080 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="462ace9b-51c7-4cd0-850a-65d714c5f3b6" containerName="mariadb-database-create" Feb 20 07:06:53 crc kubenswrapper[5094]: E0220 07:06:53.246094 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20ff73f2-0b55-4d81-9342-92dbe47435f0" containerName="mariadb-account-create-update" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.246102 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="20ff73f2-0b55-4d81-9342-92dbe47435f0" containerName="mariadb-account-create-update" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.246347 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="876bc507-6cf2-466a-9cd3-6131a1cc590e" containerName="mariadb-account-create-update" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.246430 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="20ff73f2-0b55-4d81-9342-92dbe47435f0" containerName="mariadb-account-create-update" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.246451 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0e18d8b-2657-4e87-b6ca-009df89bbac8" containerName="mariadb-database-create" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.246462 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="462ace9b-51c7-4cd0-850a-65d714c5f3b6" containerName="mariadb-database-create" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.246476 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="772e2155-8d29-40de-8aff-5e42112e6171" containerName="mariadb-account-create-update" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.246490 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a87399a2-42e4-4f46-b93c-cd4f25594a16" containerName="mariadb-database-create" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.247275 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b2w5s" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.252899 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.258792 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-b2w5s"] Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.317973 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdcld\" (UniqueName: \"kubernetes.io/projected/4a26e4e5-091b-4a9e-8a16-2dc535e85fae-kube-api-access-sdcld\") pod \"root-account-create-update-b2w5s\" (UID: \"4a26e4e5-091b-4a9e-8a16-2dc535e85fae\") " pod="openstack/root-account-create-update-b2w5s" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.318296 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a26e4e5-091b-4a9e-8a16-2dc535e85fae-operator-scripts\") pod \"root-account-create-update-b2w5s\" (UID: \"4a26e4e5-091b-4a9e-8a16-2dc535e85fae\") " pod="openstack/root-account-create-update-b2w5s" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.420942 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdcld\" (UniqueName: \"kubernetes.io/projected/4a26e4e5-091b-4a9e-8a16-2dc535e85fae-kube-api-access-sdcld\") pod \"root-account-create-update-b2w5s\" (UID: \"4a26e4e5-091b-4a9e-8a16-2dc535e85fae\") " pod="openstack/root-account-create-update-b2w5s" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.421679 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a26e4e5-091b-4a9e-8a16-2dc535e85fae-operator-scripts\") pod \"root-account-create-update-b2w5s\" (UID: \"4a26e4e5-091b-4a9e-8a16-2dc535e85fae\") " pod="openstack/root-account-create-update-b2w5s" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.423376 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a26e4e5-091b-4a9e-8a16-2dc535e85fae-operator-scripts\") pod \"root-account-create-update-b2w5s\" (UID: \"4a26e4e5-091b-4a9e-8a16-2dc535e85fae\") " pod="openstack/root-account-create-update-b2w5s" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.449977 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdcld\" (UniqueName: \"kubernetes.io/projected/4a26e4e5-091b-4a9e-8a16-2dc535e85fae-kube-api-access-sdcld\") pod \"root-account-create-update-b2w5s\" (UID: \"4a26e4e5-091b-4a9e-8a16-2dc535e85fae\") " pod="openstack/root-account-create-update-b2w5s" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.576551 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b2w5s" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.908012 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.979066 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-wdrds"] Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.979379 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" podUID="b40815a7-cffa-44ed-8acf-98261cc7e14c" containerName="dnsmasq-dns" containerID="cri-o://32e1079abe135198c0ec21ba4d5ead802416d98d3408a465053fe0c56b193faf" gracePeriod=10 Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.102815 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-b2w5s"] Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.458475 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.648440 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-dns-svc\") pod \"b40815a7-cffa-44ed-8acf-98261cc7e14c\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.648587 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-config\") pod \"b40815a7-cffa-44ed-8acf-98261cc7e14c\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.648681 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-ovsdbserver-sb\") pod \"b40815a7-cffa-44ed-8acf-98261cc7e14c\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.648818 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-928v8\" (UniqueName: \"kubernetes.io/projected/b40815a7-cffa-44ed-8acf-98261cc7e14c-kube-api-access-928v8\") pod \"b40815a7-cffa-44ed-8acf-98261cc7e14c\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.648868 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-ovsdbserver-nb\") pod \"b40815a7-cffa-44ed-8acf-98261cc7e14c\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.665538 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b40815a7-cffa-44ed-8acf-98261cc7e14c-kube-api-access-928v8" (OuterVolumeSpecName: "kube-api-access-928v8") pod "b40815a7-cffa-44ed-8acf-98261cc7e14c" (UID: "b40815a7-cffa-44ed-8acf-98261cc7e14c"). InnerVolumeSpecName "kube-api-access-928v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.698499 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b40815a7-cffa-44ed-8acf-98261cc7e14c" (UID: "b40815a7-cffa-44ed-8acf-98261cc7e14c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.702186 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b40815a7-cffa-44ed-8acf-98261cc7e14c" (UID: "b40815a7-cffa-44ed-8acf-98261cc7e14c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.703300 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b40815a7-cffa-44ed-8acf-98261cc7e14c" (UID: "b40815a7-cffa-44ed-8acf-98261cc7e14c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.709563 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-config" (OuterVolumeSpecName: "config") pod "b40815a7-cffa-44ed-8acf-98261cc7e14c" (UID: "b40815a7-cffa-44ed-8acf-98261cc7e14c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.751746 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.751784 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.751797 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.751815 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-928v8\" (UniqueName: \"kubernetes.io/projected/b40815a7-cffa-44ed-8acf-98261cc7e14c-kube-api-access-928v8\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.751827 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.888783 5094 generic.go:334] "Generic (PLEG): container finished" podID="b40815a7-cffa-44ed-8acf-98261cc7e14c" containerID="32e1079abe135198c0ec21ba4d5ead802416d98d3408a465053fe0c56b193faf" exitCode=0 Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.889303 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.889145 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" event={"ID":"b40815a7-cffa-44ed-8acf-98261cc7e14c","Type":"ContainerDied","Data":"32e1079abe135198c0ec21ba4d5ead802416d98d3408a465053fe0c56b193faf"} Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.889414 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" event={"ID":"b40815a7-cffa-44ed-8acf-98261cc7e14c","Type":"ContainerDied","Data":"ee40b807024485305f67748b89b6f21b26eeabd4f3866126d5f1e66f00f01af7"} Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.889458 5094 scope.go:117] "RemoveContainer" containerID="32e1079abe135198c0ec21ba4d5ead802416d98d3408a465053fe0c56b193faf" Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.891593 5094 generic.go:334] "Generic (PLEG): container finished" podID="a829c6b3-7069-4544-90dc-40ae83aba524" containerID="24c496b6fb0954ae89602f61cc9646541943f3eefe9945f4d983f81e20a69e1c" exitCode=0 Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.891669 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a829c6b3-7069-4544-90dc-40ae83aba524","Type":"ContainerDied","Data":"24c496b6fb0954ae89602f61cc9646541943f3eefe9945f4d983f81e20a69e1c"} Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.893611 5094 generic.go:334] "Generic (PLEG): container finished" podID="4a26e4e5-091b-4a9e-8a16-2dc535e85fae" containerID="b30b7402d8eb93fe27dd4eeb5df1c58c1d66056e0ef8f55ed4b6d91fb78c16c7" exitCode=0 Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.893728 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-b2w5s" event={"ID":"4a26e4e5-091b-4a9e-8a16-2dc535e85fae","Type":"ContainerDied","Data":"b30b7402d8eb93fe27dd4eeb5df1c58c1d66056e0ef8f55ed4b6d91fb78c16c7"} Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.893764 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-b2w5s" event={"ID":"4a26e4e5-091b-4a9e-8a16-2dc535e85fae","Type":"ContainerStarted","Data":"ca6817ae07db3acb9ded9895ffb37d6966aff2b9baa689ae6bec470103bb00b6"} Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.897168 5094 generic.go:334] "Generic (PLEG): container finished" podID="219c74d6-9f45-4bf8-8c67-acdea3c0fab3" containerID="0b471ed3a22d48304ba36ebcf4e6acbda6b19074d6e3a72832e1dda4c6f2f145" exitCode=0 Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.897209 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"219c74d6-9f45-4bf8-8c67-acdea3c0fab3","Type":"ContainerDied","Data":"0b471ed3a22d48304ba36ebcf4e6acbda6b19074d6e3a72832e1dda4c6f2f145"} Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.933052 5094 scope.go:117] "RemoveContainer" containerID="f3c82a94d6648ba5c9c19bfa7de7c6f84242900741fd3a27425da31ee3d8ec82" Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.029893 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-wdrds"] Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.033571 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-wdrds"] Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.037342 5094 scope.go:117] "RemoveContainer" containerID="32e1079abe135198c0ec21ba4d5ead802416d98d3408a465053fe0c56b193faf" Feb 20 07:06:55 crc kubenswrapper[5094]: E0220 07:06:55.039681 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32e1079abe135198c0ec21ba4d5ead802416d98d3408a465053fe0c56b193faf\": container with ID starting with 32e1079abe135198c0ec21ba4d5ead802416d98d3408a465053fe0c56b193faf not found: ID does not exist" containerID="32e1079abe135198c0ec21ba4d5ead802416d98d3408a465053fe0c56b193faf" Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.039731 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32e1079abe135198c0ec21ba4d5ead802416d98d3408a465053fe0c56b193faf"} err="failed to get container status \"32e1079abe135198c0ec21ba4d5ead802416d98d3408a465053fe0c56b193faf\": rpc error: code = NotFound desc = could not find container \"32e1079abe135198c0ec21ba4d5ead802416d98d3408a465053fe0c56b193faf\": container with ID starting with 32e1079abe135198c0ec21ba4d5ead802416d98d3408a465053fe0c56b193faf not found: ID does not exist" Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.039755 5094 scope.go:117] "RemoveContainer" containerID="f3c82a94d6648ba5c9c19bfa7de7c6f84242900741fd3a27425da31ee3d8ec82" Feb 20 07:06:55 crc kubenswrapper[5094]: E0220 07:06:55.040146 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3c82a94d6648ba5c9c19bfa7de7c6f84242900741fd3a27425da31ee3d8ec82\": container with ID starting with f3c82a94d6648ba5c9c19bfa7de7c6f84242900741fd3a27425da31ee3d8ec82 not found: ID does not exist" containerID="f3c82a94d6648ba5c9c19bfa7de7c6f84242900741fd3a27425da31ee3d8ec82" Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.040166 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3c82a94d6648ba5c9c19bfa7de7c6f84242900741fd3a27425da31ee3d8ec82"} err="failed to get container status \"f3c82a94d6648ba5c9c19bfa7de7c6f84242900741fd3a27425da31ee3d8ec82\": rpc error: code = NotFound desc = could not find container \"f3c82a94d6648ba5c9c19bfa7de7c6f84242900741fd3a27425da31ee3d8ec82\": container with ID starting with f3c82a94d6648ba5c9c19bfa7de7c6f84242900741fd3a27425da31ee3d8ec82 not found: ID does not exist" Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.468100 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.475510 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.490437 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.858282 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b40815a7-cffa-44ed-8acf-98261cc7e14c" path="/var/lib/kubelet/pods/b40815a7-cffa-44ed-8acf-98261cc7e14c/volumes" Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.911235 5094 generic.go:334] "Generic (PLEG): container finished" podID="ffd170be-0f58-4016-a451-5fb1f7fd9f1b" containerID="a9068c6d92101548d5c981bed911d9bb32d305a4eac5c47442863a89e8e65fe9" exitCode=0 Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.911418 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vpv24" event={"ID":"ffd170be-0f58-4016-a451-5fb1f7fd9f1b","Type":"ContainerDied","Data":"a9068c6d92101548d5c981bed911d9bb32d305a4eac5c47442863a89e8e65fe9"} Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.915102 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a829c6b3-7069-4544-90dc-40ae83aba524","Type":"ContainerStarted","Data":"bf7d170cd7c0f8170ef78ab632229324322185632d477a161a83826e71f489e8"} Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.915415 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.919364 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"219c74d6-9f45-4bf8-8c67-acdea3c0fab3","Type":"ContainerStarted","Data":"74e0d7c23ec3f1be5316db26c770a1e0ec492750a824549bed30f944a01c88b6"} Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.922127 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.964841 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.670213249 podStartE2EDuration="54.964812562s" podCreationTimestamp="2026-02-20 07:06:01 +0000 UTC" firstStartedPulling="2026-02-20 07:06:03.264813717 +0000 UTC m=+1178.137440428" lastFinishedPulling="2026-02-20 07:06:20.55941303 +0000 UTC m=+1195.432039741" observedRunningTime="2026-02-20 07:06:55.962158188 +0000 UTC m=+1230.834784929" watchObservedRunningTime="2026-02-20 07:06:55.964812562 +0000 UTC m=+1230.837439313" Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.986668 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.484688855 podStartE2EDuration="54.986645254s" podCreationTimestamp="2026-02-20 07:06:01 +0000 UTC" firstStartedPulling="2026-02-20 07:06:04.131381231 +0000 UTC m=+1179.004007942" lastFinishedPulling="2026-02-20 07:06:20.63333763 +0000 UTC m=+1195.505964341" observedRunningTime="2026-02-20 07:06:55.98477082 +0000 UTC m=+1230.857397531" watchObservedRunningTime="2026-02-20 07:06:55.986645254 +0000 UTC m=+1230.859271965" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.153271 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.348891 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b2w5s" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.381515 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.487304 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a26e4e5-091b-4a9e-8a16-2dc535e85fae-operator-scripts\") pod \"4a26e4e5-091b-4a9e-8a16-2dc535e85fae\" (UID: \"4a26e4e5-091b-4a9e-8a16-2dc535e85fae\") " Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.487408 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdcld\" (UniqueName: \"kubernetes.io/projected/4a26e4e5-091b-4a9e-8a16-2dc535e85fae-kube-api-access-sdcld\") pod \"4a26e4e5-091b-4a9e-8a16-2dc535e85fae\" (UID: \"4a26e4e5-091b-4a9e-8a16-2dc535e85fae\") " Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.488058 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a26e4e5-091b-4a9e-8a16-2dc535e85fae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4a26e4e5-091b-4a9e-8a16-2dc535e85fae" (UID: "4a26e4e5-091b-4a9e-8a16-2dc535e85fae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.488962 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a26e4e5-091b-4a9e-8a16-2dc535e85fae-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.496892 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a26e4e5-091b-4a9e-8a16-2dc535e85fae-kube-api-access-sdcld" (OuterVolumeSpecName: "kube-api-access-sdcld") pod "4a26e4e5-091b-4a9e-8a16-2dc535e85fae" (UID: "4a26e4e5-091b-4a9e-8a16-2dc535e85fae"). InnerVolumeSpecName "kube-api-access-sdcld". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.591226 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdcld\" (UniqueName: \"kubernetes.io/projected/4a26e4e5-091b-4a9e-8a16-2dc535e85fae-kube-api-access-sdcld\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.846946 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-d27ft"] Feb 20 07:06:56 crc kubenswrapper[5094]: E0220 07:06:56.847362 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b40815a7-cffa-44ed-8acf-98261cc7e14c" containerName="dnsmasq-dns" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.847378 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b40815a7-cffa-44ed-8acf-98261cc7e14c" containerName="dnsmasq-dns" Feb 20 07:06:56 crc kubenswrapper[5094]: E0220 07:06:56.847394 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b40815a7-cffa-44ed-8acf-98261cc7e14c" containerName="init" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.847401 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b40815a7-cffa-44ed-8acf-98261cc7e14c" containerName="init" Feb 20 07:06:56 crc kubenswrapper[5094]: E0220 07:06:56.847413 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a26e4e5-091b-4a9e-8a16-2dc535e85fae" containerName="mariadb-account-create-update" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.847419 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a26e4e5-091b-4a9e-8a16-2dc535e85fae" containerName="mariadb-account-create-update" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.847593 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a26e4e5-091b-4a9e-8a16-2dc535e85fae" containerName="mariadb-account-create-update" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.847612 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b40815a7-cffa-44ed-8acf-98261cc7e14c" containerName="dnsmasq-dns" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.848256 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-d27ft" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.851847 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.852079 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9q5bq" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.859477 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-d27ft"] Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.932651 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerStarted","Data":"b318a2984af52699dbcc87bf8935047ececfd11a736630826d02b012b12ef5e4"} Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.944439 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-b2w5s" event={"ID":"4a26e4e5-091b-4a9e-8a16-2dc535e85fae","Type":"ContainerDied","Data":"ca6817ae07db3acb9ded9895ffb37d6966aff2b9baa689ae6bec470103bb00b6"} Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.944766 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b2w5s" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.944853 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca6817ae07db3acb9ded9895ffb37d6966aff2b9baa689ae6bec470103bb00b6" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.997777 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-config-data\") pod \"glance-db-sync-d27ft\" (UID: \"2538e2cc-781b-4c2a-b993-381e488fd5bb\") " pod="openstack/glance-db-sync-d27ft" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.998133 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2sn5\" (UniqueName: \"kubernetes.io/projected/2538e2cc-781b-4c2a-b993-381e488fd5bb-kube-api-access-f2sn5\") pod \"glance-db-sync-d27ft\" (UID: \"2538e2cc-781b-4c2a-b993-381e488fd5bb\") " pod="openstack/glance-db-sync-d27ft" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.998276 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-combined-ca-bundle\") pod \"glance-db-sync-d27ft\" (UID: \"2538e2cc-781b-4c2a-b993-381e488fd5bb\") " pod="openstack/glance-db-sync-d27ft" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.998448 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-db-sync-config-data\") pod \"glance-db-sync-d27ft\" (UID: \"2538e2cc-781b-4c2a-b993-381e488fd5bb\") " pod="openstack/glance-db-sync-d27ft" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.103838 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2sn5\" (UniqueName: \"kubernetes.io/projected/2538e2cc-781b-4c2a-b993-381e488fd5bb-kube-api-access-f2sn5\") pod \"glance-db-sync-d27ft\" (UID: \"2538e2cc-781b-4c2a-b993-381e488fd5bb\") " pod="openstack/glance-db-sync-d27ft" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.104413 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-combined-ca-bundle\") pod \"glance-db-sync-d27ft\" (UID: \"2538e2cc-781b-4c2a-b993-381e488fd5bb\") " pod="openstack/glance-db-sync-d27ft" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.104544 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-db-sync-config-data\") pod \"glance-db-sync-d27ft\" (UID: \"2538e2cc-781b-4c2a-b993-381e488fd5bb\") " pod="openstack/glance-db-sync-d27ft" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.104630 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-config-data\") pod \"glance-db-sync-d27ft\" (UID: \"2538e2cc-781b-4c2a-b993-381e488fd5bb\") " pod="openstack/glance-db-sync-d27ft" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.111726 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-db-sync-config-data\") pod \"glance-db-sync-d27ft\" (UID: \"2538e2cc-781b-4c2a-b993-381e488fd5bb\") " pod="openstack/glance-db-sync-d27ft" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.126590 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-config-data\") pod \"glance-db-sync-d27ft\" (UID: \"2538e2cc-781b-4c2a-b993-381e488fd5bb\") " pod="openstack/glance-db-sync-d27ft" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.128075 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-combined-ca-bundle\") pod \"glance-db-sync-d27ft\" (UID: \"2538e2cc-781b-4c2a-b993-381e488fd5bb\") " pod="openstack/glance-db-sync-d27ft" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.132551 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2sn5\" (UniqueName: \"kubernetes.io/projected/2538e2cc-781b-4c2a-b993-381e488fd5bb-kube-api-access-f2sn5\") pod \"glance-db-sync-d27ft\" (UID: \"2538e2cc-781b-4c2a-b993-381e488fd5bb\") " pod="openstack/glance-db-sync-d27ft" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.168401 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-d27ft" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.379267 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.510186 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-etc-swift\") pod \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.510636 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-combined-ca-bundle\") pod \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.510687 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-dispersionconf\") pod \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.510725 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-swiftconf\") pod \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.510837 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-ring-data-devices\") pod \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.510878 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjgt6\" (UniqueName: \"kubernetes.io/projected/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-kube-api-access-cjgt6\") pod \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.510974 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-scripts\") pod \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.511262 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ffd170be-0f58-4016-a451-5fb1f7fd9f1b" (UID: "ffd170be-0f58-4016-a451-5fb1f7fd9f1b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.511858 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ffd170be-0f58-4016-a451-5fb1f7fd9f1b" (UID: "ffd170be-0f58-4016-a451-5fb1f7fd9f1b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.521035 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-kube-api-access-cjgt6" (OuterVolumeSpecName: "kube-api-access-cjgt6") pod "ffd170be-0f58-4016-a451-5fb1f7fd9f1b" (UID: "ffd170be-0f58-4016-a451-5fb1f7fd9f1b"). InnerVolumeSpecName "kube-api-access-cjgt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.523955 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ffd170be-0f58-4016-a451-5fb1f7fd9f1b" (UID: "ffd170be-0f58-4016-a451-5fb1f7fd9f1b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.581615 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-scripts" (OuterVolumeSpecName: "scripts") pod "ffd170be-0f58-4016-a451-5fb1f7fd9f1b" (UID: "ffd170be-0f58-4016-a451-5fb1f7fd9f1b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.591899 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ffd170be-0f58-4016-a451-5fb1f7fd9f1b" (UID: "ffd170be-0f58-4016-a451-5fb1f7fd9f1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.602063 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ffd170be-0f58-4016-a451-5fb1f7fd9f1b" (UID: "ffd170be-0f58-4016-a451-5fb1f7fd9f1b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.612968 5094 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.613000 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.613014 5094 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.613025 5094 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.613036 5094 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.613045 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjgt6\" (UniqueName: \"kubernetes.io/projected/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-kube-api-access-cjgt6\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.613057 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.814891 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-d27ft"] Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.955117 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerStarted","Data":"4eef6c37c6d679f44709f4c17ae202933f7640edde1e1c08cde85643a4c659b9"} Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.955659 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerStarted","Data":"77762b2b7b3381830ea1b52f38b5c149eed591367db41376905ffbc17a991c9c"} Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.957374 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.957371 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vpv24" event={"ID":"ffd170be-0f58-4016-a451-5fb1f7fd9f1b","Type":"ContainerDied","Data":"1204be4a8d2c968ca778e1391eab76ffdababc3edfb0acee83669813b41f3e1b"} Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.957544 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1204be4a8d2c968ca778e1391eab76ffdababc3edfb0acee83669813b41f3e1b" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.958440 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-d27ft" event={"ID":"2538e2cc-781b-4c2a-b993-381e488fd5bb","Type":"ContainerStarted","Data":"178c70ca4808580e5184d1f4d0b6c895bc6f3ba07b300a63f8cb105cddd95d66"} Feb 20 07:06:58 crc kubenswrapper[5094]: I0220 07:06:58.979563 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerStarted","Data":"876ede354cfdccb940d2003cb9a673f9c21d14f390f0a6c414edee229821334b"} Feb 20 07:06:58 crc kubenswrapper[5094]: I0220 07:06:58.981213 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerStarted","Data":"87d22bfd10a9fea3abdf5984327defada78a003b41f7d6829b04b81bf140871a"} Feb 20 07:06:59 crc kubenswrapper[5094]: I0220 07:06:59.652360 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-b2w5s"] Feb 20 07:06:59 crc kubenswrapper[5094]: I0220 07:06:59.657879 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-b2w5s"] Feb 20 07:06:59 crc kubenswrapper[5094]: I0220 07:06:59.850657 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a26e4e5-091b-4a9e-8a16-2dc535e85fae" path="/var/lib/kubelet/pods/4a26e4e5-091b-4a9e-8a16-2dc535e85fae/volumes" Feb 20 07:06:59 crc kubenswrapper[5094]: I0220 07:06:59.991503 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerStarted","Data":"798e86534a684e476081f370e9263cf67a92cf7ab886af18bececee01f397813"} Feb 20 07:06:59 crc kubenswrapper[5094]: I0220 07:06:59.991548 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerStarted","Data":"13486685a923bf2693233900e7c25d8638639ebc4de0d1f20f27c486791e4f53"} Feb 20 07:07:01 crc kubenswrapper[5094]: I0220 07:07:01.006941 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerStarted","Data":"ecea14d9b3d89c636aa8573b40d00132c5867dc6539f43130d1e69748ccb2dd3"} Feb 20 07:07:01 crc kubenswrapper[5094]: I0220 07:07:01.007430 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerStarted","Data":"da22674644cab0e6f810eb224627362470513debe3726c5711ef469be1db004e"} Feb 20 07:07:01 crc kubenswrapper[5094]: I0220 07:07:01.889078 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-lvlr2" podUID="8ecb5d91-5ba1-457e-af42-0d78c8643250" containerName="ovn-controller" probeResult="failure" output=< Feb 20 07:07:01 crc kubenswrapper[5094]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 20 07:07:01 crc kubenswrapper[5094]: > Feb 20 07:07:02 crc kubenswrapper[5094]: I0220 07:07:02.053143 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerStarted","Data":"997b5f8501f8553cfdec1b566ef6a822ad0fc847b09fa07489c9cc4bdaa9307c"} Feb 20 07:07:02 crc kubenswrapper[5094]: I0220 07:07:02.053223 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerStarted","Data":"c948d686dc04dce290ff45631530d5f46650b2f0c7f924c52cff39ae5c18105d"} Feb 20 07:07:02 crc kubenswrapper[5094]: I0220 07:07:02.053251 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerStarted","Data":"b29a2c6ccb9cf15778496c1427d8d36bd92976f1a511632fe77d63af46238ab2"} Feb 20 07:07:02 crc kubenswrapper[5094]: I0220 07:07:02.053275 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerStarted","Data":"6b2e14e49e3a0c6e84ab2a157f1e34b73c7517b9c172442d8359cdf3a71c1601"} Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.078783 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerStarted","Data":"694f121dca03e8e25df5cfb69cf088e987e8cad432e283e4814356d5f0848991"} Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.079445 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerStarted","Data":"d95b17f09f786d48faa3531b254622503fd28358bef08fa635117e335717ec76"} Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.079460 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerStarted","Data":"ec9511e4e89788b48919a84e31eb85bf09ba8ef7f2fd3f486aacf3733fac08ce"} Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.122394 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.158802266 podStartE2EDuration="25.1223741s" podCreationTimestamp="2026-02-20 07:06:38 +0000 UTC" firstStartedPulling="2026-02-20 07:06:56.157199239 +0000 UTC m=+1231.029825950" lastFinishedPulling="2026-02-20 07:07:01.120771083 +0000 UTC m=+1235.993397784" observedRunningTime="2026-02-20 07:07:03.120964516 +0000 UTC m=+1237.993591227" watchObservedRunningTime="2026-02-20 07:07:03.1223741 +0000 UTC m=+1237.995000811" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.453969 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-768666cd57-7ddwb"] Feb 20 07:07:03 crc kubenswrapper[5094]: E0220 07:07:03.454647 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffd170be-0f58-4016-a451-5fb1f7fd9f1b" containerName="swift-ring-rebalance" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.454685 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffd170be-0f58-4016-a451-5fb1f7fd9f1b" containerName="swift-ring-rebalance" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.455044 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffd170be-0f58-4016-a451-5fb1f7fd9f1b" containerName="swift-ring-rebalance" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.456625 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.461987 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.469262 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-768666cd57-7ddwb"] Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.532441 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-ovsdbserver-sb\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.532520 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-config\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.532553 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-ovsdbserver-nb\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.532577 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-dns-swift-storage-0\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.532646 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-dns-svc\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.532712 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzck5\" (UniqueName: \"kubernetes.io/projected/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-kube-api-access-jzck5\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.634611 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-ovsdbserver-sb\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.634729 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-config\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.634771 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-ovsdbserver-nb\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.634813 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-dns-swift-storage-0\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.634884 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-dns-svc\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.634909 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzck5\" (UniqueName: \"kubernetes.io/projected/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-kube-api-access-jzck5\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.636283 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-config\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.636302 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-dns-swift-storage-0\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.636361 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-ovsdbserver-sb\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.636411 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-dns-svc\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.636945 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-ovsdbserver-nb\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.664996 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzck5\" (UniqueName: \"kubernetes.io/projected/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-kube-api-access-jzck5\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.798358 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:04 crc kubenswrapper[5094]: I0220 07:07:04.109055 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:07:04 crc kubenswrapper[5094]: I0220 07:07:04.109133 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:07:04 crc kubenswrapper[5094]: I0220 07:07:04.340155 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-768666cd57-7ddwb"] Feb 20 07:07:04 crc kubenswrapper[5094]: I0220 07:07:04.663852 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-b77jp"] Feb 20 07:07:04 crc kubenswrapper[5094]: I0220 07:07:04.664884 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b77jp" Feb 20 07:07:04 crc kubenswrapper[5094]: I0220 07:07:04.667411 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 20 07:07:04 crc kubenswrapper[5094]: I0220 07:07:04.688652 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-b77jp"] Feb 20 07:07:04 crc kubenswrapper[5094]: I0220 07:07:04.767463 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5044f3da-a9aa-4f6e-b598-3b5e963f8731-operator-scripts\") pod \"root-account-create-update-b77jp\" (UID: \"5044f3da-a9aa-4f6e-b598-3b5e963f8731\") " pod="openstack/root-account-create-update-b77jp" Feb 20 07:07:04 crc kubenswrapper[5094]: I0220 07:07:04.767723 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hstcw\" (UniqueName: \"kubernetes.io/projected/5044f3da-a9aa-4f6e-b598-3b5e963f8731-kube-api-access-hstcw\") pod \"root-account-create-update-b77jp\" (UID: \"5044f3da-a9aa-4f6e-b598-3b5e963f8731\") " pod="openstack/root-account-create-update-b77jp" Feb 20 07:07:04 crc kubenswrapper[5094]: I0220 07:07:04.869753 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5044f3da-a9aa-4f6e-b598-3b5e963f8731-operator-scripts\") pod \"root-account-create-update-b77jp\" (UID: \"5044f3da-a9aa-4f6e-b598-3b5e963f8731\") " pod="openstack/root-account-create-update-b77jp" Feb 20 07:07:04 crc kubenswrapper[5094]: I0220 07:07:04.869828 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hstcw\" (UniqueName: \"kubernetes.io/projected/5044f3da-a9aa-4f6e-b598-3b5e963f8731-kube-api-access-hstcw\") pod \"root-account-create-update-b77jp\" (UID: \"5044f3da-a9aa-4f6e-b598-3b5e963f8731\") " pod="openstack/root-account-create-update-b77jp" Feb 20 07:07:04 crc kubenswrapper[5094]: I0220 07:07:04.871136 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5044f3da-a9aa-4f6e-b598-3b5e963f8731-operator-scripts\") pod \"root-account-create-update-b77jp\" (UID: \"5044f3da-a9aa-4f6e-b598-3b5e963f8731\") " pod="openstack/root-account-create-update-b77jp" Feb 20 07:07:04 crc kubenswrapper[5094]: I0220 07:07:04.889825 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hstcw\" (UniqueName: \"kubernetes.io/projected/5044f3da-a9aa-4f6e-b598-3b5e963f8731-kube-api-access-hstcw\") pod \"root-account-create-update-b77jp\" (UID: \"5044f3da-a9aa-4f6e-b598-3b5e963f8731\") " pod="openstack/root-account-create-update-b77jp" Feb 20 07:07:04 crc kubenswrapper[5094]: I0220 07:07:04.984388 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b77jp" Feb 20 07:07:06 crc kubenswrapper[5094]: I0220 07:07:06.912416 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-lvlr2" podUID="8ecb5d91-5ba1-457e-af42-0d78c8643250" containerName="ovn-controller" probeResult="failure" output=< Feb 20 07:07:06 crc kubenswrapper[5094]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 20 07:07:06 crc kubenswrapper[5094]: > Feb 20 07:07:06 crc kubenswrapper[5094]: I0220 07:07:06.981187 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:07:06 crc kubenswrapper[5094]: I0220 07:07:06.981732 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.250455 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-lvlr2-config-wmdb5"] Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.252774 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.255113 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.266159 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lvlr2-config-wmdb5"] Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.318473 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-run-ovn\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.318580 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-log-ovn\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.318643 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3a516b45-9a5d-4210-82b3-b07e7251ffad-additional-scripts\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.318666 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpdmb\" (UniqueName: \"kubernetes.io/projected/3a516b45-9a5d-4210-82b3-b07e7251ffad-kube-api-access-bpdmb\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.318741 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-run\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.318778 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a516b45-9a5d-4210-82b3-b07e7251ffad-scripts\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.421020 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-log-ovn\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.421127 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3a516b45-9a5d-4210-82b3-b07e7251ffad-additional-scripts\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.421194 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpdmb\" (UniqueName: \"kubernetes.io/projected/3a516b45-9a5d-4210-82b3-b07e7251ffad-kube-api-access-bpdmb\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.421236 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-run\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.421259 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a516b45-9a5d-4210-82b3-b07e7251ffad-scripts\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.421307 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-run-ovn\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.421521 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-log-ovn\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.421613 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-run-ovn\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.421673 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-run\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.423038 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3a516b45-9a5d-4210-82b3-b07e7251ffad-additional-scripts\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.424154 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a516b45-9a5d-4210-82b3-b07e7251ffad-scripts\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.443795 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpdmb\" (UniqueName: \"kubernetes.io/projected/3a516b45-9a5d-4210-82b3-b07e7251ffad-kube-api-access-bpdmb\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.589657 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:11 crc kubenswrapper[5094]: W0220 07:07:11.675575 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa0f7da7_a9bd_4b03_b256_d05ba9323e70.slice/crio-a20fdfeb6eba82be36ac56d6e09083509aa1e18f2c3332e9e2fa4dc3127f38aa WatchSource:0}: Error finding container a20fdfeb6eba82be36ac56d6e09083509aa1e18f2c3332e9e2fa4dc3127f38aa: Status 404 returned error can't find the container with id a20fdfeb6eba82be36ac56d6e09083509aa1e18f2c3332e9e2fa4dc3127f38aa Feb 20 07:07:11 crc kubenswrapper[5094]: I0220 07:07:11.879752 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-lvlr2" podUID="8ecb5d91-5ba1-457e-af42-0d78c8643250" containerName="ovn-controller" probeResult="failure" output=< Feb 20 07:07:11 crc kubenswrapper[5094]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 20 07:07:11 crc kubenswrapper[5094]: > Feb 20 07:07:12 crc kubenswrapper[5094]: I0220 07:07:12.169752 5094 generic.go:334] "Generic (PLEG): container finished" podID="aa0f7da7-a9bd-4b03-b256-d05ba9323e70" containerID="7c8241aa612d986c2efd3e576b0082b1361568858d2a7098f35d783948c494f3" exitCode=0 Feb 20 07:07:12 crc kubenswrapper[5094]: I0220 07:07:12.169806 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768666cd57-7ddwb" event={"ID":"aa0f7da7-a9bd-4b03-b256-d05ba9323e70","Type":"ContainerDied","Data":"7c8241aa612d986c2efd3e576b0082b1361568858d2a7098f35d783948c494f3"} Feb 20 07:07:12 crc kubenswrapper[5094]: I0220 07:07:12.169837 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768666cd57-7ddwb" event={"ID":"aa0f7da7-a9bd-4b03-b256-d05ba9323e70","Type":"ContainerStarted","Data":"a20fdfeb6eba82be36ac56d6e09083509aa1e18f2c3332e9e2fa4dc3127f38aa"} Feb 20 07:07:12 crc kubenswrapper[5094]: I0220 07:07:12.255593 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-b77jp"] Feb 20 07:07:12 crc kubenswrapper[5094]: I0220 07:07:12.263909 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lvlr2-config-wmdb5"] Feb 20 07:07:12 crc kubenswrapper[5094]: W0220 07:07:12.271120 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a516b45_9a5d_4210_82b3_b07e7251ffad.slice/crio-1ead8b8ccd74a6ebc513af1497b9a70035d11b41919035393542a1a8cea701c3 WatchSource:0}: Error finding container 1ead8b8ccd74a6ebc513af1497b9a70035d11b41919035393542a1a8cea701c3: Status 404 returned error can't find the container with id 1ead8b8ccd74a6ebc513af1497b9a70035d11b41919035393542a1a8cea701c3 Feb 20 07:07:12 crc kubenswrapper[5094]: W0220 07:07:12.274952 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5044f3da_a9aa_4f6e_b598_3b5e963f8731.slice/crio-1b450bac989c189372bce33651ac118a9439313a250ef763906a77f1ed376185 WatchSource:0}: Error finding container 1b450bac989c189372bce33651ac118a9439313a250ef763906a77f1ed376185: Status 404 returned error can't find the container with id 1b450bac989c189372bce33651ac118a9439313a250ef763906a77f1ed376185 Feb 20 07:07:12 crc kubenswrapper[5094]: I0220 07:07:12.647950 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:07:13 crc kubenswrapper[5094]: I0220 07:07:13.179827 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-d27ft" event={"ID":"2538e2cc-781b-4c2a-b993-381e488fd5bb","Type":"ContainerStarted","Data":"faa3a4c7f983e78d891220a937038d927c9ef5f317cdc8774b24154cf98f3466"} Feb 20 07:07:13 crc kubenswrapper[5094]: I0220 07:07:13.185498 5094 generic.go:334] "Generic (PLEG): container finished" podID="3a516b45-9a5d-4210-82b3-b07e7251ffad" containerID="ba5029a86f52015ae26ae9c4af241df191f71a5df81010d7bab393d3d450c913" exitCode=0 Feb 20 07:07:13 crc kubenswrapper[5094]: I0220 07:07:13.185639 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lvlr2-config-wmdb5" event={"ID":"3a516b45-9a5d-4210-82b3-b07e7251ffad","Type":"ContainerDied","Data":"ba5029a86f52015ae26ae9c4af241df191f71a5df81010d7bab393d3d450c913"} Feb 20 07:07:13 crc kubenswrapper[5094]: I0220 07:07:13.185673 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lvlr2-config-wmdb5" event={"ID":"3a516b45-9a5d-4210-82b3-b07e7251ffad","Type":"ContainerStarted","Data":"1ead8b8ccd74a6ebc513af1497b9a70035d11b41919035393542a1a8cea701c3"} Feb 20 07:07:13 crc kubenswrapper[5094]: I0220 07:07:13.189139 5094 generic.go:334] "Generic (PLEG): container finished" podID="5044f3da-a9aa-4f6e-b598-3b5e963f8731" containerID="59cc73fd6408558710efa6324658cf301b0cc15eed3c78c0c37707c5d008b54e" exitCode=0 Feb 20 07:07:13 crc kubenswrapper[5094]: I0220 07:07:13.189252 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-b77jp" event={"ID":"5044f3da-a9aa-4f6e-b598-3b5e963f8731","Type":"ContainerDied","Data":"59cc73fd6408558710efa6324658cf301b0cc15eed3c78c0c37707c5d008b54e"} Feb 20 07:07:13 crc kubenswrapper[5094]: I0220 07:07:13.189301 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-b77jp" event={"ID":"5044f3da-a9aa-4f6e-b598-3b5e963f8731","Type":"ContainerStarted","Data":"1b450bac989c189372bce33651ac118a9439313a250ef763906a77f1ed376185"} Feb 20 07:07:13 crc kubenswrapper[5094]: I0220 07:07:13.192612 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768666cd57-7ddwb" event={"ID":"aa0f7da7-a9bd-4b03-b256-d05ba9323e70","Type":"ContainerStarted","Data":"0ce2ed20e689517e7e40aa9a1d41aa5a5a7a220e9e24137c5b2fd77c89b217f6"} Feb 20 07:07:13 crc kubenswrapper[5094]: I0220 07:07:13.192978 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:13 crc kubenswrapper[5094]: I0220 07:07:13.213100 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-d27ft" podStartSLOduration=3.190560736 podStartE2EDuration="17.213059894s" podCreationTimestamp="2026-02-20 07:06:56 +0000 UTC" firstStartedPulling="2026-02-20 07:06:57.828585008 +0000 UTC m=+1232.701211719" lastFinishedPulling="2026-02-20 07:07:11.851084166 +0000 UTC m=+1246.723710877" observedRunningTime="2026-02-20 07:07:13.200621236 +0000 UTC m=+1248.073247977" watchObservedRunningTime="2026-02-20 07:07:13.213059894 +0000 UTC m=+1248.085686635" Feb 20 07:07:13 crc kubenswrapper[5094]: I0220 07:07:13.232294 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-768666cd57-7ddwb" podStartSLOduration=10.232261624 podStartE2EDuration="10.232261624s" podCreationTimestamp="2026-02-20 07:07:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:07:13.227689304 +0000 UTC m=+1248.100316025" watchObservedRunningTime="2026-02-20 07:07:13.232261624 +0000 UTC m=+1248.104888375" Feb 20 07:07:13 crc kubenswrapper[5094]: I0220 07:07:13.549950 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.612611 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.694831 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b77jp" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.712445 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a516b45-9a5d-4210-82b3-b07e7251ffad-scripts\") pod \"3a516b45-9a5d-4210-82b3-b07e7251ffad\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.712519 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-run-ovn\") pod \"3a516b45-9a5d-4210-82b3-b07e7251ffad\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.712559 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-log-ovn\") pod \"3a516b45-9a5d-4210-82b3-b07e7251ffad\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.712606 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-run\") pod \"3a516b45-9a5d-4210-82b3-b07e7251ffad\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.712655 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpdmb\" (UniqueName: \"kubernetes.io/projected/3a516b45-9a5d-4210-82b3-b07e7251ffad-kube-api-access-bpdmb\") pod \"3a516b45-9a5d-4210-82b3-b07e7251ffad\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.712759 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3a516b45-9a5d-4210-82b3-b07e7251ffad-additional-scripts\") pod \"3a516b45-9a5d-4210-82b3-b07e7251ffad\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.713838 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "3a516b45-9a5d-4210-82b3-b07e7251ffad" (UID: "3a516b45-9a5d-4210-82b3-b07e7251ffad"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.713884 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "3a516b45-9a5d-4210-82b3-b07e7251ffad" (UID: "3a516b45-9a5d-4210-82b3-b07e7251ffad"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.713987 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-run" (OuterVolumeSpecName: "var-run") pod "3a516b45-9a5d-4210-82b3-b07e7251ffad" (UID: "3a516b45-9a5d-4210-82b3-b07e7251ffad"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.715063 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a516b45-9a5d-4210-82b3-b07e7251ffad-scripts" (OuterVolumeSpecName: "scripts") pod "3a516b45-9a5d-4210-82b3-b07e7251ffad" (UID: "3a516b45-9a5d-4210-82b3-b07e7251ffad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.715146 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a516b45-9a5d-4210-82b3-b07e7251ffad-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "3a516b45-9a5d-4210-82b3-b07e7251ffad" (UID: "3a516b45-9a5d-4210-82b3-b07e7251ffad"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.736593 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a516b45-9a5d-4210-82b3-b07e7251ffad-kube-api-access-bpdmb" (OuterVolumeSpecName: "kube-api-access-bpdmb") pod "3a516b45-9a5d-4210-82b3-b07e7251ffad" (UID: "3a516b45-9a5d-4210-82b3-b07e7251ffad"). InnerVolumeSpecName "kube-api-access-bpdmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.814251 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5044f3da-a9aa-4f6e-b598-3b5e963f8731-operator-scripts\") pod \"5044f3da-a9aa-4f6e-b598-3b5e963f8731\" (UID: \"5044f3da-a9aa-4f6e-b598-3b5e963f8731\") " Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.814339 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hstcw\" (UniqueName: \"kubernetes.io/projected/5044f3da-a9aa-4f6e-b598-3b5e963f8731-kube-api-access-hstcw\") pod \"5044f3da-a9aa-4f6e-b598-3b5e963f8731\" (UID: \"5044f3da-a9aa-4f6e-b598-3b5e963f8731\") " Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.814817 5094 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3a516b45-9a5d-4210-82b3-b07e7251ffad-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.814836 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a516b45-9a5d-4210-82b3-b07e7251ffad-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.814847 5094 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.814857 5094 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.814866 5094 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-run\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.814879 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpdmb\" (UniqueName: \"kubernetes.io/projected/3a516b45-9a5d-4210-82b3-b07e7251ffad-kube-api-access-bpdmb\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.815576 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5044f3da-a9aa-4f6e-b598-3b5e963f8731-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5044f3da-a9aa-4f6e-b598-3b5e963f8731" (UID: "5044f3da-a9aa-4f6e-b598-3b5e963f8731"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.821250 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5044f3da-a9aa-4f6e-b598-3b5e963f8731-kube-api-access-hstcw" (OuterVolumeSpecName: "kube-api-access-hstcw") pod "5044f3da-a9aa-4f6e-b598-3b5e963f8731" (UID: "5044f3da-a9aa-4f6e-b598-3b5e963f8731"). InnerVolumeSpecName "kube-api-access-hstcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.854220 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-7gn4d"] Feb 20 07:07:14 crc kubenswrapper[5094]: E0220 07:07:14.855379 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5044f3da-a9aa-4f6e-b598-3b5e963f8731" containerName="mariadb-account-create-update" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.855405 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5044f3da-a9aa-4f6e-b598-3b5e963f8731" containerName="mariadb-account-create-update" Feb 20 07:07:14 crc kubenswrapper[5094]: E0220 07:07:14.856080 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a516b45-9a5d-4210-82b3-b07e7251ffad" containerName="ovn-config" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.856131 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a516b45-9a5d-4210-82b3-b07e7251ffad" containerName="ovn-config" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.856520 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a516b45-9a5d-4210-82b3-b07e7251ffad" containerName="ovn-config" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.856547 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5044f3da-a9aa-4f6e-b598-3b5e963f8731" containerName="mariadb-account-create-update" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.857513 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7gn4d" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.870827 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-7gn4d"] Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.916695 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d59abb8-e7c7-404f-8f03-13d2167bea54-operator-scripts\") pod \"cinder-db-create-7gn4d\" (UID: \"3d59abb8-e7c7-404f-8f03-13d2167bea54\") " pod="openstack/cinder-db-create-7gn4d" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.916770 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvlpt\" (UniqueName: \"kubernetes.io/projected/3d59abb8-e7c7-404f-8f03-13d2167bea54-kube-api-access-gvlpt\") pod \"cinder-db-create-7gn4d\" (UID: \"3d59abb8-e7c7-404f-8f03-13d2167bea54\") " pod="openstack/cinder-db-create-7gn4d" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.916877 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5044f3da-a9aa-4f6e-b598-3b5e963f8731-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.916893 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hstcw\" (UniqueName: \"kubernetes.io/projected/5044f3da-a9aa-4f6e-b598-3b5e963f8731-kube-api-access-hstcw\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.955542 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-8083-account-create-update-wxrzd"] Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.956726 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8083-account-create-update-wxrzd" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.960014 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.971810 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8083-account-create-update-wxrzd"] Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.018503 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d59abb8-e7c7-404f-8f03-13d2167bea54-operator-scripts\") pod \"cinder-db-create-7gn4d\" (UID: \"3d59abb8-e7c7-404f-8f03-13d2167bea54\") " pod="openstack/cinder-db-create-7gn4d" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.018567 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvlpt\" (UniqueName: \"kubernetes.io/projected/3d59abb8-e7c7-404f-8f03-13d2167bea54-kube-api-access-gvlpt\") pod \"cinder-db-create-7gn4d\" (UID: \"3d59abb8-e7c7-404f-8f03-13d2167bea54\") " pod="openstack/cinder-db-create-7gn4d" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.018652 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23a44809-2f91-4dbe-80ed-733390b037d8-operator-scripts\") pod \"cinder-8083-account-create-update-wxrzd\" (UID: \"23a44809-2f91-4dbe-80ed-733390b037d8\") " pod="openstack/cinder-8083-account-create-update-wxrzd" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.018777 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrtl2\" (UniqueName: \"kubernetes.io/projected/23a44809-2f91-4dbe-80ed-733390b037d8-kube-api-access-vrtl2\") pod \"cinder-8083-account-create-update-wxrzd\" (UID: \"23a44809-2f91-4dbe-80ed-733390b037d8\") " pod="openstack/cinder-8083-account-create-update-wxrzd" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.019439 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d59abb8-e7c7-404f-8f03-13d2167bea54-operator-scripts\") pod \"cinder-db-create-7gn4d\" (UID: \"3d59abb8-e7c7-404f-8f03-13d2167bea54\") " pod="openstack/cinder-db-create-7gn4d" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.046267 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-qvr99"] Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.047546 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qvr99" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.055905 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvlpt\" (UniqueName: \"kubernetes.io/projected/3d59abb8-e7c7-404f-8f03-13d2167bea54-kube-api-access-gvlpt\") pod \"cinder-db-create-7gn4d\" (UID: \"3d59abb8-e7c7-404f-8f03-13d2167bea54\") " pod="openstack/cinder-db-create-7gn4d" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.062939 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-qvr99"] Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.105751 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-plbtm"] Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.106871 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-plbtm" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.109372 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.109596 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qgvjt" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.109850 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.109980 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.120737 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrtl2\" (UniqueName: \"kubernetes.io/projected/23a44809-2f91-4dbe-80ed-733390b037d8-kube-api-access-vrtl2\") pod \"cinder-8083-account-create-update-wxrzd\" (UID: \"23a44809-2f91-4dbe-80ed-733390b037d8\") " pod="openstack/cinder-8083-account-create-update-wxrzd" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.120818 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c0f5daa-28f1-412d-8749-5b11f6b8f26d-operator-scripts\") pod \"barbican-db-create-qvr99\" (UID: \"5c0f5daa-28f1-412d-8749-5b11f6b8f26d\") " pod="openstack/barbican-db-create-qvr99" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.120877 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23a44809-2f91-4dbe-80ed-733390b037d8-operator-scripts\") pod \"cinder-8083-account-create-update-wxrzd\" (UID: \"23a44809-2f91-4dbe-80ed-733390b037d8\") " pod="openstack/cinder-8083-account-create-update-wxrzd" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.120951 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxrcx\" (UniqueName: \"kubernetes.io/projected/5c0f5daa-28f1-412d-8749-5b11f6b8f26d-kube-api-access-zxrcx\") pod \"barbican-db-create-qvr99\" (UID: \"5c0f5daa-28f1-412d-8749-5b11f6b8f26d\") " pod="openstack/barbican-db-create-qvr99" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.121767 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23a44809-2f91-4dbe-80ed-733390b037d8-operator-scripts\") pod \"cinder-8083-account-create-update-wxrzd\" (UID: \"23a44809-2f91-4dbe-80ed-733390b037d8\") " pod="openstack/cinder-8083-account-create-update-wxrzd" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.129809 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-plbtm"] Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.150890 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrtl2\" (UniqueName: \"kubernetes.io/projected/23a44809-2f91-4dbe-80ed-733390b037d8-kube-api-access-vrtl2\") pod \"cinder-8083-account-create-update-wxrzd\" (UID: \"23a44809-2f91-4dbe-80ed-733390b037d8\") " pod="openstack/cinder-8083-account-create-update-wxrzd" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.150994 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-c179-account-create-update-sst4m"] Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.186163 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c179-account-create-update-sst4m" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.188100 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7gn4d" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.189358 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.215652 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c179-account-create-update-sst4m"] Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.223933 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/403a4371-09f4-4206-8d60-5b970d7e4faf-combined-ca-bundle\") pod \"keystone-db-sync-plbtm\" (UID: \"403a4371-09f4-4206-8d60-5b970d7e4faf\") " pod="openstack/keystone-db-sync-plbtm" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.224018 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxrcx\" (UniqueName: \"kubernetes.io/projected/5c0f5daa-28f1-412d-8749-5b11f6b8f26d-kube-api-access-zxrcx\") pod \"barbican-db-create-qvr99\" (UID: \"5c0f5daa-28f1-412d-8749-5b11f6b8f26d\") " pod="openstack/barbican-db-create-qvr99" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.224046 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99x5n\" (UniqueName: \"kubernetes.io/projected/75a27624-eac7-47c9-9f3b-98604d88fb3a-kube-api-access-99x5n\") pod \"barbican-c179-account-create-update-sst4m\" (UID: \"75a27624-eac7-47c9-9f3b-98604d88fb3a\") " pod="openstack/barbican-c179-account-create-update-sst4m" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.224072 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d67lh\" (UniqueName: \"kubernetes.io/projected/403a4371-09f4-4206-8d60-5b970d7e4faf-kube-api-access-d67lh\") pod \"keystone-db-sync-plbtm\" (UID: \"403a4371-09f4-4206-8d60-5b970d7e4faf\") " pod="openstack/keystone-db-sync-plbtm" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.224132 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75a27624-eac7-47c9-9f3b-98604d88fb3a-operator-scripts\") pod \"barbican-c179-account-create-update-sst4m\" (UID: \"75a27624-eac7-47c9-9f3b-98604d88fb3a\") " pod="openstack/barbican-c179-account-create-update-sst4m" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.224156 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c0f5daa-28f1-412d-8749-5b11f6b8f26d-operator-scripts\") pod \"barbican-db-create-qvr99\" (UID: \"5c0f5daa-28f1-412d-8749-5b11f6b8f26d\") " pod="openstack/barbican-db-create-qvr99" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.224195 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/403a4371-09f4-4206-8d60-5b970d7e4faf-config-data\") pod \"keystone-db-sync-plbtm\" (UID: \"403a4371-09f4-4206-8d60-5b970d7e4faf\") " pod="openstack/keystone-db-sync-plbtm" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.225167 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c0f5daa-28f1-412d-8749-5b11f6b8f26d-operator-scripts\") pod \"barbican-db-create-qvr99\" (UID: \"5c0f5daa-28f1-412d-8749-5b11f6b8f26d\") " pod="openstack/barbican-db-create-qvr99" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.228188 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b77jp" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.230271 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-b77jp" event={"ID":"5044f3da-a9aa-4f6e-b598-3b5e963f8731","Type":"ContainerDied","Data":"1b450bac989c189372bce33651ac118a9439313a250ef763906a77f1ed376185"} Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.230338 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b450bac989c189372bce33651ac118a9439313a250ef763906a77f1ed376185" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.237034 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lvlr2-config-wmdb5" event={"ID":"3a516b45-9a5d-4210-82b3-b07e7251ffad","Type":"ContainerDied","Data":"1ead8b8ccd74a6ebc513af1497b9a70035d11b41919035393542a1a8cea701c3"} Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.237081 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ead8b8ccd74a6ebc513af1497b9a70035d11b41919035393542a1a8cea701c3" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.237167 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.247732 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxrcx\" (UniqueName: \"kubernetes.io/projected/5c0f5daa-28f1-412d-8749-5b11f6b8f26d-kube-api-access-zxrcx\") pod \"barbican-db-create-qvr99\" (UID: \"5c0f5daa-28f1-412d-8749-5b11f6b8f26d\") " pod="openstack/barbican-db-create-qvr99" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.250089 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-qqgpn"] Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.251747 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qqgpn" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.258725 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-qqgpn"] Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.274922 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8083-account-create-update-wxrzd" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.325674 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/403a4371-09f4-4206-8d60-5b970d7e4faf-combined-ca-bundle\") pod \"keystone-db-sync-plbtm\" (UID: \"403a4371-09f4-4206-8d60-5b970d7e4faf\") " pod="openstack/keystone-db-sync-plbtm" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.325754 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/317d32d8-9ad2-4bd1-87f4-745e3157c713-operator-scripts\") pod \"neutron-db-create-qqgpn\" (UID: \"317d32d8-9ad2-4bd1-87f4-745e3157c713\") " pod="openstack/neutron-db-create-qqgpn" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.325805 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99x5n\" (UniqueName: \"kubernetes.io/projected/75a27624-eac7-47c9-9f3b-98604d88fb3a-kube-api-access-99x5n\") pod \"barbican-c179-account-create-update-sst4m\" (UID: \"75a27624-eac7-47c9-9f3b-98604d88fb3a\") " pod="openstack/barbican-c179-account-create-update-sst4m" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.325840 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d67lh\" (UniqueName: \"kubernetes.io/projected/403a4371-09f4-4206-8d60-5b970d7e4faf-kube-api-access-d67lh\") pod \"keystone-db-sync-plbtm\" (UID: \"403a4371-09f4-4206-8d60-5b970d7e4faf\") " pod="openstack/keystone-db-sync-plbtm" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.325901 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75a27624-eac7-47c9-9f3b-98604d88fb3a-operator-scripts\") pod \"barbican-c179-account-create-update-sst4m\" (UID: \"75a27624-eac7-47c9-9f3b-98604d88fb3a\") " pod="openstack/barbican-c179-account-create-update-sst4m" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.325917 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnz6l\" (UniqueName: \"kubernetes.io/projected/317d32d8-9ad2-4bd1-87f4-745e3157c713-kube-api-access-rnz6l\") pod \"neutron-db-create-qqgpn\" (UID: \"317d32d8-9ad2-4bd1-87f4-745e3157c713\") " pod="openstack/neutron-db-create-qqgpn" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.325960 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/403a4371-09f4-4206-8d60-5b970d7e4faf-config-data\") pod \"keystone-db-sync-plbtm\" (UID: \"403a4371-09f4-4206-8d60-5b970d7e4faf\") " pod="openstack/keystone-db-sync-plbtm" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.327299 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75a27624-eac7-47c9-9f3b-98604d88fb3a-operator-scripts\") pod \"barbican-c179-account-create-update-sst4m\" (UID: \"75a27624-eac7-47c9-9f3b-98604d88fb3a\") " pod="openstack/barbican-c179-account-create-update-sst4m" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.329888 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/403a4371-09f4-4206-8d60-5b970d7e4faf-config-data\") pod \"keystone-db-sync-plbtm\" (UID: \"403a4371-09f4-4206-8d60-5b970d7e4faf\") " pod="openstack/keystone-db-sync-plbtm" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.331286 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/403a4371-09f4-4206-8d60-5b970d7e4faf-combined-ca-bundle\") pod \"keystone-db-sync-plbtm\" (UID: \"403a4371-09f4-4206-8d60-5b970d7e4faf\") " pod="openstack/keystone-db-sync-plbtm" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.360805 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d67lh\" (UniqueName: \"kubernetes.io/projected/403a4371-09f4-4206-8d60-5b970d7e4faf-kube-api-access-d67lh\") pod \"keystone-db-sync-plbtm\" (UID: \"403a4371-09f4-4206-8d60-5b970d7e4faf\") " pod="openstack/keystone-db-sync-plbtm" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.366116 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qvr99" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.371487 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99x5n\" (UniqueName: \"kubernetes.io/projected/75a27624-eac7-47c9-9f3b-98604d88fb3a-kube-api-access-99x5n\") pod \"barbican-c179-account-create-update-sst4m\" (UID: \"75a27624-eac7-47c9-9f3b-98604d88fb3a\") " pod="openstack/barbican-c179-account-create-update-sst4m" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.382228 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-abcd-account-create-update-bwsmr"] Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.385989 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-abcd-account-create-update-bwsmr" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.390568 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.412557 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-abcd-account-create-update-bwsmr"] Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.422185 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-plbtm" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.427688 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ztfs\" (UniqueName: \"kubernetes.io/projected/c4920eee-8485-4faa-892c-893c6466a90c-kube-api-access-4ztfs\") pod \"neutron-abcd-account-create-update-bwsmr\" (UID: \"c4920eee-8485-4faa-892c-893c6466a90c\") " pod="openstack/neutron-abcd-account-create-update-bwsmr" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.427751 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4920eee-8485-4faa-892c-893c6466a90c-operator-scripts\") pod \"neutron-abcd-account-create-update-bwsmr\" (UID: \"c4920eee-8485-4faa-892c-893c6466a90c\") " pod="openstack/neutron-abcd-account-create-update-bwsmr" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.427807 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnz6l\" (UniqueName: \"kubernetes.io/projected/317d32d8-9ad2-4bd1-87f4-745e3157c713-kube-api-access-rnz6l\") pod \"neutron-db-create-qqgpn\" (UID: \"317d32d8-9ad2-4bd1-87f4-745e3157c713\") " pod="openstack/neutron-db-create-qqgpn" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.427878 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/317d32d8-9ad2-4bd1-87f4-745e3157c713-operator-scripts\") pod \"neutron-db-create-qqgpn\" (UID: \"317d32d8-9ad2-4bd1-87f4-745e3157c713\") " pod="openstack/neutron-db-create-qqgpn" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.428551 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/317d32d8-9ad2-4bd1-87f4-745e3157c713-operator-scripts\") pod \"neutron-db-create-qqgpn\" (UID: \"317d32d8-9ad2-4bd1-87f4-745e3157c713\") " pod="openstack/neutron-db-create-qqgpn" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.450321 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnz6l\" (UniqueName: \"kubernetes.io/projected/317d32d8-9ad2-4bd1-87f4-745e3157c713-kube-api-access-rnz6l\") pod \"neutron-db-create-qqgpn\" (UID: \"317d32d8-9ad2-4bd1-87f4-745e3157c713\") " pod="openstack/neutron-db-create-qqgpn" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.530305 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ztfs\" (UniqueName: \"kubernetes.io/projected/c4920eee-8485-4faa-892c-893c6466a90c-kube-api-access-4ztfs\") pod \"neutron-abcd-account-create-update-bwsmr\" (UID: \"c4920eee-8485-4faa-892c-893c6466a90c\") " pod="openstack/neutron-abcd-account-create-update-bwsmr" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.530420 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4920eee-8485-4faa-892c-893c6466a90c-operator-scripts\") pod \"neutron-abcd-account-create-update-bwsmr\" (UID: \"c4920eee-8485-4faa-892c-893c6466a90c\") " pod="openstack/neutron-abcd-account-create-update-bwsmr" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.531467 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4920eee-8485-4faa-892c-893c6466a90c-operator-scripts\") pod \"neutron-abcd-account-create-update-bwsmr\" (UID: \"c4920eee-8485-4faa-892c-893c6466a90c\") " pod="openstack/neutron-abcd-account-create-update-bwsmr" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.531551 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c179-account-create-update-sst4m" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.558479 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ztfs\" (UniqueName: \"kubernetes.io/projected/c4920eee-8485-4faa-892c-893c6466a90c-kube-api-access-4ztfs\") pod \"neutron-abcd-account-create-update-bwsmr\" (UID: \"c4920eee-8485-4faa-892c-893c6466a90c\") " pod="openstack/neutron-abcd-account-create-update-bwsmr" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.579737 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qqgpn" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.724164 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-abcd-account-create-update-bwsmr" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.816116 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-lvlr2-config-wmdb5"] Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.828256 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-lvlr2-config-wmdb5"] Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.857835 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a516b45-9a5d-4210-82b3-b07e7251ffad" path="/var/lib/kubelet/pods/3a516b45-9a5d-4210-82b3-b07e7251ffad/volumes" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.858640 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-7gn4d"] Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.947239 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-lvlr2-config-wsgcd"] Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.948783 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.956934 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.972329 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lvlr2-config-wsgcd"] Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.986616 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8083-account-create-update-wxrzd"] Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.044957 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb7kw\" (UniqueName: \"kubernetes.io/projected/00e11219-ecb3-45ab-8303-265d85ff4c3a-kube-api-access-jb7kw\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.045025 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00e11219-ecb3-45ab-8303-265d85ff4c3a-scripts\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.045092 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-run\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.045122 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-log-ovn\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.045177 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/00e11219-ecb3-45ab-8303-265d85ff4c3a-additional-scripts\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.045210 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-run-ovn\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.095905 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-qvr99"] Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.147558 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-run-ovn\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.147654 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb7kw\" (UniqueName: \"kubernetes.io/projected/00e11219-ecb3-45ab-8303-265d85ff4c3a-kube-api-access-jb7kw\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.147689 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00e11219-ecb3-45ab-8303-265d85ff4c3a-scripts\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.147749 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-run\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.147776 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-log-ovn\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.147824 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/00e11219-ecb3-45ab-8303-265d85ff4c3a-additional-scripts\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.148682 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/00e11219-ecb3-45ab-8303-265d85ff4c3a-additional-scripts\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.149469 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-run\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.149517 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-log-ovn\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.149555 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-run-ovn\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.150890 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00e11219-ecb3-45ab-8303-265d85ff4c3a-scripts\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.172831 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb7kw\" (UniqueName: \"kubernetes.io/projected/00e11219-ecb3-45ab-8303-265d85ff4c3a-kube-api-access-jb7kw\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: W0220 07:07:16.205370 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod403a4371_09f4_4206_8d60_5b970d7e4faf.slice/crio-73c249f43927a9b87fab7f1f7c42f5fbb83b2f89d2a99cdbf8b2852eeafd7f98 WatchSource:0}: Error finding container 73c249f43927a9b87fab7f1f7c42f5fbb83b2f89d2a99cdbf8b2852eeafd7f98: Status 404 returned error can't find the container with id 73c249f43927a9b87fab7f1f7c42f5fbb83b2f89d2a99cdbf8b2852eeafd7f98 Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.205723 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-plbtm"] Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.239527 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-abcd-account-create-update-bwsmr"] Feb 20 07:07:16 crc kubenswrapper[5094]: W0220 07:07:16.243660 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4920eee_8485_4faa_892c_893c6466a90c.slice/crio-d53251aa996c65ff3c2062ffb8b3114ee2a3d8ca7e99e1d83319a526e535bd78 WatchSource:0}: Error finding container d53251aa996c65ff3c2062ffb8b3114ee2a3d8ca7e99e1d83319a526e535bd78: Status 404 returned error can't find the container with id d53251aa996c65ff3c2062ffb8b3114ee2a3d8ca7e99e1d83319a526e535bd78 Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.253297 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-plbtm" event={"ID":"403a4371-09f4-4206-8d60-5b970d7e4faf","Type":"ContainerStarted","Data":"73c249f43927a9b87fab7f1f7c42f5fbb83b2f89d2a99cdbf8b2852eeafd7f98"} Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.257460 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8083-account-create-update-wxrzd" event={"ID":"23a44809-2f91-4dbe-80ed-733390b037d8","Type":"ContainerStarted","Data":"07473375b1389dea5ef061588ec041b7a1dc6c39e615f25c439adbc0f20ff01b"} Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.259759 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qvr99" event={"ID":"5c0f5daa-28f1-412d-8749-5b11f6b8f26d","Type":"ContainerStarted","Data":"98316dffcd57fbd3d1bd86b87cf77b8f71de4c56ee7fbdaff700987546968d91"} Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.261404 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-7gn4d" event={"ID":"3d59abb8-e7c7-404f-8f03-13d2167bea54","Type":"ContainerStarted","Data":"627e60d9c9677f6a945a638d36a6e55c653ad3df560faf93d76ad3fae2820334"} Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.261430 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-7gn4d" event={"ID":"3d59abb8-e7c7-404f-8f03-13d2167bea54","Type":"ContainerStarted","Data":"6c0583540eb15668853e966ba4c6edb87e4fe6d9a3115b84ce82c03eaf755780"} Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.279249 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-7gn4d" podStartSLOduration=2.279225177 podStartE2EDuration="2.279225177s" podCreationTimestamp="2026-02-20 07:07:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:07:16.273078649 +0000 UTC m=+1251.145705360" watchObservedRunningTime="2026-02-20 07:07:16.279225177 +0000 UTC m=+1251.151851888" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.287899 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.372647 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c179-account-create-update-sst4m"] Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.383920 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-qqgpn"] Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.871213 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-lvlr2" Feb 20 07:07:16 crc kubenswrapper[5094]: E0220 07:07:16.906432 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23a44809_2f91_4dbe_80ed_733390b037d8.slice/crio-conmon-652be6b8a19337e61493b053c834d7ff10c694e4819ba0dbe010139f1aba0575.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23a44809_2f91_4dbe_80ed_733390b037d8.slice/crio-652be6b8a19337e61493b053c834d7ff10c694e4819ba0dbe010139f1aba0575.scope\": RecentStats: unable to find data in memory cache]" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.960438 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lvlr2-config-wsgcd"] Feb 20 07:07:17 crc kubenswrapper[5094]: W0220 07:07:17.020835 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00e11219_ecb3_45ab_8303_265d85ff4c3a.slice/crio-a83d0092220037210ddac94356a5643eedd8b0154c0d0c4581dd1f1cab42ebc4 WatchSource:0}: Error finding container a83d0092220037210ddac94356a5643eedd8b0154c0d0c4581dd1f1cab42ebc4: Status 404 returned error can't find the container with id a83d0092220037210ddac94356a5643eedd8b0154c0d0c4581dd1f1cab42ebc4 Feb 20 07:07:17 crc kubenswrapper[5094]: I0220 07:07:17.271103 5094 generic.go:334] "Generic (PLEG): container finished" podID="75a27624-eac7-47c9-9f3b-98604d88fb3a" containerID="378b26e1e0650ae576632665d611910465c17369e442435b9765cd97f7bbf4b7" exitCode=0 Feb 20 07:07:17 crc kubenswrapper[5094]: I0220 07:07:17.271180 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c179-account-create-update-sst4m" event={"ID":"75a27624-eac7-47c9-9f3b-98604d88fb3a","Type":"ContainerDied","Data":"378b26e1e0650ae576632665d611910465c17369e442435b9765cd97f7bbf4b7"} Feb 20 07:07:17 crc kubenswrapper[5094]: I0220 07:07:17.271214 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c179-account-create-update-sst4m" event={"ID":"75a27624-eac7-47c9-9f3b-98604d88fb3a","Type":"ContainerStarted","Data":"524ab1f80a7901ac7889c8ffcacb992647f00b7dc710ad8fede539905cd58f26"} Feb 20 07:07:17 crc kubenswrapper[5094]: I0220 07:07:17.273154 5094 generic.go:334] "Generic (PLEG): container finished" podID="3d59abb8-e7c7-404f-8f03-13d2167bea54" containerID="627e60d9c9677f6a945a638d36a6e55c653ad3df560faf93d76ad3fae2820334" exitCode=0 Feb 20 07:07:17 crc kubenswrapper[5094]: I0220 07:07:17.273195 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-7gn4d" event={"ID":"3d59abb8-e7c7-404f-8f03-13d2167bea54","Type":"ContainerDied","Data":"627e60d9c9677f6a945a638d36a6e55c653ad3df560faf93d76ad3fae2820334"} Feb 20 07:07:17 crc kubenswrapper[5094]: I0220 07:07:17.274592 5094 generic.go:334] "Generic (PLEG): container finished" podID="317d32d8-9ad2-4bd1-87f4-745e3157c713" containerID="50b02908599fab0b56ac49b8dfc4de2ac6a680f5927a195974880e894fd05f07" exitCode=0 Feb 20 07:07:17 crc kubenswrapper[5094]: I0220 07:07:17.274634 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qqgpn" event={"ID":"317d32d8-9ad2-4bd1-87f4-745e3157c713","Type":"ContainerDied","Data":"50b02908599fab0b56ac49b8dfc4de2ac6a680f5927a195974880e894fd05f07"} Feb 20 07:07:17 crc kubenswrapper[5094]: I0220 07:07:17.274650 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qqgpn" event={"ID":"317d32d8-9ad2-4bd1-87f4-745e3157c713","Type":"ContainerStarted","Data":"33c2a7ece5b891e28740e85e198816b9f0507288fb0d7eaec7b58b5561691817"} Feb 20 07:07:17 crc kubenswrapper[5094]: I0220 07:07:17.276509 5094 generic.go:334] "Generic (PLEG): container finished" podID="23a44809-2f91-4dbe-80ed-733390b037d8" containerID="652be6b8a19337e61493b053c834d7ff10c694e4819ba0dbe010139f1aba0575" exitCode=0 Feb 20 07:07:17 crc kubenswrapper[5094]: I0220 07:07:17.276550 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8083-account-create-update-wxrzd" event={"ID":"23a44809-2f91-4dbe-80ed-733390b037d8","Type":"ContainerDied","Data":"652be6b8a19337e61493b053c834d7ff10c694e4819ba0dbe010139f1aba0575"} Feb 20 07:07:17 crc kubenswrapper[5094]: I0220 07:07:17.278020 5094 generic.go:334] "Generic (PLEG): container finished" podID="c4920eee-8485-4faa-892c-893c6466a90c" containerID="484f3fb839183cec10038487f86ef12f28aad48e989d27e0f371b4836997c9c1" exitCode=0 Feb 20 07:07:17 crc kubenswrapper[5094]: I0220 07:07:17.278054 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-abcd-account-create-update-bwsmr" event={"ID":"c4920eee-8485-4faa-892c-893c6466a90c","Type":"ContainerDied","Data":"484f3fb839183cec10038487f86ef12f28aad48e989d27e0f371b4836997c9c1"} Feb 20 07:07:17 crc kubenswrapper[5094]: I0220 07:07:17.278068 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-abcd-account-create-update-bwsmr" event={"ID":"c4920eee-8485-4faa-892c-893c6466a90c","Type":"ContainerStarted","Data":"d53251aa996c65ff3c2062ffb8b3114ee2a3d8ca7e99e1d83319a526e535bd78"} Feb 20 07:07:17 crc kubenswrapper[5094]: I0220 07:07:17.281216 5094 generic.go:334] "Generic (PLEG): container finished" podID="5c0f5daa-28f1-412d-8749-5b11f6b8f26d" containerID="cef36671b09afd9d82ea9087a220ba378848b8caf63bdb34c5ff82372929ee6f" exitCode=0 Feb 20 07:07:17 crc kubenswrapper[5094]: I0220 07:07:17.281260 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qvr99" event={"ID":"5c0f5daa-28f1-412d-8749-5b11f6b8f26d","Type":"ContainerDied","Data":"cef36671b09afd9d82ea9087a220ba378848b8caf63bdb34c5ff82372929ee6f"} Feb 20 07:07:17 crc kubenswrapper[5094]: I0220 07:07:17.283439 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lvlr2-config-wsgcd" event={"ID":"00e11219-ecb3-45ab-8303-265d85ff4c3a","Type":"ContainerStarted","Data":"a83d0092220037210ddac94356a5643eedd8b0154c0d0c4581dd1f1cab42ebc4"} Feb 20 07:07:18 crc kubenswrapper[5094]: I0220 07:07:18.295980 5094 generic.go:334] "Generic (PLEG): container finished" podID="00e11219-ecb3-45ab-8303-265d85ff4c3a" containerID="d9a07c98406e23d72c5a2bc3d04e8964b30bc89dab757f6e64abbd3de62c1272" exitCode=0 Feb 20 07:07:18 crc kubenswrapper[5094]: I0220 07:07:18.296092 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lvlr2-config-wsgcd" event={"ID":"00e11219-ecb3-45ab-8303-265d85ff4c3a","Type":"ContainerDied","Data":"d9a07c98406e23d72c5a2bc3d04e8964b30bc89dab757f6e64abbd3de62c1272"} Feb 20 07:07:18 crc kubenswrapper[5094]: I0220 07:07:18.799902 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:18 crc kubenswrapper[5094]: I0220 07:07:18.852036 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-wdn2q"] Feb 20 07:07:18 crc kubenswrapper[5094]: I0220 07:07:18.852292 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" podUID="53d83e89-d39a-4ed6-ab65-02820d089bec" containerName="dnsmasq-dns" containerID="cri-o://a4bd321aef47fd2509dc250944b4a0aee8240092ee8ccb473ab0a8a33aee200d" gracePeriod=10 Feb 20 07:07:18 crc kubenswrapper[5094]: I0220 07:07:18.906172 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" podUID="53d83e89-d39a-4ed6-ab65-02820d089bec" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: connect: connection refused" Feb 20 07:07:19 crc kubenswrapper[5094]: I0220 07:07:19.309127 5094 generic.go:334] "Generic (PLEG): container finished" podID="53d83e89-d39a-4ed6-ab65-02820d089bec" containerID="a4bd321aef47fd2509dc250944b4a0aee8240092ee8ccb473ab0a8a33aee200d" exitCode=0 Feb 20 07:07:19 crc kubenswrapper[5094]: I0220 07:07:19.309192 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" event={"ID":"53d83e89-d39a-4ed6-ab65-02820d089bec","Type":"ContainerDied","Data":"a4bd321aef47fd2509dc250944b4a0aee8240092ee8ccb473ab0a8a33aee200d"} Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.339441 5094 generic.go:334] "Generic (PLEG): container finished" podID="2538e2cc-781b-4c2a-b993-381e488fd5bb" containerID="faa3a4c7f983e78d891220a937038d927c9ef5f317cdc8774b24154cf98f3466" exitCode=0 Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.339528 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-d27ft" event={"ID":"2538e2cc-781b-4c2a-b993-381e488fd5bb","Type":"ContainerDied","Data":"faa3a4c7f983e78d891220a937038d927c9ef5f317cdc8774b24154cf98f3466"} Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.348011 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qvr99" event={"ID":"5c0f5daa-28f1-412d-8749-5b11f6b8f26d","Type":"ContainerDied","Data":"98316dffcd57fbd3d1bd86b87cf77b8f71de4c56ee7fbdaff700987546968d91"} Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.348089 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98316dffcd57fbd3d1bd86b87cf77b8f71de4c56ee7fbdaff700987546968d91" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.622762 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qvr99" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.630575 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7gn4d" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.636326 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qqgpn" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.654110 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-abcd-account-create-update-bwsmr" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.690345 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c179-account-create-update-sst4m" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.702251 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c0f5daa-28f1-412d-8749-5b11f6b8f26d-operator-scripts\") pod \"5c0f5daa-28f1-412d-8749-5b11f6b8f26d\" (UID: \"5c0f5daa-28f1-412d-8749-5b11f6b8f26d\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.702313 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4920eee-8485-4faa-892c-893c6466a90c-operator-scripts\") pod \"c4920eee-8485-4faa-892c-893c6466a90c\" (UID: \"c4920eee-8485-4faa-892c-893c6466a90c\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.702384 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ztfs\" (UniqueName: \"kubernetes.io/projected/c4920eee-8485-4faa-892c-893c6466a90c-kube-api-access-4ztfs\") pod \"c4920eee-8485-4faa-892c-893c6466a90c\" (UID: \"c4920eee-8485-4faa-892c-893c6466a90c\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.702422 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnz6l\" (UniqueName: \"kubernetes.io/projected/317d32d8-9ad2-4bd1-87f4-745e3157c713-kube-api-access-rnz6l\") pod \"317d32d8-9ad2-4bd1-87f4-745e3157c713\" (UID: \"317d32d8-9ad2-4bd1-87f4-745e3157c713\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.702444 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/317d32d8-9ad2-4bd1-87f4-745e3157c713-operator-scripts\") pod \"317d32d8-9ad2-4bd1-87f4-745e3157c713\" (UID: \"317d32d8-9ad2-4bd1-87f4-745e3157c713\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.702489 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxrcx\" (UniqueName: \"kubernetes.io/projected/5c0f5daa-28f1-412d-8749-5b11f6b8f26d-kube-api-access-zxrcx\") pod \"5c0f5daa-28f1-412d-8749-5b11f6b8f26d\" (UID: \"5c0f5daa-28f1-412d-8749-5b11f6b8f26d\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.702556 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d59abb8-e7c7-404f-8f03-13d2167bea54-operator-scripts\") pod \"3d59abb8-e7c7-404f-8f03-13d2167bea54\" (UID: \"3d59abb8-e7c7-404f-8f03-13d2167bea54\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.702684 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvlpt\" (UniqueName: \"kubernetes.io/projected/3d59abb8-e7c7-404f-8f03-13d2167bea54-kube-api-access-gvlpt\") pod \"3d59abb8-e7c7-404f-8f03-13d2167bea54\" (UID: \"3d59abb8-e7c7-404f-8f03-13d2167bea54\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.703643 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4920eee-8485-4faa-892c-893c6466a90c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c4920eee-8485-4faa-892c-893c6466a90c" (UID: "c4920eee-8485-4faa-892c-893c6466a90c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.703966 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c0f5daa-28f1-412d-8749-5b11f6b8f26d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5c0f5daa-28f1-412d-8749-5b11f6b8f26d" (UID: "5c0f5daa-28f1-412d-8749-5b11f6b8f26d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.704093 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4920eee-8485-4faa-892c-893c6466a90c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.704546 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/317d32d8-9ad2-4bd1-87f4-745e3157c713-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "317d32d8-9ad2-4bd1-87f4-745e3157c713" (UID: "317d32d8-9ad2-4bd1-87f4-745e3157c713"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.704673 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d59abb8-e7c7-404f-8f03-13d2167bea54-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3d59abb8-e7c7-404f-8f03-13d2167bea54" (UID: "3d59abb8-e7c7-404f-8f03-13d2167bea54"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.709912 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4920eee-8485-4faa-892c-893c6466a90c-kube-api-access-4ztfs" (OuterVolumeSpecName: "kube-api-access-4ztfs") pod "c4920eee-8485-4faa-892c-893c6466a90c" (UID: "c4920eee-8485-4faa-892c-893c6466a90c"). InnerVolumeSpecName "kube-api-access-4ztfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.709953 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d59abb8-e7c7-404f-8f03-13d2167bea54-kube-api-access-gvlpt" (OuterVolumeSpecName: "kube-api-access-gvlpt") pod "3d59abb8-e7c7-404f-8f03-13d2167bea54" (UID: "3d59abb8-e7c7-404f-8f03-13d2167bea54"). InnerVolumeSpecName "kube-api-access-gvlpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.711184 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8083-account-create-update-wxrzd" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.712346 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/317d32d8-9ad2-4bd1-87f4-745e3157c713-kube-api-access-rnz6l" (OuterVolumeSpecName: "kube-api-access-rnz6l") pod "317d32d8-9ad2-4bd1-87f4-745e3157c713" (UID: "317d32d8-9ad2-4bd1-87f4-745e3157c713"). InnerVolumeSpecName "kube-api-access-rnz6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.722131 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c0f5daa-28f1-412d-8749-5b11f6b8f26d-kube-api-access-zxrcx" (OuterVolumeSpecName: "kube-api-access-zxrcx") pod "5c0f5daa-28f1-412d-8749-5b11f6b8f26d" (UID: "5c0f5daa-28f1-412d-8749-5b11f6b8f26d"). InnerVolumeSpecName "kube-api-access-zxrcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.735068 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.740361 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.804952 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-config\") pod \"53d83e89-d39a-4ed6-ab65-02820d089bec\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805001 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-run\") pod \"00e11219-ecb3-45ab-8303-265d85ff4c3a\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805077 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99x5n\" (UniqueName: \"kubernetes.io/projected/75a27624-eac7-47c9-9f3b-98604d88fb3a-kube-api-access-99x5n\") pod \"75a27624-eac7-47c9-9f3b-98604d88fb3a\" (UID: \"75a27624-eac7-47c9-9f3b-98604d88fb3a\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805144 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-ovsdbserver-nb\") pod \"53d83e89-d39a-4ed6-ab65-02820d089bec\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805171 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-dns-svc\") pod \"53d83e89-d39a-4ed6-ab65-02820d089bec\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805225 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb7kw\" (UniqueName: \"kubernetes.io/projected/00e11219-ecb3-45ab-8303-265d85ff4c3a-kube-api-access-jb7kw\") pod \"00e11219-ecb3-45ab-8303-265d85ff4c3a\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805259 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzbc9\" (UniqueName: \"kubernetes.io/projected/53d83e89-d39a-4ed6-ab65-02820d089bec-kube-api-access-mzbc9\") pod \"53d83e89-d39a-4ed6-ab65-02820d089bec\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805288 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-ovsdbserver-sb\") pod \"53d83e89-d39a-4ed6-ab65-02820d089bec\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805288 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-run" (OuterVolumeSpecName: "var-run") pod "00e11219-ecb3-45ab-8303-265d85ff4c3a" (UID: "00e11219-ecb3-45ab-8303-265d85ff4c3a"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805334 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/00e11219-ecb3-45ab-8303-265d85ff4c3a-additional-scripts\") pod \"00e11219-ecb3-45ab-8303-265d85ff4c3a\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805404 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrtl2\" (UniqueName: \"kubernetes.io/projected/23a44809-2f91-4dbe-80ed-733390b037d8-kube-api-access-vrtl2\") pod \"23a44809-2f91-4dbe-80ed-733390b037d8\" (UID: \"23a44809-2f91-4dbe-80ed-733390b037d8\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805452 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23a44809-2f91-4dbe-80ed-733390b037d8-operator-scripts\") pod \"23a44809-2f91-4dbe-80ed-733390b037d8\" (UID: \"23a44809-2f91-4dbe-80ed-733390b037d8\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805480 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00e11219-ecb3-45ab-8303-265d85ff4c3a-scripts\") pod \"00e11219-ecb3-45ab-8303-265d85ff4c3a\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805497 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-log-ovn\") pod \"00e11219-ecb3-45ab-8303-265d85ff4c3a\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805516 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75a27624-eac7-47c9-9f3b-98604d88fb3a-operator-scripts\") pod \"75a27624-eac7-47c9-9f3b-98604d88fb3a\" (UID: \"75a27624-eac7-47c9-9f3b-98604d88fb3a\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805548 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-run-ovn\") pod \"00e11219-ecb3-45ab-8303-265d85ff4c3a\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805850 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "00e11219-ecb3-45ab-8303-265d85ff4c3a" (UID: "00e11219-ecb3-45ab-8303-265d85ff4c3a"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805927 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ztfs\" (UniqueName: \"kubernetes.io/projected/c4920eee-8485-4faa-892c-893c6466a90c-kube-api-access-4ztfs\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805941 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnz6l\" (UniqueName: \"kubernetes.io/projected/317d32d8-9ad2-4bd1-87f4-745e3157c713-kube-api-access-rnz6l\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805951 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/317d32d8-9ad2-4bd1-87f4-745e3157c713-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805963 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxrcx\" (UniqueName: \"kubernetes.io/projected/5c0f5daa-28f1-412d-8749-5b11f6b8f26d-kube-api-access-zxrcx\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805973 5094 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-run\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805985 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d59abb8-e7c7-404f-8f03-13d2167bea54-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805994 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvlpt\" (UniqueName: \"kubernetes.io/projected/3d59abb8-e7c7-404f-8f03-13d2167bea54-kube-api-access-gvlpt\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.806004 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c0f5daa-28f1-412d-8749-5b11f6b8f26d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.806846 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23a44809-2f91-4dbe-80ed-733390b037d8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "23a44809-2f91-4dbe-80ed-733390b037d8" (UID: "23a44809-2f91-4dbe-80ed-733390b037d8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.806913 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "00e11219-ecb3-45ab-8303-265d85ff4c3a" (UID: "00e11219-ecb3-45ab-8303-265d85ff4c3a"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.806999 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00e11219-ecb3-45ab-8303-265d85ff4c3a-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "00e11219-ecb3-45ab-8303-265d85ff4c3a" (UID: "00e11219-ecb3-45ab-8303-265d85ff4c3a"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.807825 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00e11219-ecb3-45ab-8303-265d85ff4c3a-scripts" (OuterVolumeSpecName: "scripts") pod "00e11219-ecb3-45ab-8303-265d85ff4c3a" (UID: "00e11219-ecb3-45ab-8303-265d85ff4c3a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.808408 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75a27624-eac7-47c9-9f3b-98604d88fb3a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "75a27624-eac7-47c9-9f3b-98604d88fb3a" (UID: "75a27624-eac7-47c9-9f3b-98604d88fb3a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.810713 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23a44809-2f91-4dbe-80ed-733390b037d8-kube-api-access-vrtl2" (OuterVolumeSpecName: "kube-api-access-vrtl2") pod "23a44809-2f91-4dbe-80ed-733390b037d8" (UID: "23a44809-2f91-4dbe-80ed-733390b037d8"). InnerVolumeSpecName "kube-api-access-vrtl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.813535 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53d83e89-d39a-4ed6-ab65-02820d089bec-kube-api-access-mzbc9" (OuterVolumeSpecName: "kube-api-access-mzbc9") pod "53d83e89-d39a-4ed6-ab65-02820d089bec" (UID: "53d83e89-d39a-4ed6-ab65-02820d089bec"). InnerVolumeSpecName "kube-api-access-mzbc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.813768 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75a27624-eac7-47c9-9f3b-98604d88fb3a-kube-api-access-99x5n" (OuterVolumeSpecName: "kube-api-access-99x5n") pod "75a27624-eac7-47c9-9f3b-98604d88fb3a" (UID: "75a27624-eac7-47c9-9f3b-98604d88fb3a"). InnerVolumeSpecName "kube-api-access-99x5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.816523 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00e11219-ecb3-45ab-8303-265d85ff4c3a-kube-api-access-jb7kw" (OuterVolumeSpecName: "kube-api-access-jb7kw") pod "00e11219-ecb3-45ab-8303-265d85ff4c3a" (UID: "00e11219-ecb3-45ab-8303-265d85ff4c3a"). InnerVolumeSpecName "kube-api-access-jb7kw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.858475 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "53d83e89-d39a-4ed6-ab65-02820d089bec" (UID: "53d83e89-d39a-4ed6-ab65-02820d089bec"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.859683 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "53d83e89-d39a-4ed6-ab65-02820d089bec" (UID: "53d83e89-d39a-4ed6-ab65-02820d089bec"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.884154 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-config" (OuterVolumeSpecName: "config") pod "53d83e89-d39a-4ed6-ab65-02820d089bec" (UID: "53d83e89-d39a-4ed6-ab65-02820d089bec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.890117 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "53d83e89-d39a-4ed6-ab65-02820d089bec" (UID: "53d83e89-d39a-4ed6-ab65-02820d089bec"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.908663 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.908693 5094 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/00e11219-ecb3-45ab-8303-265d85ff4c3a-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.908728 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrtl2\" (UniqueName: \"kubernetes.io/projected/23a44809-2f91-4dbe-80ed-733390b037d8-kube-api-access-vrtl2\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.908745 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23a44809-2f91-4dbe-80ed-733390b037d8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.908759 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00e11219-ecb3-45ab-8303-265d85ff4c3a-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.908775 5094 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.908790 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75a27624-eac7-47c9-9f3b-98604d88fb3a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.908806 5094 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.908819 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.908831 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99x5n\" (UniqueName: \"kubernetes.io/projected/75a27624-eac7-47c9-9f3b-98604d88fb3a-kube-api-access-99x5n\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.908844 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.908857 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.908870 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb7kw\" (UniqueName: \"kubernetes.io/projected/00e11219-ecb3-45ab-8303-265d85ff4c3a-kube-api-access-jb7kw\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.908889 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzbc9\" (UniqueName: \"kubernetes.io/projected/53d83e89-d39a-4ed6-ab65-02820d089bec-kube-api-access-mzbc9\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.386770 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-plbtm" event={"ID":"403a4371-09f4-4206-8d60-5b970d7e4faf","Type":"ContainerStarted","Data":"f18af370a6085a561175edbcf861dea18a9b31989555b80d968ac0b83530959d"} Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.392026 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8083-account-create-update-wxrzd" event={"ID":"23a44809-2f91-4dbe-80ed-733390b037d8","Type":"ContainerDied","Data":"07473375b1389dea5ef061588ec041b7a1dc6c39e615f25c439adbc0f20ff01b"} Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.392092 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07473375b1389dea5ef061588ec041b7a1dc6c39e615f25c439adbc0f20ff01b" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.395871 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8083-account-create-update-wxrzd" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.415193 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-abcd-account-create-update-bwsmr" event={"ID":"c4920eee-8485-4faa-892c-893c6466a90c","Type":"ContainerDied","Data":"d53251aa996c65ff3c2062ffb8b3114ee2a3d8ca7e99e1d83319a526e535bd78"} Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.415262 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d53251aa996c65ff3c2062ffb8b3114ee2a3d8ca7e99e1d83319a526e535bd78" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.415413 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-abcd-account-create-update-bwsmr" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.418377 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-plbtm" podStartSLOduration=2.179566529 podStartE2EDuration="7.418350324s" podCreationTimestamp="2026-02-20 07:07:15 +0000 UTC" firstStartedPulling="2026-02-20 07:07:16.208940953 +0000 UTC m=+1251.081567664" lastFinishedPulling="2026-02-20 07:07:21.447724758 +0000 UTC m=+1256.320351459" observedRunningTime="2026-02-20 07:07:22.415917016 +0000 UTC m=+1257.288543737" watchObservedRunningTime="2026-02-20 07:07:22.418350324 +0000 UTC m=+1257.290977045" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.424191 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" event={"ID":"53d83e89-d39a-4ed6-ab65-02820d089bec","Type":"ContainerDied","Data":"4f29f1584725a78d33c79c69353a3c195206f10f8b9ed911fdd59866eb9d81be"} Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.424226 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.424298 5094 scope.go:117] "RemoveContainer" containerID="a4bd321aef47fd2509dc250944b4a0aee8240092ee8ccb473ab0a8a33aee200d" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.438756 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lvlr2-config-wsgcd" event={"ID":"00e11219-ecb3-45ab-8303-265d85ff4c3a","Type":"ContainerDied","Data":"a83d0092220037210ddac94356a5643eedd8b0154c0d0c4581dd1f1cab42ebc4"} Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.438799 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a83d0092220037210ddac94356a5643eedd8b0154c0d0c4581dd1f1cab42ebc4" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.438888 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.441330 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c179-account-create-update-sst4m" event={"ID":"75a27624-eac7-47c9-9f3b-98604d88fb3a","Type":"ContainerDied","Data":"524ab1f80a7901ac7889c8ffcacb992647f00b7dc710ad8fede539905cd58f26"} Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.441350 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="524ab1f80a7901ac7889c8ffcacb992647f00b7dc710ad8fede539905cd58f26" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.441395 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c179-account-create-update-sst4m" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.444134 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-7gn4d" event={"ID":"3d59abb8-e7c7-404f-8f03-13d2167bea54","Type":"ContainerDied","Data":"6c0583540eb15668853e966ba4c6edb87e4fe6d9a3115b84ce82c03eaf755780"} Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.444159 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c0583540eb15668853e966ba4c6edb87e4fe6d9a3115b84ce82c03eaf755780" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.444302 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7gn4d" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.447830 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qqgpn" event={"ID":"317d32d8-9ad2-4bd1-87f4-745e3157c713","Type":"ContainerDied","Data":"33c2a7ece5b891e28740e85e198816b9f0507288fb0d7eaec7b58b5561691817"} Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.447974 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33c2a7ece5b891e28740e85e198816b9f0507288fb0d7eaec7b58b5561691817" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.447940 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qqgpn" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.447878 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qvr99" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.470980 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-wdn2q"] Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.478462 5094 scope.go:117] "RemoveContainer" containerID="d125587006c31c65f1eeb83ce252e5afe7d019516fe47b88f48976b518f4ec0b" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.479236 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-wdn2q"] Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.826110 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-lvlr2-config-wsgcd"] Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.842237 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-lvlr2-config-wsgcd"] Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.935124 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-d27ft" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.036642 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-combined-ca-bundle\") pod \"2538e2cc-781b-4c2a-b993-381e488fd5bb\" (UID: \"2538e2cc-781b-4c2a-b993-381e488fd5bb\") " Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.036862 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2sn5\" (UniqueName: \"kubernetes.io/projected/2538e2cc-781b-4c2a-b993-381e488fd5bb-kube-api-access-f2sn5\") pod \"2538e2cc-781b-4c2a-b993-381e488fd5bb\" (UID: \"2538e2cc-781b-4c2a-b993-381e488fd5bb\") " Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.037147 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-config-data\") pod \"2538e2cc-781b-4c2a-b993-381e488fd5bb\" (UID: \"2538e2cc-781b-4c2a-b993-381e488fd5bb\") " Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.037267 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-db-sync-config-data\") pod \"2538e2cc-781b-4c2a-b993-381e488fd5bb\" (UID: \"2538e2cc-781b-4c2a-b993-381e488fd5bb\") " Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.050947 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2538e2cc-781b-4c2a-b993-381e488fd5bb" (UID: "2538e2cc-781b-4c2a-b993-381e488fd5bb"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.051061 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2538e2cc-781b-4c2a-b993-381e488fd5bb-kube-api-access-f2sn5" (OuterVolumeSpecName: "kube-api-access-f2sn5") pod "2538e2cc-781b-4c2a-b993-381e488fd5bb" (UID: "2538e2cc-781b-4c2a-b993-381e488fd5bb"). InnerVolumeSpecName "kube-api-access-f2sn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.069007 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2538e2cc-781b-4c2a-b993-381e488fd5bb" (UID: "2538e2cc-781b-4c2a-b993-381e488fd5bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.095183 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-config-data" (OuterVolumeSpecName: "config-data") pod "2538e2cc-781b-4c2a-b993-381e488fd5bb" (UID: "2538e2cc-781b-4c2a-b993-381e488fd5bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.140595 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.140632 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2sn5\" (UniqueName: \"kubernetes.io/projected/2538e2cc-781b-4c2a-b993-381e488fd5bb-kube-api-access-f2sn5\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.140646 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.140661 5094 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.460819 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-d27ft" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.460895 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-d27ft" event={"ID":"2538e2cc-781b-4c2a-b993-381e488fd5bb","Type":"ContainerDied","Data":"178c70ca4808580e5184d1f4d0b6c895bc6f3ba07b300a63f8cb105cddd95d66"} Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.460974 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="178c70ca4808580e5184d1f4d0b6c895bc6f3ba07b300a63f8cb105cddd95d66" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.859854 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00e11219-ecb3-45ab-8303-265d85ff4c3a" path="/var/lib/kubelet/pods/00e11219-ecb3-45ab-8303-265d85ff4c3a/volumes" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.861724 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53d83e89-d39a-4ed6-ab65-02820d089bec" path="/var/lib/kubelet/pods/53d83e89-d39a-4ed6-ab65-02820d089bec/volumes" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.904959 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68677f88c9-kpbpx"] Feb 20 07:07:23 crc kubenswrapper[5094]: E0220 07:07:23.905345 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c0f5daa-28f1-412d-8749-5b11f6b8f26d" containerName="mariadb-database-create" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905368 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c0f5daa-28f1-412d-8749-5b11f6b8f26d" containerName="mariadb-database-create" Feb 20 07:07:23 crc kubenswrapper[5094]: E0220 07:07:23.905382 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75a27624-eac7-47c9-9f3b-98604d88fb3a" containerName="mariadb-account-create-update" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905392 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="75a27624-eac7-47c9-9f3b-98604d88fb3a" containerName="mariadb-account-create-update" Feb 20 07:07:23 crc kubenswrapper[5094]: E0220 07:07:23.905402 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4920eee-8485-4faa-892c-893c6466a90c" containerName="mariadb-account-create-update" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905408 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4920eee-8485-4faa-892c-893c6466a90c" containerName="mariadb-account-create-update" Feb 20 07:07:23 crc kubenswrapper[5094]: E0220 07:07:23.905431 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="317d32d8-9ad2-4bd1-87f4-745e3157c713" containerName="mariadb-database-create" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905437 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="317d32d8-9ad2-4bd1-87f4-745e3157c713" containerName="mariadb-database-create" Feb 20 07:07:23 crc kubenswrapper[5094]: E0220 07:07:23.905451 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53d83e89-d39a-4ed6-ab65-02820d089bec" containerName="init" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905458 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="53d83e89-d39a-4ed6-ab65-02820d089bec" containerName="init" Feb 20 07:07:23 crc kubenswrapper[5094]: E0220 07:07:23.905470 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00e11219-ecb3-45ab-8303-265d85ff4c3a" containerName="ovn-config" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905477 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="00e11219-ecb3-45ab-8303-265d85ff4c3a" containerName="ovn-config" Feb 20 07:07:23 crc kubenswrapper[5094]: E0220 07:07:23.905486 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53d83e89-d39a-4ed6-ab65-02820d089bec" containerName="dnsmasq-dns" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905493 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="53d83e89-d39a-4ed6-ab65-02820d089bec" containerName="dnsmasq-dns" Feb 20 07:07:23 crc kubenswrapper[5094]: E0220 07:07:23.905508 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d59abb8-e7c7-404f-8f03-13d2167bea54" containerName="mariadb-database-create" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905514 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d59abb8-e7c7-404f-8f03-13d2167bea54" containerName="mariadb-database-create" Feb 20 07:07:23 crc kubenswrapper[5094]: E0220 07:07:23.905524 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23a44809-2f91-4dbe-80ed-733390b037d8" containerName="mariadb-account-create-update" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905530 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="23a44809-2f91-4dbe-80ed-733390b037d8" containerName="mariadb-account-create-update" Feb 20 07:07:23 crc kubenswrapper[5094]: E0220 07:07:23.905538 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2538e2cc-781b-4c2a-b993-381e488fd5bb" containerName="glance-db-sync" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905544 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="2538e2cc-781b-4c2a-b993-381e488fd5bb" containerName="glance-db-sync" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905687 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c0f5daa-28f1-412d-8749-5b11f6b8f26d" containerName="mariadb-database-create" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905715 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4920eee-8485-4faa-892c-893c6466a90c" containerName="mariadb-account-create-update" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905727 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="75a27624-eac7-47c9-9f3b-98604d88fb3a" containerName="mariadb-account-create-update" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905735 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="53d83e89-d39a-4ed6-ab65-02820d089bec" containerName="dnsmasq-dns" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905744 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="2538e2cc-781b-4c2a-b993-381e488fd5bb" containerName="glance-db-sync" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905752 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d59abb8-e7c7-404f-8f03-13d2167bea54" containerName="mariadb-database-create" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905760 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="00e11219-ecb3-45ab-8303-265d85ff4c3a" containerName="ovn-config" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905773 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="317d32d8-9ad2-4bd1-87f4-745e3157c713" containerName="mariadb-database-create" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905786 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="23a44809-2f91-4dbe-80ed-733390b037d8" containerName="mariadb-account-create-update" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.906687 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.925263 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68677f88c9-kpbpx"] Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.960956 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-dns-swift-storage-0\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.961052 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl684\" (UniqueName: \"kubernetes.io/projected/887446b0-f238-4ff4-82dd-a903299a0105-kube-api-access-vl684\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.961108 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-ovsdbserver-sb\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.961161 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-config\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.961184 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-ovsdbserver-nb\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.961205 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-dns-svc\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:24 crc kubenswrapper[5094]: I0220 07:07:24.063098 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl684\" (UniqueName: \"kubernetes.io/projected/887446b0-f238-4ff4-82dd-a903299a0105-kube-api-access-vl684\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:24 crc kubenswrapper[5094]: I0220 07:07:24.063176 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-ovsdbserver-sb\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:24 crc kubenswrapper[5094]: I0220 07:07:24.063227 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-config\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:24 crc kubenswrapper[5094]: I0220 07:07:24.063248 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-ovsdbserver-nb\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:24 crc kubenswrapper[5094]: I0220 07:07:24.063275 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-dns-svc\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:24 crc kubenswrapper[5094]: I0220 07:07:24.063319 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-dns-swift-storage-0\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:24 crc kubenswrapper[5094]: I0220 07:07:24.064519 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-dns-swift-storage-0\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:24 crc kubenswrapper[5094]: I0220 07:07:24.064677 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-ovsdbserver-nb\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:24 crc kubenswrapper[5094]: I0220 07:07:24.064929 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-config\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:24 crc kubenswrapper[5094]: I0220 07:07:24.065098 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-dns-svc\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:24 crc kubenswrapper[5094]: I0220 07:07:24.065106 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-ovsdbserver-sb\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:24 crc kubenswrapper[5094]: I0220 07:07:24.093159 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl684\" (UniqueName: \"kubernetes.io/projected/887446b0-f238-4ff4-82dd-a903299a0105-kube-api-access-vl684\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:24 crc kubenswrapper[5094]: I0220 07:07:24.227055 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:24 crc kubenswrapper[5094]: I0220 07:07:24.736378 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68677f88c9-kpbpx"] Feb 20 07:07:24 crc kubenswrapper[5094]: W0220 07:07:24.747886 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod887446b0_f238_4ff4_82dd_a903299a0105.slice/crio-d86a37ef604c4bf81c2fdfca35e77934bf681730a45dd0766d8017147ff2766f WatchSource:0}: Error finding container d86a37ef604c4bf81c2fdfca35e77934bf681730a45dd0766d8017147ff2766f: Status 404 returned error can't find the container with id d86a37ef604c4bf81c2fdfca35e77934bf681730a45dd0766d8017147ff2766f Feb 20 07:07:25 crc kubenswrapper[5094]: I0220 07:07:25.484164 5094 generic.go:334] "Generic (PLEG): container finished" podID="887446b0-f238-4ff4-82dd-a903299a0105" containerID="febef95577b2b18c2f7a29013235d1029fd43ecb02762ac3e41f6e76448932f2" exitCode=0 Feb 20 07:07:25 crc kubenswrapper[5094]: I0220 07:07:25.484315 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" event={"ID":"887446b0-f238-4ff4-82dd-a903299a0105","Type":"ContainerDied","Data":"febef95577b2b18c2f7a29013235d1029fd43ecb02762ac3e41f6e76448932f2"} Feb 20 07:07:25 crc kubenswrapper[5094]: I0220 07:07:25.484794 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" event={"ID":"887446b0-f238-4ff4-82dd-a903299a0105","Type":"ContainerStarted","Data":"d86a37ef604c4bf81c2fdfca35e77934bf681730a45dd0766d8017147ff2766f"} Feb 20 07:07:25 crc kubenswrapper[5094]: I0220 07:07:25.487238 5094 generic.go:334] "Generic (PLEG): container finished" podID="403a4371-09f4-4206-8d60-5b970d7e4faf" containerID="f18af370a6085a561175edbcf861dea18a9b31989555b80d968ac0b83530959d" exitCode=0 Feb 20 07:07:25 crc kubenswrapper[5094]: I0220 07:07:25.487285 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-plbtm" event={"ID":"403a4371-09f4-4206-8d60-5b970d7e4faf","Type":"ContainerDied","Data":"f18af370a6085a561175edbcf861dea18a9b31989555b80d968ac0b83530959d"} Feb 20 07:07:26 crc kubenswrapper[5094]: I0220 07:07:26.499303 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" event={"ID":"887446b0-f238-4ff4-82dd-a903299a0105","Type":"ContainerStarted","Data":"60d68fcdef2499d06a7eba5248abfa54f3596133e4fe35ebcfc6def68e018ca6"} Feb 20 07:07:26 crc kubenswrapper[5094]: I0220 07:07:26.537582 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" podStartSLOduration=3.537546655 podStartE2EDuration="3.537546655s" podCreationTimestamp="2026-02-20 07:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:07:26.525665641 +0000 UTC m=+1261.398292352" watchObservedRunningTime="2026-02-20 07:07:26.537546655 +0000 UTC m=+1261.410173366" Feb 20 07:07:26 crc kubenswrapper[5094]: I0220 07:07:26.879065 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-plbtm" Feb 20 07:07:26 crc kubenswrapper[5094]: I0220 07:07:26.923380 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d67lh\" (UniqueName: \"kubernetes.io/projected/403a4371-09f4-4206-8d60-5b970d7e4faf-kube-api-access-d67lh\") pod \"403a4371-09f4-4206-8d60-5b970d7e4faf\" (UID: \"403a4371-09f4-4206-8d60-5b970d7e4faf\") " Feb 20 07:07:26 crc kubenswrapper[5094]: I0220 07:07:26.923583 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/403a4371-09f4-4206-8d60-5b970d7e4faf-combined-ca-bundle\") pod \"403a4371-09f4-4206-8d60-5b970d7e4faf\" (UID: \"403a4371-09f4-4206-8d60-5b970d7e4faf\") " Feb 20 07:07:26 crc kubenswrapper[5094]: I0220 07:07:26.923825 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/403a4371-09f4-4206-8d60-5b970d7e4faf-config-data\") pod \"403a4371-09f4-4206-8d60-5b970d7e4faf\" (UID: \"403a4371-09f4-4206-8d60-5b970d7e4faf\") " Feb 20 07:07:26 crc kubenswrapper[5094]: I0220 07:07:26.931116 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/403a4371-09f4-4206-8d60-5b970d7e4faf-kube-api-access-d67lh" (OuterVolumeSpecName: "kube-api-access-d67lh") pod "403a4371-09f4-4206-8d60-5b970d7e4faf" (UID: "403a4371-09f4-4206-8d60-5b970d7e4faf"). InnerVolumeSpecName "kube-api-access-d67lh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:26 crc kubenswrapper[5094]: I0220 07:07:26.977039 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/403a4371-09f4-4206-8d60-5b970d7e4faf-config-data" (OuterVolumeSpecName: "config-data") pod "403a4371-09f4-4206-8d60-5b970d7e4faf" (UID: "403a4371-09f4-4206-8d60-5b970d7e4faf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:26 crc kubenswrapper[5094]: I0220 07:07:26.984675 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/403a4371-09f4-4206-8d60-5b970d7e4faf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "403a4371-09f4-4206-8d60-5b970d7e4faf" (UID: "403a4371-09f4-4206-8d60-5b970d7e4faf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.031558 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d67lh\" (UniqueName: \"kubernetes.io/projected/403a4371-09f4-4206-8d60-5b970d7e4faf-kube-api-access-d67lh\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.031605 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/403a4371-09f4-4206-8d60-5b970d7e4faf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.031616 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/403a4371-09f4-4206-8d60-5b970d7e4faf-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.513580 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-plbtm" event={"ID":"403a4371-09f4-4206-8d60-5b970d7e4faf","Type":"ContainerDied","Data":"73c249f43927a9b87fab7f1f7c42f5fbb83b2f89d2a99cdbf8b2852eeafd7f98"} Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.514220 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73c249f43927a9b87fab7f1f7c42f5fbb83b2f89d2a99cdbf8b2852eeafd7f98" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.514277 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.513662 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-plbtm" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.858430 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68677f88c9-kpbpx"] Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.905967 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d67cdfc8f-nngtz"] Feb 20 07:07:27 crc kubenswrapper[5094]: E0220 07:07:27.906405 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="403a4371-09f4-4206-8d60-5b970d7e4faf" containerName="keystone-db-sync" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.906425 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="403a4371-09f4-4206-8d60-5b970d7e4faf" containerName="keystone-db-sync" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.906623 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="403a4371-09f4-4206-8d60-5b970d7e4faf" containerName="keystone-db-sync" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.907525 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.952636 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-b78rr"] Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.953885 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.954438 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-dns-svc\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.954584 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-ovsdbserver-sb\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.954679 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-config\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.954825 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-ovsdbserver-nb\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.954963 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-dns-swift-storage-0\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.955036 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5mfl\" (UniqueName: \"kubernetes.io/projected/e1a92386-f07a-4845-9a5d-231a4c498d3f-kube-api-access-j5mfl\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.964887 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qgvjt" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.965110 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.965257 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.965419 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.971173 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.979466 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d67cdfc8f-nngtz"] Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.000628 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-b78rr"] Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.057065 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-scripts\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.057349 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-config-data\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.057614 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-dns-swift-storage-0\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.057677 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5mfl\" (UniqueName: \"kubernetes.io/projected/e1a92386-f07a-4845-9a5d-231a4c498d3f-kube-api-access-j5mfl\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.057842 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-combined-ca-bundle\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.058176 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-dns-svc\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.058241 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-fernet-keys\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.058274 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-ovsdbserver-sb\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.058334 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-config\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.058355 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25g6b\" (UniqueName: \"kubernetes.io/projected/d63a9457-c57a-4979-bd28-ee982250b13c-kube-api-access-25g6b\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.058673 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-ovsdbserver-nb\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.058742 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-credential-keys\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.058939 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-dns-swift-storage-0\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.059728 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-ovsdbserver-sb\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.060309 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-dns-svc\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.060930 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-ovsdbserver-nb\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.061119 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-config\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.114678 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5mfl\" (UniqueName: \"kubernetes.io/projected/e1a92386-f07a-4845-9a5d-231a4c498d3f-kube-api-access-j5mfl\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.161325 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-scripts\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.161376 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-config-data\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.161427 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-combined-ca-bundle\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.161471 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-fernet-keys\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.161498 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25g6b\" (UniqueName: \"kubernetes.io/projected/d63a9457-c57a-4979-bd28-ee982250b13c-kube-api-access-25g6b\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.161546 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-credential-keys\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.195219 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-combined-ca-bundle\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.203627 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-credential-keys\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.207000 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25g6b\" (UniqueName: \"kubernetes.io/projected/d63a9457-c57a-4979-bd28-ee982250b13c-kube-api-access-25g6b\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.220524 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-scripts\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.221175 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-fernet-keys\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.222840 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-config-data\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.233314 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.248057 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-t7hr7"] Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.249434 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.257273 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.257536 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.257662 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-f4gh4" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.283622 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-dnm22"] Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.285031 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dnm22" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.305722 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.310488 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.311985 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-zk895" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.312146 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.333258 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-dnm22"] Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.371695 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf24d\" (UniqueName: \"kubernetes.io/projected/15583b83-ce22-4b0b-9566-0e056b07c0d7-kube-api-access-qf24d\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.371768 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncgbz\" (UniqueName: \"kubernetes.io/projected/ffc4926a-ede6-4124-ac91-c9912ffa8a23-kube-api-access-ncgbz\") pod \"neutron-db-sync-dnm22\" (UID: \"ffc4926a-ede6-4124-ac91-c9912ffa8a23\") " pod="openstack/neutron-db-sync-dnm22" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.371817 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ffc4926a-ede6-4124-ac91-c9912ffa8a23-config\") pod \"neutron-db-sync-dnm22\" (UID: \"ffc4926a-ede6-4124-ac91-c9912ffa8a23\") " pod="openstack/neutron-db-sync-dnm22" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.381859 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-t7hr7"] Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.382335 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffc4926a-ede6-4124-ac91-c9912ffa8a23-combined-ca-bundle\") pod \"neutron-db-sync-dnm22\" (UID: \"ffc4926a-ede6-4124-ac91-c9912ffa8a23\") " pod="openstack/neutron-db-sync-dnm22" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.382407 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-config-data\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.382496 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-db-sync-config-data\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.382528 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/15583b83-ce22-4b0b-9566-0e056b07c0d7-etc-machine-id\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.382552 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-combined-ca-bundle\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.382594 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-scripts\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.454747 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-jsvf2"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.456494 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.459832 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.463689 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-fq797" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.463926 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.484420 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffc4926a-ede6-4124-ac91-c9912ffa8a23-combined-ca-bundle\") pod \"neutron-db-sync-dnm22\" (UID: \"ffc4926a-ede6-4124-ac91-c9912ffa8a23\") " pod="openstack/neutron-db-sync-dnm22" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.484461 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-config-data\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.484500 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-db-sync-config-data\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.484540 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/15583b83-ce22-4b0b-9566-0e056b07c0d7-etc-machine-id\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.484561 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-combined-ca-bundle\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.484583 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-scripts\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.484625 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf24d\" (UniqueName: \"kubernetes.io/projected/15583b83-ce22-4b0b-9566-0e056b07c0d7-kube-api-access-qf24d\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.484650 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncgbz\" (UniqueName: \"kubernetes.io/projected/ffc4926a-ede6-4124-ac91-c9912ffa8a23-kube-api-access-ncgbz\") pod \"neutron-db-sync-dnm22\" (UID: \"ffc4926a-ede6-4124-ac91-c9912ffa8a23\") " pod="openstack/neutron-db-sync-dnm22" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.484686 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ffc4926a-ede6-4124-ac91-c9912ffa8a23-config\") pod \"neutron-db-sync-dnm22\" (UID: \"ffc4926a-ede6-4124-ac91-c9912ffa8a23\") " pod="openstack/neutron-db-sync-dnm22" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.485450 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/15583b83-ce22-4b0b-9566-0e056b07c0d7-etc-machine-id\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.495257 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jsvf2"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.497526 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ffc4926a-ede6-4124-ac91-c9912ffa8a23-config\") pod \"neutron-db-sync-dnm22\" (UID: \"ffc4926a-ede6-4124-ac91-c9912ffa8a23\") " pod="openstack/neutron-db-sync-dnm22" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.504307 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-config-data\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.522432 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-scripts\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.523333 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-db-sync-config-data\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.523927 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffc4926a-ede6-4124-ac91-c9912ffa8a23-combined-ca-bundle\") pod \"neutron-db-sync-dnm22\" (UID: \"ffc4926a-ede6-4124-ac91-c9912ffa8a23\") " pod="openstack/neutron-db-sync-dnm22" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.524117 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-combined-ca-bundle\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.527562 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncgbz\" (UniqueName: \"kubernetes.io/projected/ffc4926a-ede6-4124-ac91-c9912ffa8a23-kube-api-access-ncgbz\") pod \"neutron-db-sync-dnm22\" (UID: \"ffc4926a-ede6-4124-ac91-c9912ffa8a23\") " pod="openstack/neutron-db-sync-dnm22" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.528377 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf24d\" (UniqueName: \"kubernetes.io/projected/15583b83-ce22-4b0b-9566-0e056b07c0d7-kube-api-access-qf24d\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.542534 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-fvmwf"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.544416 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fvmwf" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.557074 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-z4bpk" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.557445 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.585933 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqrpz\" (UniqueName: \"kubernetes.io/projected/fd4e0644-4339-45bf-a919-0de0551c5baa-kube-api-access-fqrpz\") pod \"placement-db-sync-jsvf2\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.586531 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-config-data\") pod \"placement-db-sync-jsvf2\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.586554 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bft6b\" (UniqueName: \"kubernetes.io/projected/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-kube-api-access-bft6b\") pod \"barbican-db-sync-fvmwf\" (UID: \"d6e6aec3-87a9-4f8a-b640-313ab241ec6f\") " pod="openstack/barbican-db-sync-fvmwf" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.586585 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd4e0644-4339-45bf-a919-0de0551c5baa-logs\") pod \"placement-db-sync-jsvf2\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.586652 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-scripts\") pod \"placement-db-sync-jsvf2\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.586681 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-combined-ca-bundle\") pod \"placement-db-sync-jsvf2\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.586789 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-db-sync-config-data\") pod \"barbican-db-sync-fvmwf\" (UID: \"d6e6aec3-87a9-4f8a-b640-313ab241ec6f\") " pod="openstack/barbican-db-sync-fvmwf" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.586830 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-combined-ca-bundle\") pod \"barbican-db-sync-fvmwf\" (UID: \"d6e6aec3-87a9-4f8a-b640-313ab241ec6f\") " pod="openstack/barbican-db-sync-fvmwf" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.594263 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.594882 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-fvmwf"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.609975 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d67cdfc8f-nngtz"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.621982 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.627348 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.632854 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.633142 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.678224 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.693175 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67dccc895-wgf2f"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.694136 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-scripts\") pod \"placement-db-sync-jsvf2\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.694356 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-combined-ca-bundle\") pod \"placement-db-sync-jsvf2\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.694454 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-config-data\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.694533 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-db-sync-config-data\") pod \"barbican-db-sync-fvmwf\" (UID: \"d6e6aec3-87a9-4f8a-b640-313ab241ec6f\") " pod="openstack/barbican-db-sync-fvmwf" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.695210 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.695308 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-combined-ca-bundle\") pod \"barbican-db-sync-fvmwf\" (UID: \"d6e6aec3-87a9-4f8a-b640-313ab241ec6f\") " pod="openstack/barbican-db-sync-fvmwf" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.695371 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19d2d34d-f935-40e4-a27a-a382c7634da2-log-httpd\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.695448 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-scripts\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.695947 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqrpz\" (UniqueName: \"kubernetes.io/projected/fd4e0644-4339-45bf-a919-0de0551c5baa-kube-api-access-fqrpz\") pod \"placement-db-sync-jsvf2\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.696091 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-config-data\") pod \"placement-db-sync-jsvf2\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.696263 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bft6b\" (UniqueName: \"kubernetes.io/projected/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-kube-api-access-bft6b\") pod \"barbican-db-sync-fvmwf\" (UID: \"d6e6aec3-87a9-4f8a-b640-313ab241ec6f\") " pod="openstack/barbican-db-sync-fvmwf" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.696377 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgl49\" (UniqueName: \"kubernetes.io/projected/19d2d34d-f935-40e4-a27a-a382c7634da2-kube-api-access-jgl49\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.696439 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd4e0644-4339-45bf-a919-0de0551c5baa-logs\") pod \"placement-db-sync-jsvf2\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.696522 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19d2d34d-f935-40e4-a27a-a382c7634da2-run-httpd\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.696617 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.697321 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.698813 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd4e0644-4339-45bf-a919-0de0551c5baa-logs\") pod \"placement-db-sync-jsvf2\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.703802 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-db-sync-config-data\") pod \"barbican-db-sync-fvmwf\" (UID: \"d6e6aec3-87a9-4f8a-b640-313ab241ec6f\") " pod="openstack/barbican-db-sync-fvmwf" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.707829 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-combined-ca-bundle\") pod \"barbican-db-sync-fvmwf\" (UID: \"d6e6aec3-87a9-4f8a-b640-313ab241ec6f\") " pod="openstack/barbican-db-sync-fvmwf" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.714163 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-combined-ca-bundle\") pod \"placement-db-sync-jsvf2\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.717213 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-scripts\") pod \"placement-db-sync-jsvf2\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.719131 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqrpz\" (UniqueName: \"kubernetes.io/projected/fd4e0644-4339-45bf-a919-0de0551c5baa-kube-api-access-fqrpz\") pod \"placement-db-sync-jsvf2\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.719537 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-config-data\") pod \"placement-db-sync-jsvf2\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.722223 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bft6b\" (UniqueName: \"kubernetes.io/projected/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-kube-api-access-bft6b\") pod \"barbican-db-sync-fvmwf\" (UID: \"d6e6aec3-87a9-4f8a-b640-313ab241ec6f\") " pod="openstack/barbican-db-sync-fvmwf" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.732041 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67dccc895-wgf2f"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.745093 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dnm22" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.797060 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.801431 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-dns-swift-storage-0\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.801469 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-config\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.801524 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgl49\" (UniqueName: \"kubernetes.io/projected/19d2d34d-f935-40e4-a27a-a382c7634da2-kube-api-access-jgl49\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.801563 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpr25\" (UniqueName: \"kubernetes.io/projected/53cdb905-b22d-4849-ae24-6baa2838be39-kube-api-access-xpr25\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.801589 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19d2d34d-f935-40e4-a27a-a382c7634da2-run-httpd\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.801618 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.801657 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-ovsdbserver-nb\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.801693 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-config-data\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.801740 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-dns-svc\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.801767 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.801788 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-ovsdbserver-sb\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.801816 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19d2d34d-f935-40e4-a27a-a382c7634da2-log-httpd\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.802211 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19d2d34d-f935-40e4-a27a-a382c7634da2-run-httpd\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.803833 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-scripts\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.806819 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19d2d34d-f935-40e4-a27a-a382c7634da2-log-httpd\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.807949 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.808043 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.810273 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-config-data\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.810583 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-scripts\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.826587 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgl49\" (UniqueName: \"kubernetes.io/projected/19d2d34d-f935-40e4-a27a-a382c7634da2-kube-api-access-jgl49\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.885673 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fvmwf" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.905353 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-ovsdbserver-nb\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.905419 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-dns-svc\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.905492 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-ovsdbserver-sb\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.907819 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-dns-swift-storage-0\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.907846 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-config\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.907983 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpr25\" (UniqueName: \"kubernetes.io/projected/53cdb905-b22d-4849-ae24-6baa2838be39-kube-api-access-xpr25\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.909747 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-ovsdbserver-nb\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.910018 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-dns-svc\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.910718 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-dns-swift-storage-0\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.910727 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-config\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.911053 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-ovsdbserver-sb\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.925315 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpr25\" (UniqueName: \"kubernetes.io/projected/53cdb905-b22d-4849-ae24-6baa2838be39-kube-api-access-xpr25\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.976934 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.026413 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.180957 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.183872 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.190175 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.190446 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.191433 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.191478 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9q5bq" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.194322 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.317345 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.317426 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/090c8378-96fa-4223-8b6d-b98fa179046a-logs\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.317460 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-config-data\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.317525 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-scripts\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.317541 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/090c8378-96fa-4223-8b6d-b98fa179046a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.317585 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.317607 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x4w6\" (UniqueName: \"kubernetes.io/projected/090c8378-96fa-4223-8b6d-b98fa179046a-kube-api-access-8x4w6\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.317684 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.358430 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.360434 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.370167 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.370556 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.382534 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.422443 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x4w6\" (UniqueName: \"kubernetes.io/projected/090c8378-96fa-4223-8b6d-b98fa179046a-kube-api-access-8x4w6\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.422735 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.422822 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.422924 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.422997 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cbff870-795c-4622-90e7-e06559b6d884-logs\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.425079 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.425246 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4cbff870-795c-4622-90e7-e06559b6d884-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.425354 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/090c8378-96fa-4223-8b6d-b98fa179046a-logs\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.425431 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.425461 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-config-data\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.425686 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmlgf\" (UniqueName: \"kubernetes.io/projected/4cbff870-795c-4622-90e7-e06559b6d884-kube-api-access-zmlgf\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.425970 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-scripts\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.426018 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/090c8378-96fa-4223-8b6d-b98fa179046a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.426080 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.426107 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.426215 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.426233 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/090c8378-96fa-4223-8b6d-b98fa179046a-logs\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.426783 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/090c8378-96fa-4223-8b6d-b98fa179046a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.426841 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.431054 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.431722 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-scripts\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.432735 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-config-data\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.443270 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x4w6\" (UniqueName: \"kubernetes.io/projected/090c8378-96fa-4223-8b6d-b98fa179046a-kube-api-access-8x4w6\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.447318 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.458475 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.511386 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.528206 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.528321 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmlgf\" (UniqueName: \"kubernetes.io/projected/4cbff870-795c-4622-90e7-e06559b6d884-kube-api-access-zmlgf\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.528385 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.528424 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.528488 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.528524 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.528549 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cbff870-795c-4622-90e7-e06559b6d884-logs\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.528612 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4cbff870-795c-4622-90e7-e06559b6d884-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.529265 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4cbff870-795c-4622-90e7-e06559b6d884-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.530103 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cbff870-795c-4622-90e7-e06559b6d884-logs\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.530254 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.533520 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d67cdfc8f-nngtz"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.534141 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.534463 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.536612 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.537432 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: W0220 07:07:29.548301 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1a92386_f07a_4845_9a5d_231a4c498d3f.slice/crio-268dd0a406ca72b786257f7ababdacbdbf60e8cb46f0c9eb76e1118a867dcccb WatchSource:0}: Error finding container 268dd0a406ca72b786257f7ababdacbdbf60e8cb46f0c9eb76e1118a867dcccb: Status 404 returned error can't find the container with id 268dd0a406ca72b786257f7ababdacbdbf60e8cb46f0c9eb76e1118a867dcccb Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.552406 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmlgf\" (UniqueName: \"kubernetes.io/projected/4cbff870-795c-4622-90e7-e06559b6d884-kube-api-access-zmlgf\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.601910 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.613579 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" event={"ID":"e1a92386-f07a-4845-9a5d-231a4c498d3f","Type":"ContainerStarted","Data":"268dd0a406ca72b786257f7ababdacbdbf60e8cb46f0c9eb76e1118a867dcccb"} Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.613906 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" podUID="887446b0-f238-4ff4-82dd-a903299a0105" containerName="dnsmasq-dns" containerID="cri-o://60d68fcdef2499d06a7eba5248abfa54f3596133e4fe35ebcfc6def68e018ca6" gracePeriod=10 Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.686657 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.861878 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-dnm22"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.888419 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-t7hr7"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.906726 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.915105 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67dccc895-wgf2f"] Feb 20 07:07:29 crc kubenswrapper[5094]: W0220 07:07:29.921405 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffc4926a_ede6_4124_ac91_c9912ffa8a23.slice/crio-003ab580a28b4b49925e83b0991967f99365515d76c15aa90bdb08c5b298df56 WatchSource:0}: Error finding container 003ab580a28b4b49925e83b0991967f99365515d76c15aa90bdb08c5b298df56: Status 404 returned error can't find the container with id 003ab580a28b4b49925e83b0991967f99365515d76c15aa90bdb08c5b298df56 Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.929029 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-fvmwf"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.956744 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jsvf2"] Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.030649 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-b78rr"] Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.209561 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.497375 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.543634 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.564386 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-ovsdbserver-nb\") pod \"887446b0-f238-4ff4-82dd-a903299a0105\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.564932 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vl684\" (UniqueName: \"kubernetes.io/projected/887446b0-f238-4ff4-82dd-a903299a0105-kube-api-access-vl684\") pod \"887446b0-f238-4ff4-82dd-a903299a0105\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.565088 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-config\") pod \"887446b0-f238-4ff4-82dd-a903299a0105\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.565219 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-dns-svc\") pod \"887446b0-f238-4ff4-82dd-a903299a0105\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.565451 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-dns-swift-storage-0\") pod \"887446b0-f238-4ff4-82dd-a903299a0105\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.565619 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-ovsdbserver-sb\") pod \"887446b0-f238-4ff4-82dd-a903299a0105\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.574977 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/887446b0-f238-4ff4-82dd-a903299a0105-kube-api-access-vl684" (OuterVolumeSpecName: "kube-api-access-vl684") pod "887446b0-f238-4ff4-82dd-a903299a0105" (UID: "887446b0-f238-4ff4-82dd-a903299a0105"). InnerVolumeSpecName "kube-api-access-vl684". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.671350 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vl684\" (UniqueName: \"kubernetes.io/projected/887446b0-f238-4ff4-82dd-a903299a0105-kube-api-access-vl684\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.676055 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4cbff870-795c-4622-90e7-e06559b6d884","Type":"ContainerStarted","Data":"49eddfb76f5127d9790e86db4511eac3b1b4ddb83b95cfa7b7d41b4e2cbe2a5f"} Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.680339 5094 generic.go:334] "Generic (PLEG): container finished" podID="887446b0-f238-4ff4-82dd-a903299a0105" containerID="60d68fcdef2499d06a7eba5248abfa54f3596133e4fe35ebcfc6def68e018ca6" exitCode=0 Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.680456 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.680488 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" event={"ID":"887446b0-f238-4ff4-82dd-a903299a0105","Type":"ContainerDied","Data":"60d68fcdef2499d06a7eba5248abfa54f3596133e4fe35ebcfc6def68e018ca6"} Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.680529 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" event={"ID":"887446b0-f238-4ff4-82dd-a903299a0105","Type":"ContainerDied","Data":"d86a37ef604c4bf81c2fdfca35e77934bf681730a45dd0766d8017147ff2766f"} Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.680552 5094 scope.go:117] "RemoveContainer" containerID="60d68fcdef2499d06a7eba5248abfa54f3596133e4fe35ebcfc6def68e018ca6" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.686519 5094 generic.go:334] "Generic (PLEG): container finished" podID="e1a92386-f07a-4845-9a5d-231a4c498d3f" containerID="fdd5dbb72ced24f8fe5963a32bf7fcb2da6a991b9087dbbc5d4a24d4daaa0e56" exitCode=0 Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.686597 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" event={"ID":"e1a92386-f07a-4845-9a5d-231a4c498d3f","Type":"ContainerDied","Data":"fdd5dbb72ced24f8fe5963a32bf7fcb2da6a991b9087dbbc5d4a24d4daaa0e56"} Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.690451 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"090c8378-96fa-4223-8b6d-b98fa179046a","Type":"ContainerStarted","Data":"f1fff81d621e7a98189b82c44089d8bbf94b1c86c3b0de4e3b7aae8e0f0b3931"} Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.692361 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dccc895-wgf2f" event={"ID":"53cdb905-b22d-4849-ae24-6baa2838be39","Type":"ContainerStarted","Data":"db75b8edda289ee932be6a52e71a112b82ffd485c1dcf3a3df8f8b8577f5dd7d"} Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.694050 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-t7hr7" event={"ID":"15583b83-ce22-4b0b-9566-0e056b07c0d7","Type":"ContainerStarted","Data":"7bade8250a47c4de537369bc3b59be47a7a82eab789fb96f42d111f476483273"} Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.696423 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jsvf2" event={"ID":"fd4e0644-4339-45bf-a919-0de0551c5baa","Type":"ContainerStarted","Data":"21f4e9dd4335713cc70e8a920496faa98eb7767c86615d81c2e49ffa01bf7858"} Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.699832 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19d2d34d-f935-40e4-a27a-a382c7634da2","Type":"ContainerStarted","Data":"7d9f8b3046d52c477cb1f1e73376c263067dfb9d891c04aea7389d4a8f986dbe"} Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.706084 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fvmwf" event={"ID":"d6e6aec3-87a9-4f8a-b640-313ab241ec6f","Type":"ContainerStarted","Data":"6e88acd54dbbf562acdafbd54ac1a987deab1124d294a1cf214bd932d6b05497"} Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.710274 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b78rr" event={"ID":"d63a9457-c57a-4979-bd28-ee982250b13c","Type":"ContainerStarted","Data":"1b8f196598eb71bb63b0c9673a673bcdbc971ba9e6d708b6476a724a6e47b813"} Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.740546 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dnm22" event={"ID":"ffc4926a-ede6-4124-ac91-c9912ffa8a23","Type":"ContainerStarted","Data":"003ab580a28b4b49925e83b0991967f99365515d76c15aa90bdb08c5b298df56"} Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.748058 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "887446b0-f238-4ff4-82dd-a903299a0105" (UID: "887446b0-f238-4ff4-82dd-a903299a0105"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.800168 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "887446b0-f238-4ff4-82dd-a903299a0105" (UID: "887446b0-f238-4ff4-82dd-a903299a0105"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.801541 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-config" (OuterVolumeSpecName: "config") pod "887446b0-f238-4ff4-82dd-a903299a0105" (UID: "887446b0-f238-4ff4-82dd-a903299a0105"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.809571 5094 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.809622 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.809635 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.820443 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.835621 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "887446b0-f238-4ff4-82dd-a903299a0105" (UID: "887446b0-f238-4ff4-82dd-a903299a0105"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.837408 5094 scope.go:117] "RemoveContainer" containerID="febef95577b2b18c2f7a29013235d1029fd43ecb02762ac3e41f6e76448932f2" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.901132 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.913144 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.915653 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.961032 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "887446b0-f238-4ff4-82dd-a903299a0105" (UID: "887446b0-f238-4ff4-82dd-a903299a0105"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.973955 5094 scope.go:117] "RemoveContainer" containerID="60d68fcdef2499d06a7eba5248abfa54f3596133e4fe35ebcfc6def68e018ca6" Feb 20 07:07:30 crc kubenswrapper[5094]: E0220 07:07:30.975871 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60d68fcdef2499d06a7eba5248abfa54f3596133e4fe35ebcfc6def68e018ca6\": container with ID starting with 60d68fcdef2499d06a7eba5248abfa54f3596133e4fe35ebcfc6def68e018ca6 not found: ID does not exist" containerID="60d68fcdef2499d06a7eba5248abfa54f3596133e4fe35ebcfc6def68e018ca6" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.975966 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60d68fcdef2499d06a7eba5248abfa54f3596133e4fe35ebcfc6def68e018ca6"} err="failed to get container status \"60d68fcdef2499d06a7eba5248abfa54f3596133e4fe35ebcfc6def68e018ca6\": rpc error: code = NotFound desc = could not find container \"60d68fcdef2499d06a7eba5248abfa54f3596133e4fe35ebcfc6def68e018ca6\": container with ID starting with 60d68fcdef2499d06a7eba5248abfa54f3596133e4fe35ebcfc6def68e018ca6 not found: ID does not exist" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.975998 5094 scope.go:117] "RemoveContainer" containerID="febef95577b2b18c2f7a29013235d1029fd43ecb02762ac3e41f6e76448932f2" Feb 20 07:07:30 crc kubenswrapper[5094]: E0220 07:07:30.991582 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"febef95577b2b18c2f7a29013235d1029fd43ecb02762ac3e41f6e76448932f2\": container with ID starting with febef95577b2b18c2f7a29013235d1029fd43ecb02762ac3e41f6e76448932f2 not found: ID does not exist" containerID="febef95577b2b18c2f7a29013235d1029fd43ecb02762ac3e41f6e76448932f2" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.991631 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"febef95577b2b18c2f7a29013235d1029fd43ecb02762ac3e41f6e76448932f2"} err="failed to get container status \"febef95577b2b18c2f7a29013235d1029fd43ecb02762ac3e41f6e76448932f2\": rpc error: code = NotFound desc = could not find container \"febef95577b2b18c2f7a29013235d1029fd43ecb02762ac3e41f6e76448932f2\": container with ID starting with febef95577b2b18c2f7a29013235d1029fd43ecb02762ac3e41f6e76448932f2 not found: ID does not exist" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.015196 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.109588 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68677f88c9-kpbpx"] Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.146202 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68677f88c9-kpbpx"] Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.356680 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.423312 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-dns-swift-storage-0\") pod \"e1a92386-f07a-4845-9a5d-231a4c498d3f\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.423351 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-ovsdbserver-nb\") pod \"e1a92386-f07a-4845-9a5d-231a4c498d3f\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.423390 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-config\") pod \"e1a92386-f07a-4845-9a5d-231a4c498d3f\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.423414 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-dns-svc\") pod \"e1a92386-f07a-4845-9a5d-231a4c498d3f\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.423481 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5mfl\" (UniqueName: \"kubernetes.io/projected/e1a92386-f07a-4845-9a5d-231a4c498d3f-kube-api-access-j5mfl\") pod \"e1a92386-f07a-4845-9a5d-231a4c498d3f\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.423551 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-ovsdbserver-sb\") pod \"e1a92386-f07a-4845-9a5d-231a4c498d3f\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.438315 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1a92386-f07a-4845-9a5d-231a4c498d3f-kube-api-access-j5mfl" (OuterVolumeSpecName: "kube-api-access-j5mfl") pod "e1a92386-f07a-4845-9a5d-231a4c498d3f" (UID: "e1a92386-f07a-4845-9a5d-231a4c498d3f"). InnerVolumeSpecName "kube-api-access-j5mfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.452677 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-config" (OuterVolumeSpecName: "config") pod "e1a92386-f07a-4845-9a5d-231a4c498d3f" (UID: "e1a92386-f07a-4845-9a5d-231a4c498d3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.461600 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e1a92386-f07a-4845-9a5d-231a4c498d3f" (UID: "e1a92386-f07a-4845-9a5d-231a4c498d3f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.461759 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e1a92386-f07a-4845-9a5d-231a4c498d3f" (UID: "e1a92386-f07a-4845-9a5d-231a4c498d3f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.464481 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e1a92386-f07a-4845-9a5d-231a4c498d3f" (UID: "e1a92386-f07a-4845-9a5d-231a4c498d3f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.464642 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e1a92386-f07a-4845-9a5d-231a4c498d3f" (UID: "e1a92386-f07a-4845-9a5d-231a4c498d3f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.524556 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5mfl\" (UniqueName: \"kubernetes.io/projected/e1a92386-f07a-4845-9a5d-231a4c498d3f-kube-api-access-j5mfl\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.524594 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.524607 5094 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.524621 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.524654 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.524732 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.775014 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b78rr" event={"ID":"d63a9457-c57a-4979-bd28-ee982250b13c","Type":"ContainerStarted","Data":"12ead5ac0eb56e9815a9bc9d890ea3167a7072689f354e4d8ffab5e114525ab7"} Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.783274 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dnm22" event={"ID":"ffc4926a-ede6-4124-ac91-c9912ffa8a23","Type":"ContainerStarted","Data":"bec04cc90bea00c99a6cbeb35313f8c3b75e668698426b61e7cbceb44b686553"} Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.789240 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"090c8378-96fa-4223-8b6d-b98fa179046a","Type":"ContainerStarted","Data":"8bc694cf60dc2751dc7ddd6a90ad6a89e8b2d9feaeaec62df46315e77617467e"} Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.793653 5094 generic.go:334] "Generic (PLEG): container finished" podID="53cdb905-b22d-4849-ae24-6baa2838be39" containerID="135ff79dc99efbd620593401c9d7a73c61d546032c68fab2fd094efa64fa4d62" exitCode=0 Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.793768 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dccc895-wgf2f" event={"ID":"53cdb905-b22d-4849-ae24-6baa2838be39","Type":"ContainerDied","Data":"135ff79dc99efbd620593401c9d7a73c61d546032c68fab2fd094efa64fa4d62"} Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.828364 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-b78rr" podStartSLOduration=4.828338176 podStartE2EDuration="4.828338176s" podCreationTimestamp="2026-02-20 07:07:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:07:31.800051758 +0000 UTC m=+1266.672678459" watchObservedRunningTime="2026-02-20 07:07:31.828338176 +0000 UTC m=+1266.700964887" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.842158 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.859411 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-dnm22" podStartSLOduration=3.859384079 podStartE2EDuration="3.859384079s" podCreationTimestamp="2026-02-20 07:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:07:31.849368829 +0000 UTC m=+1266.721995550" watchObservedRunningTime="2026-02-20 07:07:31.859384079 +0000 UTC m=+1266.732010790" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.894754 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="887446b0-f238-4ff4-82dd-a903299a0105" path="/var/lib/kubelet/pods/887446b0-f238-4ff4-82dd-a903299a0105/volumes" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.895579 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" event={"ID":"e1a92386-f07a-4845-9a5d-231a4c498d3f","Type":"ContainerDied","Data":"268dd0a406ca72b786257f7ababdacbdbf60e8cb46f0c9eb76e1118a867dcccb"} Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.895635 5094 scope.go:117] "RemoveContainer" containerID="fdd5dbb72ced24f8fe5963a32bf7fcb2da6a991b9087dbbc5d4a24d4daaa0e56" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.967695 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d67cdfc8f-nngtz"] Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.975138 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d67cdfc8f-nngtz"] Feb 20 07:07:32 crc kubenswrapper[5094]: I0220 07:07:32.875509 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4cbff870-795c-4622-90e7-e06559b6d884","Type":"ContainerStarted","Data":"2d6d1c9bef646cdf5d3cdcc1b6a5be793e730ee75c92f1c11a51822f455ce37d"} Feb 20 07:07:32 crc kubenswrapper[5094]: I0220 07:07:32.884797 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"090c8378-96fa-4223-8b6d-b98fa179046a","Type":"ContainerStarted","Data":"c31d8324706f8243f00fc950b22649416caea675f5ff2715ed5cde10bb50f051"} Feb 20 07:07:32 crc kubenswrapper[5094]: I0220 07:07:32.885019 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="090c8378-96fa-4223-8b6d-b98fa179046a" containerName="glance-log" containerID="cri-o://8bc694cf60dc2751dc7ddd6a90ad6a89e8b2d9feaeaec62df46315e77617467e" gracePeriod=30 Feb 20 07:07:32 crc kubenswrapper[5094]: I0220 07:07:32.885716 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="090c8378-96fa-4223-8b6d-b98fa179046a" containerName="glance-httpd" containerID="cri-o://c31d8324706f8243f00fc950b22649416caea675f5ff2715ed5cde10bb50f051" gracePeriod=30 Feb 20 07:07:32 crc kubenswrapper[5094]: I0220 07:07:32.893060 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dccc895-wgf2f" event={"ID":"53cdb905-b22d-4849-ae24-6baa2838be39","Type":"ContainerStarted","Data":"886db898799d64a40ec0fd6c2e2b8f27c2c4edd60adf4a15ed59da9eda2fda51"} Feb 20 07:07:32 crc kubenswrapper[5094]: I0220 07:07:32.893101 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:32 crc kubenswrapper[5094]: I0220 07:07:32.909528 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.909503159 podStartE2EDuration="4.909503159s" podCreationTimestamp="2026-02-20 07:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:07:32.903941205 +0000 UTC m=+1267.776567916" watchObservedRunningTime="2026-02-20 07:07:32.909503159 +0000 UTC m=+1267.782129870" Feb 20 07:07:32 crc kubenswrapper[5094]: I0220 07:07:32.943398 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67dccc895-wgf2f" podStartSLOduration=4.943373159 podStartE2EDuration="4.943373159s" podCreationTimestamp="2026-02-20 07:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:07:32.933661617 +0000 UTC m=+1267.806288328" watchObservedRunningTime="2026-02-20 07:07:32.943373159 +0000 UTC m=+1267.815999870" Feb 20 07:07:33 crc kubenswrapper[5094]: I0220 07:07:33.853904 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1a92386-f07a-4845-9a5d-231a4c498d3f" path="/var/lib/kubelet/pods/e1a92386-f07a-4845-9a5d-231a4c498d3f/volumes" Feb 20 07:07:33 crc kubenswrapper[5094]: I0220 07:07:33.921834 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"090c8378-96fa-4223-8b6d-b98fa179046a","Type":"ContainerDied","Data":"c31d8324706f8243f00fc950b22649416caea675f5ff2715ed5cde10bb50f051"} Feb 20 07:07:33 crc kubenswrapper[5094]: I0220 07:07:33.921690 5094 generic.go:334] "Generic (PLEG): container finished" podID="090c8378-96fa-4223-8b6d-b98fa179046a" containerID="c31d8324706f8243f00fc950b22649416caea675f5ff2715ed5cde10bb50f051" exitCode=0 Feb 20 07:07:33 crc kubenswrapper[5094]: I0220 07:07:33.922022 5094 generic.go:334] "Generic (PLEG): container finished" podID="090c8378-96fa-4223-8b6d-b98fa179046a" containerID="8bc694cf60dc2751dc7ddd6a90ad6a89e8b2d9feaeaec62df46315e77617467e" exitCode=143 Feb 20 07:07:33 crc kubenswrapper[5094]: I0220 07:07:33.922113 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"090c8378-96fa-4223-8b6d-b98fa179046a","Type":"ContainerDied","Data":"8bc694cf60dc2751dc7ddd6a90ad6a89e8b2d9feaeaec62df46315e77617467e"} Feb 20 07:07:33 crc kubenswrapper[5094]: I0220 07:07:33.939732 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4cbff870-795c-4622-90e7-e06559b6d884","Type":"ContainerStarted","Data":"70fcfaee70dbdb9d905be4dee063c5e65bbf885dee16b38909df107b4aa714d8"} Feb 20 07:07:33 crc kubenswrapper[5094]: I0220 07:07:33.939963 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4cbff870-795c-4622-90e7-e06559b6d884" containerName="glance-log" containerID="cri-o://2d6d1c9bef646cdf5d3cdcc1b6a5be793e730ee75c92f1c11a51822f455ce37d" gracePeriod=30 Feb 20 07:07:33 crc kubenswrapper[5094]: I0220 07:07:33.940112 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4cbff870-795c-4622-90e7-e06559b6d884" containerName="glance-httpd" containerID="cri-o://70fcfaee70dbdb9d905be4dee063c5e65bbf885dee16b38909df107b4aa714d8" gracePeriod=30 Feb 20 07:07:33 crc kubenswrapper[5094]: I0220 07:07:33.975768 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.975742975 podStartE2EDuration="5.975742975s" podCreationTimestamp="2026-02-20 07:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:07:33.965098439 +0000 UTC m=+1268.837725150" watchObservedRunningTime="2026-02-20 07:07:33.975742975 +0000 UTC m=+1268.848369686" Feb 20 07:07:34 crc kubenswrapper[5094]: I0220 07:07:34.106519 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:07:34 crc kubenswrapper[5094]: I0220 07:07:34.106614 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:07:34 crc kubenswrapper[5094]: I0220 07:07:34.954586 5094 generic.go:334] "Generic (PLEG): container finished" podID="4cbff870-795c-4622-90e7-e06559b6d884" containerID="70fcfaee70dbdb9d905be4dee063c5e65bbf885dee16b38909df107b4aa714d8" exitCode=0 Feb 20 07:07:34 crc kubenswrapper[5094]: I0220 07:07:34.955092 5094 generic.go:334] "Generic (PLEG): container finished" podID="4cbff870-795c-4622-90e7-e06559b6d884" containerID="2d6d1c9bef646cdf5d3cdcc1b6a5be793e730ee75c92f1c11a51822f455ce37d" exitCode=143 Feb 20 07:07:34 crc kubenswrapper[5094]: I0220 07:07:34.954669 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4cbff870-795c-4622-90e7-e06559b6d884","Type":"ContainerDied","Data":"70fcfaee70dbdb9d905be4dee063c5e65bbf885dee16b38909df107b4aa714d8"} Feb 20 07:07:34 crc kubenswrapper[5094]: I0220 07:07:34.955164 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4cbff870-795c-4622-90e7-e06559b6d884","Type":"ContainerDied","Data":"2d6d1c9bef646cdf5d3cdcc1b6a5be793e730ee75c92f1c11a51822f455ce37d"} Feb 20 07:07:34 crc kubenswrapper[5094]: I0220 07:07:34.957928 5094 generic.go:334] "Generic (PLEG): container finished" podID="d63a9457-c57a-4979-bd28-ee982250b13c" containerID="12ead5ac0eb56e9815a9bc9d890ea3167a7072689f354e4d8ffab5e114525ab7" exitCode=0 Feb 20 07:07:34 crc kubenswrapper[5094]: I0220 07:07:34.957968 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b78rr" event={"ID":"d63a9457-c57a-4979-bd28-ee982250b13c","Type":"ContainerDied","Data":"12ead5ac0eb56e9815a9bc9d890ea3167a7072689f354e4d8ffab5e114525ab7"} Feb 20 07:07:38 crc kubenswrapper[5094]: I0220 07:07:38.910617 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.002854 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b78rr" event={"ID":"d63a9457-c57a-4979-bd28-ee982250b13c","Type":"ContainerDied","Data":"1b8f196598eb71bb63b0c9673a673bcdbc971ba9e6d708b6476a724a6e47b813"} Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.002915 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b8f196598eb71bb63b0c9673a673bcdbc971ba9e6d708b6476a724a6e47b813" Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.002928 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.028930 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.068520 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-config-data\") pod \"d63a9457-c57a-4979-bd28-ee982250b13c\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.068670 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-combined-ca-bundle\") pod \"d63a9457-c57a-4979-bd28-ee982250b13c\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.068745 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-credential-keys\") pod \"d63a9457-c57a-4979-bd28-ee982250b13c\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.068861 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25g6b\" (UniqueName: \"kubernetes.io/projected/d63a9457-c57a-4979-bd28-ee982250b13c-kube-api-access-25g6b\") pod \"d63a9457-c57a-4979-bd28-ee982250b13c\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.069024 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-scripts\") pod \"d63a9457-c57a-4979-bd28-ee982250b13c\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.069173 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-fernet-keys\") pod \"d63a9457-c57a-4979-bd28-ee982250b13c\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.080121 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d63a9457-c57a-4979-bd28-ee982250b13c" (UID: "d63a9457-c57a-4979-bd28-ee982250b13c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.102434 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d63a9457-c57a-4979-bd28-ee982250b13c" (UID: "d63a9457-c57a-4979-bd28-ee982250b13c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.128697 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-scripts" (OuterVolumeSpecName: "scripts") pod "d63a9457-c57a-4979-bd28-ee982250b13c" (UID: "d63a9457-c57a-4979-bd28-ee982250b13c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.132128 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-config-data" (OuterVolumeSpecName: "config-data") pod "d63a9457-c57a-4979-bd28-ee982250b13c" (UID: "d63a9457-c57a-4979-bd28-ee982250b13c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.144156 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d63a9457-c57a-4979-bd28-ee982250b13c-kube-api-access-25g6b" (OuterVolumeSpecName: "kube-api-access-25g6b") pod "d63a9457-c57a-4979-bd28-ee982250b13c" (UID: "d63a9457-c57a-4979-bd28-ee982250b13c"). InnerVolumeSpecName "kube-api-access-25g6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.150143 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-768666cd57-7ddwb"] Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.150410 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-768666cd57-7ddwb" podUID="aa0f7da7-a9bd-4b03-b256-d05ba9323e70" containerName="dnsmasq-dns" containerID="cri-o://0ce2ed20e689517e7e40aa9a1d41aa5a5a7a220e9e24137c5b2fd77c89b217f6" gracePeriod=10 Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.160394 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d63a9457-c57a-4979-bd28-ee982250b13c" (UID: "d63a9457-c57a-4979-bd28-ee982250b13c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.171322 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.171364 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.171378 5094 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.171387 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25g6b\" (UniqueName: \"kubernetes.io/projected/d63a9457-c57a-4979-bd28-ee982250b13c-kube-api-access-25g6b\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.171396 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.171403 5094 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.040426 5094 generic.go:334] "Generic (PLEG): container finished" podID="aa0f7da7-a9bd-4b03-b256-d05ba9323e70" containerID="0ce2ed20e689517e7e40aa9a1d41aa5a5a7a220e9e24137c5b2fd77c89b217f6" exitCode=0 Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.040493 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768666cd57-7ddwb" event={"ID":"aa0f7da7-a9bd-4b03-b256-d05ba9323e70","Type":"ContainerDied","Data":"0ce2ed20e689517e7e40aa9a1d41aa5a5a7a220e9e24137c5b2fd77c89b217f6"} Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.080586 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-b78rr"] Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.087078 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-b78rr"] Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.187202 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xnj8t"] Feb 20 07:07:40 crc kubenswrapper[5094]: E0220 07:07:40.187840 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d63a9457-c57a-4979-bd28-ee982250b13c" containerName="keystone-bootstrap" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.187862 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d63a9457-c57a-4979-bd28-ee982250b13c" containerName="keystone-bootstrap" Feb 20 07:07:40 crc kubenswrapper[5094]: E0220 07:07:40.188042 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="887446b0-f238-4ff4-82dd-a903299a0105" containerName="dnsmasq-dns" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.188053 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="887446b0-f238-4ff4-82dd-a903299a0105" containerName="dnsmasq-dns" Feb 20 07:07:40 crc kubenswrapper[5094]: E0220 07:07:40.188094 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="887446b0-f238-4ff4-82dd-a903299a0105" containerName="init" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.188106 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="887446b0-f238-4ff4-82dd-a903299a0105" containerName="init" Feb 20 07:07:40 crc kubenswrapper[5094]: E0220 07:07:40.188123 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1a92386-f07a-4845-9a5d-231a4c498d3f" containerName="init" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.188131 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1a92386-f07a-4845-9a5d-231a4c498d3f" containerName="init" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.188344 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d63a9457-c57a-4979-bd28-ee982250b13c" containerName="keystone-bootstrap" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.188364 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="887446b0-f238-4ff4-82dd-a903299a0105" containerName="dnsmasq-dns" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.188374 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1a92386-f07a-4845-9a5d-231a4c498d3f" containerName="init" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.192077 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.196882 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.196983 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.196885 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.197377 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.197521 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qgvjt" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.218504 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xnj8t"] Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.301677 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-scripts\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.301781 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgbzs\" (UniqueName: \"kubernetes.io/projected/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-kube-api-access-vgbzs\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.301812 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-config-data\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.301868 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-fernet-keys\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.301885 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-credential-keys\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.301935 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-combined-ca-bundle\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.403773 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-combined-ca-bundle\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.403934 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-scripts\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.403988 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgbzs\" (UniqueName: \"kubernetes.io/projected/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-kube-api-access-vgbzs\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.404014 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-config-data\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.404082 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-fernet-keys\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.404105 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-credential-keys\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.415151 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-combined-ca-bundle\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.415146 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-scripts\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.415241 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-config-data\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.415429 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-fernet-keys\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.416137 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-credential-keys\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.428122 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgbzs\" (UniqueName: \"kubernetes.io/projected/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-kube-api-access-vgbzs\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.517444 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:41 crc kubenswrapper[5094]: I0220 07:07:41.856306 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d63a9457-c57a-4979-bd28-ee982250b13c" path="/var/lib/kubelet/pods/d63a9457-c57a-4979-bd28-ee982250b13c/volumes" Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.962608 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.982596 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.995378 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-config-data\") pod \"4cbff870-795c-4622-90e7-e06559b6d884\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.995529 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-combined-ca-bundle\") pod \"4cbff870-795c-4622-90e7-e06559b6d884\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.995642 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"4cbff870-795c-4622-90e7-e06559b6d884\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.995684 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/090c8378-96fa-4223-8b6d-b98fa179046a-httpd-run\") pod \"090c8378-96fa-4223-8b6d-b98fa179046a\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.995751 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-public-tls-certs\") pod \"090c8378-96fa-4223-8b6d-b98fa179046a\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.995870 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmlgf\" (UniqueName: \"kubernetes.io/projected/4cbff870-795c-4622-90e7-e06559b6d884-kube-api-access-zmlgf\") pod \"4cbff870-795c-4622-90e7-e06559b6d884\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.995907 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-internal-tls-certs\") pod \"4cbff870-795c-4622-90e7-e06559b6d884\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.995961 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-scripts\") pod \"090c8378-96fa-4223-8b6d-b98fa179046a\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.996081 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4cbff870-795c-4622-90e7-e06559b6d884-httpd-run\") pod \"4cbff870-795c-4622-90e7-e06559b6d884\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.996126 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-scripts\") pod \"4cbff870-795c-4622-90e7-e06559b6d884\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.996188 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"090c8378-96fa-4223-8b6d-b98fa179046a\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.996240 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-combined-ca-bundle\") pod \"090c8378-96fa-4223-8b6d-b98fa179046a\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.996279 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x4w6\" (UniqueName: \"kubernetes.io/projected/090c8378-96fa-4223-8b6d-b98fa179046a-kube-api-access-8x4w6\") pod \"090c8378-96fa-4223-8b6d-b98fa179046a\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.996361 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/090c8378-96fa-4223-8b6d-b98fa179046a-logs\") pod \"090c8378-96fa-4223-8b6d-b98fa179046a\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.996406 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-config-data\") pod \"090c8378-96fa-4223-8b6d-b98fa179046a\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.996435 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cbff870-795c-4622-90e7-e06559b6d884-logs\") pod \"4cbff870-795c-4622-90e7-e06559b6d884\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.003586 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cbff870-795c-4622-90e7-e06559b6d884-logs" (OuterVolumeSpecName: "logs") pod "4cbff870-795c-4622-90e7-e06559b6d884" (UID: "4cbff870-795c-4622-90e7-e06559b6d884"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.007353 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cbff870-795c-4622-90e7-e06559b6d884-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4cbff870-795c-4622-90e7-e06559b6d884" (UID: "4cbff870-795c-4622-90e7-e06559b6d884"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.007378 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/090c8378-96fa-4223-8b6d-b98fa179046a-logs" (OuterVolumeSpecName: "logs") pod "090c8378-96fa-4223-8b6d-b98fa179046a" (UID: "090c8378-96fa-4223-8b6d-b98fa179046a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.009554 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "4cbff870-795c-4622-90e7-e06559b6d884" (UID: "4cbff870-795c-4622-90e7-e06559b6d884"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.010475 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "090c8378-96fa-4223-8b6d-b98fa179046a" (UID: "090c8378-96fa-4223-8b6d-b98fa179046a"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.028516 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-scripts" (OuterVolumeSpecName: "scripts") pod "090c8378-96fa-4223-8b6d-b98fa179046a" (UID: "090c8378-96fa-4223-8b6d-b98fa179046a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.031181 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-scripts" (OuterVolumeSpecName: "scripts") pod "4cbff870-795c-4622-90e7-e06559b6d884" (UID: "4cbff870-795c-4622-90e7-e06559b6d884"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.035515 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/090c8378-96fa-4223-8b6d-b98fa179046a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "090c8378-96fa-4223-8b6d-b98fa179046a" (UID: "090c8378-96fa-4223-8b6d-b98fa179046a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.040242 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/090c8378-96fa-4223-8b6d-b98fa179046a-kube-api-access-8x4w6" (OuterVolumeSpecName: "kube-api-access-8x4w6") pod "090c8378-96fa-4223-8b6d-b98fa179046a" (UID: "090c8378-96fa-4223-8b6d-b98fa179046a"). InnerVolumeSpecName "kube-api-access-8x4w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.108647 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "090c8378-96fa-4223-8b6d-b98fa179046a" (UID: "090c8378-96fa-4223-8b6d-b98fa179046a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.122573 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cbff870-795c-4622-90e7-e06559b6d884-kube-api-access-zmlgf" (OuterVolumeSpecName: "kube-api-access-zmlgf") pod "4cbff870-795c-4622-90e7-e06559b6d884" (UID: "4cbff870-795c-4622-90e7-e06559b6d884"). InnerVolumeSpecName "kube-api-access-zmlgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.123089 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4cbff870-795c-4622-90e7-e06559b6d884" (UID: "4cbff870-795c-4622-90e7-e06559b6d884"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.123295 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmlgf\" (UniqueName: \"kubernetes.io/projected/4cbff870-795c-4622-90e7-e06559b6d884-kube-api-access-zmlgf\") pod \"4cbff870-795c-4622-90e7-e06559b6d884\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.123449 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-combined-ca-bundle\") pod \"090c8378-96fa-4223-8b6d-b98fa179046a\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " Feb 20 07:07:43 crc kubenswrapper[5094]: W0220 07:07:43.123505 5094 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/4cbff870-795c-4622-90e7-e06559b6d884/volumes/kubernetes.io~projected/kube-api-access-zmlgf Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.123608 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cbff870-795c-4622-90e7-e06559b6d884-kube-api-access-zmlgf" (OuterVolumeSpecName: "kube-api-access-zmlgf") pod "4cbff870-795c-4622-90e7-e06559b6d884" (UID: "4cbff870-795c-4622-90e7-e06559b6d884"). InnerVolumeSpecName "kube-api-access-zmlgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: W0220 07:07:43.123737 5094 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/090c8378-96fa-4223-8b6d-b98fa179046a/volumes/kubernetes.io~secret/combined-ca-bundle Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.123756 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "090c8378-96fa-4223-8b6d-b98fa179046a" (UID: "090c8378-96fa-4223-8b6d-b98fa179046a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.124549 5094 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4cbff870-795c-4622-90e7-e06559b6d884-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.124577 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.124592 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.124606 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x4w6\" (UniqueName: \"kubernetes.io/projected/090c8378-96fa-4223-8b6d-b98fa179046a-kube-api-access-8x4w6\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.124641 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.124656 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/090c8378-96fa-4223-8b6d-b98fa179046a-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.124795 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cbff870-795c-4622-90e7-e06559b6d884-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.124813 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.124864 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.124925 5094 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/090c8378-96fa-4223-8b6d-b98fa179046a-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.124976 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmlgf\" (UniqueName: \"kubernetes.io/projected/4cbff870-795c-4622-90e7-e06559b6d884-kube-api-access-zmlgf\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.124992 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.132094 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"090c8378-96fa-4223-8b6d-b98fa179046a","Type":"ContainerDied","Data":"f1fff81d621e7a98189b82c44089d8bbf94b1c86c3b0de4e3b7aae8e0f0b3931"} Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.132146 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.132283 5094 scope.go:117] "RemoveContainer" containerID="c31d8324706f8243f00fc950b22649416caea675f5ff2715ed5cde10bb50f051" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.137387 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4cbff870-795c-4622-90e7-e06559b6d884","Type":"ContainerDied","Data":"49eddfb76f5127d9790e86db4511eac3b1b4ddb83b95cfa7b7d41b4e2cbe2a5f"} Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.137530 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.143270 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-config-data" (OuterVolumeSpecName: "config-data") pod "4cbff870-795c-4622-90e7-e06559b6d884" (UID: "4cbff870-795c-4622-90e7-e06559b6d884"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.147984 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4cbff870-795c-4622-90e7-e06559b6d884" (UID: "4cbff870-795c-4622-90e7-e06559b6d884"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.148069 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-config-data" (OuterVolumeSpecName: "config-data") pod "090c8378-96fa-4223-8b6d-b98fa179046a" (UID: "090c8378-96fa-4223-8b6d-b98fa179046a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.155046 5094 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.163735 5094 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.178974 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "090c8378-96fa-4223-8b6d-b98fa179046a" (UID: "090c8378-96fa-4223-8b6d-b98fa179046a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.227437 5094 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.227478 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.227496 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.227508 5094 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.227519 5094 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.227531 5094 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.477908 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.487498 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.500861 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.509955 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.528935 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:07:43 crc kubenswrapper[5094]: E0220 07:07:43.529429 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cbff870-795c-4622-90e7-e06559b6d884" containerName="glance-httpd" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.529448 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cbff870-795c-4622-90e7-e06559b6d884" containerName="glance-httpd" Feb 20 07:07:43 crc kubenswrapper[5094]: E0220 07:07:43.529466 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="090c8378-96fa-4223-8b6d-b98fa179046a" containerName="glance-httpd" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.529474 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="090c8378-96fa-4223-8b6d-b98fa179046a" containerName="glance-httpd" Feb 20 07:07:43 crc kubenswrapper[5094]: E0220 07:07:43.529487 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="090c8378-96fa-4223-8b6d-b98fa179046a" containerName="glance-log" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.529495 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="090c8378-96fa-4223-8b6d-b98fa179046a" containerName="glance-log" Feb 20 07:07:43 crc kubenswrapper[5094]: E0220 07:07:43.529514 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cbff870-795c-4622-90e7-e06559b6d884" containerName="glance-log" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.529520 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cbff870-795c-4622-90e7-e06559b6d884" containerName="glance-log" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.530177 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cbff870-795c-4622-90e7-e06559b6d884" containerName="glance-log" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.530196 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cbff870-795c-4622-90e7-e06559b6d884" containerName="glance-httpd" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.530214 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="090c8378-96fa-4223-8b6d-b98fa179046a" containerName="glance-log" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.530225 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="090c8378-96fa-4223-8b6d-b98fa179046a" containerName="glance-httpd" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.531265 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.534276 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9q5bq" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.535426 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.535537 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.535612 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.552933 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.555022 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.557281 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.557438 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.561835 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.570505 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.635328 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-scripts\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.635378 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.635411 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-config-data\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.635502 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-logs\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.635531 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.635560 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.635616 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.635642 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzzp4\" (UniqueName: \"kubernetes.io/projected/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-kube-api-access-wzzp4\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.736928 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.736985 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.737024 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-scripts\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.737045 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.737064 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-config-data\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.737087 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.737105 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-logs\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.737162 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-logs\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.737223 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.737249 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.737596 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.737682 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.737769 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-logs\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.739920 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.740016 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.740035 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.740079 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzzp4\" (UniqueName: \"kubernetes.io/projected/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-kube-api-access-wzzp4\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.740111 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkqb7\" (UniqueName: \"kubernetes.io/projected/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-kube-api-access-pkqb7\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.740180 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.741359 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.743286 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-scripts\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.750023 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-config-data\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.757118 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.758671 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzzp4\" (UniqueName: \"kubernetes.io/projected/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-kube-api-access-wzzp4\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.763672 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.799111 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-768666cd57-7ddwb" podUID="aa0f7da7-a9bd-4b03-b256-d05ba9323e70" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.123:5353: connect: connection refused" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.841364 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.841434 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkqb7\" (UniqueName: \"kubernetes.io/projected/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-kube-api-access-pkqb7\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.841980 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.842027 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.842051 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.842089 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.842107 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-logs\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.842135 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.843853 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.843925 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-logs\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.844331 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.848868 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.853384 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.855995 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.856143 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.867515 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkqb7\" (UniqueName: \"kubernetes.io/projected/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-kube-api-access-pkqb7\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.872559 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="090c8378-96fa-4223-8b6d-b98fa179046a" path="/var/lib/kubelet/pods/090c8378-96fa-4223-8b6d-b98fa179046a/volumes" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.875610 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cbff870-795c-4622-90e7-e06559b6d884" path="/var/lib/kubelet/pods/4cbff870-795c-4622-90e7-e06559b6d884/volumes" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.892210 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.898450 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:44 crc kubenswrapper[5094]: I0220 07:07:44.201850 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 07:07:48 crc kubenswrapper[5094]: I0220 07:07:48.799695 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-768666cd57-7ddwb" podUID="aa0f7da7-a9bd-4b03-b256-d05ba9323e70" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.123:5353: connect: connection refused" Feb 20 07:07:49 crc kubenswrapper[5094]: I0220 07:07:49.201568 5094 generic.go:334] "Generic (PLEG): container finished" podID="ffc4926a-ede6-4124-ac91-c9912ffa8a23" containerID="bec04cc90bea00c99a6cbeb35313f8c3b75e668698426b61e7cbceb44b686553" exitCode=0 Feb 20 07:07:49 crc kubenswrapper[5094]: I0220 07:07:49.201613 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dnm22" event={"ID":"ffc4926a-ede6-4124-ac91-c9912ffa8a23","Type":"ContainerDied","Data":"bec04cc90bea00c99a6cbeb35313f8c3b75e668698426b61e7cbceb44b686553"} Feb 20 07:07:52 crc kubenswrapper[5094]: I0220 07:07:52.517915 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dnm22" Feb 20 07:07:52 crc kubenswrapper[5094]: I0220 07:07:52.659056 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncgbz\" (UniqueName: \"kubernetes.io/projected/ffc4926a-ede6-4124-ac91-c9912ffa8a23-kube-api-access-ncgbz\") pod \"ffc4926a-ede6-4124-ac91-c9912ffa8a23\" (UID: \"ffc4926a-ede6-4124-ac91-c9912ffa8a23\") " Feb 20 07:07:52 crc kubenswrapper[5094]: I0220 07:07:52.659452 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ffc4926a-ede6-4124-ac91-c9912ffa8a23-config\") pod \"ffc4926a-ede6-4124-ac91-c9912ffa8a23\" (UID: \"ffc4926a-ede6-4124-ac91-c9912ffa8a23\") " Feb 20 07:07:52 crc kubenswrapper[5094]: I0220 07:07:52.659556 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffc4926a-ede6-4124-ac91-c9912ffa8a23-combined-ca-bundle\") pod \"ffc4926a-ede6-4124-ac91-c9912ffa8a23\" (UID: \"ffc4926a-ede6-4124-ac91-c9912ffa8a23\") " Feb 20 07:07:52 crc kubenswrapper[5094]: I0220 07:07:52.674244 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffc4926a-ede6-4124-ac91-c9912ffa8a23-kube-api-access-ncgbz" (OuterVolumeSpecName: "kube-api-access-ncgbz") pod "ffc4926a-ede6-4124-ac91-c9912ffa8a23" (UID: "ffc4926a-ede6-4124-ac91-c9912ffa8a23"). InnerVolumeSpecName "kube-api-access-ncgbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:52 crc kubenswrapper[5094]: I0220 07:07:52.697266 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffc4926a-ede6-4124-ac91-c9912ffa8a23-config" (OuterVolumeSpecName: "config") pod "ffc4926a-ede6-4124-ac91-c9912ffa8a23" (UID: "ffc4926a-ede6-4124-ac91-c9912ffa8a23"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:52 crc kubenswrapper[5094]: I0220 07:07:52.697829 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffc4926a-ede6-4124-ac91-c9912ffa8a23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ffc4926a-ede6-4124-ac91-c9912ffa8a23" (UID: "ffc4926a-ede6-4124-ac91-c9912ffa8a23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:52 crc kubenswrapper[5094]: I0220 07:07:52.762983 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncgbz\" (UniqueName: \"kubernetes.io/projected/ffc4926a-ede6-4124-ac91-c9912ffa8a23-kube-api-access-ncgbz\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:52 crc kubenswrapper[5094]: I0220 07:07:52.763033 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ffc4926a-ede6-4124-ac91-c9912ffa8a23-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:52 crc kubenswrapper[5094]: I0220 07:07:52.763051 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffc4926a-ede6-4124-ac91-c9912ffa8a23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.241786 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dnm22" event={"ID":"ffc4926a-ede6-4124-ac91-c9912ffa8a23","Type":"ContainerDied","Data":"003ab580a28b4b49925e83b0991967f99365515d76c15aa90bdb08c5b298df56"} Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.242250 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="003ab580a28b4b49925e83b0991967f99365515d76c15aa90bdb08c5b298df56" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.241884 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dnm22" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.818136 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-db5c97f8f-4sj4d"] Feb 20 07:07:53 crc kubenswrapper[5094]: E0220 07:07:53.818558 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffc4926a-ede6-4124-ac91-c9912ffa8a23" containerName="neutron-db-sync" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.818571 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffc4926a-ede6-4124-ac91-c9912ffa8a23" containerName="neutron-db-sync" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.818789 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffc4926a-ede6-4124-ac91-c9912ffa8a23" containerName="neutron-db-sync" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.819938 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.856901 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-db5c97f8f-4sj4d"] Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.875815 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5ddb8575b6-4wznv"] Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.877417 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.884936 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.885232 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.885926 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-zk895" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.886069 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.895028 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5ddb8575b6-4wznv"] Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.990792 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-ovndb-tls-certs\") pod \"neutron-5ddb8575b6-4wznv\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.990839 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-dns-swift-storage-0\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.990864 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-httpd-config\") pod \"neutron-5ddb8575b6-4wznv\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.990888 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-config\") pod \"neutron-5ddb8575b6-4wznv\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.990937 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6zcn\" (UniqueName: \"kubernetes.io/projected/2c274cd0-4938-48fa-8534-409a3070299f-kube-api-access-b6zcn\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.990959 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-ovsdbserver-nb\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.991217 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-ovsdbserver-sb\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.991401 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-combined-ca-bundle\") pod \"neutron-5ddb8575b6-4wznv\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.991497 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-dns-svc\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.991533 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-config\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.991716 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5fx4\" (UniqueName: \"kubernetes.io/projected/c5f4dc72-bb77-44c4-8058-4939958d7a48-kube-api-access-t5fx4\") pod \"neutron-5ddb8575b6-4wznv\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.094096 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6zcn\" (UniqueName: \"kubernetes.io/projected/2c274cd0-4938-48fa-8534-409a3070299f-kube-api-access-b6zcn\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.094177 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-ovsdbserver-nb\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.094250 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-ovsdbserver-sb\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.094298 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-combined-ca-bundle\") pod \"neutron-5ddb8575b6-4wznv\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.094336 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-dns-svc\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.094360 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-config\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.094422 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5fx4\" (UniqueName: \"kubernetes.io/projected/c5f4dc72-bb77-44c4-8058-4939958d7a48-kube-api-access-t5fx4\") pod \"neutron-5ddb8575b6-4wznv\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.094474 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-ovndb-tls-certs\") pod \"neutron-5ddb8575b6-4wznv\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.094500 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-dns-swift-storage-0\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.094530 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-httpd-config\") pod \"neutron-5ddb8575b6-4wznv\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.094558 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-config\") pod \"neutron-5ddb8575b6-4wznv\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.095576 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-ovsdbserver-nb\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.096161 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-dns-svc\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.097746 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-dns-swift-storage-0\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.098010 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-ovsdbserver-sb\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.098177 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-config\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.105633 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-config\") pod \"neutron-5ddb8575b6-4wznv\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.105626 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-ovndb-tls-certs\") pod \"neutron-5ddb8575b6-4wznv\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.108685 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-combined-ca-bundle\") pod \"neutron-5ddb8575b6-4wznv\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.115491 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-httpd-config\") pod \"neutron-5ddb8575b6-4wznv\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.116622 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6zcn\" (UniqueName: \"kubernetes.io/projected/2c274cd0-4938-48fa-8534-409a3070299f-kube-api-access-b6zcn\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.118515 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5fx4\" (UniqueName: \"kubernetes.io/projected/c5f4dc72-bb77-44c4-8058-4939958d7a48-kube-api-access-t5fx4\") pod \"neutron-5ddb8575b6-4wznv\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.158413 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:54 crc kubenswrapper[5094]: E0220 07:07:54.195162 5094 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:3fa6e687aa002b92fedbfe2c1ccaa2906b399c58d17bf9ecece2c4cd69a0210b" Feb 20 07:07:54 crc kubenswrapper[5094]: E0220 07:07:54.195464 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:3fa6e687aa002b92fedbfe2c1ccaa2906b399c58d17bf9ecece2c4cd69a0210b,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qf24d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-t7hr7_openstack(15583b83-ce22-4b0b-9566-0e056b07c0d7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 07:07:54 crc kubenswrapper[5094]: E0220 07:07:54.197363 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-t7hr7" podUID="15583b83-ce22-4b0b-9566-0e056b07c0d7" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.211529 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.236960 5094 scope.go:117] "RemoveContainer" containerID="8bc694cf60dc2751dc7ddd6a90ad6a89e8b2d9feaeaec62df46315e77617467e" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.262101 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768666cd57-7ddwb" event={"ID":"aa0f7da7-a9bd-4b03-b256-d05ba9323e70","Type":"ContainerDied","Data":"a20fdfeb6eba82be36ac56d6e09083509aa1e18f2c3332e9e2fa4dc3127f38aa"} Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.262158 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a20fdfeb6eba82be36ac56d6e09083509aa1e18f2c3332e9e2fa4dc3127f38aa" Feb 20 07:07:54 crc kubenswrapper[5094]: E0220 07:07:54.266775 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:3fa6e687aa002b92fedbfe2c1ccaa2906b399c58d17bf9ecece2c4cd69a0210b\\\"\"" pod="openstack/cinder-db-sync-t7hr7" podUID="15583b83-ce22-4b0b-9566-0e056b07c0d7" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.360964 5094 scope.go:117] "RemoveContainer" containerID="70fcfaee70dbdb9d905be4dee063c5e65bbf885dee16b38909df107b4aa714d8" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.376931 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.503426 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-dns-swift-storage-0\") pod \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.503540 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzck5\" (UniqueName: \"kubernetes.io/projected/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-kube-api-access-jzck5\") pod \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.503689 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-config\") pod \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.503746 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-ovsdbserver-nb\") pod \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.503782 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-ovsdbserver-sb\") pod \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.503795 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-dns-svc\") pod \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.533439 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-kube-api-access-jzck5" (OuterVolumeSpecName: "kube-api-access-jzck5") pod "aa0f7da7-a9bd-4b03-b256-d05ba9323e70" (UID: "aa0f7da7-a9bd-4b03-b256-d05ba9323e70"). InnerVolumeSpecName "kube-api-access-jzck5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.574893 5094 scope.go:117] "RemoveContainer" containerID="2d6d1c9bef646cdf5d3cdcc1b6a5be793e730ee75c92f1c11a51822f455ce37d" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.575008 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aa0f7da7-a9bd-4b03-b256-d05ba9323e70" (UID: "aa0f7da7-a9bd-4b03-b256-d05ba9323e70"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.606307 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.606344 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzck5\" (UniqueName: \"kubernetes.io/projected/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-kube-api-access-jzck5\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.677440 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aa0f7da7-a9bd-4b03-b256-d05ba9323e70" (UID: "aa0f7da7-a9bd-4b03-b256-d05ba9323e70"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.708906 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.759030 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aa0f7da7-a9bd-4b03-b256-d05ba9323e70" (UID: "aa0f7da7-a9bd-4b03-b256-d05ba9323e70"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.762916 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xnj8t"] Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.791666 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "aa0f7da7-a9bd-4b03-b256-d05ba9323e70" (UID: "aa0f7da7-a9bd-4b03-b256-d05ba9323e70"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.810828 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.810882 5094 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.816313 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-config" (OuterVolumeSpecName: "config") pod "aa0f7da7-a9bd-4b03-b256-d05ba9323e70" (UID: "aa0f7da7-a9bd-4b03-b256-d05ba9323e70"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.914963 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.916937 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.042277 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-db5c97f8f-4sj4d"] Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.175386 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.332033 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" event={"ID":"2c274cd0-4938-48fa-8534-409a3070299f","Type":"ContainerStarted","Data":"f4c855da6dfadfbfaa6485b438038ef55afdb9b012e53b5119257649b545af32"} Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.365667 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1b5ca0fd-39ce-46da-8a15-cf0d7265e060","Type":"ContainerStarted","Data":"d752ec9568b97c7a9a1e0ea7c10ce0973a08508f8beb84b53fb4fc5635c2706f"} Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.385549 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b","Type":"ContainerStarted","Data":"c37272ac6f9e924740c6d7aa103c2e64f6efab2b196866681821b669409f2ee4"} Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.395134 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jsvf2" event={"ID":"fd4e0644-4339-45bf-a919-0de0551c5baa","Type":"ContainerStarted","Data":"8c0a9ed7e7a708519d24ca670cb89abe814680e00b43aecd0756535bb04fcacc"} Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.401940 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19d2d34d-f935-40e4-a27a-a382c7634da2","Type":"ContainerStarted","Data":"154c237a8cda216b92583d85eef501f14b59d72a35964b0b4be6a9a7f8415f61"} Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.418154 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xnj8t" event={"ID":"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b","Type":"ContainerStarted","Data":"d8a84b3a4264179ed1969ed49bb2b66b1b9094412f8430af00f5618a35872170"} Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.418218 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xnj8t" event={"ID":"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b","Type":"ContainerStarted","Data":"3471a54fd91f31962c620ac3ac75855c348f402dfa6489d764abc8274de2fc01"} Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.428163 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fvmwf" event={"ID":"d6e6aec3-87a9-4f8a-b640-313ab241ec6f","Type":"ContainerStarted","Data":"8e52dfc26f0d8eee3b5547850bbe5713b909b7c78a2c9a2bf1d9cded5250f6d8"} Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.432370 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-jsvf2" podStartSLOduration=3.179852414 podStartE2EDuration="27.432347303s" podCreationTimestamp="2026-02-20 07:07:28 +0000 UTC" firstStartedPulling="2026-02-20 07:07:29.949929809 +0000 UTC m=+1264.822556520" lastFinishedPulling="2026-02-20 07:07:54.202424698 +0000 UTC m=+1289.075051409" observedRunningTime="2026-02-20 07:07:55.419552867 +0000 UTC m=+1290.292179578" watchObservedRunningTime="2026-02-20 07:07:55.432347303 +0000 UTC m=+1290.304974014" Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.434825 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.452534 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xnj8t" podStartSLOduration=15.452513347 podStartE2EDuration="15.452513347s" podCreationTimestamp="2026-02-20 07:07:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:07:55.447934177 +0000 UTC m=+1290.320560888" watchObservedRunningTime="2026-02-20 07:07:55.452513347 +0000 UTC m=+1290.325140058" Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.488608 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-fvmwf" podStartSLOduration=3.217929856 podStartE2EDuration="27.48858445s" podCreationTimestamp="2026-02-20 07:07:28 +0000 UTC" firstStartedPulling="2026-02-20 07:07:29.943872004 +0000 UTC m=+1264.816498715" lastFinishedPulling="2026-02-20 07:07:54.214526598 +0000 UTC m=+1289.087153309" observedRunningTime="2026-02-20 07:07:55.478052118 +0000 UTC m=+1290.350678829" watchObservedRunningTime="2026-02-20 07:07:55.48858445 +0000 UTC m=+1290.361211161" Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.514314 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-768666cd57-7ddwb"] Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.521584 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-768666cd57-7ddwb"] Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.862265 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa0f7da7-a9bd-4b03-b256-d05ba9323e70" path="/var/lib/kubelet/pods/aa0f7da7-a9bd-4b03-b256-d05ba9323e70/volumes" Feb 20 07:07:56 crc kubenswrapper[5094]: I0220 07:07:56.159981 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5ddb8575b6-4wznv"] Feb 20 07:07:56 crc kubenswrapper[5094]: I0220 07:07:56.478094 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5ddb8575b6-4wznv" event={"ID":"c5f4dc72-bb77-44c4-8058-4939958d7a48","Type":"ContainerStarted","Data":"7216118bc6764d988378bcc95f70afbc24e44597d2724806d33cbd64bb7f1c0b"} Feb 20 07:07:56 crc kubenswrapper[5094]: I0220 07:07:56.495511 5094 generic.go:334] "Generic (PLEG): container finished" podID="2c274cd0-4938-48fa-8534-409a3070299f" containerID="ce1978c29ea807736776e1aab75d72153c9ee3dd68aa8d95e4d850a505683bff" exitCode=0 Feb 20 07:07:56 crc kubenswrapper[5094]: I0220 07:07:56.495596 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" event={"ID":"2c274cd0-4938-48fa-8534-409a3070299f","Type":"ContainerDied","Data":"ce1978c29ea807736776e1aab75d72153c9ee3dd68aa8d95e4d850a505683bff"} Feb 20 07:07:56 crc kubenswrapper[5094]: I0220 07:07:56.503476 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1b5ca0fd-39ce-46da-8a15-cf0d7265e060","Type":"ContainerStarted","Data":"3903f2d2312370e437aca364ecb299fb413db5a66b7bc53bb3fb92764eec4ca7"} Feb 20 07:07:56 crc kubenswrapper[5094]: I0220 07:07:56.512768 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b","Type":"ContainerStarted","Data":"d030747f97a82fa85fa58357fe602c22cf85f61f284bbee33cd22316a4d560ea"} Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.349825 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7d8645fb77-xprwl"] Feb 20 07:07:57 crc kubenswrapper[5094]: E0220 07:07:57.351104 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa0f7da7-a9bd-4b03-b256-d05ba9323e70" containerName="init" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.351120 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa0f7da7-a9bd-4b03-b256-d05ba9323e70" containerName="init" Feb 20 07:07:57 crc kubenswrapper[5094]: E0220 07:07:57.351132 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa0f7da7-a9bd-4b03-b256-d05ba9323e70" containerName="dnsmasq-dns" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.351138 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa0f7da7-a9bd-4b03-b256-d05ba9323e70" containerName="dnsmasq-dns" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.351330 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa0f7da7-a9bd-4b03-b256-d05ba9323e70" containerName="dnsmasq-dns" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.360593 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.363475 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7d8645fb77-xprwl"] Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.367195 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.367560 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.484277 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-config\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.484339 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-public-tls-certs\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.484374 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-httpd-config\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.484400 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-internal-tls-certs\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.484489 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-ovndb-tls-certs\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.484529 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-combined-ca-bundle\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.484580 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7krch\" (UniqueName: \"kubernetes.io/projected/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-kube-api-access-7krch\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.522986 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" event={"ID":"2c274cd0-4938-48fa-8534-409a3070299f","Type":"ContainerStarted","Data":"00d31f7caea8f28d87993a4c9ffad14c255acc53e525b3a8aedf619b6fe7c988"} Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.525076 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1b5ca0fd-39ce-46da-8a15-cf0d7265e060","Type":"ContainerStarted","Data":"763800845ad8e2fe479385b37abae471ee069ec13ca28f866a8628fcda5f7859"} Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.534065 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b","Type":"ContainerStarted","Data":"f8a1428170b952f6757cb59aec0fc05c990728916f1e69bbff78756ee58aca18"} Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.545003 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5ddb8575b6-4wznv" event={"ID":"c5f4dc72-bb77-44c4-8058-4939958d7a48","Type":"ContainerStarted","Data":"419f643d1ccf43ee276ce35ac60b7269f52b21d76747dcf57ea44f551a26690a"} Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.545115 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5ddb8575b6-4wznv" event={"ID":"c5f4dc72-bb77-44c4-8058-4939958d7a48","Type":"ContainerStarted","Data":"eb7da5e5d31261eb9e520b471e20dcda54e4d83a9a60405577b7def9033da931"} Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.547047 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.578525 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=14.578496202 podStartE2EDuration="14.578496202s" podCreationTimestamp="2026-02-20 07:07:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:07:57.561238709 +0000 UTC m=+1292.433865440" watchObservedRunningTime="2026-02-20 07:07:57.578496202 +0000 UTC m=+1292.451122913" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.587465 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-combined-ca-bundle\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.587533 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7krch\" (UniqueName: \"kubernetes.io/projected/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-kube-api-access-7krch\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.590067 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-config\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.593376 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-public-tls-certs\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.595585 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-internal-tls-certs\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.595625 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-httpd-config\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.595811 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-ovndb-tls-certs\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.598084 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-config\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.603108 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-internal-tls-certs\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.605291 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-httpd-config\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.610411 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-ovndb-tls-certs\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.612114 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=14.612087887 podStartE2EDuration="14.612087887s" podCreationTimestamp="2026-02-20 07:07:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:07:57.586383362 +0000 UTC m=+1292.459010073" watchObservedRunningTime="2026-02-20 07:07:57.612087887 +0000 UTC m=+1292.484714598" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.613203 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-combined-ca-bundle\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.613474 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-public-tls-certs\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.623558 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7krch\" (UniqueName: \"kubernetes.io/projected/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-kube-api-access-7krch\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.631454 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5ddb8575b6-4wznv" podStartSLOduration=4.63142755 podStartE2EDuration="4.63142755s" podCreationTimestamp="2026-02-20 07:07:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:07:57.627528727 +0000 UTC m=+1292.500155448" watchObservedRunningTime="2026-02-20 07:07:57.63142755 +0000 UTC m=+1292.504054261" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.736071 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:58 crc kubenswrapper[5094]: I0220 07:07:58.357166 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7d8645fb77-xprwl"] Feb 20 07:07:58 crc kubenswrapper[5094]: I0220 07:07:58.562361 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19d2d34d-f935-40e4-a27a-a382c7634da2","Type":"ContainerStarted","Data":"8fbd0e1a5aa799bf4144c98e297964162b94c7e1f11e3c3574fd101cc616268d"} Feb 20 07:07:58 crc kubenswrapper[5094]: I0220 07:07:58.570547 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d8645fb77-xprwl" event={"ID":"6240f946-dbc4-4fdb-b831-23e76bfe2ebc","Type":"ContainerStarted","Data":"b9098b5dbcb409320185fc3b697229991f7ab044a2834551fda65cf47f38a5d4"} Feb 20 07:07:58 crc kubenswrapper[5094]: I0220 07:07:58.570887 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:58 crc kubenswrapper[5094]: I0220 07:07:58.605401 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" podStartSLOduration=5.605373466 podStartE2EDuration="5.605373466s" podCreationTimestamp="2026-02-20 07:07:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:07:58.588799439 +0000 UTC m=+1293.461426150" watchObservedRunningTime="2026-02-20 07:07:58.605373466 +0000 UTC m=+1293.478000187" Feb 20 07:07:58 crc kubenswrapper[5094]: I0220 07:07:58.803651 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-768666cd57-7ddwb" podUID="aa0f7da7-a9bd-4b03-b256-d05ba9323e70" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.123:5353: i/o timeout" Feb 20 07:07:59 crc kubenswrapper[5094]: I0220 07:07:59.584215 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d8645fb77-xprwl" event={"ID":"6240f946-dbc4-4fdb-b831-23e76bfe2ebc","Type":"ContainerStarted","Data":"be6f73969b02eaf2a0704385ffcaff6d176cc3adc9262bf72bf4d87acc597ac5"} Feb 20 07:07:59 crc kubenswrapper[5094]: I0220 07:07:59.584669 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:59 crc kubenswrapper[5094]: I0220 07:07:59.584683 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d8645fb77-xprwl" event={"ID":"6240f946-dbc4-4fdb-b831-23e76bfe2ebc","Type":"ContainerStarted","Data":"a55515cfe2e873dc49fe2e0829229193d635d2147a43cdfaca8db2114d963017"} Feb 20 07:07:59 crc kubenswrapper[5094]: I0220 07:07:59.586170 5094 generic.go:334] "Generic (PLEG): container finished" podID="fd4e0644-4339-45bf-a919-0de0551c5baa" containerID="8c0a9ed7e7a708519d24ca670cb89abe814680e00b43aecd0756535bb04fcacc" exitCode=0 Feb 20 07:07:59 crc kubenswrapper[5094]: I0220 07:07:59.586209 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jsvf2" event={"ID":"fd4e0644-4339-45bf-a919-0de0551c5baa","Type":"ContainerDied","Data":"8c0a9ed7e7a708519d24ca670cb89abe814680e00b43aecd0756535bb04fcacc"} Feb 20 07:07:59 crc kubenswrapper[5094]: I0220 07:07:59.604293 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7d8645fb77-xprwl" podStartSLOduration=2.6042665879999998 podStartE2EDuration="2.604266588s" podCreationTimestamp="2026-02-20 07:07:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:07:59.601905401 +0000 UTC m=+1294.474532112" watchObservedRunningTime="2026-02-20 07:07:59.604266588 +0000 UTC m=+1294.476893299" Feb 20 07:08:00 crc kubenswrapper[5094]: I0220 07:08:00.601141 5094 generic.go:334] "Generic (PLEG): container finished" podID="0dd5a2da-23e3-4bbd-a211-b3f527e4e28b" containerID="d8a84b3a4264179ed1969ed49bb2b66b1b9094412f8430af00f5618a35872170" exitCode=0 Feb 20 07:08:00 crc kubenswrapper[5094]: I0220 07:08:00.601234 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xnj8t" event={"ID":"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b","Type":"ContainerDied","Data":"d8a84b3a4264179ed1969ed49bb2b66b1b9094412f8430af00f5618a35872170"} Feb 20 07:08:00 crc kubenswrapper[5094]: I0220 07:08:00.604801 5094 generic.go:334] "Generic (PLEG): container finished" podID="d6e6aec3-87a9-4f8a-b640-313ab241ec6f" containerID="8e52dfc26f0d8eee3b5547850bbe5713b909b7c78a2c9a2bf1d9cded5250f6d8" exitCode=0 Feb 20 07:08:00 crc kubenswrapper[5094]: I0220 07:08:00.604984 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fvmwf" event={"ID":"d6e6aec3-87a9-4f8a-b640-313ab241ec6f","Type":"ContainerDied","Data":"8e52dfc26f0d8eee3b5547850bbe5713b909b7c78a2c9a2bf1d9cded5250f6d8"} Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.357541 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fvmwf" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.366288 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jsvf2" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.371592 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.531328 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-combined-ca-bundle\") pod \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.531400 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqrpz\" (UniqueName: \"kubernetes.io/projected/fd4e0644-4339-45bf-a919-0de0551c5baa-kube-api-access-fqrpz\") pod \"fd4e0644-4339-45bf-a919-0de0551c5baa\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.531467 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-config-data\") pod \"fd4e0644-4339-45bf-a919-0de0551c5baa\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.531485 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-scripts\") pod \"fd4e0644-4339-45bf-a919-0de0551c5baa\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.531508 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-fernet-keys\") pod \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.531537 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-db-sync-config-data\") pod \"d6e6aec3-87a9-4f8a-b640-313ab241ec6f\" (UID: \"d6e6aec3-87a9-4f8a-b640-313ab241ec6f\") " Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.531624 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgbzs\" (UniqueName: \"kubernetes.io/projected/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-kube-api-access-vgbzs\") pod \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.531735 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd4e0644-4339-45bf-a919-0de0551c5baa-logs\") pod \"fd4e0644-4339-45bf-a919-0de0551c5baa\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.531811 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-combined-ca-bundle\") pod \"d6e6aec3-87a9-4f8a-b640-313ab241ec6f\" (UID: \"d6e6aec3-87a9-4f8a-b640-313ab241ec6f\") " Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.531885 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-credential-keys\") pod \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.531993 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-combined-ca-bundle\") pod \"fd4e0644-4339-45bf-a919-0de0551c5baa\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.532042 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bft6b\" (UniqueName: \"kubernetes.io/projected/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-kube-api-access-bft6b\") pod \"d6e6aec3-87a9-4f8a-b640-313ab241ec6f\" (UID: \"d6e6aec3-87a9-4f8a-b640-313ab241ec6f\") " Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.532063 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-config-data\") pod \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.532111 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-scripts\") pod \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.533228 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd4e0644-4339-45bf-a919-0de0551c5baa-logs" (OuterVolumeSpecName: "logs") pod "fd4e0644-4339-45bf-a919-0de0551c5baa" (UID: "fd4e0644-4339-45bf-a919-0de0551c5baa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.538162 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd4e0644-4339-45bf-a919-0de0551c5baa-kube-api-access-fqrpz" (OuterVolumeSpecName: "kube-api-access-fqrpz") pod "fd4e0644-4339-45bf-a919-0de0551c5baa" (UID: "fd4e0644-4339-45bf-a919-0de0551c5baa"). InnerVolumeSpecName "kube-api-access-fqrpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.540157 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-kube-api-access-bft6b" (OuterVolumeSpecName: "kube-api-access-bft6b") pod "d6e6aec3-87a9-4f8a-b640-313ab241ec6f" (UID: "d6e6aec3-87a9-4f8a-b640-313ab241ec6f"). InnerVolumeSpecName "kube-api-access-bft6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.540560 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-scripts" (OuterVolumeSpecName: "scripts") pod "fd4e0644-4339-45bf-a919-0de0551c5baa" (UID: "fd4e0644-4339-45bf-a919-0de0551c5baa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.540592 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0dd5a2da-23e3-4bbd-a211-b3f527e4e28b" (UID: "0dd5a2da-23e3-4bbd-a211-b3f527e4e28b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.540664 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d6e6aec3-87a9-4f8a-b640-313ab241ec6f" (UID: "d6e6aec3-87a9-4f8a-b640-313ab241ec6f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.541021 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-scripts" (OuterVolumeSpecName: "scripts") pod "0dd5a2da-23e3-4bbd-a211-b3f527e4e28b" (UID: "0dd5a2da-23e3-4bbd-a211-b3f527e4e28b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.541031 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-kube-api-access-vgbzs" (OuterVolumeSpecName: "kube-api-access-vgbzs") pod "0dd5a2da-23e3-4bbd-a211-b3f527e4e28b" (UID: "0dd5a2da-23e3-4bbd-a211-b3f527e4e28b"). InnerVolumeSpecName "kube-api-access-vgbzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.555036 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0dd5a2da-23e3-4bbd-a211-b3f527e4e28b" (UID: "0dd5a2da-23e3-4bbd-a211-b3f527e4e28b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.571114 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-config-data" (OuterVolumeSpecName: "config-data") pod "fd4e0644-4339-45bf-a919-0de0551c5baa" (UID: "fd4e0644-4339-45bf-a919-0de0551c5baa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.575981 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0dd5a2da-23e3-4bbd-a211-b3f527e4e28b" (UID: "0dd5a2da-23e3-4bbd-a211-b3f527e4e28b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.579117 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-config-data" (OuterVolumeSpecName: "config-data") pod "0dd5a2da-23e3-4bbd-a211-b3f527e4e28b" (UID: "0dd5a2da-23e3-4bbd-a211-b3f527e4e28b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.584657 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd4e0644-4339-45bf-a919-0de0551c5baa" (UID: "fd4e0644-4339-45bf-a919-0de0551c5baa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.606845 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6e6aec3-87a9-4f8a-b640-313ab241ec6f" (UID: "d6e6aec3-87a9-4f8a-b640-313ab241ec6f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.633541 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xnj8t" event={"ID":"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b","Type":"ContainerDied","Data":"3471a54fd91f31962c620ac3ac75855c348f402dfa6489d764abc8274de2fc01"} Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.633595 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3471a54fd91f31962c620ac3ac75855c348f402dfa6489d764abc8274de2fc01" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.634757 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.637928 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.637962 5094 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.637975 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.637988 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bft6b\" (UniqueName: \"kubernetes.io/projected/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-kube-api-access-bft6b\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.638003 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.638015 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.638027 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.638038 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqrpz\" (UniqueName: \"kubernetes.io/projected/fd4e0644-4339-45bf-a919-0de0551c5baa-kube-api-access-fqrpz\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.638051 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.638062 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.638076 5094 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.638091 5094 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.638103 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgbzs\" (UniqueName: \"kubernetes.io/projected/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-kube-api-access-vgbzs\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.638114 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd4e0644-4339-45bf-a919-0de0551c5baa-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.638779 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fvmwf" event={"ID":"d6e6aec3-87a9-4f8a-b640-313ab241ec6f","Type":"ContainerDied","Data":"6e88acd54dbbf562acdafbd54ac1a987deab1124d294a1cf214bd932d6b05497"} Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.638829 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e88acd54dbbf562acdafbd54ac1a987deab1124d294a1cf214bd932d6b05497" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.638903 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fvmwf" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.648932 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jsvf2" event={"ID":"fd4e0644-4339-45bf-a919-0de0551c5baa","Type":"ContainerDied","Data":"21f4e9dd4335713cc70e8a920496faa98eb7767c86615d81c2e49ffa01bf7858"} Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.648970 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21f4e9dd4335713cc70e8a920496faa98eb7767c86615d81c2e49ffa01bf7858" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.649043 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jsvf2" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.728219 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-55468cd684-wv6dn"] Feb 20 07:08:02 crc kubenswrapper[5094]: E0220 07:08:02.728772 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dd5a2da-23e3-4bbd-a211-b3f527e4e28b" containerName="keystone-bootstrap" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.728790 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dd5a2da-23e3-4bbd-a211-b3f527e4e28b" containerName="keystone-bootstrap" Feb 20 07:08:02 crc kubenswrapper[5094]: E0220 07:08:02.728832 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e6aec3-87a9-4f8a-b640-313ab241ec6f" containerName="barbican-db-sync" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.728844 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e6aec3-87a9-4f8a-b640-313ab241ec6f" containerName="barbican-db-sync" Feb 20 07:08:02 crc kubenswrapper[5094]: E0220 07:08:02.728854 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd4e0644-4339-45bf-a919-0de0551c5baa" containerName="placement-db-sync" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.728860 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd4e0644-4339-45bf-a919-0de0551c5baa" containerName="placement-db-sync" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.729058 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6e6aec3-87a9-4f8a-b640-313ab241ec6f" containerName="barbican-db-sync" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.729105 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd4e0644-4339-45bf-a919-0de0551c5baa" containerName="placement-db-sync" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.729130 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dd5a2da-23e3-4bbd-a211-b3f527e4e28b" containerName="keystone-bootstrap" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.729905 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.732506 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.734689 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.735146 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qgvjt" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.735330 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.735478 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.735722 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.738318 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-55468cd684-wv6dn"] Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.846962 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-config-data\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.847030 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-combined-ca-bundle\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.847200 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-internal-tls-certs\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.847244 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-fernet-keys\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.847285 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dw5l\" (UniqueName: \"kubernetes.io/projected/732b4015-53b2-4422-b7d1-12b65f6e0c92-kube-api-access-7dw5l\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.847350 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-scripts\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.847397 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-public-tls-certs\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.847425 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-credential-keys\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.949420 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-internal-tls-certs\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.949498 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-fernet-keys\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.949529 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dw5l\" (UniqueName: \"kubernetes.io/projected/732b4015-53b2-4422-b7d1-12b65f6e0c92-kube-api-access-7dw5l\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.949580 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-scripts\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.949613 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-public-tls-certs\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.949636 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-credential-keys\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.949684 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-config-data\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.949728 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-combined-ca-bundle\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.971490 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-credential-keys\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.971762 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-public-tls-certs\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.976552 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-combined-ca-bundle\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.976933 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-internal-tls-certs\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.977348 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-fernet-keys\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.979653 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-scripts\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.982483 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-config-data\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.995385 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dw5l\" (UniqueName: \"kubernetes.io/projected/732b4015-53b2-4422-b7d1-12b65f6e0c92-kube-api-access-7dw5l\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.025725 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7bf7749bf9-ggfvf"] Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.029546 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.063089 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.072859 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7bf7749bf9-ggfvf"] Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.073069 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-z4bpk" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.073152 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.140320 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.156499 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-config-data-custom\") pod \"barbican-worker-7bf7749bf9-ggfvf\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.156572 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-config-data\") pod \"barbican-worker-7bf7749bf9-ggfvf\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.156650 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-combined-ca-bundle\") pod \"barbican-worker-7bf7749bf9-ggfvf\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.156688 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mknrc\" (UniqueName: \"kubernetes.io/projected/038d4354-a929-4f2d-9633-4cd7dadd4523-kube-api-access-mknrc\") pod \"barbican-worker-7bf7749bf9-ggfvf\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.156778 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/038d4354-a929-4f2d-9633-4cd7dadd4523-logs\") pod \"barbican-worker-7bf7749bf9-ggfvf\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.201799 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-548b4f9548-p65sn"] Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.220927 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-548b4f9548-p65sn"] Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.221075 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.234632 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.258055 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mknrc\" (UniqueName: \"kubernetes.io/projected/038d4354-a929-4f2d-9633-4cd7dadd4523-kube-api-access-mknrc\") pod \"barbican-worker-7bf7749bf9-ggfvf\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.258147 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/038d4354-a929-4f2d-9633-4cd7dadd4523-logs\") pod \"barbican-worker-7bf7749bf9-ggfvf\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.258204 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-config-data-custom\") pod \"barbican-worker-7bf7749bf9-ggfvf\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.258225 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-config-data\") pod \"barbican-worker-7bf7749bf9-ggfvf\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.258282 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-combined-ca-bundle\") pod \"barbican-worker-7bf7749bf9-ggfvf\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.260534 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/038d4354-a929-4f2d-9633-4cd7dadd4523-logs\") pod \"barbican-worker-7bf7749bf9-ggfvf\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.272685 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-config-data-custom\") pod \"barbican-worker-7bf7749bf9-ggfvf\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.296838 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7df9984bd9-6txsf"] Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.298394 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.298568 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-combined-ca-bundle\") pod \"barbican-worker-7bf7749bf9-ggfvf\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.306583 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-config-data\") pod \"barbican-worker-7bf7749bf9-ggfvf\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.311173 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-db5c97f8f-4sj4d"] Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.315103 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" podUID="2c274cd0-4938-48fa-8534-409a3070299f" containerName="dnsmasq-dns" containerID="cri-o://00d31f7caea8f28d87993a4c9ffad14c255acc53e525b3a8aedf619b6fe7c988" gracePeriod=10 Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.322676 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.330286 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mknrc\" (UniqueName: \"kubernetes.io/projected/038d4354-a929-4f2d-9633-4cd7dadd4523-kube-api-access-mknrc\") pod \"barbican-worker-7bf7749bf9-ggfvf\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.345151 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7df9984bd9-6txsf"] Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.363180 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-combined-ca-bundle\") pod \"barbican-keystone-listener-548b4f9548-p65sn\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.363270 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-config-data-custom\") pod \"barbican-keystone-listener-548b4f9548-p65sn\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.363313 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/795db1e7-56e9-4ff2-91f1-b3589603d82c-logs\") pod \"barbican-keystone-listener-548b4f9548-p65sn\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.363422 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65k5m\" (UniqueName: \"kubernetes.io/projected/795db1e7-56e9-4ff2-91f1-b3589603d82c-kube-api-access-65k5m\") pod \"barbican-keystone-listener-548b4f9548-p65sn\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.363458 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-config-data\") pod \"barbican-keystone-listener-548b4f9548-p65sn\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.363671 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.472298 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmx2z\" (UniqueName: \"kubernetes.io/projected/92877559-6960-4dbf-890a-fb563f4b0bf8-kube-api-access-fmx2z\") pod \"barbican-worker-7df9984bd9-6txsf\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.472356 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-config-data\") pod \"barbican-worker-7df9984bd9-6txsf\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.472388 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-config-data-custom\") pod \"barbican-keystone-listener-548b4f9548-p65sn\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.472427 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/795db1e7-56e9-4ff2-91f1-b3589603d82c-logs\") pod \"barbican-keystone-listener-548b4f9548-p65sn\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.472464 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-config-data-custom\") pod \"barbican-worker-7df9984bd9-6txsf\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.472489 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-combined-ca-bundle\") pod \"barbican-worker-7df9984bd9-6txsf\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.472547 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65k5m\" (UniqueName: \"kubernetes.io/projected/795db1e7-56e9-4ff2-91f1-b3589603d82c-kube-api-access-65k5m\") pod \"barbican-keystone-listener-548b4f9548-p65sn\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.472577 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-config-data\") pod \"barbican-keystone-listener-548b4f9548-p65sn\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.472594 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92877559-6960-4dbf-890a-fb563f4b0bf8-logs\") pod \"barbican-worker-7df9984bd9-6txsf\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.472631 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-combined-ca-bundle\") pod \"barbican-keystone-listener-548b4f9548-p65sn\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.476817 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/795db1e7-56e9-4ff2-91f1-b3589603d82c-logs\") pod \"barbican-keystone-listener-548b4f9548-p65sn\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.500542 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-combined-ca-bundle\") pod \"barbican-keystone-listener-548b4f9548-p65sn\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.507503 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-config-data-custom\") pod \"barbican-keystone-listener-548b4f9548-p65sn\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.525813 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-config-data\") pod \"barbican-keystone-listener-548b4f9548-p65sn\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.561917 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65k5m\" (UniqueName: \"kubernetes.io/projected/795db1e7-56e9-4ff2-91f1-b3589603d82c-kube-api-access-65k5m\") pod \"barbican-keystone-listener-548b4f9548-p65sn\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.562473 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.575101 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92877559-6960-4dbf-890a-fb563f4b0bf8-logs\") pod \"barbican-worker-7df9984bd9-6txsf\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.575206 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmx2z\" (UniqueName: \"kubernetes.io/projected/92877559-6960-4dbf-890a-fb563f4b0bf8-kube-api-access-fmx2z\") pod \"barbican-worker-7df9984bd9-6txsf\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.575232 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-config-data\") pod \"barbican-worker-7df9984bd9-6txsf\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.575280 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-config-data-custom\") pod \"barbican-worker-7df9984bd9-6txsf\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.575304 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-combined-ca-bundle\") pod \"barbican-worker-7df9984bd9-6txsf\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.576896 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92877559-6960-4dbf-890a-fb563f4b0bf8-logs\") pod \"barbican-worker-7df9984bd9-6txsf\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.600865 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-config-data-custom\") pod \"barbican-worker-7df9984bd9-6txsf\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.602190 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-config-data\") pod \"barbican-worker-7df9984bd9-6txsf\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.602539 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-combined-ca-bundle\") pod \"barbican-worker-7df9984bd9-6txsf\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.621079 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7c9ffdc684-v56nc"] Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.623238 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.670754 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9d49dd75f-pdb4g"] Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.672372 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.694139 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7c9ffdc684-v56nc"] Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.713071 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmx2z\" (UniqueName: \"kubernetes.io/projected/92877559-6960-4dbf-890a-fb563f4b0bf8-kube-api-access-fmx2z\") pod \"barbican-worker-7df9984bd9-6txsf\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.737292 5094 generic.go:334] "Generic (PLEG): container finished" podID="2c274cd0-4938-48fa-8534-409a3070299f" containerID="00d31f7caea8f28d87993a4c9ffad14c255acc53e525b3a8aedf619b6fe7c988" exitCode=0 Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.737384 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" event={"ID":"2c274cd0-4938-48fa-8534-409a3070299f","Type":"ContainerDied","Data":"00d31f7caea8f28d87993a4c9ffad14c255acc53e525b3a8aedf619b6fe7c988"} Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.739818 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.752670 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9d49dd75f-pdb4g"] Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.786091 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-9d777b794-txl9q"] Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.786614 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-dns-swift-storage-0\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.786682 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-ovsdbserver-nb\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.786718 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-ovsdbserver-sb\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.786739 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-config-data-custom\") pod \"barbican-keystone-listener-7c9ffdc684-v56nc\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.786758 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-logs\") pod \"barbican-keystone-listener-7c9ffdc684-v56nc\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.786784 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-config-data\") pod \"barbican-keystone-listener-7c9ffdc684-v56nc\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.786806 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x72mx\" (UniqueName: \"kubernetes.io/projected/b2a1712c-5268-4203-bab0-c427e96b217b-kube-api-access-x72mx\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.786848 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-config\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.786893 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-dns-svc\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.786920 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bf7m\" (UniqueName: \"kubernetes.io/projected/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-kube-api-access-9bf7m\") pod \"barbican-keystone-listener-7c9ffdc684-v56nc\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.786941 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-combined-ca-bundle\") pod \"barbican-keystone-listener-7c9ffdc684-v56nc\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.787991 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.798104 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.873801 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19d2d34d-f935-40e4-a27a-a382c7634da2","Type":"ContainerStarted","Data":"735b44943bdf895ffeb305a8a9b07202be3d3f21805babf681c05f7b0d65ad7e"} Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.873913 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-9d777b794-txl9q"] Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.893341 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.893386 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.895722 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-dns-svc\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.906349 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-dns-svc\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.906476 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-config-data\") pod \"barbican-api-9d777b794-txl9q\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.906569 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bf7m\" (UniqueName: \"kubernetes.io/projected/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-kube-api-access-9bf7m\") pod \"barbican-keystone-listener-7c9ffdc684-v56nc\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.906629 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-combined-ca-bundle\") pod \"barbican-keystone-listener-7c9ffdc684-v56nc\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.906825 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-dns-swift-storage-0\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.906858 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-combined-ca-bundle\") pod \"barbican-api-9d777b794-txl9q\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.906924 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9b56f38-2467-4518-bdc3-d6ee665987da-logs\") pod \"barbican-api-9d777b794-txl9q\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.906991 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-ovsdbserver-nb\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.907009 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-ovsdbserver-sb\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.907029 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-config-data-custom\") pod \"barbican-keystone-listener-7c9ffdc684-v56nc\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.907046 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-logs\") pod \"barbican-keystone-listener-7c9ffdc684-v56nc\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.907096 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-config-data\") pod \"barbican-keystone-listener-7c9ffdc684-v56nc\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.907137 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twrlp\" (UniqueName: \"kubernetes.io/projected/e9b56f38-2467-4518-bdc3-d6ee665987da-kube-api-access-twrlp\") pod \"barbican-api-9d777b794-txl9q\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.907166 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x72mx\" (UniqueName: \"kubernetes.io/projected/b2a1712c-5268-4203-bab0-c427e96b217b-kube-api-access-x72mx\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.907233 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-config-data-custom\") pod \"barbican-api-9d777b794-txl9q\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.907310 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-config\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.908292 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-config\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.917899 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-logs\") pod \"barbican-keystone-listener-7c9ffdc684-v56nc\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.919219 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-dns-swift-storage-0\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.920353 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-ovsdbserver-nb\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.921406 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-combined-ca-bundle\") pod \"barbican-keystone-listener-7c9ffdc684-v56nc\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.921864 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-ovsdbserver-sb\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.924855 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-config-data-custom\") pod \"barbican-keystone-listener-7c9ffdc684-v56nc\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.957727 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-config-data\") pod \"barbican-keystone-listener-7c9ffdc684-v56nc\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.974819 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.975367 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x72mx\" (UniqueName: \"kubernetes.io/projected/b2a1712c-5268-4203-bab0-c427e96b217b-kube-api-access-x72mx\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.976063 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bf7m\" (UniqueName: \"kubernetes.io/projected/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-kube-api-access-9bf7m\") pod \"barbican-keystone-listener-7c9ffdc684-v56nc\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.992467 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5b8f9d577d-pgn2k"] Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.996780 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.006902 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.007351 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.008205 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.009184 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.012538 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-fq797" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.016628 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-combined-ca-bundle\") pod \"barbican-api-9d777b794-txl9q\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.016743 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9b56f38-2467-4518-bdc3-d6ee665987da-logs\") pod \"barbican-api-9d777b794-txl9q\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.016829 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twrlp\" (UniqueName: \"kubernetes.io/projected/e9b56f38-2467-4518-bdc3-d6ee665987da-kube-api-access-twrlp\") pod \"barbican-api-9d777b794-txl9q\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.016917 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-config-data-custom\") pod \"barbican-api-9d777b794-txl9q\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.017059 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-config-data\") pod \"barbican-api-9d777b794-txl9q\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.019226 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9b56f38-2467-4518-bdc3-d6ee665987da-logs\") pod \"barbican-api-9d777b794-txl9q\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.021960 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-combined-ca-bundle\") pod \"barbican-api-9d777b794-txl9q\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.026872 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.029691 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-config-data-custom\") pod \"barbican-api-9d777b794-txl9q\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.032656 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.040725 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-config-data\") pod \"barbican-api-9d777b794-txl9q\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.056476 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.057081 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5b8f9d577d-pgn2k"] Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.079857 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twrlp\" (UniqueName: \"kubernetes.io/projected/e9b56f38-2467-4518-bdc3-d6ee665987da-kube-api-access-twrlp\") pod \"barbican-api-9d777b794-txl9q\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.106998 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.107087 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.107152 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.108327 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b476ab5b54b82460e3be1d827bfff187825879d93c9cc19cc5170f59943c3ef7"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.108387 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://b476ab5b54b82460e3be1d827bfff187825879d93c9cc19cc5170f59943c3ef7" gracePeriod=600 Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.124669 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b90c9110-e4fd-461b-ad2c-a58ff01921d8-logs\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.124791 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-internal-tls-certs\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.124822 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-public-tls-certs\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.124868 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-combined-ca-bundle\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.124903 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxpk6\" (UniqueName: \"kubernetes.io/projected/b90c9110-e4fd-461b-ad2c-a58ff01921d8-kube-api-access-qxpk6\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.124961 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-config-data\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.124980 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-scripts\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.149601 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.170163 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" podUID="2c274cd0-4938-48fa-8534-409a3070299f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.148:5353: connect: connection refused" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.202600 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.203011 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.233525 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxpk6\" (UniqueName: \"kubernetes.io/projected/b90c9110-e4fd-461b-ad2c-a58ff01921d8-kube-api-access-qxpk6\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.233601 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-config-data\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.233624 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-scripts\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.246342 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-scripts\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.251270 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b90c9110-e4fd-461b-ad2c-a58ff01921d8-logs\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.251423 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-internal-tls-certs\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.251466 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-public-tls-certs\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.251555 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-combined-ca-bundle\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.258610 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b90c9110-e4fd-461b-ad2c-a58ff01921d8-logs\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.261059 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-config-data\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.274555 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-internal-tls-certs\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.274680 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-public-tls-certs\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.287999 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-combined-ca-bundle\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.301920 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxpk6\" (UniqueName: \"kubernetes.io/projected/b90c9110-e4fd-461b-ad2c-a58ff01921d8-kube-api-access-qxpk6\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.313867 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.347168 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.404632 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.497979 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-55468cd684-wv6dn"] Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.550368 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7bf7749bf9-ggfvf"] Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.744714 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.840939 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7df9984bd9-6txsf"] Feb 20 07:08:04 crc kubenswrapper[5094]: W0220 07:08:04.860566 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod795db1e7_56e9_4ff2_91f1_b3589603d82c.slice/crio-fb04a41995c63d48690668a46b722c642ec1c8a54ddc3f182df807e2e50bd027 WatchSource:0}: Error finding container fb04a41995c63d48690668a46b722c642ec1c8a54ddc3f182df807e2e50bd027: Status 404 returned error can't find the container with id fb04a41995c63d48690668a46b722c642ec1c8a54ddc3f182df807e2e50bd027 Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.870218 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-ovsdbserver-sb\") pod \"2c274cd0-4938-48fa-8534-409a3070299f\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.870288 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-dns-svc\") pod \"2c274cd0-4938-48fa-8534-409a3070299f\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.870333 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-config\") pod \"2c274cd0-4938-48fa-8534-409a3070299f\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.870515 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-dns-swift-storage-0\") pod \"2c274cd0-4938-48fa-8534-409a3070299f\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.871106 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-ovsdbserver-nb\") pod \"2c274cd0-4938-48fa-8534-409a3070299f\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.871246 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6zcn\" (UniqueName: \"kubernetes.io/projected/2c274cd0-4938-48fa-8534-409a3070299f-kube-api-access-b6zcn\") pod \"2c274cd0-4938-48fa-8534-409a3070299f\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.876759 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-548b4f9548-p65sn"] Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.887953 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c274cd0-4938-48fa-8534-409a3070299f-kube-api-access-b6zcn" (OuterVolumeSpecName: "kube-api-access-b6zcn") pod "2c274cd0-4938-48fa-8534-409a3070299f" (UID: "2c274cd0-4938-48fa-8534-409a3070299f"). InnerVolumeSpecName "kube-api-access-b6zcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.896094 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" event={"ID":"2c274cd0-4938-48fa-8534-409a3070299f","Type":"ContainerDied","Data":"f4c855da6dfadfbfaa6485b438038ef55afdb9b012e53b5119257649b545af32"} Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.896155 5094 scope.go:117] "RemoveContainer" containerID="00d31f7caea8f28d87993a4c9ffad14c255acc53e525b3a8aedf619b6fe7c988" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.896361 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.915364 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-55468cd684-wv6dn" event={"ID":"732b4015-53b2-4422-b7d1-12b65f6e0c92","Type":"ContainerStarted","Data":"0e6de3c16ee5f3004f5d74169204f09eb1abb0191c70b76d1a09ff44b2f07e6d"} Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.981492 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="b476ab5b54b82460e3be1d827bfff187825879d93c9cc19cc5170f59943c3ef7" exitCode=0 Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.982383 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2c274cd0-4938-48fa-8534-409a3070299f" (UID: "2c274cd0-4938-48fa-8534-409a3070299f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.984827 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2c274cd0-4938-48fa-8534-409a3070299f" (UID: "2c274cd0-4938-48fa-8534-409a3070299f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.985795 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"b476ab5b54b82460e3be1d827bfff187825879d93c9cc19cc5170f59943c3ef7"} Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.985882 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310"} Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.985908 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7bf7749bf9-ggfvf" event={"ID":"038d4354-a929-4f2d-9633-4cd7dadd4523","Type":"ContainerStarted","Data":"775f84fe6aa5df6496627179d444588e580571d5d2f0a7733afea33c0965498e"} Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.986232 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.988049 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.988085 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.988268 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6zcn\" (UniqueName: \"kubernetes.io/projected/2c274cd0-4938-48fa-8534-409a3070299f-kube-api-access-b6zcn\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.988634 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.989104 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 20 07:08:05 crc kubenswrapper[5094]: I0220 07:08:05.003269 5094 scope.go:117] "RemoveContainer" containerID="ce1978c29ea807736776e1aab75d72153c9ee3dd68aa8d95e4d850a505683bff" Feb 20 07:08:05 crc kubenswrapper[5094]: I0220 07:08:05.026697 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 20 07:08:05 crc kubenswrapper[5094]: I0220 07:08:05.026822 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9d49dd75f-pdb4g"] Feb 20 07:08:05 crc kubenswrapper[5094]: I0220 07:08:05.031594 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-9d777b794-txl9q"] Feb 20 07:08:05 crc kubenswrapper[5094]: I0220 07:08:05.091613 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7c9ffdc684-v56nc"] Feb 20 07:08:05 crc kubenswrapper[5094]: I0220 07:08:05.163449 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5b8f9d577d-pgn2k"] Feb 20 07:08:05 crc kubenswrapper[5094]: I0220 07:08:05.353825 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-config" (OuterVolumeSpecName: "config") pod "2c274cd0-4938-48fa-8534-409a3070299f" (UID: "2c274cd0-4938-48fa-8534-409a3070299f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:05 crc kubenswrapper[5094]: I0220 07:08:05.396846 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2c274cd0-4938-48fa-8534-409a3070299f" (UID: "2c274cd0-4938-48fa-8534-409a3070299f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:05 crc kubenswrapper[5094]: I0220 07:08:05.401723 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:05 crc kubenswrapper[5094]: I0220 07:08:05.401749 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:05 crc kubenswrapper[5094]: I0220 07:08:05.424235 5094 scope.go:117] "RemoveContainer" containerID="2fef18a95cc33ac92b0e7f9a9dd8176f054fb1a2db35c994afe1e4273592eb9d" Feb 20 07:08:05 crc kubenswrapper[5094]: I0220 07:08:05.564546 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2c274cd0-4938-48fa-8534-409a3070299f" (UID: "2c274cd0-4938-48fa-8534-409a3070299f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:05 crc kubenswrapper[5094]: I0220 07:08:05.632781 5094 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:05 crc kubenswrapper[5094]: I0220 07:08:05.891806 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-db5c97f8f-4sj4d"] Feb 20 07:08:05 crc kubenswrapper[5094]: I0220 07:08:05.891848 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-db5c97f8f-4sj4d"] Feb 20 07:08:06 crc kubenswrapper[5094]: I0220 07:08:06.106597 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" event={"ID":"b2a1712c-5268-4203-bab0-c427e96b217b","Type":"ContainerStarted","Data":"0dddba45fa8488a7532914b19dd0a9c232300eec4679520cb1b28149a6920d2e"} Feb 20 07:08:06 crc kubenswrapper[5094]: I0220 07:08:06.141872 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7df9984bd9-6txsf" event={"ID":"92877559-6960-4dbf-890a-fb563f4b0bf8","Type":"ContainerStarted","Data":"5c342dad28df34f1d8d92f5a04877af4cb07a57675de3adb965ed98cfe8eaa77"} Feb 20 07:08:06 crc kubenswrapper[5094]: I0220 07:08:06.169484 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" event={"ID":"5d9f1f40-92cc-4f19-9f3b-49651f56bffb","Type":"ContainerStarted","Data":"9f163abc1efd183ebd2809a660db1c44ccc0e92e53e74d9ce4dfa48299f86759"} Feb 20 07:08:06 crc kubenswrapper[5094]: I0220 07:08:06.182666 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5b8f9d577d-pgn2k" event={"ID":"b90c9110-e4fd-461b-ad2c-a58ff01921d8","Type":"ContainerStarted","Data":"b019c482612055cc0918048f8a12a69d96710169b8244b6ca81050099107cecc"} Feb 20 07:08:06 crc kubenswrapper[5094]: I0220 07:08:06.189913 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-55468cd684-wv6dn" event={"ID":"732b4015-53b2-4422-b7d1-12b65f6e0c92","Type":"ContainerStarted","Data":"d6784972d1434d152307a7464021eb59a6a508fa634f71c81ae53ac06db7b051"} Feb 20 07:08:06 crc kubenswrapper[5094]: I0220 07:08:06.190365 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:06 crc kubenswrapper[5094]: I0220 07:08:06.191566 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9d777b794-txl9q" event={"ID":"e9b56f38-2467-4518-bdc3-d6ee665987da","Type":"ContainerStarted","Data":"62bcfccc8f6311f78f2ee50f1468178552942ffa51a8e7c08d763e6569fd3de7"} Feb 20 07:08:06 crc kubenswrapper[5094]: I0220 07:08:06.214828 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" event={"ID":"795db1e7-56e9-4ff2-91f1-b3589603d82c","Type":"ContainerStarted","Data":"fb04a41995c63d48690668a46b722c642ec1c8a54ddc3f182df807e2e50bd027"} Feb 20 07:08:06 crc kubenswrapper[5094]: I0220 07:08:06.262644 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-55468cd684-wv6dn" podStartSLOduration=4.26262226 podStartE2EDuration="4.26262226s" podCreationTimestamp="2026-02-20 07:08:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:08:06.241637317 +0000 UTC m=+1301.114264028" watchObservedRunningTime="2026-02-20 07:08:06.26262226 +0000 UTC m=+1301.135248961" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.251845 5094 generic.go:334] "Generic (PLEG): container finished" podID="b2a1712c-5268-4203-bab0-c427e96b217b" containerID="548f803cb834d46aa79471e7e46c2cf5bba78c70a36499567ada0307a507be4e" exitCode=0 Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.252582 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" event={"ID":"b2a1712c-5268-4203-bab0-c427e96b217b","Type":"ContainerDied","Data":"548f803cb834d46aa79471e7e46c2cf5bba78c70a36499567ada0307a507be4e"} Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.252629 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" event={"ID":"b2a1712c-5268-4203-bab0-c427e96b217b","Type":"ContainerStarted","Data":"a7f1752e88d647cbe7b50383d081ac516e62abd971b333187ad49b37e7117299"} Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.254240 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.257896 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5b8f9d577d-pgn2k" event={"ID":"b90c9110-e4fd-461b-ad2c-a58ff01921d8","Type":"ContainerStarted","Data":"d1984bf7ea49205773f404c2e26dd668c4e64dc0f98adc5a3b111f4efeded4ec"} Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.257955 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5b8f9d577d-pgn2k" event={"ID":"b90c9110-e4fd-461b-ad2c-a58ff01921d8","Type":"ContainerStarted","Data":"91947cfb40227973fecb8ecdcb4c5ccb4aacad2780a5f8683eda8de6a99c2b2e"} Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.259044 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.259073 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.263112 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9d777b794-txl9q" event={"ID":"e9b56f38-2467-4518-bdc3-d6ee665987da","Type":"ContainerStarted","Data":"da39043890404f802fa5323fb43efc928dbb515543671fe7e4daa4998c4f6b42"} Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.263184 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9d777b794-txl9q" event={"ID":"e9b56f38-2467-4518-bdc3-d6ee665987da","Type":"ContainerStarted","Data":"3416645d606b6c11c1a4b3c83d2ebc2d75fd169ec7df3d3e76572e9a7ca20cfb"} Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.263430 5094 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.263450 5094 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.263428 5094 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.263609 5094 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.263973 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.263990 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.276270 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" podStartSLOduration=4.276250356 podStartE2EDuration="4.276250356s" podCreationTimestamp="2026-02-20 07:08:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:08:07.273254104 +0000 UTC m=+1302.145880815" watchObservedRunningTime="2026-02-20 07:08:07.276250356 +0000 UTC m=+1302.148877067" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.348542 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5b8f9d577d-pgn2k" podStartSLOduration=4.3485177870000005 podStartE2EDuration="4.348517787s" podCreationTimestamp="2026-02-20 07:08:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:08:07.316182212 +0000 UTC m=+1302.188808923" watchObservedRunningTime="2026-02-20 07:08:07.348517787 +0000 UTC m=+1302.221144498" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.353742 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-9d777b794-txl9q" podStartSLOduration=4.351694483 podStartE2EDuration="4.351694483s" podCreationTimestamp="2026-02-20 07:08:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:08:07.35158937 +0000 UTC m=+1302.224216071" watchObservedRunningTime="2026-02-20 07:08:07.351694483 +0000 UTC m=+1302.224321194" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.571659 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-79d9bcd9d4-9jrtt"] Feb 20 07:08:07 crc kubenswrapper[5094]: E0220 07:08:07.572057 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c274cd0-4938-48fa-8534-409a3070299f" containerName="init" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.572075 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c274cd0-4938-48fa-8534-409a3070299f" containerName="init" Feb 20 07:08:07 crc kubenswrapper[5094]: E0220 07:08:07.572102 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c274cd0-4938-48fa-8534-409a3070299f" containerName="dnsmasq-dns" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.572109 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c274cd0-4938-48fa-8534-409a3070299f" containerName="dnsmasq-dns" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.572314 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c274cd0-4938-48fa-8534-409a3070299f" containerName="dnsmasq-dns" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.573256 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.575727 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.575729 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.598759 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-79d9bcd9d4-9jrtt"] Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.666083 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-combined-ca-bundle\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.666179 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-config-data-custom\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.666204 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/908e2706-d24f-41c9-b481-4c0d5415c5ca-logs\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.666247 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ns5d\" (UniqueName: \"kubernetes.io/projected/908e2706-d24f-41c9-b481-4c0d5415c5ca-kube-api-access-4ns5d\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.666316 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-public-tls-certs\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.666342 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-internal-tls-certs\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.666362 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-config-data\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.768518 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-combined-ca-bundle\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.768607 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-config-data-custom\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.768638 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/908e2706-d24f-41c9-b481-4c0d5415c5ca-logs\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.768687 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ns5d\" (UniqueName: \"kubernetes.io/projected/908e2706-d24f-41c9-b481-4c0d5415c5ca-kube-api-access-4ns5d\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.768769 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-public-tls-certs\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.768797 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-internal-tls-certs\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.768825 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-config-data\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.769295 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/908e2706-d24f-41c9-b481-4c0d5415c5ca-logs\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.781744 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-config-data-custom\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.786363 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-public-tls-certs\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.800439 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-combined-ca-bundle\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.801436 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ns5d\" (UniqueName: \"kubernetes.io/projected/908e2706-d24f-41c9-b481-4c0d5415c5ca-kube-api-access-4ns5d\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.807482 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-internal-tls-certs\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.808178 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-config-data\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.880179 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c274cd0-4938-48fa-8534-409a3070299f" path="/var/lib/kubelet/pods/2c274cd0-4938-48fa-8534-409a3070299f/volumes" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.896003 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:08 crc kubenswrapper[5094]: I0220 07:08:08.304286 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 20 07:08:08 crc kubenswrapper[5094]: I0220 07:08:08.306838 5094 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 07:08:08 crc kubenswrapper[5094]: I0220 07:08:08.349831 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 20 07:08:08 crc kubenswrapper[5094]: I0220 07:08:08.350961 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 20 07:08:08 crc kubenswrapper[5094]: I0220 07:08:08.351117 5094 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 07:08:08 crc kubenswrapper[5094]: I0220 07:08:08.357406 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 20 07:08:09 crc kubenswrapper[5094]: I0220 07:08:09.333657 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-79d9bcd9d4-9jrtt"] Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.330825 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" event={"ID":"5d9f1f40-92cc-4f19-9f3b-49651f56bffb","Type":"ContainerStarted","Data":"b35c284d78ecbe2f2a62876408599b6c0ef8f8ad565bfbbe236f98afa60d9b08"} Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.331656 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" event={"ID":"5d9f1f40-92cc-4f19-9f3b-49651f56bffb","Type":"ContainerStarted","Data":"c53dec831e54b024a954907e297fb4a37cd3b545f865b7a580f6bd56abb0a90d"} Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.335748 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7df9984bd9-6txsf" event={"ID":"92877559-6960-4dbf-890a-fb563f4b0bf8","Type":"ContainerStarted","Data":"d397c0695b1183c0c759491ae377b37ffc66032936b01b69dbc1bd7fedbcab31"} Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.335792 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7df9984bd9-6txsf" event={"ID":"92877559-6960-4dbf-890a-fb563f4b0bf8","Type":"ContainerStarted","Data":"7fc72402d88effbe5b7f66ca244892cffc5d212981c680da190e5ee72f72b92a"} Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.349177 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-t7hr7" event={"ID":"15583b83-ce22-4b0b-9566-0e056b07c0d7","Type":"ContainerStarted","Data":"2e821859400e46f67b2a4c22980c8831875df8a723826e86fdac889d47a73a82"} Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.371678 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" podStartSLOduration=3.705876657 podStartE2EDuration="7.371651189s" podCreationTimestamp="2026-02-20 07:08:03 +0000 UTC" firstStartedPulling="2026-02-20 07:08:05.146845379 +0000 UTC m=+1300.019472090" lastFinishedPulling="2026-02-20 07:08:08.812619911 +0000 UTC m=+1303.685246622" observedRunningTime="2026-02-20 07:08:10.355188114 +0000 UTC m=+1305.227814825" watchObservedRunningTime="2026-02-20 07:08:10.371651189 +0000 UTC m=+1305.244277890" Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.373142 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" event={"ID":"795db1e7-56e9-4ff2-91f1-b3589603d82c","Type":"ContainerStarted","Data":"31f1d9012990fa973db844a448a0432ed8205633a414c007cfd82b7a9d651223"} Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.373235 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" event={"ID":"795db1e7-56e9-4ff2-91f1-b3589603d82c","Type":"ContainerStarted","Data":"23d5d2a1c804a074977e389a7dd43487a8931e8cf7c459c46b6f0f8448361c8d"} Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.385584 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7bf7749bf9-ggfvf" event={"ID":"038d4354-a929-4f2d-9633-4cd7dadd4523","Type":"ContainerStarted","Data":"2667db03c56b602dde1dd590504590facc01acd7cc000fb0e9dbfb81bb5c4b28"} Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.385612 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7bf7749bf9-ggfvf" event={"ID":"038d4354-a929-4f2d-9633-4cd7dadd4523","Type":"ContainerStarted","Data":"7333f8096289116ee06bfc72c67bb4306feb2e24c57c1db54024c78d8103de1a"} Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.390330 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-t7hr7" podStartSLOduration=3.50325182 podStartE2EDuration="42.390305136s" podCreationTimestamp="2026-02-20 07:07:28 +0000 UTC" firstStartedPulling="2026-02-20 07:07:29.943488145 +0000 UTC m=+1264.816114856" lastFinishedPulling="2026-02-20 07:08:08.830541461 +0000 UTC m=+1303.703168172" observedRunningTime="2026-02-20 07:08:10.378076383 +0000 UTC m=+1305.250703094" watchObservedRunningTime="2026-02-20 07:08:10.390305136 +0000 UTC m=+1305.262931847" Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.413654 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-548b4f9548-p65sn"] Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.418430 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7df9984bd9-6txsf" podStartSLOduration=3.520126607 podStartE2EDuration="7.418405809s" podCreationTimestamp="2026-02-20 07:08:03 +0000 UTC" firstStartedPulling="2026-02-20 07:08:04.90432345 +0000 UTC m=+1299.776950171" lastFinishedPulling="2026-02-20 07:08:08.802602672 +0000 UTC m=+1303.675229373" observedRunningTime="2026-02-20 07:08:10.416904493 +0000 UTC m=+1305.289531204" watchObservedRunningTime="2026-02-20 07:08:10.418405809 +0000 UTC m=+1305.291032520" Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.419190 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" event={"ID":"908e2706-d24f-41c9-b481-4c0d5415c5ca","Type":"ContainerStarted","Data":"459bc2f2d89a8c0984a77dac5551d18970f5268790591ad15bb4ecd73c5d3e57"} Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.419256 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" event={"ID":"908e2706-d24f-41c9-b481-4c0d5415c5ca","Type":"ContainerStarted","Data":"106993ad972a41a70b0e11997dac58bd4e6ab90384569399045d1bedeaba95e2"} Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.420262 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.420287 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.485573 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" podStartSLOduration=3.55608003 podStartE2EDuration="7.485548877s" podCreationTimestamp="2026-02-20 07:08:03 +0000 UTC" firstStartedPulling="2026-02-20 07:08:04.882833286 +0000 UTC m=+1299.755459997" lastFinishedPulling="2026-02-20 07:08:08.812302133 +0000 UTC m=+1303.684928844" observedRunningTime="2026-02-20 07:08:10.446086831 +0000 UTC m=+1305.318713532" watchObservedRunningTime="2026-02-20 07:08:10.485548877 +0000 UTC m=+1305.358175588" Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.493266 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7bf7749bf9-ggfvf"] Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.505303 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7bf7749bf9-ggfvf" podStartSLOduration=4.2396663310000005 podStartE2EDuration="8.505274339s" podCreationTimestamp="2026-02-20 07:08:02 +0000 UTC" firstStartedPulling="2026-02-20 07:08:04.546738177 +0000 UTC m=+1299.419364888" lastFinishedPulling="2026-02-20 07:08:08.812346185 +0000 UTC m=+1303.684972896" observedRunningTime="2026-02-20 07:08:10.472313879 +0000 UTC m=+1305.344940590" watchObservedRunningTime="2026-02-20 07:08:10.505274339 +0000 UTC m=+1305.377901050" Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.523159 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" podStartSLOduration=3.523137046 podStartE2EDuration="3.523137046s" podCreationTimestamp="2026-02-20 07:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:08:10.50908631 +0000 UTC m=+1305.381713021" watchObservedRunningTime="2026-02-20 07:08:10.523137046 +0000 UTC m=+1305.395763757" Feb 20 07:08:11 crc kubenswrapper[5094]: I0220 07:08:11.432535 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" event={"ID":"908e2706-d24f-41c9-b481-4c0d5415c5ca","Type":"ContainerStarted","Data":"a9359fe0f2ca547fd93214e43c994a9c0538ad96741f69a95262122bb7dc4d7b"} Feb 20 07:08:12 crc kubenswrapper[5094]: I0220 07:08:12.447559 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7bf7749bf9-ggfvf" podUID="038d4354-a929-4f2d-9633-4cd7dadd4523" containerName="barbican-worker-log" containerID="cri-o://7333f8096289116ee06bfc72c67bb4306feb2e24c57c1db54024c78d8103de1a" gracePeriod=30 Feb 20 07:08:12 crc kubenswrapper[5094]: I0220 07:08:12.447904 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7bf7749bf9-ggfvf" podUID="038d4354-a929-4f2d-9633-4cd7dadd4523" containerName="barbican-worker" containerID="cri-o://2667db03c56b602dde1dd590504590facc01acd7cc000fb0e9dbfb81bb5c4b28" gracePeriod=30 Feb 20 07:08:12 crc kubenswrapper[5094]: I0220 07:08:12.448318 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" podUID="795db1e7-56e9-4ff2-91f1-b3589603d82c" containerName="barbican-keystone-listener-log" containerID="cri-o://23d5d2a1c804a074977e389a7dd43487a8931e8cf7c459c46b6f0f8448361c8d" gracePeriod=30 Feb 20 07:08:12 crc kubenswrapper[5094]: I0220 07:08:12.449071 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" podUID="795db1e7-56e9-4ff2-91f1-b3589603d82c" containerName="barbican-keystone-listener" containerID="cri-o://31f1d9012990fa973db844a448a0432ed8205633a414c007cfd82b7a9d651223" gracePeriod=30 Feb 20 07:08:13 crc kubenswrapper[5094]: I0220 07:08:13.461434 5094 generic.go:334] "Generic (PLEG): container finished" podID="795db1e7-56e9-4ff2-91f1-b3589603d82c" containerID="31f1d9012990fa973db844a448a0432ed8205633a414c007cfd82b7a9d651223" exitCode=0 Feb 20 07:08:13 crc kubenswrapper[5094]: I0220 07:08:13.461478 5094 generic.go:334] "Generic (PLEG): container finished" podID="795db1e7-56e9-4ff2-91f1-b3589603d82c" containerID="23d5d2a1c804a074977e389a7dd43487a8931e8cf7c459c46b6f0f8448361c8d" exitCode=143 Feb 20 07:08:13 crc kubenswrapper[5094]: I0220 07:08:13.461572 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" event={"ID":"795db1e7-56e9-4ff2-91f1-b3589603d82c","Type":"ContainerDied","Data":"31f1d9012990fa973db844a448a0432ed8205633a414c007cfd82b7a9d651223"} Feb 20 07:08:13 crc kubenswrapper[5094]: I0220 07:08:13.461651 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" event={"ID":"795db1e7-56e9-4ff2-91f1-b3589603d82c","Type":"ContainerDied","Data":"23d5d2a1c804a074977e389a7dd43487a8931e8cf7c459c46b6f0f8448361c8d"} Feb 20 07:08:13 crc kubenswrapper[5094]: I0220 07:08:13.465143 5094 generic.go:334] "Generic (PLEG): container finished" podID="038d4354-a929-4f2d-9633-4cd7dadd4523" containerID="2667db03c56b602dde1dd590504590facc01acd7cc000fb0e9dbfb81bb5c4b28" exitCode=0 Feb 20 07:08:13 crc kubenswrapper[5094]: I0220 07:08:13.465211 5094 generic.go:334] "Generic (PLEG): container finished" podID="038d4354-a929-4f2d-9633-4cd7dadd4523" containerID="7333f8096289116ee06bfc72c67bb4306feb2e24c57c1db54024c78d8103de1a" exitCode=143 Feb 20 07:08:13 crc kubenswrapper[5094]: I0220 07:08:13.465232 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7bf7749bf9-ggfvf" event={"ID":"038d4354-a929-4f2d-9633-4cd7dadd4523","Type":"ContainerDied","Data":"2667db03c56b602dde1dd590504590facc01acd7cc000fb0e9dbfb81bb5c4b28"} Feb 20 07:08:13 crc kubenswrapper[5094]: I0220 07:08:13.465312 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7bf7749bf9-ggfvf" event={"ID":"038d4354-a929-4f2d-9633-4cd7dadd4523","Type":"ContainerDied","Data":"7333f8096289116ee06bfc72c67bb4306feb2e24c57c1db54024c78d8103de1a"} Feb 20 07:08:14 crc kubenswrapper[5094]: I0220 07:08:14.059011 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:14 crc kubenswrapper[5094]: I0220 07:08:14.155360 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67dccc895-wgf2f"] Feb 20 07:08:14 crc kubenswrapper[5094]: I0220 07:08:14.156733 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67dccc895-wgf2f" podUID="53cdb905-b22d-4849-ae24-6baa2838be39" containerName="dnsmasq-dns" containerID="cri-o://886db898799d64a40ec0fd6c2e2b8f27c2c4edd60adf4a15ed59da9eda2fda51" gracePeriod=10 Feb 20 07:08:14 crc kubenswrapper[5094]: I0220 07:08:14.485640 5094 generic.go:334] "Generic (PLEG): container finished" podID="53cdb905-b22d-4849-ae24-6baa2838be39" containerID="886db898799d64a40ec0fd6c2e2b8f27c2c4edd60adf4a15ed59da9eda2fda51" exitCode=0 Feb 20 07:08:14 crc kubenswrapper[5094]: I0220 07:08:14.485690 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dccc895-wgf2f" event={"ID":"53cdb905-b22d-4849-ae24-6baa2838be39","Type":"ContainerDied","Data":"886db898799d64a40ec0fd6c2e2b8f27c2c4edd60adf4a15ed59da9eda2fda51"} Feb 20 07:08:15 crc kubenswrapper[5094]: I0220 07:08:15.496388 5094 generic.go:334] "Generic (PLEG): container finished" podID="15583b83-ce22-4b0b-9566-0e056b07c0d7" containerID="2e821859400e46f67b2a4c22980c8831875df8a723826e86fdac889d47a73a82" exitCode=0 Feb 20 07:08:15 crc kubenswrapper[5094]: I0220 07:08:15.496522 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-t7hr7" event={"ID":"15583b83-ce22-4b0b-9566-0e056b07c0d7","Type":"ContainerDied","Data":"2e821859400e46f67b2a4c22980c8831875df8a723826e86fdac889d47a73a82"} Feb 20 07:08:15 crc kubenswrapper[5094]: I0220 07:08:15.685336 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:15 crc kubenswrapper[5094]: I0220 07:08:15.773902 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.521191 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-t7hr7" event={"ID":"15583b83-ce22-4b0b-9566-0e056b07c0d7","Type":"ContainerDied","Data":"7bade8250a47c4de537369bc3b59be47a7a82eab789fb96f42d111f476483273"} Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.521737 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bade8250a47c4de537369bc3b59be47a7a82eab789fb96f42d111f476483273" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.528815 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.651464 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.725683 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-db-sync-config-data\") pod \"15583b83-ce22-4b0b-9566-0e056b07c0d7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.725842 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qf24d\" (UniqueName: \"kubernetes.io/projected/15583b83-ce22-4b0b-9566-0e056b07c0d7-kube-api-access-qf24d\") pod \"15583b83-ce22-4b0b-9566-0e056b07c0d7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.725884 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/15583b83-ce22-4b0b-9566-0e056b07c0d7-etc-machine-id\") pod \"15583b83-ce22-4b0b-9566-0e056b07c0d7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.725956 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-config-data\") pod \"15583b83-ce22-4b0b-9566-0e056b07c0d7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.726008 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-scripts\") pod \"15583b83-ce22-4b0b-9566-0e056b07c0d7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.726066 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-combined-ca-bundle\") pod \"15583b83-ce22-4b0b-9566-0e056b07c0d7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.731171 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15583b83-ce22-4b0b-9566-0e056b07c0d7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "15583b83-ce22-4b0b-9566-0e056b07c0d7" (UID: "15583b83-ce22-4b0b-9566-0e056b07c0d7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.744033 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-scripts" (OuterVolumeSpecName: "scripts") pod "15583b83-ce22-4b0b-9566-0e056b07c0d7" (UID: "15583b83-ce22-4b0b-9566-0e056b07c0d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.753841 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15583b83-ce22-4b0b-9566-0e056b07c0d7-kube-api-access-qf24d" (OuterVolumeSpecName: "kube-api-access-qf24d") pod "15583b83-ce22-4b0b-9566-0e056b07c0d7" (UID: "15583b83-ce22-4b0b-9566-0e056b07c0d7"). InnerVolumeSpecName "kube-api-access-qf24d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.762925 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "15583b83-ce22-4b0b-9566-0e056b07c0d7" (UID: "15583b83-ce22-4b0b-9566-0e056b07c0d7"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.771922 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15583b83-ce22-4b0b-9566-0e056b07c0d7" (UID: "15583b83-ce22-4b0b-9566-0e056b07c0d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.811751 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-config-data" (OuterVolumeSpecName: "config-data") pod "15583b83-ce22-4b0b-9566-0e056b07c0d7" (UID: "15583b83-ce22-4b0b-9566-0e056b07c0d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.827715 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-dns-svc\") pod \"53cdb905-b22d-4849-ae24-6baa2838be39\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.827863 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-config\") pod \"53cdb905-b22d-4849-ae24-6baa2838be39\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.827923 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-ovsdbserver-sb\") pod \"53cdb905-b22d-4849-ae24-6baa2838be39\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.827948 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-dns-swift-storage-0\") pod \"53cdb905-b22d-4849-ae24-6baa2838be39\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.828047 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-ovsdbserver-nb\") pod \"53cdb905-b22d-4849-ae24-6baa2838be39\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.828099 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpr25\" (UniqueName: \"kubernetes.io/projected/53cdb905-b22d-4849-ae24-6baa2838be39-kube-api-access-xpr25\") pod \"53cdb905-b22d-4849-ae24-6baa2838be39\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.828561 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qf24d\" (UniqueName: \"kubernetes.io/projected/15583b83-ce22-4b0b-9566-0e056b07c0d7-kube-api-access-qf24d\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.828589 5094 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/15583b83-ce22-4b0b-9566-0e056b07c0d7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.828601 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.828612 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.828621 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.828630 5094 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.863034 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53cdb905-b22d-4849-ae24-6baa2838be39-kube-api-access-xpr25" (OuterVolumeSpecName: "kube-api-access-xpr25") pod "53cdb905-b22d-4849-ae24-6baa2838be39" (UID: "53cdb905-b22d-4849-ae24-6baa2838be39"). InnerVolumeSpecName "kube-api-access-xpr25". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.877758 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "53cdb905-b22d-4849-ae24-6baa2838be39" (UID: "53cdb905-b22d-4849-ae24-6baa2838be39"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.884218 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-config" (OuterVolumeSpecName: "config") pod "53cdb905-b22d-4849-ae24-6baa2838be39" (UID: "53cdb905-b22d-4849-ae24-6baa2838be39"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.925432 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.928393 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "53cdb905-b22d-4849-ae24-6baa2838be39" (UID: "53cdb905-b22d-4849-ae24-6baa2838be39"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.929009 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "53cdb905-b22d-4849-ae24-6baa2838be39" (UID: "53cdb905-b22d-4849-ae24-6baa2838be39"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.929561 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-dns-svc\") pod \"53cdb905-b22d-4849-ae24-6baa2838be39\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " Feb 20 07:08:17 crc kubenswrapper[5094]: W0220 07:08:17.929918 5094 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/53cdb905-b22d-4849-ae24-6baa2838be39/volumes/kubernetes.io~configmap/dns-svc Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.929949 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "53cdb905-b22d-4849-ae24-6baa2838be39" (UID: "53cdb905-b22d-4849-ae24-6baa2838be39"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.930167 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.930187 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.930199 5094 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.930208 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpr25\" (UniqueName: \"kubernetes.io/projected/53cdb905-b22d-4849-ae24-6baa2838be39-kube-api-access-xpr25\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.930217 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.944734 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.991294 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "53cdb905-b22d-4849-ae24-6baa2838be39" (UID: "53cdb905-b22d-4849-ae24-6baa2838be39"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.031066 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-config-data-custom\") pod \"038d4354-a929-4f2d-9633-4cd7dadd4523\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.031212 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-config-data\") pod \"038d4354-a929-4f2d-9633-4cd7dadd4523\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.031250 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-combined-ca-bundle\") pod \"038d4354-a929-4f2d-9633-4cd7dadd4523\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.031279 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mknrc\" (UniqueName: \"kubernetes.io/projected/038d4354-a929-4f2d-9633-4cd7dadd4523-kube-api-access-mknrc\") pod \"038d4354-a929-4f2d-9633-4cd7dadd4523\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.031443 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/038d4354-a929-4f2d-9633-4cd7dadd4523-logs\") pod \"038d4354-a929-4f2d-9633-4cd7dadd4523\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.031899 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.032042 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/038d4354-a929-4f2d-9633-4cd7dadd4523-logs" (OuterVolumeSpecName: "logs") pod "038d4354-a929-4f2d-9633-4cd7dadd4523" (UID: "038d4354-a929-4f2d-9633-4cd7dadd4523"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.037597 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "038d4354-a929-4f2d-9633-4cd7dadd4523" (UID: "038d4354-a929-4f2d-9633-4cd7dadd4523"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.037990 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/038d4354-a929-4f2d-9633-4cd7dadd4523-kube-api-access-mknrc" (OuterVolumeSpecName: "kube-api-access-mknrc") pod "038d4354-a929-4f2d-9633-4cd7dadd4523" (UID: "038d4354-a929-4f2d-9633-4cd7dadd4523"). InnerVolumeSpecName "kube-api-access-mknrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.080012 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-config-data" (OuterVolumeSpecName: "config-data") pod "038d4354-a929-4f2d-9633-4cd7dadd4523" (UID: "038d4354-a929-4f2d-9633-4cd7dadd4523"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.080596 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "038d4354-a929-4f2d-9633-4cd7dadd4523" (UID: "038d4354-a929-4f2d-9633-4cd7dadd4523"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.132725 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65k5m\" (UniqueName: \"kubernetes.io/projected/795db1e7-56e9-4ff2-91f1-b3589603d82c-kube-api-access-65k5m\") pod \"795db1e7-56e9-4ff2-91f1-b3589603d82c\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.133050 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/795db1e7-56e9-4ff2-91f1-b3589603d82c-logs\") pod \"795db1e7-56e9-4ff2-91f1-b3589603d82c\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.133181 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-config-data-custom\") pod \"795db1e7-56e9-4ff2-91f1-b3589603d82c\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.133385 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-config-data\") pod \"795db1e7-56e9-4ff2-91f1-b3589603d82c\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.133588 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-combined-ca-bundle\") pod \"795db1e7-56e9-4ff2-91f1-b3589603d82c\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.134182 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/038d4354-a929-4f2d-9633-4cd7dadd4523-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.134282 5094 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.134361 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.134417 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.134472 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mknrc\" (UniqueName: \"kubernetes.io/projected/038d4354-a929-4f2d-9633-4cd7dadd4523-kube-api-access-mknrc\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.134168 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/795db1e7-56e9-4ff2-91f1-b3589603d82c-logs" (OuterVolumeSpecName: "logs") pod "795db1e7-56e9-4ff2-91f1-b3589603d82c" (UID: "795db1e7-56e9-4ff2-91f1-b3589603d82c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.136161 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/795db1e7-56e9-4ff2-91f1-b3589603d82c-kube-api-access-65k5m" (OuterVolumeSpecName: "kube-api-access-65k5m") pod "795db1e7-56e9-4ff2-91f1-b3589603d82c" (UID: "795db1e7-56e9-4ff2-91f1-b3589603d82c"). InnerVolumeSpecName "kube-api-access-65k5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.136883 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "795db1e7-56e9-4ff2-91f1-b3589603d82c" (UID: "795db1e7-56e9-4ff2-91f1-b3589603d82c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.166516 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "795db1e7-56e9-4ff2-91f1-b3589603d82c" (UID: "795db1e7-56e9-4ff2-91f1-b3589603d82c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.195797 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-config-data" (OuterVolumeSpecName: "config-data") pod "795db1e7-56e9-4ff2-91f1-b3589603d82c" (UID: "795db1e7-56e9-4ff2-91f1-b3589603d82c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.236288 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.236330 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.236343 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65k5m\" (UniqueName: \"kubernetes.io/projected/795db1e7-56e9-4ff2-91f1-b3589603d82c-kube-api-access-65k5m\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.236353 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/795db1e7-56e9-4ff2-91f1-b3589603d82c-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.236362 5094 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.530814 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7bf7749bf9-ggfvf" event={"ID":"038d4354-a929-4f2d-9633-4cd7dadd4523","Type":"ContainerDied","Data":"775f84fe6aa5df6496627179d444588e580571d5d2f0a7733afea33c0965498e"} Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.530891 5094 scope.go:117] "RemoveContainer" containerID="2667db03c56b602dde1dd590504590facc01acd7cc000fb0e9dbfb81bb5c4b28" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.530902 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.532717 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dccc895-wgf2f" event={"ID":"53cdb905-b22d-4849-ae24-6baa2838be39","Type":"ContainerDied","Data":"db75b8edda289ee932be6a52e71a112b82ffd485c1dcf3a3df8f8b8577f5dd7d"} Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.532823 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.542624 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19d2d34d-f935-40e4-a27a-a382c7634da2","Type":"ContainerStarted","Data":"515e3e41bcc23544014d7b617f381b5649aaa881e54eee2c7379b43ae15d6516"} Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.542840 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerName="ceilometer-central-agent" containerID="cri-o://154c237a8cda216b92583d85eef501f14b59d72a35964b0b4be6a9a7f8415f61" gracePeriod=30 Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.543149 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.543454 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerName="proxy-httpd" containerID="cri-o://515e3e41bcc23544014d7b617f381b5649aaa881e54eee2c7379b43ae15d6516" gracePeriod=30 Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.543529 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerName="sg-core" containerID="cri-o://735b44943bdf895ffeb305a8a9b07202be3d3f21805babf681c05f7b0d65ad7e" gracePeriod=30 Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.543571 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerName="ceilometer-notification-agent" containerID="cri-o://8fbd0e1a5aa799bf4144c98e297964162b94c7e1f11e3c3574fd101cc616268d" gracePeriod=30 Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.568168 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.569641 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.572941 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" event={"ID":"795db1e7-56e9-4ff2-91f1-b3589603d82c","Type":"ContainerDied","Data":"fb04a41995c63d48690668a46b722c642ec1c8a54ddc3f182df807e2e50bd027"} Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.581461 5094 scope.go:117] "RemoveContainer" containerID="7333f8096289116ee06bfc72c67bb4306feb2e24c57c1db54024c78d8103de1a" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.601774 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.060106877 podStartE2EDuration="50.601747823s" podCreationTimestamp="2026-02-20 07:07:28 +0000 UTC" firstStartedPulling="2026-02-20 07:07:29.872673959 +0000 UTC m=+1264.745300670" lastFinishedPulling="2026-02-20 07:08:17.414314905 +0000 UTC m=+1312.286941616" observedRunningTime="2026-02-20 07:08:18.577588445 +0000 UTC m=+1313.450215156" watchObservedRunningTime="2026-02-20 07:08:18.601747823 +0000 UTC m=+1313.474374534" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.634211 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67dccc895-wgf2f"] Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.642746 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67dccc895-wgf2f"] Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.652793 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7bf7749bf9-ggfvf"] Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.657694 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-7bf7749bf9-ggfvf"] Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.672786 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-548b4f9548-p65sn"] Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.682506 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-548b4f9548-p65sn"] Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.790282 5094 scope.go:117] "RemoveContainer" containerID="886db898799d64a40ec0fd6c2e2b8f27c2c4edd60adf4a15ed59da9eda2fda51" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.860273 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 07:08:18 crc kubenswrapper[5094]: E0220 07:08:18.860732 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="795db1e7-56e9-4ff2-91f1-b3589603d82c" containerName="barbican-keystone-listener" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.860746 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="795db1e7-56e9-4ff2-91f1-b3589603d82c" containerName="barbican-keystone-listener" Feb 20 07:08:18 crc kubenswrapper[5094]: E0220 07:08:18.860777 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53cdb905-b22d-4849-ae24-6baa2838be39" containerName="init" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.860783 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="53cdb905-b22d-4849-ae24-6baa2838be39" containerName="init" Feb 20 07:08:18 crc kubenswrapper[5094]: E0220 07:08:18.860794 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="038d4354-a929-4f2d-9633-4cd7dadd4523" containerName="barbican-worker-log" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.860803 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="038d4354-a929-4f2d-9633-4cd7dadd4523" containerName="barbican-worker-log" Feb 20 07:08:18 crc kubenswrapper[5094]: E0220 07:08:18.860817 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15583b83-ce22-4b0b-9566-0e056b07c0d7" containerName="cinder-db-sync" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.860824 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="15583b83-ce22-4b0b-9566-0e056b07c0d7" containerName="cinder-db-sync" Feb 20 07:08:18 crc kubenswrapper[5094]: E0220 07:08:18.860837 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53cdb905-b22d-4849-ae24-6baa2838be39" containerName="dnsmasq-dns" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.860842 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="53cdb905-b22d-4849-ae24-6baa2838be39" containerName="dnsmasq-dns" Feb 20 07:08:18 crc kubenswrapper[5094]: E0220 07:08:18.860921 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="038d4354-a929-4f2d-9633-4cd7dadd4523" containerName="barbican-worker" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.860927 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="038d4354-a929-4f2d-9633-4cd7dadd4523" containerName="barbican-worker" Feb 20 07:08:18 crc kubenswrapper[5094]: E0220 07:08:18.860936 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="795db1e7-56e9-4ff2-91f1-b3589603d82c" containerName="barbican-keystone-listener-log" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.860943 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="795db1e7-56e9-4ff2-91f1-b3589603d82c" containerName="barbican-keystone-listener-log" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.861104 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="795db1e7-56e9-4ff2-91f1-b3589603d82c" containerName="barbican-keystone-listener" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.861122 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="038d4354-a929-4f2d-9633-4cd7dadd4523" containerName="barbican-worker-log" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.861141 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="795db1e7-56e9-4ff2-91f1-b3589603d82c" containerName="barbican-keystone-listener-log" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.861155 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="15583b83-ce22-4b0b-9566-0e056b07c0d7" containerName="cinder-db-sync" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.861164 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="038d4354-a929-4f2d-9633-4cd7dadd4523" containerName="barbican-worker" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.861175 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="53cdb905-b22d-4849-ae24-6baa2838be39" containerName="dnsmasq-dns" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.866181 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.871248 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.871410 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-f4gh4" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.871574 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.871666 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.879440 5094 scope.go:117] "RemoveContainer" containerID="135ff79dc99efbd620593401c9d7a73c61d546032c68fab2fd094efa64fa4d62" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.897377 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.953826 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-config-data\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.954431 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25pbr\" (UniqueName: \"kubernetes.io/projected/71648c9f-0170-413c-9f26-d169c9933469-kube-api-access-25pbr\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.954573 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/71648c9f-0170-413c-9f26-d169c9933469-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.954652 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-scripts\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.954682 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.954876 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.023518 5094 scope.go:117] "RemoveContainer" containerID="31f1d9012990fa973db844a448a0432ed8205633a414c007cfd82b7a9d651223" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.057207 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/71648c9f-0170-413c-9f26-d169c9933469-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.057544 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-scripts\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.057637 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.057777 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.057966 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-config-data\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.058067 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25pbr\" (UniqueName: \"kubernetes.io/projected/71648c9f-0170-413c-9f26-d169c9933469-kube-api-access-25pbr\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.058591 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/71648c9f-0170-413c-9f26-d169c9933469-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.061944 5094 scope.go:117] "RemoveContainer" containerID="23d5d2a1c804a074977e389a7dd43487a8931e8cf7c459c46b6f0f8448361c8d" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.071326 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-scripts\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.102394 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.103500 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.112902 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-config-data\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.119104 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25pbr\" (UniqueName: \"kubernetes.io/projected/71648c9f-0170-413c-9f26-d169c9933469-kube-api-access-25pbr\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.176601 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw"] Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.178782 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.195633 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw"] Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.223766 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.370829 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.376043 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-dns-swift-storage-0\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.376137 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-ovsdbserver-sb\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.376185 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-ovsdbserver-nb\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.376288 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmkxt\" (UniqueName: \"kubernetes.io/projected/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-kube-api-access-dmkxt\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.376485 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-dns-svc\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.376544 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-config\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.389468 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.395990 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.436856 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.478822 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-dns-swift-storage-0\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.478886 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-ovsdbserver-sb\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.478925 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-ovsdbserver-nb\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.478974 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmkxt\" (UniqueName: \"kubernetes.io/projected/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-kube-api-access-dmkxt\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.479053 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-dns-svc\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.479090 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-config\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.480065 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-dns-swift-storage-0\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.480089 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-ovsdbserver-sb\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.480194 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-config\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.480809 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-dns-svc\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.480895 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-ovsdbserver-nb\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.518185 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmkxt\" (UniqueName: \"kubernetes.io/projected/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-kube-api-access-dmkxt\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.531382 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.582213 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-config-data\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.582275 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26e7c194-b2d4-4578-90c0-d0f141b96bdd-logs\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.582354 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-config-data-custom\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.582515 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/26e7c194-b2d4-4578-90c0-d0f141b96bdd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.582536 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f825d\" (UniqueName: \"kubernetes.io/projected/26e7c194-b2d4-4578-90c0-d0f141b96bdd-kube-api-access-f825d\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.582555 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.582594 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-scripts\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.624291 5094 generic.go:334] "Generic (PLEG): container finished" podID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerID="515e3e41bcc23544014d7b617f381b5649aaa881e54eee2c7379b43ae15d6516" exitCode=0 Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.642206 5094 generic.go:334] "Generic (PLEG): container finished" podID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerID="735b44943bdf895ffeb305a8a9b07202be3d3f21805babf681c05f7b0d65ad7e" exitCode=2 Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.642242 5094 generic.go:334] "Generic (PLEG): container finished" podID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerID="154c237a8cda216b92583d85eef501f14b59d72a35964b0b4be6a9a7f8415f61" exitCode=0 Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.625899 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19d2d34d-f935-40e4-a27a-a382c7634da2","Type":"ContainerDied","Data":"515e3e41bcc23544014d7b617f381b5649aaa881e54eee2c7379b43ae15d6516"} Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.642383 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19d2d34d-f935-40e4-a27a-a382c7634da2","Type":"ContainerDied","Data":"735b44943bdf895ffeb305a8a9b07202be3d3f21805babf681c05f7b0d65ad7e"} Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.642433 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19d2d34d-f935-40e4-a27a-a382c7634da2","Type":"ContainerDied","Data":"154c237a8cda216b92583d85eef501f14b59d72a35964b0b4be6a9a7f8415f61"} Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.687890 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-scripts\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.687951 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-config-data\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.687973 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26e7c194-b2d4-4578-90c0-d0f141b96bdd-logs\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.688030 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-config-data-custom\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.688142 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/26e7c194-b2d4-4578-90c0-d0f141b96bdd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.688167 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f825d\" (UniqueName: \"kubernetes.io/projected/26e7c194-b2d4-4578-90c0-d0f141b96bdd-kube-api-access-f825d\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.688189 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.690439 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/26e7c194-b2d4-4578-90c0-d0f141b96bdd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.690814 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26e7c194-b2d4-4578-90c0-d0f141b96bdd-logs\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.696186 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-config-data-custom\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.698228 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-scripts\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.698523 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.709313 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-config-data\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.714213 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f825d\" (UniqueName: \"kubernetes.io/projected/26e7c194-b2d4-4578-90c0-d0f141b96bdd-kube-api-access-f825d\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.778672 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.883101 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="038d4354-a929-4f2d-9633-4cd7dadd4523" path="/var/lib/kubelet/pods/038d4354-a929-4f2d-9633-4cd7dadd4523/volumes" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.883922 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53cdb905-b22d-4849-ae24-6baa2838be39" path="/var/lib/kubelet/pods/53cdb905-b22d-4849-ae24-6baa2838be39/volumes" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.885458 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="795db1e7-56e9-4ff2-91f1-b3589603d82c" path="/var/lib/kubelet/pods/795db1e7-56e9-4ff2-91f1-b3589603d82c/volumes" Feb 20 07:08:20 crc kubenswrapper[5094]: I0220 07:08:20.020532 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 07:08:20 crc kubenswrapper[5094]: I0220 07:08:20.119301 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:20 crc kubenswrapper[5094]: I0220 07:08:20.194923 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw"] Feb 20 07:08:20 crc kubenswrapper[5094]: I0220 07:08:20.359836 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 20 07:08:20 crc kubenswrapper[5094]: I0220 07:08:20.390323 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:20 crc kubenswrapper[5094]: I0220 07:08:20.477975 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-9d777b794-txl9q"] Feb 20 07:08:20 crc kubenswrapper[5094]: I0220 07:08:20.478282 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-9d777b794-txl9q" podUID="e9b56f38-2467-4518-bdc3-d6ee665987da" containerName="barbican-api-log" containerID="cri-o://da39043890404f802fa5323fb43efc928dbb515543671fe7e4daa4998c4f6b42" gracePeriod=30 Feb 20 07:08:20 crc kubenswrapper[5094]: I0220 07:08:20.478555 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-9d777b794-txl9q" podUID="e9b56f38-2467-4518-bdc3-d6ee665987da" containerName="barbican-api" containerID="cri-o://3416645d606b6c11c1a4b3c83d2ebc2d75fd169ec7df3d3e76572e9a7ca20cfb" gracePeriod=30 Feb 20 07:08:20 crc kubenswrapper[5094]: I0220 07:08:20.495019 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-9d777b794-txl9q" podUID="e9b56f38-2467-4518-bdc3-d6ee665987da" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": EOF" Feb 20 07:08:20 crc kubenswrapper[5094]: I0220 07:08:20.747483 5094 generic.go:334] "Generic (PLEG): container finished" podID="e9b56f38-2467-4518-bdc3-d6ee665987da" containerID="da39043890404f802fa5323fb43efc928dbb515543671fe7e4daa4998c4f6b42" exitCode=143 Feb 20 07:08:20 crc kubenswrapper[5094]: I0220 07:08:20.747987 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9d777b794-txl9q" event={"ID":"e9b56f38-2467-4518-bdc3-d6ee665987da","Type":"ContainerDied","Data":"da39043890404f802fa5323fb43efc928dbb515543671fe7e4daa4998c4f6b42"} Feb 20 07:08:20 crc kubenswrapper[5094]: I0220 07:08:20.755318 5094 generic.go:334] "Generic (PLEG): container finished" podID="864952a5-1f2e-4930-be7f-7dbc3a2c2af8" containerID="e3d0972b678e16469c3366801fb2f8b9511912679cef680d0f581c3085fd2723" exitCode=0 Feb 20 07:08:20 crc kubenswrapper[5094]: I0220 07:08:20.755362 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" event={"ID":"864952a5-1f2e-4930-be7f-7dbc3a2c2af8","Type":"ContainerDied","Data":"e3d0972b678e16469c3366801fb2f8b9511912679cef680d0f581c3085fd2723"} Feb 20 07:08:20 crc kubenswrapper[5094]: I0220 07:08:20.755381 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" event={"ID":"864952a5-1f2e-4930-be7f-7dbc3a2c2af8","Type":"ContainerStarted","Data":"db6c4cfd73d84c6bf37834db171aab681839cbd0872d2e8b1d00c5c8feb0f4da"} Feb 20 07:08:20 crc kubenswrapper[5094]: I0220 07:08:20.762989 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"26e7c194-b2d4-4578-90c0-d0f141b96bdd","Type":"ContainerStarted","Data":"7d1b6fe880fd5fac2cb8fe26f239b1778509510802a76237301a5a28526a7e35"} Feb 20 07:08:20 crc kubenswrapper[5094]: I0220 07:08:20.769867 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"71648c9f-0170-413c-9f26-d169c9933469","Type":"ContainerStarted","Data":"a5e9daf344d7df70576896055f545ff598209ddc3efe1ce29774c938244e44a7"} Feb 20 07:08:21 crc kubenswrapper[5094]: I0220 07:08:21.499925 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 20 07:08:21 crc kubenswrapper[5094]: I0220 07:08:21.784335 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" event={"ID":"864952a5-1f2e-4930-be7f-7dbc3a2c2af8","Type":"ContainerStarted","Data":"fdf1995e90b86d0bcffee929097bd15c99141d0d2d0b08867fe0841afc3d59b6"} Feb 20 07:08:21 crc kubenswrapper[5094]: I0220 07:08:21.784850 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:21 crc kubenswrapper[5094]: I0220 07:08:21.788351 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"71648c9f-0170-413c-9f26-d169c9933469","Type":"ContainerStarted","Data":"0686cfd57649205f2e479e85d8bd4cc3d24d29e2bf21f8972dd18e12a66c7bbc"} Feb 20 07:08:21 crc kubenswrapper[5094]: I0220 07:08:21.790444 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"26e7c194-b2d4-4578-90c0-d0f141b96bdd","Type":"ContainerStarted","Data":"151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639"} Feb 20 07:08:21 crc kubenswrapper[5094]: I0220 07:08:21.816488 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" podStartSLOduration=2.816464873 podStartE2EDuration="2.816464873s" podCreationTimestamp="2026-02-20 07:08:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:08:21.810400278 +0000 UTC m=+1316.683026989" watchObservedRunningTime="2026-02-20 07:08:21.816464873 +0000 UTC m=+1316.689091584" Feb 20 07:08:22 crc kubenswrapper[5094]: I0220 07:08:22.840775 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"71648c9f-0170-413c-9f26-d169c9933469","Type":"ContainerStarted","Data":"67148a4e684e1ddbbec96ddc574fd1f1d4b808c14bbd286654cdb053aeb29db9"} Feb 20 07:08:22 crc kubenswrapper[5094]: I0220 07:08:22.870286 5094 generic.go:334] "Generic (PLEG): container finished" podID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerID="8fbd0e1a5aa799bf4144c98e297964162b94c7e1f11e3c3574fd101cc616268d" exitCode=0 Feb 20 07:08:22 crc kubenswrapper[5094]: I0220 07:08:22.870432 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19d2d34d-f935-40e4-a27a-a382c7634da2","Type":"ContainerDied","Data":"8fbd0e1a5aa799bf4144c98e297964162b94c7e1f11e3c3574fd101cc616268d"} Feb 20 07:08:22 crc kubenswrapper[5094]: I0220 07:08:22.884787 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="26e7c194-b2d4-4578-90c0-d0f141b96bdd" containerName="cinder-api-log" containerID="cri-o://151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639" gracePeriod=30 Feb 20 07:08:22 crc kubenswrapper[5094]: I0220 07:08:22.884918 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"26e7c194-b2d4-4578-90c0-d0f141b96bdd","Type":"ContainerStarted","Data":"0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68"} Feb 20 07:08:22 crc kubenswrapper[5094]: I0220 07:08:22.885346 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 20 07:08:22 crc kubenswrapper[5094]: I0220 07:08:22.885965 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="26e7c194-b2d4-4578-90c0-d0f141b96bdd" containerName="cinder-api" containerID="cri-o://0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68" gracePeriod=30 Feb 20 07:08:22 crc kubenswrapper[5094]: I0220 07:08:22.904422 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.176236839 podStartE2EDuration="4.904391398s" podCreationTimestamp="2026-02-20 07:08:18 +0000 UTC" firstStartedPulling="2026-02-20 07:08:20.040151752 +0000 UTC m=+1314.912778463" lastFinishedPulling="2026-02-20 07:08:20.768306311 +0000 UTC m=+1315.640933022" observedRunningTime="2026-02-20 07:08:22.885575598 +0000 UTC m=+1317.758202309" watchObservedRunningTime="2026-02-20 07:08:22.904391398 +0000 UTC m=+1317.777018099" Feb 20 07:08:22 crc kubenswrapper[5094]: I0220 07:08:22.930327 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.9303035790000003 podStartE2EDuration="3.930303579s" podCreationTimestamp="2026-02-20 07:08:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:08:22.922972294 +0000 UTC m=+1317.795598995" watchObservedRunningTime="2026-02-20 07:08:22.930303579 +0000 UTC m=+1317.802930290" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.224098 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.399171 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-scripts\") pod \"19d2d34d-f935-40e4-a27a-a382c7634da2\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.400963 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-combined-ca-bundle\") pod \"19d2d34d-f935-40e4-a27a-a382c7634da2\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.401221 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19d2d34d-f935-40e4-a27a-a382c7634da2-log-httpd\") pod \"19d2d34d-f935-40e4-a27a-a382c7634da2\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.401332 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-sg-core-conf-yaml\") pod \"19d2d34d-f935-40e4-a27a-a382c7634da2\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.401384 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-config-data\") pod \"19d2d34d-f935-40e4-a27a-a382c7634da2\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.401447 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19d2d34d-f935-40e4-a27a-a382c7634da2-run-httpd\") pod \"19d2d34d-f935-40e4-a27a-a382c7634da2\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.401491 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgl49\" (UniqueName: \"kubernetes.io/projected/19d2d34d-f935-40e4-a27a-a382c7634da2-kube-api-access-jgl49\") pod \"19d2d34d-f935-40e4-a27a-a382c7634da2\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.401857 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19d2d34d-f935-40e4-a27a-a382c7634da2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "19d2d34d-f935-40e4-a27a-a382c7634da2" (UID: "19d2d34d-f935-40e4-a27a-a382c7634da2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.401924 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19d2d34d-f935-40e4-a27a-a382c7634da2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "19d2d34d-f935-40e4-a27a-a382c7634da2" (UID: "19d2d34d-f935-40e4-a27a-a382c7634da2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.409953 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19d2d34d-f935-40e4-a27a-a382c7634da2-kube-api-access-jgl49" (OuterVolumeSpecName: "kube-api-access-jgl49") pod "19d2d34d-f935-40e4-a27a-a382c7634da2" (UID: "19d2d34d-f935-40e4-a27a-a382c7634da2"). InnerVolumeSpecName "kube-api-access-jgl49". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.413183 5094 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19d2d34d-f935-40e4-a27a-a382c7634da2-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.413230 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgl49\" (UniqueName: \"kubernetes.io/projected/19d2d34d-f935-40e4-a27a-a382c7634da2-kube-api-access-jgl49\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.413245 5094 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19d2d34d-f935-40e4-a27a-a382c7634da2-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.430730 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-scripts" (OuterVolumeSpecName: "scripts") pod "19d2d34d-f935-40e4-a27a-a382c7634da2" (UID: "19d2d34d-f935-40e4-a27a-a382c7634da2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.455811 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "19d2d34d-f935-40e4-a27a-a382c7634da2" (UID: "19d2d34d-f935-40e4-a27a-a382c7634da2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.498118 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19d2d34d-f935-40e4-a27a-a382c7634da2" (UID: "19d2d34d-f935-40e4-a27a-a382c7634da2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.515952 5094 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.515986 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.515997 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.520420 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-config-data" (OuterVolumeSpecName: "config-data") pod "19d2d34d-f935-40e4-a27a-a382c7634da2" (UID: "19d2d34d-f935-40e4-a27a-a382c7634da2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.606182 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.617359 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-config-data\") pod \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.617419 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-combined-ca-bundle\") pod \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.617531 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-config-data-custom\") pod \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.617572 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26e7c194-b2d4-4578-90c0-d0f141b96bdd-logs\") pod \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.617680 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-scripts\") pod \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.617724 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f825d\" (UniqueName: \"kubernetes.io/projected/26e7c194-b2d4-4578-90c0-d0f141b96bdd-kube-api-access-f825d\") pod \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.617810 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/26e7c194-b2d4-4578-90c0-d0f141b96bdd-etc-machine-id\") pod \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.618359 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.618422 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/26e7c194-b2d4-4578-90c0-d0f141b96bdd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "26e7c194-b2d4-4578-90c0-d0f141b96bdd" (UID: "26e7c194-b2d4-4578-90c0-d0f141b96bdd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.618463 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26e7c194-b2d4-4578-90c0-d0f141b96bdd-logs" (OuterVolumeSpecName: "logs") pod "26e7c194-b2d4-4578-90c0-d0f141b96bdd" (UID: "26e7c194-b2d4-4578-90c0-d0f141b96bdd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.623572 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-scripts" (OuterVolumeSpecName: "scripts") pod "26e7c194-b2d4-4578-90c0-d0f141b96bdd" (UID: "26e7c194-b2d4-4578-90c0-d0f141b96bdd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.623852 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26e7c194-b2d4-4578-90c0-d0f141b96bdd-kube-api-access-f825d" (OuterVolumeSpecName: "kube-api-access-f825d") pod "26e7c194-b2d4-4578-90c0-d0f141b96bdd" (UID: "26e7c194-b2d4-4578-90c0-d0f141b96bdd"). InnerVolumeSpecName "kube-api-access-f825d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.640900 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "26e7c194-b2d4-4578-90c0-d0f141b96bdd" (UID: "26e7c194-b2d4-4578-90c0-d0f141b96bdd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.657241 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26e7c194-b2d4-4578-90c0-d0f141b96bdd" (UID: "26e7c194-b2d4-4578-90c0-d0f141b96bdd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.719112 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-config-data" (OuterVolumeSpecName: "config-data") pod "26e7c194-b2d4-4578-90c0-d0f141b96bdd" (UID: "26e7c194-b2d4-4578-90c0-d0f141b96bdd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.720455 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-config-data\") pod \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.721140 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.721178 5094 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.721193 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26e7c194-b2d4-4578-90c0-d0f141b96bdd-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.721212 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f825d\" (UniqueName: \"kubernetes.io/projected/26e7c194-b2d4-4578-90c0-d0f141b96bdd-kube-api-access-f825d\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.721228 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.721240 5094 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/26e7c194-b2d4-4578-90c0-d0f141b96bdd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:23 crc kubenswrapper[5094]: W0220 07:08:23.721384 5094 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/26e7c194-b2d4-4578-90c0-d0f141b96bdd/volumes/kubernetes.io~secret/config-data Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.721412 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-config-data" (OuterVolumeSpecName: "config-data") pod "26e7c194-b2d4-4578-90c0-d0f141b96bdd" (UID: "26e7c194-b2d4-4578-90c0-d0f141b96bdd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.822081 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.898025 5094 generic.go:334] "Generic (PLEG): container finished" podID="26e7c194-b2d4-4578-90c0-d0f141b96bdd" containerID="0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68" exitCode=0 Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.898080 5094 generic.go:334] "Generic (PLEG): container finished" podID="26e7c194-b2d4-4578-90c0-d0f141b96bdd" containerID="151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639" exitCode=143 Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.898235 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.898371 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"26e7c194-b2d4-4578-90c0-d0f141b96bdd","Type":"ContainerDied","Data":"0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68"} Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.898432 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"26e7c194-b2d4-4578-90c0-d0f141b96bdd","Type":"ContainerDied","Data":"151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639"} Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.898454 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"26e7c194-b2d4-4578-90c0-d0f141b96bdd","Type":"ContainerDied","Data":"7d1b6fe880fd5fac2cb8fe26f239b1778509510802a76237301a5a28526a7e35"} Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.898486 5094 scope.go:117] "RemoveContainer" containerID="0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.905403 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.906019 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19d2d34d-f935-40e4-a27a-a382c7634da2","Type":"ContainerDied","Data":"7d9f8b3046d52c477cb1f1e73376c263067dfb9d891c04aea7389d4a8f986dbe"} Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.930012 5094 scope.go:117] "RemoveContainer" containerID="151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.945659 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.956777 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.980210 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.992634 5094 scope.go:117] "RemoveContainer" containerID="0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68" Feb 20 07:08:23 crc kubenswrapper[5094]: E0220 07:08:23.993312 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68\": container with ID starting with 0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68 not found: ID does not exist" containerID="0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.993345 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68"} err="failed to get container status \"0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68\": rpc error: code = NotFound desc = could not find container \"0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68\": container with ID starting with 0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68 not found: ID does not exist" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.993368 5094 scope.go:117] "RemoveContainer" containerID="151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639" Feb 20 07:08:23 crc kubenswrapper[5094]: E0220 07:08:23.993664 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639\": container with ID starting with 151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639 not found: ID does not exist" containerID="151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.993683 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639"} err="failed to get container status \"151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639\": rpc error: code = NotFound desc = could not find container \"151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639\": container with ID starting with 151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639 not found: ID does not exist" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.993695 5094 scope.go:117] "RemoveContainer" containerID="0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.993930 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68"} err="failed to get container status \"0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68\": rpc error: code = NotFound desc = could not find container \"0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68\": container with ID starting with 0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68 not found: ID does not exist" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.993947 5094 scope.go:117] "RemoveContainer" containerID="151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.994510 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639"} err="failed to get container status \"151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639\": rpc error: code = NotFound desc = could not find container \"151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639\": container with ID starting with 151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639 not found: ID does not exist" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.994531 5094 scope.go:117] "RemoveContainer" containerID="515e3e41bcc23544014d7b617f381b5649aaa881e54eee2c7379b43ae15d6516" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.003491 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 20 07:08:24 crc kubenswrapper[5094]: E0220 07:08:24.004040 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerName="ceilometer-notification-agent" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.004053 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerName="ceilometer-notification-agent" Feb 20 07:08:24 crc kubenswrapper[5094]: E0220 07:08:24.004068 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26e7c194-b2d4-4578-90c0-d0f141b96bdd" containerName="cinder-api" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.004077 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="26e7c194-b2d4-4578-90c0-d0f141b96bdd" containerName="cinder-api" Feb 20 07:08:24 crc kubenswrapper[5094]: E0220 07:08:24.004100 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerName="proxy-httpd" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.004106 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerName="proxy-httpd" Feb 20 07:08:24 crc kubenswrapper[5094]: E0220 07:08:24.004118 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerName="sg-core" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.004125 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerName="sg-core" Feb 20 07:08:24 crc kubenswrapper[5094]: E0220 07:08:24.004149 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26e7c194-b2d4-4578-90c0-d0f141b96bdd" containerName="cinder-api-log" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.004155 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="26e7c194-b2d4-4578-90c0-d0f141b96bdd" containerName="cinder-api-log" Feb 20 07:08:24 crc kubenswrapper[5094]: E0220 07:08:24.004165 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerName="ceilometer-central-agent" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.004170 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerName="ceilometer-central-agent" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.004347 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerName="sg-core" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.004362 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerName="ceilometer-central-agent" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.004372 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerName="ceilometer-notification-agent" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.004381 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerName="proxy-httpd" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.004392 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="26e7c194-b2d4-4578-90c0-d0f141b96bdd" containerName="cinder-api-log" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.004406 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="26e7c194-b2d4-4578-90c0-d0f141b96bdd" containerName="cinder-api" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.005434 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.013643 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.015233 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.015449 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.018736 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.027039 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.037966 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.041833 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.042380 5094 scope.go:117] "RemoveContainer" containerID="735b44943bdf895ffeb305a8a9b07202be3d3f21805babf681c05f7b0d65ad7e" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.044628 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.050868 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.059352 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.090132 5094 scope.go:117] "RemoveContainer" containerID="8fbd0e1a5aa799bf4144c98e297964162b94c7e1f11e3c3574fd101cc616268d" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.116244 5094 scope.go:117] "RemoveContainer" containerID="154c237a8cda216b92583d85eef501f14b59d72a35964b0b4be6a9a7f8415f61" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.130026 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb6j8\" (UniqueName: \"kubernetes.io/projected/8f8cb333-2939-4404-b242-67bcf4e6875b-kube-api-access-lb6j8\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.130087 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.130126 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.130210 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-scripts\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.130295 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8f8cb333-2939-4404-b242-67bcf4e6875b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.130332 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.130364 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-config-data\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.130418 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f8cb333-2939-4404-b242-67bcf4e6875b-logs\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.130438 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-config-data-custom\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.224917 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.229090 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.231508 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.231628 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f8cb333-2939-4404-b242-67bcf4e6875b-logs\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.231755 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-config-data-custom\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.231846 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-config-data\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.231945 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb6j8\" (UniqueName: \"kubernetes.io/projected/8f8cb333-2939-4404-b242-67bcf4e6875b-kube-api-access-lb6j8\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.232346 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.232725 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f8cb333-2939-4404-b242-67bcf4e6875b-logs\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.232940 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-scripts\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.233119 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.233166 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.233262 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-scripts\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.233315 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb81c29-b634-4b80-a18d-53ccfdd8dd40-run-httpd\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.233383 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp4r9\" (UniqueName: \"kubernetes.io/projected/efb81c29-b634-4b80-a18d-53ccfdd8dd40-kube-api-access-vp4r9\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.233454 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8f8cb333-2939-4404-b242-67bcf4e6875b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.233506 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8f8cb333-2939-4404-b242-67bcf4e6875b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.233599 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.233712 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-config-data\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.233809 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb81c29-b634-4b80-a18d-53ccfdd8dd40-log-httpd\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.237886 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.238803 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-scripts\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.239126 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-config-data\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.240448 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-config-data-custom\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.240524 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.246855 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.257296 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb6j8\" (UniqueName: \"kubernetes.io/projected/8f8cb333-2939-4404-b242-67bcf4e6875b-kube-api-access-lb6j8\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.335485 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-scripts\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.335743 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.335796 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.335967 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb81c29-b634-4b80-a18d-53ccfdd8dd40-run-httpd\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.336035 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp4r9\" (UniqueName: \"kubernetes.io/projected/efb81c29-b634-4b80-a18d-53ccfdd8dd40-kube-api-access-vp4r9\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.336423 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb81c29-b634-4b80-a18d-53ccfdd8dd40-log-httpd\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.336585 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.336721 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-config-data\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.337132 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb81c29-b634-4b80-a18d-53ccfdd8dd40-log-httpd\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.337255 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb81c29-b634-4b80-a18d-53ccfdd8dd40-run-httpd\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.339452 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-scripts\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.342943 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-config-data\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.349560 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.350144 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.358782 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp4r9\" (UniqueName: \"kubernetes.io/projected/efb81c29-b634-4b80-a18d-53ccfdd8dd40-kube-api-access-vp4r9\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.362961 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.532463 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7d8645fb77-xprwl"] Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.533249 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7d8645fb77-xprwl" podUID="6240f946-dbc4-4fdb-b831-23e76bfe2ebc" containerName="neutron-api" containerID="cri-o://a55515cfe2e873dc49fe2e0829229193d635d2147a43cdfaca8db2114d963017" gracePeriod=30 Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.533457 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7d8645fb77-xprwl" podUID="6240f946-dbc4-4fdb-b831-23e76bfe2ebc" containerName="neutron-httpd" containerID="cri-o://be6f73969b02eaf2a0704385ffcaff6d176cc3adc9262bf72bf4d87acc597ac5" gracePeriod=30 Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.577755 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-54bd68f77-fkqmr"] Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.584631 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.624011 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-54bd68f77-fkqmr"] Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.660932 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-combined-ca-bundle\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.661052 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-httpd-config\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.661094 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-public-tls-certs\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.661149 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-internal-tls-certs\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.661206 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-ovndb-tls-certs\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.661364 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmlpd\" (UniqueName: \"kubernetes.io/projected/530069d2-7146-46eb-9c88-056cc8a583b2-kube-api-access-wmlpd\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.661452 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-config\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.763628 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-config\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.763802 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-combined-ca-bundle\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.763834 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-httpd-config\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.763876 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-public-tls-certs\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.763911 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-internal-tls-certs\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.763966 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-ovndb-tls-certs\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.763993 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmlpd\" (UniqueName: \"kubernetes.io/projected/530069d2-7146-46eb-9c88-056cc8a583b2-kube-api-access-wmlpd\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.771876 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-internal-tls-certs\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.774776 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-httpd-config\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.776574 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-public-tls-certs\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.777689 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-combined-ca-bundle\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.778964 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-config\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.779615 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-ovndb-tls-certs\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.783624 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmlpd\" (UniqueName: \"kubernetes.io/projected/530069d2-7146-46eb-9c88-056cc8a583b2-kube-api-access-wmlpd\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.905006 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-7d8645fb77-xprwl" podUID="6240f946-dbc4-4fdb-b831-23e76bfe2ebc" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.150:9696/\": read tcp 10.217.0.2:41686->10.217.0.150:9696: read: connection reset by peer" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.925109 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-9d777b794-txl9q" podUID="e9b56f38-2467-4518-bdc3-d6ee665987da" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": read tcp 10.217.0.2:56714->10.217.0.157:9311: read: connection reset by peer" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.925138 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-9d777b794-txl9q" podUID="e9b56f38-2467-4518-bdc3-d6ee665987da" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": read tcp 10.217.0.2:56716->10.217.0.157:9311: read: connection reset by peer" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.933069 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.947767 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 20 07:08:25 crc kubenswrapper[5094]: W0220 07:08:25.012750 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f8cb333_2939_4404_b242_67bcf4e6875b.slice/crio-32ffade3cb1f0b317c39a0a7865ad0bdf56c99cb77f0149f0bfd6e89ce114623 WatchSource:0}: Error finding container 32ffade3cb1f0b317c39a0a7865ad0bdf56c99cb77f0149f0bfd6e89ce114623: Status 404 returned error can't find the container with id 32ffade3cb1f0b317c39a0a7865ad0bdf56c99cb77f0149f0bfd6e89ce114623 Feb 20 07:08:25 crc kubenswrapper[5094]: I0220 07:08:25.035093 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:08:25 crc kubenswrapper[5094]: I0220 07:08:25.691149 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-54bd68f77-fkqmr"] Feb 20 07:08:25 crc kubenswrapper[5094]: W0220 07:08:25.700690 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod530069d2_7146_46eb_9c88_056cc8a583b2.slice/crio-bd1a8216869bf24d8d0b2b58cd3351c7ec8fb6113dac3dda70cf8b0a5e81d368 WatchSource:0}: Error finding container bd1a8216869bf24d8d0b2b58cd3351c7ec8fb6113dac3dda70cf8b0a5e81d368: Status 404 returned error can't find the container with id bd1a8216869bf24d8d0b2b58cd3351c7ec8fb6113dac3dda70cf8b0a5e81d368 Feb 20 07:08:25 crc kubenswrapper[5094]: I0220 07:08:25.858933 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" path="/var/lib/kubelet/pods/19d2d34d-f935-40e4-a27a-a382c7634da2/volumes" Feb 20 07:08:25 crc kubenswrapper[5094]: I0220 07:08:25.860685 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26e7c194-b2d4-4578-90c0-d0f141b96bdd" path="/var/lib/kubelet/pods/26e7c194-b2d4-4578-90c0-d0f141b96bdd/volumes" Feb 20 07:08:25 crc kubenswrapper[5094]: I0220 07:08:25.931597 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:25 crc kubenswrapper[5094]: I0220 07:08:25.939063 5094 generic.go:334] "Generic (PLEG): container finished" podID="e9b56f38-2467-4518-bdc3-d6ee665987da" containerID="3416645d606b6c11c1a4b3c83d2ebc2d75fd169ec7df3d3e76572e9a7ca20cfb" exitCode=0 Feb 20 07:08:25 crc kubenswrapper[5094]: I0220 07:08:25.939188 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:25 crc kubenswrapper[5094]: I0220 07:08:25.939862 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9d777b794-txl9q" event={"ID":"e9b56f38-2467-4518-bdc3-d6ee665987da","Type":"ContainerDied","Data":"3416645d606b6c11c1a4b3c83d2ebc2d75fd169ec7df3d3e76572e9a7ca20cfb"} Feb 20 07:08:25 crc kubenswrapper[5094]: I0220 07:08:25.939893 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9d777b794-txl9q" event={"ID":"e9b56f38-2467-4518-bdc3-d6ee665987da","Type":"ContainerDied","Data":"62bcfccc8f6311f78f2ee50f1468178552942ffa51a8e7c08d763e6569fd3de7"} Feb 20 07:08:25 crc kubenswrapper[5094]: I0220 07:08:25.939916 5094 scope.go:117] "RemoveContainer" containerID="3416645d606b6c11c1a4b3c83d2ebc2d75fd169ec7df3d3e76572e9a7ca20cfb" Feb 20 07:08:25 crc kubenswrapper[5094]: I0220 07:08:25.942057 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb81c29-b634-4b80-a18d-53ccfdd8dd40","Type":"ContainerStarted","Data":"45ba2822d7e80855ff79a5771c243c313af7e75204cbacc92b81dca39bd3e6ea"} Feb 20 07:08:25 crc kubenswrapper[5094]: I0220 07:08:25.946235 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8f8cb333-2939-4404-b242-67bcf4e6875b","Type":"ContainerStarted","Data":"32ffade3cb1f0b317c39a0a7865ad0bdf56c99cb77f0149f0bfd6e89ce114623"} Feb 20 07:08:25 crc kubenswrapper[5094]: I0220 07:08:25.949979 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54bd68f77-fkqmr" event={"ID":"530069d2-7146-46eb-9c88-056cc8a583b2","Type":"ContainerStarted","Data":"bd1a8216869bf24d8d0b2b58cd3351c7ec8fb6113dac3dda70cf8b0a5e81d368"} Feb 20 07:08:25 crc kubenswrapper[5094]: I0220 07:08:25.966831 5094 generic.go:334] "Generic (PLEG): container finished" podID="6240f946-dbc4-4fdb-b831-23e76bfe2ebc" containerID="be6f73969b02eaf2a0704385ffcaff6d176cc3adc9262bf72bf4d87acc597ac5" exitCode=0 Feb 20 07:08:25 crc kubenswrapper[5094]: I0220 07:08:25.966879 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d8645fb77-xprwl" event={"ID":"6240f946-dbc4-4fdb-b831-23e76bfe2ebc","Type":"ContainerDied","Data":"be6f73969b02eaf2a0704385ffcaff6d176cc3adc9262bf72bf4d87acc597ac5"} Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.000305 5094 scope.go:117] "RemoveContainer" containerID="da39043890404f802fa5323fb43efc928dbb515543671fe7e4daa4998c4f6b42" Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.026879 5094 scope.go:117] "RemoveContainer" containerID="3416645d606b6c11c1a4b3c83d2ebc2d75fd169ec7df3d3e76572e9a7ca20cfb" Feb 20 07:08:26 crc kubenswrapper[5094]: E0220 07:08:26.029140 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3416645d606b6c11c1a4b3c83d2ebc2d75fd169ec7df3d3e76572e9a7ca20cfb\": container with ID starting with 3416645d606b6c11c1a4b3c83d2ebc2d75fd169ec7df3d3e76572e9a7ca20cfb not found: ID does not exist" containerID="3416645d606b6c11c1a4b3c83d2ebc2d75fd169ec7df3d3e76572e9a7ca20cfb" Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.029218 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3416645d606b6c11c1a4b3c83d2ebc2d75fd169ec7df3d3e76572e9a7ca20cfb"} err="failed to get container status \"3416645d606b6c11c1a4b3c83d2ebc2d75fd169ec7df3d3e76572e9a7ca20cfb\": rpc error: code = NotFound desc = could not find container \"3416645d606b6c11c1a4b3c83d2ebc2d75fd169ec7df3d3e76572e9a7ca20cfb\": container with ID starting with 3416645d606b6c11c1a4b3c83d2ebc2d75fd169ec7df3d3e76572e9a7ca20cfb not found: ID does not exist" Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.029253 5094 scope.go:117] "RemoveContainer" containerID="da39043890404f802fa5323fb43efc928dbb515543671fe7e4daa4998c4f6b42" Feb 20 07:08:26 crc kubenswrapper[5094]: E0220 07:08:26.033064 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da39043890404f802fa5323fb43efc928dbb515543671fe7e4daa4998c4f6b42\": container with ID starting with da39043890404f802fa5323fb43efc928dbb515543671fe7e4daa4998c4f6b42 not found: ID does not exist" containerID="da39043890404f802fa5323fb43efc928dbb515543671fe7e4daa4998c4f6b42" Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.033139 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da39043890404f802fa5323fb43efc928dbb515543671fe7e4daa4998c4f6b42"} err="failed to get container status \"da39043890404f802fa5323fb43efc928dbb515543671fe7e4daa4998c4f6b42\": rpc error: code = NotFound desc = could not find container \"da39043890404f802fa5323fb43efc928dbb515543671fe7e4daa4998c4f6b42\": container with ID starting with da39043890404f802fa5323fb43efc928dbb515543671fe7e4daa4998c4f6b42 not found: ID does not exist" Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.107108 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twrlp\" (UniqueName: \"kubernetes.io/projected/e9b56f38-2467-4518-bdc3-d6ee665987da-kube-api-access-twrlp\") pod \"e9b56f38-2467-4518-bdc3-d6ee665987da\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.107678 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-config-data-custom\") pod \"e9b56f38-2467-4518-bdc3-d6ee665987da\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.107901 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9b56f38-2467-4518-bdc3-d6ee665987da-logs\") pod \"e9b56f38-2467-4518-bdc3-d6ee665987da\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.107945 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-config-data\") pod \"e9b56f38-2467-4518-bdc3-d6ee665987da\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.107977 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-combined-ca-bundle\") pod \"e9b56f38-2467-4518-bdc3-d6ee665987da\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.108678 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9b56f38-2467-4518-bdc3-d6ee665987da-logs" (OuterVolumeSpecName: "logs") pod "e9b56f38-2467-4518-bdc3-d6ee665987da" (UID: "e9b56f38-2467-4518-bdc3-d6ee665987da"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.117636 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e9b56f38-2467-4518-bdc3-d6ee665987da" (UID: "e9b56f38-2467-4518-bdc3-d6ee665987da"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.117671 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9b56f38-2467-4518-bdc3-d6ee665987da-kube-api-access-twrlp" (OuterVolumeSpecName: "kube-api-access-twrlp") pod "e9b56f38-2467-4518-bdc3-d6ee665987da" (UID: "e9b56f38-2467-4518-bdc3-d6ee665987da"). InnerVolumeSpecName "kube-api-access-twrlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.154681 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9b56f38-2467-4518-bdc3-d6ee665987da" (UID: "e9b56f38-2467-4518-bdc3-d6ee665987da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.184270 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-config-data" (OuterVolumeSpecName: "config-data") pod "e9b56f38-2467-4518-bdc3-d6ee665987da" (UID: "e9b56f38-2467-4518-bdc3-d6ee665987da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.210661 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9b56f38-2467-4518-bdc3-d6ee665987da-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.210694 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.210718 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.210731 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twrlp\" (UniqueName: \"kubernetes.io/projected/e9b56f38-2467-4518-bdc3-d6ee665987da-kube-api-access-twrlp\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.210742 5094 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.296021 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-9d777b794-txl9q"] Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.305202 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-9d777b794-txl9q"] Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.981034 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54bd68f77-fkqmr" event={"ID":"530069d2-7146-46eb-9c88-056cc8a583b2","Type":"ContainerStarted","Data":"5cb136f403a952c3f992bdf43a3607fce1325d571764a30723172fab59ca2ce3"} Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.981514 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54bd68f77-fkqmr" event={"ID":"530069d2-7146-46eb-9c88-056cc8a583b2","Type":"ContainerStarted","Data":"2d88bbf120f5f105b4c231e353a818cac99e32ccbc4f6e26a6b8f4bf8f8c6db4"} Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.981948 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.987782 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb81c29-b634-4b80-a18d-53ccfdd8dd40","Type":"ContainerStarted","Data":"cd05d437d28e343e62d60fa6d0fe82e9917aa7dc5f2c11ce6e688c742734555f"} Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.987807 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb81c29-b634-4b80-a18d-53ccfdd8dd40","Type":"ContainerStarted","Data":"53cbc5e72918f62d25b4fe3bae2605aaa2a03057afdb2e8c72b4e5f0c49f6ad8"} Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.989222 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8f8cb333-2939-4404-b242-67bcf4e6875b","Type":"ContainerStarted","Data":"d19cd01d15b92dc7454d6cf3fb5973b463816ef9279ed3d67fe67ccb7ea9cc15"} Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.989251 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8f8cb333-2939-4404-b242-67bcf4e6875b","Type":"ContainerStarted","Data":"e83bab219c26de7197a4c8d483fd96f4ccf30f290122d4606fa75843efcfaa32"} Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.989612 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 20 07:08:27 crc kubenswrapper[5094]: I0220 07:08:27.003559 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-54bd68f77-fkqmr" podStartSLOduration=3.00354013 podStartE2EDuration="3.00354013s" podCreationTimestamp="2026-02-20 07:08:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:08:27.002223088 +0000 UTC m=+1321.874849809" watchObservedRunningTime="2026-02-20 07:08:27.00354013 +0000 UTC m=+1321.876166841" Feb 20 07:08:27 crc kubenswrapper[5094]: I0220 07:08:27.034604 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.034576273 podStartE2EDuration="4.034576273s" podCreationTimestamp="2026-02-20 07:08:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:08:27.023651511 +0000 UTC m=+1321.896278222" watchObservedRunningTime="2026-02-20 07:08:27.034576273 +0000 UTC m=+1321.907202984" Feb 20 07:08:27 crc kubenswrapper[5094]: I0220 07:08:27.737842 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-7d8645fb77-xprwl" podUID="6240f946-dbc4-4fdb-b831-23e76bfe2ebc" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.150:9696/\": dial tcp 10.217.0.150:9696: connect: connection refused" Feb 20 07:08:27 crc kubenswrapper[5094]: I0220 07:08:27.854141 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9b56f38-2467-4518-bdc3-d6ee665987da" path="/var/lib/kubelet/pods/e9b56f38-2467-4518-bdc3-d6ee665987da/volumes" Feb 20 07:08:28 crc kubenswrapper[5094]: I0220 07:08:28.001831 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb81c29-b634-4b80-a18d-53ccfdd8dd40","Type":"ContainerStarted","Data":"3ba1ebcfadf5f58f002b4207065f964408fa6f65e36f58017a720dfb37eee266"} Feb 20 07:08:29 crc kubenswrapper[5094]: I0220 07:08:29.046089 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb81c29-b634-4b80-a18d-53ccfdd8dd40","Type":"ContainerStarted","Data":"a4ea8efcf9ce84e8f850ee05a73654082ccde16843f950d6cef65dc52a82c120"} Feb 20 07:08:29 crc kubenswrapper[5094]: I0220 07:08:29.047507 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 07:08:29 crc kubenswrapper[5094]: I0220 07:08:29.095831 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.577898866 podStartE2EDuration="6.095806668s" podCreationTimestamp="2026-02-20 07:08:23 +0000 UTC" firstStartedPulling="2026-02-20 07:08:25.071611071 +0000 UTC m=+1319.944237782" lastFinishedPulling="2026-02-20 07:08:28.589518843 +0000 UTC m=+1323.462145584" observedRunningTime="2026-02-20 07:08:29.08880324 +0000 UTC m=+1323.961429971" watchObservedRunningTime="2026-02-20 07:08:29.095806668 +0000 UTC m=+1323.968433389" Feb 20 07:08:29 crc kubenswrapper[5094]: I0220 07:08:29.534186 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:29 crc kubenswrapper[5094]: I0220 07:08:29.536344 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 20 07:08:29 crc kubenswrapper[5094]: I0220 07:08:29.669537 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9d49dd75f-pdb4g"] Feb 20 07:08:29 crc kubenswrapper[5094]: I0220 07:08:29.670321 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" podUID="b2a1712c-5268-4203-bab0-c427e96b217b" containerName="dnsmasq-dns" containerID="cri-o://a7f1752e88d647cbe7b50383d081ac516e62abd971b333187ad49b37e7117299" gracePeriod=10 Feb 20 07:08:29 crc kubenswrapper[5094]: I0220 07:08:29.709777 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.062992 5094 generic.go:334] "Generic (PLEG): container finished" podID="6240f946-dbc4-4fdb-b831-23e76bfe2ebc" containerID="a55515cfe2e873dc49fe2e0829229193d635d2147a43cdfaca8db2114d963017" exitCode=0 Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.063097 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d8645fb77-xprwl" event={"ID":"6240f946-dbc4-4fdb-b831-23e76bfe2ebc","Type":"ContainerDied","Data":"a55515cfe2e873dc49fe2e0829229193d635d2147a43cdfaca8db2114d963017"} Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.064842 5094 generic.go:334] "Generic (PLEG): container finished" podID="b2a1712c-5268-4203-bab0-c427e96b217b" containerID="a7f1752e88d647cbe7b50383d081ac516e62abd971b333187ad49b37e7117299" exitCode=0 Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.065176 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="71648c9f-0170-413c-9f26-d169c9933469" containerName="cinder-scheduler" containerID="cri-o://0686cfd57649205f2e479e85d8bd4cc3d24d29e2bf21f8972dd18e12a66c7bbc" gracePeriod=30 Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.065634 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" event={"ID":"b2a1712c-5268-4203-bab0-c427e96b217b","Type":"ContainerDied","Data":"a7f1752e88d647cbe7b50383d081ac516e62abd971b333187ad49b37e7117299"} Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.067228 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="71648c9f-0170-413c-9f26-d169c9933469" containerName="probe" containerID="cri-o://67148a4e684e1ddbbec96ddc574fd1f1d4b808c14bbd286654cdb053aeb29db9" gracePeriod=30 Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.336136 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.343107 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.518780 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-combined-ca-bundle\") pod \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.518912 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7krch\" (UniqueName: \"kubernetes.io/projected/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-kube-api-access-7krch\") pod \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.518969 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-public-tls-certs\") pod \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.519022 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-config\") pod \"b2a1712c-5268-4203-bab0-c427e96b217b\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.519072 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-ovsdbserver-nb\") pod \"b2a1712c-5268-4203-bab0-c427e96b217b\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.519120 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-config\") pod \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.519197 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-dns-swift-storage-0\") pod \"b2a1712c-5268-4203-bab0-c427e96b217b\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.519237 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-internal-tls-certs\") pod \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.519288 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-dns-svc\") pod \"b2a1712c-5268-4203-bab0-c427e96b217b\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.519432 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-ovndb-tls-certs\") pod \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.519463 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x72mx\" (UniqueName: \"kubernetes.io/projected/b2a1712c-5268-4203-bab0-c427e96b217b-kube-api-access-x72mx\") pod \"b2a1712c-5268-4203-bab0-c427e96b217b\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.519487 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-ovsdbserver-sb\") pod \"b2a1712c-5268-4203-bab0-c427e96b217b\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.519523 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-httpd-config\") pod \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.528057 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-kube-api-access-7krch" (OuterVolumeSpecName: "kube-api-access-7krch") pod "6240f946-dbc4-4fdb-b831-23e76bfe2ebc" (UID: "6240f946-dbc4-4fdb-b831-23e76bfe2ebc"). InnerVolumeSpecName "kube-api-access-7krch". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.528594 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2a1712c-5268-4203-bab0-c427e96b217b-kube-api-access-x72mx" (OuterVolumeSpecName: "kube-api-access-x72mx") pod "b2a1712c-5268-4203-bab0-c427e96b217b" (UID: "b2a1712c-5268-4203-bab0-c427e96b217b"). InnerVolumeSpecName "kube-api-access-x72mx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.559221 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "6240f946-dbc4-4fdb-b831-23e76bfe2ebc" (UID: "6240f946-dbc4-4fdb-b831-23e76bfe2ebc"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.587589 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b2a1712c-5268-4203-bab0-c427e96b217b" (UID: "b2a1712c-5268-4203-bab0-c427e96b217b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.593078 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-config" (OuterVolumeSpecName: "config") pod "b2a1712c-5268-4203-bab0-c427e96b217b" (UID: "b2a1712c-5268-4203-bab0-c427e96b217b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.606604 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6240f946-dbc4-4fdb-b831-23e76bfe2ebc" (UID: "6240f946-dbc4-4fdb-b831-23e76bfe2ebc"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.609527 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6240f946-dbc4-4fdb-b831-23e76bfe2ebc" (UID: "6240f946-dbc4-4fdb-b831-23e76bfe2ebc"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.615856 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b2a1712c-5268-4203-bab0-c427e96b217b" (UID: "b2a1712c-5268-4203-bab0-c427e96b217b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.623636 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7krch\" (UniqueName: \"kubernetes.io/projected/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-kube-api-access-7krch\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.623667 5094 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.623679 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.623690 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.623735 5094 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.623745 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x72mx\" (UniqueName: \"kubernetes.io/projected/b2a1712c-5268-4203-bab0-c427e96b217b-kube-api-access-x72mx\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.623755 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.623765 5094 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.624893 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6240f946-dbc4-4fdb-b831-23e76bfe2ebc" (UID: "6240f946-dbc4-4fdb-b831-23e76bfe2ebc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.627574 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b2a1712c-5268-4203-bab0-c427e96b217b" (UID: "b2a1712c-5268-4203-bab0-c427e96b217b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.632903 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b2a1712c-5268-4203-bab0-c427e96b217b" (UID: "b2a1712c-5268-4203-bab0-c427e96b217b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.634119 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-config" (OuterVolumeSpecName: "config") pod "6240f946-dbc4-4fdb-b831-23e76bfe2ebc" (UID: "6240f946-dbc4-4fdb-b831-23e76bfe2ebc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.637671 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "6240f946-dbc4-4fdb-b831-23e76bfe2ebc" (UID: "6240f946-dbc4-4fdb-b831-23e76bfe2ebc"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.726631 5094 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.726677 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.726690 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.726722 5094 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.726735 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:31 crc kubenswrapper[5094]: I0220 07:08:31.079950 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d8645fb77-xprwl" event={"ID":"6240f946-dbc4-4fdb-b831-23e76bfe2ebc","Type":"ContainerDied","Data":"b9098b5dbcb409320185fc3b697229991f7ab044a2834551fda65cf47f38a5d4"} Feb 20 07:08:31 crc kubenswrapper[5094]: I0220 07:08:31.080044 5094 scope.go:117] "RemoveContainer" containerID="be6f73969b02eaf2a0704385ffcaff6d176cc3adc9262bf72bf4d87acc597ac5" Feb 20 07:08:31 crc kubenswrapper[5094]: I0220 07:08:31.080282 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:08:31 crc kubenswrapper[5094]: I0220 07:08:31.085519 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" event={"ID":"b2a1712c-5268-4203-bab0-c427e96b217b","Type":"ContainerDied","Data":"0dddba45fa8488a7532914b19dd0a9c232300eec4679520cb1b28149a6920d2e"} Feb 20 07:08:31 crc kubenswrapper[5094]: I0220 07:08:31.085547 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:31 crc kubenswrapper[5094]: I0220 07:08:31.087836 5094 generic.go:334] "Generic (PLEG): container finished" podID="71648c9f-0170-413c-9f26-d169c9933469" containerID="67148a4e684e1ddbbec96ddc574fd1f1d4b808c14bbd286654cdb053aeb29db9" exitCode=0 Feb 20 07:08:31 crc kubenswrapper[5094]: I0220 07:08:31.087914 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"71648c9f-0170-413c-9f26-d169c9933469","Type":"ContainerDied","Data":"67148a4e684e1ddbbec96ddc574fd1f1d4b808c14bbd286654cdb053aeb29db9"} Feb 20 07:08:31 crc kubenswrapper[5094]: I0220 07:08:31.104540 5094 scope.go:117] "RemoveContainer" containerID="a55515cfe2e873dc49fe2e0829229193d635d2147a43cdfaca8db2114d963017" Feb 20 07:08:31 crc kubenswrapper[5094]: I0220 07:08:31.134945 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7d8645fb77-xprwl"] Feb 20 07:08:31 crc kubenswrapper[5094]: I0220 07:08:31.150933 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7d8645fb77-xprwl"] Feb 20 07:08:31 crc kubenswrapper[5094]: I0220 07:08:31.152545 5094 scope.go:117] "RemoveContainer" containerID="a7f1752e88d647cbe7b50383d081ac516e62abd971b333187ad49b37e7117299" Feb 20 07:08:31 crc kubenswrapper[5094]: I0220 07:08:31.163059 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9d49dd75f-pdb4g"] Feb 20 07:08:31 crc kubenswrapper[5094]: I0220 07:08:31.173470 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9d49dd75f-pdb4g"] Feb 20 07:08:31 crc kubenswrapper[5094]: I0220 07:08:31.191849 5094 scope.go:117] "RemoveContainer" containerID="548f803cb834d46aa79471e7e46c2cf5bba78c70a36499567ada0307a507be4e" Feb 20 07:08:31 crc kubenswrapper[5094]: I0220 07:08:31.859402 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6240f946-dbc4-4fdb-b831-23e76bfe2ebc" path="/var/lib/kubelet/pods/6240f946-dbc4-4fdb-b831-23e76bfe2ebc/volumes" Feb 20 07:08:31 crc kubenswrapper[5094]: I0220 07:08:31.861724 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2a1712c-5268-4203-bab0-c427e96b217b" path="/var/lib/kubelet/pods/b2a1712c-5268-4203-bab0-c427e96b217b/volumes" Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.637864 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.800301 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/71648c9f-0170-413c-9f26-d169c9933469-etc-machine-id\") pod \"71648c9f-0170-413c-9f26-d169c9933469\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.800427 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-combined-ca-bundle\") pod \"71648c9f-0170-413c-9f26-d169c9933469\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.800479 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-scripts\") pod \"71648c9f-0170-413c-9f26-d169c9933469\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.800572 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25pbr\" (UniqueName: \"kubernetes.io/projected/71648c9f-0170-413c-9f26-d169c9933469-kube-api-access-25pbr\") pod \"71648c9f-0170-413c-9f26-d169c9933469\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.800620 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-config-data-custom\") pod \"71648c9f-0170-413c-9f26-d169c9933469\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.800650 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-config-data\") pod \"71648c9f-0170-413c-9f26-d169c9933469\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.800874 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71648c9f-0170-413c-9f26-d169c9933469-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "71648c9f-0170-413c-9f26-d169c9933469" (UID: "71648c9f-0170-413c-9f26-d169c9933469"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.802927 5094 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/71648c9f-0170-413c-9f26-d169c9933469-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.809772 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-scripts" (OuterVolumeSpecName: "scripts") pod "71648c9f-0170-413c-9f26-d169c9933469" (UID: "71648c9f-0170-413c-9f26-d169c9933469"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.809793 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71648c9f-0170-413c-9f26-d169c9933469-kube-api-access-25pbr" (OuterVolumeSpecName: "kube-api-access-25pbr") pod "71648c9f-0170-413c-9f26-d169c9933469" (UID: "71648c9f-0170-413c-9f26-d169c9933469"). InnerVolumeSpecName "kube-api-access-25pbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.820880 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "71648c9f-0170-413c-9f26-d169c9933469" (UID: "71648c9f-0170-413c-9f26-d169c9933469"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.864218 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71648c9f-0170-413c-9f26-d169c9933469" (UID: "71648c9f-0170-413c-9f26-d169c9933469"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.909109 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25pbr\" (UniqueName: \"kubernetes.io/projected/71648c9f-0170-413c-9f26-d169c9933469-kube-api-access-25pbr\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.909152 5094 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.909167 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.909181 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.950803 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-config-data" (OuterVolumeSpecName: "config-data") pod "71648c9f-0170-413c-9f26-d169c9933469" (UID: "71648c9f-0170-413c-9f26-d169c9933469"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.011700 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.128108 5094 generic.go:334] "Generic (PLEG): container finished" podID="71648c9f-0170-413c-9f26-d169c9933469" containerID="0686cfd57649205f2e479e85d8bd4cc3d24d29e2bf21f8972dd18e12a66c7bbc" exitCode=0 Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.128154 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"71648c9f-0170-413c-9f26-d169c9933469","Type":"ContainerDied","Data":"0686cfd57649205f2e479e85d8bd4cc3d24d29e2bf21f8972dd18e12a66c7bbc"} Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.128184 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"71648c9f-0170-413c-9f26-d169c9933469","Type":"ContainerDied","Data":"a5e9daf344d7df70576896055f545ff598209ddc3efe1ce29774c938244e44a7"} Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.128202 5094 scope.go:117] "RemoveContainer" containerID="67148a4e684e1ddbbec96ddc574fd1f1d4b808c14bbd286654cdb053aeb29db9" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.128199 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.153041 5094 scope.go:117] "RemoveContainer" containerID="0686cfd57649205f2e479e85d8bd4cc3d24d29e2bf21f8972dd18e12a66c7bbc" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.171562 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.180024 5094 scope.go:117] "RemoveContainer" containerID="67148a4e684e1ddbbec96ddc574fd1f1d4b808c14bbd286654cdb053aeb29db9" Feb 20 07:08:34 crc kubenswrapper[5094]: E0220 07:08:34.180519 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67148a4e684e1ddbbec96ddc574fd1f1d4b808c14bbd286654cdb053aeb29db9\": container with ID starting with 67148a4e684e1ddbbec96ddc574fd1f1d4b808c14bbd286654cdb053aeb29db9 not found: ID does not exist" containerID="67148a4e684e1ddbbec96ddc574fd1f1d4b808c14bbd286654cdb053aeb29db9" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.180569 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67148a4e684e1ddbbec96ddc574fd1f1d4b808c14bbd286654cdb053aeb29db9"} err="failed to get container status \"67148a4e684e1ddbbec96ddc574fd1f1d4b808c14bbd286654cdb053aeb29db9\": rpc error: code = NotFound desc = could not find container \"67148a4e684e1ddbbec96ddc574fd1f1d4b808c14bbd286654cdb053aeb29db9\": container with ID starting with 67148a4e684e1ddbbec96ddc574fd1f1d4b808c14bbd286654cdb053aeb29db9 not found: ID does not exist" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.180603 5094 scope.go:117] "RemoveContainer" containerID="0686cfd57649205f2e479e85d8bd4cc3d24d29e2bf21f8972dd18e12a66c7bbc" Feb 20 07:08:34 crc kubenswrapper[5094]: E0220 07:08:34.181137 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0686cfd57649205f2e479e85d8bd4cc3d24d29e2bf21f8972dd18e12a66c7bbc\": container with ID starting with 0686cfd57649205f2e479e85d8bd4cc3d24d29e2bf21f8972dd18e12a66c7bbc not found: ID does not exist" containerID="0686cfd57649205f2e479e85d8bd4cc3d24d29e2bf21f8972dd18e12a66c7bbc" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.181174 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0686cfd57649205f2e479e85d8bd4cc3d24d29e2bf21f8972dd18e12a66c7bbc"} err="failed to get container status \"0686cfd57649205f2e479e85d8bd4cc3d24d29e2bf21f8972dd18e12a66c7bbc\": rpc error: code = NotFound desc = could not find container \"0686cfd57649205f2e479e85d8bd4cc3d24d29e2bf21f8972dd18e12a66c7bbc\": container with ID starting with 0686cfd57649205f2e479e85d8bd4cc3d24d29e2bf21f8972dd18e12a66c7bbc not found: ID does not exist" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.201294 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.225805 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 07:08:34 crc kubenswrapper[5094]: E0220 07:08:34.226233 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6240f946-dbc4-4fdb-b831-23e76bfe2ebc" containerName="neutron-httpd" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.226254 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6240f946-dbc4-4fdb-b831-23e76bfe2ebc" containerName="neutron-httpd" Feb 20 07:08:34 crc kubenswrapper[5094]: E0220 07:08:34.226279 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9b56f38-2467-4518-bdc3-d6ee665987da" containerName="barbican-api-log" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.226286 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9b56f38-2467-4518-bdc3-d6ee665987da" containerName="barbican-api-log" Feb 20 07:08:34 crc kubenswrapper[5094]: E0220 07:08:34.226293 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2a1712c-5268-4203-bab0-c427e96b217b" containerName="init" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.226299 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2a1712c-5268-4203-bab0-c427e96b217b" containerName="init" Feb 20 07:08:34 crc kubenswrapper[5094]: E0220 07:08:34.226309 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9b56f38-2467-4518-bdc3-d6ee665987da" containerName="barbican-api" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.226315 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9b56f38-2467-4518-bdc3-d6ee665987da" containerName="barbican-api" Feb 20 07:08:34 crc kubenswrapper[5094]: E0220 07:08:34.226332 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6240f946-dbc4-4fdb-b831-23e76bfe2ebc" containerName="neutron-api" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.226341 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6240f946-dbc4-4fdb-b831-23e76bfe2ebc" containerName="neutron-api" Feb 20 07:08:34 crc kubenswrapper[5094]: E0220 07:08:34.226351 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71648c9f-0170-413c-9f26-d169c9933469" containerName="probe" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.226357 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="71648c9f-0170-413c-9f26-d169c9933469" containerName="probe" Feb 20 07:08:34 crc kubenswrapper[5094]: E0220 07:08:34.226365 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2a1712c-5268-4203-bab0-c427e96b217b" containerName="dnsmasq-dns" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.226371 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2a1712c-5268-4203-bab0-c427e96b217b" containerName="dnsmasq-dns" Feb 20 07:08:34 crc kubenswrapper[5094]: E0220 07:08:34.226385 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71648c9f-0170-413c-9f26-d169c9933469" containerName="cinder-scheduler" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.226392 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="71648c9f-0170-413c-9f26-d169c9933469" containerName="cinder-scheduler" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.226570 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9b56f38-2467-4518-bdc3-d6ee665987da" containerName="barbican-api" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.226586 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="6240f946-dbc4-4fdb-b831-23e76bfe2ebc" containerName="neutron-api" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.226596 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="6240f946-dbc4-4fdb-b831-23e76bfe2ebc" containerName="neutron-httpd" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.226609 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9b56f38-2467-4518-bdc3-d6ee665987da" containerName="barbican-api-log" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.226624 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2a1712c-5268-4203-bab0-c427e96b217b" containerName="dnsmasq-dns" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.226631 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="71648c9f-0170-413c-9f26-d169c9933469" containerName="cinder-scheduler" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.226637 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="71648c9f-0170-413c-9f26-d169c9933469" containerName="probe" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.227649 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.233240 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.237792 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.318491 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d01cbaa4-5543-4cd5-b098-7e4600819d32-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.318789 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.318931 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8b4w\" (UniqueName: \"kubernetes.io/projected/d01cbaa4-5543-4cd5-b098-7e4600819d32-kube-api-access-f8b4w\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.319010 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.319318 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-config-data\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.319534 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-scripts\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.421837 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-config-data\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.421933 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-scripts\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.421978 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d01cbaa4-5543-4cd5-b098-7e4600819d32-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.422003 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.422032 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8b4w\" (UniqueName: \"kubernetes.io/projected/d01cbaa4-5543-4cd5-b098-7e4600819d32-kube-api-access-f8b4w\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.422060 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.422440 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d01cbaa4-5543-4cd5-b098-7e4600819d32-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.427226 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-config-data\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.428805 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.429300 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.435156 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-scripts\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.446017 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8b4w\" (UniqueName: \"kubernetes.io/projected/d01cbaa4-5543-4cd5-b098-7e4600819d32-kube-api-access-f8b4w\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.562219 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.976069 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:35 crc kubenswrapper[5094]: I0220 07:08:35.123055 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 07:08:35 crc kubenswrapper[5094]: I0220 07:08:35.163919 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d01cbaa4-5543-4cd5-b098-7e4600819d32","Type":"ContainerStarted","Data":"e629f6202467daa64ea1c5522af0e65990925325c9e7a625f6d5ea287157f10f"} Feb 20 07:08:35 crc kubenswrapper[5094]: I0220 07:08:35.858354 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71648c9f-0170-413c-9f26-d169c9933469" path="/var/lib/kubelet/pods/71648c9f-0170-413c-9f26-d169c9933469/volumes" Feb 20 07:08:36 crc kubenswrapper[5094]: I0220 07:08:36.005200 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:36 crc kubenswrapper[5094]: I0220 07:08:36.009687 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:36 crc kubenswrapper[5094]: I0220 07:08:36.183575 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d01cbaa4-5543-4cd5-b098-7e4600819d32","Type":"ContainerStarted","Data":"2d7a76b02a624c041d0a977a291c39fc1dacc4367a61490ce8714745def9a3c7"} Feb 20 07:08:36 crc kubenswrapper[5094]: I0220 07:08:36.853151 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 20 07:08:37 crc kubenswrapper[5094]: I0220 07:08:37.192406 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d01cbaa4-5543-4cd5-b098-7e4600819d32","Type":"ContainerStarted","Data":"0d80098740f3555dc54ef0c65032de273b3ce866905e15dc8682ab9661be5b23"} Feb 20 07:08:37 crc kubenswrapper[5094]: I0220 07:08:37.219918 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.219889674 podStartE2EDuration="3.219889674s" podCreationTimestamp="2026-02-20 07:08:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:08:37.212193009 +0000 UTC m=+1332.084819720" watchObservedRunningTime="2026-02-20 07:08:37.219889674 +0000 UTC m=+1332.092516385" Feb 20 07:08:37 crc kubenswrapper[5094]: I0220 07:08:37.804527 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 20 07:08:37 crc kubenswrapper[5094]: I0220 07:08:37.805749 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 07:08:37 crc kubenswrapper[5094]: I0220 07:08:37.807919 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 20 07:08:37 crc kubenswrapper[5094]: I0220 07:08:37.808255 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-68tnv" Feb 20 07:08:37 crc kubenswrapper[5094]: I0220 07:08:37.808791 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 20 07:08:37 crc kubenswrapper[5094]: I0220 07:08:37.822757 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 20 07:08:37 crc kubenswrapper[5094]: I0220 07:08:37.990508 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-openstack-config-secret\") pod \"openstackclient\" (UID: \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\") " pod="openstack/openstackclient" Feb 20 07:08:37 crc kubenswrapper[5094]: I0220 07:08:37.991020 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\") " pod="openstack/openstackclient" Feb 20 07:08:37 crc kubenswrapper[5094]: I0220 07:08:37.991102 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-openstack-config\") pod \"openstackclient\" (UID: \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\") " pod="openstack/openstackclient" Feb 20 07:08:37 crc kubenswrapper[5094]: I0220 07:08:37.991158 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4p89\" (UniqueName: \"kubernetes.io/projected/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-kube-api-access-w4p89\") pod \"openstackclient\" (UID: \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.092780 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4p89\" (UniqueName: \"kubernetes.io/projected/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-kube-api-access-w4p89\") pod \"openstackclient\" (UID: \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.092885 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-openstack-config-secret\") pod \"openstackclient\" (UID: \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.092952 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.093004 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-openstack-config\") pod \"openstackclient\" (UID: \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.093966 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-openstack-config\") pod \"openstackclient\" (UID: \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.101091 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-openstack-config-secret\") pod \"openstackclient\" (UID: \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.113071 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.117167 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4p89\" (UniqueName: \"kubernetes.io/projected/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-kube-api-access-w4p89\") pod \"openstackclient\" (UID: \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.137433 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.491365 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.499950 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.518686 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.520164 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.536186 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.605780 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f8ca33ba-f76e-4352-b6f1-54588dd25285-openstack-config-secret\") pod \"openstackclient\" (UID: \"f8ca33ba-f76e-4352-b6f1-54588dd25285\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.605830 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjn6q\" (UniqueName: \"kubernetes.io/projected/f8ca33ba-f76e-4352-b6f1-54588dd25285-kube-api-access-hjn6q\") pod \"openstackclient\" (UID: \"f8ca33ba-f76e-4352-b6f1-54588dd25285\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.605863 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f8ca33ba-f76e-4352-b6f1-54588dd25285-openstack-config\") pod \"openstackclient\" (UID: \"f8ca33ba-f76e-4352-b6f1-54588dd25285\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.605883 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8ca33ba-f76e-4352-b6f1-54588dd25285-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f8ca33ba-f76e-4352-b6f1-54588dd25285\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.707564 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f8ca33ba-f76e-4352-b6f1-54588dd25285-openstack-config-secret\") pod \"openstackclient\" (UID: \"f8ca33ba-f76e-4352-b6f1-54588dd25285\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.707603 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjn6q\" (UniqueName: \"kubernetes.io/projected/f8ca33ba-f76e-4352-b6f1-54588dd25285-kube-api-access-hjn6q\") pod \"openstackclient\" (UID: \"f8ca33ba-f76e-4352-b6f1-54588dd25285\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.707640 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f8ca33ba-f76e-4352-b6f1-54588dd25285-openstack-config\") pod \"openstackclient\" (UID: \"f8ca33ba-f76e-4352-b6f1-54588dd25285\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.707663 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8ca33ba-f76e-4352-b6f1-54588dd25285-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f8ca33ba-f76e-4352-b6f1-54588dd25285\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.708848 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f8ca33ba-f76e-4352-b6f1-54588dd25285-openstack-config\") pod \"openstackclient\" (UID: \"f8ca33ba-f76e-4352-b6f1-54588dd25285\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.716517 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8ca33ba-f76e-4352-b6f1-54588dd25285-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f8ca33ba-f76e-4352-b6f1-54588dd25285\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.726336 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f8ca33ba-f76e-4352-b6f1-54588dd25285-openstack-config-secret\") pod \"openstackclient\" (UID: \"f8ca33ba-f76e-4352-b6f1-54588dd25285\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.732233 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjn6q\" (UniqueName: \"kubernetes.io/projected/f8ca33ba-f76e-4352-b6f1-54588dd25285-kube-api-access-hjn6q\") pod \"openstackclient\" (UID: \"f8ca33ba-f76e-4352-b6f1-54588dd25285\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.847500 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: E0220 07:08:38.951254 5094 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 20 07:08:38 crc kubenswrapper[5094]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec_0(45cde7565f0964897c8b71abc1733e8bbf88261ce2cd0d6de493e6761dc91576): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"45cde7565f0964897c8b71abc1733e8bbf88261ce2cd0d6de493e6761dc91576" Netns:"/var/run/netns/9b9b216a-1781-43ac-9654-e754c1788f6b" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=45cde7565f0964897c8b71abc1733e8bbf88261ce2cd0d6de493e6761dc91576;K8S_POD_UID=6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: [openstack/openstackclient/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec:ovn-kubernetes]: error adding container to network "ovn-kubernetes": CNI request failed with status 400: '[openstack/openstackclient 45cde7565f0964897c8b71abc1733e8bbf88261ce2cd0d6de493e6761dc91576 network default NAD default] [openstack/openstackclient 45cde7565f0964897c8b71abc1733e8bbf88261ce2cd0d6de493e6761dc91576 network default NAD default] failed to configure pod interface: canceled old pod sandbox waiting for OVS port binding for 0a:58:0a:d9:00:a7 [10.217.0.167/23] Feb 20 07:08:38 crc kubenswrapper[5094]: ' Feb 20 07:08:38 crc kubenswrapper[5094]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 20 07:08:38 crc kubenswrapper[5094]: > Feb 20 07:08:38 crc kubenswrapper[5094]: E0220 07:08:38.951362 5094 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 20 07:08:38 crc kubenswrapper[5094]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec_0(45cde7565f0964897c8b71abc1733e8bbf88261ce2cd0d6de493e6761dc91576): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"45cde7565f0964897c8b71abc1733e8bbf88261ce2cd0d6de493e6761dc91576" Netns:"/var/run/netns/9b9b216a-1781-43ac-9654-e754c1788f6b" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=45cde7565f0964897c8b71abc1733e8bbf88261ce2cd0d6de493e6761dc91576;K8S_POD_UID=6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: [openstack/openstackclient/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec:ovn-kubernetes]: error adding container to network "ovn-kubernetes": CNI request failed with status 400: '[openstack/openstackclient 45cde7565f0964897c8b71abc1733e8bbf88261ce2cd0d6de493e6761dc91576 network default NAD default] [openstack/openstackclient 45cde7565f0964897c8b71abc1733e8bbf88261ce2cd0d6de493e6761dc91576 network default NAD default] failed to configure pod interface: canceled old pod sandbox waiting for OVS port binding for 0a:58:0a:d9:00:a7 [10.217.0.167/23] Feb 20 07:08:38 crc kubenswrapper[5094]: ' Feb 20 07:08:38 crc kubenswrapper[5094]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 20 07:08:38 crc kubenswrapper[5094]: > pod="openstack/openstackclient" Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.287910 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.359509 5094 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec" podUID="f8ca33ba-f76e-4352-b6f1-54588dd25285" Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.378825 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.539067 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 20 07:08:39 crc kubenswrapper[5094]: W0220 07:08:39.543896 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8ca33ba_f76e_4352_b6f1_54588dd25285.slice/crio-7a5b6df81f5a888ddc75f1e10288eb4ffa4908ce2890d2f759d537c7527243db WatchSource:0}: Error finding container 7a5b6df81f5a888ddc75f1e10288eb4ffa4908ce2890d2f759d537c7527243db: Status 404 returned error can't find the container with id 7a5b6df81f5a888ddc75f1e10288eb4ffa4908ce2890d2f759d537c7527243db Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.563120 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.566044 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-openstack-config\") pod \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\" (UID: \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\") " Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.566139 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4p89\" (UniqueName: \"kubernetes.io/projected/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-kube-api-access-w4p89\") pod \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\" (UID: \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\") " Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.566188 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-openstack-config-secret\") pod \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\" (UID: \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\") " Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.566548 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-combined-ca-bundle\") pod \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\" (UID: \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\") " Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.566604 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec" (UID: "6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.567246 5094 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.573802 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec" (UID: "6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.574457 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-kube-api-access-w4p89" (OuterVolumeSpecName: "kube-api-access-w4p89") pod "6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec" (UID: "6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec"). InnerVolumeSpecName "kube-api-access-w4p89". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.576793 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec" (UID: "6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.669117 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.669158 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4p89\" (UniqueName: \"kubernetes.io/projected/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-kube-api-access-w4p89\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.669184 5094 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.852263 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec" path="/var/lib/kubelet/pods/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec/volumes" Feb 20 07:08:40 crc kubenswrapper[5094]: I0220 07:08:40.301979 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f8ca33ba-f76e-4352-b6f1-54588dd25285","Type":"ContainerStarted","Data":"7a5b6df81f5a888ddc75f1e10288eb4ffa4908ce2890d2f759d537c7527243db"} Feb 20 07:08:40 crc kubenswrapper[5094]: I0220 07:08:40.302021 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 07:08:40 crc kubenswrapper[5094]: I0220 07:08:40.315521 5094 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec" podUID="f8ca33ba-f76e-4352-b6f1-54588dd25285" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.546071 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6964856c75-f7xdp"] Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.548330 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.555989 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.556836 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.556963 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.560261 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6964856c75-f7xdp"] Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.713820 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-combined-ca-bundle\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.714031 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-internal-tls-certs\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.714116 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-config-data\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.714263 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk69b\" (UniqueName: \"kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-kube-api-access-lk69b\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.714335 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e30bbcd-c206-4a74-ae52-21462356babf-log-httpd\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.714622 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e30bbcd-c206-4a74-ae52-21462356babf-run-httpd\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.714741 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-etc-swift\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.714811 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-public-tls-certs\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.817130 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e30bbcd-c206-4a74-ae52-21462356babf-log-httpd\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.817227 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e30bbcd-c206-4a74-ae52-21462356babf-run-httpd\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.817261 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-etc-swift\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.817292 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-public-tls-certs\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.817333 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-combined-ca-bundle\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.817368 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-internal-tls-certs\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.817406 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-config-data\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.817454 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk69b\" (UniqueName: \"kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-kube-api-access-lk69b\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.817956 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e30bbcd-c206-4a74-ae52-21462356babf-log-httpd\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.818333 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e30bbcd-c206-4a74-ae52-21462356babf-run-httpd\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.824788 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-config-data\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.826155 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-combined-ca-bundle\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.826818 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-internal-tls-certs\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.827231 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-public-tls-certs\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.827639 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-etc-swift\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.837610 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk69b\" (UniqueName: \"kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-kube-api-access-lk69b\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.872187 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:42 crc kubenswrapper[5094]: I0220 07:08:42.154344 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:08:42 crc kubenswrapper[5094]: I0220 07:08:42.155104 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerName="ceilometer-central-agent" containerID="cri-o://53cbc5e72918f62d25b4fe3bae2605aaa2a03057afdb2e8c72b4e5f0c49f6ad8" gracePeriod=30 Feb 20 07:08:42 crc kubenswrapper[5094]: I0220 07:08:42.157160 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerName="proxy-httpd" containerID="cri-o://a4ea8efcf9ce84e8f850ee05a73654082ccde16843f950d6cef65dc52a82c120" gracePeriod=30 Feb 20 07:08:42 crc kubenswrapper[5094]: I0220 07:08:42.157218 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerName="sg-core" containerID="cri-o://3ba1ebcfadf5f58f002b4207065f964408fa6f65e36f58017a720dfb37eee266" gracePeriod=30 Feb 20 07:08:42 crc kubenswrapper[5094]: I0220 07:08:42.157240 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerName="ceilometer-notification-agent" containerID="cri-o://cd05d437d28e343e62d60fa6d0fe82e9917aa7dc5f2c11ce6e688c742734555f" gracePeriod=30 Feb 20 07:08:42 crc kubenswrapper[5094]: I0220 07:08:42.163623 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 20 07:08:42 crc kubenswrapper[5094]: I0220 07:08:42.663086 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6964856c75-f7xdp"] Feb 20 07:08:43 crc kubenswrapper[5094]: I0220 07:08:43.335332 5094 generic.go:334] "Generic (PLEG): container finished" podID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerID="3ba1ebcfadf5f58f002b4207065f964408fa6f65e36f58017a720dfb37eee266" exitCode=2 Feb 20 07:08:43 crc kubenswrapper[5094]: I0220 07:08:43.335410 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb81c29-b634-4b80-a18d-53ccfdd8dd40","Type":"ContainerDied","Data":"3ba1ebcfadf5f58f002b4207065f964408fa6f65e36f58017a720dfb37eee266"} Feb 20 07:08:43 crc kubenswrapper[5094]: I0220 07:08:43.336301 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb81c29-b634-4b80-a18d-53ccfdd8dd40","Type":"ContainerDied","Data":"cd05d437d28e343e62d60fa6d0fe82e9917aa7dc5f2c11ce6e688c742734555f"} Feb 20 07:08:43 crc kubenswrapper[5094]: I0220 07:08:43.336639 5094 generic.go:334] "Generic (PLEG): container finished" podID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerID="cd05d437d28e343e62d60fa6d0fe82e9917aa7dc5f2c11ce6e688c742734555f" exitCode=0 Feb 20 07:08:43 crc kubenswrapper[5094]: I0220 07:08:43.336668 5094 generic.go:334] "Generic (PLEG): container finished" podID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerID="53cbc5e72918f62d25b4fe3bae2605aaa2a03057afdb2e8c72b4e5f0c49f6ad8" exitCode=0 Feb 20 07:08:43 crc kubenswrapper[5094]: I0220 07:08:43.336766 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb81c29-b634-4b80-a18d-53ccfdd8dd40","Type":"ContainerDied","Data":"53cbc5e72918f62d25b4fe3bae2605aaa2a03057afdb2e8c72b4e5f0c49f6ad8"} Feb 20 07:08:43 crc kubenswrapper[5094]: I0220 07:08:43.339322 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6964856c75-f7xdp" event={"ID":"8e30bbcd-c206-4a74-ae52-21462356babf","Type":"ContainerStarted","Data":"9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c"} Feb 20 07:08:43 crc kubenswrapper[5094]: I0220 07:08:43.339348 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6964856c75-f7xdp" event={"ID":"8e30bbcd-c206-4a74-ae52-21462356babf","Type":"ContainerStarted","Data":"e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96"} Feb 20 07:08:43 crc kubenswrapper[5094]: I0220 07:08:43.339359 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6964856c75-f7xdp" event={"ID":"8e30bbcd-c206-4a74-ae52-21462356babf","Type":"ContainerStarted","Data":"d9175bae901767e94daa13c4878599492cbc5a434fa253663ae2586b3df20eb3"} Feb 20 07:08:43 crc kubenswrapper[5094]: I0220 07:08:43.339687 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:43 crc kubenswrapper[5094]: I0220 07:08:43.339838 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:43 crc kubenswrapper[5094]: I0220 07:08:43.365239 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6964856c75-f7xdp" podStartSLOduration=2.36521837 podStartE2EDuration="2.36521837s" podCreationTimestamp="2026-02-20 07:08:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:08:43.36191687 +0000 UTC m=+1338.234543581" watchObservedRunningTime="2026-02-20 07:08:43.36521837 +0000 UTC m=+1338.237845081" Feb 20 07:08:43 crc kubenswrapper[5094]: I0220 07:08:43.983623 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.094618 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb81c29-b634-4b80-a18d-53ccfdd8dd40-log-httpd\") pod \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.094669 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-sg-core-conf-yaml\") pod \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.094751 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-config-data\") pod \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.094786 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb81c29-b634-4b80-a18d-53ccfdd8dd40-run-httpd\") pod \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.094856 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-combined-ca-bundle\") pod \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.094904 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vp4r9\" (UniqueName: \"kubernetes.io/projected/efb81c29-b634-4b80-a18d-53ccfdd8dd40-kube-api-access-vp4r9\") pod \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.094979 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-scripts\") pod \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.097924 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efb81c29-b634-4b80-a18d-53ccfdd8dd40-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "efb81c29-b634-4b80-a18d-53ccfdd8dd40" (UID: "efb81c29-b634-4b80-a18d-53ccfdd8dd40"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.098829 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efb81c29-b634-4b80-a18d-53ccfdd8dd40-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "efb81c29-b634-4b80-a18d-53ccfdd8dd40" (UID: "efb81c29-b634-4b80-a18d-53ccfdd8dd40"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.104843 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-scripts" (OuterVolumeSpecName: "scripts") pod "efb81c29-b634-4b80-a18d-53ccfdd8dd40" (UID: "efb81c29-b634-4b80-a18d-53ccfdd8dd40"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.105319 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efb81c29-b634-4b80-a18d-53ccfdd8dd40-kube-api-access-vp4r9" (OuterVolumeSpecName: "kube-api-access-vp4r9") pod "efb81c29-b634-4b80-a18d-53ccfdd8dd40" (UID: "efb81c29-b634-4b80-a18d-53ccfdd8dd40"). InnerVolumeSpecName "kube-api-access-vp4r9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.133841 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "efb81c29-b634-4b80-a18d-53ccfdd8dd40" (UID: "efb81c29-b634-4b80-a18d-53ccfdd8dd40"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.197431 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vp4r9\" (UniqueName: \"kubernetes.io/projected/efb81c29-b634-4b80-a18d-53ccfdd8dd40-kube-api-access-vp4r9\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.197475 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.197500 5094 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb81c29-b634-4b80-a18d-53ccfdd8dd40-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.197512 5094 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.197525 5094 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb81c29-b634-4b80-a18d-53ccfdd8dd40-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.218779 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "efb81c29-b634-4b80-a18d-53ccfdd8dd40" (UID: "efb81c29-b634-4b80-a18d-53ccfdd8dd40"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.229033 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-config-data" (OuterVolumeSpecName: "config-data") pod "efb81c29-b634-4b80-a18d-53ccfdd8dd40" (UID: "efb81c29-b634-4b80-a18d-53ccfdd8dd40"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.309536 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.309585 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.358130 5094 generic.go:334] "Generic (PLEG): container finished" podID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerID="a4ea8efcf9ce84e8f850ee05a73654082ccde16843f950d6cef65dc52a82c120" exitCode=0 Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.358209 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.358260 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb81c29-b634-4b80-a18d-53ccfdd8dd40","Type":"ContainerDied","Data":"a4ea8efcf9ce84e8f850ee05a73654082ccde16843f950d6cef65dc52a82c120"} Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.358295 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb81c29-b634-4b80-a18d-53ccfdd8dd40","Type":"ContainerDied","Data":"45ba2822d7e80855ff79a5771c243c313af7e75204cbacc92b81dca39bd3e6ea"} Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.358313 5094 scope.go:117] "RemoveContainer" containerID="a4ea8efcf9ce84e8f850ee05a73654082ccde16843f950d6cef65dc52a82c120" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.396972 5094 scope.go:117] "RemoveContainer" containerID="3ba1ebcfadf5f58f002b4207065f964408fa6f65e36f58017a720dfb37eee266" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.428839 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.444780 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.457552 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:08:44 crc kubenswrapper[5094]: E0220 07:08:44.458056 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerName="proxy-httpd" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.458076 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerName="proxy-httpd" Feb 20 07:08:44 crc kubenswrapper[5094]: E0220 07:08:44.458090 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerName="ceilometer-central-agent" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.458099 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerName="ceilometer-central-agent" Feb 20 07:08:44 crc kubenswrapper[5094]: E0220 07:08:44.458110 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerName="sg-core" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.458117 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerName="sg-core" Feb 20 07:08:44 crc kubenswrapper[5094]: E0220 07:08:44.458127 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerName="ceilometer-notification-agent" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.458133 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerName="ceilometer-notification-agent" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.458331 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerName="sg-core" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.458352 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerName="proxy-httpd" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.458367 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerName="ceilometer-notification-agent" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.458378 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerName="ceilometer-central-agent" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.473966 5094 scope.go:117] "RemoveContainer" containerID="cd05d437d28e343e62d60fa6d0fe82e9917aa7dc5f2c11ce6e688c742734555f" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.474161 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.476414 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.476982 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.479060 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.511100 5094 scope.go:117] "RemoveContainer" containerID="53cbc5e72918f62d25b4fe3bae2605aaa2a03057afdb2e8c72b4e5f0c49f6ad8" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.556389 5094 scope.go:117] "RemoveContainer" containerID="a4ea8efcf9ce84e8f850ee05a73654082ccde16843f950d6cef65dc52a82c120" Feb 20 07:08:44 crc kubenswrapper[5094]: E0220 07:08:44.562978 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4ea8efcf9ce84e8f850ee05a73654082ccde16843f950d6cef65dc52a82c120\": container with ID starting with a4ea8efcf9ce84e8f850ee05a73654082ccde16843f950d6cef65dc52a82c120 not found: ID does not exist" containerID="a4ea8efcf9ce84e8f850ee05a73654082ccde16843f950d6cef65dc52a82c120" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.563028 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4ea8efcf9ce84e8f850ee05a73654082ccde16843f950d6cef65dc52a82c120"} err="failed to get container status \"a4ea8efcf9ce84e8f850ee05a73654082ccde16843f950d6cef65dc52a82c120\": rpc error: code = NotFound desc = could not find container \"a4ea8efcf9ce84e8f850ee05a73654082ccde16843f950d6cef65dc52a82c120\": container with ID starting with a4ea8efcf9ce84e8f850ee05a73654082ccde16843f950d6cef65dc52a82c120 not found: ID does not exist" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.563058 5094 scope.go:117] "RemoveContainer" containerID="3ba1ebcfadf5f58f002b4207065f964408fa6f65e36f58017a720dfb37eee266" Feb 20 07:08:44 crc kubenswrapper[5094]: E0220 07:08:44.563664 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ba1ebcfadf5f58f002b4207065f964408fa6f65e36f58017a720dfb37eee266\": container with ID starting with 3ba1ebcfadf5f58f002b4207065f964408fa6f65e36f58017a720dfb37eee266 not found: ID does not exist" containerID="3ba1ebcfadf5f58f002b4207065f964408fa6f65e36f58017a720dfb37eee266" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.563885 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ba1ebcfadf5f58f002b4207065f964408fa6f65e36f58017a720dfb37eee266"} err="failed to get container status \"3ba1ebcfadf5f58f002b4207065f964408fa6f65e36f58017a720dfb37eee266\": rpc error: code = NotFound desc = could not find container \"3ba1ebcfadf5f58f002b4207065f964408fa6f65e36f58017a720dfb37eee266\": container with ID starting with 3ba1ebcfadf5f58f002b4207065f964408fa6f65e36f58017a720dfb37eee266 not found: ID does not exist" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.563969 5094 scope.go:117] "RemoveContainer" containerID="cd05d437d28e343e62d60fa6d0fe82e9917aa7dc5f2c11ce6e688c742734555f" Feb 20 07:08:44 crc kubenswrapper[5094]: E0220 07:08:44.564327 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd05d437d28e343e62d60fa6d0fe82e9917aa7dc5f2c11ce6e688c742734555f\": container with ID starting with cd05d437d28e343e62d60fa6d0fe82e9917aa7dc5f2c11ce6e688c742734555f not found: ID does not exist" containerID="cd05d437d28e343e62d60fa6d0fe82e9917aa7dc5f2c11ce6e688c742734555f" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.564377 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd05d437d28e343e62d60fa6d0fe82e9917aa7dc5f2c11ce6e688c742734555f"} err="failed to get container status \"cd05d437d28e343e62d60fa6d0fe82e9917aa7dc5f2c11ce6e688c742734555f\": rpc error: code = NotFound desc = could not find container \"cd05d437d28e343e62d60fa6d0fe82e9917aa7dc5f2c11ce6e688c742734555f\": container with ID starting with cd05d437d28e343e62d60fa6d0fe82e9917aa7dc5f2c11ce6e688c742734555f not found: ID does not exist" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.564414 5094 scope.go:117] "RemoveContainer" containerID="53cbc5e72918f62d25b4fe3bae2605aaa2a03057afdb2e8c72b4e5f0c49f6ad8" Feb 20 07:08:44 crc kubenswrapper[5094]: E0220 07:08:44.564654 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53cbc5e72918f62d25b4fe3bae2605aaa2a03057afdb2e8c72b4e5f0c49f6ad8\": container with ID starting with 53cbc5e72918f62d25b4fe3bae2605aaa2a03057afdb2e8c72b4e5f0c49f6ad8 not found: ID does not exist" containerID="53cbc5e72918f62d25b4fe3bae2605aaa2a03057afdb2e8c72b4e5f0c49f6ad8" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.564684 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53cbc5e72918f62d25b4fe3bae2605aaa2a03057afdb2e8c72b4e5f0c49f6ad8"} err="failed to get container status \"53cbc5e72918f62d25b4fe3bae2605aaa2a03057afdb2e8c72b4e5f0c49f6ad8\": rpc error: code = NotFound desc = could not find container \"53cbc5e72918f62d25b4fe3bae2605aaa2a03057afdb2e8c72b4e5f0c49f6ad8\": container with ID starting with 53cbc5e72918f62d25b4fe3bae2605aaa2a03057afdb2e8c72b4e5f0c49f6ad8 not found: ID does not exist" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.614563 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.614718 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7thfk\" (UniqueName: \"kubernetes.io/projected/7e271af7-d690-418f-a044-e9a87e519a5a-kube-api-access-7thfk\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.614745 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.614762 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-config-data\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.614850 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e271af7-d690-418f-a044-e9a87e519a5a-run-httpd\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.614872 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-scripts\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.614928 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e271af7-d690-418f-a044-e9a87e519a5a-log-httpd\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.717446 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.717532 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7thfk\" (UniqueName: \"kubernetes.io/projected/7e271af7-d690-418f-a044-e9a87e519a5a-kube-api-access-7thfk\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.717571 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.717612 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-config-data\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.717673 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e271af7-d690-418f-a044-e9a87e519a5a-run-httpd\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.717699 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-scripts\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.717743 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e271af7-d690-418f-a044-e9a87e519a5a-log-httpd\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.718412 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e271af7-d690-418f-a044-e9a87e519a5a-log-httpd\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.718841 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e271af7-d690-418f-a044-e9a87e519a5a-run-httpd\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.727582 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.727820 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-scripts\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.728221 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.728986 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-config-data\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.737851 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7thfk\" (UniqueName: \"kubernetes.io/projected/7e271af7-d690-418f-a044-e9a87e519a5a-kube-api-access-7thfk\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.785947 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.813098 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:08:45 crc kubenswrapper[5094]: I0220 07:08:45.302299 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:08:45 crc kubenswrapper[5094]: I0220 07:08:45.374722 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e271af7-d690-418f-a044-e9a87e519a5a","Type":"ContainerStarted","Data":"72664ddb60921a7c82bcb5bf7c13f327aadde894768d3fc2b11fc7a9a5e76ba1"} Feb 20 07:08:45 crc kubenswrapper[5094]: I0220 07:08:45.878598 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" path="/var/lib/kubelet/pods/efb81c29-b634-4b80-a18d-53ccfdd8dd40/volumes" Feb 20 07:08:46 crc kubenswrapper[5094]: I0220 07:08:46.699169 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:08:46 crc kubenswrapper[5094]: I0220 07:08:46.699623 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" containerName="glance-log" containerID="cri-o://d030747f97a82fa85fa58357fe602c22cf85f61f284bbee33cd22316a4d560ea" gracePeriod=30 Feb 20 07:08:46 crc kubenswrapper[5094]: I0220 07:08:46.701854 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" containerName="glance-httpd" containerID="cri-o://f8a1428170b952f6757cb59aec0fc05c990728916f1e69bbff78756ee58aca18" gracePeriod=30 Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.398651 5094 generic.go:334] "Generic (PLEG): container finished" podID="8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" containerID="d030747f97a82fa85fa58357fe602c22cf85f61f284bbee33cd22316a4d560ea" exitCode=143 Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.398863 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b","Type":"ContainerDied","Data":"d030747f97a82fa85fa58357fe602c22cf85f61f284bbee33cd22316a4d560ea"} Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.726972 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-wrxqf"] Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.728496 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-wrxqf" Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.745048 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-wrxqf"] Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.830753 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-hlntr"] Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.832544 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hlntr" Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.852398 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-d63e-account-create-update-hkz6h"] Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.854092 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d63e-account-create-update-hkz6h" Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.855767 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-hlntr"] Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.859863 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.866108 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-d63e-account-create-update-hkz6h"] Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.894084 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32b230f5-7de4-450c-90e6-e9c18a0d9c0e-operator-scripts\") pod \"nova-api-db-create-wrxqf\" (UID: \"32b230f5-7de4-450c-90e6-e9c18a0d9c0e\") " pod="openstack/nova-api-db-create-wrxqf" Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.894770 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg46j\" (UniqueName: \"kubernetes.io/projected/32b230f5-7de4-450c-90e6-e9c18a0d9c0e-kube-api-access-xg46j\") pod \"nova-api-db-create-wrxqf\" (UID: \"32b230f5-7de4-450c-90e6-e9c18a0d9c0e\") " pod="openstack/nova-api-db-create-wrxqf" Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.928492 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-lb6l5"] Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.929730 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lb6l5" Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.986535 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-lb6l5"] Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.996277 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0626209e-1ab6-4bd1-a5cc-35a2f6525e5b-operator-scripts\") pod \"nova-api-d63e-account-create-update-hkz6h\" (UID: \"0626209e-1ab6-4bd1-a5cc-35a2f6525e5b\") " pod="openstack/nova-api-d63e-account-create-update-hkz6h" Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.996339 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg46j\" (UniqueName: \"kubernetes.io/projected/32b230f5-7de4-450c-90e6-e9c18a0d9c0e-kube-api-access-xg46j\") pod \"nova-api-db-create-wrxqf\" (UID: \"32b230f5-7de4-450c-90e6-e9c18a0d9c0e\") " pod="openstack/nova-api-db-create-wrxqf" Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.996400 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m24hg\" (UniqueName: \"kubernetes.io/projected/c32c030e-7f51-4f5e-a4bb-c27288d8f2e9-kube-api-access-m24hg\") pod \"nova-cell0-db-create-hlntr\" (UID: \"c32c030e-7f51-4f5e-a4bb-c27288d8f2e9\") " pod="openstack/nova-cell0-db-create-hlntr" Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.997483 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c32c030e-7f51-4f5e-a4bb-c27288d8f2e9-operator-scripts\") pod \"nova-cell0-db-create-hlntr\" (UID: \"c32c030e-7f51-4f5e-a4bb-c27288d8f2e9\") " pod="openstack/nova-cell0-db-create-hlntr" Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.997628 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7hfc\" (UniqueName: \"kubernetes.io/projected/0626209e-1ab6-4bd1-a5cc-35a2f6525e5b-kube-api-access-z7hfc\") pod \"nova-api-d63e-account-create-update-hkz6h\" (UID: \"0626209e-1ab6-4bd1-a5cc-35a2f6525e5b\") " pod="openstack/nova-api-d63e-account-create-update-hkz6h" Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.997691 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32b230f5-7de4-450c-90e6-e9c18a0d9c0e-operator-scripts\") pod \"nova-api-db-create-wrxqf\" (UID: \"32b230f5-7de4-450c-90e6-e9c18a0d9c0e\") " pod="openstack/nova-api-db-create-wrxqf" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.000129 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32b230f5-7de4-450c-90e6-e9c18a0d9c0e-operator-scripts\") pod \"nova-api-db-create-wrxqf\" (UID: \"32b230f5-7de4-450c-90e6-e9c18a0d9c0e\") " pod="openstack/nova-api-db-create-wrxqf" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.033206 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg46j\" (UniqueName: \"kubernetes.io/projected/32b230f5-7de4-450c-90e6-e9c18a0d9c0e-kube-api-access-xg46j\") pod \"nova-api-db-create-wrxqf\" (UID: \"32b230f5-7de4-450c-90e6-e9c18a0d9c0e\") " pod="openstack/nova-api-db-create-wrxqf" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.059855 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-675a-account-create-update-7f6v9"] Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.061476 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-675a-account-create-update-7f6v9" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.064109 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.067616 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-wrxqf" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.084056 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-675a-account-create-update-7f6v9"] Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.125068 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7hfc\" (UniqueName: \"kubernetes.io/projected/0626209e-1ab6-4bd1-a5cc-35a2f6525e5b-kube-api-access-z7hfc\") pod \"nova-api-d63e-account-create-update-hkz6h\" (UID: \"0626209e-1ab6-4bd1-a5cc-35a2f6525e5b\") " pod="openstack/nova-api-d63e-account-create-update-hkz6h" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.125792 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t67jv\" (UniqueName: \"kubernetes.io/projected/43cfca6d-55e3-431f-b5b8-2b8db44bcee0-kube-api-access-t67jv\") pod \"nova-cell1-db-create-lb6l5\" (UID: \"43cfca6d-55e3-431f-b5b8-2b8db44bcee0\") " pod="openstack/nova-cell1-db-create-lb6l5" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.125902 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43cfca6d-55e3-431f-b5b8-2b8db44bcee0-operator-scripts\") pod \"nova-cell1-db-create-lb6l5\" (UID: \"43cfca6d-55e3-431f-b5b8-2b8db44bcee0\") " pod="openstack/nova-cell1-db-create-lb6l5" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.126133 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0626209e-1ab6-4bd1-a5cc-35a2f6525e5b-operator-scripts\") pod \"nova-api-d63e-account-create-update-hkz6h\" (UID: \"0626209e-1ab6-4bd1-a5cc-35a2f6525e5b\") " pod="openstack/nova-api-d63e-account-create-update-hkz6h" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.126272 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m24hg\" (UniqueName: \"kubernetes.io/projected/c32c030e-7f51-4f5e-a4bb-c27288d8f2e9-kube-api-access-m24hg\") pod \"nova-cell0-db-create-hlntr\" (UID: \"c32c030e-7f51-4f5e-a4bb-c27288d8f2e9\") " pod="openstack/nova-cell0-db-create-hlntr" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.126315 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsqkn\" (UniqueName: \"kubernetes.io/projected/50bf9176-b504-436f-a845-7ab55506a258-kube-api-access-fsqkn\") pod \"nova-cell0-675a-account-create-update-7f6v9\" (UID: \"50bf9176-b504-436f-a845-7ab55506a258\") " pod="openstack/nova-cell0-675a-account-create-update-7f6v9" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.126377 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c32c030e-7f51-4f5e-a4bb-c27288d8f2e9-operator-scripts\") pod \"nova-cell0-db-create-hlntr\" (UID: \"c32c030e-7f51-4f5e-a4bb-c27288d8f2e9\") " pod="openstack/nova-cell0-db-create-hlntr" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.126412 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50bf9176-b504-436f-a845-7ab55506a258-operator-scripts\") pod \"nova-cell0-675a-account-create-update-7f6v9\" (UID: \"50bf9176-b504-436f-a845-7ab55506a258\") " pod="openstack/nova-cell0-675a-account-create-update-7f6v9" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.128001 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0626209e-1ab6-4bd1-a5cc-35a2f6525e5b-operator-scripts\") pod \"nova-api-d63e-account-create-update-hkz6h\" (UID: \"0626209e-1ab6-4bd1-a5cc-35a2f6525e5b\") " pod="openstack/nova-api-d63e-account-create-update-hkz6h" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.128309 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c32c030e-7f51-4f5e-a4bb-c27288d8f2e9-operator-scripts\") pod \"nova-cell0-db-create-hlntr\" (UID: \"c32c030e-7f51-4f5e-a4bb-c27288d8f2e9\") " pod="openstack/nova-cell0-db-create-hlntr" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.156264 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.169873 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m24hg\" (UniqueName: \"kubernetes.io/projected/c32c030e-7f51-4f5e-a4bb-c27288d8f2e9-kube-api-access-m24hg\") pod \"nova-cell0-db-create-hlntr\" (UID: \"c32c030e-7f51-4f5e-a4bb-c27288d8f2e9\") " pod="openstack/nova-cell0-db-create-hlntr" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.177904 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7hfc\" (UniqueName: \"kubernetes.io/projected/0626209e-1ab6-4bd1-a5cc-35a2f6525e5b-kube-api-access-z7hfc\") pod \"nova-api-d63e-account-create-update-hkz6h\" (UID: \"0626209e-1ab6-4bd1-a5cc-35a2f6525e5b\") " pod="openstack/nova-api-d63e-account-create-update-hkz6h" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.182344 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d63e-account-create-update-hkz6h" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.229121 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsqkn\" (UniqueName: \"kubernetes.io/projected/50bf9176-b504-436f-a845-7ab55506a258-kube-api-access-fsqkn\") pod \"nova-cell0-675a-account-create-update-7f6v9\" (UID: \"50bf9176-b504-436f-a845-7ab55506a258\") " pod="openstack/nova-cell0-675a-account-create-update-7f6v9" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.229195 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50bf9176-b504-436f-a845-7ab55506a258-operator-scripts\") pod \"nova-cell0-675a-account-create-update-7f6v9\" (UID: \"50bf9176-b504-436f-a845-7ab55506a258\") " pod="openstack/nova-cell0-675a-account-create-update-7f6v9" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.229273 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t67jv\" (UniqueName: \"kubernetes.io/projected/43cfca6d-55e3-431f-b5b8-2b8db44bcee0-kube-api-access-t67jv\") pod \"nova-cell1-db-create-lb6l5\" (UID: \"43cfca6d-55e3-431f-b5b8-2b8db44bcee0\") " pod="openstack/nova-cell1-db-create-lb6l5" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.229324 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43cfca6d-55e3-431f-b5b8-2b8db44bcee0-operator-scripts\") pod \"nova-cell1-db-create-lb6l5\" (UID: \"43cfca6d-55e3-431f-b5b8-2b8db44bcee0\") " pod="openstack/nova-cell1-db-create-lb6l5" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.232318 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50bf9176-b504-436f-a845-7ab55506a258-operator-scripts\") pod \"nova-cell0-675a-account-create-update-7f6v9\" (UID: \"50bf9176-b504-436f-a845-7ab55506a258\") " pod="openstack/nova-cell0-675a-account-create-update-7f6v9" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.235447 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43cfca6d-55e3-431f-b5b8-2b8db44bcee0-operator-scripts\") pod \"nova-cell1-db-create-lb6l5\" (UID: \"43cfca6d-55e3-431f-b5b8-2b8db44bcee0\") " pod="openstack/nova-cell1-db-create-lb6l5" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.267275 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t67jv\" (UniqueName: \"kubernetes.io/projected/43cfca6d-55e3-431f-b5b8-2b8db44bcee0-kube-api-access-t67jv\") pod \"nova-cell1-db-create-lb6l5\" (UID: \"43cfca6d-55e3-431f-b5b8-2b8db44bcee0\") " pod="openstack/nova-cell1-db-create-lb6l5" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.281409 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsqkn\" (UniqueName: \"kubernetes.io/projected/50bf9176-b504-436f-a845-7ab55506a258-kube-api-access-fsqkn\") pod \"nova-cell0-675a-account-create-update-7f6v9\" (UID: \"50bf9176-b504-436f-a845-7ab55506a258\") " pod="openstack/nova-cell0-675a-account-create-update-7f6v9" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.294077 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-8a74-account-create-update-cdhcw"] Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.296268 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8a74-account-create-update-cdhcw" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.302315 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lb6l5" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.303185 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.324558 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8a74-account-create-update-cdhcw"] Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.331919 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fb81f20-1f88-4c11-a37a-31db4472afd2-operator-scripts\") pod \"nova-cell1-8a74-account-create-update-cdhcw\" (UID: \"7fb81f20-1f88-4c11-a37a-31db4472afd2\") " pod="openstack/nova-cell1-8a74-account-create-update-cdhcw" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.332060 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztt8k\" (UniqueName: \"kubernetes.io/projected/7fb81f20-1f88-4c11-a37a-31db4472afd2-kube-api-access-ztt8k\") pod \"nova-cell1-8a74-account-create-update-cdhcw\" (UID: \"7fb81f20-1f88-4c11-a37a-31db4472afd2\") " pod="openstack/nova-cell1-8a74-account-create-update-cdhcw" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.413929 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-675a-account-create-update-7f6v9" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.435992 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fb81f20-1f88-4c11-a37a-31db4472afd2-operator-scripts\") pod \"nova-cell1-8a74-account-create-update-cdhcw\" (UID: \"7fb81f20-1f88-4c11-a37a-31db4472afd2\") " pod="openstack/nova-cell1-8a74-account-create-update-cdhcw" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.436108 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztt8k\" (UniqueName: \"kubernetes.io/projected/7fb81f20-1f88-4c11-a37a-31db4472afd2-kube-api-access-ztt8k\") pod \"nova-cell1-8a74-account-create-update-cdhcw\" (UID: \"7fb81f20-1f88-4c11-a37a-31db4472afd2\") " pod="openstack/nova-cell1-8a74-account-create-update-cdhcw" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.437216 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fb81f20-1f88-4c11-a37a-31db4472afd2-operator-scripts\") pod \"nova-cell1-8a74-account-create-update-cdhcw\" (UID: \"7fb81f20-1f88-4c11-a37a-31db4472afd2\") " pod="openstack/nova-cell1-8a74-account-create-update-cdhcw" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.454173 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hlntr" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.479725 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztt8k\" (UniqueName: \"kubernetes.io/projected/7fb81f20-1f88-4c11-a37a-31db4472afd2-kube-api-access-ztt8k\") pod \"nova-cell1-8a74-account-create-update-cdhcw\" (UID: \"7fb81f20-1f88-4c11-a37a-31db4472afd2\") " pod="openstack/nova-cell1-8a74-account-create-update-cdhcw" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.668051 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8a74-account-create-update-cdhcw" Feb 20 07:08:50 crc kubenswrapper[5094]: I0220 07:08:50.445869 5094 generic.go:334] "Generic (PLEG): container finished" podID="8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" containerID="f8a1428170b952f6757cb59aec0fc05c990728916f1e69bbff78756ee58aca18" exitCode=0 Feb 20 07:08:50 crc kubenswrapper[5094]: I0220 07:08:50.446229 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b","Type":"ContainerDied","Data":"f8a1428170b952f6757cb59aec0fc05c990728916f1e69bbff78756ee58aca18"} Feb 20 07:08:51 crc kubenswrapper[5094]: I0220 07:08:51.881690 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:51 crc kubenswrapper[5094]: I0220 07:08:51.884776 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:51 crc kubenswrapper[5094]: I0220 07:08:51.909518 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:08:51 crc kubenswrapper[5094]: I0220 07:08:51.909811 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1b5ca0fd-39ce-46da-8a15-cf0d7265e060" containerName="glance-log" containerID="cri-o://3903f2d2312370e437aca364ecb299fb413db5a66b7bc53bb3fb92764eec4ca7" gracePeriod=30 Feb 20 07:08:51 crc kubenswrapper[5094]: I0220 07:08:51.909961 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1b5ca0fd-39ce-46da-8a15-cf0d7265e060" containerName="glance-httpd" containerID="cri-o://763800845ad8e2fe479385b37abae471ee069ec13ca28f866a8628fcda5f7859" gracePeriod=30 Feb 20 07:08:52 crc kubenswrapper[5094]: I0220 07:08:52.528516 5094 generic.go:334] "Generic (PLEG): container finished" podID="1b5ca0fd-39ce-46da-8a15-cf0d7265e060" containerID="3903f2d2312370e437aca364ecb299fb413db5a66b7bc53bb3fb92764eec4ca7" exitCode=143 Feb 20 07:08:52 crc kubenswrapper[5094]: I0220 07:08:52.528605 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1b5ca0fd-39ce-46da-8a15-cf0d7265e060","Type":"ContainerDied","Data":"3903f2d2312370e437aca364ecb299fb413db5a66b7bc53bb3fb92764eec4ca7"} Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.124686 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.159085 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-internal-tls-certs\") pod \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.159208 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-httpd-run\") pod \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.159277 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-combined-ca-bundle\") pod \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.159321 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-logs\") pod \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.159364 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-config-data\") pod \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.159391 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-scripts\") pod \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.159415 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkqb7\" (UniqueName: \"kubernetes.io/projected/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-kube-api-access-pkqb7\") pod \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.159442 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.162051 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-logs" (OuterVolumeSpecName: "logs") pod "8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" (UID: "8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.162245 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" (UID: "8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.178879 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" (UID: "8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.193362 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-kube-api-access-pkqb7" (OuterVolumeSpecName: "kube-api-access-pkqb7") pod "8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" (UID: "8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b"). InnerVolumeSpecName "kube-api-access-pkqb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.197246 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-scripts" (OuterVolumeSpecName: "scripts") pod "8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" (UID: "8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.246388 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" (UID: "8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.261848 5094 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.261887 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.261900 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.261909 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.261919 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkqb7\" (UniqueName: \"kubernetes.io/projected/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-kube-api-access-pkqb7\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.261951 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.284977 5094 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.290822 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-config-data" (OuterVolumeSpecName: "config-data") pod "8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" (UID: "8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.294491 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" (UID: "8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.364089 5094 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.364459 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.364470 5094 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.570212 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-wrxqf"] Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.574357 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-d63e-account-create-update-hkz6h"] Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.586899 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f8ca33ba-f76e-4352-b6f1-54588dd25285","Type":"ContainerStarted","Data":"1db2a566904b3dc511755f2dbe6958ef04fbbaf45c4039fb8bb4f2d0b8ee27fe"} Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.619122 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e271af7-d690-418f-a044-e9a87e519a5a","Type":"ContainerStarted","Data":"3f9647d815121349b11fd3acdfdd813a022852ebeb4d626a22c011ec079172fc"} Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.673684 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b","Type":"ContainerDied","Data":"c37272ac6f9e924740c6d7aa103c2e64f6efab2b196866681821b669409f2ee4"} Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.673770 5094 scope.go:117] "RemoveContainer" containerID="f8a1428170b952f6757cb59aec0fc05c990728916f1e69bbff78756ee58aca18" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.674003 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.689004 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-hlntr"] Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.761873 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.6335531469999998 podStartE2EDuration="16.761848919s" podCreationTimestamp="2026-02-20 07:08:38 +0000 UTC" firstStartedPulling="2026-02-20 07:08:39.546597646 +0000 UTC m=+1334.419224357" lastFinishedPulling="2026-02-20 07:08:53.674893418 +0000 UTC m=+1348.547520129" observedRunningTime="2026-02-20 07:08:54.64953982 +0000 UTC m=+1349.522166531" watchObservedRunningTime="2026-02-20 07:08:54.761848919 +0000 UTC m=+1349.634475630" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.828525 5094 scope.go:117] "RemoveContainer" containerID="d030747f97a82fa85fa58357fe602c22cf85f61f284bbee33cd22316a4d560ea" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.859431 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.908319 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.932379 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:08:54 crc kubenswrapper[5094]: E0220 07:08:54.933008 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" containerName="glance-log" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.933025 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" containerName="glance-log" Feb 20 07:08:54 crc kubenswrapper[5094]: E0220 07:08:54.933078 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" containerName="glance-httpd" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.933084 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" containerName="glance-httpd" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.933335 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" containerName="glance-httpd" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.933350 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" containerName="glance-log" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.935278 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.938728 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.950570 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.951530 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.961291 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.975803 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-lb6l5"] Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.004037 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-675a-account-create-update-7f6v9"] Feb 20 07:08:55 crc kubenswrapper[5094]: W0220 07:08:55.066649 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7fb81f20_1f88_4c11_a37a_31db4472afd2.slice/crio-76d709d0c242731b3e4167bd010802ba2e12adacc0ebc9c94ef4cef8fb201f1a WatchSource:0}: Error finding container 76d709d0c242731b3e4167bd010802ba2e12adacc0ebc9c94ef4cef8fb201f1a: Status 404 returned error can't find the container with id 76d709d0c242731b3e4167bd010802ba2e12adacc0ebc9c94ef4cef8fb201f1a Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.076262 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8a74-account-create-update-cdhcw"] Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.086672 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5ddb8575b6-4wznv"] Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.086984 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5ddb8575b6-4wznv" podUID="c5f4dc72-bb77-44c4-8058-4939958d7a48" containerName="neutron-api" containerID="cri-o://eb7da5e5d31261eb9e520b471e20dcda54e4d83a9a60405577b7def9033da931" gracePeriod=30 Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.087560 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5ddb8575b6-4wznv" podUID="c5f4dc72-bb77-44c4-8058-4939958d7a48" containerName="neutron-httpd" containerID="cri-o://419f643d1ccf43ee276ce35ac60b7269f52b21d76747dcf57ea44f551a26690a" gracePeriod=30 Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.088600 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.088739 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2fsw\" (UniqueName: \"kubernetes.io/projected/762a565c-672e-4127-a8c6-90f721eeda81-kube-api-access-b2fsw\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.088969 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.089056 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.089183 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/762a565c-672e-4127-a8c6-90f721eeda81-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.089368 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/762a565c-672e-4127-a8c6-90f721eeda81-logs\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.089495 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-scripts\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.089693 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-config-data\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.195618 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.195678 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.195730 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/762a565c-672e-4127-a8c6-90f721eeda81-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.195811 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/762a565c-672e-4127-a8c6-90f721eeda81-logs\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.195860 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-scripts\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.195943 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-config-data\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.195983 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.196001 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2fsw\" (UniqueName: \"kubernetes.io/projected/762a565c-672e-4127-a8c6-90f721eeda81-kube-api-access-b2fsw\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.197378 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/762a565c-672e-4127-a8c6-90f721eeda81-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.197992 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.199266 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/762a565c-672e-4127-a8c6-90f721eeda81-logs\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.218679 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-scripts\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.218683 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.218832 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.237544 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2fsw\" (UniqueName: \"kubernetes.io/projected/762a565c-672e-4127-a8c6-90f721eeda81-kube-api-access-b2fsw\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.237960 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-config-data\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.276534 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.310249 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.691023 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-lb6l5" event={"ID":"43cfca6d-55e3-431f-b5b8-2b8db44bcee0","Type":"ContainerStarted","Data":"1616ce3dec6e69416e60df4bd5866fdeda9b06b012b17e0a3ceddd1a84ee25c5"} Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.691091 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-lb6l5" event={"ID":"43cfca6d-55e3-431f-b5b8-2b8db44bcee0","Type":"ContainerStarted","Data":"a329d7a0add1e392fa4985f19f6e93cb9612c786391b2490ffb4a1810885dc7c"} Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.694533 5094 generic.go:334] "Generic (PLEG): container finished" podID="0626209e-1ab6-4bd1-a5cc-35a2f6525e5b" containerID="6d3b6790676924518ae410dca9464ac17e8adb8be7c1c0809abd3e37c9afadec" exitCode=0 Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.694641 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d63e-account-create-update-hkz6h" event={"ID":"0626209e-1ab6-4bd1-a5cc-35a2f6525e5b","Type":"ContainerDied","Data":"6d3b6790676924518ae410dca9464ac17e8adb8be7c1c0809abd3e37c9afadec"} Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.694665 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d63e-account-create-update-hkz6h" event={"ID":"0626209e-1ab6-4bd1-a5cc-35a2f6525e5b","Type":"ContainerStarted","Data":"da11e79f81a9148438c1b80c421638abdc63b2dce3f3a517fc7c35f6a21bff49"} Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.717591 5094 generic.go:334] "Generic (PLEG): container finished" podID="c32c030e-7f51-4f5e-a4bb-c27288d8f2e9" containerID="f74e9fd620d60bb7c55d8ca9b94a45b983b355d7aa77e6d394eb827e69cef1af" exitCode=0 Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.717904 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-hlntr" event={"ID":"c32c030e-7f51-4f5e-a4bb-c27288d8f2e9","Type":"ContainerDied","Data":"f74e9fd620d60bb7c55d8ca9b94a45b983b355d7aa77e6d394eb827e69cef1af"} Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.717946 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-hlntr" event={"ID":"c32c030e-7f51-4f5e-a4bb-c27288d8f2e9","Type":"ContainerStarted","Data":"4a19cbf70032a85b200b21fb37bf3d894dcadf7dd6710eb8d150415435e649c4"} Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.722182 5094 generic.go:334] "Generic (PLEG): container finished" podID="32b230f5-7de4-450c-90e6-e9c18a0d9c0e" containerID="514774461bdf76f918de93cfcbabf0b67e1bca119186db34bd24f1a423cf7e05" exitCode=0 Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.722266 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-wrxqf" event={"ID":"32b230f5-7de4-450c-90e6-e9c18a0d9c0e","Type":"ContainerDied","Data":"514774461bdf76f918de93cfcbabf0b67e1bca119186db34bd24f1a423cf7e05"} Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.722303 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-wrxqf" event={"ID":"32b230f5-7de4-450c-90e6-e9c18a0d9c0e","Type":"ContainerStarted","Data":"026cfb43d0585fc1caa48d4ac0edb6168f335e6a26fa3ac7ecc6a4d36f2a9e37"} Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.726218 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-lb6l5" podStartSLOduration=8.726194435 podStartE2EDuration="8.726194435s" podCreationTimestamp="2026-02-20 07:08:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:08:55.712130118 +0000 UTC m=+1350.584756829" watchObservedRunningTime="2026-02-20 07:08:55.726194435 +0000 UTC m=+1350.598821146" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.732822 5094 generic.go:334] "Generic (PLEG): container finished" podID="1b5ca0fd-39ce-46da-8a15-cf0d7265e060" containerID="763800845ad8e2fe479385b37abae471ee069ec13ca28f866a8628fcda5f7859" exitCode=0 Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.732936 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1b5ca0fd-39ce-46da-8a15-cf0d7265e060","Type":"ContainerDied","Data":"763800845ad8e2fe479385b37abae471ee069ec13ca28f866a8628fcda5f7859"} Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.736144 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-675a-account-create-update-7f6v9" event={"ID":"50bf9176-b504-436f-a845-7ab55506a258","Type":"ContainerStarted","Data":"87c896e6058562619d75ebeaacab8cf5ef47914c9fd0a960cbfc01098b412e02"} Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.736191 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-675a-account-create-update-7f6v9" event={"ID":"50bf9176-b504-436f-a845-7ab55506a258","Type":"ContainerStarted","Data":"65625c1b377aaa04513c4aa31e214dd00d79211619df4ac8f4928a84644d7951"} Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.738429 5094 generic.go:334] "Generic (PLEG): container finished" podID="c5f4dc72-bb77-44c4-8058-4939958d7a48" containerID="419f643d1ccf43ee276ce35ac60b7269f52b21d76747dcf57ea44f551a26690a" exitCode=0 Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.738475 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5ddb8575b6-4wznv" event={"ID":"c5f4dc72-bb77-44c4-8058-4939958d7a48","Type":"ContainerDied","Data":"419f643d1ccf43ee276ce35ac60b7269f52b21d76747dcf57ea44f551a26690a"} Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.744974 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8a74-account-create-update-cdhcw" event={"ID":"7fb81f20-1f88-4c11-a37a-31db4472afd2","Type":"ContainerStarted","Data":"5c648081449bf27866a97d9ce26a3b96bf18b0b2aa6ccd28cfd7c308a8ad471b"} Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.745026 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8a74-account-create-update-cdhcw" event={"ID":"7fb81f20-1f88-4c11-a37a-31db4472afd2","Type":"ContainerStarted","Data":"76d709d0c242731b3e4167bd010802ba2e12adacc0ebc9c94ef4cef8fb201f1a"} Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.754030 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e271af7-d690-418f-a044-e9a87e519a5a","Type":"ContainerStarted","Data":"00acae1bf2e8033c37174db166bb17e6c15833da2f59f9b70ebeb491b8ac41ed"} Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.847313 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-8a74-account-create-update-cdhcw" podStartSLOduration=7.847282845 podStartE2EDuration="7.847282845s" podCreationTimestamp="2026-02-20 07:08:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:08:55.817841529 +0000 UTC m=+1350.690468240" watchObservedRunningTime="2026-02-20 07:08:55.847282845 +0000 UTC m=+1350.719909556" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.880110 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" path="/var/lib/kubelet/pods/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b/volumes" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.881024 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.219365 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.377410 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-public-tls-certs\") pod \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.377528 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-config-data\") pod \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.377590 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzzp4\" (UniqueName: \"kubernetes.io/projected/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-kube-api-access-wzzp4\") pod \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.377639 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.377725 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-combined-ca-bundle\") pod \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.377754 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-scripts\") pod \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.377845 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-logs\") pod \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.377914 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-httpd-run\") pod \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.379043 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1b5ca0fd-39ce-46da-8a15-cf0d7265e060" (UID: "1b5ca0fd-39ce-46da-8a15-cf0d7265e060"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.387569 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "1b5ca0fd-39ce-46da-8a15-cf0d7265e060" (UID: "1b5ca0fd-39ce-46da-8a15-cf0d7265e060"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.388742 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-logs" (OuterVolumeSpecName: "logs") pod "1b5ca0fd-39ce-46da-8a15-cf0d7265e060" (UID: "1b5ca0fd-39ce-46da-8a15-cf0d7265e060"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.397021 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-kube-api-access-wzzp4" (OuterVolumeSpecName: "kube-api-access-wzzp4") pod "1b5ca0fd-39ce-46da-8a15-cf0d7265e060" (UID: "1b5ca0fd-39ce-46da-8a15-cf0d7265e060"). InnerVolumeSpecName "kube-api-access-wzzp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.402717 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-scripts" (OuterVolumeSpecName: "scripts") pod "1b5ca0fd-39ce-46da-8a15-cf0d7265e060" (UID: "1b5ca0fd-39ce-46da-8a15-cf0d7265e060"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.426155 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b5ca0fd-39ce-46da-8a15-cf0d7265e060" (UID: "1b5ca0fd-39ce-46da-8a15-cf0d7265e060"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.450608 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-config-data" (OuterVolumeSpecName: "config-data") pod "1b5ca0fd-39ce-46da-8a15-cf0d7265e060" (UID: "1b5ca0fd-39ce-46da-8a15-cf0d7265e060"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.478817 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1b5ca0fd-39ce-46da-8a15-cf0d7265e060" (UID: "1b5ca0fd-39ce-46da-8a15-cf0d7265e060"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.480322 5094 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.480360 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.480370 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzzp4\" (UniqueName: \"kubernetes.io/projected/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-kube-api-access-wzzp4\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.480403 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.480413 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.480421 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.480430 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.480438 5094 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.509844 5094 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.582732 5094 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.766969 5094 generic.go:334] "Generic (PLEG): container finished" podID="7fb81f20-1f88-4c11-a37a-31db4472afd2" containerID="5c648081449bf27866a97d9ce26a3b96bf18b0b2aa6ccd28cfd7c308a8ad471b" exitCode=0 Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.767042 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8a74-account-create-update-cdhcw" event={"ID":"7fb81f20-1f88-4c11-a37a-31db4472afd2","Type":"ContainerDied","Data":"5c648081449bf27866a97d9ce26a3b96bf18b0b2aa6ccd28cfd7c308a8ad471b"} Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.770571 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e271af7-d690-418f-a044-e9a87e519a5a","Type":"ContainerStarted","Data":"5a0434c6a826a37d6e7bed76e99b0191ef536eaf264b53bd220828627c872178"} Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.772724 5094 generic.go:334] "Generic (PLEG): container finished" podID="43cfca6d-55e3-431f-b5b8-2b8db44bcee0" containerID="1616ce3dec6e69416e60df4bd5866fdeda9b06b012b17e0a3ceddd1a84ee25c5" exitCode=0 Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.772764 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-lb6l5" event={"ID":"43cfca6d-55e3-431f-b5b8-2b8db44bcee0","Type":"ContainerDied","Data":"1616ce3dec6e69416e60df4bd5866fdeda9b06b012b17e0a3ceddd1a84ee25c5"} Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.797015 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1b5ca0fd-39ce-46da-8a15-cf0d7265e060","Type":"ContainerDied","Data":"d752ec9568b97c7a9a1e0ea7c10ce0973a08508f8beb84b53fb4fc5635c2706f"} Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.797082 5094 scope.go:117] "RemoveContainer" containerID="763800845ad8e2fe479385b37abae471ee069ec13ca28f866a8628fcda5f7859" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.797269 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.808261 5094 generic.go:334] "Generic (PLEG): container finished" podID="50bf9176-b504-436f-a845-7ab55506a258" containerID="87c896e6058562619d75ebeaacab8cf5ef47914c9fd0a960cbfc01098b412e02" exitCode=0 Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.808382 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-675a-account-create-update-7f6v9" event={"ID":"50bf9176-b504-436f-a845-7ab55506a258","Type":"ContainerDied","Data":"87c896e6058562619d75ebeaacab8cf5ef47914c9fd0a960cbfc01098b412e02"} Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.831810 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"762a565c-672e-4127-a8c6-90f721eeda81","Type":"ContainerStarted","Data":"fbee36bb89639ece4abdb6a81c5837582f4784307b3986814aecb8c79e38a15c"} Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.832149 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"762a565c-672e-4127-a8c6-90f721eeda81","Type":"ContainerStarted","Data":"ca1059f5843b1683c2b383612bdd39e42db92c13986bf2a30fa4cc0e0bdde634"} Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.898247 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.913563 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.937352 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:08:56 crc kubenswrapper[5094]: E0220 07:08:56.949836 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b5ca0fd-39ce-46da-8a15-cf0d7265e060" containerName="glance-log" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.949873 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b5ca0fd-39ce-46da-8a15-cf0d7265e060" containerName="glance-log" Feb 20 07:08:56 crc kubenswrapper[5094]: E0220 07:08:56.949911 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b5ca0fd-39ce-46da-8a15-cf0d7265e060" containerName="glance-httpd" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.949917 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b5ca0fd-39ce-46da-8a15-cf0d7265e060" containerName="glance-httpd" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.950182 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b5ca0fd-39ce-46da-8a15-cf0d7265e060" containerName="glance-httpd" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.950202 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b5ca0fd-39ce-46da-8a15-cf0d7265e060" containerName="glance-log" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.951208 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.956506 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.958971 5094 scope.go:117] "RemoveContainer" containerID="3903f2d2312370e437aca364ecb299fb413db5a66b7bc53bb3fb92764eec4ca7" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.959242 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.971508 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.095582 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8j56\" (UniqueName: \"kubernetes.io/projected/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-kube-api-access-z8j56\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.095683 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.095732 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-config-data\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.095777 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-scripts\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.095806 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.095826 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.095848 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-logs\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.095871 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.202770 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.203194 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.203813 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.203228 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-logs\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.204197 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-logs\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.204236 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.204679 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8j56\" (UniqueName: \"kubernetes.io/projected/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-kube-api-access-z8j56\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.204756 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.204793 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-config-data\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.204854 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-scripts\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.205363 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.211253 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-config-data\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.216621 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.220935 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.227374 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8j56\" (UniqueName: \"kubernetes.io/projected/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-kube-api-access-z8j56\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.227529 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-scripts\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.261805 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d63e-account-create-update-hkz6h" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.271028 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.335894 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.409592 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7hfc\" (UniqueName: \"kubernetes.io/projected/0626209e-1ab6-4bd1-a5cc-35a2f6525e5b-kube-api-access-z7hfc\") pod \"0626209e-1ab6-4bd1-a5cc-35a2f6525e5b\" (UID: \"0626209e-1ab6-4bd1-a5cc-35a2f6525e5b\") " Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.409867 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0626209e-1ab6-4bd1-a5cc-35a2f6525e5b-operator-scripts\") pod \"0626209e-1ab6-4bd1-a5cc-35a2f6525e5b\" (UID: \"0626209e-1ab6-4bd1-a5cc-35a2f6525e5b\") " Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.410686 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0626209e-1ab6-4bd1-a5cc-35a2f6525e5b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0626209e-1ab6-4bd1-a5cc-35a2f6525e5b" (UID: "0626209e-1ab6-4bd1-a5cc-35a2f6525e5b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.443384 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0626209e-1ab6-4bd1-a5cc-35a2f6525e5b-kube-api-access-z7hfc" (OuterVolumeSpecName: "kube-api-access-z7hfc") pod "0626209e-1ab6-4bd1-a5cc-35a2f6525e5b" (UID: "0626209e-1ab6-4bd1-a5cc-35a2f6525e5b"). InnerVolumeSpecName "kube-api-access-z7hfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.513640 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0626209e-1ab6-4bd1-a5cc-35a2f6525e5b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.513681 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7hfc\" (UniqueName: \"kubernetes.io/projected/0626209e-1ab6-4bd1-a5cc-35a2f6525e5b-kube-api-access-z7hfc\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.714743 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hlntr" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.785232 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-675a-account-create-update-7f6v9" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.832960 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-wrxqf" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.836727 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c32c030e-7f51-4f5e-a4bb-c27288d8f2e9-operator-scripts\") pod \"c32c030e-7f51-4f5e-a4bb-c27288d8f2e9\" (UID: \"c32c030e-7f51-4f5e-a4bb-c27288d8f2e9\") " Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.836915 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m24hg\" (UniqueName: \"kubernetes.io/projected/c32c030e-7f51-4f5e-a4bb-c27288d8f2e9-kube-api-access-m24hg\") pod \"c32c030e-7f51-4f5e-a4bb-c27288d8f2e9\" (UID: \"c32c030e-7f51-4f5e-a4bb-c27288d8f2e9\") " Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.849478 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c32c030e-7f51-4f5e-a4bb-c27288d8f2e9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c32c030e-7f51-4f5e-a4bb-c27288d8f2e9" (UID: "c32c030e-7f51-4f5e-a4bb-c27288d8f2e9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.891549 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b5ca0fd-39ce-46da-8a15-cf0d7265e060" path="/var/lib/kubelet/pods/1b5ca0fd-39ce-46da-8a15-cf0d7265e060/volumes" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.894362 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c32c030e-7f51-4f5e-a4bb-c27288d8f2e9-kube-api-access-m24hg" (OuterVolumeSpecName: "kube-api-access-m24hg") pod "c32c030e-7f51-4f5e-a4bb-c27288d8f2e9" (UID: "c32c030e-7f51-4f5e-a4bb-c27288d8f2e9"). InnerVolumeSpecName "kube-api-access-m24hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.947294 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50bf9176-b504-436f-a845-7ab55506a258-operator-scripts\") pod \"50bf9176-b504-436f-a845-7ab55506a258\" (UID: \"50bf9176-b504-436f-a845-7ab55506a258\") " Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.947383 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32b230f5-7de4-450c-90e6-e9c18a0d9c0e-operator-scripts\") pod \"32b230f5-7de4-450c-90e6-e9c18a0d9c0e\" (UID: \"32b230f5-7de4-450c-90e6-e9c18a0d9c0e\") " Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.947473 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsqkn\" (UniqueName: \"kubernetes.io/projected/50bf9176-b504-436f-a845-7ab55506a258-kube-api-access-fsqkn\") pod \"50bf9176-b504-436f-a845-7ab55506a258\" (UID: \"50bf9176-b504-436f-a845-7ab55506a258\") " Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.947507 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg46j\" (UniqueName: \"kubernetes.io/projected/32b230f5-7de4-450c-90e6-e9c18a0d9c0e-kube-api-access-xg46j\") pod \"32b230f5-7de4-450c-90e6-e9c18a0d9c0e\" (UID: \"32b230f5-7de4-450c-90e6-e9c18a0d9c0e\") " Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.948301 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c32c030e-7f51-4f5e-a4bb-c27288d8f2e9-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.948324 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m24hg\" (UniqueName: \"kubernetes.io/projected/c32c030e-7f51-4f5e-a4bb-c27288d8f2e9-kube-api-access-m24hg\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.949564 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-675a-account-create-update-7f6v9" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.968361 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50bf9176-b504-436f-a845-7ab55506a258-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "50bf9176-b504-436f-a845-7ab55506a258" (UID: "50bf9176-b504-436f-a845-7ab55506a258"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.972201 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50bf9176-b504-436f-a845-7ab55506a258-kube-api-access-fsqkn" (OuterVolumeSpecName: "kube-api-access-fsqkn") pod "50bf9176-b504-436f-a845-7ab55506a258" (UID: "50bf9176-b504-436f-a845-7ab55506a258"). InnerVolumeSpecName "kube-api-access-fsqkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.972340 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32b230f5-7de4-450c-90e6-e9c18a0d9c0e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "32b230f5-7de4-450c-90e6-e9c18a0d9c0e" (UID: "32b230f5-7de4-450c-90e6-e9c18a0d9c0e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.973035 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-675a-account-create-update-7f6v9" event={"ID":"50bf9176-b504-436f-a845-7ab55506a258","Type":"ContainerDied","Data":"65625c1b377aaa04513c4aa31e214dd00d79211619df4ac8f4928a84644d7951"} Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.973176 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65625c1b377aaa04513c4aa31e214dd00d79211619df4ac8f4928a84644d7951" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.975166 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d63e-account-create-update-hkz6h" event={"ID":"0626209e-1ab6-4bd1-a5cc-35a2f6525e5b","Type":"ContainerDied","Data":"da11e79f81a9148438c1b80c421638abdc63b2dce3f3a517fc7c35f6a21bff49"} Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.975238 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da11e79f81a9148438c1b80c421638abdc63b2dce3f3a517fc7c35f6a21bff49" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.975390 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d63e-account-create-update-hkz6h" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.977758 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hlntr" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.977829 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-hlntr" event={"ID":"c32c030e-7f51-4f5e-a4bb-c27288d8f2e9","Type":"ContainerDied","Data":"4a19cbf70032a85b200b21fb37bf3d894dcadf7dd6710eb8d150415435e649c4"} Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.977878 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a19cbf70032a85b200b21fb37bf3d894dcadf7dd6710eb8d150415435e649c4" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.979210 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-wrxqf" event={"ID":"32b230f5-7de4-450c-90e6-e9c18a0d9c0e","Type":"ContainerDied","Data":"026cfb43d0585fc1caa48d4ac0edb6168f335e6a26fa3ac7ecc6a4d36f2a9e37"} Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.979261 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="026cfb43d0585fc1caa48d4ac0edb6168f335e6a26fa3ac7ecc6a4d36f2a9e37" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.982438 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-wrxqf" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.026923 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32b230f5-7de4-450c-90e6-e9c18a0d9c0e-kube-api-access-xg46j" (OuterVolumeSpecName: "kube-api-access-xg46j") pod "32b230f5-7de4-450c-90e6-e9c18a0d9c0e" (UID: "32b230f5-7de4-450c-90e6-e9c18a0d9c0e"). InnerVolumeSpecName "kube-api-access-xg46j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.061645 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50bf9176-b504-436f-a845-7ab55506a258-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.061682 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32b230f5-7de4-450c-90e6-e9c18a0d9c0e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.061695 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsqkn\" (UniqueName: \"kubernetes.io/projected/50bf9176-b504-436f-a845-7ab55506a258-kube-api-access-fsqkn\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.061795 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg46j\" (UniqueName: \"kubernetes.io/projected/32b230f5-7de4-450c-90e6-e9c18a0d9c0e-kube-api-access-xg46j\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.204091 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.404281 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8a74-account-create-update-cdhcw" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.483476 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztt8k\" (UniqueName: \"kubernetes.io/projected/7fb81f20-1f88-4c11-a37a-31db4472afd2-kube-api-access-ztt8k\") pod \"7fb81f20-1f88-4c11-a37a-31db4472afd2\" (UID: \"7fb81f20-1f88-4c11-a37a-31db4472afd2\") " Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.483597 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fb81f20-1f88-4c11-a37a-31db4472afd2-operator-scripts\") pod \"7fb81f20-1f88-4c11-a37a-31db4472afd2\" (UID: \"7fb81f20-1f88-4c11-a37a-31db4472afd2\") " Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.491084 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fb81f20-1f88-4c11-a37a-31db4472afd2-kube-api-access-ztt8k" (OuterVolumeSpecName: "kube-api-access-ztt8k") pod "7fb81f20-1f88-4c11-a37a-31db4472afd2" (UID: "7fb81f20-1f88-4c11-a37a-31db4472afd2"). InnerVolumeSpecName "kube-api-access-ztt8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.491261 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fb81f20-1f88-4c11-a37a-31db4472afd2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7fb81f20-1f88-4c11-a37a-31db4472afd2" (UID: "7fb81f20-1f88-4c11-a37a-31db4472afd2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.500026 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lb6l5" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.588318 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43cfca6d-55e3-431f-b5b8-2b8db44bcee0-operator-scripts\") pod \"43cfca6d-55e3-431f-b5b8-2b8db44bcee0\" (UID: \"43cfca6d-55e3-431f-b5b8-2b8db44bcee0\") " Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.588563 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t67jv\" (UniqueName: \"kubernetes.io/projected/43cfca6d-55e3-431f-b5b8-2b8db44bcee0-kube-api-access-t67jv\") pod \"43cfca6d-55e3-431f-b5b8-2b8db44bcee0\" (UID: \"43cfca6d-55e3-431f-b5b8-2b8db44bcee0\") " Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.588799 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43cfca6d-55e3-431f-b5b8-2b8db44bcee0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "43cfca6d-55e3-431f-b5b8-2b8db44bcee0" (UID: "43cfca6d-55e3-431f-b5b8-2b8db44bcee0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.589852 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43cfca6d-55e3-431f-b5b8-2b8db44bcee0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.589878 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztt8k\" (UniqueName: \"kubernetes.io/projected/7fb81f20-1f88-4c11-a37a-31db4472afd2-kube-api-access-ztt8k\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.589891 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fb81f20-1f88-4c11-a37a-31db4472afd2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.618239 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43cfca6d-55e3-431f-b5b8-2b8db44bcee0-kube-api-access-t67jv" (OuterVolumeSpecName: "kube-api-access-t67jv") pod "43cfca6d-55e3-431f-b5b8-2b8db44bcee0" (UID: "43cfca6d-55e3-431f-b5b8-2b8db44bcee0"). InnerVolumeSpecName "kube-api-access-t67jv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.692048 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t67jv\" (UniqueName: \"kubernetes.io/projected/43cfca6d-55e3-431f-b5b8-2b8db44bcee0-kube-api-access-t67jv\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.992071 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lb6l5" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.992067 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-lb6l5" event={"ID":"43cfca6d-55e3-431f-b5b8-2b8db44bcee0","Type":"ContainerDied","Data":"a329d7a0add1e392fa4985f19f6e93cb9612c786391b2490ffb4a1810885dc7c"} Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.992157 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a329d7a0add1e392fa4985f19f6e93cb9612c786391b2490ffb4a1810885dc7c" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.994234 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"762a565c-672e-4127-a8c6-90f721eeda81","Type":"ContainerStarted","Data":"d1f6db0391d91d17006e26254d5724c2e1250f967872ca29f449b7f22386e51b"} Feb 20 07:08:59 crc kubenswrapper[5094]: I0220 07:08:59.006322 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8a74-account-create-update-cdhcw" Feb 20 07:08:59 crc kubenswrapper[5094]: I0220 07:08:59.006896 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8a74-account-create-update-cdhcw" event={"ID":"7fb81f20-1f88-4c11-a37a-31db4472afd2","Type":"ContainerDied","Data":"76d709d0c242731b3e4167bd010802ba2e12adacc0ebc9c94ef4cef8fb201f1a"} Feb 20 07:08:59 crc kubenswrapper[5094]: I0220 07:08:59.007027 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76d709d0c242731b3e4167bd010802ba2e12adacc0ebc9c94ef4cef8fb201f1a" Feb 20 07:08:59 crc kubenswrapper[5094]: I0220 07:08:59.020208 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.020188254 podStartE2EDuration="5.020188254s" podCreationTimestamp="2026-02-20 07:08:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:08:59.016868314 +0000 UTC m=+1353.889495025" watchObservedRunningTime="2026-02-20 07:08:59.020188254 +0000 UTC m=+1353.892814965" Feb 20 07:08:59 crc kubenswrapper[5094]: I0220 07:08:59.020520 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4","Type":"ContainerStarted","Data":"0f4330768d5d427a2d77f0009f9dd9c0b2e0c4e46d6fb295f8aa0f285169a62c"} Feb 20 07:08:59 crc kubenswrapper[5094]: I0220 07:08:59.021545 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4","Type":"ContainerStarted","Data":"85e8b39c918088f619ee9b44d81cb6828488069406841b038890d491ba98168a"} Feb 20 07:08:59 crc kubenswrapper[5094]: I0220 07:08:59.032073 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e271af7-d690-418f-a044-e9a87e519a5a","Type":"ContainerStarted","Data":"a832ec9199a5419d0ba4f2f2126d3b84e07fb41f365601017d388685421a0168"} Feb 20 07:08:59 crc kubenswrapper[5094]: I0220 07:08:59.032296 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" containerName="ceilometer-central-agent" containerID="cri-o://3f9647d815121349b11fd3acdfdd813a022852ebeb4d626a22c011ec079172fc" gracePeriod=30 Feb 20 07:08:59 crc kubenswrapper[5094]: I0220 07:08:59.032372 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 07:08:59 crc kubenswrapper[5094]: I0220 07:08:59.032445 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" containerName="proxy-httpd" containerID="cri-o://a832ec9199a5419d0ba4f2f2126d3b84e07fb41f365601017d388685421a0168" gracePeriod=30 Feb 20 07:08:59 crc kubenswrapper[5094]: I0220 07:08:59.032488 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" containerName="sg-core" containerID="cri-o://5a0434c6a826a37d6e7bed76e99b0191ef536eaf264b53bd220828627c872178" gracePeriod=30 Feb 20 07:08:59 crc kubenswrapper[5094]: I0220 07:08:59.032528 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" containerName="ceilometer-notification-agent" containerID="cri-o://00acae1bf2e8033c37174db166bb17e6c15833da2f59f9b70ebeb491b8ac41ed" gracePeriod=30 Feb 20 07:08:59 crc kubenswrapper[5094]: I0220 07:08:59.070922 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.088236484 podStartE2EDuration="15.070895948s" podCreationTimestamp="2026-02-20 07:08:44 +0000 UTC" firstStartedPulling="2026-02-20 07:08:45.314673488 +0000 UTC m=+1340.187300199" lastFinishedPulling="2026-02-20 07:08:57.297332952 +0000 UTC m=+1352.169959663" observedRunningTime="2026-02-20 07:08:59.065005727 +0000 UTC m=+1353.937632448" watchObservedRunningTime="2026-02-20 07:08:59.070895948 +0000 UTC m=+1353.943522659" Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.072378 5094 generic.go:334] "Generic (PLEG): container finished" podID="c5f4dc72-bb77-44c4-8058-4939958d7a48" containerID="eb7da5e5d31261eb9e520b471e20dcda54e4d83a9a60405577b7def9033da931" exitCode=0 Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.074463 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5ddb8575b6-4wznv" event={"ID":"c5f4dc72-bb77-44c4-8058-4939958d7a48","Type":"ContainerDied","Data":"eb7da5e5d31261eb9e520b471e20dcda54e4d83a9a60405577b7def9033da931"} Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.085870 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4","Type":"ContainerStarted","Data":"5c666f7fe1bbd7b65dc971fe96a85660b5d4b37560d1db6e6ea2ac21f9782b5c"} Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.090990 5094 generic.go:334] "Generic (PLEG): container finished" podID="7e271af7-d690-418f-a044-e9a87e519a5a" containerID="a832ec9199a5419d0ba4f2f2126d3b84e07fb41f365601017d388685421a0168" exitCode=0 Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.091031 5094 generic.go:334] "Generic (PLEG): container finished" podID="7e271af7-d690-418f-a044-e9a87e519a5a" containerID="5a0434c6a826a37d6e7bed76e99b0191ef536eaf264b53bd220828627c872178" exitCode=2 Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.091041 5094 generic.go:334] "Generic (PLEG): container finished" podID="7e271af7-d690-418f-a044-e9a87e519a5a" containerID="00acae1bf2e8033c37174db166bb17e6c15833da2f59f9b70ebeb491b8ac41ed" exitCode=0 Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.091550 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e271af7-d690-418f-a044-e9a87e519a5a","Type":"ContainerDied","Data":"a832ec9199a5419d0ba4f2f2126d3b84e07fb41f365601017d388685421a0168"} Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.091722 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e271af7-d690-418f-a044-e9a87e519a5a","Type":"ContainerDied","Data":"5a0434c6a826a37d6e7bed76e99b0191ef536eaf264b53bd220828627c872178"} Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.091811 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e271af7-d690-418f-a044-e9a87e519a5a","Type":"ContainerDied","Data":"00acae1bf2e8033c37174db166bb17e6c15833da2f59f9b70ebeb491b8ac41ed"} Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.124810 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.124784008 podStartE2EDuration="4.124784008s" podCreationTimestamp="2026-02-20 07:08:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:09:00.114842149 +0000 UTC m=+1354.987468860" watchObservedRunningTime="2026-02-20 07:09:00.124784008 +0000 UTC m=+1354.997410719" Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.387505 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.434416 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5fx4\" (UniqueName: \"kubernetes.io/projected/c5f4dc72-bb77-44c4-8058-4939958d7a48-kube-api-access-t5fx4\") pod \"c5f4dc72-bb77-44c4-8058-4939958d7a48\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.434488 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-config\") pod \"c5f4dc72-bb77-44c4-8058-4939958d7a48\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.434530 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-ovndb-tls-certs\") pod \"c5f4dc72-bb77-44c4-8058-4939958d7a48\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.434638 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-httpd-config\") pod \"c5f4dc72-bb77-44c4-8058-4939958d7a48\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.434765 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-combined-ca-bundle\") pod \"c5f4dc72-bb77-44c4-8058-4939958d7a48\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.444254 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "c5f4dc72-bb77-44c4-8058-4939958d7a48" (UID: "c5f4dc72-bb77-44c4-8058-4939958d7a48"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.449917 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5f4dc72-bb77-44c4-8058-4939958d7a48-kube-api-access-t5fx4" (OuterVolumeSpecName: "kube-api-access-t5fx4") pod "c5f4dc72-bb77-44c4-8058-4939958d7a48" (UID: "c5f4dc72-bb77-44c4-8058-4939958d7a48"). InnerVolumeSpecName "kube-api-access-t5fx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.495566 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5f4dc72-bb77-44c4-8058-4939958d7a48" (UID: "c5f4dc72-bb77-44c4-8058-4939958d7a48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.497284 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-config" (OuterVolumeSpecName: "config") pod "c5f4dc72-bb77-44c4-8058-4939958d7a48" (UID: "c5f4dc72-bb77-44c4-8058-4939958d7a48"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.540671 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.540757 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5fx4\" (UniqueName: \"kubernetes.io/projected/c5f4dc72-bb77-44c4-8058-4939958d7a48-kube-api-access-t5fx4\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.540771 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.540786 5094 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.546001 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "c5f4dc72-bb77-44c4-8058-4939958d7a48" (UID: "c5f4dc72-bb77-44c4-8058-4939958d7a48"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.642914 5094 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.085184 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.105914 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5ddb8575b6-4wznv" event={"ID":"c5f4dc72-bb77-44c4-8058-4939958d7a48","Type":"ContainerDied","Data":"7216118bc6764d988378bcc95f70afbc24e44597d2724806d33cbd64bb7f1c0b"} Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.106261 5094 scope.go:117] "RemoveContainer" containerID="419f643d1ccf43ee276ce35ac60b7269f52b21d76747dcf57ea44f551a26690a" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.106225 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.113434 5094 generic.go:334] "Generic (PLEG): container finished" podID="7e271af7-d690-418f-a044-e9a87e519a5a" containerID="3f9647d815121349b11fd3acdfdd813a022852ebeb4d626a22c011ec079172fc" exitCode=0 Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.115426 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.117325 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e271af7-d690-418f-a044-e9a87e519a5a","Type":"ContainerDied","Data":"3f9647d815121349b11fd3acdfdd813a022852ebeb4d626a22c011ec079172fc"} Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.117662 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e271af7-d690-418f-a044-e9a87e519a5a","Type":"ContainerDied","Data":"72664ddb60921a7c82bcb5bf7c13f327aadde894768d3fc2b11fc7a9a5e76ba1"} Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.140474 5094 scope.go:117] "RemoveContainer" containerID="eb7da5e5d31261eb9e520b471e20dcda54e4d83a9a60405577b7def9033da931" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.141901 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5ddb8575b6-4wznv"] Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.152183 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-config-data\") pod \"7e271af7-d690-418f-a044-e9a87e519a5a\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.152262 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-sg-core-conf-yaml\") pod \"7e271af7-d690-418f-a044-e9a87e519a5a\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.152184 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5ddb8575b6-4wznv"] Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.154251 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e271af7-d690-418f-a044-e9a87e519a5a-run-httpd\") pod \"7e271af7-d690-418f-a044-e9a87e519a5a\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.154368 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7thfk\" (UniqueName: \"kubernetes.io/projected/7e271af7-d690-418f-a044-e9a87e519a5a-kube-api-access-7thfk\") pod \"7e271af7-d690-418f-a044-e9a87e519a5a\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.154422 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e271af7-d690-418f-a044-e9a87e519a5a-log-httpd\") pod \"7e271af7-d690-418f-a044-e9a87e519a5a\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.154606 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-combined-ca-bundle\") pod \"7e271af7-d690-418f-a044-e9a87e519a5a\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.154863 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-scripts\") pod \"7e271af7-d690-418f-a044-e9a87e519a5a\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.155565 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e271af7-d690-418f-a044-e9a87e519a5a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7e271af7-d690-418f-a044-e9a87e519a5a" (UID: "7e271af7-d690-418f-a044-e9a87e519a5a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.155729 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e271af7-d690-418f-a044-e9a87e519a5a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7e271af7-d690-418f-a044-e9a87e519a5a" (UID: "7e271af7-d690-418f-a044-e9a87e519a5a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.156492 5094 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e271af7-d690-418f-a044-e9a87e519a5a-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.156524 5094 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e271af7-d690-418f-a044-e9a87e519a5a-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.165340 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-scripts" (OuterVolumeSpecName: "scripts") pod "7e271af7-d690-418f-a044-e9a87e519a5a" (UID: "7e271af7-d690-418f-a044-e9a87e519a5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.166030 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e271af7-d690-418f-a044-e9a87e519a5a-kube-api-access-7thfk" (OuterVolumeSpecName: "kube-api-access-7thfk") pod "7e271af7-d690-418f-a044-e9a87e519a5a" (UID: "7e271af7-d690-418f-a044-e9a87e519a5a"). InnerVolumeSpecName "kube-api-access-7thfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.178665 5094 scope.go:117] "RemoveContainer" containerID="a832ec9199a5419d0ba4f2f2126d3b84e07fb41f365601017d388685421a0168" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.260438 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.260472 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7thfk\" (UniqueName: \"kubernetes.io/projected/7e271af7-d690-418f-a044-e9a87e519a5a-kube-api-access-7thfk\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.276878 5094 scope.go:117] "RemoveContainer" containerID="5a0434c6a826a37d6e7bed76e99b0191ef536eaf264b53bd220828627c872178" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.282883 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7e271af7-d690-418f-a044-e9a87e519a5a" (UID: "7e271af7-d690-418f-a044-e9a87e519a5a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.348796 5094 scope.go:117] "RemoveContainer" containerID="00acae1bf2e8033c37174db166bb17e6c15833da2f59f9b70ebeb491b8ac41ed" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.366830 5094 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.418496 5094 scope.go:117] "RemoveContainer" containerID="3f9647d815121349b11fd3acdfdd813a022852ebeb4d626a22c011ec079172fc" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.422189 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e271af7-d690-418f-a044-e9a87e519a5a" (UID: "7e271af7-d690-418f-a044-e9a87e519a5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.464104 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-config-data" (OuterVolumeSpecName: "config-data") pod "7e271af7-d690-418f-a044-e9a87e519a5a" (UID: "7e271af7-d690-418f-a044-e9a87e519a5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.468373 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.468411 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.470081 5094 scope.go:117] "RemoveContainer" containerID="a832ec9199a5419d0ba4f2f2126d3b84e07fb41f365601017d388685421a0168" Feb 20 07:09:01 crc kubenswrapper[5094]: E0220 07:09:01.470622 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a832ec9199a5419d0ba4f2f2126d3b84e07fb41f365601017d388685421a0168\": container with ID starting with a832ec9199a5419d0ba4f2f2126d3b84e07fb41f365601017d388685421a0168 not found: ID does not exist" containerID="a832ec9199a5419d0ba4f2f2126d3b84e07fb41f365601017d388685421a0168" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.470666 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a832ec9199a5419d0ba4f2f2126d3b84e07fb41f365601017d388685421a0168"} err="failed to get container status \"a832ec9199a5419d0ba4f2f2126d3b84e07fb41f365601017d388685421a0168\": rpc error: code = NotFound desc = could not find container \"a832ec9199a5419d0ba4f2f2126d3b84e07fb41f365601017d388685421a0168\": container with ID starting with a832ec9199a5419d0ba4f2f2126d3b84e07fb41f365601017d388685421a0168 not found: ID does not exist" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.470712 5094 scope.go:117] "RemoveContainer" containerID="5a0434c6a826a37d6e7bed76e99b0191ef536eaf264b53bd220828627c872178" Feb 20 07:09:01 crc kubenswrapper[5094]: E0220 07:09:01.471299 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a0434c6a826a37d6e7bed76e99b0191ef536eaf264b53bd220828627c872178\": container with ID starting with 5a0434c6a826a37d6e7bed76e99b0191ef536eaf264b53bd220828627c872178 not found: ID does not exist" containerID="5a0434c6a826a37d6e7bed76e99b0191ef536eaf264b53bd220828627c872178" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.471358 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a0434c6a826a37d6e7bed76e99b0191ef536eaf264b53bd220828627c872178"} err="failed to get container status \"5a0434c6a826a37d6e7bed76e99b0191ef536eaf264b53bd220828627c872178\": rpc error: code = NotFound desc = could not find container \"5a0434c6a826a37d6e7bed76e99b0191ef536eaf264b53bd220828627c872178\": container with ID starting with 5a0434c6a826a37d6e7bed76e99b0191ef536eaf264b53bd220828627c872178 not found: ID does not exist" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.471390 5094 scope.go:117] "RemoveContainer" containerID="00acae1bf2e8033c37174db166bb17e6c15833da2f59f9b70ebeb491b8ac41ed" Feb 20 07:09:01 crc kubenswrapper[5094]: E0220 07:09:01.471766 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00acae1bf2e8033c37174db166bb17e6c15833da2f59f9b70ebeb491b8ac41ed\": container with ID starting with 00acae1bf2e8033c37174db166bb17e6c15833da2f59f9b70ebeb491b8ac41ed not found: ID does not exist" containerID="00acae1bf2e8033c37174db166bb17e6c15833da2f59f9b70ebeb491b8ac41ed" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.471791 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00acae1bf2e8033c37174db166bb17e6c15833da2f59f9b70ebeb491b8ac41ed"} err="failed to get container status \"00acae1bf2e8033c37174db166bb17e6c15833da2f59f9b70ebeb491b8ac41ed\": rpc error: code = NotFound desc = could not find container \"00acae1bf2e8033c37174db166bb17e6c15833da2f59f9b70ebeb491b8ac41ed\": container with ID starting with 00acae1bf2e8033c37174db166bb17e6c15833da2f59f9b70ebeb491b8ac41ed not found: ID does not exist" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.471807 5094 scope.go:117] "RemoveContainer" containerID="3f9647d815121349b11fd3acdfdd813a022852ebeb4d626a22c011ec079172fc" Feb 20 07:09:01 crc kubenswrapper[5094]: E0220 07:09:01.473072 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f9647d815121349b11fd3acdfdd813a022852ebeb4d626a22c011ec079172fc\": container with ID starting with 3f9647d815121349b11fd3acdfdd813a022852ebeb4d626a22c011ec079172fc not found: ID does not exist" containerID="3f9647d815121349b11fd3acdfdd813a022852ebeb4d626a22c011ec079172fc" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.473109 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f9647d815121349b11fd3acdfdd813a022852ebeb4d626a22c011ec079172fc"} err="failed to get container status \"3f9647d815121349b11fd3acdfdd813a022852ebeb4d626a22c011ec079172fc\": rpc error: code = NotFound desc = could not find container \"3f9647d815121349b11fd3acdfdd813a022852ebeb4d626a22c011ec079172fc\": container with ID starting with 3f9647d815121349b11fd3acdfdd813a022852ebeb4d626a22c011ec079172fc not found: ID does not exist" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.752238 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.761509 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.773621 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:09:01 crc kubenswrapper[5094]: E0220 07:09:01.774255 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50bf9176-b504-436f-a845-7ab55506a258" containerName="mariadb-account-create-update" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.774376 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="50bf9176-b504-436f-a845-7ab55506a258" containerName="mariadb-account-create-update" Feb 20 07:09:01 crc kubenswrapper[5094]: E0220 07:09:01.774454 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5f4dc72-bb77-44c4-8058-4939958d7a48" containerName="neutron-httpd" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.774518 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f4dc72-bb77-44c4-8058-4939958d7a48" containerName="neutron-httpd" Feb 20 07:09:01 crc kubenswrapper[5094]: E0220 07:09:01.774577 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" containerName="ceilometer-notification-agent" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.774642 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" containerName="ceilometer-notification-agent" Feb 20 07:09:01 crc kubenswrapper[5094]: E0220 07:09:01.774795 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43cfca6d-55e3-431f-b5b8-2b8db44bcee0" containerName="mariadb-database-create" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.774862 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="43cfca6d-55e3-431f-b5b8-2b8db44bcee0" containerName="mariadb-database-create" Feb 20 07:09:01 crc kubenswrapper[5094]: E0220 07:09:01.774917 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5f4dc72-bb77-44c4-8058-4939958d7a48" containerName="neutron-api" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.774964 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f4dc72-bb77-44c4-8058-4939958d7a48" containerName="neutron-api" Feb 20 07:09:01 crc kubenswrapper[5094]: E0220 07:09:01.775015 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" containerName="sg-core" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.775063 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" containerName="sg-core" Feb 20 07:09:01 crc kubenswrapper[5094]: E0220 07:09:01.775120 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0626209e-1ab6-4bd1-a5cc-35a2f6525e5b" containerName="mariadb-account-create-update" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.775176 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0626209e-1ab6-4bd1-a5cc-35a2f6525e5b" containerName="mariadb-account-create-update" Feb 20 07:09:01 crc kubenswrapper[5094]: E0220 07:09:01.775233 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fb81f20-1f88-4c11-a37a-31db4472afd2" containerName="mariadb-account-create-update" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.775283 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fb81f20-1f88-4c11-a37a-31db4472afd2" containerName="mariadb-account-create-update" Feb 20 07:09:01 crc kubenswrapper[5094]: E0220 07:09:01.775340 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c32c030e-7f51-4f5e-a4bb-c27288d8f2e9" containerName="mariadb-database-create" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.775395 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c32c030e-7f51-4f5e-a4bb-c27288d8f2e9" containerName="mariadb-database-create" Feb 20 07:09:01 crc kubenswrapper[5094]: E0220 07:09:01.775451 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32b230f5-7de4-450c-90e6-e9c18a0d9c0e" containerName="mariadb-database-create" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.775505 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="32b230f5-7de4-450c-90e6-e9c18a0d9c0e" containerName="mariadb-database-create" Feb 20 07:09:01 crc kubenswrapper[5094]: E0220 07:09:01.775577 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" containerName="proxy-httpd" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.775644 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" containerName="proxy-httpd" Feb 20 07:09:01 crc kubenswrapper[5094]: E0220 07:09:01.775895 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" containerName="ceilometer-central-agent" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.775951 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" containerName="ceilometer-central-agent" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.776369 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="0626209e-1ab6-4bd1-a5cc-35a2f6525e5b" containerName="mariadb-account-create-update" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.776459 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="50bf9176-b504-436f-a845-7ab55506a258" containerName="mariadb-account-create-update" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.776525 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="32b230f5-7de4-450c-90e6-e9c18a0d9c0e" containerName="mariadb-database-create" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.776587 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" containerName="sg-core" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.776649 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="43cfca6d-55e3-431f-b5b8-2b8db44bcee0" containerName="mariadb-database-create" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.776739 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" containerName="proxy-httpd" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.776795 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fb81f20-1f88-4c11-a37a-31db4472afd2" containerName="mariadb-account-create-update" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.776861 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" containerName="ceilometer-notification-agent" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.776928 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5f4dc72-bb77-44c4-8058-4939958d7a48" containerName="neutron-api" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.776995 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5f4dc72-bb77-44c4-8058-4939958d7a48" containerName="neutron-httpd" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.777057 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" containerName="ceilometer-central-agent" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.777128 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="c32c030e-7f51-4f5e-a4bb-c27288d8f2e9" containerName="mariadb-database-create" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.780027 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.783299 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.783515 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.787264 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.854002 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" path="/var/lib/kubelet/pods/7e271af7-d690-418f-a044-e9a87e519a5a/volumes" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.854908 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5f4dc72-bb77-44c4-8058-4939958d7a48" path="/var/lib/kubelet/pods/c5f4dc72-bb77-44c4-8058-4939958d7a48/volumes" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.875986 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfk5k\" (UniqueName: \"kubernetes.io/projected/a0797f40-7313-479d-95ab-0a65d83b96d1-kube-api-access-bfk5k\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.876271 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0797f40-7313-479d-95ab-0a65d83b96d1-run-httpd\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.876387 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0797f40-7313-479d-95ab-0a65d83b96d1-log-httpd\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.876557 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-config-data\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.876657 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-scripts\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.876803 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.876900 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.978651 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.979148 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.979184 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfk5k\" (UniqueName: \"kubernetes.io/projected/a0797f40-7313-479d-95ab-0a65d83b96d1-kube-api-access-bfk5k\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.979249 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0797f40-7313-479d-95ab-0a65d83b96d1-run-httpd\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.979277 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0797f40-7313-479d-95ab-0a65d83b96d1-log-httpd\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.979327 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-config-data\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.979363 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-scripts\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.979949 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0797f40-7313-479d-95ab-0a65d83b96d1-run-httpd\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.980063 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0797f40-7313-479d-95ab-0a65d83b96d1-log-httpd\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.990545 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.990570 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.990685 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-scripts\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.996466 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfk5k\" (UniqueName: \"kubernetes.io/projected/a0797f40-7313-479d-95ab-0a65d83b96d1-kube-api-access-bfk5k\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:02 crc kubenswrapper[5094]: I0220 07:09:02.008734 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-config-data\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:02 crc kubenswrapper[5094]: I0220 07:09:02.100794 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:09:02 crc kubenswrapper[5094]: I0220 07:09:02.624002 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.140767 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0797f40-7313-479d-95ab-0a65d83b96d1","Type":"ContainerStarted","Data":"f94a0c0b0f09c18362f4c831a12784474e87ae445bfb68efe6344e1d738ee970"} Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.289945 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rfczb"] Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.292210 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rfczb" Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.300666 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.301096 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.301389 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-j928v" Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.317673 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rfczb"] Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.409597 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rfczb\" (UID: \"9b534507-5d2d-496b-9a60-f0b45e25bb23\") " pod="openstack/nova-cell0-conductor-db-sync-rfczb" Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.409665 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-scripts\") pod \"nova-cell0-conductor-db-sync-rfczb\" (UID: \"9b534507-5d2d-496b-9a60-f0b45e25bb23\") " pod="openstack/nova-cell0-conductor-db-sync-rfczb" Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.409716 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5bjh\" (UniqueName: \"kubernetes.io/projected/9b534507-5d2d-496b-9a60-f0b45e25bb23-kube-api-access-k5bjh\") pod \"nova-cell0-conductor-db-sync-rfczb\" (UID: \"9b534507-5d2d-496b-9a60-f0b45e25bb23\") " pod="openstack/nova-cell0-conductor-db-sync-rfczb" Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.409816 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-config-data\") pod \"nova-cell0-conductor-db-sync-rfczb\" (UID: \"9b534507-5d2d-496b-9a60-f0b45e25bb23\") " pod="openstack/nova-cell0-conductor-db-sync-rfczb" Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.511629 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rfczb\" (UID: \"9b534507-5d2d-496b-9a60-f0b45e25bb23\") " pod="openstack/nova-cell0-conductor-db-sync-rfczb" Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.511696 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-scripts\") pod \"nova-cell0-conductor-db-sync-rfczb\" (UID: \"9b534507-5d2d-496b-9a60-f0b45e25bb23\") " pod="openstack/nova-cell0-conductor-db-sync-rfczb" Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.511752 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5bjh\" (UniqueName: \"kubernetes.io/projected/9b534507-5d2d-496b-9a60-f0b45e25bb23-kube-api-access-k5bjh\") pod \"nova-cell0-conductor-db-sync-rfczb\" (UID: \"9b534507-5d2d-496b-9a60-f0b45e25bb23\") " pod="openstack/nova-cell0-conductor-db-sync-rfczb" Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.511849 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-config-data\") pod \"nova-cell0-conductor-db-sync-rfczb\" (UID: \"9b534507-5d2d-496b-9a60-f0b45e25bb23\") " pod="openstack/nova-cell0-conductor-db-sync-rfczb" Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.520291 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-scripts\") pod \"nova-cell0-conductor-db-sync-rfczb\" (UID: \"9b534507-5d2d-496b-9a60-f0b45e25bb23\") " pod="openstack/nova-cell0-conductor-db-sync-rfczb" Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.520934 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rfczb\" (UID: \"9b534507-5d2d-496b-9a60-f0b45e25bb23\") " pod="openstack/nova-cell0-conductor-db-sync-rfczb" Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.522127 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-config-data\") pod \"nova-cell0-conductor-db-sync-rfczb\" (UID: \"9b534507-5d2d-496b-9a60-f0b45e25bb23\") " pod="openstack/nova-cell0-conductor-db-sync-rfczb" Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.528860 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5bjh\" (UniqueName: \"kubernetes.io/projected/9b534507-5d2d-496b-9a60-f0b45e25bb23-kube-api-access-k5bjh\") pod \"nova-cell0-conductor-db-sync-rfczb\" (UID: \"9b534507-5d2d-496b-9a60-f0b45e25bb23\") " pod="openstack/nova-cell0-conductor-db-sync-rfczb" Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.628368 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rfczb" Feb 20 07:09:04 crc kubenswrapper[5094]: I0220 07:09:04.185997 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0797f40-7313-479d-95ab-0a65d83b96d1","Type":"ContainerStarted","Data":"c4793f3606e7baf873c3ad7d2141782aac8bcc7350dfd8b20839747be94d7f25"} Feb 20 07:09:04 crc kubenswrapper[5094]: I0220 07:09:04.186927 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0797f40-7313-479d-95ab-0a65d83b96d1","Type":"ContainerStarted","Data":"1229efd8ed9ba1e9b30b374a627f3c7487afe05838a26a18548bf279f8959273"} Feb 20 07:09:04 crc kubenswrapper[5094]: I0220 07:09:04.252755 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rfczb"] Feb 20 07:09:04 crc kubenswrapper[5094]: W0220 07:09:04.266168 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b534507_5d2d_496b_9a60_f0b45e25bb23.slice/crio-00529373718eeeb13da913d19da79c13c97d537751cf2e72ac9762a9157f28af WatchSource:0}: Error finding container 00529373718eeeb13da913d19da79c13c97d537751cf2e72ac9762a9157f28af: Status 404 returned error can't find the container with id 00529373718eeeb13da913d19da79c13c97d537751cf2e72ac9762a9157f28af Feb 20 07:09:04 crc kubenswrapper[5094]: I0220 07:09:04.761925 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:09:05 crc kubenswrapper[5094]: I0220 07:09:05.224740 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rfczb" event={"ID":"9b534507-5d2d-496b-9a60-f0b45e25bb23","Type":"ContainerStarted","Data":"00529373718eeeb13da913d19da79c13c97d537751cf2e72ac9762a9157f28af"} Feb 20 07:09:05 crc kubenswrapper[5094]: I0220 07:09:05.235441 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0797f40-7313-479d-95ab-0a65d83b96d1","Type":"ContainerStarted","Data":"92042acb82a2b22307c932b8e95e8c09d3491dbdbdfc8249a49e90755238923c"} Feb 20 07:09:05 crc kubenswrapper[5094]: I0220 07:09:05.311225 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 20 07:09:05 crc kubenswrapper[5094]: I0220 07:09:05.311276 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 20 07:09:05 crc kubenswrapper[5094]: I0220 07:09:05.357479 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 20 07:09:05 crc kubenswrapper[5094]: I0220 07:09:05.362144 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 20 07:09:06 crc kubenswrapper[5094]: I0220 07:09:06.249476 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0797f40-7313-479d-95ab-0a65d83b96d1","Type":"ContainerStarted","Data":"52d0d24cfba05c590c4ce7431d5e10a363e5e5262aa05f4ff8f4f412c8232924"} Feb 20 07:09:06 crc kubenswrapper[5094]: I0220 07:09:06.249647 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="ceilometer-central-agent" containerID="cri-o://1229efd8ed9ba1e9b30b374a627f3c7487afe05838a26a18548bf279f8959273" gracePeriod=30 Feb 20 07:09:06 crc kubenswrapper[5094]: I0220 07:09:06.249673 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="proxy-httpd" containerID="cri-o://52d0d24cfba05c590c4ce7431d5e10a363e5e5262aa05f4ff8f4f412c8232924" gracePeriod=30 Feb 20 07:09:06 crc kubenswrapper[5094]: I0220 07:09:06.249733 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="ceilometer-notification-agent" containerID="cri-o://c4793f3606e7baf873c3ad7d2141782aac8bcc7350dfd8b20839747be94d7f25" gracePeriod=30 Feb 20 07:09:06 crc kubenswrapper[5094]: I0220 07:09:06.249746 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="sg-core" containerID="cri-o://92042acb82a2b22307c932b8e95e8c09d3491dbdbdfc8249a49e90755238923c" gracePeriod=30 Feb 20 07:09:06 crc kubenswrapper[5094]: I0220 07:09:06.250568 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 07:09:06 crc kubenswrapper[5094]: I0220 07:09:06.250625 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 20 07:09:06 crc kubenswrapper[5094]: I0220 07:09:06.250640 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 20 07:09:06 crc kubenswrapper[5094]: I0220 07:09:06.279622 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.952487478 podStartE2EDuration="5.27959667s" podCreationTimestamp="2026-02-20 07:09:01 +0000 UTC" firstStartedPulling="2026-02-20 07:09:02.64145068 +0000 UTC m=+1357.514077381" lastFinishedPulling="2026-02-20 07:09:05.968559862 +0000 UTC m=+1360.841186573" observedRunningTime="2026-02-20 07:09:06.273963936 +0000 UTC m=+1361.146590647" watchObservedRunningTime="2026-02-20 07:09:06.27959667 +0000 UTC m=+1361.152223381" Feb 20 07:09:07 crc kubenswrapper[5094]: I0220 07:09:07.266882 5094 generic.go:334] "Generic (PLEG): container finished" podID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerID="92042acb82a2b22307c932b8e95e8c09d3491dbdbdfc8249a49e90755238923c" exitCode=2 Feb 20 07:09:07 crc kubenswrapper[5094]: I0220 07:09:07.267464 5094 generic.go:334] "Generic (PLEG): container finished" podID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerID="c4793f3606e7baf873c3ad7d2141782aac8bcc7350dfd8b20839747be94d7f25" exitCode=0 Feb 20 07:09:07 crc kubenswrapper[5094]: I0220 07:09:07.266939 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0797f40-7313-479d-95ab-0a65d83b96d1","Type":"ContainerDied","Data":"92042acb82a2b22307c932b8e95e8c09d3491dbdbdfc8249a49e90755238923c"} Feb 20 07:09:07 crc kubenswrapper[5094]: I0220 07:09:07.267583 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0797f40-7313-479d-95ab-0a65d83b96d1","Type":"ContainerDied","Data":"c4793f3606e7baf873c3ad7d2141782aac8bcc7350dfd8b20839747be94d7f25"} Feb 20 07:09:07 crc kubenswrapper[5094]: I0220 07:09:07.336773 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 20 07:09:07 crc kubenswrapper[5094]: I0220 07:09:07.336844 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 20 07:09:07 crc kubenswrapper[5094]: I0220 07:09:07.463380 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 20 07:09:07 crc kubenswrapper[5094]: I0220 07:09:07.472184 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 20 07:09:08 crc kubenswrapper[5094]: I0220 07:09:08.278288 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 20 07:09:08 crc kubenswrapper[5094]: I0220 07:09:08.278349 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 20 07:09:08 crc kubenswrapper[5094]: I0220 07:09:08.482474 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 20 07:09:08 crc kubenswrapper[5094]: I0220 07:09:08.482980 5094 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 07:09:08 crc kubenswrapper[5094]: I0220 07:09:08.659302 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 20 07:09:10 crc kubenswrapper[5094]: I0220 07:09:10.252283 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 20 07:09:10 crc kubenswrapper[5094]: I0220 07:09:10.264020 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 20 07:09:16 crc kubenswrapper[5094]: I0220 07:09:16.435858 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rfczb" event={"ID":"9b534507-5d2d-496b-9a60-f0b45e25bb23","Type":"ContainerStarted","Data":"a7698ff70c057c639a740972f6d9a16bf4a7bb2202bca0a958050e7b6bf16b01"} Feb 20 07:09:16 crc kubenswrapper[5094]: I0220 07:09:16.468779 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-rfczb" podStartSLOduration=2.110722678 podStartE2EDuration="13.468751634s" podCreationTimestamp="2026-02-20 07:09:03 +0000 UTC" firstStartedPulling="2026-02-20 07:09:04.269283796 +0000 UTC m=+1359.141910497" lastFinishedPulling="2026-02-20 07:09:15.627312742 +0000 UTC m=+1370.499939453" observedRunningTime="2026-02-20 07:09:16.450873536 +0000 UTC m=+1371.323500277" watchObservedRunningTime="2026-02-20 07:09:16.468751634 +0000 UTC m=+1371.341378355" Feb 20 07:09:17 crc kubenswrapper[5094]: I0220 07:09:17.453546 5094 generic.go:334] "Generic (PLEG): container finished" podID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerID="1229efd8ed9ba1e9b30b374a627f3c7487afe05838a26a18548bf279f8959273" exitCode=0 Feb 20 07:09:17 crc kubenswrapper[5094]: I0220 07:09:17.454920 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0797f40-7313-479d-95ab-0a65d83b96d1","Type":"ContainerDied","Data":"1229efd8ed9ba1e9b30b374a627f3c7487afe05838a26a18548bf279f8959273"} Feb 20 07:09:22 crc kubenswrapper[5094]: I0220 07:09:22.978347 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-htpf7"] Feb 20 07:09:22 crc kubenswrapper[5094]: I0220 07:09:22.984625 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:22 crc kubenswrapper[5094]: I0220 07:09:22.990321 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-htpf7"] Feb 20 07:09:23 crc kubenswrapper[5094]: I0220 07:09:23.095311 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5fdx\" (UniqueName: \"kubernetes.io/projected/9a6d10f4-9048-4e6c-831c-8342e340d290-kube-api-access-b5fdx\") pod \"redhat-marketplace-htpf7\" (UID: \"9a6d10f4-9048-4e6c-831c-8342e340d290\") " pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:23 crc kubenswrapper[5094]: I0220 07:09:23.095397 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a6d10f4-9048-4e6c-831c-8342e340d290-utilities\") pod \"redhat-marketplace-htpf7\" (UID: \"9a6d10f4-9048-4e6c-831c-8342e340d290\") " pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:23 crc kubenswrapper[5094]: I0220 07:09:23.095424 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a6d10f4-9048-4e6c-831c-8342e340d290-catalog-content\") pod \"redhat-marketplace-htpf7\" (UID: \"9a6d10f4-9048-4e6c-831c-8342e340d290\") " pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:23 crc kubenswrapper[5094]: I0220 07:09:23.198591 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5fdx\" (UniqueName: \"kubernetes.io/projected/9a6d10f4-9048-4e6c-831c-8342e340d290-kube-api-access-b5fdx\") pod \"redhat-marketplace-htpf7\" (UID: \"9a6d10f4-9048-4e6c-831c-8342e340d290\") " pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:23 crc kubenswrapper[5094]: I0220 07:09:23.199017 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a6d10f4-9048-4e6c-831c-8342e340d290-utilities\") pod \"redhat-marketplace-htpf7\" (UID: \"9a6d10f4-9048-4e6c-831c-8342e340d290\") " pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:23 crc kubenswrapper[5094]: I0220 07:09:23.199043 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a6d10f4-9048-4e6c-831c-8342e340d290-catalog-content\") pod \"redhat-marketplace-htpf7\" (UID: \"9a6d10f4-9048-4e6c-831c-8342e340d290\") " pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:23 crc kubenswrapper[5094]: I0220 07:09:23.199792 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a6d10f4-9048-4e6c-831c-8342e340d290-catalog-content\") pod \"redhat-marketplace-htpf7\" (UID: \"9a6d10f4-9048-4e6c-831c-8342e340d290\") " pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:23 crc kubenswrapper[5094]: I0220 07:09:23.199818 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a6d10f4-9048-4e6c-831c-8342e340d290-utilities\") pod \"redhat-marketplace-htpf7\" (UID: \"9a6d10f4-9048-4e6c-831c-8342e340d290\") " pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:23 crc kubenswrapper[5094]: I0220 07:09:23.221556 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5fdx\" (UniqueName: \"kubernetes.io/projected/9a6d10f4-9048-4e6c-831c-8342e340d290-kube-api-access-b5fdx\") pod \"redhat-marketplace-htpf7\" (UID: \"9a6d10f4-9048-4e6c-831c-8342e340d290\") " pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:23 crc kubenswrapper[5094]: I0220 07:09:23.307289 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:23 crc kubenswrapper[5094]: I0220 07:09:23.801900 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-htpf7"] Feb 20 07:09:24 crc kubenswrapper[5094]: I0220 07:09:24.527813 5094 generic.go:334] "Generic (PLEG): container finished" podID="9a6d10f4-9048-4e6c-831c-8342e340d290" containerID="c1adf8db45a1096e54fcc03867ab2eaebd032c062e599ce4d2e371f076654953" exitCode=0 Feb 20 07:09:24 crc kubenswrapper[5094]: I0220 07:09:24.527929 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-htpf7" event={"ID":"9a6d10f4-9048-4e6c-831c-8342e340d290","Type":"ContainerDied","Data":"c1adf8db45a1096e54fcc03867ab2eaebd032c062e599ce4d2e371f076654953"} Feb 20 07:09:24 crc kubenswrapper[5094]: I0220 07:09:24.528249 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-htpf7" event={"ID":"9a6d10f4-9048-4e6c-831c-8342e340d290","Type":"ContainerStarted","Data":"06256a1ce1be59544899f5fa410cb2b0ac1384aa6f324b0c9ca14e8938c56f75"} Feb 20 07:09:25 crc kubenswrapper[5094]: I0220 07:09:25.542403 5094 generic.go:334] "Generic (PLEG): container finished" podID="9a6d10f4-9048-4e6c-831c-8342e340d290" containerID="21b883bbf98254c856096d0b01c81fe32d0d73ad5cc8f16e030435fa52dfc0db" exitCode=0 Feb 20 07:09:25 crc kubenswrapper[5094]: I0220 07:09:25.542480 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-htpf7" event={"ID":"9a6d10f4-9048-4e6c-831c-8342e340d290","Type":"ContainerDied","Data":"21b883bbf98254c856096d0b01c81fe32d0d73ad5cc8f16e030435fa52dfc0db"} Feb 20 07:09:26 crc kubenswrapper[5094]: I0220 07:09:26.569629 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-htpf7" event={"ID":"9a6d10f4-9048-4e6c-831c-8342e340d290","Type":"ContainerStarted","Data":"4e36992ceacec2103fd06fe1ba8d785e7dc84b537e9b805e8337c088a4f67592"} Feb 20 07:09:27 crc kubenswrapper[5094]: I0220 07:09:27.591930 5094 generic.go:334] "Generic (PLEG): container finished" podID="9b534507-5d2d-496b-9a60-f0b45e25bb23" containerID="a7698ff70c057c639a740972f6d9a16bf4a7bb2202bca0a958050e7b6bf16b01" exitCode=0 Feb 20 07:09:27 crc kubenswrapper[5094]: I0220 07:09:27.592255 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rfczb" event={"ID":"9b534507-5d2d-496b-9a60-f0b45e25bb23","Type":"ContainerDied","Data":"a7698ff70c057c639a740972f6d9a16bf4a7bb2202bca0a958050e7b6bf16b01"} Feb 20 07:09:27 crc kubenswrapper[5094]: I0220 07:09:27.635239 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-htpf7" podStartSLOduration=4.143455556 podStartE2EDuration="5.635201952s" podCreationTimestamp="2026-02-20 07:09:22 +0000 UTC" firstStartedPulling="2026-02-20 07:09:24.530865455 +0000 UTC m=+1379.403492176" lastFinishedPulling="2026-02-20 07:09:26.022611861 +0000 UTC m=+1380.895238572" observedRunningTime="2026-02-20 07:09:26.606379952 +0000 UTC m=+1381.479006683" watchObservedRunningTime="2026-02-20 07:09:27.635201952 +0000 UTC m=+1382.507828703" Feb 20 07:09:28 crc kubenswrapper[5094]: I0220 07:09:28.977266 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rfczb" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.058139 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5bjh\" (UniqueName: \"kubernetes.io/projected/9b534507-5d2d-496b-9a60-f0b45e25bb23-kube-api-access-k5bjh\") pod \"9b534507-5d2d-496b-9a60-f0b45e25bb23\" (UID: \"9b534507-5d2d-496b-9a60-f0b45e25bb23\") " Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.070021 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b534507-5d2d-496b-9a60-f0b45e25bb23-kube-api-access-k5bjh" (OuterVolumeSpecName: "kube-api-access-k5bjh") pod "9b534507-5d2d-496b-9a60-f0b45e25bb23" (UID: "9b534507-5d2d-496b-9a60-f0b45e25bb23"). InnerVolumeSpecName "kube-api-access-k5bjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.159914 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-scripts\") pod \"9b534507-5d2d-496b-9a60-f0b45e25bb23\" (UID: \"9b534507-5d2d-496b-9a60-f0b45e25bb23\") " Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.160099 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-config-data\") pod \"9b534507-5d2d-496b-9a60-f0b45e25bb23\" (UID: \"9b534507-5d2d-496b-9a60-f0b45e25bb23\") " Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.160266 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-combined-ca-bundle\") pod \"9b534507-5d2d-496b-9a60-f0b45e25bb23\" (UID: \"9b534507-5d2d-496b-9a60-f0b45e25bb23\") " Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.161096 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5bjh\" (UniqueName: \"kubernetes.io/projected/9b534507-5d2d-496b-9a60-f0b45e25bb23-kube-api-access-k5bjh\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.166300 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-scripts" (OuterVolumeSpecName: "scripts") pod "9b534507-5d2d-496b-9a60-f0b45e25bb23" (UID: "9b534507-5d2d-496b-9a60-f0b45e25bb23"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.196091 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-config-data" (OuterVolumeSpecName: "config-data") pod "9b534507-5d2d-496b-9a60-f0b45e25bb23" (UID: "9b534507-5d2d-496b-9a60-f0b45e25bb23"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.219806 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b534507-5d2d-496b-9a60-f0b45e25bb23" (UID: "9b534507-5d2d-496b-9a60-f0b45e25bb23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.264014 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.264082 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.264109 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.650520 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rfczb" event={"ID":"9b534507-5d2d-496b-9a60-f0b45e25bb23","Type":"ContainerDied","Data":"00529373718eeeb13da913d19da79c13c97d537751cf2e72ac9762a9157f28af"} Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.650688 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00529373718eeeb13da913d19da79c13c97d537751cf2e72ac9762a9157f28af" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.651106 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rfczb" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.838662 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 07:09:29 crc kubenswrapper[5094]: E0220 07:09:29.839127 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b534507-5d2d-496b-9a60-f0b45e25bb23" containerName="nova-cell0-conductor-db-sync" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.839150 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b534507-5d2d-496b-9a60-f0b45e25bb23" containerName="nova-cell0-conductor-db-sync" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.839330 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b534507-5d2d-496b-9a60-f0b45e25bb23" containerName="nova-cell0-conductor-db-sync" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.840168 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.843380 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.856055 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.856807 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-j928v" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.880585 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whfw7\" (UniqueName: \"kubernetes.io/projected/a5bbb9ad-deeb-495f-9750-f7012c00061d-kube-api-access-whfw7\") pod \"nova-cell0-conductor-0\" (UID: \"a5bbb9ad-deeb-495f-9750-f7012c00061d\") " pod="openstack/nova-cell0-conductor-0" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.880652 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5bbb9ad-deeb-495f-9750-f7012c00061d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a5bbb9ad-deeb-495f-9750-f7012c00061d\") " pod="openstack/nova-cell0-conductor-0" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.880774 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5bbb9ad-deeb-495f-9750-f7012c00061d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a5bbb9ad-deeb-495f-9750-f7012c00061d\") " pod="openstack/nova-cell0-conductor-0" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.982171 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whfw7\" (UniqueName: \"kubernetes.io/projected/a5bbb9ad-deeb-495f-9750-f7012c00061d-kube-api-access-whfw7\") pod \"nova-cell0-conductor-0\" (UID: \"a5bbb9ad-deeb-495f-9750-f7012c00061d\") " pod="openstack/nova-cell0-conductor-0" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.982235 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5bbb9ad-deeb-495f-9750-f7012c00061d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a5bbb9ad-deeb-495f-9750-f7012c00061d\") " pod="openstack/nova-cell0-conductor-0" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.982307 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5bbb9ad-deeb-495f-9750-f7012c00061d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a5bbb9ad-deeb-495f-9750-f7012c00061d\") " pod="openstack/nova-cell0-conductor-0" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.987452 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5bbb9ad-deeb-495f-9750-f7012c00061d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a5bbb9ad-deeb-495f-9750-f7012c00061d\") " pod="openstack/nova-cell0-conductor-0" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.987793 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5bbb9ad-deeb-495f-9750-f7012c00061d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a5bbb9ad-deeb-495f-9750-f7012c00061d\") " pod="openstack/nova-cell0-conductor-0" Feb 20 07:09:30 crc kubenswrapper[5094]: I0220 07:09:30.002522 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whfw7\" (UniqueName: \"kubernetes.io/projected/a5bbb9ad-deeb-495f-9750-f7012c00061d-kube-api-access-whfw7\") pod \"nova-cell0-conductor-0\" (UID: \"a5bbb9ad-deeb-495f-9750-f7012c00061d\") " pod="openstack/nova-cell0-conductor-0" Feb 20 07:09:30 crc kubenswrapper[5094]: I0220 07:09:30.156322 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 07:09:30 crc kubenswrapper[5094]: I0220 07:09:30.628986 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 07:09:30 crc kubenswrapper[5094]: W0220 07:09:30.630164 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5bbb9ad_deeb_495f_9750_f7012c00061d.slice/crio-8fedbe0d4bad6502fc4850eabe0e6dd2cb68cdc0ac0b36644634a1ea4c48d937 WatchSource:0}: Error finding container 8fedbe0d4bad6502fc4850eabe0e6dd2cb68cdc0ac0b36644634a1ea4c48d937: Status 404 returned error can't find the container with id 8fedbe0d4bad6502fc4850eabe0e6dd2cb68cdc0ac0b36644634a1ea4c48d937 Feb 20 07:09:30 crc kubenswrapper[5094]: I0220 07:09:30.663158 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a5bbb9ad-deeb-495f-9750-f7012c00061d","Type":"ContainerStarted","Data":"8fedbe0d4bad6502fc4850eabe0e6dd2cb68cdc0ac0b36644634a1ea4c48d937"} Feb 20 07:09:31 crc kubenswrapper[5094]: I0220 07:09:31.677648 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a5bbb9ad-deeb-495f-9750-f7012c00061d","Type":"ContainerStarted","Data":"c9d8bef2aa627582c2efca41433b2b24a6bb42ef156ade330dc390cb501211cf"} Feb 20 07:09:31 crc kubenswrapper[5094]: I0220 07:09:31.678352 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 20 07:09:31 crc kubenswrapper[5094]: I0220 07:09:31.701835 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.7018136630000003 podStartE2EDuration="2.701813663s" podCreationTimestamp="2026-02-20 07:09:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:09:31.694279983 +0000 UTC m=+1386.566906694" watchObservedRunningTime="2026-02-20 07:09:31.701813663 +0000 UTC m=+1386.574440384" Feb 20 07:09:32 crc kubenswrapper[5094]: I0220 07:09:32.108819 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 20 07:09:33 crc kubenswrapper[5094]: I0220 07:09:33.308402 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:33 crc kubenswrapper[5094]: I0220 07:09:33.309021 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:33 crc kubenswrapper[5094]: I0220 07:09:33.403494 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:33 crc kubenswrapper[5094]: I0220 07:09:33.787172 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:33 crc kubenswrapper[5094]: I0220 07:09:33.860340 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-htpf7"] Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.209653 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.723635 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-htpf7" podUID="9a6d10f4-9048-4e6c-831c-8342e340d290" containerName="registry-server" containerID="cri-o://4e36992ceacec2103fd06fe1ba8d785e7dc84b537e9b805e8337c088a4f67592" gracePeriod=2 Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.829803 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-v6v6k"] Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.831277 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-v6v6k" Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.838265 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.838401 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.855037 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-scripts\") pod \"nova-cell0-cell-mapping-v6v6k\" (UID: \"61f66271-5ce9-4412-8ea3-9a63a934f307\") " pod="openstack/nova-cell0-cell-mapping-v6v6k" Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.856910 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw5mp\" (UniqueName: \"kubernetes.io/projected/61f66271-5ce9-4412-8ea3-9a63a934f307-kube-api-access-pw5mp\") pod \"nova-cell0-cell-mapping-v6v6k\" (UID: \"61f66271-5ce9-4412-8ea3-9a63a934f307\") " pod="openstack/nova-cell0-cell-mapping-v6v6k" Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.857137 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-v6v6k\" (UID: \"61f66271-5ce9-4412-8ea3-9a63a934f307\") " pod="openstack/nova-cell0-cell-mapping-v6v6k" Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.857191 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-config-data\") pod \"nova-cell0-cell-mapping-v6v6k\" (UID: \"61f66271-5ce9-4412-8ea3-9a63a934f307\") " pod="openstack/nova-cell0-cell-mapping-v6v6k" Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.861966 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-v6v6k"] Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.959550 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-scripts\") pod \"nova-cell0-cell-mapping-v6v6k\" (UID: \"61f66271-5ce9-4412-8ea3-9a63a934f307\") " pod="openstack/nova-cell0-cell-mapping-v6v6k" Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.959630 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw5mp\" (UniqueName: \"kubernetes.io/projected/61f66271-5ce9-4412-8ea3-9a63a934f307-kube-api-access-pw5mp\") pod \"nova-cell0-cell-mapping-v6v6k\" (UID: \"61f66271-5ce9-4412-8ea3-9a63a934f307\") " pod="openstack/nova-cell0-cell-mapping-v6v6k" Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.959841 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-v6v6k\" (UID: \"61f66271-5ce9-4412-8ea3-9a63a934f307\") " pod="openstack/nova-cell0-cell-mapping-v6v6k" Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.959894 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-config-data\") pod \"nova-cell0-cell-mapping-v6v6k\" (UID: \"61f66271-5ce9-4412-8ea3-9a63a934f307\") " pod="openstack/nova-cell0-cell-mapping-v6v6k" Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.980736 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-scripts\") pod \"nova-cell0-cell-mapping-v6v6k\" (UID: \"61f66271-5ce9-4412-8ea3-9a63a934f307\") " pod="openstack/nova-cell0-cell-mapping-v6v6k" Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.981296 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-config-data\") pod \"nova-cell0-cell-mapping-v6v6k\" (UID: \"61f66271-5ce9-4412-8ea3-9a63a934f307\") " pod="openstack/nova-cell0-cell-mapping-v6v6k" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:35.999930 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-v6v6k\" (UID: \"61f66271-5ce9-4412-8ea3-9a63a934f307\") " pod="openstack/nova-cell0-cell-mapping-v6v6k" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.013387 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw5mp\" (UniqueName: \"kubernetes.io/projected/61f66271-5ce9-4412-8ea3-9a63a934f307-kube-api-access-pw5mp\") pod \"nova-cell0-cell-mapping-v6v6k\" (UID: \"61f66271-5ce9-4412-8ea3-9a63a934f307\") " pod="openstack/nova-cell0-cell-mapping-v6v6k" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.158337 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.174248 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.182172 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.182355 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.205242 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-v6v6k" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.235853 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.238010 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.242752 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.277569 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.294057 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/049a4617-20ed-4f86-a0c4-a3a59bd44f26-config-data\") pod \"nova-api-0\" (UID: \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\") " pod="openstack/nova-api-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.294128 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/049a4617-20ed-4f86-a0c4-a3a59bd44f26-logs\") pod \"nova-api-0\" (UID: \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\") " pod="openstack/nova-api-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.294171 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/043bc1d7-f57a-481d-b132-71ef45e85480-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"043bc1d7-f57a-481d-b132-71ef45e85480\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.294191 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwglg\" (UniqueName: \"kubernetes.io/projected/043bc1d7-f57a-481d-b132-71ef45e85480-kube-api-access-zwglg\") pod \"nova-cell1-novncproxy-0\" (UID: \"043bc1d7-f57a-481d-b132-71ef45e85480\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.294278 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw5c6\" (UniqueName: \"kubernetes.io/projected/049a4617-20ed-4f86-a0c4-a3a59bd44f26-kube-api-access-sw5c6\") pod \"nova-api-0\" (UID: \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\") " pod="openstack/nova-api-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.294306 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/049a4617-20ed-4f86-a0c4-a3a59bd44f26-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\") " pod="openstack/nova-api-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.294338 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/043bc1d7-f57a-481d-b132-71ef45e85480-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"043bc1d7-f57a-481d-b132-71ef45e85480\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.391452 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.393666 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.398765 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.402460 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw5c6\" (UniqueName: \"kubernetes.io/projected/049a4617-20ed-4f86-a0c4-a3a59bd44f26-kube-api-access-sw5c6\") pod \"nova-api-0\" (UID: \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\") " pod="openstack/nova-api-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.402572 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/049a4617-20ed-4f86-a0c4-a3a59bd44f26-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\") " pod="openstack/nova-api-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.402613 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/043bc1d7-f57a-481d-b132-71ef45e85480-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"043bc1d7-f57a-481d-b132-71ef45e85480\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.402677 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/049a4617-20ed-4f86-a0c4-a3a59bd44f26-config-data\") pod \"nova-api-0\" (UID: \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\") " pod="openstack/nova-api-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.402712 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/049a4617-20ed-4f86-a0c4-a3a59bd44f26-logs\") pod \"nova-api-0\" (UID: \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\") " pod="openstack/nova-api-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.402754 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/043bc1d7-f57a-481d-b132-71ef45e85480-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"043bc1d7-f57a-481d-b132-71ef45e85480\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.402775 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwglg\" (UniqueName: \"kubernetes.io/projected/043bc1d7-f57a-481d-b132-71ef45e85480-kube-api-access-zwglg\") pod \"nova-cell1-novncproxy-0\" (UID: \"043bc1d7-f57a-481d-b132-71ef45e85480\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.404524 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/049a4617-20ed-4f86-a0c4-a3a59bd44f26-logs\") pod \"nova-api-0\" (UID: \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\") " pod="openstack/nova-api-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.411233 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/049a4617-20ed-4f86-a0c4-a3a59bd44f26-config-data\") pod \"nova-api-0\" (UID: \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\") " pod="openstack/nova-api-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.424414 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.426327 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/043bc1d7-f57a-481d-b132-71ef45e85480-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"043bc1d7-f57a-481d-b132-71ef45e85480\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.428377 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwglg\" (UniqueName: \"kubernetes.io/projected/043bc1d7-f57a-481d-b132-71ef45e85480-kube-api-access-zwglg\") pod \"nova-cell1-novncproxy-0\" (UID: \"043bc1d7-f57a-481d-b132-71ef45e85480\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.435061 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/049a4617-20ed-4f86-a0c4-a3a59bd44f26-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\") " pod="openstack/nova-api-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.437302 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/043bc1d7-f57a-481d-b132-71ef45e85480-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"043bc1d7-f57a-481d-b132-71ef45e85480\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.445806 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:09:36 crc kubenswrapper[5094]: E0220 07:09:36.446395 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a6d10f4-9048-4e6c-831c-8342e340d290" containerName="extract-content" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.446415 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a6d10f4-9048-4e6c-831c-8342e340d290" containerName="extract-content" Feb 20 07:09:36 crc kubenswrapper[5094]: E0220 07:09:36.446467 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a6d10f4-9048-4e6c-831c-8342e340d290" containerName="extract-utilities" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.446474 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a6d10f4-9048-4e6c-831c-8342e340d290" containerName="extract-utilities" Feb 20 07:09:36 crc kubenswrapper[5094]: E0220 07:09:36.446493 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a6d10f4-9048-4e6c-831c-8342e340d290" containerName="registry-server" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.446499 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a6d10f4-9048-4e6c-831c-8342e340d290" containerName="registry-server" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.446660 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a6d10f4-9048-4e6c-831c-8342e340d290" containerName="registry-server" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.447113 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw5c6\" (UniqueName: \"kubernetes.io/projected/049a4617-20ed-4f86-a0c4-a3a59bd44f26-kube-api-access-sw5c6\") pod \"nova-api-0\" (UID: \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\") " pod="openstack/nova-api-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.458056 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.468217 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.483134 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.503884 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.505925 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a6d10f4-9048-4e6c-831c-8342e340d290-catalog-content\") pod \"9a6d10f4-9048-4e6c-831c-8342e340d290\" (UID: \"9a6d10f4-9048-4e6c-831c-8342e340d290\") " Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.505958 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a6d10f4-9048-4e6c-831c-8342e340d290-utilities\") pod \"9a6d10f4-9048-4e6c-831c-8342e340d290\" (UID: \"9a6d10f4-9048-4e6c-831c-8342e340d290\") " Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.506084 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5fdx\" (UniqueName: \"kubernetes.io/projected/9a6d10f4-9048-4e6c-831c-8342e340d290-kube-api-access-b5fdx\") pod \"9a6d10f4-9048-4e6c-831c-8342e340d290\" (UID: \"9a6d10f4-9048-4e6c-831c-8342e340d290\") " Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.506287 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bae6adb7-0e70-4689-aebe-d027da87abbb-config-data\") pod \"nova-scheduler-0\" (UID: \"bae6adb7-0e70-4689-aebe-d027da87abbb\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.506318 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdz9n\" (UniqueName: \"kubernetes.io/projected/bae6adb7-0e70-4689-aebe-d027da87abbb-kube-api-access-vdz9n\") pod \"nova-scheduler-0\" (UID: \"bae6adb7-0e70-4689-aebe-d027da87abbb\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.506346 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1c7c378-78bc-48bd-932c-fa19cf4e6284-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\") " pod="openstack/nova-metadata-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.506385 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4nvf\" (UniqueName: \"kubernetes.io/projected/c1c7c378-78bc-48bd-932c-fa19cf4e6284-kube-api-access-x4nvf\") pod \"nova-metadata-0\" (UID: \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\") " pod="openstack/nova-metadata-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.506409 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bae6adb7-0e70-4689-aebe-d027da87abbb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bae6adb7-0e70-4689-aebe-d027da87abbb\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.506442 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1c7c378-78bc-48bd-932c-fa19cf4e6284-config-data\") pod \"nova-metadata-0\" (UID: \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\") " pod="openstack/nova-metadata-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.506480 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1c7c378-78bc-48bd-932c-fa19cf4e6284-logs\") pod \"nova-metadata-0\" (UID: \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\") " pod="openstack/nova-metadata-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.508688 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a6d10f4-9048-4e6c-831c-8342e340d290-utilities" (OuterVolumeSpecName: "utilities") pod "9a6d10f4-9048-4e6c-831c-8342e340d290" (UID: "9a6d10f4-9048-4e6c-831c-8342e340d290"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.519877 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a6d10f4-9048-4e6c-831c-8342e340d290-kube-api-access-b5fdx" (OuterVolumeSpecName: "kube-api-access-b5fdx") pod "9a6d10f4-9048-4e6c-831c-8342e340d290" (UID: "9a6d10f4-9048-4e6c-831c-8342e340d290"). InnerVolumeSpecName "kube-api-access-b5fdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.591331 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75ddbf7c75-smx5j"] Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.591682 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a6d10f4-9048-4e6c-831c-8342e340d290-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a6d10f4-9048-4e6c-831c-8342e340d290" (UID: "9a6d10f4-9048-4e6c-831c-8342e340d290"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.593584 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.627379 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1c7c378-78bc-48bd-932c-fa19cf4e6284-logs\") pod \"nova-metadata-0\" (UID: \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\") " pod="openstack/nova-metadata-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.627802 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bae6adb7-0e70-4689-aebe-d027da87abbb-config-data\") pod \"nova-scheduler-0\" (UID: \"bae6adb7-0e70-4689-aebe-d027da87abbb\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.627873 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdz9n\" (UniqueName: \"kubernetes.io/projected/bae6adb7-0e70-4689-aebe-d027da87abbb-kube-api-access-vdz9n\") pod \"nova-scheduler-0\" (UID: \"bae6adb7-0e70-4689-aebe-d027da87abbb\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.627927 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1c7c378-78bc-48bd-932c-fa19cf4e6284-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\") " pod="openstack/nova-metadata-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.627978 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1c7c378-78bc-48bd-932c-fa19cf4e6284-logs\") pod \"nova-metadata-0\" (UID: \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\") " pod="openstack/nova-metadata-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.628056 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4nvf\" (UniqueName: \"kubernetes.io/projected/c1c7c378-78bc-48bd-932c-fa19cf4e6284-kube-api-access-x4nvf\") pod \"nova-metadata-0\" (UID: \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\") " pod="openstack/nova-metadata-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.628087 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bae6adb7-0e70-4689-aebe-d027da87abbb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bae6adb7-0e70-4689-aebe-d027da87abbb\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.628117 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1c7c378-78bc-48bd-932c-fa19cf4e6284-config-data\") pod \"nova-metadata-0\" (UID: \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\") " pod="openstack/nova-metadata-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.628183 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a6d10f4-9048-4e6c-831c-8342e340d290-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.628194 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a6d10f4-9048-4e6c-831c-8342e340d290-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.628204 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5fdx\" (UniqueName: \"kubernetes.io/projected/9a6d10f4-9048-4e6c-831c-8342e340d290-kube-api-access-b5fdx\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.635719 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75ddbf7c75-smx5j"] Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.636954 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bae6adb7-0e70-4689-aebe-d027da87abbb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bae6adb7-0e70-4689-aebe-d027da87abbb\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.637652 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1c7c378-78bc-48bd-932c-fa19cf4e6284-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\") " pod="openstack/nova-metadata-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.638508 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1c7c378-78bc-48bd-932c-fa19cf4e6284-config-data\") pod \"nova-metadata-0\" (UID: \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\") " pod="openstack/nova-metadata-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.644780 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bae6adb7-0e70-4689-aebe-d027da87abbb-config-data\") pod \"nova-scheduler-0\" (UID: \"bae6adb7-0e70-4689-aebe-d027da87abbb\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.657347 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.658591 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdz9n\" (UniqueName: \"kubernetes.io/projected/bae6adb7-0e70-4689-aebe-d027da87abbb-kube-api-access-vdz9n\") pod \"nova-scheduler-0\" (UID: \"bae6adb7-0e70-4689-aebe-d027da87abbb\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.677687 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.684212 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4nvf\" (UniqueName: \"kubernetes.io/projected/c1c7c378-78bc-48bd-932c-fa19cf4e6284-kube-api-access-x4nvf\") pod \"nova-metadata-0\" (UID: \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\") " pod="openstack/nova-metadata-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.729410 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.729917 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88nqd\" (UniqueName: \"kubernetes.io/projected/5c3bfc89-edf8-4721-a74c-b01a81025919-kube-api-access-88nqd\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.730177 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-dns-svc\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.731017 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-ovsdbserver-sb\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.731077 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-config\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.731102 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-ovsdbserver-nb\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.731121 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-dns-swift-storage-0\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.747843 5094 generic.go:334] "Generic (PLEG): container finished" podID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerID="52d0d24cfba05c590c4ce7431d5e10a363e5e5262aa05f4ff8f4f412c8232924" exitCode=137 Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.747935 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0797f40-7313-479d-95ab-0a65d83b96d1","Type":"ContainerDied","Data":"52d0d24cfba05c590c4ce7431d5e10a363e5e5262aa05f4ff8f4f412c8232924"} Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.753771 5094 generic.go:334] "Generic (PLEG): container finished" podID="9a6d10f4-9048-4e6c-831c-8342e340d290" containerID="4e36992ceacec2103fd06fe1ba8d785e7dc84b537e9b805e8337c088a4f67592" exitCode=0 Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.753841 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-htpf7" event={"ID":"9a6d10f4-9048-4e6c-831c-8342e340d290","Type":"ContainerDied","Data":"4e36992ceacec2103fd06fe1ba8d785e7dc84b537e9b805e8337c088a4f67592"} Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.753916 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-htpf7" event={"ID":"9a6d10f4-9048-4e6c-831c-8342e340d290","Type":"ContainerDied","Data":"06256a1ce1be59544899f5fa410cb2b0ac1384aa6f324b0c9ca14e8938c56f75"} Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.753939 5094 scope.go:117] "RemoveContainer" containerID="4e36992ceacec2103fd06fe1ba8d785e7dc84b537e9b805e8337c088a4f67592" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.753983 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.803639 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-htpf7"] Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.832272 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-htpf7"] Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.834416 5094 scope.go:117] "RemoveContainer" containerID="21b883bbf98254c856096d0b01c81fe32d0d73ad5cc8f16e030435fa52dfc0db" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.845494 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.856059 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-config\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.856193 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-ovsdbserver-nb\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.856271 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-dns-swift-storage-0\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.856674 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88nqd\" (UniqueName: \"kubernetes.io/projected/5c3bfc89-edf8-4721-a74c-b01a81025919-kube-api-access-88nqd\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.857070 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-dns-svc\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.857235 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-ovsdbserver-sb\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.857908 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-config\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.858497 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-dns-swift-storage-0\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.858558 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-ovsdbserver-sb\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.859097 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-ovsdbserver-nb\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.859449 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-dns-svc\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.883898 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88nqd\" (UniqueName: \"kubernetes.io/projected/5c3bfc89-edf8-4721-a74c-b01a81025919-kube-api-access-88nqd\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.970667 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.036692 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-v6v6k"] Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.079131 5094 scope.go:117] "RemoveContainer" containerID="c1adf8db45a1096e54fcc03867ab2eaebd032c062e599ce4d2e371f076654953" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.186449 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.207450 5094 scope.go:117] "RemoveContainer" containerID="4e36992ceacec2103fd06fe1ba8d785e7dc84b537e9b805e8337c088a4f67592" Feb 20 07:09:37 crc kubenswrapper[5094]: E0220 07:09:37.207983 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e36992ceacec2103fd06fe1ba8d785e7dc84b537e9b805e8337c088a4f67592\": container with ID starting with 4e36992ceacec2103fd06fe1ba8d785e7dc84b537e9b805e8337c088a4f67592 not found: ID does not exist" containerID="4e36992ceacec2103fd06fe1ba8d785e7dc84b537e9b805e8337c088a4f67592" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.208012 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e36992ceacec2103fd06fe1ba8d785e7dc84b537e9b805e8337c088a4f67592"} err="failed to get container status \"4e36992ceacec2103fd06fe1ba8d785e7dc84b537e9b805e8337c088a4f67592\": rpc error: code = NotFound desc = could not find container \"4e36992ceacec2103fd06fe1ba8d785e7dc84b537e9b805e8337c088a4f67592\": container with ID starting with 4e36992ceacec2103fd06fe1ba8d785e7dc84b537e9b805e8337c088a4f67592 not found: ID does not exist" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.208035 5094 scope.go:117] "RemoveContainer" containerID="21b883bbf98254c856096d0b01c81fe32d0d73ad5cc8f16e030435fa52dfc0db" Feb 20 07:09:37 crc kubenswrapper[5094]: E0220 07:09:37.208468 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21b883bbf98254c856096d0b01c81fe32d0d73ad5cc8f16e030435fa52dfc0db\": container with ID starting with 21b883bbf98254c856096d0b01c81fe32d0d73ad5cc8f16e030435fa52dfc0db not found: ID does not exist" containerID="21b883bbf98254c856096d0b01c81fe32d0d73ad5cc8f16e030435fa52dfc0db" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.208529 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21b883bbf98254c856096d0b01c81fe32d0d73ad5cc8f16e030435fa52dfc0db"} err="failed to get container status \"21b883bbf98254c856096d0b01c81fe32d0d73ad5cc8f16e030435fa52dfc0db\": rpc error: code = NotFound desc = could not find container \"21b883bbf98254c856096d0b01c81fe32d0d73ad5cc8f16e030435fa52dfc0db\": container with ID starting with 21b883bbf98254c856096d0b01c81fe32d0d73ad5cc8f16e030435fa52dfc0db not found: ID does not exist" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.208562 5094 scope.go:117] "RemoveContainer" containerID="c1adf8db45a1096e54fcc03867ab2eaebd032c062e599ce4d2e371f076654953" Feb 20 07:09:37 crc kubenswrapper[5094]: E0220 07:09:37.208853 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1adf8db45a1096e54fcc03867ab2eaebd032c062e599ce4d2e371f076654953\": container with ID starting with c1adf8db45a1096e54fcc03867ab2eaebd032c062e599ce4d2e371f076654953 not found: ID does not exist" containerID="c1adf8db45a1096e54fcc03867ab2eaebd032c062e599ce4d2e371f076654953" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.208919 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1adf8db45a1096e54fcc03867ab2eaebd032c062e599ce4d2e371f076654953"} err="failed to get container status \"c1adf8db45a1096e54fcc03867ab2eaebd032c062e599ce4d2e371f076654953\": rpc error: code = NotFound desc = could not find container \"c1adf8db45a1096e54fcc03867ab2eaebd032c062e599ce4d2e371f076654953\": container with ID starting with c1adf8db45a1096e54fcc03867ab2eaebd032c062e599ce4d2e371f076654953 not found: ID does not exist" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.271533 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-scripts\") pod \"a0797f40-7313-479d-95ab-0a65d83b96d1\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.271605 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-sg-core-conf-yaml\") pod \"a0797f40-7313-479d-95ab-0a65d83b96d1\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.271640 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-config-data\") pod \"a0797f40-7313-479d-95ab-0a65d83b96d1\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.271815 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-combined-ca-bundle\") pod \"a0797f40-7313-479d-95ab-0a65d83b96d1\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.271908 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfk5k\" (UniqueName: \"kubernetes.io/projected/a0797f40-7313-479d-95ab-0a65d83b96d1-kube-api-access-bfk5k\") pod \"a0797f40-7313-479d-95ab-0a65d83b96d1\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.271942 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0797f40-7313-479d-95ab-0a65d83b96d1-log-httpd\") pod \"a0797f40-7313-479d-95ab-0a65d83b96d1\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.272011 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0797f40-7313-479d-95ab-0a65d83b96d1-run-httpd\") pod \"a0797f40-7313-479d-95ab-0a65d83b96d1\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.273183 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0797f40-7313-479d-95ab-0a65d83b96d1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a0797f40-7313-479d-95ab-0a65d83b96d1" (UID: "a0797f40-7313-479d-95ab-0a65d83b96d1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.273678 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0797f40-7313-479d-95ab-0a65d83b96d1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a0797f40-7313-479d-95ab-0a65d83b96d1" (UID: "a0797f40-7313-479d-95ab-0a65d83b96d1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.281137 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0797f40-7313-479d-95ab-0a65d83b96d1-kube-api-access-bfk5k" (OuterVolumeSpecName: "kube-api-access-bfk5k") pod "a0797f40-7313-479d-95ab-0a65d83b96d1" (UID: "a0797f40-7313-479d-95ab-0a65d83b96d1"). InnerVolumeSpecName "kube-api-access-bfk5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.282692 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-scripts" (OuterVolumeSpecName: "scripts") pod "a0797f40-7313-479d-95ab-0a65d83b96d1" (UID: "a0797f40-7313-479d-95ab-0a65d83b96d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.340389 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-pk97g"] Feb 20 07:09:37 crc kubenswrapper[5094]: E0220 07:09:37.340980 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="ceilometer-notification-agent" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.340995 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="ceilometer-notification-agent" Feb 20 07:09:37 crc kubenswrapper[5094]: E0220 07:09:37.341006 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="sg-core" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.341012 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="sg-core" Feb 20 07:09:37 crc kubenswrapper[5094]: E0220 07:09:37.341039 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="ceilometer-central-agent" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.341047 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="ceilometer-central-agent" Feb 20 07:09:37 crc kubenswrapper[5094]: E0220 07:09:37.341056 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="proxy-httpd" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.341063 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="proxy-httpd" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.341243 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="ceilometer-central-agent" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.341269 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="sg-core" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.341288 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="ceilometer-notification-agent" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.341309 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="proxy-httpd" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.342314 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-pk97g" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.345429 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.345439 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.352286 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a0797f40-7313-479d-95ab-0a65d83b96d1" (UID: "a0797f40-7313-479d-95ab-0a65d83b96d1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.363100 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-pk97g"] Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.375019 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-pk97g\" (UID: \"74c08d54-fdef-4808-bf52-f8ea0894af36\") " pod="openstack/nova-cell1-conductor-db-sync-pk97g" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.375146 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-scripts\") pod \"nova-cell1-conductor-db-sync-pk97g\" (UID: \"74c08d54-fdef-4808-bf52-f8ea0894af36\") " pod="openstack/nova-cell1-conductor-db-sync-pk97g" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.375198 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppk9z\" (UniqueName: \"kubernetes.io/projected/74c08d54-fdef-4808-bf52-f8ea0894af36-kube-api-access-ppk9z\") pod \"nova-cell1-conductor-db-sync-pk97g\" (UID: \"74c08d54-fdef-4808-bf52-f8ea0894af36\") " pod="openstack/nova-cell1-conductor-db-sync-pk97g" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.375225 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-config-data\") pod \"nova-cell1-conductor-db-sync-pk97g\" (UID: \"74c08d54-fdef-4808-bf52-f8ea0894af36\") " pod="openstack/nova-cell1-conductor-db-sync-pk97g" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.375346 5094 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0797f40-7313-479d-95ab-0a65d83b96d1-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.375365 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.375472 5094 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.375508 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfk5k\" (UniqueName: \"kubernetes.io/projected/a0797f40-7313-479d-95ab-0a65d83b96d1-kube-api-access-bfk5k\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.375523 5094 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0797f40-7313-479d-95ab-0a65d83b96d1-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.411257 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0797f40-7313-479d-95ab-0a65d83b96d1" (UID: "a0797f40-7313-479d-95ab-0a65d83b96d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.428431 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-config-data" (OuterVolumeSpecName: "config-data") pod "a0797f40-7313-479d-95ab-0a65d83b96d1" (UID: "a0797f40-7313-479d-95ab-0a65d83b96d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.478071 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-pk97g\" (UID: \"74c08d54-fdef-4808-bf52-f8ea0894af36\") " pod="openstack/nova-cell1-conductor-db-sync-pk97g" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.478252 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-scripts\") pod \"nova-cell1-conductor-db-sync-pk97g\" (UID: \"74c08d54-fdef-4808-bf52-f8ea0894af36\") " pod="openstack/nova-cell1-conductor-db-sync-pk97g" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.478303 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppk9z\" (UniqueName: \"kubernetes.io/projected/74c08d54-fdef-4808-bf52-f8ea0894af36-kube-api-access-ppk9z\") pod \"nova-cell1-conductor-db-sync-pk97g\" (UID: \"74c08d54-fdef-4808-bf52-f8ea0894af36\") " pod="openstack/nova-cell1-conductor-db-sync-pk97g" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.478329 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-config-data\") pod \"nova-cell1-conductor-db-sync-pk97g\" (UID: \"74c08d54-fdef-4808-bf52-f8ea0894af36\") " pod="openstack/nova-cell1-conductor-db-sync-pk97g" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.478430 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.478447 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.483564 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-scripts\") pod \"nova-cell1-conductor-db-sync-pk97g\" (UID: \"74c08d54-fdef-4808-bf52-f8ea0894af36\") " pod="openstack/nova-cell1-conductor-db-sync-pk97g" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.484207 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-pk97g\" (UID: \"74c08d54-fdef-4808-bf52-f8ea0894af36\") " pod="openstack/nova-cell1-conductor-db-sync-pk97g" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.490416 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-config-data\") pod \"nova-cell1-conductor-db-sync-pk97g\" (UID: \"74c08d54-fdef-4808-bf52-f8ea0894af36\") " pod="openstack/nova-cell1-conductor-db-sync-pk97g" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.507590 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppk9z\" (UniqueName: \"kubernetes.io/projected/74c08d54-fdef-4808-bf52-f8ea0894af36-kube-api-access-ppk9z\") pod \"nova-cell1-conductor-db-sync-pk97g\" (UID: \"74c08d54-fdef-4808-bf52-f8ea0894af36\") " pod="openstack/nova-cell1-conductor-db-sync-pk97g" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.543938 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.571508 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.585530 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.665564 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-pk97g" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.730367 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:09:37 crc kubenswrapper[5094]: W0220 07:09:37.744360 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1c7c378_78bc_48bd_932c_fa19cf4e6284.slice/crio-6f8e319488735091a18c4719ad1bc421ed45fbf1898270c70449e004acaf42c0 WatchSource:0}: Error finding container 6f8e319488735091a18c4719ad1bc421ed45fbf1898270c70449e004acaf42c0: Status 404 returned error can't find the container with id 6f8e319488735091a18c4719ad1bc421ed45fbf1898270c70449e004acaf42c0 Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.787541 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0797f40-7313-479d-95ab-0a65d83b96d1","Type":"ContainerDied","Data":"f94a0c0b0f09c18362f4c831a12784474e87ae445bfb68efe6344e1d738ee970"} Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.789655 5094 scope.go:117] "RemoveContainer" containerID="52d0d24cfba05c590c4ce7431d5e10a363e5e5262aa05f4ff8f4f412c8232924" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.787868 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.805898 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"043bc1d7-f57a-481d-b132-71ef45e85480","Type":"ContainerStarted","Data":"9e410d4605ab52b279cb99a108d57c885a2eae7e022b21ab722b6893f02390c6"} Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.808422 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bae6adb7-0e70-4689-aebe-d027da87abbb","Type":"ContainerStarted","Data":"d76d353af2545ad8202d965c9d3961ffa837c726f661aaff8084a4e9ecb335a6"} Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.810865 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75ddbf7c75-smx5j"] Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.818951 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-v6v6k" event={"ID":"61f66271-5ce9-4412-8ea3-9a63a934f307","Type":"ContainerStarted","Data":"20a41a106027a121889419317d3bee9adb413422fa7d51549986b6bb65338151"} Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.818983 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-v6v6k" event={"ID":"61f66271-5ce9-4412-8ea3-9a63a934f307","Type":"ContainerStarted","Data":"80f14b4d6e479a64d507e140a2b75fccd3a83dbdb2db5beaa1b87cdc4abdeef2"} Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.825451 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c1c7c378-78bc-48bd-932c-fa19cf4e6284","Type":"ContainerStarted","Data":"6f8e319488735091a18c4719ad1bc421ed45fbf1898270c70449e004acaf42c0"} Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.827194 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"049a4617-20ed-4f86-a0c4-a3a59bd44f26","Type":"ContainerStarted","Data":"4e611ddd33c80e60aea3bd50de5978b4641402d17394778695691b74d9d5ce40"} Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.844927 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-v6v6k" podStartSLOduration=2.844903016 podStartE2EDuration="2.844903016s" podCreationTimestamp="2026-02-20 07:09:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:09:37.842978009 +0000 UTC m=+1392.715604720" watchObservedRunningTime="2026-02-20 07:09:37.844903016 +0000 UTC m=+1392.717529727" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.849011 5094 scope.go:117] "RemoveContainer" containerID="92042acb82a2b22307c932b8e95e8c09d3491dbdbdfc8249a49e90755238923c" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.901162 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a6d10f4-9048-4e6c-831c-8342e340d290" path="/var/lib/kubelet/pods/9a6d10f4-9048-4e6c-831c-8342e340d290/volumes" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.934346 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.945022 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.981803 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.988673 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.001270 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npk6v\" (UniqueName: \"kubernetes.io/projected/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-kube-api-access-npk6v\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.001336 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-log-httpd\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.001375 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-run-httpd\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.001451 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-scripts\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.001471 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.001492 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-config-data\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.001540 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.002622 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.005641 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.021570 5094 scope.go:117] "RemoveContainer" containerID="c4793f3606e7baf873c3ad7d2141782aac8bcc7350dfd8b20839747be94d7f25" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.027100 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.103062 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-log-httpd\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.103121 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-run-httpd\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.103196 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-scripts\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.103225 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.103251 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-config-data\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.103300 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.103332 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npk6v\" (UniqueName: \"kubernetes.io/projected/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-kube-api-access-npk6v\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.103692 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-log-httpd\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.104141 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-run-httpd\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.112190 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-config-data\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.113368 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-scripts\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.116883 5094 scope.go:117] "RemoveContainer" containerID="1229efd8ed9ba1e9b30b374a627f3c7487afe05838a26a18548bf279f8959273" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.118541 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.121397 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.125353 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npk6v\" (UniqueName: \"kubernetes.io/projected/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-kube-api-access-npk6v\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.187982 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-pk97g"] Feb 20 07:09:38 crc kubenswrapper[5094]: W0220 07:09:38.212185 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74c08d54_fdef_4808_bf52_f8ea0894af36.slice/crio-ff75344bbc9886f9a77e9dcd22f9fe49181ad6479e98427b7a131e2ff1510a0b WatchSource:0}: Error finding container ff75344bbc9886f9a77e9dcd22f9fe49181ad6479e98427b7a131e2ff1510a0b: Status 404 returned error can't find the container with id ff75344bbc9886f9a77e9dcd22f9fe49181ad6479e98427b7a131e2ff1510a0b Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.371835 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.870434 5094 generic.go:334] "Generic (PLEG): container finished" podID="5c3bfc89-edf8-4721-a74c-b01a81025919" containerID="f1d81ab41f319c40d4f9b647076b0951c152348ace820e5806cf50ad3581a45e" exitCode=0 Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.872849 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" event={"ID":"5c3bfc89-edf8-4721-a74c-b01a81025919","Type":"ContainerDied","Data":"f1d81ab41f319c40d4f9b647076b0951c152348ace820e5806cf50ad3581a45e"} Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.872917 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" event={"ID":"5c3bfc89-edf8-4721-a74c-b01a81025919","Type":"ContainerStarted","Data":"b6d15829f95b8aff57b91d13f507a8fa1a3e6f6b0bdf9f807b5778fd0588a0ff"} Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.880821 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-pk97g" event={"ID":"74c08d54-fdef-4808-bf52-f8ea0894af36","Type":"ContainerStarted","Data":"1dec1e1d1106c53f0481c59fed514648b5c6714553079cbf04bed733473a6862"} Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.880865 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-pk97g" event={"ID":"74c08d54-fdef-4808-bf52-f8ea0894af36","Type":"ContainerStarted","Data":"ff75344bbc9886f9a77e9dcd22f9fe49181ad6479e98427b7a131e2ff1510a0b"} Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.925752 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.934816 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-pk97g" podStartSLOduration=1.934783247 podStartE2EDuration="1.934783247s" podCreationTimestamp="2026-02-20 07:09:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:09:38.917944604 +0000 UTC m=+1393.790571315" watchObservedRunningTime="2026-02-20 07:09:38.934783247 +0000 UTC m=+1393.807409958" Feb 20 07:09:39 crc kubenswrapper[5094]: I0220 07:09:39.858844 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" path="/var/lib/kubelet/pods/a0797f40-7313-479d-95ab-0a65d83b96d1/volumes" Feb 20 07:09:40 crc kubenswrapper[5094]: I0220 07:09:40.115598 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:09:40 crc kubenswrapper[5094]: I0220 07:09:40.185920 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 07:09:40 crc kubenswrapper[5094]: I0220 07:09:40.903570 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8","Type":"ContainerStarted","Data":"33dad10331613824fe73aa48d99c70dc5318fba2b2e24e20fa4abae1bee74f21"} Feb 20 07:09:41 crc kubenswrapper[5094]: I0220 07:09:41.928321 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"049a4617-20ed-4f86-a0c4-a3a59bd44f26","Type":"ContainerStarted","Data":"f1120a67726b82c35452578214de053dba930e494ccdb044a45f2d20acdc8e01"} Feb 20 07:09:41 crc kubenswrapper[5094]: I0220 07:09:41.931390 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8","Type":"ContainerStarted","Data":"897a4151c566a72e1d0e9378be05cebb168cc60b9817042c8a3829b3600f9baa"} Feb 20 07:09:41 crc kubenswrapper[5094]: I0220 07:09:41.961268 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" event={"ID":"5c3bfc89-edf8-4721-a74c-b01a81025919","Type":"ContainerStarted","Data":"6722282a6c774d888a17178f484ae2330f099371a021c9f8c4fb887feb5e8b4c"} Feb 20 07:09:41 crc kubenswrapper[5094]: I0220 07:09:41.965127 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:41 crc kubenswrapper[5094]: I0220 07:09:41.983833 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"043bc1d7-f57a-481d-b132-71ef45e85480","Type":"ContainerStarted","Data":"f58aa2d0199fdb5db167c5aef1bd053858fb706513548fbeaab67aa8e11ddcd5"} Feb 20 07:09:41 crc kubenswrapper[5094]: I0220 07:09:41.984103 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="043bc1d7-f57a-481d-b132-71ef45e85480" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://f58aa2d0199fdb5db167c5aef1bd053858fb706513548fbeaab67aa8e11ddcd5" gracePeriod=30 Feb 20 07:09:41 crc kubenswrapper[5094]: I0220 07:09:41.994792 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bae6adb7-0e70-4689-aebe-d027da87abbb","Type":"ContainerStarted","Data":"f58ed2f0a80c2e018e48b0ad957c62ab9bd7aa5d88b44dabd89b67964ecb7d85"} Feb 20 07:09:41 crc kubenswrapper[5094]: I0220 07:09:41.999123 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c1c7c378-78bc-48bd-932c-fa19cf4e6284","Type":"ContainerStarted","Data":"8fc9ffae76a3123cc0d37835acdbe04a2f9a8e499d724ece0380a09483cd0561"} Feb 20 07:09:41 crc kubenswrapper[5094]: I0220 07:09:41.999284 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c1c7c378-78bc-48bd-932c-fa19cf4e6284" containerName="nova-metadata-log" containerID="cri-o://8fc9ffae76a3123cc0d37835acdbe04a2f9a8e499d724ece0380a09483cd0561" gracePeriod=30 Feb 20 07:09:41 crc kubenswrapper[5094]: I0220 07:09:41.999531 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c1c7c378-78bc-48bd-932c-fa19cf4e6284" containerName="nova-metadata-metadata" containerID="cri-o://3b84d0ad082d2f55f39f713582ba0b77187948e8fdabcb2633a95c7f3cd76484" gracePeriod=30 Feb 20 07:09:42 crc kubenswrapper[5094]: I0220 07:09:42.002019 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" podStartSLOduration=6.001994385 podStartE2EDuration="6.001994385s" podCreationTimestamp="2026-02-20 07:09:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:09:41.98846035 +0000 UTC m=+1396.861087061" watchObservedRunningTime="2026-02-20 07:09:42.001994385 +0000 UTC m=+1396.874621096" Feb 20 07:09:42 crc kubenswrapper[5094]: I0220 07:09:42.016302 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.271058182 podStartE2EDuration="6.016278847s" podCreationTimestamp="2026-02-20 07:09:36 +0000 UTC" firstStartedPulling="2026-02-20 07:09:37.537190786 +0000 UTC m=+1392.409817497" lastFinishedPulling="2026-02-20 07:09:41.282411451 +0000 UTC m=+1396.155038162" observedRunningTime="2026-02-20 07:09:42.008679295 +0000 UTC m=+1396.881306006" watchObservedRunningTime="2026-02-20 07:09:42.016278847 +0000 UTC m=+1396.888905558" Feb 20 07:09:42 crc kubenswrapper[5094]: I0220 07:09:42.057907 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.320484644 podStartE2EDuration="6.057880503s" podCreationTimestamp="2026-02-20 07:09:36 +0000 UTC" firstStartedPulling="2026-02-20 07:09:37.551544619 +0000 UTC m=+1392.424171330" lastFinishedPulling="2026-02-20 07:09:41.288940478 +0000 UTC m=+1396.161567189" observedRunningTime="2026-02-20 07:09:42.037564816 +0000 UTC m=+1396.910191527" watchObservedRunningTime="2026-02-20 07:09:42.057880503 +0000 UTC m=+1396.930507204" Feb 20 07:09:42 crc kubenswrapper[5094]: I0220 07:09:42.065127 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.496635154 podStartE2EDuration="6.065104816s" podCreationTimestamp="2026-02-20 07:09:36 +0000 UTC" firstStartedPulling="2026-02-20 07:09:37.748367594 +0000 UTC m=+1392.620994295" lastFinishedPulling="2026-02-20 07:09:41.316837246 +0000 UTC m=+1396.189463957" observedRunningTime="2026-02-20 07:09:42.059791769 +0000 UTC m=+1396.932418480" watchObservedRunningTime="2026-02-20 07:09:42.065104816 +0000 UTC m=+1396.937731527" Feb 20 07:09:43 crc kubenswrapper[5094]: I0220 07:09:43.018368 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"049a4617-20ed-4f86-a0c4-a3a59bd44f26","Type":"ContainerStarted","Data":"ef54747f742ee43a493f0fa824a214fb6267fc5169fba7034a316b9cd14cd93e"} Feb 20 07:09:43 crc kubenswrapper[5094]: I0220 07:09:43.022844 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8","Type":"ContainerStarted","Data":"d18a7448181106e07e9f83589f66798d53c158f7b49b4cb2f63dd87bb3ea37e5"} Feb 20 07:09:43 crc kubenswrapper[5094]: I0220 07:09:43.027263 5094 generic.go:334] "Generic (PLEG): container finished" podID="c1c7c378-78bc-48bd-932c-fa19cf4e6284" containerID="8fc9ffae76a3123cc0d37835acdbe04a2f9a8e499d724ece0380a09483cd0561" exitCode=143 Feb 20 07:09:43 crc kubenswrapper[5094]: I0220 07:09:43.028815 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c1c7c378-78bc-48bd-932c-fa19cf4e6284","Type":"ContainerDied","Data":"8fc9ffae76a3123cc0d37835acdbe04a2f9a8e499d724ece0380a09483cd0561"} Feb 20 07:09:43 crc kubenswrapper[5094]: I0220 07:09:43.028872 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c1c7c378-78bc-48bd-932c-fa19cf4e6284","Type":"ContainerStarted","Data":"3b84d0ad082d2f55f39f713582ba0b77187948e8fdabcb2633a95c7f3cd76484"} Feb 20 07:09:43 crc kubenswrapper[5094]: I0220 07:09:43.071806 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.300036494 podStartE2EDuration="7.071779935s" podCreationTimestamp="2026-02-20 07:09:36 +0000 UTC" firstStartedPulling="2026-02-20 07:09:37.537256867 +0000 UTC m=+1392.409883568" lastFinishedPulling="2026-02-20 07:09:41.309000298 +0000 UTC m=+1396.181627009" observedRunningTime="2026-02-20 07:09:43.053817465 +0000 UTC m=+1397.926444216" watchObservedRunningTime="2026-02-20 07:09:43.071779935 +0000 UTC m=+1397.944406656" Feb 20 07:09:44 crc kubenswrapper[5094]: I0220 07:09:44.041022 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8","Type":"ContainerStarted","Data":"40d3a0b02e5c779c99ffae4aa91b14ff147cdd2679151488afdea67f1c016caf"} Feb 20 07:09:46 crc kubenswrapper[5094]: I0220 07:09:46.080516 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8","Type":"ContainerStarted","Data":"07ea59e61fed8a3358551accb40261cf86e2779fdb107f968e111962734d98f4"} Feb 20 07:09:46 crc kubenswrapper[5094]: I0220 07:09:46.081021 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 07:09:46 crc kubenswrapper[5094]: I0220 07:09:46.082907 5094 generic.go:334] "Generic (PLEG): container finished" podID="61f66271-5ce9-4412-8ea3-9a63a934f307" containerID="20a41a106027a121889419317d3bee9adb413422fa7d51549986b6bb65338151" exitCode=0 Feb 20 07:09:46 crc kubenswrapper[5094]: I0220 07:09:46.082953 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-v6v6k" event={"ID":"61f66271-5ce9-4412-8ea3-9a63a934f307","Type":"ContainerDied","Data":"20a41a106027a121889419317d3bee9adb413422fa7d51549986b6bb65338151"} Feb 20 07:09:46 crc kubenswrapper[5094]: I0220 07:09:46.107930 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=5.308940555 podStartE2EDuration="9.107903638s" podCreationTimestamp="2026-02-20 07:09:37 +0000 UTC" firstStartedPulling="2026-02-20 07:09:41.147139751 +0000 UTC m=+1396.019766462" lastFinishedPulling="2026-02-20 07:09:44.946102794 +0000 UTC m=+1399.818729545" observedRunningTime="2026-02-20 07:09:46.103084663 +0000 UTC m=+1400.975711384" watchObservedRunningTime="2026-02-20 07:09:46.107903638 +0000 UTC m=+1400.980530349" Feb 20 07:09:46 crc kubenswrapper[5094]: I0220 07:09:46.659513 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:09:46 crc kubenswrapper[5094]: I0220 07:09:46.680019 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 07:09:46 crc kubenswrapper[5094]: I0220 07:09:46.680124 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 07:09:46 crc kubenswrapper[5094]: I0220 07:09:46.730996 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 20 07:09:46 crc kubenswrapper[5094]: I0220 07:09:46.731104 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 20 07:09:46 crc kubenswrapper[5094]: I0220 07:09:46.769330 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 20 07:09:46 crc kubenswrapper[5094]: I0220 07:09:46.847532 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 07:09:46 crc kubenswrapper[5094]: I0220 07:09:46.847902 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 07:09:46 crc kubenswrapper[5094]: I0220 07:09:46.973012 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.059007 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw"] Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.059299 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" podUID="864952a5-1f2e-4930-be7f-7dbc3a2c2af8" containerName="dnsmasq-dns" containerID="cri-o://fdf1995e90b86d0bcffee929097bd15c99141d0d2d0b08867fe0841afc3d59b6" gracePeriod=10 Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.121767 5094 generic.go:334] "Generic (PLEG): container finished" podID="74c08d54-fdef-4808-bf52-f8ea0894af36" containerID="1dec1e1d1106c53f0481c59fed514648b5c6714553079cbf04bed733473a6862" exitCode=0 Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.123282 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-pk97g" event={"ID":"74c08d54-fdef-4808-bf52-f8ea0894af36","Type":"ContainerDied","Data":"1dec1e1d1106c53f0481c59fed514648b5c6714553079cbf04bed733473a6862"} Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.177776 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.713511 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-v6v6k" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.719512 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.764913 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="049a4617-20ed-4f86-a0c4-a3a59bd44f26" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.185:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.765023 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="049a4617-20ed-4f86-a0c4-a3a59bd44f26" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.185:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.815845 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw5mp\" (UniqueName: \"kubernetes.io/projected/61f66271-5ce9-4412-8ea3-9a63a934f307-kube-api-access-pw5mp\") pod \"61f66271-5ce9-4412-8ea3-9a63a934f307\" (UID: \"61f66271-5ce9-4412-8ea3-9a63a934f307\") " Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.815962 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-config-data\") pod \"61f66271-5ce9-4412-8ea3-9a63a934f307\" (UID: \"61f66271-5ce9-4412-8ea3-9a63a934f307\") " Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.816072 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmkxt\" (UniqueName: \"kubernetes.io/projected/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-kube-api-access-dmkxt\") pod \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.816209 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-config\") pod \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.816252 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-dns-svc\") pod \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.816368 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-dns-swift-storage-0\") pod \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.816454 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-scripts\") pod \"61f66271-5ce9-4412-8ea3-9a63a934f307\" (UID: \"61f66271-5ce9-4412-8ea3-9a63a934f307\") " Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.816524 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-ovsdbserver-sb\") pod \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.816570 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-combined-ca-bundle\") pod \"61f66271-5ce9-4412-8ea3-9a63a934f307\" (UID: \"61f66271-5ce9-4412-8ea3-9a63a934f307\") " Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.816598 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-ovsdbserver-nb\") pod \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.868262 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-scripts" (OuterVolumeSpecName: "scripts") pod "61f66271-5ce9-4412-8ea3-9a63a934f307" (UID: "61f66271-5ce9-4412-8ea3-9a63a934f307"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.873287 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61f66271-5ce9-4412-8ea3-9a63a934f307-kube-api-access-pw5mp" (OuterVolumeSpecName: "kube-api-access-pw5mp") pod "61f66271-5ce9-4412-8ea3-9a63a934f307" (UID: "61f66271-5ce9-4412-8ea3-9a63a934f307"). InnerVolumeSpecName "kube-api-access-pw5mp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.874644 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-kube-api-access-dmkxt" (OuterVolumeSpecName: "kube-api-access-dmkxt") pod "864952a5-1f2e-4930-be7f-7dbc3a2c2af8" (UID: "864952a5-1f2e-4930-be7f-7dbc3a2c2af8"). InnerVolumeSpecName "kube-api-access-dmkxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.891908 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-config-data" (OuterVolumeSpecName: "config-data") pod "61f66271-5ce9-4412-8ea3-9a63a934f307" (UID: "61f66271-5ce9-4412-8ea3-9a63a934f307"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.896841 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61f66271-5ce9-4412-8ea3-9a63a934f307" (UID: "61f66271-5ce9-4412-8ea3-9a63a934f307"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.930574 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.932624 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw5mp\" (UniqueName: \"kubernetes.io/projected/61f66271-5ce9-4412-8ea3-9a63a934f307-kube-api-access-pw5mp\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.932721 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.932804 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmkxt\" (UniqueName: \"kubernetes.io/projected/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-kube-api-access-dmkxt\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.932870 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.934636 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "864952a5-1f2e-4930-be7f-7dbc3a2c2af8" (UID: "864952a5-1f2e-4930-be7f-7dbc3a2c2af8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.947652 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "864952a5-1f2e-4930-be7f-7dbc3a2c2af8" (UID: "864952a5-1f2e-4930-be7f-7dbc3a2c2af8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.969003 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-config" (OuterVolumeSpecName: "config") pod "864952a5-1f2e-4930-be7f-7dbc3a2c2af8" (UID: "864952a5-1f2e-4930-be7f-7dbc3a2c2af8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.976258 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "864952a5-1f2e-4930-be7f-7dbc3a2c2af8" (UID: "864952a5-1f2e-4930-be7f-7dbc3a2c2af8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.977180 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "864952a5-1f2e-4930-be7f-7dbc3a2c2af8" (UID: "864952a5-1f2e-4930-be7f-7dbc3a2c2af8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.034805 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.034835 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.034845 5094 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.034860 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.034871 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.133619 5094 generic.go:334] "Generic (PLEG): container finished" podID="864952a5-1f2e-4930-be7f-7dbc3a2c2af8" containerID="fdf1995e90b86d0bcffee929097bd15c99141d0d2d0b08867fe0841afc3d59b6" exitCode=0 Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.133671 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" event={"ID":"864952a5-1f2e-4930-be7f-7dbc3a2c2af8","Type":"ContainerDied","Data":"fdf1995e90b86d0bcffee929097bd15c99141d0d2d0b08867fe0841afc3d59b6"} Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.134236 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" event={"ID":"864952a5-1f2e-4930-be7f-7dbc3a2c2af8","Type":"ContainerDied","Data":"db6c4cfd73d84c6bf37834db171aab681839cbd0872d2e8b1d00c5c8feb0f4da"} Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.134264 5094 scope.go:117] "RemoveContainer" containerID="fdf1995e90b86d0bcffee929097bd15c99141d0d2d0b08867fe0841afc3d59b6" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.135952 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-v6v6k" event={"ID":"61f66271-5ce9-4412-8ea3-9a63a934f307","Type":"ContainerDied","Data":"80f14b4d6e479a64d507e140a2b75fccd3a83dbdb2db5beaa1b87cdc4abdeef2"} Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.136292 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80f14b4d6e479a64d507e140a2b75fccd3a83dbdb2db5beaa1b87cdc4abdeef2" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.138011 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.138442 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-v6v6k" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.178916 5094 scope.go:117] "RemoveContainer" containerID="e3d0972b678e16469c3366801fb2f8b9511912679cef680d0f581c3085fd2723" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.281220 5094 scope.go:117] "RemoveContainer" containerID="fdf1995e90b86d0bcffee929097bd15c99141d0d2d0b08867fe0841afc3d59b6" Feb 20 07:09:48 crc kubenswrapper[5094]: E0220 07:09:48.282439 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdf1995e90b86d0bcffee929097bd15c99141d0d2d0b08867fe0841afc3d59b6\": container with ID starting with fdf1995e90b86d0bcffee929097bd15c99141d0d2d0b08867fe0841afc3d59b6 not found: ID does not exist" containerID="fdf1995e90b86d0bcffee929097bd15c99141d0d2d0b08867fe0841afc3d59b6" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.282471 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdf1995e90b86d0bcffee929097bd15c99141d0d2d0b08867fe0841afc3d59b6"} err="failed to get container status \"fdf1995e90b86d0bcffee929097bd15c99141d0d2d0b08867fe0841afc3d59b6\": rpc error: code = NotFound desc = could not find container \"fdf1995e90b86d0bcffee929097bd15c99141d0d2d0b08867fe0841afc3d59b6\": container with ID starting with fdf1995e90b86d0bcffee929097bd15c99141d0d2d0b08867fe0841afc3d59b6 not found: ID does not exist" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.282494 5094 scope.go:117] "RemoveContainer" containerID="e3d0972b678e16469c3366801fb2f8b9511912679cef680d0f581c3085fd2723" Feb 20 07:09:48 crc kubenswrapper[5094]: E0220 07:09:48.283871 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3d0972b678e16469c3366801fb2f8b9511912679cef680d0f581c3085fd2723\": container with ID starting with e3d0972b678e16469c3366801fb2f8b9511912679cef680d0f581c3085fd2723 not found: ID does not exist" containerID="e3d0972b678e16469c3366801fb2f8b9511912679cef680d0f581c3085fd2723" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.283938 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3d0972b678e16469c3366801fb2f8b9511912679cef680d0f581c3085fd2723"} err="failed to get container status \"e3d0972b678e16469c3366801fb2f8b9511912679cef680d0f581c3085fd2723\": rpc error: code = NotFound desc = could not find container \"e3d0972b678e16469c3366801fb2f8b9511912679cef680d0f581c3085fd2723\": container with ID starting with e3d0972b678e16469c3366801fb2f8b9511912679cef680d0f581c3085fd2723 not found: ID does not exist" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.345124 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw"] Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.368214 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw"] Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.380075 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.391600 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.391856 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="049a4617-20ed-4f86-a0c4-a3a59bd44f26" containerName="nova-api-log" containerID="cri-o://f1120a67726b82c35452578214de053dba930e494ccdb044a45f2d20acdc8e01" gracePeriod=30 Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.392024 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="049a4617-20ed-4f86-a0c4-a3a59bd44f26" containerName="nova-api-api" containerID="cri-o://ef54747f742ee43a493f0fa824a214fb6267fc5169fba7034a316b9cd14cd93e" gracePeriod=30 Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.593372 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-pk97g" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.649982 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-config-data\") pod \"74c08d54-fdef-4808-bf52-f8ea0894af36\" (UID: \"74c08d54-fdef-4808-bf52-f8ea0894af36\") " Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.651613 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-scripts\") pod \"74c08d54-fdef-4808-bf52-f8ea0894af36\" (UID: \"74c08d54-fdef-4808-bf52-f8ea0894af36\") " Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.651799 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppk9z\" (UniqueName: \"kubernetes.io/projected/74c08d54-fdef-4808-bf52-f8ea0894af36-kube-api-access-ppk9z\") pod \"74c08d54-fdef-4808-bf52-f8ea0894af36\" (UID: \"74c08d54-fdef-4808-bf52-f8ea0894af36\") " Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.652049 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-combined-ca-bundle\") pod \"74c08d54-fdef-4808-bf52-f8ea0894af36\" (UID: \"74c08d54-fdef-4808-bf52-f8ea0894af36\") " Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.670326 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-scripts" (OuterVolumeSpecName: "scripts") pod "74c08d54-fdef-4808-bf52-f8ea0894af36" (UID: "74c08d54-fdef-4808-bf52-f8ea0894af36"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.685075 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74c08d54-fdef-4808-bf52-f8ea0894af36-kube-api-access-ppk9z" (OuterVolumeSpecName: "kube-api-access-ppk9z") pod "74c08d54-fdef-4808-bf52-f8ea0894af36" (UID: "74c08d54-fdef-4808-bf52-f8ea0894af36"). InnerVolumeSpecName "kube-api-access-ppk9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.692352 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-config-data" (OuterVolumeSpecName: "config-data") pod "74c08d54-fdef-4808-bf52-f8ea0894af36" (UID: "74c08d54-fdef-4808-bf52-f8ea0894af36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.724814 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74c08d54-fdef-4808-bf52-f8ea0894af36" (UID: "74c08d54-fdef-4808-bf52-f8ea0894af36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.756797 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.756837 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppk9z\" (UniqueName: \"kubernetes.io/projected/74c08d54-fdef-4808-bf52-f8ea0894af36-kube-api-access-ppk9z\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.756851 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.756862 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.161161 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-pk97g" event={"ID":"74c08d54-fdef-4808-bf52-f8ea0894af36","Type":"ContainerDied","Data":"ff75344bbc9886f9a77e9dcd22f9fe49181ad6479e98427b7a131e2ff1510a0b"} Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.161579 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff75344bbc9886f9a77e9dcd22f9fe49181ad6479e98427b7a131e2ff1510a0b" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.161219 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-pk97g" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.167940 5094 generic.go:334] "Generic (PLEG): container finished" podID="049a4617-20ed-4f86-a0c4-a3a59bd44f26" containerID="f1120a67726b82c35452578214de053dba930e494ccdb044a45f2d20acdc8e01" exitCode=143 Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.168827 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"049a4617-20ed-4f86-a0c4-a3a59bd44f26","Type":"ContainerDied","Data":"f1120a67726b82c35452578214de053dba930e494ccdb044a45f2d20acdc8e01"} Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.169103 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="bae6adb7-0e70-4689-aebe-d027da87abbb" containerName="nova-scheduler-scheduler" containerID="cri-o://f58ed2f0a80c2e018e48b0ad957c62ab9bd7aa5d88b44dabd89b67964ecb7d85" gracePeriod=30 Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.259862 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 07:09:49 crc kubenswrapper[5094]: E0220 07:09:49.260554 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74c08d54-fdef-4808-bf52-f8ea0894af36" containerName="nova-cell1-conductor-db-sync" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.260573 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="74c08d54-fdef-4808-bf52-f8ea0894af36" containerName="nova-cell1-conductor-db-sync" Feb 20 07:09:49 crc kubenswrapper[5094]: E0220 07:09:49.260617 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f66271-5ce9-4412-8ea3-9a63a934f307" containerName="nova-manage" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.260626 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f66271-5ce9-4412-8ea3-9a63a934f307" containerName="nova-manage" Feb 20 07:09:49 crc kubenswrapper[5094]: E0220 07:09:49.260645 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="864952a5-1f2e-4930-be7f-7dbc3a2c2af8" containerName="dnsmasq-dns" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.260653 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="864952a5-1f2e-4930-be7f-7dbc3a2c2af8" containerName="dnsmasq-dns" Feb 20 07:09:49 crc kubenswrapper[5094]: E0220 07:09:49.260670 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="864952a5-1f2e-4930-be7f-7dbc3a2c2af8" containerName="init" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.260676 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="864952a5-1f2e-4930-be7f-7dbc3a2c2af8" containerName="init" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.260925 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="74c08d54-fdef-4808-bf52-f8ea0894af36" containerName="nova-cell1-conductor-db-sync" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.260954 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="864952a5-1f2e-4930-be7f-7dbc3a2c2af8" containerName="dnsmasq-dns" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.260972 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="61f66271-5ce9-4412-8ea3-9a63a934f307" containerName="nova-manage" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.261866 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.264855 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.272370 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.374596 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3caa33a-a0ec-4fdc-876b-266724a5af50-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f3caa33a-a0ec-4fdc-876b-266724a5af50\") " pod="openstack/nova-cell1-conductor-0" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.375133 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3caa33a-a0ec-4fdc-876b-266724a5af50-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f3caa33a-a0ec-4fdc-876b-266724a5af50\") " pod="openstack/nova-cell1-conductor-0" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.375303 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7snx5\" (UniqueName: \"kubernetes.io/projected/f3caa33a-a0ec-4fdc-876b-266724a5af50-kube-api-access-7snx5\") pod \"nova-cell1-conductor-0\" (UID: \"f3caa33a-a0ec-4fdc-876b-266724a5af50\") " pod="openstack/nova-cell1-conductor-0" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.476828 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3caa33a-a0ec-4fdc-876b-266724a5af50-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f3caa33a-a0ec-4fdc-876b-266724a5af50\") " pod="openstack/nova-cell1-conductor-0" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.476943 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3caa33a-a0ec-4fdc-876b-266724a5af50-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f3caa33a-a0ec-4fdc-876b-266724a5af50\") " pod="openstack/nova-cell1-conductor-0" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.477096 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7snx5\" (UniqueName: \"kubernetes.io/projected/f3caa33a-a0ec-4fdc-876b-266724a5af50-kube-api-access-7snx5\") pod \"nova-cell1-conductor-0\" (UID: \"f3caa33a-a0ec-4fdc-876b-266724a5af50\") " pod="openstack/nova-cell1-conductor-0" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.484774 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3caa33a-a0ec-4fdc-876b-266724a5af50-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f3caa33a-a0ec-4fdc-876b-266724a5af50\") " pod="openstack/nova-cell1-conductor-0" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.485931 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3caa33a-a0ec-4fdc-876b-266724a5af50-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f3caa33a-a0ec-4fdc-876b-266724a5af50\") " pod="openstack/nova-cell1-conductor-0" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.494536 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7snx5\" (UniqueName: \"kubernetes.io/projected/f3caa33a-a0ec-4fdc-876b-266724a5af50-kube-api-access-7snx5\") pod \"nova-cell1-conductor-0\" (UID: \"f3caa33a-a0ec-4fdc-876b-266724a5af50\") " pod="openstack/nova-cell1-conductor-0" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.584166 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.854546 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="864952a5-1f2e-4930-be7f-7dbc3a2c2af8" path="/var/lib/kubelet/pods/864952a5-1f2e-4930-be7f-7dbc3a2c2af8/volumes" Feb 20 07:09:50 crc kubenswrapper[5094]: I0220 07:09:50.098854 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 07:09:50 crc kubenswrapper[5094]: W0220 07:09:50.112553 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3caa33a_a0ec_4fdc_876b_266724a5af50.slice/crio-04b7e6ec6e40c5ff8ef2a4fd5c6eaae7045f9b9edc65925744ff48699dca0766 WatchSource:0}: Error finding container 04b7e6ec6e40c5ff8ef2a4fd5c6eaae7045f9b9edc65925744ff48699dca0766: Status 404 returned error can't find the container with id 04b7e6ec6e40c5ff8ef2a4fd5c6eaae7045f9b9edc65925744ff48699dca0766 Feb 20 07:09:50 crc kubenswrapper[5094]: I0220 07:09:50.183227 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f3caa33a-a0ec-4fdc-876b-266724a5af50","Type":"ContainerStarted","Data":"04b7e6ec6e40c5ff8ef2a4fd5c6eaae7045f9b9edc65925744ff48699dca0766"} Feb 20 07:09:51 crc kubenswrapper[5094]: I0220 07:09:51.199926 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f3caa33a-a0ec-4fdc-876b-266724a5af50","Type":"ContainerStarted","Data":"d82cb6bb5b6363e25ca9e64db1ea09c7494c05c93392e549c89755977ef55244"} Feb 20 07:09:51 crc kubenswrapper[5094]: I0220 07:09:51.200371 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 20 07:09:51 crc kubenswrapper[5094]: I0220 07:09:51.238894 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.238862841 podStartE2EDuration="2.238862841s" podCreationTimestamp="2026-02-20 07:09:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:09:51.230748456 +0000 UTC m=+1406.103375167" watchObservedRunningTime="2026-02-20 07:09:51.238862841 +0000 UTC m=+1406.111489562" Feb 20 07:09:51 crc kubenswrapper[5094]: E0220 07:09:51.736915 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f58ed2f0a80c2e018e48b0ad957c62ab9bd7aa5d88b44dabd89b67964ecb7d85" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 07:09:51 crc kubenswrapper[5094]: E0220 07:09:51.739381 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f58ed2f0a80c2e018e48b0ad957c62ab9bd7aa5d88b44dabd89b67964ecb7d85" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 07:09:51 crc kubenswrapper[5094]: E0220 07:09:51.741645 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f58ed2f0a80c2e018e48b0ad957c62ab9bd7aa5d88b44dabd89b67964ecb7d85" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 07:09:51 crc kubenswrapper[5094]: E0220 07:09:51.741835 5094 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="bae6adb7-0e70-4689-aebe-d027da87abbb" containerName="nova-scheduler-scheduler" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.229039 5094 generic.go:334] "Generic (PLEG): container finished" podID="049a4617-20ed-4f86-a0c4-a3a59bd44f26" containerID="ef54747f742ee43a493f0fa824a214fb6267fc5169fba7034a316b9cd14cd93e" exitCode=0 Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.229159 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"049a4617-20ed-4f86-a0c4-a3a59bd44f26","Type":"ContainerDied","Data":"ef54747f742ee43a493f0fa824a214fb6267fc5169fba7034a316b9cd14cd93e"} Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.229727 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"049a4617-20ed-4f86-a0c4-a3a59bd44f26","Type":"ContainerDied","Data":"4e611ddd33c80e60aea3bd50de5978b4641402d17394778695691b74d9d5ce40"} Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.229746 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e611ddd33c80e60aea3bd50de5978b4641402d17394778695691b74d9d5ce40" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.425442 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.492447 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/049a4617-20ed-4f86-a0c4-a3a59bd44f26-config-data\") pod \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\" (UID: \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\") " Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.492544 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sw5c6\" (UniqueName: \"kubernetes.io/projected/049a4617-20ed-4f86-a0c4-a3a59bd44f26-kube-api-access-sw5c6\") pod \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\" (UID: \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\") " Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.492608 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/049a4617-20ed-4f86-a0c4-a3a59bd44f26-combined-ca-bundle\") pod \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\" (UID: \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\") " Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.492760 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/049a4617-20ed-4f86-a0c4-a3a59bd44f26-logs\") pod \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\" (UID: \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\") " Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.493684 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/049a4617-20ed-4f86-a0c4-a3a59bd44f26-logs" (OuterVolumeSpecName: "logs") pod "049a4617-20ed-4f86-a0c4-a3a59bd44f26" (UID: "049a4617-20ed-4f86-a0c4-a3a59bd44f26"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.495478 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/049a4617-20ed-4f86-a0c4-a3a59bd44f26-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.535751 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/049a4617-20ed-4f86-a0c4-a3a59bd44f26-config-data" (OuterVolumeSpecName: "config-data") pod "049a4617-20ed-4f86-a0c4-a3a59bd44f26" (UID: "049a4617-20ed-4f86-a0c4-a3a59bd44f26"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.540547 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/049a4617-20ed-4f86-a0c4-a3a59bd44f26-kube-api-access-sw5c6" (OuterVolumeSpecName: "kube-api-access-sw5c6") pod "049a4617-20ed-4f86-a0c4-a3a59bd44f26" (UID: "049a4617-20ed-4f86-a0c4-a3a59bd44f26"). InnerVolumeSpecName "kube-api-access-sw5c6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.544679 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/049a4617-20ed-4f86-a0c4-a3a59bd44f26-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "049a4617-20ed-4f86-a0c4-a3a59bd44f26" (UID: "049a4617-20ed-4f86-a0c4-a3a59bd44f26"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.597748 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/049a4617-20ed-4f86-a0c4-a3a59bd44f26-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.597800 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sw5c6\" (UniqueName: \"kubernetes.io/projected/049a4617-20ed-4f86-a0c4-a3a59bd44f26-kube-api-access-sw5c6\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.597813 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/049a4617-20ed-4f86-a0c4-a3a59bd44f26-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.602134 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.698834 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bae6adb7-0e70-4689-aebe-d027da87abbb-combined-ca-bundle\") pod \"bae6adb7-0e70-4689-aebe-d027da87abbb\" (UID: \"bae6adb7-0e70-4689-aebe-d027da87abbb\") " Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.698909 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bae6adb7-0e70-4689-aebe-d027da87abbb-config-data\") pod \"bae6adb7-0e70-4689-aebe-d027da87abbb\" (UID: \"bae6adb7-0e70-4689-aebe-d027da87abbb\") " Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.699157 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdz9n\" (UniqueName: \"kubernetes.io/projected/bae6adb7-0e70-4689-aebe-d027da87abbb-kube-api-access-vdz9n\") pod \"bae6adb7-0e70-4689-aebe-d027da87abbb\" (UID: \"bae6adb7-0e70-4689-aebe-d027da87abbb\") " Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.704580 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bae6adb7-0e70-4689-aebe-d027da87abbb-kube-api-access-vdz9n" (OuterVolumeSpecName: "kube-api-access-vdz9n") pod "bae6adb7-0e70-4689-aebe-d027da87abbb" (UID: "bae6adb7-0e70-4689-aebe-d027da87abbb"). InnerVolumeSpecName "kube-api-access-vdz9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.726050 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bae6adb7-0e70-4689-aebe-d027da87abbb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bae6adb7-0e70-4689-aebe-d027da87abbb" (UID: "bae6adb7-0e70-4689-aebe-d027da87abbb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.739108 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bae6adb7-0e70-4689-aebe-d027da87abbb-config-data" (OuterVolumeSpecName: "config-data") pod "bae6adb7-0e70-4689-aebe-d027da87abbb" (UID: "bae6adb7-0e70-4689-aebe-d027da87abbb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.804413 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bae6adb7-0e70-4689-aebe-d027da87abbb-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.804460 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdz9n\" (UniqueName: \"kubernetes.io/projected/bae6adb7-0e70-4689-aebe-d027da87abbb-kube-api-access-vdz9n\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.804478 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bae6adb7-0e70-4689-aebe-d027da87abbb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.242517 5094 generic.go:334] "Generic (PLEG): container finished" podID="bae6adb7-0e70-4689-aebe-d027da87abbb" containerID="f58ed2f0a80c2e018e48b0ad957c62ab9bd7aa5d88b44dabd89b67964ecb7d85" exitCode=0 Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.243034 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.244389 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.244833 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bae6adb7-0e70-4689-aebe-d027da87abbb","Type":"ContainerDied","Data":"f58ed2f0a80c2e018e48b0ad957c62ab9bd7aa5d88b44dabd89b67964ecb7d85"} Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.244866 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bae6adb7-0e70-4689-aebe-d027da87abbb","Type":"ContainerDied","Data":"d76d353af2545ad8202d965c9d3961ffa837c726f661aaff8084a4e9ecb335a6"} Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.244886 5094 scope.go:117] "RemoveContainer" containerID="f58ed2f0a80c2e018e48b0ad957c62ab9bd7aa5d88b44dabd89b67964ecb7d85" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.284219 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.291096 5094 scope.go:117] "RemoveContainer" containerID="f58ed2f0a80c2e018e48b0ad957c62ab9bd7aa5d88b44dabd89b67964ecb7d85" Feb 20 07:09:54 crc kubenswrapper[5094]: E0220 07:09:54.291622 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f58ed2f0a80c2e018e48b0ad957c62ab9bd7aa5d88b44dabd89b67964ecb7d85\": container with ID starting with f58ed2f0a80c2e018e48b0ad957c62ab9bd7aa5d88b44dabd89b67964ecb7d85 not found: ID does not exist" containerID="f58ed2f0a80c2e018e48b0ad957c62ab9bd7aa5d88b44dabd89b67964ecb7d85" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.291683 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f58ed2f0a80c2e018e48b0ad957c62ab9bd7aa5d88b44dabd89b67964ecb7d85"} err="failed to get container status \"f58ed2f0a80c2e018e48b0ad957c62ab9bd7aa5d88b44dabd89b67964ecb7d85\": rpc error: code = NotFound desc = could not find container \"f58ed2f0a80c2e018e48b0ad957c62ab9bd7aa5d88b44dabd89b67964ecb7d85\": container with ID starting with f58ed2f0a80c2e018e48b0ad957c62ab9bd7aa5d88b44dabd89b67964ecb7d85 not found: ID does not exist" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.309275 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.330926 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.349383 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.366291 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:09:54 crc kubenswrapper[5094]: E0220 07:09:54.367284 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="049a4617-20ed-4f86-a0c4-a3a59bd44f26" containerName="nova-api-api" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.367311 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="049a4617-20ed-4f86-a0c4-a3a59bd44f26" containerName="nova-api-api" Feb 20 07:09:54 crc kubenswrapper[5094]: E0220 07:09:54.367528 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="049a4617-20ed-4f86-a0c4-a3a59bd44f26" containerName="nova-api-log" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.367539 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="049a4617-20ed-4f86-a0c4-a3a59bd44f26" containerName="nova-api-log" Feb 20 07:09:54 crc kubenswrapper[5094]: E0220 07:09:54.367571 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bae6adb7-0e70-4689-aebe-d027da87abbb" containerName="nova-scheduler-scheduler" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.367580 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="bae6adb7-0e70-4689-aebe-d027da87abbb" containerName="nova-scheduler-scheduler" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.367845 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="049a4617-20ed-4f86-a0c4-a3a59bd44f26" containerName="nova-api-api" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.367858 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="049a4617-20ed-4f86-a0c4-a3a59bd44f26" containerName="nova-api-log" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.367871 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="bae6adb7-0e70-4689-aebe-d027da87abbb" containerName="nova-scheduler-scheduler" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.369067 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.374181 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.397537 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.408067 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.409944 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.412567 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.423855 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.522972 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f31fb44b-cd20-42d8-a384-0e4a800ea177-logs\") pod \"nova-api-0\" (UID: \"f31fb44b-cd20-42d8-a384-0e4a800ea177\") " pod="openstack/nova-api-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.523022 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f31fb44b-cd20-42d8-a384-0e4a800ea177-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f31fb44b-cd20-42d8-a384-0e4a800ea177\") " pod="openstack/nova-api-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.523065 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8xqn\" (UniqueName: \"kubernetes.io/projected/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-kube-api-access-w8xqn\") pod \"nova-scheduler-0\" (UID: \"c1c68ac6-9f96-4a39-b477-7ad74a04dff9\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.523220 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-config-data\") pod \"nova-scheduler-0\" (UID: \"c1c68ac6-9f96-4a39-b477-7ad74a04dff9\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.523333 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh6fn\" (UniqueName: \"kubernetes.io/projected/f31fb44b-cd20-42d8-a384-0e4a800ea177-kube-api-access-qh6fn\") pod \"nova-api-0\" (UID: \"f31fb44b-cd20-42d8-a384-0e4a800ea177\") " pod="openstack/nova-api-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.523640 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f31fb44b-cd20-42d8-a384-0e4a800ea177-config-data\") pod \"nova-api-0\" (UID: \"f31fb44b-cd20-42d8-a384-0e4a800ea177\") " pod="openstack/nova-api-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.523799 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c1c68ac6-9f96-4a39-b477-7ad74a04dff9\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.627328 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f31fb44b-cd20-42d8-a384-0e4a800ea177-config-data\") pod \"nova-api-0\" (UID: \"f31fb44b-cd20-42d8-a384-0e4a800ea177\") " pod="openstack/nova-api-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.627389 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c1c68ac6-9f96-4a39-b477-7ad74a04dff9\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.627487 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f31fb44b-cd20-42d8-a384-0e4a800ea177-logs\") pod \"nova-api-0\" (UID: \"f31fb44b-cd20-42d8-a384-0e4a800ea177\") " pod="openstack/nova-api-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.627517 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f31fb44b-cd20-42d8-a384-0e4a800ea177-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f31fb44b-cd20-42d8-a384-0e4a800ea177\") " pod="openstack/nova-api-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.627549 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8xqn\" (UniqueName: \"kubernetes.io/projected/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-kube-api-access-w8xqn\") pod \"nova-scheduler-0\" (UID: \"c1c68ac6-9f96-4a39-b477-7ad74a04dff9\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.627569 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-config-data\") pod \"nova-scheduler-0\" (UID: \"c1c68ac6-9f96-4a39-b477-7ad74a04dff9\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.627594 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh6fn\" (UniqueName: \"kubernetes.io/projected/f31fb44b-cd20-42d8-a384-0e4a800ea177-kube-api-access-qh6fn\") pod \"nova-api-0\" (UID: \"f31fb44b-cd20-42d8-a384-0e4a800ea177\") " pod="openstack/nova-api-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.629646 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f31fb44b-cd20-42d8-a384-0e4a800ea177-logs\") pod \"nova-api-0\" (UID: \"f31fb44b-cd20-42d8-a384-0e4a800ea177\") " pod="openstack/nova-api-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.638201 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c1c68ac6-9f96-4a39-b477-7ad74a04dff9\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.638900 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f31fb44b-cd20-42d8-a384-0e4a800ea177-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f31fb44b-cd20-42d8-a384-0e4a800ea177\") " pod="openstack/nova-api-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.639745 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-config-data\") pod \"nova-scheduler-0\" (UID: \"c1c68ac6-9f96-4a39-b477-7ad74a04dff9\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.648990 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8xqn\" (UniqueName: \"kubernetes.io/projected/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-kube-api-access-w8xqn\") pod \"nova-scheduler-0\" (UID: \"c1c68ac6-9f96-4a39-b477-7ad74a04dff9\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.649683 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f31fb44b-cd20-42d8-a384-0e4a800ea177-config-data\") pod \"nova-api-0\" (UID: \"f31fb44b-cd20-42d8-a384-0e4a800ea177\") " pod="openstack/nova-api-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.660569 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh6fn\" (UniqueName: \"kubernetes.io/projected/f31fb44b-cd20-42d8-a384-0e4a800ea177-kube-api-access-qh6fn\") pod \"nova-api-0\" (UID: \"f31fb44b-cd20-42d8-a384-0e4a800ea177\") " pod="openstack/nova-api-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.709011 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.729081 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 07:09:55 crc kubenswrapper[5094]: I0220 07:09:55.257562 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:09:55 crc kubenswrapper[5094]: I0220 07:09:55.259002 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c1c68ac6-9f96-4a39-b477-7ad74a04dff9","Type":"ContainerStarted","Data":"1ceadaa23027d492c09e5b48aebf25f3f67173315364e5177caaee816d00585c"} Feb 20 07:09:55 crc kubenswrapper[5094]: I0220 07:09:55.260251 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f31fb44b-cd20-42d8-a384-0e4a800ea177","Type":"ContainerStarted","Data":"7169a135be999d7785f50c1ff2763f918e250f0425691f0c798f5fb9269603bf"} Feb 20 07:09:55 crc kubenswrapper[5094]: I0220 07:09:55.272467 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:09:55 crc kubenswrapper[5094]: I0220 07:09:55.863048 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="049a4617-20ed-4f86-a0c4-a3a59bd44f26" path="/var/lib/kubelet/pods/049a4617-20ed-4f86-a0c4-a3a59bd44f26/volumes" Feb 20 07:09:55 crc kubenswrapper[5094]: I0220 07:09:55.864238 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bae6adb7-0e70-4689-aebe-d027da87abbb" path="/var/lib/kubelet/pods/bae6adb7-0e70-4689-aebe-d027da87abbb/volumes" Feb 20 07:09:56 crc kubenswrapper[5094]: I0220 07:09:56.276651 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f31fb44b-cd20-42d8-a384-0e4a800ea177","Type":"ContainerStarted","Data":"18ddd6e3ad547c4944cb5e98b7ae2e945f7c76f0d0a718546312f974a906096b"} Feb 20 07:09:56 crc kubenswrapper[5094]: I0220 07:09:56.276775 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f31fb44b-cd20-42d8-a384-0e4a800ea177","Type":"ContainerStarted","Data":"f71fec55c80bd4551277892c0bac5ca2613045a77053e7cbf6abbd8434bf844c"} Feb 20 07:09:56 crc kubenswrapper[5094]: I0220 07:09:56.281170 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c1c68ac6-9f96-4a39-b477-7ad74a04dff9","Type":"ContainerStarted","Data":"af3a839b78e1bdb84a4e5e4f15b94bc87880557a5061b18ba71bafe2fbc48a2f"} Feb 20 07:09:56 crc kubenswrapper[5094]: I0220 07:09:56.312547 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.312518921 podStartE2EDuration="2.312518921s" podCreationTimestamp="2026-02-20 07:09:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:09:56.305522144 +0000 UTC m=+1411.178148895" watchObservedRunningTime="2026-02-20 07:09:56.312518921 +0000 UTC m=+1411.185145632" Feb 20 07:09:56 crc kubenswrapper[5094]: I0220 07:09:56.339630 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.33960481 podStartE2EDuration="2.33960481s" podCreationTimestamp="2026-02-20 07:09:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:09:56.326493686 +0000 UTC m=+1411.199120407" watchObservedRunningTime="2026-02-20 07:09:56.33960481 +0000 UTC m=+1411.212231531" Feb 20 07:09:59 crc kubenswrapper[5094]: I0220 07:09:59.634043 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 20 07:09:59 crc kubenswrapper[5094]: I0220 07:09:59.709931 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 20 07:10:04 crc kubenswrapper[5094]: I0220 07:10:04.106928 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:10:04 crc kubenswrapper[5094]: I0220 07:10:04.107904 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:10:04 crc kubenswrapper[5094]: I0220 07:10:04.709397 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 20 07:10:04 crc kubenswrapper[5094]: I0220 07:10:04.730475 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 07:10:04 crc kubenswrapper[5094]: I0220 07:10:04.730533 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 07:10:04 crc kubenswrapper[5094]: I0220 07:10:04.755163 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 20 07:10:05 crc kubenswrapper[5094]: I0220 07:10:05.432895 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 20 07:10:05 crc kubenswrapper[5094]: I0220 07:10:05.814868 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f31fb44b-cd20-42d8-a384-0e4a800ea177" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 07:10:05 crc kubenswrapper[5094]: I0220 07:10:05.814874 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f31fb44b-cd20-42d8-a384-0e4a800ea177" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 07:10:08 crc kubenswrapper[5094]: I0220 07:10:08.380792 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.492205 5094 generic.go:334] "Generic (PLEG): container finished" podID="043bc1d7-f57a-481d-b132-71ef45e85480" containerID="f58aa2d0199fdb5db167c5aef1bd053858fb706513548fbeaab67aa8e11ddcd5" exitCode=137 Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.492392 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"043bc1d7-f57a-481d-b132-71ef45e85480","Type":"ContainerDied","Data":"f58aa2d0199fdb5db167c5aef1bd053858fb706513548fbeaab67aa8e11ddcd5"} Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.492937 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"043bc1d7-f57a-481d-b132-71ef45e85480","Type":"ContainerDied","Data":"9e410d4605ab52b279cb99a108d57c885a2eae7e022b21ab722b6893f02390c6"} Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.492960 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e410d4605ab52b279cb99a108d57c885a2eae7e022b21ab722b6893f02390c6" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.496242 5094 generic.go:334] "Generic (PLEG): container finished" podID="c1c7c378-78bc-48bd-932c-fa19cf4e6284" containerID="3b84d0ad082d2f55f39f713582ba0b77187948e8fdabcb2633a95c7f3cd76484" exitCode=137 Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.496274 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c1c7c378-78bc-48bd-932c-fa19cf4e6284","Type":"ContainerDied","Data":"3b84d0ad082d2f55f39f713582ba0b77187948e8fdabcb2633a95c7f3cd76484"} Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.496296 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c1c7c378-78bc-48bd-932c-fa19cf4e6284","Type":"ContainerDied","Data":"6f8e319488735091a18c4719ad1bc421ed45fbf1898270c70449e004acaf42c0"} Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.496306 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f8e319488735091a18c4719ad1bc421ed45fbf1898270c70449e004acaf42c0" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.539000 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.548474 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.589989 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.590311 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="715094df-6704-4332-b990-95d790fd5ff1" containerName="kube-state-metrics" containerID="cri-o://ffe736a6fc24efeb2e9463249f14ea2fb642857b322f6c26497919b56fb7314a" gracePeriod=30 Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.734186 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/043bc1d7-f57a-481d-b132-71ef45e85480-combined-ca-bundle\") pod \"043bc1d7-f57a-481d-b132-71ef45e85480\" (UID: \"043bc1d7-f57a-481d-b132-71ef45e85480\") " Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.734318 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwglg\" (UniqueName: \"kubernetes.io/projected/043bc1d7-f57a-481d-b132-71ef45e85480-kube-api-access-zwglg\") pod \"043bc1d7-f57a-481d-b132-71ef45e85480\" (UID: \"043bc1d7-f57a-481d-b132-71ef45e85480\") " Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.734365 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/043bc1d7-f57a-481d-b132-71ef45e85480-config-data\") pod \"043bc1d7-f57a-481d-b132-71ef45e85480\" (UID: \"043bc1d7-f57a-481d-b132-71ef45e85480\") " Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.734426 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1c7c378-78bc-48bd-932c-fa19cf4e6284-combined-ca-bundle\") pod \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\" (UID: \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\") " Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.734501 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1c7c378-78bc-48bd-932c-fa19cf4e6284-logs\") pod \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\" (UID: \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\") " Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.734555 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4nvf\" (UniqueName: \"kubernetes.io/projected/c1c7c378-78bc-48bd-932c-fa19cf4e6284-kube-api-access-x4nvf\") pod \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\" (UID: \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\") " Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.734593 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1c7c378-78bc-48bd-932c-fa19cf4e6284-config-data\") pod \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\" (UID: \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\") " Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.735245 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1c7c378-78bc-48bd-932c-fa19cf4e6284-logs" (OuterVolumeSpecName: "logs") pod "c1c7c378-78bc-48bd-932c-fa19cf4e6284" (UID: "c1c7c378-78bc-48bd-932c-fa19cf4e6284"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.746603 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1c7c378-78bc-48bd-932c-fa19cf4e6284-kube-api-access-x4nvf" (OuterVolumeSpecName: "kube-api-access-x4nvf") pod "c1c7c378-78bc-48bd-932c-fa19cf4e6284" (UID: "c1c7c378-78bc-48bd-932c-fa19cf4e6284"). InnerVolumeSpecName "kube-api-access-x4nvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.747338 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/043bc1d7-f57a-481d-b132-71ef45e85480-kube-api-access-zwglg" (OuterVolumeSpecName: "kube-api-access-zwglg") pod "043bc1d7-f57a-481d-b132-71ef45e85480" (UID: "043bc1d7-f57a-481d-b132-71ef45e85480"). InnerVolumeSpecName "kube-api-access-zwglg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.770674 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/043bc1d7-f57a-481d-b132-71ef45e85480-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "043bc1d7-f57a-481d-b132-71ef45e85480" (UID: "043bc1d7-f57a-481d-b132-71ef45e85480"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.776385 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/043bc1d7-f57a-481d-b132-71ef45e85480-config-data" (OuterVolumeSpecName: "config-data") pod "043bc1d7-f57a-481d-b132-71ef45e85480" (UID: "043bc1d7-f57a-481d-b132-71ef45e85480"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.783464 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1c7c378-78bc-48bd-932c-fa19cf4e6284-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1c7c378-78bc-48bd-932c-fa19cf4e6284" (UID: "c1c7c378-78bc-48bd-932c-fa19cf4e6284"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.784551 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1c7c378-78bc-48bd-932c-fa19cf4e6284-config-data" (OuterVolumeSpecName: "config-data") pod "c1c7c378-78bc-48bd-932c-fa19cf4e6284" (UID: "c1c7c378-78bc-48bd-932c-fa19cf4e6284"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.837840 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/043bc1d7-f57a-481d-b132-71ef45e85480-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.838037 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwglg\" (UniqueName: \"kubernetes.io/projected/043bc1d7-f57a-481d-b132-71ef45e85480-kube-api-access-zwglg\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.838127 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/043bc1d7-f57a-481d-b132-71ef45e85480-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.838187 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1c7c378-78bc-48bd-932c-fa19cf4e6284-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.838262 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1c7c378-78bc-48bd-932c-fa19cf4e6284-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.838334 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4nvf\" (UniqueName: \"kubernetes.io/projected/c1c7c378-78bc-48bd-932c-fa19cf4e6284-kube-api-access-x4nvf\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.838398 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1c7c378-78bc-48bd-932c-fa19cf4e6284-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.946138 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.041900 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7k8s\" (UniqueName: \"kubernetes.io/projected/715094df-6704-4332-b990-95d790fd5ff1-kube-api-access-b7k8s\") pod \"715094df-6704-4332-b990-95d790fd5ff1\" (UID: \"715094df-6704-4332-b990-95d790fd5ff1\") " Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.046313 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/715094df-6704-4332-b990-95d790fd5ff1-kube-api-access-b7k8s" (OuterVolumeSpecName: "kube-api-access-b7k8s") pod "715094df-6704-4332-b990-95d790fd5ff1" (UID: "715094df-6704-4332-b990-95d790fd5ff1"). InnerVolumeSpecName "kube-api-access-b7k8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.144932 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7k8s\" (UniqueName: \"kubernetes.io/projected/715094df-6704-4332-b990-95d790fd5ff1-kube-api-access-b7k8s\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.512056 5094 generic.go:334] "Generic (PLEG): container finished" podID="715094df-6704-4332-b990-95d790fd5ff1" containerID="ffe736a6fc24efeb2e9463249f14ea2fb642857b322f6c26497919b56fb7314a" exitCode=2 Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.512181 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.512227 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"715094df-6704-4332-b990-95d790fd5ff1","Type":"ContainerDied","Data":"ffe736a6fc24efeb2e9463249f14ea2fb642857b322f6c26497919b56fb7314a"} Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.512274 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"715094df-6704-4332-b990-95d790fd5ff1","Type":"ContainerDied","Data":"3204759b1dff0695d3d73b3d9357f50ef02ad54eb0c3291abc0e31ecef319da3"} Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.512307 5094 scope.go:117] "RemoveContainer" containerID="ffe736a6fc24efeb2e9463249f14ea2fb642857b322f6c26497919b56fb7314a" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.512571 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.512206 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.564893 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.578207 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.588540 5094 scope.go:117] "RemoveContainer" containerID="ffe736a6fc24efeb2e9463249f14ea2fb642857b322f6c26497919b56fb7314a" Feb 20 07:10:13 crc kubenswrapper[5094]: E0220 07:10:13.589581 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffe736a6fc24efeb2e9463249f14ea2fb642857b322f6c26497919b56fb7314a\": container with ID starting with ffe736a6fc24efeb2e9463249f14ea2fb642857b322f6c26497919b56fb7314a not found: ID does not exist" containerID="ffe736a6fc24efeb2e9463249f14ea2fb642857b322f6c26497919b56fb7314a" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.589672 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffe736a6fc24efeb2e9463249f14ea2fb642857b322f6c26497919b56fb7314a"} err="failed to get container status \"ffe736a6fc24efeb2e9463249f14ea2fb642857b322f6c26497919b56fb7314a\": rpc error: code = NotFound desc = could not find container \"ffe736a6fc24efeb2e9463249f14ea2fb642857b322f6c26497919b56fb7314a\": container with ID starting with ffe736a6fc24efeb2e9463249f14ea2fb642857b322f6c26497919b56fb7314a not found: ID does not exist" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.616422 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.630306 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.659253 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 07:10:13 crc kubenswrapper[5094]: E0220 07:10:13.668879 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1c7c378-78bc-48bd-932c-fa19cf4e6284" containerName="nova-metadata-log" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.668944 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1c7c378-78bc-48bd-932c-fa19cf4e6284" containerName="nova-metadata-log" Feb 20 07:10:13 crc kubenswrapper[5094]: E0220 07:10:13.669036 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715094df-6704-4332-b990-95d790fd5ff1" containerName="kube-state-metrics" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.669049 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="715094df-6704-4332-b990-95d790fd5ff1" containerName="kube-state-metrics" Feb 20 07:10:13 crc kubenswrapper[5094]: E0220 07:10:13.669073 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1c7c378-78bc-48bd-932c-fa19cf4e6284" containerName="nova-metadata-metadata" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.669087 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1c7c378-78bc-48bd-932c-fa19cf4e6284" containerName="nova-metadata-metadata" Feb 20 07:10:13 crc kubenswrapper[5094]: E0220 07:10:13.669115 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="043bc1d7-f57a-481d-b132-71ef45e85480" containerName="nova-cell1-novncproxy-novncproxy" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.669127 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="043bc1d7-f57a-481d-b132-71ef45e85480" containerName="nova-cell1-novncproxy-novncproxy" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.692664 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1c7c378-78bc-48bd-932c-fa19cf4e6284" containerName="nova-metadata-metadata" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.692760 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="043bc1d7-f57a-481d-b132-71ef45e85480" containerName="nova-cell1-novncproxy-novncproxy" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.692794 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="715094df-6704-4332-b990-95d790fd5ff1" containerName="kube-state-metrics" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.692834 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1c7c378-78bc-48bd-932c-fa19cf4e6284" containerName="nova-metadata-log" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.694262 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.698395 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.702166 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.701672 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.723374 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.739253 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.744864 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.748635 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.748975 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.765341 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.776483 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.786656 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.794542 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.796307 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.802375 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.803137 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.803681 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.852307 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="043bc1d7-f57a-481d-b132-71ef45e85480" path="/var/lib/kubelet/pods/043bc1d7-f57a-481d-b132-71ef45e85480/volumes" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.853002 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="715094df-6704-4332-b990-95d790fd5ff1" path="/var/lib/kubelet/pods/715094df-6704-4332-b990-95d790fd5ff1/volumes" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.853740 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1c7c378-78bc-48bd-932c-fa19cf4e6284" path="/var/lib/kubelet/pods/c1c7c378-78bc-48bd-932c-fa19cf4e6284/volumes" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.890966 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " pod="openstack/nova-metadata-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.891051 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.891085 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " pod="openstack/nova-metadata-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.891129 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.891185 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgt5d\" (UniqueName: \"kubernetes.io/projected/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-kube-api-access-cgt5d\") pod \"nova-metadata-0\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " pod="openstack/nova-metadata-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.891213 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44cs2\" (UniqueName: \"kubernetes.io/projected/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-kube-api-access-44cs2\") pod \"nova-cell1-novncproxy-0\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.891284 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-config-data\") pod \"nova-metadata-0\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " pod="openstack/nova-metadata-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.891379 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.891541 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.891797 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-logs\") pod \"nova-metadata-0\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " pod="openstack/nova-metadata-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.994502 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1fe9db54-4204-4335-a272-c469e0923478\") " pod="openstack/kube-state-metrics-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.994591 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.994696 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.994905 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-logs\") pod \"nova-metadata-0\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " pod="openstack/nova-metadata-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.995020 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " pod="openstack/nova-metadata-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.995054 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.995099 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " pod="openstack/nova-metadata-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.995155 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1fe9db54-4204-4335-a272-c469e0923478\") " pod="openstack/kube-state-metrics-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.995199 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1fe9db54-4204-4335-a272-c469e0923478\") " pod="openstack/kube-state-metrics-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.995241 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.995316 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44cs2\" (UniqueName: \"kubernetes.io/projected/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-kube-api-access-44cs2\") pod \"nova-cell1-novncproxy-0\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.995354 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgt5d\" (UniqueName: \"kubernetes.io/projected/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-kube-api-access-cgt5d\") pod \"nova-metadata-0\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " pod="openstack/nova-metadata-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.995435 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjgxz\" (UniqueName: \"kubernetes.io/projected/1fe9db54-4204-4335-a272-c469e0923478-kube-api-access-gjgxz\") pod \"kube-state-metrics-0\" (UID: \"1fe9db54-4204-4335-a272-c469e0923478\") " pod="openstack/kube-state-metrics-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.996129 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-logs\") pod \"nova-metadata-0\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " pod="openstack/nova-metadata-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.995486 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-config-data\") pod \"nova-metadata-0\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " pod="openstack/nova-metadata-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.002864 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.003575 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.004574 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " pod="openstack/nova-metadata-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.004837 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " pod="openstack/nova-metadata-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.005847 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-config-data\") pod \"nova-metadata-0\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " pod="openstack/nova-metadata-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.017133 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.025546 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44cs2\" (UniqueName: \"kubernetes.io/projected/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-kube-api-access-44cs2\") pod \"nova-cell1-novncproxy-0\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.025987 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.033290 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgt5d\" (UniqueName: \"kubernetes.io/projected/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-kube-api-access-cgt5d\") pod \"nova-metadata-0\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " pod="openstack/nova-metadata-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.040800 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.070656 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.099917 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjgxz\" (UniqueName: \"kubernetes.io/projected/1fe9db54-4204-4335-a272-c469e0923478-kube-api-access-gjgxz\") pod \"kube-state-metrics-0\" (UID: \"1fe9db54-4204-4335-a272-c469e0923478\") " pod="openstack/kube-state-metrics-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.099978 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1fe9db54-4204-4335-a272-c469e0923478\") " pod="openstack/kube-state-metrics-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.100105 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1fe9db54-4204-4335-a272-c469e0923478\") " pod="openstack/kube-state-metrics-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.100135 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1fe9db54-4204-4335-a272-c469e0923478\") " pod="openstack/kube-state-metrics-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.106395 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1fe9db54-4204-4335-a272-c469e0923478\") " pod="openstack/kube-state-metrics-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.106444 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1fe9db54-4204-4335-a272-c469e0923478\") " pod="openstack/kube-state-metrics-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.109377 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1fe9db54-4204-4335-a272-c469e0923478\") " pod="openstack/kube-state-metrics-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.121374 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjgxz\" (UniqueName: \"kubernetes.io/projected/1fe9db54-4204-4335-a272-c469e0923478-kube-api-access-gjgxz\") pod \"kube-state-metrics-0\" (UID: \"1fe9db54-4204-4335-a272-c469e0923478\") " pod="openstack/kube-state-metrics-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.414268 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.617543 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.692544 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.735337 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.735927 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.739476 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.747172 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.798754 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.799431 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerName="ceilometer-central-agent" containerID="cri-o://897a4151c566a72e1d0e9378be05cebb168cc60b9817042c8a3829b3600f9baa" gracePeriod=30 Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.799509 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerName="proxy-httpd" containerID="cri-o://07ea59e61fed8a3358551accb40261cf86e2779fdb107f968e111962734d98f4" gracePeriod=30 Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.799606 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerName="sg-core" containerID="cri-o://40d3a0b02e5c779c99ffae4aa91b14ff147cdd2679151488afdea67f1c016caf" gracePeriod=30 Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.799553 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerName="ceilometer-notification-agent" containerID="cri-o://d18a7448181106e07e9f83589f66798d53c158f7b49b4cb2f63dd87bb3ea37e5" gracePeriod=30 Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.910865 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.933808 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.548621 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0eaf3b2-613b-41c2-9eac-ce8093ccec66","Type":"ContainerStarted","Data":"c7a53317b54066a403baba090862c5995bba55506254ad03280d9c1ace19b11d"} Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.549147 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0eaf3b2-613b-41c2-9eac-ce8093ccec66","Type":"ContainerStarted","Data":"ae97ccd59962f48b4e76117ad4ab2034a3bfad9a1ff6a242afd02e6a32cafe32"} Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.549161 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0eaf3b2-613b-41c2-9eac-ce8093ccec66","Type":"ContainerStarted","Data":"0d734b120ad598d02afd1f0952c1bd5ccf1f9badb30cba5e61b1f4e9fc055b42"} Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.555153 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3db6d35c-dfd1-4a59-95d3-cc8a99151c12","Type":"ContainerStarted","Data":"2bdc91fbdf4016ef04e3a2baae7cf5e5ae714534246ea953292e2628283eecdf"} Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.555218 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3db6d35c-dfd1-4a59-95d3-cc8a99151c12","Type":"ContainerStarted","Data":"d1d010e8fd8bc9707a8121e22dcd018ce86f613e9bbcc45a6bd9ab2c3e354582"} Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.559844 5094 generic.go:334] "Generic (PLEG): container finished" podID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerID="07ea59e61fed8a3358551accb40261cf86e2779fdb107f968e111962734d98f4" exitCode=0 Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.559879 5094 generic.go:334] "Generic (PLEG): container finished" podID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerID="40d3a0b02e5c779c99ffae4aa91b14ff147cdd2679151488afdea67f1c016caf" exitCode=2 Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.559887 5094 generic.go:334] "Generic (PLEG): container finished" podID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerID="897a4151c566a72e1d0e9378be05cebb168cc60b9817042c8a3829b3600f9baa" exitCode=0 Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.559934 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8","Type":"ContainerDied","Data":"07ea59e61fed8a3358551accb40261cf86e2779fdb107f968e111962734d98f4"} Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.559969 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8","Type":"ContainerDied","Data":"40d3a0b02e5c779c99ffae4aa91b14ff147cdd2679151488afdea67f1c016caf"} Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.559981 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8","Type":"ContainerDied","Data":"897a4151c566a72e1d0e9378be05cebb168cc60b9817042c8a3829b3600f9baa"} Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.566130 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1fe9db54-4204-4335-a272-c469e0923478","Type":"ContainerStarted","Data":"8a0b13fdbdedc5064e8f68c82ce215006ed4f58e7530fd19fcca453a9915c200"} Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.567301 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.582271 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.601955 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.6019214059999998 podStartE2EDuration="2.601921406s" podCreationTimestamp="2026-02-20 07:10:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:10:15.572941583 +0000 UTC m=+1430.445568294" watchObservedRunningTime="2026-02-20 07:10:15.601921406 +0000 UTC m=+1430.474548127" Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.614495 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.614470367 podStartE2EDuration="2.614470367s" podCreationTimestamp="2026-02-20 07:10:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:10:15.590735478 +0000 UTC m=+1430.463362209" watchObservedRunningTime="2026-02-20 07:10:15.614470367 +0000 UTC m=+1430.487097088" Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.756676 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7677694455-llk7m"] Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.760788 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.814790 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7677694455-llk7m"] Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.944101 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-ovsdbserver-nb\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.944203 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-dns-svc\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.944257 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djp42\" (UniqueName: \"kubernetes.io/projected/37533bd5-22b5-4b59-8672-35eaa19b9295-kube-api-access-djp42\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.944335 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-dns-swift-storage-0\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.944377 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-ovsdbserver-sb\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.944418 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-config\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.047006 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-dns-swift-storage-0\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.047095 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-ovsdbserver-sb\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.047148 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-config\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.047191 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-ovsdbserver-nb\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.047243 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-dns-svc\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.047270 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djp42\" (UniqueName: \"kubernetes.io/projected/37533bd5-22b5-4b59-8672-35eaa19b9295-kube-api-access-djp42\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.048193 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-dns-swift-storage-0\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.048240 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-config\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.048629 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-dns-svc\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.048628 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-ovsdbserver-sb\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.048861 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-ovsdbserver-nb\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.071486 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djp42\" (UniqueName: \"kubernetes.io/projected/37533bd5-22b5-4b59-8672-35eaa19b9295-kube-api-access-djp42\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.106618 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.591424 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1fe9db54-4204-4335-a272-c469e0923478","Type":"ContainerStarted","Data":"0fcfeb4fdfabbbcb036fd6cf631f390dac9fcd6e68173a7af3e76810e7a92db9"} Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.593354 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.596720 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7677694455-llk7m"] Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.611306 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.228997945 podStartE2EDuration="3.61127687s" podCreationTimestamp="2026-02-20 07:10:13 +0000 UTC" firstStartedPulling="2026-02-20 07:10:14.93355958 +0000 UTC m=+1429.806186281" lastFinishedPulling="2026-02-20 07:10:15.315838495 +0000 UTC m=+1430.188465206" observedRunningTime="2026-02-20 07:10:16.608521694 +0000 UTC m=+1431.481148405" watchObservedRunningTime="2026-02-20 07:10:16.61127687 +0000 UTC m=+1431.483903581" Feb 20 07:10:17 crc kubenswrapper[5094]: I0220 07:10:17.601800 5094 generic.go:334] "Generic (PLEG): container finished" podID="37533bd5-22b5-4b59-8672-35eaa19b9295" containerID="a4c4f92b36bb2b7d701dbf8c3f7817a427ce69bddf6bb34e82a6884705e2608c" exitCode=0 Feb 20 07:10:17 crc kubenswrapper[5094]: I0220 07:10:17.601913 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7677694455-llk7m" event={"ID":"37533bd5-22b5-4b59-8672-35eaa19b9295","Type":"ContainerDied","Data":"a4c4f92b36bb2b7d701dbf8c3f7817a427ce69bddf6bb34e82a6884705e2608c"} Feb 20 07:10:17 crc kubenswrapper[5094]: I0220 07:10:17.602179 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7677694455-llk7m" event={"ID":"37533bd5-22b5-4b59-8672-35eaa19b9295","Type":"ContainerStarted","Data":"11d132d9afa27137f856e4e3ac63fa1a46eebfeb7ef403dea2957ddcdaf2acba"} Feb 20 07:10:18 crc kubenswrapper[5094]: I0220 07:10:18.483276 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:10:18 crc kubenswrapper[5094]: I0220 07:10:18.613847 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7677694455-llk7m" event={"ID":"37533bd5-22b5-4b59-8672-35eaa19b9295","Type":"ContainerStarted","Data":"fabd932b0a47a0bf51f49258d5ff3d5003b1b88996a45ebe155061bab41c04ff"} Feb 20 07:10:18 crc kubenswrapper[5094]: I0220 07:10:18.614039 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f31fb44b-cd20-42d8-a384-0e4a800ea177" containerName="nova-api-log" containerID="cri-o://f71fec55c80bd4551277892c0bac5ca2613045a77053e7cbf6abbd8434bf844c" gracePeriod=30 Feb 20 07:10:18 crc kubenswrapper[5094]: I0220 07:10:18.614102 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f31fb44b-cd20-42d8-a384-0e4a800ea177" containerName="nova-api-api" containerID="cri-o://18ddd6e3ad547c4944cb5e98b7ae2e945f7c76f0d0a718546312f974a906096b" gracePeriod=30 Feb 20 07:10:18 crc kubenswrapper[5094]: I0220 07:10:18.661893 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7677694455-llk7m" podStartSLOduration=3.66186123 podStartE2EDuration="3.66186123s" podCreationTimestamp="2026-02-20 07:10:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:10:18.651485661 +0000 UTC m=+1433.524112372" watchObservedRunningTime="2026-02-20 07:10:18.66186123 +0000 UTC m=+1433.534487971" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.047061 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.070723 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.070772 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.207857 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.332693 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-combined-ca-bundle\") pod \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.332887 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-config-data\") pod \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.333834 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-scripts\") pod \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.333955 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-sg-core-conf-yaml\") pod \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.334044 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npk6v\" (UniqueName: \"kubernetes.io/projected/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-kube-api-access-npk6v\") pod \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.334116 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-log-httpd\") pod \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.334168 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-run-httpd\") pod \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.334495 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" (UID: "941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.334833 5094 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.339857 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" (UID: "941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.347781 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-scripts" (OuterVolumeSpecName: "scripts") pod "941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" (UID: "941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.348302 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-kube-api-access-npk6v" (OuterVolumeSpecName: "kube-api-access-npk6v") pod "941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" (UID: "941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8"). InnerVolumeSpecName "kube-api-access-npk6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.382526 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" (UID: "941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.436862 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npk6v\" (UniqueName: \"kubernetes.io/projected/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-kube-api-access-npk6v\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.437122 5094 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.437234 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.437294 5094 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.471919 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" (UID: "941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.481204 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-config-data" (OuterVolumeSpecName: "config-data") pod "941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" (UID: "941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.539628 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.539903 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.632311 5094 generic.go:334] "Generic (PLEG): container finished" podID="f31fb44b-cd20-42d8-a384-0e4a800ea177" containerID="f71fec55c80bd4551277892c0bac5ca2613045a77053e7cbf6abbd8434bf844c" exitCode=143 Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.632409 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f31fb44b-cd20-42d8-a384-0e4a800ea177","Type":"ContainerDied","Data":"f71fec55c80bd4551277892c0bac5ca2613045a77053e7cbf6abbd8434bf844c"} Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.636058 5094 generic.go:334] "Generic (PLEG): container finished" podID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerID="d18a7448181106e07e9f83589f66798d53c158f7b49b4cb2f63dd87bb3ea37e5" exitCode=0 Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.636140 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.636184 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8","Type":"ContainerDied","Data":"d18a7448181106e07e9f83589f66798d53c158f7b49b4cb2f63dd87bb3ea37e5"} Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.636232 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8","Type":"ContainerDied","Data":"33dad10331613824fe73aa48d99c70dc5318fba2b2e24e20fa4abae1bee74f21"} Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.636256 5094 scope.go:117] "RemoveContainer" containerID="07ea59e61fed8a3358551accb40261cf86e2779fdb107f968e111962734d98f4" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.636854 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.663233 5094 scope.go:117] "RemoveContainer" containerID="40d3a0b02e5c779c99ffae4aa91b14ff147cdd2679151488afdea67f1c016caf" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.677182 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.691036 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.706819 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.707660 5094 scope.go:117] "RemoveContainer" containerID="d18a7448181106e07e9f83589f66798d53c158f7b49b4cb2f63dd87bb3ea37e5" Feb 20 07:10:19 crc kubenswrapper[5094]: E0220 07:10:19.709783 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerName="ceilometer-notification-agent" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.709804 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerName="ceilometer-notification-agent" Feb 20 07:10:19 crc kubenswrapper[5094]: E0220 07:10:19.709823 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerName="sg-core" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.709829 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerName="sg-core" Feb 20 07:10:19 crc kubenswrapper[5094]: E0220 07:10:19.709860 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerName="proxy-httpd" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.709866 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerName="proxy-httpd" Feb 20 07:10:19 crc kubenswrapper[5094]: E0220 07:10:19.710172 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerName="ceilometer-central-agent" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.710184 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerName="ceilometer-central-agent" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.710494 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerName="proxy-httpd" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.710525 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerName="ceilometer-central-agent" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.710532 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerName="sg-core" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.710547 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerName="ceilometer-notification-agent" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.712461 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.715778 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.715974 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.716003 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.728892 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.804508 5094 scope.go:117] "RemoveContainer" containerID="897a4151c566a72e1d0e9378be05cebb168cc60b9817042c8a3829b3600f9baa" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.835403 5094 scope.go:117] "RemoveContainer" containerID="07ea59e61fed8a3358551accb40261cf86e2779fdb107f968e111962734d98f4" Feb 20 07:10:19 crc kubenswrapper[5094]: E0220 07:10:19.836201 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07ea59e61fed8a3358551accb40261cf86e2779fdb107f968e111962734d98f4\": container with ID starting with 07ea59e61fed8a3358551accb40261cf86e2779fdb107f968e111962734d98f4 not found: ID does not exist" containerID="07ea59e61fed8a3358551accb40261cf86e2779fdb107f968e111962734d98f4" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.836257 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07ea59e61fed8a3358551accb40261cf86e2779fdb107f968e111962734d98f4"} err="failed to get container status \"07ea59e61fed8a3358551accb40261cf86e2779fdb107f968e111962734d98f4\": rpc error: code = NotFound desc = could not find container \"07ea59e61fed8a3358551accb40261cf86e2779fdb107f968e111962734d98f4\": container with ID starting with 07ea59e61fed8a3358551accb40261cf86e2779fdb107f968e111962734d98f4 not found: ID does not exist" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.836294 5094 scope.go:117] "RemoveContainer" containerID="40d3a0b02e5c779c99ffae4aa91b14ff147cdd2679151488afdea67f1c016caf" Feb 20 07:10:19 crc kubenswrapper[5094]: E0220 07:10:19.836894 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40d3a0b02e5c779c99ffae4aa91b14ff147cdd2679151488afdea67f1c016caf\": container with ID starting with 40d3a0b02e5c779c99ffae4aa91b14ff147cdd2679151488afdea67f1c016caf not found: ID does not exist" containerID="40d3a0b02e5c779c99ffae4aa91b14ff147cdd2679151488afdea67f1c016caf" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.836914 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40d3a0b02e5c779c99ffae4aa91b14ff147cdd2679151488afdea67f1c016caf"} err="failed to get container status \"40d3a0b02e5c779c99ffae4aa91b14ff147cdd2679151488afdea67f1c016caf\": rpc error: code = NotFound desc = could not find container \"40d3a0b02e5c779c99ffae4aa91b14ff147cdd2679151488afdea67f1c016caf\": container with ID starting with 40d3a0b02e5c779c99ffae4aa91b14ff147cdd2679151488afdea67f1c016caf not found: ID does not exist" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.836929 5094 scope.go:117] "RemoveContainer" containerID="d18a7448181106e07e9f83589f66798d53c158f7b49b4cb2f63dd87bb3ea37e5" Feb 20 07:10:19 crc kubenswrapper[5094]: E0220 07:10:19.837463 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d18a7448181106e07e9f83589f66798d53c158f7b49b4cb2f63dd87bb3ea37e5\": container with ID starting with d18a7448181106e07e9f83589f66798d53c158f7b49b4cb2f63dd87bb3ea37e5 not found: ID does not exist" containerID="d18a7448181106e07e9f83589f66798d53c158f7b49b4cb2f63dd87bb3ea37e5" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.837542 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d18a7448181106e07e9f83589f66798d53c158f7b49b4cb2f63dd87bb3ea37e5"} err="failed to get container status \"d18a7448181106e07e9f83589f66798d53c158f7b49b4cb2f63dd87bb3ea37e5\": rpc error: code = NotFound desc = could not find container \"d18a7448181106e07e9f83589f66798d53c158f7b49b4cb2f63dd87bb3ea37e5\": container with ID starting with d18a7448181106e07e9f83589f66798d53c158f7b49b4cb2f63dd87bb3ea37e5 not found: ID does not exist" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.837620 5094 scope.go:117] "RemoveContainer" containerID="897a4151c566a72e1d0e9378be05cebb168cc60b9817042c8a3829b3600f9baa" Feb 20 07:10:19 crc kubenswrapper[5094]: E0220 07:10:19.838095 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"897a4151c566a72e1d0e9378be05cebb168cc60b9817042c8a3829b3600f9baa\": container with ID starting with 897a4151c566a72e1d0e9378be05cebb168cc60b9817042c8a3829b3600f9baa not found: ID does not exist" containerID="897a4151c566a72e1d0e9378be05cebb168cc60b9817042c8a3829b3600f9baa" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.838132 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"897a4151c566a72e1d0e9378be05cebb168cc60b9817042c8a3829b3600f9baa"} err="failed to get container status \"897a4151c566a72e1d0e9378be05cebb168cc60b9817042c8a3829b3600f9baa\": rpc error: code = NotFound desc = could not find container \"897a4151c566a72e1d0e9378be05cebb168cc60b9817042c8a3829b3600f9baa\": container with ID starting with 897a4151c566a72e1d0e9378be05cebb168cc60b9817042c8a3829b3600f9baa not found: ID does not exist" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.845882 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-scripts\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.846111 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.846226 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-config-data\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.846278 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51465a58-4a65-4b58-b7fa-1180b1245e8a-log-httpd\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.846305 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51465a58-4a65-4b58-b7fa-1180b1245e8a-run-httpd\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.846558 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.846785 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.846838 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmfjt\" (UniqueName: \"kubernetes.io/projected/51465a58-4a65-4b58-b7fa-1180b1245e8a-kube-api-access-fmfjt\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.855760 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" path="/var/lib/kubelet/pods/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8/volumes" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.949359 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.949416 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmfjt\" (UniqueName: \"kubernetes.io/projected/51465a58-4a65-4b58-b7fa-1180b1245e8a-kube-api-access-fmfjt\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.949454 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-scripts\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.949586 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.949728 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-config-data\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.949768 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51465a58-4a65-4b58-b7fa-1180b1245e8a-log-httpd\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.949810 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51465a58-4a65-4b58-b7fa-1180b1245e8a-run-httpd\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.949907 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.951865 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51465a58-4a65-4b58-b7fa-1180b1245e8a-run-httpd\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.952335 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51465a58-4a65-4b58-b7fa-1180b1245e8a-log-httpd\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.960042 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.960336 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.960406 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.960496 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-scripts\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.960654 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-config-data\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.971761 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmfjt\" (UniqueName: \"kubernetes.io/projected/51465a58-4a65-4b58-b7fa-1180b1245e8a-kube-api-access-fmfjt\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:20 crc kubenswrapper[5094]: I0220 07:10:20.105047 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:10:20 crc kubenswrapper[5094]: W0220 07:10:20.643895 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51465a58_4a65_4b58_b7fa_1180b1245e8a.slice/crio-5e96006a15cd910a0a78cfae59f1575755cc3a3f08785750cbeacf83d1bc1674 WatchSource:0}: Error finding container 5e96006a15cd910a0a78cfae59f1575755cc3a3f08785750cbeacf83d1bc1674: Status 404 returned error can't find the container with id 5e96006a15cd910a0a78cfae59f1575755cc3a3f08785750cbeacf83d1bc1674 Feb 20 07:10:20 crc kubenswrapper[5094]: I0220 07:10:20.661424 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:10:21 crc kubenswrapper[5094]: I0220 07:10:21.187639 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:10:21 crc kubenswrapper[5094]: I0220 07:10:21.666747 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51465a58-4a65-4b58-b7fa-1180b1245e8a","Type":"ContainerStarted","Data":"01ec1a5b35feb84d8641c01766f84e98d6cb85128044a4d510ffc163fa29f15c"} Feb 20 07:10:21 crc kubenswrapper[5094]: I0220 07:10:21.666866 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51465a58-4a65-4b58-b7fa-1180b1245e8a","Type":"ContainerStarted","Data":"5e96006a15cd910a0a78cfae59f1575755cc3a3f08785750cbeacf83d1bc1674"} Feb 20 07:10:21 crc kubenswrapper[5094]: E0220 07:10:21.996261 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf31fb44b_cd20_42d8_a384_0e4a800ea177.slice/crio-18ddd6e3ad547c4944cb5e98b7ae2e945f7c76f0d0a718546312f974a906096b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf31fb44b_cd20_42d8_a384_0e4a800ea177.slice/crio-conmon-18ddd6e3ad547c4944cb5e98b7ae2e945f7c76f0d0a718546312f974a906096b.scope\": RecentStats: unable to find data in memory cache]" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.327655 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.516136 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f31fb44b-cd20-42d8-a384-0e4a800ea177-logs\") pod \"f31fb44b-cd20-42d8-a384-0e4a800ea177\" (UID: \"f31fb44b-cd20-42d8-a384-0e4a800ea177\") " Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.516813 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f31fb44b-cd20-42d8-a384-0e4a800ea177-config-data\") pod \"f31fb44b-cd20-42d8-a384-0e4a800ea177\" (UID: \"f31fb44b-cd20-42d8-a384-0e4a800ea177\") " Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.517539 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f31fb44b-cd20-42d8-a384-0e4a800ea177-logs" (OuterVolumeSpecName: "logs") pod "f31fb44b-cd20-42d8-a384-0e4a800ea177" (UID: "f31fb44b-cd20-42d8-a384-0e4a800ea177"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.518418 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh6fn\" (UniqueName: \"kubernetes.io/projected/f31fb44b-cd20-42d8-a384-0e4a800ea177-kube-api-access-qh6fn\") pod \"f31fb44b-cd20-42d8-a384-0e4a800ea177\" (UID: \"f31fb44b-cd20-42d8-a384-0e4a800ea177\") " Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.518473 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f31fb44b-cd20-42d8-a384-0e4a800ea177-combined-ca-bundle\") pod \"f31fb44b-cd20-42d8-a384-0e4a800ea177\" (UID: \"f31fb44b-cd20-42d8-a384-0e4a800ea177\") " Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.519028 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f31fb44b-cd20-42d8-a384-0e4a800ea177-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.532216 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f31fb44b-cd20-42d8-a384-0e4a800ea177-kube-api-access-qh6fn" (OuterVolumeSpecName: "kube-api-access-qh6fn") pod "f31fb44b-cd20-42d8-a384-0e4a800ea177" (UID: "f31fb44b-cd20-42d8-a384-0e4a800ea177"). InnerVolumeSpecName "kube-api-access-qh6fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.558574 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f31fb44b-cd20-42d8-a384-0e4a800ea177-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f31fb44b-cd20-42d8-a384-0e4a800ea177" (UID: "f31fb44b-cd20-42d8-a384-0e4a800ea177"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.561835 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f31fb44b-cd20-42d8-a384-0e4a800ea177-config-data" (OuterVolumeSpecName: "config-data") pod "f31fb44b-cd20-42d8-a384-0e4a800ea177" (UID: "f31fb44b-cd20-42d8-a384-0e4a800ea177"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.620540 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f31fb44b-cd20-42d8-a384-0e4a800ea177-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.620582 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qh6fn\" (UniqueName: \"kubernetes.io/projected/f31fb44b-cd20-42d8-a384-0e4a800ea177-kube-api-access-qh6fn\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.620592 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f31fb44b-cd20-42d8-a384-0e4a800ea177-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.678814 5094 generic.go:334] "Generic (PLEG): container finished" podID="f31fb44b-cd20-42d8-a384-0e4a800ea177" containerID="18ddd6e3ad547c4944cb5e98b7ae2e945f7c76f0d0a718546312f974a906096b" exitCode=0 Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.678876 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f31fb44b-cd20-42d8-a384-0e4a800ea177","Type":"ContainerDied","Data":"18ddd6e3ad547c4944cb5e98b7ae2e945f7c76f0d0a718546312f974a906096b"} Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.678914 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.679859 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f31fb44b-cd20-42d8-a384-0e4a800ea177","Type":"ContainerDied","Data":"7169a135be999d7785f50c1ff2763f918e250f0425691f0c798f5fb9269603bf"} Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.679893 5094 scope.go:117] "RemoveContainer" containerID="18ddd6e3ad547c4944cb5e98b7ae2e945f7c76f0d0a718546312f974a906096b" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.695898 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51465a58-4a65-4b58-b7fa-1180b1245e8a","Type":"ContainerStarted","Data":"ba0a7d74c30a9d41a3210803bed113e6c1055b5c997ed973ff04e9fba9735eb0"} Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.725603 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.736163 5094 scope.go:117] "RemoveContainer" containerID="f71fec55c80bd4551277892c0bac5ca2613045a77053e7cbf6abbd8434bf844c" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.744206 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.761012 5094 scope.go:117] "RemoveContainer" containerID="18ddd6e3ad547c4944cb5e98b7ae2e945f7c76f0d0a718546312f974a906096b" Feb 20 07:10:22 crc kubenswrapper[5094]: E0220 07:10:22.761466 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18ddd6e3ad547c4944cb5e98b7ae2e945f7c76f0d0a718546312f974a906096b\": container with ID starting with 18ddd6e3ad547c4944cb5e98b7ae2e945f7c76f0d0a718546312f974a906096b not found: ID does not exist" containerID="18ddd6e3ad547c4944cb5e98b7ae2e945f7c76f0d0a718546312f974a906096b" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.761522 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18ddd6e3ad547c4944cb5e98b7ae2e945f7c76f0d0a718546312f974a906096b"} err="failed to get container status \"18ddd6e3ad547c4944cb5e98b7ae2e945f7c76f0d0a718546312f974a906096b\": rpc error: code = NotFound desc = could not find container \"18ddd6e3ad547c4944cb5e98b7ae2e945f7c76f0d0a718546312f974a906096b\": container with ID starting with 18ddd6e3ad547c4944cb5e98b7ae2e945f7c76f0d0a718546312f974a906096b not found: ID does not exist" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.761571 5094 scope.go:117] "RemoveContainer" containerID="f71fec55c80bd4551277892c0bac5ca2613045a77053e7cbf6abbd8434bf844c" Feb 20 07:10:22 crc kubenswrapper[5094]: E0220 07:10:22.762108 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f71fec55c80bd4551277892c0bac5ca2613045a77053e7cbf6abbd8434bf844c\": container with ID starting with f71fec55c80bd4551277892c0bac5ca2613045a77053e7cbf6abbd8434bf844c not found: ID does not exist" containerID="f71fec55c80bd4551277892c0bac5ca2613045a77053e7cbf6abbd8434bf844c" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.762149 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f71fec55c80bd4551277892c0bac5ca2613045a77053e7cbf6abbd8434bf844c"} err="failed to get container status \"f71fec55c80bd4551277892c0bac5ca2613045a77053e7cbf6abbd8434bf844c\": rpc error: code = NotFound desc = could not find container \"f71fec55c80bd4551277892c0bac5ca2613045a77053e7cbf6abbd8434bf844c\": container with ID starting with f71fec55c80bd4551277892c0bac5ca2613045a77053e7cbf6abbd8434bf844c not found: ID does not exist" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.771881 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 20 07:10:22 crc kubenswrapper[5094]: E0220 07:10:22.772399 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31fb44b-cd20-42d8-a384-0e4a800ea177" containerName="nova-api-api" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.772420 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31fb44b-cd20-42d8-a384-0e4a800ea177" containerName="nova-api-api" Feb 20 07:10:22 crc kubenswrapper[5094]: E0220 07:10:22.772463 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31fb44b-cd20-42d8-a384-0e4a800ea177" containerName="nova-api-log" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.772471 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31fb44b-cd20-42d8-a384-0e4a800ea177" containerName="nova-api-log" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.772665 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f31fb44b-cd20-42d8-a384-0e4a800ea177" containerName="nova-api-api" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.772731 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f31fb44b-cd20-42d8-a384-0e4a800ea177" containerName="nova-api-log" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.773964 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.777273 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.777548 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.778580 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.780776 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.925547 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-config-data\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.926031 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.926078 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.926134 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-logs\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.926167 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxhmb\" (UniqueName: \"kubernetes.io/projected/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-kube-api-access-cxhmb\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.926188 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-public-tls-certs\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.028048 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.028167 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.028220 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-logs\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.028269 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxhmb\" (UniqueName: \"kubernetes.io/projected/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-kube-api-access-cxhmb\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.028312 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-public-tls-certs\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.028546 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-config-data\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.030759 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-logs\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.032723 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.033571 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-public-tls-certs\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.033579 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-config-data\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.036092 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.060104 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxhmb\" (UniqueName: \"kubernetes.io/projected/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-kube-api-access-cxhmb\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.103721 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.642473 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:10:23 crc kubenswrapper[5094]: W0220 07:10:23.649917 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee1fa388_c752_45bf_9bd0_25ef5ac0052e.slice/crio-0f5bc2605cd33284cc1bf9e4b5234398fc32ff7c0f92f9117b0968f05d2000c8 WatchSource:0}: Error finding container 0f5bc2605cd33284cc1bf9e4b5234398fc32ff7c0f92f9117b0968f05d2000c8: Status 404 returned error can't find the container with id 0f5bc2605cd33284cc1bf9e4b5234398fc32ff7c0f92f9117b0968f05d2000c8 Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.717048 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51465a58-4a65-4b58-b7fa-1180b1245e8a","Type":"ContainerStarted","Data":"e477f14e15963afda9a601fb4fbac2afcf99b9584a478ab0c3ba3be50b0e631b"} Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.720664 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ee1fa388-c752-45bf-9bd0-25ef5ac0052e","Type":"ContainerStarted","Data":"0f5bc2605cd33284cc1bf9e4b5234398fc32ff7c0f92f9117b0968f05d2000c8"} Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.881023 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f31fb44b-cd20-42d8-a384-0e4a800ea177" path="/var/lib/kubelet/pods/f31fb44b-cd20-42d8-a384-0e4a800ea177/volumes" Feb 20 07:10:24 crc kubenswrapper[5094]: I0220 07:10:24.041311 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:24 crc kubenswrapper[5094]: I0220 07:10:24.071897 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 07:10:24 crc kubenswrapper[5094]: I0220 07:10:24.071940 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 07:10:24 crc kubenswrapper[5094]: I0220 07:10:24.074860 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:24 crc kubenswrapper[5094]: I0220 07:10:24.439008 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 20 07:10:24 crc kubenswrapper[5094]: I0220 07:10:24.732860 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ee1fa388-c752-45bf-9bd0-25ef5ac0052e","Type":"ContainerStarted","Data":"3d3145d1a90086f1227a72125422cd468bd73a49677f9611c5a511d3d24411ae"} Feb 20 07:10:24 crc kubenswrapper[5094]: I0220 07:10:24.733247 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ee1fa388-c752-45bf-9bd0-25ef5ac0052e","Type":"ContainerStarted","Data":"8455819637c85b383fa39a66aae025d6e59a87cd68c5fdcccaf581b0cceed857"} Feb 20 07:10:24 crc kubenswrapper[5094]: I0220 07:10:24.742905 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerName="ceilometer-central-agent" containerID="cri-o://01ec1a5b35feb84d8641c01766f84e98d6cb85128044a4d510ffc163fa29f15c" gracePeriod=30 Feb 20 07:10:24 crc kubenswrapper[5094]: I0220 07:10:24.746811 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51465a58-4a65-4b58-b7fa-1180b1245e8a","Type":"ContainerStarted","Data":"4d5fa03cd8cbabd10d5ecc1d66e47cf63ec12bc59870f2b8394b65efb92f55a0"} Feb 20 07:10:24 crc kubenswrapper[5094]: I0220 07:10:24.746873 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 07:10:24 crc kubenswrapper[5094]: I0220 07:10:24.746908 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerName="sg-core" containerID="cri-o://e477f14e15963afda9a601fb4fbac2afcf99b9584a478ab0c3ba3be50b0e631b" gracePeriod=30 Feb 20 07:10:24 crc kubenswrapper[5094]: I0220 07:10:24.746961 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerName="proxy-httpd" containerID="cri-o://4d5fa03cd8cbabd10d5ecc1d66e47cf63ec12bc59870f2b8394b65efb92f55a0" gracePeriod=30 Feb 20 07:10:24 crc kubenswrapper[5094]: I0220 07:10:24.746956 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerName="ceilometer-notification-agent" containerID="cri-o://ba0a7d74c30a9d41a3210803bed113e6c1055b5c997ed973ff04e9fba9735eb0" gracePeriod=30 Feb 20 07:10:24 crc kubenswrapper[5094]: I0220 07:10:24.761959 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:24 crc kubenswrapper[5094]: I0220 07:10:24.769630 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.769611936 podStartE2EDuration="2.769611936s" podCreationTimestamp="2026-02-20 07:10:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:10:24.765773384 +0000 UTC m=+1439.638400095" watchObservedRunningTime="2026-02-20 07:10:24.769611936 +0000 UTC m=+1439.642238647" Feb 20 07:10:24 crc kubenswrapper[5094]: I0220 07:10:24.793863 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.555401157 podStartE2EDuration="5.793840996s" podCreationTimestamp="2026-02-20 07:10:19 +0000 UTC" firstStartedPulling="2026-02-20 07:10:20.647238508 +0000 UTC m=+1435.519865219" lastFinishedPulling="2026-02-20 07:10:23.885678327 +0000 UTC m=+1438.758305058" observedRunningTime="2026-02-20 07:10:24.783044398 +0000 UTC m=+1439.655671109" watchObservedRunningTime="2026-02-20 07:10:24.793840996 +0000 UTC m=+1439.666467707" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.018223 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-tp2rg"] Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.019474 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-tp2rg" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.023854 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.023874 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.031979 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-tp2rg"] Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.083100 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-tp2rg\" (UID: \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\") " pod="openstack/nova-cell1-cell-mapping-tp2rg" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.083166 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-scripts\") pod \"nova-cell1-cell-mapping-tp2rg\" (UID: \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\") " pod="openstack/nova-cell1-cell-mapping-tp2rg" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.083214 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz9vw\" (UniqueName: \"kubernetes.io/projected/85a1c623-233b-4b7e-9a57-e761a5ad27ab-kube-api-access-lz9vw\") pod \"nova-cell1-cell-mapping-tp2rg\" (UID: \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\") " pod="openstack/nova-cell1-cell-mapping-tp2rg" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.084068 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-config-data\") pod \"nova-cell1-cell-mapping-tp2rg\" (UID: \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\") " pod="openstack/nova-cell1-cell-mapping-tp2rg" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.085069 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c0eaf3b2-613b-41c2-9eac-ce8093ccec66" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.085276 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c0eaf3b2-613b-41c2-9eac-ce8093ccec66" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.189160 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-tp2rg\" (UID: \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\") " pod="openstack/nova-cell1-cell-mapping-tp2rg" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.189223 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-scripts\") pod \"nova-cell1-cell-mapping-tp2rg\" (UID: \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\") " pod="openstack/nova-cell1-cell-mapping-tp2rg" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.189265 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz9vw\" (UniqueName: \"kubernetes.io/projected/85a1c623-233b-4b7e-9a57-e761a5ad27ab-kube-api-access-lz9vw\") pod \"nova-cell1-cell-mapping-tp2rg\" (UID: \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\") " pod="openstack/nova-cell1-cell-mapping-tp2rg" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.189329 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-config-data\") pod \"nova-cell1-cell-mapping-tp2rg\" (UID: \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\") " pod="openstack/nova-cell1-cell-mapping-tp2rg" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.199036 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-tp2rg\" (UID: \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\") " pod="openstack/nova-cell1-cell-mapping-tp2rg" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.199067 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-config-data\") pod \"nova-cell1-cell-mapping-tp2rg\" (UID: \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\") " pod="openstack/nova-cell1-cell-mapping-tp2rg" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.212447 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz9vw\" (UniqueName: \"kubernetes.io/projected/85a1c623-233b-4b7e-9a57-e761a5ad27ab-kube-api-access-lz9vw\") pod \"nova-cell1-cell-mapping-tp2rg\" (UID: \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\") " pod="openstack/nova-cell1-cell-mapping-tp2rg" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.212550 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-scripts\") pod \"nova-cell1-cell-mapping-tp2rg\" (UID: \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\") " pod="openstack/nova-cell1-cell-mapping-tp2rg" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.346597 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-tp2rg" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.756237 5094 generic.go:334] "Generic (PLEG): container finished" podID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerID="4d5fa03cd8cbabd10d5ecc1d66e47cf63ec12bc59870f2b8394b65efb92f55a0" exitCode=0 Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.756695 5094 generic.go:334] "Generic (PLEG): container finished" podID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerID="e477f14e15963afda9a601fb4fbac2afcf99b9584a478ab0c3ba3be50b0e631b" exitCode=2 Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.756722 5094 generic.go:334] "Generic (PLEG): container finished" podID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerID="ba0a7d74c30a9d41a3210803bed113e6c1055b5c997ed973ff04e9fba9735eb0" exitCode=0 Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.757662 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51465a58-4a65-4b58-b7fa-1180b1245e8a","Type":"ContainerDied","Data":"4d5fa03cd8cbabd10d5ecc1d66e47cf63ec12bc59870f2b8394b65efb92f55a0"} Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.757690 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51465a58-4a65-4b58-b7fa-1180b1245e8a","Type":"ContainerDied","Data":"e477f14e15963afda9a601fb4fbac2afcf99b9584a478ab0c3ba3be50b0e631b"} Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.757700 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51465a58-4a65-4b58-b7fa-1180b1245e8a","Type":"ContainerDied","Data":"ba0a7d74c30a9d41a3210803bed113e6c1055b5c997ed973ff04e9fba9735eb0"} Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.826167 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-tp2rg"] Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.026859 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kbmdf"] Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.029369 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.061845 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kbmdf"] Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.108683 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.115988 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b5cbe24-4197-464c-b995-1a1708b551c4-catalog-content\") pod \"certified-operators-kbmdf\" (UID: \"3b5cbe24-4197-464c-b995-1a1708b551c4\") " pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.116029 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4c5k\" (UniqueName: \"kubernetes.io/projected/3b5cbe24-4197-464c-b995-1a1708b551c4-kube-api-access-q4c5k\") pod \"certified-operators-kbmdf\" (UID: \"3b5cbe24-4197-464c-b995-1a1708b551c4\") " pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.116129 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b5cbe24-4197-464c-b995-1a1708b551c4-utilities\") pod \"certified-operators-kbmdf\" (UID: \"3b5cbe24-4197-464c-b995-1a1708b551c4\") " pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.193784 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75ddbf7c75-smx5j"] Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.194163 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" podUID="5c3bfc89-edf8-4721-a74c-b01a81025919" containerName="dnsmasq-dns" containerID="cri-o://6722282a6c774d888a17178f484ae2330f099371a021c9f8c4fb887feb5e8b4c" gracePeriod=10 Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.218200 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b5cbe24-4197-464c-b995-1a1708b551c4-catalog-content\") pod \"certified-operators-kbmdf\" (UID: \"3b5cbe24-4197-464c-b995-1a1708b551c4\") " pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.218259 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4c5k\" (UniqueName: \"kubernetes.io/projected/3b5cbe24-4197-464c-b995-1a1708b551c4-kube-api-access-q4c5k\") pod \"certified-operators-kbmdf\" (UID: \"3b5cbe24-4197-464c-b995-1a1708b551c4\") " pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.218422 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b5cbe24-4197-464c-b995-1a1708b551c4-utilities\") pod \"certified-operators-kbmdf\" (UID: \"3b5cbe24-4197-464c-b995-1a1708b551c4\") " pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.219297 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b5cbe24-4197-464c-b995-1a1708b551c4-catalog-content\") pod \"certified-operators-kbmdf\" (UID: \"3b5cbe24-4197-464c-b995-1a1708b551c4\") " pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.220003 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b5cbe24-4197-464c-b995-1a1708b551c4-utilities\") pod \"certified-operators-kbmdf\" (UID: \"3b5cbe24-4197-464c-b995-1a1708b551c4\") " pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.243612 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4c5k\" (UniqueName: \"kubernetes.io/projected/3b5cbe24-4197-464c-b995-1a1708b551c4-kube-api-access-q4c5k\") pod \"certified-operators-kbmdf\" (UID: \"3b5cbe24-4197-464c-b995-1a1708b551c4\") " pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.413609 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.728381 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.774279 5094 generic.go:334] "Generic (PLEG): container finished" podID="5c3bfc89-edf8-4721-a74c-b01a81025919" containerID="6722282a6c774d888a17178f484ae2330f099371a021c9f8c4fb887feb5e8b4c" exitCode=0 Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.774374 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" event={"ID":"5c3bfc89-edf8-4721-a74c-b01a81025919","Type":"ContainerDied","Data":"6722282a6c774d888a17178f484ae2330f099371a021c9f8c4fb887feb5e8b4c"} Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.774408 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" event={"ID":"5c3bfc89-edf8-4721-a74c-b01a81025919","Type":"ContainerDied","Data":"b6d15829f95b8aff57b91d13f507a8fa1a3e6f6b0bdf9f807b5778fd0588a0ff"} Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.774430 5094 scope.go:117] "RemoveContainer" containerID="6722282a6c774d888a17178f484ae2330f099371a021c9f8c4fb887feb5e8b4c" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.774610 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.791062 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-tp2rg" event={"ID":"85a1c623-233b-4b7e-9a57-e761a5ad27ab","Type":"ContainerStarted","Data":"f4dba9092d5d591bec2aa8645bd29d545b24d0697fb34f266f49cf7c75f5e2e2"} Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.791124 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-tp2rg" event={"ID":"85a1c623-233b-4b7e-9a57-e761a5ad27ab","Type":"ContainerStarted","Data":"a97113103dcc5c7ddecb6fa66034d60390c12cce955ecc18177eadd21f50c5ad"} Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.823522 5094 scope.go:117] "RemoveContainer" containerID="f1d81ab41f319c40d4f9b647076b0951c152348ace820e5806cf50ad3581a45e" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.833780 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-tp2rg" podStartSLOduration=2.83375479 podStartE2EDuration="2.83375479s" podCreationTimestamp="2026-02-20 07:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:10:26.811878227 +0000 UTC m=+1441.684504938" watchObservedRunningTime="2026-02-20 07:10:26.83375479 +0000 UTC m=+1441.706381501" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.836561 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-ovsdbserver-sb\") pod \"5c3bfc89-edf8-4721-a74c-b01a81025919\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.836623 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-dns-swift-storage-0\") pod \"5c3bfc89-edf8-4721-a74c-b01a81025919\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.836721 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-ovsdbserver-nb\") pod \"5c3bfc89-edf8-4721-a74c-b01a81025919\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.836954 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-dns-svc\") pod \"5c3bfc89-edf8-4721-a74c-b01a81025919\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.836984 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88nqd\" (UniqueName: \"kubernetes.io/projected/5c3bfc89-edf8-4721-a74c-b01a81025919-kube-api-access-88nqd\") pod \"5c3bfc89-edf8-4721-a74c-b01a81025919\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.837078 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-config\") pod \"5c3bfc89-edf8-4721-a74c-b01a81025919\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.848285 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c3bfc89-edf8-4721-a74c-b01a81025919-kube-api-access-88nqd" (OuterVolumeSpecName: "kube-api-access-88nqd") pod "5c3bfc89-edf8-4721-a74c-b01a81025919" (UID: "5c3bfc89-edf8-4721-a74c-b01a81025919"). InnerVolumeSpecName "kube-api-access-88nqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.855587 5094 scope.go:117] "RemoveContainer" containerID="6722282a6c774d888a17178f484ae2330f099371a021c9f8c4fb887feb5e8b4c" Feb 20 07:10:26 crc kubenswrapper[5094]: E0220 07:10:26.856017 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6722282a6c774d888a17178f484ae2330f099371a021c9f8c4fb887feb5e8b4c\": container with ID starting with 6722282a6c774d888a17178f484ae2330f099371a021c9f8c4fb887feb5e8b4c not found: ID does not exist" containerID="6722282a6c774d888a17178f484ae2330f099371a021c9f8c4fb887feb5e8b4c" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.856142 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6722282a6c774d888a17178f484ae2330f099371a021c9f8c4fb887feb5e8b4c"} err="failed to get container status \"6722282a6c774d888a17178f484ae2330f099371a021c9f8c4fb887feb5e8b4c\": rpc error: code = NotFound desc = could not find container \"6722282a6c774d888a17178f484ae2330f099371a021c9f8c4fb887feb5e8b4c\": container with ID starting with 6722282a6c774d888a17178f484ae2330f099371a021c9f8c4fb887feb5e8b4c not found: ID does not exist" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.856229 5094 scope.go:117] "RemoveContainer" containerID="f1d81ab41f319c40d4f9b647076b0951c152348ace820e5806cf50ad3581a45e" Feb 20 07:10:26 crc kubenswrapper[5094]: E0220 07:10:26.856527 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1d81ab41f319c40d4f9b647076b0951c152348ace820e5806cf50ad3581a45e\": container with ID starting with f1d81ab41f319c40d4f9b647076b0951c152348ace820e5806cf50ad3581a45e not found: ID does not exist" containerID="f1d81ab41f319c40d4f9b647076b0951c152348ace820e5806cf50ad3581a45e" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.856620 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1d81ab41f319c40d4f9b647076b0951c152348ace820e5806cf50ad3581a45e"} err="failed to get container status \"f1d81ab41f319c40d4f9b647076b0951c152348ace820e5806cf50ad3581a45e\": rpc error: code = NotFound desc = could not find container \"f1d81ab41f319c40d4f9b647076b0951c152348ace820e5806cf50ad3581a45e\": container with ID starting with f1d81ab41f319c40d4f9b647076b0951c152348ace820e5806cf50ad3581a45e not found: ID does not exist" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.953964 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88nqd\" (UniqueName: \"kubernetes.io/projected/5c3bfc89-edf8-4721-a74c-b01a81025919-kube-api-access-88nqd\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.956926 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-config" (OuterVolumeSpecName: "config") pod "5c3bfc89-edf8-4721-a74c-b01a81025919" (UID: "5c3bfc89-edf8-4721-a74c-b01a81025919"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.984731 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5c3bfc89-edf8-4721-a74c-b01a81025919" (UID: "5c3bfc89-edf8-4721-a74c-b01a81025919"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:10:27 crc kubenswrapper[5094]: I0220 07:10:26.999020 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5c3bfc89-edf8-4721-a74c-b01a81025919" (UID: "5c3bfc89-edf8-4721-a74c-b01a81025919"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:10:27 crc kubenswrapper[5094]: I0220 07:10:27.010341 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5c3bfc89-edf8-4721-a74c-b01a81025919" (UID: "5c3bfc89-edf8-4721-a74c-b01a81025919"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:10:27 crc kubenswrapper[5094]: I0220 07:10:27.024014 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5c3bfc89-edf8-4721-a74c-b01a81025919" (UID: "5c3bfc89-edf8-4721-a74c-b01a81025919"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:10:27 crc kubenswrapper[5094]: I0220 07:10:27.093883 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:27 crc kubenswrapper[5094]: I0220 07:10:27.093914 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:27 crc kubenswrapper[5094]: I0220 07:10:27.093924 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:27 crc kubenswrapper[5094]: I0220 07:10:27.093934 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:27 crc kubenswrapper[5094]: I0220 07:10:27.093943 5094 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:27 crc kubenswrapper[5094]: I0220 07:10:27.159853 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75ddbf7c75-smx5j"] Feb 20 07:10:27 crc kubenswrapper[5094]: I0220 07:10:27.166937 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75ddbf7c75-smx5j"] Feb 20 07:10:27 crc kubenswrapper[5094]: I0220 07:10:27.187210 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kbmdf"] Feb 20 07:10:27 crc kubenswrapper[5094]: I0220 07:10:27.806449 5094 generic.go:334] "Generic (PLEG): container finished" podID="3b5cbe24-4197-464c-b995-1a1708b551c4" containerID="ff4e8e219345288cfccd7aabbaf0290fadab4ba345665377f23e53ef7cae6632" exitCode=0 Feb 20 07:10:27 crc kubenswrapper[5094]: I0220 07:10:27.806503 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kbmdf" event={"ID":"3b5cbe24-4197-464c-b995-1a1708b551c4","Type":"ContainerDied","Data":"ff4e8e219345288cfccd7aabbaf0290fadab4ba345665377f23e53ef7cae6632"} Feb 20 07:10:27 crc kubenswrapper[5094]: I0220 07:10:27.806852 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kbmdf" event={"ID":"3b5cbe24-4197-464c-b995-1a1708b551c4","Type":"ContainerStarted","Data":"66d7454047a3aa7a1aca34ae635b12f5cfc407d75ffd9cc4ed6c357443fe8696"} Feb 20 07:10:27 crc kubenswrapper[5094]: I0220 07:10:27.854388 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c3bfc89-edf8-4721-a74c-b01a81025919" path="/var/lib/kubelet/pods/5c3bfc89-edf8-4721-a74c-b01a81025919/volumes" Feb 20 07:10:28 crc kubenswrapper[5094]: I0220 07:10:28.825387 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kbmdf" event={"ID":"3b5cbe24-4197-464c-b995-1a1708b551c4","Type":"ContainerStarted","Data":"606cb4ba5ad2ea0198fffd5e046968227d6ed3efb0115787761f2b7c8968dac1"} Feb 20 07:10:28 crc kubenswrapper[5094]: I0220 07:10:28.838804 5094 generic.go:334] "Generic (PLEG): container finished" podID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerID="01ec1a5b35feb84d8641c01766f84e98d6cb85128044a4d510ffc163fa29f15c" exitCode=0 Feb 20 07:10:28 crc kubenswrapper[5094]: I0220 07:10:28.838861 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51465a58-4a65-4b58-b7fa-1180b1245e8a","Type":"ContainerDied","Data":"01ec1a5b35feb84d8641c01766f84e98d6cb85128044a4d510ffc163fa29f15c"} Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.074817 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.144210 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-scripts\") pod \"51465a58-4a65-4b58-b7fa-1180b1245e8a\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.144322 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-ceilometer-tls-certs\") pod \"51465a58-4a65-4b58-b7fa-1180b1245e8a\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.144430 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmfjt\" (UniqueName: \"kubernetes.io/projected/51465a58-4a65-4b58-b7fa-1180b1245e8a-kube-api-access-fmfjt\") pod \"51465a58-4a65-4b58-b7fa-1180b1245e8a\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.144492 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-combined-ca-bundle\") pod \"51465a58-4a65-4b58-b7fa-1180b1245e8a\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.144721 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51465a58-4a65-4b58-b7fa-1180b1245e8a-log-httpd\") pod \"51465a58-4a65-4b58-b7fa-1180b1245e8a\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.144783 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-config-data\") pod \"51465a58-4a65-4b58-b7fa-1180b1245e8a\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.144827 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51465a58-4a65-4b58-b7fa-1180b1245e8a-run-httpd\") pod \"51465a58-4a65-4b58-b7fa-1180b1245e8a\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.144901 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-sg-core-conf-yaml\") pod \"51465a58-4a65-4b58-b7fa-1180b1245e8a\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.146200 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51465a58-4a65-4b58-b7fa-1180b1245e8a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "51465a58-4a65-4b58-b7fa-1180b1245e8a" (UID: "51465a58-4a65-4b58-b7fa-1180b1245e8a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.146631 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51465a58-4a65-4b58-b7fa-1180b1245e8a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "51465a58-4a65-4b58-b7fa-1180b1245e8a" (UID: "51465a58-4a65-4b58-b7fa-1180b1245e8a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.157875 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-scripts" (OuterVolumeSpecName: "scripts") pod "51465a58-4a65-4b58-b7fa-1180b1245e8a" (UID: "51465a58-4a65-4b58-b7fa-1180b1245e8a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.157880 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51465a58-4a65-4b58-b7fa-1180b1245e8a-kube-api-access-fmfjt" (OuterVolumeSpecName: "kube-api-access-fmfjt") pod "51465a58-4a65-4b58-b7fa-1180b1245e8a" (UID: "51465a58-4a65-4b58-b7fa-1180b1245e8a"). InnerVolumeSpecName "kube-api-access-fmfjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.194201 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "51465a58-4a65-4b58-b7fa-1180b1245e8a" (UID: "51465a58-4a65-4b58-b7fa-1180b1245e8a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.214681 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "51465a58-4a65-4b58-b7fa-1180b1245e8a" (UID: "51465a58-4a65-4b58-b7fa-1180b1245e8a"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.239908 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51465a58-4a65-4b58-b7fa-1180b1245e8a" (UID: "51465a58-4a65-4b58-b7fa-1180b1245e8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.247943 5094 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51465a58-4a65-4b58-b7fa-1180b1245e8a-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.247988 5094 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51465a58-4a65-4b58-b7fa-1180b1245e8a-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.248001 5094 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.248017 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.248028 5094 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.248044 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmfjt\" (UniqueName: \"kubernetes.io/projected/51465a58-4a65-4b58-b7fa-1180b1245e8a-kube-api-access-fmfjt\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.248056 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.290815 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-config-data" (OuterVolumeSpecName: "config-data") pod "51465a58-4a65-4b58-b7fa-1180b1245e8a" (UID: "51465a58-4a65-4b58-b7fa-1180b1245e8a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.349929 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.861024 5094 generic.go:334] "Generic (PLEG): container finished" podID="3b5cbe24-4197-464c-b995-1a1708b551c4" containerID="606cb4ba5ad2ea0198fffd5e046968227d6ed3efb0115787761f2b7c8968dac1" exitCode=0 Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.861099 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kbmdf" event={"ID":"3b5cbe24-4197-464c-b995-1a1708b551c4","Type":"ContainerDied","Data":"606cb4ba5ad2ea0198fffd5e046968227d6ed3efb0115787761f2b7c8968dac1"} Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.887947 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51465a58-4a65-4b58-b7fa-1180b1245e8a","Type":"ContainerDied","Data":"5e96006a15cd910a0a78cfae59f1575755cc3a3f08785750cbeacf83d1bc1674"} Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.888051 5094 scope.go:117] "RemoveContainer" containerID="4d5fa03cd8cbabd10d5ecc1d66e47cf63ec12bc59870f2b8394b65efb92f55a0" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.888472 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.946965 5094 scope.go:117] "RemoveContainer" containerID="e477f14e15963afda9a601fb4fbac2afcf99b9584a478ab0c3ba3be50b0e631b" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.990304 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.016384 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.030894 5094 scope.go:117] "RemoveContainer" containerID="ba0a7d74c30a9d41a3210803bed113e6c1055b5c997ed973ff04e9fba9735eb0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.043539 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:10:30 crc kubenswrapper[5094]: E0220 07:10:30.044246 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerName="sg-core" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.044263 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerName="sg-core" Feb 20 07:10:30 crc kubenswrapper[5094]: E0220 07:10:30.044291 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c3bfc89-edf8-4721-a74c-b01a81025919" containerName="dnsmasq-dns" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.044300 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c3bfc89-edf8-4721-a74c-b01a81025919" containerName="dnsmasq-dns" Feb 20 07:10:30 crc kubenswrapper[5094]: E0220 07:10:30.044316 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerName="ceilometer-notification-agent" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.044324 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerName="ceilometer-notification-agent" Feb 20 07:10:30 crc kubenswrapper[5094]: E0220 07:10:30.044337 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerName="proxy-httpd" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.044345 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerName="proxy-httpd" Feb 20 07:10:30 crc kubenswrapper[5094]: E0220 07:10:30.044356 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c3bfc89-edf8-4721-a74c-b01a81025919" containerName="init" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.044363 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c3bfc89-edf8-4721-a74c-b01a81025919" containerName="init" Feb 20 07:10:30 crc kubenswrapper[5094]: E0220 07:10:30.044403 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerName="ceilometer-central-agent" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.044412 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerName="ceilometer-central-agent" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.044661 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerName="sg-core" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.044687 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerName="proxy-httpd" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.045847 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerName="ceilometer-central-agent" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.045913 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerName="ceilometer-notification-agent" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.045984 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c3bfc89-edf8-4721-a74c-b01a81025919" containerName="dnsmasq-dns" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.049657 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.060616 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.061683 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.061867 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.093746 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.109967 5094 scope.go:117] "RemoveContainer" containerID="01ec1a5b35feb84d8641c01766f84e98d6cb85128044a4d510ffc163fa29f15c" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.182048 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-scripts\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.182118 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1218d679-0e51-4bef-9526-db16c8783d8b-log-httpd\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.182218 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.182247 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-config-data\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.182272 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1218d679-0e51-4bef-9526-db16c8783d8b-run-httpd\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.182290 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.182343 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.182375 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmtnp\" (UniqueName: \"kubernetes.io/projected/1218d679-0e51-4bef-9526-db16c8783d8b-kube-api-access-rmtnp\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.284093 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-scripts\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.284146 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1218d679-0e51-4bef-9526-db16c8783d8b-log-httpd\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.284227 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.284257 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-config-data\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.284277 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1218d679-0e51-4bef-9526-db16c8783d8b-run-httpd\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.284294 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.284339 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.284372 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmtnp\" (UniqueName: \"kubernetes.io/projected/1218d679-0e51-4bef-9526-db16c8783d8b-kube-api-access-rmtnp\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.292152 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.292359 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.292776 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1218d679-0e51-4bef-9526-db16c8783d8b-run-httpd\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.293217 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-scripts\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.293016 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1218d679-0e51-4bef-9526-db16c8783d8b-log-httpd\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.294242 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.294427 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-config-data\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.315092 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmtnp\" (UniqueName: \"kubernetes.io/projected/1218d679-0e51-4bef-9526-db16c8783d8b-kube-api-access-rmtnp\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.428615 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.902356 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kbmdf" event={"ID":"3b5cbe24-4197-464c-b995-1a1708b551c4","Type":"ContainerStarted","Data":"5ab65dc0d699ae8adcd5a321bb71d387f14dd304ffa2faad72488ad6d7cb83d4"} Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.942566 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kbmdf" podStartSLOduration=2.470435377 podStartE2EDuration="4.942540463s" podCreationTimestamp="2026-02-20 07:10:26 +0000 UTC" firstStartedPulling="2026-02-20 07:10:27.809648602 +0000 UTC m=+1442.682275333" lastFinishedPulling="2026-02-20 07:10:30.281753708 +0000 UTC m=+1445.154380419" observedRunningTime="2026-02-20 07:10:30.931790895 +0000 UTC m=+1445.804417616" watchObservedRunningTime="2026-02-20 07:10:30.942540463 +0000 UTC m=+1445.815167184" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.987478 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:10:30 crc kubenswrapper[5094]: W0220 07:10:30.992984 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1218d679_0e51_4bef_9526_db16c8783d8b.slice/crio-98363b14b517b7143e52cebeb4cbb5e2929e2cad785f4ba33d41cbc51fc7ecfd WatchSource:0}: Error finding container 98363b14b517b7143e52cebeb4cbb5e2929e2cad785f4ba33d41cbc51fc7ecfd: Status 404 returned error can't find the container with id 98363b14b517b7143e52cebeb4cbb5e2929e2cad785f4ba33d41cbc51fc7ecfd Feb 20 07:10:31 crc kubenswrapper[5094]: I0220 07:10:31.863471 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" path="/var/lib/kubelet/pods/51465a58-4a65-4b58-b7fa-1180b1245e8a/volumes" Feb 20 07:10:31 crc kubenswrapper[5094]: I0220 07:10:31.915022 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1218d679-0e51-4bef-9526-db16c8783d8b","Type":"ContainerStarted","Data":"f580b2199d149d5c06b5c9bfb9b0ef745161fcfd54354c074e1bf52afa2fbbe4"} Feb 20 07:10:31 crc kubenswrapper[5094]: I0220 07:10:31.915080 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1218d679-0e51-4bef-9526-db16c8783d8b","Type":"ContainerStarted","Data":"98363b14b517b7143e52cebeb4cbb5e2929e2cad785f4ba33d41cbc51fc7ecfd"} Feb 20 07:10:32 crc kubenswrapper[5094]: E0220 07:10:32.292389 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85a1c623_233b_4b7e_9a57_e761a5ad27ab.slice/crio-f4dba9092d5d591bec2aa8645bd29d545b24d0697fb34f266f49cf7c75f5e2e2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85a1c623_233b_4b7e_9a57_e761a5ad27ab.slice/crio-conmon-f4dba9092d5d591bec2aa8645bd29d545b24d0697fb34f266f49cf7c75f5e2e2.scope\": RecentStats: unable to find data in memory cache]" Feb 20 07:10:32 crc kubenswrapper[5094]: I0220 07:10:32.937261 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1218d679-0e51-4bef-9526-db16c8783d8b","Type":"ContainerStarted","Data":"0ea214df3349d0a0166b8f5bd5c44fd7b531097f01bbd476b5f80570546b706a"} Feb 20 07:10:32 crc kubenswrapper[5094]: I0220 07:10:32.937632 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1218d679-0e51-4bef-9526-db16c8783d8b","Type":"ContainerStarted","Data":"b54b09ff79204e43835570c3e6dd4b61080c0d99a2704ffb7699787efee4998c"} Feb 20 07:10:32 crc kubenswrapper[5094]: I0220 07:10:32.942061 5094 generic.go:334] "Generic (PLEG): container finished" podID="85a1c623-233b-4b7e-9a57-e761a5ad27ab" containerID="f4dba9092d5d591bec2aa8645bd29d545b24d0697fb34f266f49cf7c75f5e2e2" exitCode=0 Feb 20 07:10:32 crc kubenswrapper[5094]: I0220 07:10:32.942125 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-tp2rg" event={"ID":"85a1c623-233b-4b7e-9a57-e761a5ad27ab","Type":"ContainerDied","Data":"f4dba9092d5d591bec2aa8645bd29d545b24d0697fb34f266f49cf7c75f5e2e2"} Feb 20 07:10:33 crc kubenswrapper[5094]: I0220 07:10:33.105028 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 07:10:33 crc kubenswrapper[5094]: I0220 07:10:33.105150 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.080977 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.097394 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.116924 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ee1fa388-c752-45bf-9bd0-25ef5ac0052e" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.199:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.117318 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ee1fa388-c752-45bf-9bd0-25ef5ac0052e" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.199:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.117670 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.117740 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.117838 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.419233 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-tp2rg" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.485813 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-config-data\") pod \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\" (UID: \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\") " Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.485929 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-scripts\") pod \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\" (UID: \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\") " Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.486268 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9vw\" (UniqueName: \"kubernetes.io/projected/85a1c623-233b-4b7e-9a57-e761a5ad27ab-kube-api-access-lz9vw\") pod \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\" (UID: \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\") " Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.486320 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-combined-ca-bundle\") pod \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\" (UID: \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\") " Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.514301 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-scripts" (OuterVolumeSpecName: "scripts") pod "85a1c623-233b-4b7e-9a57-e761a5ad27ab" (UID: "85a1c623-233b-4b7e-9a57-e761a5ad27ab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.520054 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85a1c623-233b-4b7e-9a57-e761a5ad27ab-kube-api-access-lz9vw" (OuterVolumeSpecName: "kube-api-access-lz9vw") pod "85a1c623-233b-4b7e-9a57-e761a5ad27ab" (UID: "85a1c623-233b-4b7e-9a57-e761a5ad27ab"). InnerVolumeSpecName "kube-api-access-lz9vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.564212 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-config-data" (OuterVolumeSpecName: "config-data") pod "85a1c623-233b-4b7e-9a57-e761a5ad27ab" (UID: "85a1c623-233b-4b7e-9a57-e761a5ad27ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.570547 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85a1c623-233b-4b7e-9a57-e761a5ad27ab" (UID: "85a1c623-233b-4b7e-9a57-e761a5ad27ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.590017 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9vw\" (UniqueName: \"kubernetes.io/projected/85a1c623-233b-4b7e-9a57-e761a5ad27ab-kube-api-access-lz9vw\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.590050 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.590060 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.590070 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.967094 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-tp2rg" event={"ID":"85a1c623-233b-4b7e-9a57-e761a5ad27ab","Type":"ContainerDied","Data":"a97113103dcc5c7ddecb6fa66034d60390c12cce955ecc18177eadd21f50c5ad"} Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.967142 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-tp2rg" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.967180 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a97113103dcc5c7ddecb6fa66034d60390c12cce955ecc18177eadd21f50c5ad" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.980964 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1218d679-0e51-4bef-9526-db16c8783d8b","Type":"ContainerStarted","Data":"4073373c0648687276e84e986260ddd367c34f5129c119e3fce3d4726ca86b92"} Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.996394 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 20 07:10:35 crc kubenswrapper[5094]: I0220 07:10:35.032536 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.874135313 podStartE2EDuration="6.032506184s" podCreationTimestamp="2026-02-20 07:10:29 +0000 UTC" firstStartedPulling="2026-02-20 07:10:30.996483114 +0000 UTC m=+1445.869109835" lastFinishedPulling="2026-02-20 07:10:34.154853985 +0000 UTC m=+1449.027480706" observedRunningTime="2026-02-20 07:10:35.014434232 +0000 UTC m=+1449.887060943" watchObservedRunningTime="2026-02-20 07:10:35.032506184 +0000 UTC m=+1449.905132915" Feb 20 07:10:35 crc kubenswrapper[5094]: I0220 07:10:35.219236 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:10:35 crc kubenswrapper[5094]: I0220 07:10:35.247775 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:10:35 crc kubenswrapper[5094]: I0220 07:10:35.248099 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ee1fa388-c752-45bf-9bd0-25ef5ac0052e" containerName="nova-api-log" containerID="cri-o://8455819637c85b383fa39a66aae025d6e59a87cd68c5fdcccaf581b0cceed857" gracePeriod=30 Feb 20 07:10:35 crc kubenswrapper[5094]: I0220 07:10:35.248653 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ee1fa388-c752-45bf-9bd0-25ef5ac0052e" containerName="nova-api-api" containerID="cri-o://3d3145d1a90086f1227a72125422cd468bd73a49677f9611c5a511d3d24411ae" gracePeriod=30 Feb 20 07:10:35 crc kubenswrapper[5094]: I0220 07:10:35.256717 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:10:35 crc kubenswrapper[5094]: I0220 07:10:35.258043 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c1c68ac6-9f96-4a39-b477-7ad74a04dff9" containerName="nova-scheduler-scheduler" containerID="cri-o://af3a839b78e1bdb84a4e5e4f15b94bc87880557a5061b18ba71bafe2fbc48a2f" gracePeriod=30 Feb 20 07:10:35 crc kubenswrapper[5094]: I0220 07:10:35.995110 5094 generic.go:334] "Generic (PLEG): container finished" podID="ee1fa388-c752-45bf-9bd0-25ef5ac0052e" containerID="8455819637c85b383fa39a66aae025d6e59a87cd68c5fdcccaf581b0cceed857" exitCode=143 Feb 20 07:10:35 crc kubenswrapper[5094]: I0220 07:10:35.995233 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ee1fa388-c752-45bf-9bd0-25ef5ac0052e","Type":"ContainerDied","Data":"8455819637c85b383fa39a66aae025d6e59a87cd68c5fdcccaf581b0cceed857"} Feb 20 07:10:35 crc kubenswrapper[5094]: I0220 07:10:35.995754 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 07:10:36 crc kubenswrapper[5094]: I0220 07:10:36.414185 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:36 crc kubenswrapper[5094]: I0220 07:10:36.414237 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:36 crc kubenswrapper[5094]: I0220 07:10:36.483885 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:37 crc kubenswrapper[5094]: I0220 07:10:37.007956 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c0eaf3b2-613b-41c2-9eac-ce8093ccec66" containerName="nova-metadata-metadata" containerID="cri-o://c7a53317b54066a403baba090862c5995bba55506254ad03280d9c1ace19b11d" gracePeriod=30 Feb 20 07:10:37 crc kubenswrapper[5094]: I0220 07:10:37.008044 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c0eaf3b2-613b-41c2-9eac-ce8093ccec66" containerName="nova-metadata-log" containerID="cri-o://ae97ccd59962f48b4e76117ad4ab2034a3bfad9a1ff6a242afd02e6a32cafe32" gracePeriod=30 Feb 20 07:10:37 crc kubenswrapper[5094]: I0220 07:10:37.100547 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:37 crc kubenswrapper[5094]: I0220 07:10:37.164122 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kbmdf"] Feb 20 07:10:38 crc kubenswrapper[5094]: I0220 07:10:38.028546 5094 generic.go:334] "Generic (PLEG): container finished" podID="c0eaf3b2-613b-41c2-9eac-ce8093ccec66" containerID="ae97ccd59962f48b4e76117ad4ab2034a3bfad9a1ff6a242afd02e6a32cafe32" exitCode=143 Feb 20 07:10:38 crc kubenswrapper[5094]: I0220 07:10:38.028663 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0eaf3b2-613b-41c2-9eac-ce8093ccec66","Type":"ContainerDied","Data":"ae97ccd59962f48b4e76117ad4ab2034a3bfad9a1ff6a242afd02e6a32cafe32"} Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.044685 5094 generic.go:334] "Generic (PLEG): container finished" podID="c1c68ac6-9f96-4a39-b477-7ad74a04dff9" containerID="af3a839b78e1bdb84a4e5e4f15b94bc87880557a5061b18ba71bafe2fbc48a2f" exitCode=0 Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.045350 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kbmdf" podUID="3b5cbe24-4197-464c-b995-1a1708b551c4" containerName="registry-server" containerID="cri-o://5ab65dc0d699ae8adcd5a321bb71d387f14dd304ffa2faad72488ad6d7cb83d4" gracePeriod=2 Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.044866 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c1c68ac6-9f96-4a39-b477-7ad74a04dff9","Type":"ContainerDied","Data":"af3a839b78e1bdb84a4e5e4f15b94bc87880557a5061b18ba71bafe2fbc48a2f"} Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.479390 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.517558 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8xqn\" (UniqueName: \"kubernetes.io/projected/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-kube-api-access-w8xqn\") pod \"c1c68ac6-9f96-4a39-b477-7ad74a04dff9\" (UID: \"c1c68ac6-9f96-4a39-b477-7ad74a04dff9\") " Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.517632 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-combined-ca-bundle\") pod \"c1c68ac6-9f96-4a39-b477-7ad74a04dff9\" (UID: \"c1c68ac6-9f96-4a39-b477-7ad74a04dff9\") " Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.517773 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-config-data\") pod \"c1c68ac6-9f96-4a39-b477-7ad74a04dff9\" (UID: \"c1c68ac6-9f96-4a39-b477-7ad74a04dff9\") " Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.525172 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-kube-api-access-w8xqn" (OuterVolumeSpecName: "kube-api-access-w8xqn") pod "c1c68ac6-9f96-4a39-b477-7ad74a04dff9" (UID: "c1c68ac6-9f96-4a39-b477-7ad74a04dff9"). InnerVolumeSpecName "kube-api-access-w8xqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.554065 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-config-data" (OuterVolumeSpecName: "config-data") pod "c1c68ac6-9f96-4a39-b477-7ad74a04dff9" (UID: "c1c68ac6-9f96-4a39-b477-7ad74a04dff9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.577441 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1c68ac6-9f96-4a39-b477-7ad74a04dff9" (UID: "c1c68ac6-9f96-4a39-b477-7ad74a04dff9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.581892 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.619627 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b5cbe24-4197-464c-b995-1a1708b551c4-utilities\") pod \"3b5cbe24-4197-464c-b995-1a1708b551c4\" (UID: \"3b5cbe24-4197-464c-b995-1a1708b551c4\") " Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.619959 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4c5k\" (UniqueName: \"kubernetes.io/projected/3b5cbe24-4197-464c-b995-1a1708b551c4-kube-api-access-q4c5k\") pod \"3b5cbe24-4197-464c-b995-1a1708b551c4\" (UID: \"3b5cbe24-4197-464c-b995-1a1708b551c4\") " Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.619999 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b5cbe24-4197-464c-b995-1a1708b551c4-catalog-content\") pod \"3b5cbe24-4197-464c-b995-1a1708b551c4\" (UID: \"3b5cbe24-4197-464c-b995-1a1708b551c4\") " Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.620443 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8xqn\" (UniqueName: \"kubernetes.io/projected/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-kube-api-access-w8xqn\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.620464 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.620473 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.622746 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b5cbe24-4197-464c-b995-1a1708b551c4-utilities" (OuterVolumeSpecName: "utilities") pod "3b5cbe24-4197-464c-b995-1a1708b551c4" (UID: "3b5cbe24-4197-464c-b995-1a1708b551c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.627608 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b5cbe24-4197-464c-b995-1a1708b551c4-kube-api-access-q4c5k" (OuterVolumeSpecName: "kube-api-access-q4c5k") pod "3b5cbe24-4197-464c-b995-1a1708b551c4" (UID: "3b5cbe24-4197-464c-b995-1a1708b551c4"). InnerVolumeSpecName "kube-api-access-q4c5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.675959 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b5cbe24-4197-464c-b995-1a1708b551c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3b5cbe24-4197-464c-b995-1a1708b551c4" (UID: "3b5cbe24-4197-464c-b995-1a1708b551c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.721634 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4c5k\" (UniqueName: \"kubernetes.io/projected/3b5cbe24-4197-464c-b995-1a1708b551c4-kube-api-access-q4c5k\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.721675 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b5cbe24-4197-464c-b995-1a1708b551c4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.721685 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b5cbe24-4197-464c-b995-1a1708b551c4-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.115386 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c1c68ac6-9f96-4a39-b477-7ad74a04dff9","Type":"ContainerDied","Data":"1ceadaa23027d492c09e5b48aebf25f3f67173315364e5177caaee816d00585c"} Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.115772 5094 scope.go:117] "RemoveContainer" containerID="af3a839b78e1bdb84a4e5e4f15b94bc87880557a5061b18ba71bafe2fbc48a2f" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.115568 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.120079 5094 generic.go:334] "Generic (PLEG): container finished" podID="ee1fa388-c752-45bf-9bd0-25ef5ac0052e" containerID="3d3145d1a90086f1227a72125422cd468bd73a49677f9611c5a511d3d24411ae" exitCode=0 Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.120145 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ee1fa388-c752-45bf-9bd0-25ef5ac0052e","Type":"ContainerDied","Data":"3d3145d1a90086f1227a72125422cd468bd73a49677f9611c5a511d3d24411ae"} Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.122531 5094 generic.go:334] "Generic (PLEG): container finished" podID="3b5cbe24-4197-464c-b995-1a1708b551c4" containerID="5ab65dc0d699ae8adcd5a321bb71d387f14dd304ffa2faad72488ad6d7cb83d4" exitCode=0 Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.122558 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kbmdf" event={"ID":"3b5cbe24-4197-464c-b995-1a1708b551c4","Type":"ContainerDied","Data":"5ab65dc0d699ae8adcd5a321bb71d387f14dd304ffa2faad72488ad6d7cb83d4"} Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.122573 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kbmdf" event={"ID":"3b5cbe24-4197-464c-b995-1a1708b551c4","Type":"ContainerDied","Data":"66d7454047a3aa7a1aca34ae635b12f5cfc407d75ffd9cc4ed6c357443fe8696"} Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.122633 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.140033 5094 scope.go:117] "RemoveContainer" containerID="5ab65dc0d699ae8adcd5a321bb71d387f14dd304ffa2faad72488ad6d7cb83d4" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.178345 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kbmdf"] Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.193891 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="c0eaf3b2-613b-41c2-9eac-ce8093ccec66" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": read tcp 10.217.0.2:37220->10.217.0.195:8775: read: connection reset by peer" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.194273 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="c0eaf3b2-613b-41c2-9eac-ce8093ccec66" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": read tcp 10.217.0.2:37206->10.217.0.195:8775: read: connection reset by peer" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.197225 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kbmdf"] Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.213953 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.225088 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.241221 5094 scope.go:117] "RemoveContainer" containerID="606cb4ba5ad2ea0198fffd5e046968227d6ed3efb0115787761f2b7c8968dac1" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.246261 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:10:40 crc kubenswrapper[5094]: E0220 07:10:40.248626 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5cbe24-4197-464c-b995-1a1708b551c4" containerName="extract-content" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.248648 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5cbe24-4197-464c-b995-1a1708b551c4" containerName="extract-content" Feb 20 07:10:40 crc kubenswrapper[5094]: E0220 07:10:40.248672 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5cbe24-4197-464c-b995-1a1708b551c4" containerName="registry-server" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.248680 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5cbe24-4197-464c-b995-1a1708b551c4" containerName="registry-server" Feb 20 07:10:40 crc kubenswrapper[5094]: E0220 07:10:40.248717 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1c68ac6-9f96-4a39-b477-7ad74a04dff9" containerName="nova-scheduler-scheduler" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.248725 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1c68ac6-9f96-4a39-b477-7ad74a04dff9" containerName="nova-scheduler-scheduler" Feb 20 07:10:40 crc kubenswrapper[5094]: E0220 07:10:40.248741 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85a1c623-233b-4b7e-9a57-e761a5ad27ab" containerName="nova-manage" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.248748 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="85a1c623-233b-4b7e-9a57-e761a5ad27ab" containerName="nova-manage" Feb 20 07:10:40 crc kubenswrapper[5094]: E0220 07:10:40.248771 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5cbe24-4197-464c-b995-1a1708b551c4" containerName="extract-utilities" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.248778 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5cbe24-4197-464c-b995-1a1708b551c4" containerName="extract-utilities" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.249013 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1c68ac6-9f96-4a39-b477-7ad74a04dff9" containerName="nova-scheduler-scheduler" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.249032 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b5cbe24-4197-464c-b995-1a1708b551c4" containerName="registry-server" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.249055 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="85a1c623-233b-4b7e-9a57-e761a5ad27ab" containerName="nova-manage" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.249971 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.259123 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.318090 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.357676 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpdrv\" (UniqueName: \"kubernetes.io/projected/b1504790-ccaf-42d5-a28a-a25f0cd353c9-kube-api-access-fpdrv\") pod \"nova-scheduler-0\" (UID: \"b1504790-ccaf-42d5-a28a-a25f0cd353c9\") " pod="openstack/nova-scheduler-0" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.357748 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1504790-ccaf-42d5-a28a-a25f0cd353c9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b1504790-ccaf-42d5-a28a-a25f0cd353c9\") " pod="openstack/nova-scheduler-0" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.357914 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1504790-ccaf-42d5-a28a-a25f0cd353c9-config-data\") pod \"nova-scheduler-0\" (UID: \"b1504790-ccaf-42d5-a28a-a25f0cd353c9\") " pod="openstack/nova-scheduler-0" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.416526 5094 scope.go:117] "RemoveContainer" containerID="ff4e8e219345288cfccd7aabbaf0290fadab4ba345665377f23e53ef7cae6632" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.466583 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpdrv\" (UniqueName: \"kubernetes.io/projected/b1504790-ccaf-42d5-a28a-a25f0cd353c9-kube-api-access-fpdrv\") pod \"nova-scheduler-0\" (UID: \"b1504790-ccaf-42d5-a28a-a25f0cd353c9\") " pod="openstack/nova-scheduler-0" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.466641 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1504790-ccaf-42d5-a28a-a25f0cd353c9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b1504790-ccaf-42d5-a28a-a25f0cd353c9\") " pod="openstack/nova-scheduler-0" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.466771 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1504790-ccaf-42d5-a28a-a25f0cd353c9-config-data\") pod \"nova-scheduler-0\" (UID: \"b1504790-ccaf-42d5-a28a-a25f0cd353c9\") " pod="openstack/nova-scheduler-0" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.474447 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1504790-ccaf-42d5-a28a-a25f0cd353c9-config-data\") pod \"nova-scheduler-0\" (UID: \"b1504790-ccaf-42d5-a28a-a25f0cd353c9\") " pod="openstack/nova-scheduler-0" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.479803 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1504790-ccaf-42d5-a28a-a25f0cd353c9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b1504790-ccaf-42d5-a28a-a25f0cd353c9\") " pod="openstack/nova-scheduler-0" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.482401 5094 scope.go:117] "RemoveContainer" containerID="5ab65dc0d699ae8adcd5a321bb71d387f14dd304ffa2faad72488ad6d7cb83d4" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.485998 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpdrv\" (UniqueName: \"kubernetes.io/projected/b1504790-ccaf-42d5-a28a-a25f0cd353c9-kube-api-access-fpdrv\") pod \"nova-scheduler-0\" (UID: \"b1504790-ccaf-42d5-a28a-a25f0cd353c9\") " pod="openstack/nova-scheduler-0" Feb 20 07:10:40 crc kubenswrapper[5094]: E0220 07:10:40.488331 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ab65dc0d699ae8adcd5a321bb71d387f14dd304ffa2faad72488ad6d7cb83d4\": container with ID starting with 5ab65dc0d699ae8adcd5a321bb71d387f14dd304ffa2faad72488ad6d7cb83d4 not found: ID does not exist" containerID="5ab65dc0d699ae8adcd5a321bb71d387f14dd304ffa2faad72488ad6d7cb83d4" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.488379 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ab65dc0d699ae8adcd5a321bb71d387f14dd304ffa2faad72488ad6d7cb83d4"} err="failed to get container status \"5ab65dc0d699ae8adcd5a321bb71d387f14dd304ffa2faad72488ad6d7cb83d4\": rpc error: code = NotFound desc = could not find container \"5ab65dc0d699ae8adcd5a321bb71d387f14dd304ffa2faad72488ad6d7cb83d4\": container with ID starting with 5ab65dc0d699ae8adcd5a321bb71d387f14dd304ffa2faad72488ad6d7cb83d4 not found: ID does not exist" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.488413 5094 scope.go:117] "RemoveContainer" containerID="606cb4ba5ad2ea0198fffd5e046968227d6ed3efb0115787761f2b7c8968dac1" Feb 20 07:10:40 crc kubenswrapper[5094]: E0220 07:10:40.489140 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"606cb4ba5ad2ea0198fffd5e046968227d6ed3efb0115787761f2b7c8968dac1\": container with ID starting with 606cb4ba5ad2ea0198fffd5e046968227d6ed3efb0115787761f2b7c8968dac1 not found: ID does not exist" containerID="606cb4ba5ad2ea0198fffd5e046968227d6ed3efb0115787761f2b7c8968dac1" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.489173 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"606cb4ba5ad2ea0198fffd5e046968227d6ed3efb0115787761f2b7c8968dac1"} err="failed to get container status \"606cb4ba5ad2ea0198fffd5e046968227d6ed3efb0115787761f2b7c8968dac1\": rpc error: code = NotFound desc = could not find container \"606cb4ba5ad2ea0198fffd5e046968227d6ed3efb0115787761f2b7c8968dac1\": container with ID starting with 606cb4ba5ad2ea0198fffd5e046968227d6ed3efb0115787761f2b7c8968dac1 not found: ID does not exist" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.489203 5094 scope.go:117] "RemoveContainer" containerID="ff4e8e219345288cfccd7aabbaf0290fadab4ba345665377f23e53ef7cae6632" Feb 20 07:10:40 crc kubenswrapper[5094]: E0220 07:10:40.492148 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff4e8e219345288cfccd7aabbaf0290fadab4ba345665377f23e53ef7cae6632\": container with ID starting with ff4e8e219345288cfccd7aabbaf0290fadab4ba345665377f23e53ef7cae6632 not found: ID does not exist" containerID="ff4e8e219345288cfccd7aabbaf0290fadab4ba345665377f23e53ef7cae6632" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.492172 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff4e8e219345288cfccd7aabbaf0290fadab4ba345665377f23e53ef7cae6632"} err="failed to get container status \"ff4e8e219345288cfccd7aabbaf0290fadab4ba345665377f23e53ef7cae6632\": rpc error: code = NotFound desc = could not find container \"ff4e8e219345288cfccd7aabbaf0290fadab4ba345665377f23e53ef7cae6632\": container with ID starting with ff4e8e219345288cfccd7aabbaf0290fadab4ba345665377f23e53ef7cae6632 not found: ID does not exist" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.540928 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.546828 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.650168 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.681022 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-internal-tls-certs\") pod \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.681074 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxhmb\" (UniqueName: \"kubernetes.io/projected/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-kube-api-access-cxhmb\") pod \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.681135 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-config-data\") pod \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.681231 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-public-tls-certs\") pod \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.681263 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-logs\") pod \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.681310 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-combined-ca-bundle\") pod \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.686757 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-logs" (OuterVolumeSpecName: "logs") pod "ee1fa388-c752-45bf-9bd0-25ef5ac0052e" (UID: "ee1fa388-c752-45bf-9bd0-25ef5ac0052e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.690843 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-kube-api-access-cxhmb" (OuterVolumeSpecName: "kube-api-access-cxhmb") pod "ee1fa388-c752-45bf-9bd0-25ef5ac0052e" (UID: "ee1fa388-c752-45bf-9bd0-25ef5ac0052e"). InnerVolumeSpecName "kube-api-access-cxhmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.692346 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxhmb\" (UniqueName: \"kubernetes.io/projected/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-kube-api-access-cxhmb\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.692681 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.716603 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee1fa388-c752-45bf-9bd0-25ef5ac0052e" (UID: "ee1fa388-c752-45bf-9bd0-25ef5ac0052e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.719269 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-config-data" (OuterVolumeSpecName: "config-data") pod "ee1fa388-c752-45bf-9bd0-25ef5ac0052e" (UID: "ee1fa388-c752-45bf-9bd0-25ef5ac0052e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.766930 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ee1fa388-c752-45bf-9bd0-25ef5ac0052e" (UID: "ee1fa388-c752-45bf-9bd0-25ef5ac0052e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.779748 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ee1fa388-c752-45bf-9bd0-25ef5ac0052e" (UID: "ee1fa388-c752-45bf-9bd0-25ef5ac0052e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.793858 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-combined-ca-bundle\") pod \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.793971 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-config-data\") pod \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.794116 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgt5d\" (UniqueName: \"kubernetes.io/projected/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-kube-api-access-cgt5d\") pod \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.794160 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-nova-metadata-tls-certs\") pod \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.794309 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-logs\") pod \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.795074 5094 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.795115 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.795124 5094 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.795134 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.795146 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-logs" (OuterVolumeSpecName: "logs") pod "c0eaf3b2-613b-41c2-9eac-ce8093ccec66" (UID: "c0eaf3b2-613b-41c2-9eac-ce8093ccec66"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.797982 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-kube-api-access-cgt5d" (OuterVolumeSpecName: "kube-api-access-cgt5d") pod "c0eaf3b2-613b-41c2-9eac-ce8093ccec66" (UID: "c0eaf3b2-613b-41c2-9eac-ce8093ccec66"). InnerVolumeSpecName "kube-api-access-cgt5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.824386 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-config-data" (OuterVolumeSpecName: "config-data") pod "c0eaf3b2-613b-41c2-9eac-ce8093ccec66" (UID: "c0eaf3b2-613b-41c2-9eac-ce8093ccec66"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.824864 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0eaf3b2-613b-41c2-9eac-ce8093ccec66" (UID: "c0eaf3b2-613b-41c2-9eac-ce8093ccec66"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.849694 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c0eaf3b2-613b-41c2-9eac-ce8093ccec66" (UID: "c0eaf3b2-613b-41c2-9eac-ce8093ccec66"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.897505 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgt5d\" (UniqueName: \"kubernetes.io/projected/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-kube-api-access-cgt5d\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.897543 5094 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.897552 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.897576 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.897589 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.075269 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.133466 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ee1fa388-c752-45bf-9bd0-25ef5ac0052e","Type":"ContainerDied","Data":"0f5bc2605cd33284cc1bf9e4b5234398fc32ff7c0f92f9117b0968f05d2000c8"} Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.133998 5094 scope.go:117] "RemoveContainer" containerID="3d3145d1a90086f1227a72125422cd468bd73a49677f9611c5a511d3d24411ae" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.133542 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.138690 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b1504790-ccaf-42d5-a28a-a25f0cd353c9","Type":"ContainerStarted","Data":"15996ec151f9ac116c6912aa4e992bb9af3fc72485808d76d5e14b93da12f57f"} Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.140922 5094 generic.go:334] "Generic (PLEG): container finished" podID="c0eaf3b2-613b-41c2-9eac-ce8093ccec66" containerID="c7a53317b54066a403baba090862c5995bba55506254ad03280d9c1ace19b11d" exitCode=0 Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.140965 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0eaf3b2-613b-41c2-9eac-ce8093ccec66","Type":"ContainerDied","Data":"c7a53317b54066a403baba090862c5995bba55506254ad03280d9c1ace19b11d"} Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.140979 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0eaf3b2-613b-41c2-9eac-ce8093ccec66","Type":"ContainerDied","Data":"0d734b120ad598d02afd1f0952c1bd5ccf1f9badb30cba5e61b1f4e9fc055b42"} Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.141038 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.225232 5094 scope.go:117] "RemoveContainer" containerID="8455819637c85b383fa39a66aae025d6e59a87cd68c5fdcccaf581b0cceed857" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.229689 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.263441 5094 scope.go:117] "RemoveContainer" containerID="c7a53317b54066a403baba090862c5995bba55506254ad03280d9c1ace19b11d" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.282384 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.300537 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:10:41 crc kubenswrapper[5094]: E0220 07:10:41.301535 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0eaf3b2-613b-41c2-9eac-ce8093ccec66" containerName="nova-metadata-log" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.301626 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0eaf3b2-613b-41c2-9eac-ce8093ccec66" containerName="nova-metadata-log" Feb 20 07:10:41 crc kubenswrapper[5094]: E0220 07:10:41.301721 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0eaf3b2-613b-41c2-9eac-ce8093ccec66" containerName="nova-metadata-metadata" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.301779 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0eaf3b2-613b-41c2-9eac-ce8093ccec66" containerName="nova-metadata-metadata" Feb 20 07:10:41 crc kubenswrapper[5094]: E0220 07:10:41.301858 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee1fa388-c752-45bf-9bd0-25ef5ac0052e" containerName="nova-api-api" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.301916 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee1fa388-c752-45bf-9bd0-25ef5ac0052e" containerName="nova-api-api" Feb 20 07:10:41 crc kubenswrapper[5094]: E0220 07:10:41.302006 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee1fa388-c752-45bf-9bd0-25ef5ac0052e" containerName="nova-api-log" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.302312 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee1fa388-c752-45bf-9bd0-25ef5ac0052e" containerName="nova-api-log" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.302690 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0eaf3b2-613b-41c2-9eac-ce8093ccec66" containerName="nova-metadata-metadata" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.302861 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee1fa388-c752-45bf-9bd0-25ef5ac0052e" containerName="nova-api-api" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.302931 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0eaf3b2-613b-41c2-9eac-ce8093ccec66" containerName="nova-metadata-log" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.302990 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee1fa388-c752-45bf-9bd0-25ef5ac0052e" containerName="nova-api-log" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.304245 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.308867 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.310230 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.318940 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.341490 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.344996 5094 scope.go:117] "RemoveContainer" containerID="ae97ccd59962f48b4e76117ad4ab2034a3bfad9a1ff6a242afd02e6a32cafe32" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.351681 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.362731 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.366478 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.369535 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.369928 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.369967 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.371086 5094 scope.go:117] "RemoveContainer" containerID="c7a53317b54066a403baba090862c5995bba55506254ad03280d9c1ace19b11d" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.372514 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:10:41 crc kubenswrapper[5094]: E0220 07:10:41.374927 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7a53317b54066a403baba090862c5995bba55506254ad03280d9c1ace19b11d\": container with ID starting with c7a53317b54066a403baba090862c5995bba55506254ad03280d9c1ace19b11d not found: ID does not exist" containerID="c7a53317b54066a403baba090862c5995bba55506254ad03280d9c1ace19b11d" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.374977 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7a53317b54066a403baba090862c5995bba55506254ad03280d9c1ace19b11d"} err="failed to get container status \"c7a53317b54066a403baba090862c5995bba55506254ad03280d9c1ace19b11d\": rpc error: code = NotFound desc = could not find container \"c7a53317b54066a403baba090862c5995bba55506254ad03280d9c1ace19b11d\": container with ID starting with c7a53317b54066a403baba090862c5995bba55506254ad03280d9c1ace19b11d not found: ID does not exist" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.375013 5094 scope.go:117] "RemoveContainer" containerID="ae97ccd59962f48b4e76117ad4ab2034a3bfad9a1ff6a242afd02e6a32cafe32" Feb 20 07:10:41 crc kubenswrapper[5094]: E0220 07:10:41.375550 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae97ccd59962f48b4e76117ad4ab2034a3bfad9a1ff6a242afd02e6a32cafe32\": container with ID starting with ae97ccd59962f48b4e76117ad4ab2034a3bfad9a1ff6a242afd02e6a32cafe32 not found: ID does not exist" containerID="ae97ccd59962f48b4e76117ad4ab2034a3bfad9a1ff6a242afd02e6a32cafe32" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.375584 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae97ccd59962f48b4e76117ad4ab2034a3bfad9a1ff6a242afd02e6a32cafe32"} err="failed to get container status \"ae97ccd59962f48b4e76117ad4ab2034a3bfad9a1ff6a242afd02e6a32cafe32\": rpc error: code = NotFound desc = could not find container \"ae97ccd59962f48b4e76117ad4ab2034a3bfad9a1ff6a242afd02e6a32cafe32\": container with ID starting with ae97ccd59962f48b4e76117ad4ab2034a3bfad9a1ff6a242afd02e6a32cafe32 not found: ID does not exist" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.415695 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f11aa87b-3964-4a62-871f-bdf7d1ad7848-logs\") pod \"nova-metadata-0\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.415752 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.415807 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.415867 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq8dx\" (UniqueName: \"kubernetes.io/projected/f11aa87b-3964-4a62-871f-bdf7d1ad7848-kube-api-access-gq8dx\") pod \"nova-metadata-0\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.415885 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-public-tls-certs\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.416145 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca3adffb-7baf-45db-ab16-cc1c63510fec-logs\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.416255 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.416406 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8nkm\" (UniqueName: \"kubernetes.io/projected/ca3adffb-7baf-45db-ab16-cc1c63510fec-kube-api-access-p8nkm\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.416640 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-config-data\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.416696 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.416813 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-config-data\") pod \"nova-metadata-0\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.520106 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8nkm\" (UniqueName: \"kubernetes.io/projected/ca3adffb-7baf-45db-ab16-cc1c63510fec-kube-api-access-p8nkm\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.520741 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-config-data\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.521049 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.521454 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-config-data\") pod \"nova-metadata-0\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.523489 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f11aa87b-3964-4a62-871f-bdf7d1ad7848-logs\") pod \"nova-metadata-0\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.525011 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.525219 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.526610 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq8dx\" (UniqueName: \"kubernetes.io/projected/f11aa87b-3964-4a62-871f-bdf7d1ad7848-kube-api-access-gq8dx\") pod \"nova-metadata-0\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.526799 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-public-tls-certs\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.527087 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca3adffb-7baf-45db-ab16-cc1c63510fec-logs\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.527236 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.524696 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f11aa87b-3964-4a62-871f-bdf7d1ad7848-logs\") pod \"nova-metadata-0\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.528428 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-config-data\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.528690 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca3adffb-7baf-45db-ab16-cc1c63510fec-logs\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.529868 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.531350 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-config-data\") pod \"nova-metadata-0\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.532802 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.533470 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-public-tls-certs\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.538424 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.538781 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.539190 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8nkm\" (UniqueName: \"kubernetes.io/projected/ca3adffb-7baf-45db-ab16-cc1c63510fec-kube-api-access-p8nkm\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.547503 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq8dx\" (UniqueName: \"kubernetes.io/projected/f11aa87b-3964-4a62-871f-bdf7d1ad7848-kube-api-access-gq8dx\") pod \"nova-metadata-0\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.639519 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.692308 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.858319 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b5cbe24-4197-464c-b995-1a1708b551c4" path="/var/lib/kubelet/pods/3b5cbe24-4197-464c-b995-1a1708b551c4/volumes" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.859331 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0eaf3b2-613b-41c2-9eac-ce8093ccec66" path="/var/lib/kubelet/pods/c0eaf3b2-613b-41c2-9eac-ce8093ccec66/volumes" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.859907 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1c68ac6-9f96-4a39-b477-7ad74a04dff9" path="/var/lib/kubelet/pods/c1c68ac6-9f96-4a39-b477-7ad74a04dff9/volumes" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.862164 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee1fa388-c752-45bf-9bd0-25ef5ac0052e" path="/var/lib/kubelet/pods/ee1fa388-c752-45bf-9bd0-25ef5ac0052e/volumes" Feb 20 07:10:42 crc kubenswrapper[5094]: I0220 07:10:42.163406 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b1504790-ccaf-42d5-a28a-a25f0cd353c9","Type":"ContainerStarted","Data":"e21d77d7f75d6c24720a691c3dafa0724131a4a216a5cb72213f2651bb77470e"} Feb 20 07:10:42 crc kubenswrapper[5094]: I0220 07:10:42.187155 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.187132942 podStartE2EDuration="2.187132942s" podCreationTimestamp="2026-02-20 07:10:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:10:42.183080825 +0000 UTC m=+1457.055707546" watchObservedRunningTime="2026-02-20 07:10:42.187132942 +0000 UTC m=+1457.059759653" Feb 20 07:10:42 crc kubenswrapper[5094]: I0220 07:10:42.207283 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:10:42 crc kubenswrapper[5094]: W0220 07:10:42.212857 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf11aa87b_3964_4a62_871f_bdf7d1ad7848.slice/crio-cf9b5c579a0d68acf1d445e79a8eef5359bd9a83ecb400c0f57115c47e73ceda WatchSource:0}: Error finding container cf9b5c579a0d68acf1d445e79a8eef5359bd9a83ecb400c0f57115c47e73ceda: Status 404 returned error can't find the container with id cf9b5c579a0d68acf1d445e79a8eef5359bd9a83ecb400c0f57115c47e73ceda Feb 20 07:10:42 crc kubenswrapper[5094]: I0220 07:10:42.289469 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:10:42 crc kubenswrapper[5094]: W0220 07:10:42.298540 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca3adffb_7baf_45db_ab16_cc1c63510fec.slice/crio-1139ba82db72b15ac53a4a8cb3e78462a70a685a8a4aaff7d443bdc0aa77867c WatchSource:0}: Error finding container 1139ba82db72b15ac53a4a8cb3e78462a70a685a8a4aaff7d443bdc0aa77867c: Status 404 returned error can't find the container with id 1139ba82db72b15ac53a4a8cb3e78462a70a685a8a4aaff7d443bdc0aa77867c Feb 20 07:10:43 crc kubenswrapper[5094]: I0220 07:10:43.212451 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f11aa87b-3964-4a62-871f-bdf7d1ad7848","Type":"ContainerStarted","Data":"50299a2c387cfc7c5adc90764eae8fbdc420d6bc7964e0d04844da05fc246e7d"} Feb 20 07:10:43 crc kubenswrapper[5094]: I0220 07:10:43.213007 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f11aa87b-3964-4a62-871f-bdf7d1ad7848","Type":"ContainerStarted","Data":"229e2aebe241cc126dc9491dbc9e726710e67939495c62f8914e915bce65e45d"} Feb 20 07:10:43 crc kubenswrapper[5094]: I0220 07:10:43.213063 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f11aa87b-3964-4a62-871f-bdf7d1ad7848","Type":"ContainerStarted","Data":"cf9b5c579a0d68acf1d445e79a8eef5359bd9a83ecb400c0f57115c47e73ceda"} Feb 20 07:10:43 crc kubenswrapper[5094]: I0220 07:10:43.215825 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ca3adffb-7baf-45db-ab16-cc1c63510fec","Type":"ContainerStarted","Data":"92a9942e0db46da8477438409c4da3d36ea534be827434e466a4bec7f4c990ab"} Feb 20 07:10:43 crc kubenswrapper[5094]: I0220 07:10:43.215893 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ca3adffb-7baf-45db-ab16-cc1c63510fec","Type":"ContainerStarted","Data":"c34769ddae56119b12d9b51bd88bf7ef48671807437c5ad0d48869446003d663"} Feb 20 07:10:43 crc kubenswrapper[5094]: I0220 07:10:43.215911 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ca3adffb-7baf-45db-ab16-cc1c63510fec","Type":"ContainerStarted","Data":"1139ba82db72b15ac53a4a8cb3e78462a70a685a8a4aaff7d443bdc0aa77867c"} Feb 20 07:10:43 crc kubenswrapper[5094]: I0220 07:10:43.250656 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.250625273 podStartE2EDuration="2.250625273s" podCreationTimestamp="2026-02-20 07:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:10:43.238502412 +0000 UTC m=+1458.111129153" watchObservedRunningTime="2026-02-20 07:10:43.250625273 +0000 UTC m=+1458.123251994" Feb 20 07:10:43 crc kubenswrapper[5094]: I0220 07:10:43.277405 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.277381203 podStartE2EDuration="2.277381203s" podCreationTimestamp="2026-02-20 07:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:10:43.27146057 +0000 UTC m=+1458.144087301" watchObservedRunningTime="2026-02-20 07:10:43.277381203 +0000 UTC m=+1458.150007914" Feb 20 07:10:45 crc kubenswrapper[5094]: I0220 07:10:45.541373 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 20 07:10:46 crc kubenswrapper[5094]: I0220 07:10:46.639733 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 07:10:46 crc kubenswrapper[5094]: I0220 07:10:46.639807 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 07:10:50 crc kubenswrapper[5094]: I0220 07:10:50.542267 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 20 07:10:50 crc kubenswrapper[5094]: I0220 07:10:50.594624 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 20 07:10:51 crc kubenswrapper[5094]: I0220 07:10:51.388603 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 20 07:10:51 crc kubenswrapper[5094]: I0220 07:10:51.639894 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 07:10:51 crc kubenswrapper[5094]: I0220 07:10:51.639989 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 07:10:51 crc kubenswrapper[5094]: I0220 07:10:51.693136 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 07:10:51 crc kubenswrapper[5094]: I0220 07:10:51.693371 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 07:10:52 crc kubenswrapper[5094]: I0220 07:10:52.688061 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f11aa87b-3964-4a62-871f-bdf7d1ad7848" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 07:10:52 crc kubenswrapper[5094]: I0220 07:10:52.688100 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f11aa87b-3964-4a62-871f-bdf7d1ad7848" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 07:10:52 crc kubenswrapper[5094]: I0220 07:10:52.707896 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ca3adffb-7baf-45db-ab16-cc1c63510fec" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.205:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 07:10:52 crc kubenswrapper[5094]: I0220 07:10:52.707891 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ca3adffb-7baf-45db-ab16-cc1c63510fec" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.205:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 07:11:00 crc kubenswrapper[5094]: I0220 07:11:00.442681 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 20 07:11:01 crc kubenswrapper[5094]: I0220 07:11:01.648049 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 20 07:11:01 crc kubenswrapper[5094]: I0220 07:11:01.648895 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 20 07:11:01 crc kubenswrapper[5094]: I0220 07:11:01.655643 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 20 07:11:01 crc kubenswrapper[5094]: I0220 07:11:01.708281 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 20 07:11:01 crc kubenswrapper[5094]: I0220 07:11:01.710027 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 20 07:11:01 crc kubenswrapper[5094]: I0220 07:11:01.710179 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 20 07:11:01 crc kubenswrapper[5094]: I0220 07:11:01.716664 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 20 07:11:02 crc kubenswrapper[5094]: I0220 07:11:02.479888 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 20 07:11:02 crc kubenswrapper[5094]: I0220 07:11:02.490824 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 20 07:11:02 crc kubenswrapper[5094]: I0220 07:11:02.509832 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 20 07:11:04 crc kubenswrapper[5094]: I0220 07:11:04.106794 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:11:04 crc kubenswrapper[5094]: I0220 07:11:04.107311 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:11:04 crc kubenswrapper[5094]: I0220 07:11:04.107370 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 07:11:04 crc kubenswrapper[5094]: I0220 07:11:04.108062 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 07:11:04 crc kubenswrapper[5094]: I0220 07:11:04.108132 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" gracePeriod=600 Feb 20 07:11:04 crc kubenswrapper[5094]: E0220 07:11:04.238793 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:11:04 crc kubenswrapper[5094]: I0220 07:11:04.509821 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" exitCode=0 Feb 20 07:11:04 crc kubenswrapper[5094]: I0220 07:11:04.509955 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310"} Feb 20 07:11:04 crc kubenswrapper[5094]: I0220 07:11:04.510390 5094 scope.go:117] "RemoveContainer" containerID="b476ab5b54b82460e3be1d827bfff187825879d93c9cc19cc5170f59943c3ef7" Feb 20 07:11:04 crc kubenswrapper[5094]: I0220 07:11:04.512358 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:11:04 crc kubenswrapper[5094]: E0220 07:11:04.512979 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:11:17 crc kubenswrapper[5094]: I0220 07:11:17.841090 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:11:17 crc kubenswrapper[5094]: E0220 07:11:17.842441 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.441806 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.443077 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="f8ca33ba-f76e-4352-b6f1-54588dd25285" containerName="openstackclient" containerID="cri-o://1db2a566904b3dc511755f2dbe6958ef04fbbaf45c4039fb8bb4f2d0b8ee27fe" gracePeriod=2 Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.504491 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.705950 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.747079 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-d63e-account-create-update-tvf55"] Feb 20 07:11:22 crc kubenswrapper[5094]: E0220 07:11:22.747744 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8ca33ba-f76e-4352-b6f1-54588dd25285" containerName="openstackclient" Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.747766 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8ca33ba-f76e-4352-b6f1-54588dd25285" containerName="openstackclient" Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.748003 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8ca33ba-f76e-4352-b6f1-54588dd25285" containerName="openstackclient" Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.748820 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d63e-account-create-update-tvf55" Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.753208 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.798811 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a1b6a8a-aefe-4f59-8936-f08aed30d8f7-operator-scripts\") pod \"nova-api-d63e-account-create-update-tvf55\" (UID: \"2a1b6a8a-aefe-4f59-8936-f08aed30d8f7\") " pod="openstack/nova-api-d63e-account-create-update-tvf55" Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.799300 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7nxd\" (UniqueName: \"kubernetes.io/projected/2a1b6a8a-aefe-4f59-8936-f08aed30d8f7-kube-api-access-r7nxd\") pod \"nova-api-d63e-account-create-update-tvf55\" (UID: \"2a1b6a8a-aefe-4f59-8936-f08aed30d8f7\") " pod="openstack/nova-api-d63e-account-create-update-tvf55" Feb 20 07:11:22 crc kubenswrapper[5094]: E0220 07:11:22.799996 5094 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 20 07:11:22 crc kubenswrapper[5094]: E0220 07:11:22.800098 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-config-data podName:219c74d6-9f45-4bf8-8c67-acdea3c0fab3 nodeName:}" failed. No retries permitted until 2026-02-20 07:11:23.3000593 +0000 UTC m=+1498.172686011 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-config-data") pod "rabbitmq-server-0" (UID: "219c74d6-9f45-4bf8-8c67-acdea3c0fab3") : configmap "rabbitmq-config-data" not found Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.830600 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-d63e-account-create-update-tvf55"] Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.875819 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-d63e-account-create-update-hkz6h"] Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.901196 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a1b6a8a-aefe-4f59-8936-f08aed30d8f7-operator-scripts\") pod \"nova-api-d63e-account-create-update-tvf55\" (UID: \"2a1b6a8a-aefe-4f59-8936-f08aed30d8f7\") " pod="openstack/nova-api-d63e-account-create-update-tvf55" Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.901545 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7nxd\" (UniqueName: \"kubernetes.io/projected/2a1b6a8a-aefe-4f59-8936-f08aed30d8f7-kube-api-access-r7nxd\") pod \"nova-api-d63e-account-create-update-tvf55\" (UID: \"2a1b6a8a-aefe-4f59-8936-f08aed30d8f7\") " pod="openstack/nova-api-d63e-account-create-update-tvf55" Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.902613 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a1b6a8a-aefe-4f59-8936-f08aed30d8f7-operator-scripts\") pod \"nova-api-d63e-account-create-update-tvf55\" (UID: \"2a1b6a8a-aefe-4f59-8936-f08aed30d8f7\") " pod="openstack/nova-api-d63e-account-create-update-tvf55" Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.925660 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.926305 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="cd92d75e-9882-4bb7-a41e-cab9777424e8" containerName="ovn-northd" containerID="cri-o://3f7c742aada7ed25bb815042dbf9602749ab67ca6990bc8a7e5d047b638c6680" gracePeriod=30 Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.926970 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="cd92d75e-9882-4bb7-a41e-cab9777424e8" containerName="openstack-network-exporter" containerID="cri-o://aae8dde1864d6aaeb77909eb96b990c39e162c1e7c99e123f9bcf832ed144feb" gracePeriod=30 Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.953894 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-d63e-account-create-update-hkz6h"] Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.963250 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7nxd\" (UniqueName: \"kubernetes.io/projected/2a1b6a8a-aefe-4f59-8936-f08aed30d8f7-kube-api-access-r7nxd\") pod \"nova-api-d63e-account-create-update-tvf55\" (UID: \"2a1b6a8a-aefe-4f59-8936-f08aed30d8f7\") " pod="openstack/nova-api-d63e-account-create-update-tvf55" Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.997112 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-1fa0-account-create-update-8g9zk"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.031787 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-1fa0-account-create-update-8g9zk"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.057569 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-1fa0-account-create-update-zvvj2"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.063058 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1fa0-account-create-update-zvvj2" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.072819 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.088160 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1fa0-account-create-update-zvvj2"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.102943 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d63e-account-create-update-tvf55" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.110953 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.112745 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/290e5022-8d17-4415-87c0-07891b0b66f5-operator-scripts\") pod \"glance-1fa0-account-create-update-zvvj2\" (UID: \"290e5022-8d17-4415-87c0-07891b0b66f5\") " pod="openstack/glance-1fa0-account-create-update-zvvj2" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.112909 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-497kt\" (UniqueName: \"kubernetes.io/projected/290e5022-8d17-4415-87c0-07891b0b66f5-kube-api-access-497kt\") pod \"glance-1fa0-account-create-update-zvvj2\" (UID: \"290e5022-8d17-4415-87c0-07891b0b66f5\") " pod="openstack/glance-1fa0-account-create-update-zvvj2" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.137096 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-b77jp"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.175845 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-b77jp"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.206366 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-tj42x"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.222725 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/290e5022-8d17-4415-87c0-07891b0b66f5-operator-scripts\") pod \"glance-1fa0-account-create-update-zvvj2\" (UID: \"290e5022-8d17-4415-87c0-07891b0b66f5\") " pod="openstack/glance-1fa0-account-create-update-zvvj2" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.222838 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-497kt\" (UniqueName: \"kubernetes.io/projected/290e5022-8d17-4415-87c0-07891b0b66f5-kube-api-access-497kt\") pod \"glance-1fa0-account-create-update-zvvj2\" (UID: \"290e5022-8d17-4415-87c0-07891b0b66f5\") " pod="openstack/glance-1fa0-account-create-update-zvvj2" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.224098 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/290e5022-8d17-4415-87c0-07891b0b66f5-operator-scripts\") pod \"glance-1fa0-account-create-update-zvvj2\" (UID: \"290e5022-8d17-4415-87c0-07891b0b66f5\") " pod="openstack/glance-1fa0-account-create-update-zvvj2" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.262173 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-jsvf2"] Feb 20 07:11:23 crc kubenswrapper[5094]: E0220 07:11:23.328101 5094 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 20 07:11:23 crc kubenswrapper[5094]: E0220 07:11:23.328188 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-config-data podName:a829c6b3-7069-4544-90dc-40ae83aba524 nodeName:}" failed. No retries permitted until 2026-02-20 07:11:23.828156618 +0000 UTC m=+1498.700783329 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-config-data") pod "rabbitmq-cell1-server-0" (UID: "a829c6b3-7069-4544-90dc-40ae83aba524") : configmap "rabbitmq-cell1-config-data" not found Feb 20 07:11:23 crc kubenswrapper[5094]: E0220 07:11:23.328289 5094 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 20 07:11:23 crc kubenswrapper[5094]: E0220 07:11:23.328409 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-config-data podName:219c74d6-9f45-4bf8-8c67-acdea3c0fab3 nodeName:}" failed. No retries permitted until 2026-02-20 07:11:24.328373984 +0000 UTC m=+1499.201000685 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-config-data") pod "rabbitmq-server-0" (UID: "219c74d6-9f45-4bf8-8c67-acdea3c0fab3") : configmap "rabbitmq-config-data" not found Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.334792 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-jsvf2"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.355626 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-497kt\" (UniqueName: \"kubernetes.io/projected/290e5022-8d17-4415-87c0-07891b0b66f5-kube-api-access-497kt\") pod \"glance-1fa0-account-create-update-zvvj2\" (UID: \"290e5022-8d17-4415-87c0-07891b0b66f5\") " pod="openstack/glance-1fa0-account-create-update-zvvj2" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.406829 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-5vc6z"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.408310 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5vc6z" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.417151 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.487734 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-5vc6z"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.534791 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-gg2h9"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.535200 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-gg2h9" podUID="ca32118b-2e77-4484-b753-3467e1ba8df1" containerName="openstack-network-exporter" containerID="cri-o://ca9c2e6bc34587959334740a47a8ea31e6c558cced5427d8002996ae38da9310" gracePeriod=30 Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.536325 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfrlw\" (UniqueName: \"kubernetes.io/projected/0330a367-c0c9-42a9-9993-1a3b6775fd3b-kube-api-access-bfrlw\") pod \"root-account-create-update-5vc6z\" (UID: \"0330a367-c0c9-42a9-9993-1a3b6775fd3b\") " pod="openstack/root-account-create-update-5vc6z" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.536522 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0330a367-c0c9-42a9-9993-1a3b6775fd3b-operator-scripts\") pod \"root-account-create-update-5vc6z\" (UID: \"0330a367-c0c9-42a9-9993-1a3b6775fd3b\") " pod="openstack/root-account-create-update-5vc6z" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.550821 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-lvlr2"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.573162 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8083-account-create-update-wxrzd"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.593027 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-8083-account-create-update-wxrzd"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.608355 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1fa0-account-create-update-zvvj2" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.623476 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-abcd-account-create-update-bwsmr"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.640061 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0330a367-c0c9-42a9-9993-1a3b6775fd3b-operator-scripts\") pod \"root-account-create-update-5vc6z\" (UID: \"0330a367-c0c9-42a9-9993-1a3b6775fd3b\") " pod="openstack/root-account-create-update-5vc6z" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.640164 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfrlw\" (UniqueName: \"kubernetes.io/projected/0330a367-c0c9-42a9-9993-1a3b6775fd3b-kube-api-access-bfrlw\") pod \"root-account-create-update-5vc6z\" (UID: \"0330a367-c0c9-42a9-9993-1a3b6775fd3b\") " pod="openstack/root-account-create-update-5vc6z" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.641377 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0330a367-c0c9-42a9-9993-1a3b6775fd3b-operator-scripts\") pod \"root-account-create-update-5vc6z\" (UID: \"0330a367-c0c9-42a9-9993-1a3b6775fd3b\") " pod="openstack/root-account-create-update-5vc6z" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.656409 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-abcd-account-create-update-bwsmr"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.689723 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfrlw\" (UniqueName: \"kubernetes.io/projected/0330a367-c0c9-42a9-9993-1a3b6775fd3b-kube-api-access-bfrlw\") pod \"root-account-create-update-5vc6z\" (UID: \"0330a367-c0c9-42a9-9993-1a3b6775fd3b\") " pod="openstack/root-account-create-update-5vc6z" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.689846 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-d27ft"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.737883 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-675a-account-create-update-7f6v9"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.759426 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-d27ft"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.771743 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5vc6z" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.860028 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-675a-account-create-update-7f6v9"] Feb 20 07:11:23 crc kubenswrapper[5094]: E0220 07:11:23.866053 5094 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 20 07:11:23 crc kubenswrapper[5094]: E0220 07:11:23.866130 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-config-data podName:a829c6b3-7069-4544-90dc-40ae83aba524 nodeName:}" failed. No retries permitted until 2026-02-20 07:11:24.866106042 +0000 UTC m=+1499.738732753 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-config-data") pod "rabbitmq-cell1-server-0" (UID: "a829c6b3-7069-4544-90dc-40ae83aba524") : configmap "rabbitmq-cell1-config-data" not found Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.912383 5094 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/swift-proxy-6964856c75-f7xdp" secret="" err="secret \"swift-swift-dockercfg-5btkp\" not found" Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.009489 5094 generic.go:334] "Generic (PLEG): container finished" podID="cd92d75e-9882-4bb7-a41e-cab9777424e8" containerID="aae8dde1864d6aaeb77909eb96b990c39e162c1e7c99e123f9bcf832ed144feb" exitCode=2 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.105860 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0626209e-1ab6-4bd1-a5cc-35a2f6525e5b" path="/var/lib/kubelet/pods/0626209e-1ab6-4bd1-a5cc-35a2f6525e5b/volumes" Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.106549 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23a44809-2f91-4dbe-80ed-733390b037d8" path="/var/lib/kubelet/pods/23a44809-2f91-4dbe-80ed-733390b037d8/volumes" Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.107143 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2538e2cc-781b-4c2a-b993-381e488fd5bb" path="/var/lib/kubelet/pods/2538e2cc-781b-4c2a-b993-381e488fd5bb/volumes" Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.107998 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5044f3da-a9aa-4f6e-b598-3b5e963f8731" path="/var/lib/kubelet/pods/5044f3da-a9aa-4f6e-b598-3b5e963f8731/volumes" Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.112522 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50bf9176-b504-436f-a845-7ab55506a258" path="/var/lib/kubelet/pods/50bf9176-b504-436f-a845-7ab55506a258/volumes" Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.125998 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="772e2155-8d29-40de-8aff-5e42112e6171" path="/var/lib/kubelet/pods/772e2155-8d29-40de-8aff-5e42112e6171/volumes" Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.127064 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4920eee-8485-4faa-892c-893c6466a90c" path="/var/lib/kubelet/pods/c4920eee-8485-4faa-892c-893c6466a90c/volumes" Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.128477 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd4e0644-4339-45bf-a919-0de0551c5baa" path="/var/lib/kubelet/pods/fd4e0644-4339-45bf-a919-0de0551c5baa/volumes" Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.128946 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-gg2h9_ca32118b-2e77-4484-b753-3467e1ba8df1/openstack-network-exporter/0.log" Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.129000 5094 generic.go:334] "Generic (PLEG): container finished" podID="ca32118b-2e77-4484-b753-3467e1ba8df1" containerID="ca9c2e6bc34587959334740a47a8ea31e6c558cced5427d8002996ae38da9310" exitCode=2 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.129243 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"cd92d75e-9882-4bb7-a41e-cab9777424e8","Type":"ContainerDied","Data":"aae8dde1864d6aaeb77909eb96b990c39e162c1e7c99e123f9bcf832ed144feb"} Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.129287 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-fvmwf"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.129308 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-gg2h9" event={"ID":"ca32118b-2e77-4484-b753-3467e1ba8df1","Type":"ContainerDied","Data":"ca9c2e6bc34587959334740a47a8ea31e6c558cced5427d8002996ae38da9310"} Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.131634 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-fvmwf"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.163122 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-t7hr7"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.202531 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-t7hr7"] Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.208625 5094 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.208662 5094 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.208682 5094 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-proxy-6964856c75-f7xdp: [secret "swift-conf" not found, configmap "swift-ring-files" not found] Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.208752 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-etc-swift podName:8e30bbcd-c206-4a74-ae52-21462356babf nodeName:}" failed. No retries permitted until 2026-02-20 07:11:24.708728647 +0000 UTC m=+1499.581355358 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-etc-swift") pod "swift-proxy-6964856c75-f7xdp" (UID: "8e30bbcd-c206-4a74-ae52-21462356babf") : [secret "swift-conf" not found, configmap "swift-ring-files" not found] Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.270124 5094 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 20 07:11:24 crc kubenswrapper[5094]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 20 07:11:24 crc kubenswrapper[5094]: Feb 20 07:11:24 crc kubenswrapper[5094]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 20 07:11:24 crc kubenswrapper[5094]: Feb 20 07:11:24 crc kubenswrapper[5094]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 20 07:11:24 crc kubenswrapper[5094]: Feb 20 07:11:24 crc kubenswrapper[5094]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 20 07:11:24 crc kubenswrapper[5094]: Feb 20 07:11:24 crc kubenswrapper[5094]: if [ -n "nova_api" ]; then Feb 20 07:11:24 crc kubenswrapper[5094]: GRANT_DATABASE="nova_api" Feb 20 07:11:24 crc kubenswrapper[5094]: else Feb 20 07:11:24 crc kubenswrapper[5094]: GRANT_DATABASE="*" Feb 20 07:11:24 crc kubenswrapper[5094]: fi Feb 20 07:11:24 crc kubenswrapper[5094]: Feb 20 07:11:24 crc kubenswrapper[5094]: # going for maximum compatibility here: Feb 20 07:11:24 crc kubenswrapper[5094]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 20 07:11:24 crc kubenswrapper[5094]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 20 07:11:24 crc kubenswrapper[5094]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 20 07:11:24 crc kubenswrapper[5094]: # support updates Feb 20 07:11:24 crc kubenswrapper[5094]: Feb 20 07:11:24 crc kubenswrapper[5094]: $MYSQL_CMD < logger="UnhandledError" Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.271763 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-d63e-account-create-update-tvf55" podUID="2a1b6a8a-aefe-4f59-8936-f08aed30d8f7" Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.286296 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-8a74-account-create-update-cdhcw"] Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.330053 5094 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-lvlr2" message="Exiting ovn-controller (1) " Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.330098 5094 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-lvlr2" podUID="8ecb5d91-5ba1-457e-af42-0d78c8643250" containerName="ovn-controller" containerID="cri-o://4f0a991715f978e32d8d86416765ee44cb6bb02db7b4482e7725dce37e0de301" Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.330140 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-lvlr2" podUID="8ecb5d91-5ba1-457e-af42-0d78c8643250" containerName="ovn-controller" containerID="cri-o://4f0a991715f978e32d8d86416765ee44cb6bb02db7b4482e7725dce37e0de301" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.351323 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-8a74-account-create-update-cdhcw"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.425620 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5b8f9d577d-pgn2k"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.426000 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5b8f9d577d-pgn2k" podUID="b90c9110-e4fd-461b-ad2c-a58ff01921d8" containerName="placement-log" containerID="cri-o://d1984bf7ea49205773f404c2e26dd668c4e64dc0f98adc5a3b111f4efeded4ec" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.426329 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5b8f9d577d-pgn2k" podUID="b90c9110-e4fd-461b-ad2c-a58ff01921d8" containerName="placement-api" containerID="cri-o://91947cfb40227973fecb8ecdcb4c5ccb4aacad2780a5f8683eda8de6a99c2b2e" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.431978 5094 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.432133 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-config-data podName:219c74d6-9f45-4bf8-8c67-acdea3c0fab3 nodeName:}" failed. No retries permitted until 2026-02-20 07:11:26.432099557 +0000 UTC m=+1501.304726278 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-config-data") pod "rabbitmq-server-0" (UID: "219c74d6-9f45-4bf8-8c67-acdea3c0fab3") : configmap "rabbitmq-config-data" not found Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.468721 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-dnm22"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.480517 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-dnm22"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.548779 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-tp2rg"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.583909 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-v6v6k"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.632522 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.633355 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="cadd011d-8dde-4346-8608-c5f74376204d" containerName="openstack-network-exporter" containerID="cri-o://2a6097f5aaeab1082991a6d095181e3f753899eb178f0d829dd1ed8b74f20e47" gracePeriod=300 Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.664840 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca32118b_2e77_4484_b753_3467e1ba8df1.slice/crio-ca9c2e6bc34587959334740a47a8ea31e6c558cced5427d8002996ae38da9310.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca32118b_2e77_4484_b753_3467e1ba8df1.slice/crio-conmon-ca9c2e6bc34587959334740a47a8ea31e6c558cced5427d8002996ae38da9310.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ecb5d91_5ba1_457e_af42_0d78c8643250.slice/crio-conmon-4f0a991715f978e32d8d86416765ee44cb6bb02db7b4482e7725dce37e0de301.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ecb5d91_5ba1_457e_af42_0d78c8643250.slice/crio-4f0a991715f978e32d8d86416765ee44cb6bb02db7b4482e7725dce37e0de301.scope\": RecentStats: unable to find data in memory cache]" Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.680947 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.682082 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="d3ec8857-5a33-44ea-bdd0-97b343adfc8a" containerName="openstack-network-exporter" containerID="cri-o://d8891902715ab819be070436bd6baa8c209b3026708544fcd16518b5902e4976" gracePeriod=300 Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.734960 5094 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.734998 5094 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.735011 5094 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-proxy-6964856c75-f7xdp: [secret "swift-conf" not found, configmap "swift-ring-files" not found] Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.735072 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-etc-swift podName:8e30bbcd-c206-4a74-ae52-21462356babf nodeName:}" failed. No retries permitted until 2026-02-20 07:11:25.735050252 +0000 UTC m=+1500.607676953 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-etc-swift") pod "swift-proxy-6964856c75-f7xdp" (UID: "8e30bbcd-c206-4a74-ae52-21462356babf") : [secret "swift-conf" not found, configmap "swift-ring-files" not found] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.740037 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-tp2rg"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.759901 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="cadd011d-8dde-4346-8608-c5f74376204d" containerName="ovsdbserver-sb" containerID="cri-o://333434623dc65ba599c492292fd799a78a2d1d5581438ea036aa6124a9583e68" gracePeriod=300 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.778876 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-v6v6k"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.785549 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7677694455-llk7m"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.785946 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7677694455-llk7m" podUID="37533bd5-22b5-4b59-8672-35eaa19b9295" containerName="dnsmasq-dns" containerID="cri-o://fabd932b0a47a0bf51f49258d5ff3d5003b1b88996a45ebe155061bab41c04ff" gracePeriod=10 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.792982 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.793431 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="account-server" containerID="cri-o://77762b2b7b3381830ea1b52f38b5c149eed591367db41376905ffbc17a991c9c" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.793724 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="d3ec8857-5a33-44ea-bdd0-97b343adfc8a" containerName="ovsdbserver-nb" containerID="cri-o://87a953c9ab5036126a76e97a66b02ef98a2f8720e10ee1bc11a373190ac13d0d" gracePeriod=300 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.793812 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="swift-recon-cron" containerID="cri-o://694f121dca03e8e25df5cfb69cf088e987e8cad432e283e4814356d5f0848991" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.793864 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="rsync" containerID="cri-o://d95b17f09f786d48faa3531b254622503fd28358bef08fa635117e335717ec76" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.793902 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-expirer" containerID="cri-o://ec9511e4e89788b48919a84e31eb85bf09ba8ef7f2fd3f486aacf3733fac08ce" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.793960 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="container-auditor" containerID="cri-o://da22674644cab0e6f810eb224627362470513debe3726c5711ef469be1db004e" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.794020 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-updater" containerID="cri-o://997b5f8501f8553cfdec1b566ef6a822ad0fc847b09fa07489c9cc4bdaa9307c" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.794074 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-auditor" containerID="cri-o://c948d686dc04dce290ff45631530d5f46650b2f0c7f924c52cff39ae5c18105d" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.794135 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-replicator" containerID="cri-o://b29a2c6ccb9cf15778496c1427d8d36bd92976f1a511632fe77d63af46238ab2" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.794192 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-server" containerID="cri-o://6b2e14e49e3a0c6e84ab2a157f1e34b73c7517b9c172442d8359cdf3a71c1601" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.794230 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="container-updater" containerID="cri-o://ecea14d9b3d89c636aa8573b40d00132c5867dc6539f43130d1e69748ccb2dd3" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.794277 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="account-reaper" containerID="cri-o://876ede354cfdccb940d2003cb9a673f9c21d14f390f0a6c414edee229821334b" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.794315 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="container-replicator" containerID="cri-o://798e86534a684e476081f370e9263cf67a92cf7ab886af18bececee01f397813" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.794364 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="container-server" containerID="cri-o://13486685a923bf2693233900e7c25d8638639ebc4de0d1f20f27c486791e4f53" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.794406 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="account-replicator" containerID="cri-o://4eef6c37c6d679f44709f4c17ae202933f7640edde1e1c08cde85643a4c659b9" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.794452 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="account-auditor" containerID="cri-o://87d22bfd10a9fea3abdf5984327defada78a003b41f7d6829b04b81bf140871a" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.800329 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-vpv24"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.804095 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-vpv24"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.816864 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.817193 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" containerName="glance-log" containerID="cri-o://0f4330768d5d427a2d77f0009f9dd9c0b2e0c4e46d6fb295f8aa0f285169a62c" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.817922 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" containerName="glance-httpd" containerID="cri-o://5c666f7fe1bbd7b65dc971fe96a85660b5d4b37560d1db6e6ea2ac21f9782b5c" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.820641 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.821964 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="d01cbaa4-5543-4cd5-b098-7e4600819d32" containerName="cinder-scheduler" containerID="cri-o://2d7a76b02a624c041d0a977a291c39fc1dacc4367a61490ce8714745def9a3c7" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.822170 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="d01cbaa4-5543-4cd5-b098-7e4600819d32" containerName="probe" containerID="cri-o://0d80098740f3555dc54ef0c65032de273b3ce866905e15dc8682ab9661be5b23" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.868769 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.881770 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-54bd68f77-fkqmr"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.882056 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-54bd68f77-fkqmr" podUID="530069d2-7146-46eb-9c88-056cc8a583b2" containerName="neutron-api" containerID="cri-o://2d88bbf120f5f105b4c231e353a818cac99e32ccbc4f6e26a6b8f4bf8f8c6db4" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.882458 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-54bd68f77-fkqmr" podUID="530069d2-7146-46eb-9c88-056cc8a583b2" containerName="neutron-httpd" containerID="cri-o://5cb136f403a952c3f992bdf43a3607fce1325d571764a30723172fab59ca2ce3" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.898227 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.904497 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="762a565c-672e-4127-a8c6-90f721eeda81" containerName="glance-log" containerID="cri-o://fbee36bb89639ece4abdb6a81c5837582f4784307b3986814aecb8c79e38a15c" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.904931 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="762a565c-672e-4127-a8c6-90f721eeda81" containerName="glance-httpd" containerID="cri-o://d1f6db0391d91d17006e26254d5724c2e1250f967872ca29f449b7f22386e51b" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.937129 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.938953 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-7gn4d"] Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.942161 5094 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.942215 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-config-data podName:a829c6b3-7069-4544-90dc-40ae83aba524 nodeName:}" failed. No retries permitted until 2026-02-20 07:11:26.942198393 +0000 UTC m=+1501.814825104 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-config-data") pod "rabbitmq-cell1-server-0" (UID: "a829c6b3-7069-4544-90dc-40ae83aba524") : configmap "rabbitmq-cell1-config-data" not found Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.942450 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f11aa87b-3964-4a62-871f-bdf7d1ad7848" containerName="nova-metadata-log" containerID="cri-o://229e2aebe241cc126dc9491dbc9e726710e67939495c62f8914e915bce65e45d" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.942767 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f11aa87b-3964-4a62-871f-bdf7d1ad7848" containerName="nova-metadata-metadata" containerID="cri-o://50299a2c387cfc7c5adc90764eae8fbdc420d6bc7964e0d04844da05fc246e7d" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.945874 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-tj42x" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovs-vswitchd" containerID="cri-o://381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" gracePeriod=29 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.966318 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="219c74d6-9f45-4bf8-8c67-acdea3c0fab3" containerName="rabbitmq" containerID="cri-o://74e0d7c23ec3f1be5316db26c770a1e0ec492750a824549bed30f944a01c88b6" gracePeriod=604800 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.015841 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-7gn4d"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.299934 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-gg2h9_ca32118b-2e77-4484-b753-3467e1ba8df1/openstack-network-exporter/0.log" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.300438 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.302077 5094 generic.go:334] "Generic (PLEG): container finished" podID="762a565c-672e-4127-a8c6-90f721eeda81" containerID="fbee36bb89639ece4abdb6a81c5837582f4784307b3986814aecb8c79e38a15c" exitCode=143 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.302160 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"762a565c-672e-4127-a8c6-90f721eeda81","Type":"ContainerDied","Data":"fbee36bb89639ece4abdb6a81c5837582f4784307b3986814aecb8c79e38a15c"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.319291 5094 generic.go:334] "Generic (PLEG): container finished" podID="b90c9110-e4fd-461b-ad2c-a58ff01921d8" containerID="d1984bf7ea49205773f404c2e26dd668c4e64dc0f98adc5a3b111f4efeded4ec" exitCode=143 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.319486 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7df9984bd9-6txsf"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.319536 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5b8f9d577d-pgn2k" event={"ID":"b90c9110-e4fd-461b-ad2c-a58ff01921d8","Type":"ContainerDied","Data":"d1984bf7ea49205773f404c2e26dd668c4e64dc0f98adc5a3b111f4efeded4ec"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.320023 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7df9984bd9-6txsf" podUID="92877559-6960-4dbf-890a-fb563f4b0bf8" containerName="barbican-worker-log" containerID="cri-o://7fc72402d88effbe5b7f66ca244892cffc5d212981c680da190e5ee72f72b92a" gracePeriod=30 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.320166 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7df9984bd9-6txsf" podUID="92877559-6960-4dbf-890a-fb563f4b0bf8" containerName="barbican-worker" containerID="cri-o://d397c0695b1183c0c759491ae377b37ffc66032936b01b69dbc1bd7fedbcab31" gracePeriod=30 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.332640 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lvlr2" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.337712 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d3ec8857-5a33-44ea-bdd0-97b343adfc8a/ovsdbserver-nb/0.log" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.337756 5094 generic.go:334] "Generic (PLEG): container finished" podID="d3ec8857-5a33-44ea-bdd0-97b343adfc8a" containerID="d8891902715ab819be070436bd6baa8c209b3026708544fcd16518b5902e4976" exitCode=2 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.337776 5094 generic.go:334] "Generic (PLEG): container finished" podID="d3ec8857-5a33-44ea-bdd0-97b343adfc8a" containerID="87a953c9ab5036126a76e97a66b02ef98a2f8720e10ee1bc11a373190ac13d0d" exitCode=143 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.337855 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d3ec8857-5a33-44ea-bdd0-97b343adfc8a","Type":"ContainerDied","Data":"d8891902715ab819be070436bd6baa8c209b3026708544fcd16518b5902e4976"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.337884 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d3ec8857-5a33-44ea-bdd0-97b343adfc8a","Type":"ContainerDied","Data":"87a953c9ab5036126a76e97a66b02ef98a2f8720e10ee1bc11a373190ac13d0d"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.338853 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.340025 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d63e-account-create-update-tvf55" event={"ID":"2a1b6a8a-aefe-4f59-8936-f08aed30d8f7","Type":"ContainerStarted","Data":"9e2e1470fd33e88144567dad9b332e30ed6e81a2d129cafee24b4fca5bfc7939"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.365988 5094 generic.go:334] "Generic (PLEG): container finished" podID="f11aa87b-3964-4a62-871f-bdf7d1ad7848" containerID="229e2aebe241cc126dc9491dbc9e726710e67939495c62f8914e915bce65e45d" exitCode=143 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.366092 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f11aa87b-3964-4a62-871f-bdf7d1ad7848","Type":"ContainerDied","Data":"229e2aebe241cc126dc9491dbc9e726710e67939495c62f8914e915bce65e45d"} Feb 20 07:11:25 crc kubenswrapper[5094]: E0220 07:11:25.366434 5094 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 20 07:11:25 crc kubenswrapper[5094]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 20 07:11:25 crc kubenswrapper[5094]: Feb 20 07:11:25 crc kubenswrapper[5094]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 20 07:11:25 crc kubenswrapper[5094]: Feb 20 07:11:25 crc kubenswrapper[5094]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 20 07:11:25 crc kubenswrapper[5094]: Feb 20 07:11:25 crc kubenswrapper[5094]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 20 07:11:25 crc kubenswrapper[5094]: Feb 20 07:11:25 crc kubenswrapper[5094]: if [ -n "nova_api" ]; then Feb 20 07:11:25 crc kubenswrapper[5094]: GRANT_DATABASE="nova_api" Feb 20 07:11:25 crc kubenswrapper[5094]: else Feb 20 07:11:25 crc kubenswrapper[5094]: GRANT_DATABASE="*" Feb 20 07:11:25 crc kubenswrapper[5094]: fi Feb 20 07:11:25 crc kubenswrapper[5094]: Feb 20 07:11:25 crc kubenswrapper[5094]: # going for maximum compatibility here: Feb 20 07:11:25 crc kubenswrapper[5094]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 20 07:11:25 crc kubenswrapper[5094]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 20 07:11:25 crc kubenswrapper[5094]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 20 07:11:25 crc kubenswrapper[5094]: # support updates Feb 20 07:11:25 crc kubenswrapper[5094]: Feb 20 07:11:25 crc kubenswrapper[5094]: $MYSQL_CMD < logger="UnhandledError" Feb 20 07:11:25 crc kubenswrapper[5094]: E0220 07:11:25.367695 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-d63e-account-create-update-tvf55" podUID="2a1b6a8a-aefe-4f59-8936-f08aed30d8f7" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.405338 5094 generic.go:334] "Generic (PLEG): container finished" podID="8ecb5d91-5ba1-457e-af42-0d78c8643250" containerID="4f0a991715f978e32d8d86416765ee44cb6bb02db7b4482e7725dce37e0de301" exitCode=0 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.405459 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lvlr2" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.405498 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lvlr2" event={"ID":"8ecb5d91-5ba1-457e-af42-0d78c8643250","Type":"ContainerDied","Data":"4f0a991715f978e32d8d86416765ee44cb6bb02db7b4482e7725dce37e0de301"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.406692 5094 scope.go:117] "RemoveContainer" containerID="4f0a991715f978e32d8d86416765ee44cb6bb02db7b4482e7725dce37e0de301" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.433763 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.434044 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="8f8cb333-2939-4404-b242-67bcf4e6875b" containerName="cinder-api-log" containerID="cri-o://e83bab219c26de7197a4c8d483fd96f4ccf30f290122d4606fa75843efcfaa32" gracePeriod=30 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.434141 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="8f8cb333-2939-4404-b242-67bcf4e6875b" containerName="cinder-api" containerID="cri-o://d19cd01d15b92dc7454d6cf3fb5973b463816ef9279ed3d67fe67ccb7ea9cc15" gracePeriod=30 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.438978 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8ca33ba-f76e-4352-b6f1-54588dd25285-combined-ca-bundle\") pod \"f8ca33ba-f76e-4352-b6f1-54588dd25285\" (UID: \"f8ca33ba-f76e-4352-b6f1-54588dd25285\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.439041 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjn6q\" (UniqueName: \"kubernetes.io/projected/f8ca33ba-f76e-4352-b6f1-54588dd25285-kube-api-access-hjn6q\") pod \"f8ca33ba-f76e-4352-b6f1-54588dd25285\" (UID: \"f8ca33ba-f76e-4352-b6f1-54588dd25285\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.439099 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ca32118b-2e77-4484-b753-3467e1ba8df1-ovs-rundir\") pod \"ca32118b-2e77-4484-b753-3467e1ba8df1\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.439122 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ca32118b-2e77-4484-b753-3467e1ba8df1-ovn-rundir\") pod \"ca32118b-2e77-4484-b753-3467e1ba8df1\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.439169 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-log-ovn\") pod \"8ecb5d91-5ba1-457e-af42-0d78c8643250\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.439224 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5tkf\" (UniqueName: \"kubernetes.io/projected/8ecb5d91-5ba1-457e-af42-0d78c8643250-kube-api-access-s5tkf\") pod \"8ecb5d91-5ba1-457e-af42-0d78c8643250\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.439287 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ecb5d91-5ba1-457e-af42-0d78c8643250-scripts\") pod \"8ecb5d91-5ba1-457e-af42-0d78c8643250\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.439326 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca32118b-2e77-4484-b753-3467e1ba8df1-config\") pod \"ca32118b-2e77-4484-b753-3467e1ba8df1\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.439387 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-run\") pod \"8ecb5d91-5ba1-457e-af42-0d78c8643250\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.439448 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ecb5d91-5ba1-457e-af42-0d78c8643250-ovn-controller-tls-certs\") pod \"8ecb5d91-5ba1-457e-af42-0d78c8643250\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.439494 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ecb5d91-5ba1-457e-af42-0d78c8643250-combined-ca-bundle\") pod \"8ecb5d91-5ba1-457e-af42-0d78c8643250\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.439532 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca32118b-2e77-4484-b753-3467e1ba8df1-metrics-certs-tls-certs\") pod \"ca32118b-2e77-4484-b753-3467e1ba8df1\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.439594 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-run-ovn\") pod \"8ecb5d91-5ba1-457e-af42-0d78c8643250\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.447902 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-run" (OuterVolumeSpecName: "var-run") pod "8ecb5d91-5ba1-457e-af42-0d78c8643250" (UID: "8ecb5d91-5ba1-457e-af42-0d78c8643250"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.451799 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca32118b-2e77-4484-b753-3467e1ba8df1-config" (OuterVolumeSpecName: "config") pod "ca32118b-2e77-4484-b753-3467e1ba8df1" (UID: "ca32118b-2e77-4484-b753-3467e1ba8df1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.455390 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ecb5d91-5ba1-457e-af42-0d78c8643250-scripts" (OuterVolumeSpecName: "scripts") pod "8ecb5d91-5ba1-457e-af42-0d78c8643250" (UID: "8ecb5d91-5ba1-457e-af42-0d78c8643250"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.457340 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_cadd011d-8dde-4346-8608-c5f74376204d/ovsdbserver-sb/0.log" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.457387 5094 generic.go:334] "Generic (PLEG): container finished" podID="cadd011d-8dde-4346-8608-c5f74376204d" containerID="2a6097f5aaeab1082991a6d095181e3f753899eb178f0d829dd1ed8b74f20e47" exitCode=2 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.457407 5094 generic.go:334] "Generic (PLEG): container finished" podID="cadd011d-8dde-4346-8608-c5f74376204d" containerID="333434623dc65ba599c492292fd799a78a2d1d5581438ea036aa6124a9583e68" exitCode=143 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.457499 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"cadd011d-8dde-4346-8608-c5f74376204d","Type":"ContainerDied","Data":"2a6097f5aaeab1082991a6d095181e3f753899eb178f0d829dd1ed8b74f20e47"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.457531 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"cadd011d-8dde-4346-8608-c5f74376204d","Type":"ContainerDied","Data":"333434623dc65ba599c492292fd799a78a2d1d5581438ea036aa6124a9583e68"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.458021 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca32118b-2e77-4484-b753-3467e1ba8df1-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "ca32118b-2e77-4484-b753-3467e1ba8df1" (UID: "ca32118b-2e77-4484-b753-3467e1ba8df1"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.458501 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca32118b-2e77-4484-b753-3467e1ba8df1-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "ca32118b-2e77-4484-b753-3467e1ba8df1" (UID: "ca32118b-2e77-4484-b753-3467e1ba8df1"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.458542 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "8ecb5d91-5ba1-457e-af42-0d78c8643250" (UID: "8ecb5d91-5ba1-457e-af42-0d78c8643250"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.458686 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "8ecb5d91-5ba1-457e-af42-0d78c8643250" (UID: "8ecb5d91-5ba1-457e-af42-0d78c8643250"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.465411 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f8ca33ba-f76e-4352-b6f1-54588dd25285-openstack-config\") pod \"f8ca33ba-f76e-4352-b6f1-54588dd25285\" (UID: \"f8ca33ba-f76e-4352-b6f1-54588dd25285\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.465543 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca32118b-2e77-4484-b753-3467e1ba8df1-combined-ca-bundle\") pod \"ca32118b-2e77-4484-b753-3467e1ba8df1\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.465569 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f8ca33ba-f76e-4352-b6f1-54588dd25285-openstack-config-secret\") pod \"f8ca33ba-f76e-4352-b6f1-54588dd25285\" (UID: \"f8ca33ba-f76e-4352-b6f1-54588dd25285\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.465647 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6d2f\" (UniqueName: \"kubernetes.io/projected/ca32118b-2e77-4484-b753-3467e1ba8df1-kube-api-access-n6d2f\") pod \"ca32118b-2e77-4484-b753-3467e1ba8df1\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.466681 5094 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ca32118b-2e77-4484-b753-3467e1ba8df1-ovs-rundir\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.466717 5094 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ca32118b-2e77-4484-b753-3467e1ba8df1-ovn-rundir\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.466727 5094 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.466737 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ecb5d91-5ba1-457e-af42-0d78c8643250-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.466748 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca32118b-2e77-4484-b753-3467e1ba8df1-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.466756 5094 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-run\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.466765 5094 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.467843 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-d63e-account-create-update-tvf55"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.486909 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-54bd68f77-fkqmr" podUID="530069d2-7146-46eb-9c88-056cc8a583b2" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.165:9696/\": read tcp 10.217.0.2:45196->10.217.0.165:9696: read: connection reset by peer" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.488131 5094 generic.go:334] "Generic (PLEG): container finished" podID="f8ca33ba-f76e-4352-b6f1-54588dd25285" containerID="1db2a566904b3dc511755f2dbe6958ef04fbbaf45c4039fb8bb4f2d0b8ee27fe" exitCode=137 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.488387 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.511157 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-qqgpn"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.519092 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8ca33ba-f76e-4352-b6f1-54588dd25285-kube-api-access-hjn6q" (OuterVolumeSpecName: "kube-api-access-hjn6q") pod "f8ca33ba-f76e-4352-b6f1-54588dd25285" (UID: "f8ca33ba-f76e-4352-b6f1-54588dd25285"). InnerVolumeSpecName "kube-api-access-hjn6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.519248 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ecb5d91-5ba1-457e-af42-0d78c8643250-kube-api-access-s5tkf" (OuterVolumeSpecName: "kube-api-access-s5tkf") pod "8ecb5d91-5ba1-457e-af42-0d78c8643250" (UID: "8ecb5d91-5ba1-457e-af42-0d78c8643250"). InnerVolumeSpecName "kube-api-access-s5tkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.547546 5094 scope.go:117] "RemoveContainer" containerID="1db2a566904b3dc511755f2dbe6958ef04fbbaf45c4039fb8bb4f2d0b8ee27fe" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.548263 5094 generic.go:334] "Generic (PLEG): container finished" podID="cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" containerID="0f4330768d5d427a2d77f0009f9dd9c0b2e0c4e46d6fb295f8aa0f285169a62c" exitCode=143 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.548342 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4","Type":"ContainerDied","Data":"0f4330768d5d427a2d77f0009f9dd9c0b2e0c4e46d6fb295f8aa0f285169a62c"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.577688 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5tkf\" (UniqueName: \"kubernetes.io/projected/8ecb5d91-5ba1-457e-af42-0d78c8643250-kube-api-access-s5tkf\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.577733 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjn6q\" (UniqueName: \"kubernetes.io/projected/f8ca33ba-f76e-4352-b6f1-54588dd25285-kube-api-access-hjn6q\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.585143 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-qqgpn"] Feb 20 07:11:25 crc kubenswrapper[5094]: E0220 07:11:25.587866 5094 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Feb 20 07:11:25 crc kubenswrapper[5094]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 20 07:11:25 crc kubenswrapper[5094]: + source /usr/local/bin/container-scripts/functions Feb 20 07:11:25 crc kubenswrapper[5094]: ++ OVNBridge=br-int Feb 20 07:11:25 crc kubenswrapper[5094]: ++ OVNRemote=tcp:localhost:6642 Feb 20 07:11:25 crc kubenswrapper[5094]: ++ OVNEncapType=geneve Feb 20 07:11:25 crc kubenswrapper[5094]: ++ OVNAvailabilityZones= Feb 20 07:11:25 crc kubenswrapper[5094]: ++ EnableChassisAsGateway=true Feb 20 07:11:25 crc kubenswrapper[5094]: ++ PhysicalNetworks= Feb 20 07:11:25 crc kubenswrapper[5094]: ++ OVNHostName= Feb 20 07:11:25 crc kubenswrapper[5094]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 20 07:11:25 crc kubenswrapper[5094]: ++ ovs_dir=/var/lib/openvswitch Feb 20 07:11:25 crc kubenswrapper[5094]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 20 07:11:25 crc kubenswrapper[5094]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 20 07:11:25 crc kubenswrapper[5094]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 20 07:11:25 crc kubenswrapper[5094]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 07:11:25 crc kubenswrapper[5094]: + sleep 0.5 Feb 20 07:11:25 crc kubenswrapper[5094]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 07:11:25 crc kubenswrapper[5094]: + sleep 0.5 Feb 20 07:11:25 crc kubenswrapper[5094]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 07:11:25 crc kubenswrapper[5094]: + sleep 0.5 Feb 20 07:11:25 crc kubenswrapper[5094]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 07:11:25 crc kubenswrapper[5094]: + cleanup_ovsdb_server_semaphore Feb 20 07:11:25 crc kubenswrapper[5094]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 20 07:11:25 crc kubenswrapper[5094]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 20 07:11:25 crc kubenswrapper[5094]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-tj42x" message=< Feb 20 07:11:25 crc kubenswrapper[5094]: Exiting ovsdb-server (5) [ OK ] Feb 20 07:11:25 crc kubenswrapper[5094]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 20 07:11:25 crc kubenswrapper[5094]: + source /usr/local/bin/container-scripts/functions Feb 20 07:11:25 crc kubenswrapper[5094]: ++ OVNBridge=br-int Feb 20 07:11:25 crc kubenswrapper[5094]: ++ OVNRemote=tcp:localhost:6642 Feb 20 07:11:25 crc kubenswrapper[5094]: ++ OVNEncapType=geneve Feb 20 07:11:25 crc kubenswrapper[5094]: ++ OVNAvailabilityZones= Feb 20 07:11:25 crc kubenswrapper[5094]: ++ EnableChassisAsGateway=true Feb 20 07:11:25 crc kubenswrapper[5094]: ++ PhysicalNetworks= Feb 20 07:11:25 crc kubenswrapper[5094]: ++ OVNHostName= Feb 20 07:11:25 crc kubenswrapper[5094]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 20 07:11:25 crc kubenswrapper[5094]: ++ ovs_dir=/var/lib/openvswitch Feb 20 07:11:25 crc kubenswrapper[5094]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 20 07:11:25 crc kubenswrapper[5094]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 20 07:11:25 crc kubenswrapper[5094]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 20 07:11:25 crc kubenswrapper[5094]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 07:11:25 crc kubenswrapper[5094]: + sleep 0.5 Feb 20 07:11:25 crc kubenswrapper[5094]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 07:11:25 crc kubenswrapper[5094]: + sleep 0.5 Feb 20 07:11:25 crc kubenswrapper[5094]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 07:11:25 crc kubenswrapper[5094]: + sleep 0.5 Feb 20 07:11:25 crc kubenswrapper[5094]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 07:11:25 crc kubenswrapper[5094]: + cleanup_ovsdb_server_semaphore Feb 20 07:11:25 crc kubenswrapper[5094]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 20 07:11:25 crc kubenswrapper[5094]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 20 07:11:25 crc kubenswrapper[5094]: > Feb 20 07:11:25 crc kubenswrapper[5094]: E0220 07:11:25.587916 5094 kuberuntime_container.go:691] "PreStop hook failed" err=< Feb 20 07:11:25 crc kubenswrapper[5094]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 20 07:11:25 crc kubenswrapper[5094]: + source /usr/local/bin/container-scripts/functions Feb 20 07:11:25 crc kubenswrapper[5094]: ++ OVNBridge=br-int Feb 20 07:11:25 crc kubenswrapper[5094]: ++ OVNRemote=tcp:localhost:6642 Feb 20 07:11:25 crc kubenswrapper[5094]: ++ OVNEncapType=geneve Feb 20 07:11:25 crc kubenswrapper[5094]: ++ OVNAvailabilityZones= Feb 20 07:11:25 crc kubenswrapper[5094]: ++ EnableChassisAsGateway=true Feb 20 07:11:25 crc kubenswrapper[5094]: ++ PhysicalNetworks= Feb 20 07:11:25 crc kubenswrapper[5094]: ++ OVNHostName= Feb 20 07:11:25 crc kubenswrapper[5094]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 20 07:11:25 crc kubenswrapper[5094]: ++ ovs_dir=/var/lib/openvswitch Feb 20 07:11:25 crc kubenswrapper[5094]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 20 07:11:25 crc kubenswrapper[5094]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 20 07:11:25 crc kubenswrapper[5094]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 20 07:11:25 crc kubenswrapper[5094]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 07:11:25 crc kubenswrapper[5094]: + sleep 0.5 Feb 20 07:11:25 crc kubenswrapper[5094]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 07:11:25 crc kubenswrapper[5094]: + sleep 0.5 Feb 20 07:11:25 crc kubenswrapper[5094]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 07:11:25 crc kubenswrapper[5094]: + sleep 0.5 Feb 20 07:11:25 crc kubenswrapper[5094]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 07:11:25 crc kubenswrapper[5094]: + cleanup_ovsdb_server_semaphore Feb 20 07:11:25 crc kubenswrapper[5094]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 20 07:11:25 crc kubenswrapper[5094]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 20 07:11:25 crc kubenswrapper[5094]: > pod="openstack/ovn-controller-ovs-tj42x" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovsdb-server" containerID="cri-o://ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.587957 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-tj42x" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovsdb-server" containerID="cri-o://ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" gracePeriod=28 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.599571 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.599953 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ca3adffb-7baf-45db-ab16-cc1c63510fec" containerName="nova-api-log" containerID="cri-o://c34769ddae56119b12d9b51bd88bf7ef48671807437c5ad0d48869446003d663" gracePeriod=30 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.600128 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ca3adffb-7baf-45db-ab16-cc1c63510fec" containerName="nova-api-api" containerID="cri-o://92a9942e0db46da8477438409c4da3d36ea534be827434e466a4bec7f4c990ab" gracePeriod=30 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.606750 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-hlntr"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.612057 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-79d9bcd9d4-9jrtt"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.612378 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" podUID="908e2706-d24f-41c9-b481-4c0d5415c5ca" containerName="barbican-api-log" containerID="cri-o://459bc2f2d89a8c0984a77dac5551d18970f5268790591ad15bb4ecd73c5d3e57" gracePeriod=30 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.612557 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" podUID="908e2706-d24f-41c9-b481-4c0d5415c5ca" containerName="barbican-api" containerID="cri-o://a9359fe0f2ca547fd93214e43c994a9c0538ad96741f69a95262122bb7dc4d7b" gracePeriod=30 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.617468 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-hlntr"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.624225 5094 generic.go:334] "Generic (PLEG): container finished" podID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerID="ec9511e4e89788b48919a84e31eb85bf09ba8ef7f2fd3f486aacf3733fac08ce" exitCode=0 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.624361 5094 generic.go:334] "Generic (PLEG): container finished" podID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerID="997b5f8501f8553cfdec1b566ef6a822ad0fc847b09fa07489c9cc4bdaa9307c" exitCode=0 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.624420 5094 generic.go:334] "Generic (PLEG): container finished" podID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerID="c948d686dc04dce290ff45631530d5f46650b2f0c7f924c52cff39ae5c18105d" exitCode=0 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.624495 5094 generic.go:334] "Generic (PLEG): container finished" podID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerID="b29a2c6ccb9cf15778496c1427d8d36bd92976f1a511632fe77d63af46238ab2" exitCode=0 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.624548 5094 generic.go:334] "Generic (PLEG): container finished" podID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerID="ecea14d9b3d89c636aa8573b40d00132c5867dc6539f43130d1e69748ccb2dd3" exitCode=0 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.624610 5094 generic.go:334] "Generic (PLEG): container finished" podID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerID="da22674644cab0e6f810eb224627362470513debe3726c5711ef469be1db004e" exitCode=0 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.624666 5094 generic.go:334] "Generic (PLEG): container finished" podID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerID="798e86534a684e476081f370e9263cf67a92cf7ab886af18bececee01f397813" exitCode=0 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.624741 5094 generic.go:334] "Generic (PLEG): container finished" podID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerID="876ede354cfdccb940d2003cb9a673f9c21d14f390f0a6c414edee229821334b" exitCode=0 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.624809 5094 generic.go:334] "Generic (PLEG): container finished" podID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerID="87d22bfd10a9fea3abdf5984327defada78a003b41f7d6829b04b81bf140871a" exitCode=0 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.624894 5094 generic.go:334] "Generic (PLEG): container finished" podID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerID="4eef6c37c6d679f44709f4c17ae202933f7640edde1e1c08cde85643a4c659b9" exitCode=0 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.625135 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerDied","Data":"ec9511e4e89788b48919a84e31eb85bf09ba8ef7f2fd3f486aacf3733fac08ce"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.625228 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerDied","Data":"997b5f8501f8553cfdec1b566ef6a822ad0fc847b09fa07489c9cc4bdaa9307c"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.625290 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerDied","Data":"c948d686dc04dce290ff45631530d5f46650b2f0c7f924c52cff39ae5c18105d"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.625557 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerDied","Data":"b29a2c6ccb9cf15778496c1427d8d36bd92976f1a511632fe77d63af46238ab2"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.625735 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerDied","Data":"ecea14d9b3d89c636aa8573b40d00132c5867dc6539f43130d1e69748ccb2dd3"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.625812 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerDied","Data":"da22674644cab0e6f810eb224627362470513debe3726c5711ef469be1db004e"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.625880 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerDied","Data":"798e86534a684e476081f370e9263cf67a92cf7ab886af18bececee01f397813"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.625940 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerDied","Data":"876ede354cfdccb940d2003cb9a673f9c21d14f390f0a6c414edee229821334b"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.626002 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerDied","Data":"87d22bfd10a9fea3abdf5984327defada78a003b41f7d6829b04b81bf140871a"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.626074 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerDied","Data":"4eef6c37c6d679f44709f4c17ae202933f7640edde1e1c08cde85643a4c659b9"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.627983 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-7c9ffdc684-v56nc"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.628242 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca32118b-2e77-4484-b753-3467e1ba8df1-kube-api-access-n6d2f" (OuterVolumeSpecName: "kube-api-access-n6d2f") pod "ca32118b-2e77-4484-b753-3467e1ba8df1" (UID: "ca32118b-2e77-4484-b753-3467e1ba8df1"). InnerVolumeSpecName "kube-api-access-n6d2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.628276 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" podUID="5d9f1f40-92cc-4f19-9f3b-49651f56bffb" containerName="barbican-keystone-listener-log" containerID="cri-o://c53dec831e54b024a954907e297fb4a37cd3b545f865b7a580f6bd56abb0a90d" gracePeriod=30 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.628445 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" podUID="5d9f1f40-92cc-4f19-9f3b-49651f56bffb" containerName="barbican-keystone-listener" containerID="cri-o://b35c284d78ecbe2f2a62876408599b6c0ef8f8ad565bfbbe236f98afa60d9b08" gracePeriod=30 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.666560 5094 generic.go:334] "Generic (PLEG): container finished" podID="37533bd5-22b5-4b59-8672-35eaa19b9295" containerID="fabd932b0a47a0bf51f49258d5ff3d5003b1b88996a45ebe155061bab41c04ff" exitCode=0 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.666714 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7677694455-llk7m" event={"ID":"37533bd5-22b5-4b59-8672-35eaa19b9295","Type":"ContainerDied","Data":"fabd932b0a47a0bf51f49258d5ff3d5003b1b88996a45ebe155061bab41c04ff"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.670488 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.670793 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b1504790-ccaf-42d5-a28a-a25f0cd353c9" containerName="nova-scheduler-scheduler" containerID="cri-o://e21d77d7f75d6c24720a691c3dafa0724131a4a216a5cb72213f2651bb77470e" gracePeriod=30 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.679100 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-gg2h9_ca32118b-2e77-4484-b753-3467e1ba8df1/openstack-network-exporter/0.log" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.679188 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-gg2h9" event={"ID":"ca32118b-2e77-4484-b753-3467e1ba8df1","Type":"ContainerDied","Data":"8220b8fae0a21f561557c1539a6ab409db96f4d4c24a493c8737608b37dc4bc1"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.679300 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.679890 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6d2f\" (UniqueName: \"kubernetes.io/projected/ca32118b-2e77-4484-b753-3467e1ba8df1-kube-api-access-n6d2f\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.695841 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-z4m42"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.724648 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-z4m42"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.746228 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8ca33ba-f76e-4352-b6f1-54588dd25285-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8ca33ba-f76e-4352-b6f1-54588dd25285" (UID: "f8ca33ba-f76e-4352-b6f1-54588dd25285"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.761324 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca32118b-2e77-4484-b753-3467e1ba8df1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca32118b-2e77-4484-b753-3467e1ba8df1" (UID: "ca32118b-2e77-4484-b753-3467e1ba8df1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.782592 5094 scope.go:117] "RemoveContainer" containerID="ca9c2e6bc34587959334740a47a8ea31e6c558cced5427d8002996ae38da9310" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.785494 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8ca33ba-f76e-4352-b6f1-54588dd25285-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "f8ca33ba-f76e-4352-b6f1-54588dd25285" (UID: "f8ca33ba-f76e-4352-b6f1-54588dd25285"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:25 crc kubenswrapper[5094]: E0220 07:11:25.793636 5094 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Feb 20 07:11:25 crc kubenswrapper[5094]: E0220 07:11:25.793688 5094 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 20 07:11:25 crc kubenswrapper[5094]: E0220 07:11:25.793715 5094 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-proxy-6964856c75-f7xdp: [secret "swift-conf" not found, configmap "swift-ring-files" not found] Feb 20 07:11:25 crc kubenswrapper[5094]: E0220 07:11:25.793806 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-etc-swift podName:8e30bbcd-c206-4a74-ae52-21462356babf nodeName:}" failed. No retries permitted until 2026-02-20 07:11:27.793777547 +0000 UTC m=+1502.666404258 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-etc-swift") pod "swift-proxy-6964856c75-f7xdp" (UID: "8e30bbcd-c206-4a74-ae52-21462356babf") : [secret "swift-conf" not found, configmap "swift-ring-files" not found] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.799268 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ecb5d91-5ba1-457e-af42-0d78c8643250-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ecb5d91-5ba1-457e-af42-0d78c8643250" (UID: "8ecb5d91-5ba1-457e-af42-0d78c8643250"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.802856 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.810895 5094 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f8ca33ba-f76e-4352-b6f1-54588dd25285-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.832692 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca32118b-2e77-4484-b753-3467e1ba8df1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.833671 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8ca33ba-f76e-4352-b6f1-54588dd25285-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:25 crc kubenswrapper[5094]: E0220 07:11:25.847414 5094 info.go:109] Failed to get network devices: open /sys/class/net/11d132d9afa2713/address: no such file or directory Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.883432 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15583b83-ce22-4b0b-9566-0e056b07c0d7" path="/var/lib/kubelet/pods/15583b83-ce22-4b0b-9566-0e056b07c0d7/volumes" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.884498 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="317d32d8-9ad2-4bd1-87f4-745e3157c713" path="/var/lib/kubelet/pods/317d32d8-9ad2-4bd1-87f4-745e3157c713/volumes" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.885081 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d59abb8-e7c7-404f-8f03-13d2167bea54" path="/var/lib/kubelet/pods/3d59abb8-e7c7-404f-8f03-13d2167bea54/volumes" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.904263 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61f66271-5ce9-4412-8ea3-9a63a934f307" path="/var/lib/kubelet/pods/61f66271-5ce9-4412-8ea3-9a63a934f307/volumes" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.908811 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fb81f20-1f88-4c11-a37a-31db4472afd2" path="/var/lib/kubelet/pods/7fb81f20-1f88-4c11-a37a-31db4472afd2/volumes" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.909315 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85a1c623-233b-4b7e-9a57-e761a5ad27ab" path="/var/lib/kubelet/pods/85a1c623-233b-4b7e-9a57-e761a5ad27ab/volumes" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.909860 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a87399a2-42e4-4f46-b93c-cd4f25594a16" path="/var/lib/kubelet/pods/a87399a2-42e4-4f46-b93c-cd4f25594a16/volumes" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.917250 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c32c030e-7f51-4f5e-a4bb-c27288d8f2e9" path="/var/lib/kubelet/pods/c32c030e-7f51-4f5e-a4bb-c27288d8f2e9/volumes" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.917803 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6e6aec3-87a9-4f8a-b640-313ab241ec6f" path="/var/lib/kubelet/pods/d6e6aec3-87a9-4f8a-b640-313ab241ec6f/volumes" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.918443 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffc4926a-ede6-4124-ac91-c9912ffa8a23" path="/var/lib/kubelet/pods/ffc4926a-ede6-4124-ac91-c9912ffa8a23/volumes" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.925814 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffd170be-0f58-4016-a451-5fb1f7fd9f1b" path="/var/lib/kubelet/pods/ffd170be-0f58-4016-a451-5fb1f7fd9f1b/volumes" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.936237 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ecb5d91-5ba1-457e-af42-0d78c8643250-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.969270 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d3ec8857-5a33-44ea-bdd0-97b343adfc8a/ovsdbserver-nb/0.log" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.969353 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.990556 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-1fa0-account-create-update-zvvj2"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.990599 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.990615 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-qvr99"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.990628 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-qvr99"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.990914 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="3db6d35c-dfd1-4a59-95d3-cc8a99151c12" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://2bdc91fbdf4016ef04e3a2baae7cf5e5ae714534246ea953292e2628283eecdf" gracePeriod=30 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.997866 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.010788 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-c179-account-create-update-sst4m"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.013622 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 20 07:11:26 crc kubenswrapper[5094]: E0220 07:11:26.017788 5094 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 20 07:11:26 crc kubenswrapper[5094]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 20 07:11:26 crc kubenswrapper[5094]: Feb 20 07:11:26 crc kubenswrapper[5094]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 20 07:11:26 crc kubenswrapper[5094]: Feb 20 07:11:26 crc kubenswrapper[5094]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 20 07:11:26 crc kubenswrapper[5094]: Feb 20 07:11:26 crc kubenswrapper[5094]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 20 07:11:26 crc kubenswrapper[5094]: Feb 20 07:11:26 crc kubenswrapper[5094]: if [ -n "glance" ]; then Feb 20 07:11:26 crc kubenswrapper[5094]: GRANT_DATABASE="glance" Feb 20 07:11:26 crc kubenswrapper[5094]: else Feb 20 07:11:26 crc kubenswrapper[5094]: GRANT_DATABASE="*" Feb 20 07:11:26 crc kubenswrapper[5094]: fi Feb 20 07:11:26 crc kubenswrapper[5094]: Feb 20 07:11:26 crc kubenswrapper[5094]: # going for maximum compatibility here: Feb 20 07:11:26 crc kubenswrapper[5094]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 20 07:11:26 crc kubenswrapper[5094]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 20 07:11:26 crc kubenswrapper[5094]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 20 07:11:26 crc kubenswrapper[5094]: # support updates Feb 20 07:11:26 crc kubenswrapper[5094]: Feb 20 07:11:26 crc kubenswrapper[5094]: $MYSQL_CMD < logger="UnhandledError" Feb 20 07:11:26 crc kubenswrapper[5094]: E0220 07:11:26.019047 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack/glance-1fa0-account-create-update-zvvj2" podUID="290e5022-8d17-4415-87c0-07891b0b66f5" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.022784 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-lb6l5"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.061455 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-lb6l5"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.081237 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-c179-account-create-update-sst4m"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.081328 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-p4nhd"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.083305 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8ca33ba-f76e-4352-b6f1-54588dd25285-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "f8ca33ba-f76e-4352-b6f1-54588dd25285" (UID: "f8ca33ba-f76e-4352-b6f1-54588dd25285"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.108118 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6e9d-account-create-update-f9bgk"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.131652 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-wrxqf"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.144029 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-d63e-account-create-update-tvf55"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.145056 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-scripts\") pod \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.145200 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-ovsdb-rundir\") pod \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.145222 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.145248 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-combined-ca-bundle\") pod \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.145279 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-config\") pod \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.145342 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-ovsdbserver-nb-tls-certs\") pod \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.145367 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-metrics-certs-tls-certs\") pod \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.145491 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55pw9\" (UniqueName: \"kubernetes.io/projected/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-kube-api-access-55pw9\") pod \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.146544 5094 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f8ca33ba-f76e-4352-b6f1-54588dd25285-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.149384 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ecb5d91-5ba1-457e-af42-0d78c8643250-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "8ecb5d91-5ba1-457e-af42-0d78c8643250" (UID: "8ecb5d91-5ba1-457e-af42-0d78c8643250"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.157437 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "d3ec8857-5a33-44ea-bdd0-97b343adfc8a" (UID: "d3ec8857-5a33-44ea-bdd0-97b343adfc8a"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.160981 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-config" (OuterVolumeSpecName: "config") pod "d3ec8857-5a33-44ea-bdd0-97b343adfc8a" (UID: "d3ec8857-5a33-44ea-bdd0-97b343adfc8a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.166562 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-kube-api-access-55pw9" (OuterVolumeSpecName: "kube-api-access-55pw9") pod "d3ec8857-5a33-44ea-bdd0-97b343adfc8a" (UID: "d3ec8857-5a33-44ea-bdd0-97b343adfc8a"). InnerVolumeSpecName "kube-api-access-55pw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.169991 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-scripts" (OuterVolumeSpecName: "scripts") pod "d3ec8857-5a33-44ea-bdd0-97b343adfc8a" (UID: "d3ec8857-5a33-44ea-bdd0-97b343adfc8a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.172533 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-wrxqf"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.185130 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca32118b-2e77-4484-b753-3467e1ba8df1-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "ca32118b-2e77-4484-b753-3467e1ba8df1" (UID: "ca32118b-2e77-4484-b753-3467e1ba8df1"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.222202 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "d3ec8857-5a33-44ea-bdd0-97b343adfc8a" (UID: "d3ec8857-5a33-44ea-bdd0-97b343adfc8a"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.224542 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="a829c6b3-7069-4544-90dc-40ae83aba524" containerName="rabbitmq" containerID="cri-o://bf7d170cd7c0f8170ef78ab632229324322185632d477a161a83826e71f489e8" gracePeriod=604800 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.247956 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-p4nhd"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.252137 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55pw9\" (UniqueName: \"kubernetes.io/projected/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-kube-api-access-55pw9\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.252185 5094 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ecb5d91-5ba1-457e-af42-0d78c8643250-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.252198 5094 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca32118b-2e77-4484-b753-3467e1ba8df1-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.252212 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.252226 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.252255 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.252269 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.295040 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6e9d-account-create-update-f9bgk"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.305196 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3ec8857-5a33-44ea-bdd0-97b343adfc8a" (UID: "d3ec8857-5a33-44ea-bdd0-97b343adfc8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.307622 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.308084 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="f3caa33a-a0ec-4fdc-876b-266724a5af50" containerName="nova-cell1-conductor-conductor" containerID="cri-o://d82cb6bb5b6363e25ca9e64db1ea09c7494c05c93392e549c89755977ef55244" gracePeriod=30 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.310841 5094 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 20 07:11:26 crc kubenswrapper[5094]: E0220 07:11:26.315913 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3f7c742aada7ed25bb815042dbf9602749ab67ca6990bc8a7e5d047b638c6680" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 20 07:11:26 crc kubenswrapper[5094]: E0220 07:11:26.347262 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3f7c742aada7ed25bb815042dbf9602749ab67ca6990bc8a7e5d047b638c6680" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 20 07:11:26 crc kubenswrapper[5094]: E0220 07:11:26.356054 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3f7c742aada7ed25bb815042dbf9602749ab67ca6990bc8a7e5d047b638c6680" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 20 07:11:26 crc kubenswrapper[5094]: E0220 07:11:26.356133 5094 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="cd92d75e-9882-4bb7-a41e-cab9777424e8" containerName="ovn-northd" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.356810 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.359132 5094 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.359159 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.378270 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="30e79ba9-83fc-4246-9fb2-7136f6ae30a5" containerName="galera" containerID="cri-o://f1ef94c055965af4c60a329875d73bb23ffc248862090105b8c3a3484d931c14" gracePeriod=30 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.414160 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-pk97g"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.415664 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "d3ec8857-5a33-44ea-bdd0-97b343adfc8a" (UID: "d3ec8857-5a33-44ea-bdd0-97b343adfc8a"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.417075 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-pk97g"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.421735 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.426436 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rfczb"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.433974 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rfczb"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.439873 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "d3ec8857-5a33-44ea-bdd0-97b343adfc8a" (UID: "d3ec8857-5a33-44ea-bdd0-97b343adfc8a"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.446586 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.447121 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="a5bbb9ad-deeb-495f-9750-f7012c00061d" containerName="nova-cell0-conductor-conductor" containerID="cri-o://c9d8bef2aa627582c2efca41433b2b24a6bb42ef156ade330dc390cb501211cf" gracePeriod=30 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.465431 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.465469 5094 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: E0220 07:11:26.465543 5094 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 20 07:11:26 crc kubenswrapper[5094]: E0220 07:11:26.465596 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-config-data podName:219c74d6-9f45-4bf8-8c67-acdea3c0fab3 nodeName:}" failed. No retries permitted until 2026-02-20 07:11:30.465578077 +0000 UTC m=+1505.338204788 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-config-data") pod "rabbitmq-server-0" (UID: "219c74d6-9f45-4bf8-8c67-acdea3c0fab3") : configmap "rabbitmq-config-data" not found Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.518148 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_cadd011d-8dde-4346-8608-c5f74376204d/ovsdbserver-sb/0.log" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.518237 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.574749 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-dns-swift-storage-0\") pod \"37533bd5-22b5-4b59-8672-35eaa19b9295\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.574828 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-config\") pod \"37533bd5-22b5-4b59-8672-35eaa19b9295\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.574902 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-ovsdbserver-sb\") pod \"37533bd5-22b5-4b59-8672-35eaa19b9295\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.575127 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djp42\" (UniqueName: \"kubernetes.io/projected/37533bd5-22b5-4b59-8672-35eaa19b9295-kube-api-access-djp42\") pod \"37533bd5-22b5-4b59-8672-35eaa19b9295\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.575185 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-dns-svc\") pod \"37533bd5-22b5-4b59-8672-35eaa19b9295\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.575235 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-ovsdbserver-nb\") pod \"37533bd5-22b5-4b59-8672-35eaa19b9295\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.609021 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37533bd5-22b5-4b59-8672-35eaa19b9295-kube-api-access-djp42" (OuterVolumeSpecName: "kube-api-access-djp42") pod "37533bd5-22b5-4b59-8672-35eaa19b9295" (UID: "37533bd5-22b5-4b59-8672-35eaa19b9295"). InnerVolumeSpecName "kube-api-access-djp42". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.610191 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-1fa0-account-create-update-zvvj2"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.629945 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-5vc6z"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.656253 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-lvlr2"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.662533 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-lvlr2"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.672891 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-gg2h9"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.678413 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-combined-ca-bundle\") pod \"cadd011d-8dde-4346-8608-c5f74376204d\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.678459 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4rzz\" (UniqueName: \"kubernetes.io/projected/cadd011d-8dde-4346-8608-c5f74376204d-kube-api-access-s4rzz\") pod \"cadd011d-8dde-4346-8608-c5f74376204d\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.678521 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cadd011d-8dde-4346-8608-c5f74376204d-scripts\") pod \"cadd011d-8dde-4346-8608-c5f74376204d\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.678646 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-metrics-certs-tls-certs\") pod \"cadd011d-8dde-4346-8608-c5f74376204d\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.678793 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cadd011d-8dde-4346-8608-c5f74376204d\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.678830 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-ovsdbserver-sb-tls-certs\") pod \"cadd011d-8dde-4346-8608-c5f74376204d\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.678851 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cadd011d-8dde-4346-8608-c5f74376204d-ovsdb-rundir\") pod \"cadd011d-8dde-4346-8608-c5f74376204d\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.678938 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cadd011d-8dde-4346-8608-c5f74376204d-config\") pod \"cadd011d-8dde-4346-8608-c5f74376204d\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.679361 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djp42\" (UniqueName: \"kubernetes.io/projected/37533bd5-22b5-4b59-8672-35eaa19b9295-kube-api-access-djp42\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.687743 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-6964856c75-f7xdp"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.688012 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-6964856c75-f7xdp" podUID="8e30bbcd-c206-4a74-ae52-21462356babf" containerName="proxy-httpd" containerID="cri-o://e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96" gracePeriod=30 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.688556 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-6964856c75-f7xdp" podUID="8e30bbcd-c206-4a74-ae52-21462356babf" containerName="proxy-server" containerID="cri-o://9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c" gracePeriod=30 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.691617 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-gg2h9"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.692815 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cadd011d-8dde-4346-8608-c5f74376204d-scripts" (OuterVolumeSpecName: "scripts") pod "cadd011d-8dde-4346-8608-c5f74376204d" (UID: "cadd011d-8dde-4346-8608-c5f74376204d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.693245 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cadd011d-8dde-4346-8608-c5f74376204d-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "cadd011d-8dde-4346-8608-c5f74376204d" (UID: "cadd011d-8dde-4346-8608-c5f74376204d"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.693568 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cadd011d-8dde-4346-8608-c5f74376204d-config" (OuterVolumeSpecName: "config") pod "cadd011d-8dde-4346-8608-c5f74376204d" (UID: "cadd011d-8dde-4346-8608-c5f74376204d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.697935 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "cadd011d-8dde-4346-8608-c5f74376204d" (UID: "cadd011d-8dde-4346-8608-c5f74376204d"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.712352 5094 generic.go:334] "Generic (PLEG): container finished" podID="92877559-6960-4dbf-890a-fb563f4b0bf8" containerID="7fc72402d88effbe5b7f66ca244892cffc5d212981c680da190e5ee72f72b92a" exitCode=143 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.712419 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7df9984bd9-6txsf" event={"ID":"92877559-6960-4dbf-890a-fb563f4b0bf8","Type":"ContainerDied","Data":"7fc72402d88effbe5b7f66ca244892cffc5d212981c680da190e5ee72f72b92a"} Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.718084 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cadd011d-8dde-4346-8608-c5f74376204d-kube-api-access-s4rzz" (OuterVolumeSpecName: "kube-api-access-s4rzz") pod "cadd011d-8dde-4346-8608-c5f74376204d" (UID: "cadd011d-8dde-4346-8608-c5f74376204d"). InnerVolumeSpecName "kube-api-access-s4rzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.728276 5094 generic.go:334] "Generic (PLEG): container finished" podID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerID="d95b17f09f786d48faa3531b254622503fd28358bef08fa635117e335717ec76" exitCode=0 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.728326 5094 generic.go:334] "Generic (PLEG): container finished" podID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerID="6b2e14e49e3a0c6e84ab2a157f1e34b73c7517b9c172442d8359cdf3a71c1601" exitCode=0 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.728336 5094 generic.go:334] "Generic (PLEG): container finished" podID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerID="13486685a923bf2693233900e7c25d8638639ebc4de0d1f20f27c486791e4f53" exitCode=0 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.728347 5094 generic.go:334] "Generic (PLEG): container finished" podID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerID="77762b2b7b3381830ea1b52f38b5c149eed591367db41376905ffbc17a991c9c" exitCode=0 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.728436 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerDied","Data":"d95b17f09f786d48faa3531b254622503fd28358bef08fa635117e335717ec76"} Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.728541 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerDied","Data":"6b2e14e49e3a0c6e84ab2a157f1e34b73c7517b9c172442d8359cdf3a71c1601"} Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.728646 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerDied","Data":"13486685a923bf2693233900e7c25d8638639ebc4de0d1f20f27c486791e4f53"} Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.728660 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerDied","Data":"77762b2b7b3381830ea1b52f38b5c149eed591367db41376905ffbc17a991c9c"} Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.735068 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d3ec8857-5a33-44ea-bdd0-97b343adfc8a/ovsdbserver-nb/0.log" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.735196 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d3ec8857-5a33-44ea-bdd0-97b343adfc8a","Type":"ContainerDied","Data":"c486e86d620d37448242e0209a1c7bf77c53c9f654c68f860c22b2ce1ff67ce9"} Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.735267 5094 scope.go:117] "RemoveContainer" containerID="d8891902715ab819be070436bd6baa8c209b3026708544fcd16518b5902e4976" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.735309 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.760650 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7677694455-llk7m" event={"ID":"37533bd5-22b5-4b59-8672-35eaa19b9295","Type":"ContainerDied","Data":"11d132d9afa27137f856e4e3ac63fa1a46eebfeb7ef403dea2957ddcdaf2acba"} Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.760856 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.768721 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-config" (OuterVolumeSpecName: "config") pod "37533bd5-22b5-4b59-8672-35eaa19b9295" (UID: "37533bd5-22b5-4b59-8672-35eaa19b9295"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.783921 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.783962 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cadd011d-8dde-4346-8608-c5f74376204d-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.783975 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cadd011d-8dde-4346-8608-c5f74376204d-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.783991 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4rzz\" (UniqueName: \"kubernetes.io/projected/cadd011d-8dde-4346-8608-c5f74376204d-kube-api-access-s4rzz\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.784004 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cadd011d-8dde-4346-8608-c5f74376204d-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.784017 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.795653 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_cadd011d-8dde-4346-8608-c5f74376204d/ovsdbserver-sb/0.log" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.795751 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"cadd011d-8dde-4346-8608-c5f74376204d","Type":"ContainerDied","Data":"066aae04f2be17dda3fe7b7198c86cbe95ffc64bcd807181570b9426d3daf04a"} Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.795837 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.806944 5094 generic.go:334] "Generic (PLEG): container finished" podID="908e2706-d24f-41c9-b481-4c0d5415c5ca" containerID="459bc2f2d89a8c0984a77dac5551d18970f5268790591ad15bb4ecd73c5d3e57" exitCode=143 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.807061 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" event={"ID":"908e2706-d24f-41c9-b481-4c0d5415c5ca","Type":"ContainerDied","Data":"459bc2f2d89a8c0984a77dac5551d18970f5268790591ad15bb4ecd73c5d3e57"} Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.821407 5094 generic.go:334] "Generic (PLEG): container finished" podID="5d9f1f40-92cc-4f19-9f3b-49651f56bffb" containerID="c53dec831e54b024a954907e297fb4a37cd3b545f865b7a580f6bd56abb0a90d" exitCode=143 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.821564 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" event={"ID":"5d9f1f40-92cc-4f19-9f3b-49651f56bffb","Type":"ContainerDied","Data":"c53dec831e54b024a954907e297fb4a37cd3b545f865b7a580f6bd56abb0a90d"} Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.827408 5094 scope.go:117] "RemoveContainer" containerID="87a953c9ab5036126a76e97a66b02ef98a2f8720e10ee1bc11a373190ac13d0d" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.828471 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1fa0-account-create-update-zvvj2" event={"ID":"290e5022-8d17-4415-87c0-07891b0b66f5","Type":"ContainerStarted","Data":"53f6cb6ae1d8a22199f485c77e321317380823b606f56ad3e800d4265b641405"} Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.834753 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.850075 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.866095 5094 generic.go:334] "Generic (PLEG): container finished" podID="530069d2-7146-46eb-9c88-056cc8a583b2" containerID="5cb136f403a952c3f992bdf43a3607fce1325d571764a30723172fab59ca2ce3" exitCode=0 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.866169 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54bd68f77-fkqmr" event={"ID":"530069d2-7146-46eb-9c88-056cc8a583b2","Type":"ContainerDied","Data":"5cb136f403a952c3f992bdf43a3607fce1325d571764a30723172fab59ca2ce3"} Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.872066 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5vc6z" event={"ID":"0330a367-c0c9-42a9-9993-1a3b6775fd3b","Type":"ContainerStarted","Data":"09536db7b188d6d7e190e80f4a66c3a18be0cf58fe509f674089f0d4cd9626eb"} Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.873364 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-6964856c75-f7xdp" podUID="8e30bbcd-c206-4a74-ae52-21462356babf" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.169:8080/healthcheck\": dial tcp 10.217.0.169:8080: connect: connection refused" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.873746 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-6964856c75-f7xdp" podUID="8e30bbcd-c206-4a74-ae52-21462356babf" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.169:8080/healthcheck\": dial tcp 10.217.0.169:8080: connect: connection refused" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.875575 5094 generic.go:334] "Generic (PLEG): container finished" podID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" exitCode=0 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.875653 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tj42x" event={"ID":"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9","Type":"ContainerDied","Data":"ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d"} Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.889884 5094 generic.go:334] "Generic (PLEG): container finished" podID="ca3adffb-7baf-45db-ab16-cc1c63510fec" containerID="c34769ddae56119b12d9b51bd88bf7ef48671807437c5ad0d48869446003d663" exitCode=143 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.889963 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ca3adffb-7baf-45db-ab16-cc1c63510fec","Type":"ContainerDied","Data":"c34769ddae56119b12d9b51bd88bf7ef48671807437c5ad0d48869446003d663"} Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.898364 5094 scope.go:117] "RemoveContainer" containerID="fabd932b0a47a0bf51f49258d5ff3d5003b1b88996a45ebe155061bab41c04ff" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.899212 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "37533bd5-22b5-4b59-8672-35eaa19b9295" (UID: "37533bd5-22b5-4b59-8672-35eaa19b9295"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.906395 5094 generic.go:334] "Generic (PLEG): container finished" podID="d01cbaa4-5543-4cd5-b098-7e4600819d32" containerID="0d80098740f3555dc54ef0c65032de273b3ce866905e15dc8682ab9661be5b23" exitCode=0 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.906507 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d01cbaa4-5543-4cd5-b098-7e4600819d32","Type":"ContainerDied","Data":"0d80098740f3555dc54ef0c65032de273b3ce866905e15dc8682ab9661be5b23"} Feb 20 07:11:26 crc kubenswrapper[5094]: E0220 07:11:26.969770 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:26 crc kubenswrapper[5094]: E0220 07:11:26.971290 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:26 crc kubenswrapper[5094]: E0220 07:11:26.971570 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.980159 5094 generic.go:334] "Generic (PLEG): container finished" podID="8f8cb333-2939-4404-b242-67bcf4e6875b" containerID="e83bab219c26de7197a4c8d483fd96f4ccf30f290122d4606fa75843efcfaa32" exitCode=143 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.980247 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8f8cb333-2939-4404-b242-67bcf4e6875b","Type":"ContainerDied","Data":"e83bab219c26de7197a4c8d483fd96f4ccf30f290122d4606fa75843efcfaa32"} Feb 20 07:11:26 crc kubenswrapper[5094]: E0220 07:11:26.980302 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:26 crc kubenswrapper[5094]: E0220 07:11:26.980349 5094 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tj42x" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovsdb-server" Feb 20 07:11:26 crc kubenswrapper[5094]: E0220 07:11:26.980911 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:26 crc kubenswrapper[5094]: E0220 07:11:26.989861 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:26 crc kubenswrapper[5094]: E0220 07:11:26.989926 5094 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tj42x" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovs-vswitchd" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.001097 5094 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.004398 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "37533bd5-22b5-4b59-8672-35eaa19b9295" (UID: "37533bd5-22b5-4b59-8672-35eaa19b9295"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.010308 5094 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:27 crc kubenswrapper[5094]: E0220 07:11:27.010342 5094 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.010351 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.010371 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:27 crc kubenswrapper[5094]: E0220 07:11:27.010404 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-config-data podName:a829c6b3-7069-4544-90dc-40ae83aba524 nodeName:}" failed. No retries permitted until 2026-02-20 07:11:31.010381914 +0000 UTC m=+1505.883008625 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-config-data") pod "rabbitmq-cell1-server-0" (UID: "a829c6b3-7069-4544-90dc-40ae83aba524") : configmap "rabbitmq-cell1-config-data" not found Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.018008 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cadd011d-8dde-4346-8608-c5f74376204d" (UID: "cadd011d-8dde-4346-8608-c5f74376204d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.039075 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "37533bd5-22b5-4b59-8672-35eaa19b9295" (UID: "37533bd5-22b5-4b59-8672-35eaa19b9295"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:27 crc kubenswrapper[5094]: E0220 07:11:27.054302 5094 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 20 07:11:27 crc kubenswrapper[5094]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 20 07:11:27 crc kubenswrapper[5094]: Feb 20 07:11:27 crc kubenswrapper[5094]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 20 07:11:27 crc kubenswrapper[5094]: Feb 20 07:11:27 crc kubenswrapper[5094]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 20 07:11:27 crc kubenswrapper[5094]: Feb 20 07:11:27 crc kubenswrapper[5094]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 20 07:11:27 crc kubenswrapper[5094]: Feb 20 07:11:27 crc kubenswrapper[5094]: if [ -n "nova_api" ]; then Feb 20 07:11:27 crc kubenswrapper[5094]: GRANT_DATABASE="nova_api" Feb 20 07:11:27 crc kubenswrapper[5094]: else Feb 20 07:11:27 crc kubenswrapper[5094]: GRANT_DATABASE="*" Feb 20 07:11:27 crc kubenswrapper[5094]: fi Feb 20 07:11:27 crc kubenswrapper[5094]: Feb 20 07:11:27 crc kubenswrapper[5094]: # going for maximum compatibility here: Feb 20 07:11:27 crc kubenswrapper[5094]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 20 07:11:27 crc kubenswrapper[5094]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 20 07:11:27 crc kubenswrapper[5094]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 20 07:11:27 crc kubenswrapper[5094]: # support updates Feb 20 07:11:27 crc kubenswrapper[5094]: Feb 20 07:11:27 crc kubenswrapper[5094]: $MYSQL_CMD < logger="UnhandledError" Feb 20 07:11:27 crc kubenswrapper[5094]: E0220 07:11:27.055996 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-d63e-account-create-update-tvf55" podUID="2a1b6a8a-aefe-4f59-8936-f08aed30d8f7" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.087916 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "37533bd5-22b5-4b59-8672-35eaa19b9295" (UID: "37533bd5-22b5-4b59-8672-35eaa19b9295"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.088051 5094 scope.go:117] "RemoveContainer" containerID="a4c4f92b36bb2b7d701dbf8c3f7817a427ce69bddf6bb34e82a6884705e2608c" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.097334 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "cadd011d-8dde-4346-8608-c5f74376204d" (UID: "cadd011d-8dde-4346-8608-c5f74376204d"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.100081 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "cadd011d-8dde-4346-8608-c5f74376204d" (UID: "cadd011d-8dde-4346-8608-c5f74376204d"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.114626 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.114720 5094 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.115014 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.115024 5094 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.115039 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.220384 5094 scope.go:117] "RemoveContainer" containerID="2a6097f5aaeab1082991a6d095181e3f753899eb178f0d829dd1ed8b74f20e47" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.295930 5094 scope.go:117] "RemoveContainer" containerID="333434623dc65ba599c492292fd799a78a2d1d5581438ea036aa6124a9583e68" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.401957 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7677694455-llk7m"] Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.409610 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7677694455-llk7m"] Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.521797 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.528821 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.635006 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1fa0-account-create-update-zvvj2" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.730818 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-497kt\" (UniqueName: \"kubernetes.io/projected/290e5022-8d17-4415-87c0-07891b0b66f5-kube-api-access-497kt\") pod \"290e5022-8d17-4415-87c0-07891b0b66f5\" (UID: \"290e5022-8d17-4415-87c0-07891b0b66f5\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.731030 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/290e5022-8d17-4415-87c0-07891b0b66f5-operator-scripts\") pod \"290e5022-8d17-4415-87c0-07891b0b66f5\" (UID: \"290e5022-8d17-4415-87c0-07891b0b66f5\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.733851 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/290e5022-8d17-4415-87c0-07891b0b66f5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "290e5022-8d17-4415-87c0-07891b0b66f5" (UID: "290e5022-8d17-4415-87c0-07891b0b66f5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.743472 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/290e5022-8d17-4415-87c0-07891b0b66f5-kube-api-access-497kt" (OuterVolumeSpecName: "kube-api-access-497kt") pod "290e5022-8d17-4415-87c0-07891b0b66f5" (UID: "290e5022-8d17-4415-87c0-07891b0b66f5"). InnerVolumeSpecName "kube-api-access-497kt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.834312 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-497kt\" (UniqueName: \"kubernetes.io/projected/290e5022-8d17-4415-87c0-07891b0b66f5-kube-api-access-497kt\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.834361 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/290e5022-8d17-4415-87c0-07891b0b66f5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:27 crc kubenswrapper[5094]: E0220 07:11:27.834476 5094 projected.go:263] Couldn't get secret openstack/swift-proxy-config-data: secret "swift-proxy-config-data" not found Feb 20 07:11:27 crc kubenswrapper[5094]: E0220 07:11:27.834493 5094 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Feb 20 07:11:27 crc kubenswrapper[5094]: E0220 07:11:27.834505 5094 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 20 07:11:27 crc kubenswrapper[5094]: E0220 07:11:27.834519 5094 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-proxy-6964856c75-f7xdp: [secret "swift-proxy-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Feb 20 07:11:27 crc kubenswrapper[5094]: E0220 07:11:27.834582 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-etc-swift podName:8e30bbcd-c206-4a74-ae52-21462356babf nodeName:}" failed. No retries permitted until 2026-02-20 07:11:31.834559733 +0000 UTC m=+1506.707186434 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-etc-swift") pod "swift-proxy-6964856c75-f7xdp" (UID: "8e30bbcd-c206-4a74-ae52-21462356babf") : [secret "swift-proxy-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.879438 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20ff73f2-0b55-4d81-9342-92dbe47435f0" path="/var/lib/kubelet/pods/20ff73f2-0b55-4d81-9342-92dbe47435f0/volumes" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.880344 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32b230f5-7de4-450c-90e6-e9c18a0d9c0e" path="/var/lib/kubelet/pods/32b230f5-7de4-450c-90e6-e9c18a0d9c0e/volumes" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.881123 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37533bd5-22b5-4b59-8672-35eaa19b9295" path="/var/lib/kubelet/pods/37533bd5-22b5-4b59-8672-35eaa19b9295/volumes" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.882323 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43cfca6d-55e3-431f-b5b8-2b8db44bcee0" path="/var/lib/kubelet/pods/43cfca6d-55e3-431f-b5b8-2b8db44bcee0/volumes" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.882879 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="462ace9b-51c7-4cd0-850a-65d714c5f3b6" path="/var/lib/kubelet/pods/462ace9b-51c7-4cd0-850a-65d714c5f3b6/volumes" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.883579 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c0f5daa-28f1-412d-8749-5b11f6b8f26d" path="/var/lib/kubelet/pods/5c0f5daa-28f1-412d-8749-5b11f6b8f26d/volumes" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.884300 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74c08d54-fdef-4808-bf52-f8ea0894af36" path="/var/lib/kubelet/pods/74c08d54-fdef-4808-bf52-f8ea0894af36/volumes" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.885589 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75a27624-eac7-47c9-9f3b-98604d88fb3a" path="/var/lib/kubelet/pods/75a27624-eac7-47c9-9f3b-98604d88fb3a/volumes" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.893795 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.904412 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.905749 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ecb5d91-5ba1-457e-af42-0d78c8643250" path="/var/lib/kubelet/pods/8ecb5d91-5ba1-457e-af42-0d78c8643250/volumes" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.907548 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b534507-5d2d-496b-9a60-f0b45e25bb23" path="/var/lib/kubelet/pods/9b534507-5d2d-496b-9a60-f0b45e25bb23/volumes" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.908849 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca32118b-2e77-4484-b753-3467e1ba8df1" path="/var/lib/kubelet/pods/ca32118b-2e77-4484-b753-3467e1ba8df1/volumes" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.910269 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cadd011d-8dde-4346-8608-c5f74376204d" path="/var/lib/kubelet/pods/cadd011d-8dde-4346-8608-c5f74376204d/volumes" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.911387 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3ec8857-5a33-44ea-bdd0-97b343adfc8a" path="/var/lib/kubelet/pods/d3ec8857-5a33-44ea-bdd0-97b343adfc8a/volumes" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.919932 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8ca33ba-f76e-4352-b6f1-54588dd25285" path="/var/lib/kubelet/pods/f8ca33ba-f76e-4352-b6f1-54588dd25285/volumes" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.927769 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.944842 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e30bbcd-c206-4a74-ae52-21462356babf-log-httpd\") pod \"8e30bbcd-c206-4a74-ae52-21462356babf\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.944914 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44cs2\" (UniqueName: \"kubernetes.io/projected/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-kube-api-access-44cs2\") pod \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.944999 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-nova-novncproxy-tls-certs\") pod \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945031 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-combined-ca-bundle\") pod \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945087 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-config-data\") pod \"8e30bbcd-c206-4a74-ae52-21462356babf\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945153 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-vencrypt-tls-certs\") pod \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945223 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-kolla-config\") pod \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945269 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-config-data-generated\") pod \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945321 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lk69b\" (UniqueName: \"kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-kube-api-access-lk69b\") pod \"8e30bbcd-c206-4a74-ae52-21462356babf\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945386 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-operator-scripts\") pod \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945419 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-config-data\") pod \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945444 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-config-data-default\") pod \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945469 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e30bbcd-c206-4a74-ae52-21462356babf-run-httpd\") pod \"8e30bbcd-c206-4a74-ae52-21462356babf\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945491 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-combined-ca-bundle\") pod \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945512 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-combined-ca-bundle\") pod \"8e30bbcd-c206-4a74-ae52-21462356babf\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945678 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945726 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-internal-tls-certs\") pod \"8e30bbcd-c206-4a74-ae52-21462356babf\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945757 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-public-tls-certs\") pod \"8e30bbcd-c206-4a74-ae52-21462356babf\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945791 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-galera-tls-certs\") pod \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945849 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-etc-swift\") pod \"8e30bbcd-c206-4a74-ae52-21462356babf\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945892 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fss5r\" (UniqueName: \"kubernetes.io/projected/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-kube-api-access-fss5r\") pod \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.952503 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e30bbcd-c206-4a74-ae52-21462356babf-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8e30bbcd-c206-4a74-ae52-21462356babf" (UID: "8e30bbcd-c206-4a74-ae52-21462356babf"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.954440 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "30e79ba9-83fc-4246-9fb2-7136f6ae30a5" (UID: "30e79ba9-83fc-4246-9fb2-7136f6ae30a5"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.966863 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-kube-api-access-44cs2" (OuterVolumeSpecName: "kube-api-access-44cs2") pod "3db6d35c-dfd1-4a59-95d3-cc8a99151c12" (UID: "3db6d35c-dfd1-4a59-95d3-cc8a99151c12"). InnerVolumeSpecName "kube-api-access-44cs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.968767 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e30bbcd-c206-4a74-ae52-21462356babf-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8e30bbcd-c206-4a74-ae52-21462356babf" (UID: "8e30bbcd-c206-4a74-ae52-21462356babf"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.969023 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "30e79ba9-83fc-4246-9fb2-7136f6ae30a5" (UID: "30e79ba9-83fc-4246-9fb2-7136f6ae30a5"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.971868 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "30e79ba9-83fc-4246-9fb2-7136f6ae30a5" (UID: "30e79ba9-83fc-4246-9fb2-7136f6ae30a5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.980026 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-kube-api-access-lk69b" (OuterVolumeSpecName: "kube-api-access-lk69b") pod "8e30bbcd-c206-4a74-ae52-21462356babf" (UID: "8e30bbcd-c206-4a74-ae52-21462356babf"). InnerVolumeSpecName "kube-api-access-lk69b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.994942 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "30e79ba9-83fc-4246-9fb2-7136f6ae30a5" (UID: "30e79ba9-83fc-4246-9fb2-7136f6ae30a5"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.019271 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-kube-api-access-fss5r" (OuterVolumeSpecName: "kube-api-access-fss5r") pod "30e79ba9-83fc-4246-9fb2-7136f6ae30a5" (UID: "30e79ba9-83fc-4246-9fb2-7136f6ae30a5"). InnerVolumeSpecName "kube-api-access-fss5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.052680 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8e30bbcd-c206-4a74-ae52-21462356babf" (UID: "8e30bbcd-c206-4a74-ae52-21462356babf"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.066619 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-config-data" (OuterVolumeSpecName: "config-data") pod "3db6d35c-dfd1-4a59-95d3-cc8a99151c12" (UID: "3db6d35c-dfd1-4a59-95d3-cc8a99151c12"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.067283 5094 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e30bbcd-c206-4a74-ae52-21462356babf-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.067335 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44cs2\" (UniqueName: \"kubernetes.io/projected/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-kube-api-access-44cs2\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.067346 5094 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.067357 5094 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.067367 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lk69b\" (UniqueName: \"kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-kube-api-access-lk69b\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.067382 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.067391 5094 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.068585 5094 generic.go:334] "Generic (PLEG): container finished" podID="3db6d35c-dfd1-4a59-95d3-cc8a99151c12" containerID="2bdc91fbdf4016ef04e3a2baae7cf5e5ae714534246ea953292e2628283eecdf" exitCode=0 Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.068759 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3db6d35c-dfd1-4a59-95d3-cc8a99151c12","Type":"ContainerDied","Data":"2bdc91fbdf4016ef04e3a2baae7cf5e5ae714534246ea953292e2628283eecdf"} Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.068831 5094 scope.go:117] "RemoveContainer" containerID="2bdc91fbdf4016ef04e3a2baae7cf5e5ae714534246ea953292e2628283eecdf" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.069074 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.069221 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3db6d35c-dfd1-4a59-95d3-cc8a99151c12","Type":"ContainerDied","Data":"d1d010e8fd8bc9707a8121e22dcd018ce86f613e9bbcc45a6bd9ab2c3e354582"} Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.070837 5094 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e30bbcd-c206-4a74-ae52-21462356babf-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.071087 5094 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.071119 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fss5r\" (UniqueName: \"kubernetes.io/projected/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-kube-api-access-fss5r\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.077624 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "mysql-db") pod "30e79ba9-83fc-4246-9fb2-7136f6ae30a5" (UID: "30e79ba9-83fc-4246-9fb2-7136f6ae30a5"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.115727 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.178:9292/healthcheck\": read tcp 10.217.0.2:52950->10.217.0.178:9292: read: connection reset by peer" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.115758 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.178:9292/healthcheck\": read tcp 10.217.0.2:52948->10.217.0.178:9292: read: connection reset by peer" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.121529 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "3db6d35c-dfd1-4a59-95d3-cc8a99151c12" (UID: "3db6d35c-dfd1-4a59-95d3-cc8a99151c12"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.123979 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30e79ba9-83fc-4246-9fb2-7136f6ae30a5" (UID: "30e79ba9-83fc-4246-9fb2-7136f6ae30a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.151977 5094 generic.go:334] "Generic (PLEG): container finished" podID="30e79ba9-83fc-4246-9fb2-7136f6ae30a5" containerID="f1ef94c055965af4c60a329875d73bb23ffc248862090105b8c3a3484d931c14" exitCode=0 Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.152102 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"30e79ba9-83fc-4246-9fb2-7136f6ae30a5","Type":"ContainerDied","Data":"f1ef94c055965af4c60a329875d73bb23ffc248862090105b8c3a3484d931c14"} Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.152139 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"30e79ba9-83fc-4246-9fb2-7136f6ae30a5","Type":"ContainerDied","Data":"ab6b9e58f533ca8387a46ffe5e0cb304794c4450b59c803c80417c57e86e76ef"} Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.152233 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.165020 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "3db6d35c-dfd1-4a59-95d3-cc8a99151c12" (UID: "3db6d35c-dfd1-4a59-95d3-cc8a99151c12"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.176926 5094 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.176954 5094 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.176963 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.176974 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.177000 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.181967 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1fa0-account-create-update-zvvj2" event={"ID":"290e5022-8d17-4415-87c0-07891b0b66f5","Type":"ContainerDied","Data":"53f6cb6ae1d8a22199f485c77e321317380823b606f56ad3e800d4265b641405"} Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.182121 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1fa0-account-create-update-zvvj2" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.197415 5094 generic.go:334] "Generic (PLEG): container finished" podID="8e30bbcd-c206-4a74-ae52-21462356babf" containerID="9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c" exitCode=0 Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.197463 5094 generic.go:334] "Generic (PLEG): container finished" podID="8e30bbcd-c206-4a74-ae52-21462356babf" containerID="e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96" exitCode=0 Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.197538 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6964856c75-f7xdp" event={"ID":"8e30bbcd-c206-4a74-ae52-21462356babf","Type":"ContainerDied","Data":"9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c"} Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.197593 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6964856c75-f7xdp" event={"ID":"8e30bbcd-c206-4a74-ae52-21462356babf","Type":"ContainerDied","Data":"e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96"} Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.197605 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6964856c75-f7xdp" event={"ID":"8e30bbcd-c206-4a74-ae52-21462356babf","Type":"ContainerDied","Data":"d9175bae901767e94daa13c4878599492cbc5a434fa253663ae2586b3df20eb3"} Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.197687 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.256252 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-config-data" (OuterVolumeSpecName: "config-data") pod "8e30bbcd-c206-4a74-ae52-21462356babf" (UID: "8e30bbcd-c206-4a74-ae52-21462356babf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.256851 5094 generic.go:334] "Generic (PLEG): container finished" podID="b90c9110-e4fd-461b-ad2c-a58ff01921d8" containerID="91947cfb40227973fecb8ecdcb4c5ccb4aacad2780a5f8683eda8de6a99c2b2e" exitCode=0 Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.256948 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5b8f9d577d-pgn2k" event={"ID":"b90c9110-e4fd-461b-ad2c-a58ff01921d8","Type":"ContainerDied","Data":"91947cfb40227973fecb8ecdcb4c5ccb4aacad2780a5f8683eda8de6a99c2b2e"} Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.274216 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8e30bbcd-c206-4a74-ae52-21462356babf" (UID: "8e30bbcd-c206-4a74-ae52-21462356babf"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.281070 5094 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.281105 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.286135 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "30e79ba9-83fc-4246-9fb2-7136f6ae30a5" (UID: "30e79ba9-83fc-4246-9fb2-7136f6ae30a5"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.319913 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3db6d35c-dfd1-4a59-95d3-cc8a99151c12" (UID: "3db6d35c-dfd1-4a59-95d3-cc8a99151c12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.339542 5094 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.354891 5094 generic.go:334] "Generic (PLEG): container finished" podID="0330a367-c0c9-42a9-9993-1a3b6775fd3b" containerID="8d9599986f47c1f116606b523c4cbef0203820a63f129ca2e7a4866193abc8ce" exitCode=1 Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.355002 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5vc6z" event={"ID":"0330a367-c0c9-42a9-9993-1a3b6775fd3b","Type":"ContainerDied","Data":"8d9599986f47c1f116606b523c4cbef0203820a63f129ca2e7a4866193abc8ce"} Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.355777 5094 scope.go:117] "RemoveContainer" containerID="8d9599986f47c1f116606b523c4cbef0203820a63f129ca2e7a4866193abc8ce" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.386011 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8e30bbcd-c206-4a74-ae52-21462356babf" (UID: "8e30bbcd-c206-4a74-ae52-21462356babf"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.398871 5094 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.398910 5094 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.398923 5094 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.398933 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.484811 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e30bbcd-c206-4a74-ae52-21462356babf" (UID: "8e30bbcd-c206-4a74-ae52-21462356babf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.505675 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.629371 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="762a565c-672e-4127-a8c6-90f721eeda81" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.177:9292/healthcheck\": read tcp 10.217.0.2:47654->10.217.0.177:9292: read: connection reset by peer" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.629807 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="762a565c-672e-4127-a8c6-90f721eeda81" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.177:9292/healthcheck\": read tcp 10.217.0.2:47664->10.217.0.177:9292: read: connection reset by peer" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.634738 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="f11aa87b-3964-4a62-871f-bdf7d1ad7848" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": read tcp 10.217.0.2:34896->10.217.0.204:8775: read: connection reset by peer" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.634918 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="f11aa87b-3964-4a62-871f-bdf7d1ad7848" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": read tcp 10.217.0.2:34910->10.217.0.204:8775: read: connection reset by peer" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.832412 5094 scope.go:117] "RemoveContainer" containerID="2bdc91fbdf4016ef04e3a2baae7cf5e5ae714534246ea953292e2628283eecdf" Feb 20 07:11:28 crc kubenswrapper[5094]: E0220 07:11:28.832904 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bdc91fbdf4016ef04e3a2baae7cf5e5ae714534246ea953292e2628283eecdf\": container with ID starting with 2bdc91fbdf4016ef04e3a2baae7cf5e5ae714534246ea953292e2628283eecdf not found: ID does not exist" containerID="2bdc91fbdf4016ef04e3a2baae7cf5e5ae714534246ea953292e2628283eecdf" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.832936 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bdc91fbdf4016ef04e3a2baae7cf5e5ae714534246ea953292e2628283eecdf"} err="failed to get container status \"2bdc91fbdf4016ef04e3a2baae7cf5e5ae714534246ea953292e2628283eecdf\": rpc error: code = NotFound desc = could not find container \"2bdc91fbdf4016ef04e3a2baae7cf5e5ae714534246ea953292e2628283eecdf\": container with ID starting with 2bdc91fbdf4016ef04e3a2baae7cf5e5ae714534246ea953292e2628283eecdf not found: ID does not exist" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.832974 5094 scope.go:117] "RemoveContainer" containerID="f1ef94c055965af4c60a329875d73bb23ffc248862090105b8c3a3484d931c14" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.848917 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" podUID="908e2706-d24f-41c9-b481-4c0d5415c5ca" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.159:9311/healthcheck\": read tcp 10.217.0.2:52130->10.217.0.159:9311: read: connection reset by peer" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.849226 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" podUID="908e2706-d24f-41c9-b481-4c0d5415c5ca" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.159:9311/healthcheck\": read tcp 10.217.0.2:52138->10.217.0.159:9311: read: connection reset by peer" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.873250 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.901069 5094 scope.go:117] "RemoveContainer" containerID="1a298d17b5fd33f9f36c09a0339c08e7faff16875504e0c410ecbbdc1176b9c8" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.924363 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxpk6\" (UniqueName: \"kubernetes.io/projected/b90c9110-e4fd-461b-ad2c-a58ff01921d8-kube-api-access-qxpk6\") pod \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.924614 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-internal-tls-certs\") pod \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.924681 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-combined-ca-bundle\") pod \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.924890 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-public-tls-certs\") pod \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.924955 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-config-data\") pod \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.924980 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-scripts\") pod \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.924997 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b90c9110-e4fd-461b-ad2c-a58ff01921d8-logs\") pod \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.927006 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b90c9110-e4fd-461b-ad2c-a58ff01921d8-logs" (OuterVolumeSpecName: "logs") pod "b90c9110-e4fd-461b-ad2c-a58ff01921d8" (UID: "b90c9110-e4fd-461b-ad2c-a58ff01921d8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.944662 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-scripts" (OuterVolumeSpecName: "scripts") pod "b90c9110-e4fd-461b-ad2c-a58ff01921d8" (UID: "b90c9110-e4fd-461b-ad2c-a58ff01921d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.944879 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b90c9110-e4fd-461b-ad2c-a58ff01921d8-kube-api-access-qxpk6" (OuterVolumeSpecName: "kube-api-access-qxpk6") pod "b90c9110-e4fd-461b-ad2c-a58ff01921d8" (UID: "b90c9110-e4fd-461b-ad2c-a58ff01921d8"). InnerVolumeSpecName "kube-api-access-qxpk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.033251 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b90c9110-e4fd-461b-ad2c-a58ff01921d8-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.033751 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.033765 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxpk6\" (UniqueName: \"kubernetes.io/projected/b90c9110-e4fd-461b-ad2c-a58ff01921d8-kube-api-access-qxpk6\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.042194 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b90c9110-e4fd-461b-ad2c-a58ff01921d8" (UID: "b90c9110-e4fd-461b-ad2c-a58ff01921d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.044264 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.051390 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.105244 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-config-data" (OuterVolumeSpecName: "config-data") pod "b90c9110-e4fd-461b-ad2c-a58ff01921d8" (UID: "b90c9110-e4fd-461b-ad2c-a58ff01921d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.122202 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-1fa0-account-create-update-zvvj2"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.144120 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.144150 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.174004 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-1fa0-account-create-update-zvvj2"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.210882 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.211212 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="ceilometer-central-agent" containerID="cri-o://f580b2199d149d5c06b5c9bfb9b0ef745161fcfd54354c074e1bf52afa2fbbe4" gracePeriod=30 Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.214231 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="proxy-httpd" containerID="cri-o://4073373c0648687276e84e986260ddd367c34f5129c119e3fce3d4726ca86b92" gracePeriod=30 Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.214321 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="sg-core" containerID="cri-o://0ea214df3349d0a0166b8f5bd5c44fd7b531097f01bbd476b5f80570546b706a" gracePeriod=30 Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.214377 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="ceilometer-notification-agent" containerID="cri-o://b54b09ff79204e43835570c3e6dd4b61080c0d99a2704ffb7699787efee4998c" gracePeriod=30 Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.233406 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.241296 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b90c9110-e4fd-461b-ad2c-a58ff01921d8" (UID: "b90c9110-e4fd-461b-ad2c-a58ff01921d8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.246259 5094 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.265901 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.296826 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.297191 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="1fe9db54-4204-4335-a272-c469e0923478" containerName="kube-state-metrics" containerID="cri-o://0fcfeb4fdfabbbcb036fd6cf631f390dac9fcd6e68173a7af3e76810e7a92db9" gracePeriod=30 Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.325186 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-6964856c75-f7xdp"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.344211 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="8f8cb333-2939-4404-b242-67bcf4e6875b" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.163:8776/healthcheck\": dial tcp 10.217.0.163:8776: connect: connection refused" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.362238 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-6964856c75-f7xdp"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.387179 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b90c9110-e4fd-461b-ad2c-a58ff01921d8" (UID: "b90c9110-e4fd-461b-ad2c-a58ff01921d8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.390119 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.390469 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="7dd0ff85-ae3a-4035-a096-fea5952b19a7" containerName="memcached" containerID="cri-o://631fbada038605f5c51cfb450102e200cb6c919e6e8732fe4d271c4b0f19cc19" gracePeriod=30 Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.424613 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6cc2-account-create-update-vqjjf"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.455553 5094 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.466373 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-6cc2-account-create-update-vqjjf"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.473147 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6cc2-account-create-update-98lt6"] Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.473634 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b90c9110-e4fd-461b-ad2c-a58ff01921d8" containerName="placement-log" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.473648 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b90c9110-e4fd-461b-ad2c-a58ff01921d8" containerName="placement-log" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.473664 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cadd011d-8dde-4346-8608-c5f74376204d" containerName="ovsdbserver-sb" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.473670 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="cadd011d-8dde-4346-8608-c5f74376204d" containerName="ovsdbserver-sb" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.473683 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ecb5d91-5ba1-457e-af42-0d78c8643250" containerName="ovn-controller" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.473689 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ecb5d91-5ba1-457e-af42-0d78c8643250" containerName="ovn-controller" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.473913 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37533bd5-22b5-4b59-8672-35eaa19b9295" containerName="init" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.473923 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="37533bd5-22b5-4b59-8672-35eaa19b9295" containerName="init" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.473945 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3ec8857-5a33-44ea-bdd0-97b343adfc8a" containerName="openstack-network-exporter" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.473951 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ec8857-5a33-44ea-bdd0-97b343adfc8a" containerName="openstack-network-exporter" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.473968 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cadd011d-8dde-4346-8608-c5f74376204d" containerName="openstack-network-exporter" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.473999 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="cadd011d-8dde-4346-8608-c5f74376204d" containerName="openstack-network-exporter" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.474012 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3ec8857-5a33-44ea-bdd0-97b343adfc8a" containerName="ovsdbserver-nb" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474020 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ec8857-5a33-44ea-bdd0-97b343adfc8a" containerName="ovsdbserver-nb" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.474028 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e30bbcd-c206-4a74-ae52-21462356babf" containerName="proxy-httpd" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474034 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e30bbcd-c206-4a74-ae52-21462356babf" containerName="proxy-httpd" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.474046 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30e79ba9-83fc-4246-9fb2-7136f6ae30a5" containerName="galera" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474052 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="30e79ba9-83fc-4246-9fb2-7136f6ae30a5" containerName="galera" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.474068 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b90c9110-e4fd-461b-ad2c-a58ff01921d8" containerName="placement-api" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474074 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b90c9110-e4fd-461b-ad2c-a58ff01921d8" containerName="placement-api" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.474085 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca32118b-2e77-4484-b753-3467e1ba8df1" containerName="openstack-network-exporter" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474092 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca32118b-2e77-4484-b753-3467e1ba8df1" containerName="openstack-network-exporter" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.474101 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37533bd5-22b5-4b59-8672-35eaa19b9295" containerName="dnsmasq-dns" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474107 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="37533bd5-22b5-4b59-8672-35eaa19b9295" containerName="dnsmasq-dns" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.474114 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3db6d35c-dfd1-4a59-95d3-cc8a99151c12" containerName="nova-cell1-novncproxy-novncproxy" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474122 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db6d35c-dfd1-4a59-95d3-cc8a99151c12" containerName="nova-cell1-novncproxy-novncproxy" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.474135 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e30bbcd-c206-4a74-ae52-21462356babf" containerName="proxy-server" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474141 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e30bbcd-c206-4a74-ae52-21462356babf" containerName="proxy-server" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.474152 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30e79ba9-83fc-4246-9fb2-7136f6ae30a5" containerName="mysql-bootstrap" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474166 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="30e79ba9-83fc-4246-9fb2-7136f6ae30a5" containerName="mysql-bootstrap" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474334 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="cadd011d-8dde-4346-8608-c5f74376204d" containerName="openstack-network-exporter" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474348 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca32118b-2e77-4484-b753-3467e1ba8df1" containerName="openstack-network-exporter" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474360 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="37533bd5-22b5-4b59-8672-35eaa19b9295" containerName="dnsmasq-dns" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474370 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="3db6d35c-dfd1-4a59-95d3-cc8a99151c12" containerName="nova-cell1-novncproxy-novncproxy" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474381 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b90c9110-e4fd-461b-ad2c-a58ff01921d8" containerName="placement-api" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474393 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3ec8857-5a33-44ea-bdd0-97b343adfc8a" containerName="ovsdbserver-nb" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474402 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e30bbcd-c206-4a74-ae52-21462356babf" containerName="proxy-httpd" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474412 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3ec8857-5a33-44ea-bdd0-97b343adfc8a" containerName="openstack-network-exporter" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474420 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="30e79ba9-83fc-4246-9fb2-7136f6ae30a5" containerName="galera" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474429 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e30bbcd-c206-4a74-ae52-21462356babf" containerName="proxy-server" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474441 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b90c9110-e4fd-461b-ad2c-a58ff01921d8" containerName="placement-log" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474452 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ecb5d91-5ba1-457e-af42-0d78c8643250" containerName="ovn-controller" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474462 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="cadd011d-8dde-4346-8608-c5f74376204d" containerName="ovsdbserver-sb" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.475140 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6cc2-account-create-update-98lt6" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.477581 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.492371 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6cc2-account-create-update-98lt6"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.500921 5094 generic.go:334] "Generic (PLEG): container finished" podID="f11aa87b-3964-4a62-871f-bdf7d1ad7848" containerID="50299a2c387cfc7c5adc90764eae8fbdc420d6bc7964e0d04844da05fc246e7d" exitCode=0 Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.501057 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f11aa87b-3964-4a62-871f-bdf7d1ad7848","Type":"ContainerDied","Data":"50299a2c387cfc7c5adc90764eae8fbdc420d6bc7964e0d04844da05fc246e7d"} Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.515786 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xnj8t"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.529082 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xnj8t"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.544463 5094 generic.go:334] "Generic (PLEG): container finished" podID="cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" containerID="5c666f7fe1bbd7b65dc971fe96a85660b5d4b37560d1db6e6ea2ac21f9782b5c" exitCode=0 Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.545000 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-plbtm"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.545037 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4","Type":"ContainerDied","Data":"5c666f7fe1bbd7b65dc971fe96a85660b5d4b37560d1db6e6ea2ac21f9782b5c"} Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.547759 5094 scope.go:117] "RemoveContainer" containerID="f1ef94c055965af4c60a329875d73bb23ffc248862090105b8c3a3484d931c14" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.557820 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1ef94c055965af4c60a329875d73bb23ffc248862090105b8c3a3484d931c14\": container with ID starting with f1ef94c055965af4c60a329875d73bb23ffc248862090105b8c3a3484d931c14 not found: ID does not exist" containerID="f1ef94c055965af4c60a329875d73bb23ffc248862090105b8c3a3484d931c14" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.557874 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1ef94c055965af4c60a329875d73bb23ffc248862090105b8c3a3484d931c14"} err="failed to get container status \"f1ef94c055965af4c60a329875d73bb23ffc248862090105b8c3a3484d931c14\": rpc error: code = NotFound desc = could not find container \"f1ef94c055965af4c60a329875d73bb23ffc248862090105b8c3a3484d931c14\": container with ID starting with f1ef94c055965af4c60a329875d73bb23ffc248862090105b8c3a3484d931c14 not found: ID does not exist" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.557908 5094 scope.go:117] "RemoveContainer" containerID="1a298d17b5fd33f9f36c09a0339c08e7faff16875504e0c410ecbbdc1176b9c8" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.559278 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-operator-scripts\") pod \"keystone-6cc2-account-create-update-98lt6\" (UID: \"c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6\") " pod="openstack/keystone-6cc2-account-create-update-98lt6" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.559351 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvlxh\" (UniqueName: \"kubernetes.io/projected/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-kube-api-access-tvlxh\") pod \"keystone-6cc2-account-create-update-98lt6\" (UID: \"c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6\") " pod="openstack/keystone-6cc2-account-create-update-98lt6" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.562502 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-plbtm"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.569462 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.572925 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a298d17b5fd33f9f36c09a0339c08e7faff16875504e0c410ecbbdc1176b9c8\": container with ID starting with 1a298d17b5fd33f9f36c09a0339c08e7faff16875504e0c410ecbbdc1176b9c8 not found: ID does not exist" containerID="1a298d17b5fd33f9f36c09a0339c08e7faff16875504e0c410ecbbdc1176b9c8" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.572977 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a298d17b5fd33f9f36c09a0339c08e7faff16875504e0c410ecbbdc1176b9c8"} err="failed to get container status \"1a298d17b5fd33f9f36c09a0339c08e7faff16875504e0c410ecbbdc1176b9c8\": rpc error: code = NotFound desc = could not find container \"1a298d17b5fd33f9f36c09a0339c08e7faff16875504e0c410ecbbdc1176b9c8\": container with ID starting with 1a298d17b5fd33f9f36c09a0339c08e7faff16875504e0c410ecbbdc1176b9c8 not found: ID does not exist" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.573013 5094 scope.go:117] "RemoveContainer" containerID="9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.584533 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-55468cd684-wv6dn"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.584813 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-55468cd684-wv6dn" podUID="732b4015-53b2-4422-b7d1-12b65f6e0c92" containerName="keystone-api" containerID="cri-o://d6784972d1434d152307a7464021eb59a6a508fa634f71c81ae53ac06db7b051" gracePeriod=30 Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.608277 5094 generic.go:334] "Generic (PLEG): container finished" podID="5d9f1f40-92cc-4f19-9f3b-49651f56bffb" containerID="b35c284d78ecbe2f2a62876408599b6c0ef8f8ad565bfbbe236f98afa60d9b08" exitCode=0 Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.608371 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" event={"ID":"5d9f1f40-92cc-4f19-9f3b-49651f56bffb","Type":"ContainerDied","Data":"b35c284d78ecbe2f2a62876408599b6c0ef8f8ad565bfbbe236f98afa60d9b08"} Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.608723 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d82cb6bb5b6363e25ca9e64db1ea09c7494c05c93392e549c89755977ef55244" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.615319 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d82cb6bb5b6363e25ca9e64db1ea09c7494c05c93392e549c89755977ef55244" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.619481 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d82cb6bb5b6363e25ca9e64db1ea09c7494c05c93392e549c89755977ef55244" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.619517 5094 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="f3caa33a-a0ec-4fdc-876b-266724a5af50" containerName="nova-cell1-conductor-conductor" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.619508 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-kbc28"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.627176 5094 generic.go:334] "Generic (PLEG): container finished" podID="92877559-6960-4dbf-890a-fb563f4b0bf8" containerID="d397c0695b1183c0c759491ae377b37ffc66032936b01b69dbc1bd7fedbcab31" exitCode=0 Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.627283 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7df9984bd9-6txsf" event={"ID":"92877559-6960-4dbf-890a-fb563f4b0bf8","Type":"ContainerDied","Data":"d397c0695b1183c0c759491ae377b37ffc66032936b01b69dbc1bd7fedbcab31"} Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.633556 5094 generic.go:334] "Generic (PLEG): container finished" podID="1218d679-0e51-4bef-9526-db16c8783d8b" containerID="0ea214df3349d0a0166b8f5bd5c44fd7b531097f01bbd476b5f80570546b706a" exitCode=2 Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.633632 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1218d679-0e51-4bef-9526-db16c8783d8b","Type":"ContainerDied","Data":"0ea214df3349d0a0166b8f5bd5c44fd7b531097f01bbd476b5f80570546b706a"} Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.636886 5094 generic.go:334] "Generic (PLEG): container finished" podID="ca3adffb-7baf-45db-ab16-cc1c63510fec" containerID="92a9942e0db46da8477438409c4da3d36ea534be827434e466a4bec7f4c990ab" exitCode=0 Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.636953 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ca3adffb-7baf-45db-ab16-cc1c63510fec","Type":"ContainerDied","Data":"92a9942e0db46da8477438409c4da3d36ea534be827434e466a4bec7f4c990ab"} Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.655517 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d63e-account-create-update-tvf55" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.661486 5094 generic.go:334] "Generic (PLEG): container finished" podID="762a565c-672e-4127-a8c6-90f721eeda81" containerID="d1f6db0391d91d17006e26254d5724c2e1250f967872ca29f449b7f22386e51b" exitCode=0 Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.661575 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"762a565c-672e-4127-a8c6-90f721eeda81","Type":"ContainerDied","Data":"d1f6db0391d91d17006e26254d5724c2e1250f967872ca29f449b7f22386e51b"} Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.661765 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.667776 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-operator-scripts\") pod \"keystone-6cc2-account-create-update-98lt6\" (UID: \"c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6\") " pod="openstack/keystone-6cc2-account-create-update-98lt6" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.667855 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvlxh\" (UniqueName: \"kubernetes.io/projected/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-kube-api-access-tvlxh\") pod \"keystone-6cc2-account-create-update-98lt6\" (UID: \"c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6\") " pod="openstack/keystone-6cc2-account-create-update-98lt6" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.668211 5094 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.668273 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-operator-scripts podName:c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6 nodeName:}" failed. No retries permitted until 2026-02-20 07:11:30.168252959 +0000 UTC m=+1505.040879670 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-operator-scripts") pod "keystone-6cc2-account-create-update-98lt6" (UID: "c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6") : configmap "openstack-scripts" not found Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.669323 5094 generic.go:334] "Generic (PLEG): container finished" podID="908e2706-d24f-41c9-b481-4c0d5415c5ca" containerID="a9359fe0f2ca547fd93214e43c994a9c0538ad96741f69a95262122bb7dc4d7b" exitCode=0 Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.672971 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" event={"ID":"908e2706-d24f-41c9-b481-4c0d5415c5ca","Type":"ContainerDied","Data":"a9359fe0f2ca547fd93214e43c994a9c0538ad96741f69a95262122bb7dc4d7b"} Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.679241 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-kbc28"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.676227 5094 scope.go:117] "RemoveContainer" containerID="e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.683012 5094 projected.go:194] Error preparing data for projected volume kube-api-access-tvlxh for pod openstack/keystone-6cc2-account-create-update-98lt6: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.683120 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-kube-api-access-tvlxh podName:c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6 nodeName:}" failed. No retries permitted until 2026-02-20 07:11:30.183088784 +0000 UTC m=+1505.055715495 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-tvlxh" (UniqueName: "kubernetes.io/projected/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-kube-api-access-tvlxh") pod "keystone-6cc2-account-create-update-98lt6" (UID: "c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.706559 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.706645 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5b8f9d577d-pgn2k" event={"ID":"b90c9110-e4fd-461b-ad2c-a58ff01921d8","Type":"ContainerDied","Data":"b019c482612055cc0918048f8a12a69d96710169b8244b6ca81050099107cecc"} Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.727400 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6cc2-account-create-update-98lt6"] Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.735120 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-tvlxh operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-6cc2-account-create-update-98lt6" podUID="c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.765098 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-5vc6z"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.769420 5094 scope.go:117] "RemoveContainer" containerID="9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.770388 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gq8dx\" (UniqueName: \"kubernetes.io/projected/f11aa87b-3964-4a62-871f-bdf7d1ad7848-kube-api-access-gq8dx\") pod \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.770453 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f11aa87b-3964-4a62-871f-bdf7d1ad7848-logs\") pod \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.770538 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a1b6a8a-aefe-4f59-8936-f08aed30d8f7-operator-scripts\") pod \"2a1b6a8a-aefe-4f59-8936-f08aed30d8f7\" (UID: \"2a1b6a8a-aefe-4f59-8936-f08aed30d8f7\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.770569 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-nova-metadata-tls-certs\") pod \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.770683 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-config-data\") pod \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.770721 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-combined-ca-bundle\") pod \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.770802 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7nxd\" (UniqueName: \"kubernetes.io/projected/2a1b6a8a-aefe-4f59-8936-f08aed30d8f7-kube-api-access-r7nxd\") pod \"2a1b6a8a-aefe-4f59-8936-f08aed30d8f7\" (UID: \"2a1b6a8a-aefe-4f59-8936-f08aed30d8f7\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.779222 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.779348 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c\": container with ID starting with 9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c not found: ID does not exist" containerID="9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.779379 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c"} err="failed to get container status \"9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c\": rpc error: code = NotFound desc = could not find container \"9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c\": container with ID starting with 9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c not found: ID does not exist" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.779409 5094 scope.go:117] "RemoveContainer" containerID="e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.781438 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a1b6a8a-aefe-4f59-8936-f08aed30d8f7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2a1b6a8a-aefe-4f59-8936-f08aed30d8f7" (UID: "2a1b6a8a-aefe-4f59-8936-f08aed30d8f7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.781542 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96\": container with ID starting with e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96 not found: ID does not exist" containerID="e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.781570 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96"} err="failed to get container status \"e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96\": rpc error: code = NotFound desc = could not find container \"e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96\": container with ID starting with e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96 not found: ID does not exist" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.781587 5094 scope.go:117] "RemoveContainer" containerID="9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.782556 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f11aa87b-3964-4a62-871f-bdf7d1ad7848-logs" (OuterVolumeSpecName: "logs") pod "f11aa87b-3964-4a62-871f-bdf7d1ad7848" (UID: "f11aa87b-3964-4a62-871f-bdf7d1ad7848"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.793483 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a1b6a8a-aefe-4f59-8936-f08aed30d8f7-kube-api-access-r7nxd" (OuterVolumeSpecName: "kube-api-access-r7nxd") pod "2a1b6a8a-aefe-4f59-8936-f08aed30d8f7" (UID: "2a1b6a8a-aefe-4f59-8936-f08aed30d8f7"). InnerVolumeSpecName "kube-api-access-r7nxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.793605 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c"} err="failed to get container status \"9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c\": rpc error: code = NotFound desc = could not find container \"9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c\": container with ID starting with 9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c not found: ID does not exist" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.793675 5094 scope.go:117] "RemoveContainer" containerID="e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.793915 5094 generic.go:334] "Generic (PLEG): container finished" podID="8f8cb333-2939-4404-b242-67bcf4e6875b" containerID="d19cd01d15b92dc7454d6cf3fb5973b463816ef9279ed3d67fe67ccb7ea9cc15" exitCode=0 Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.793977 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8f8cb333-2939-4404-b242-67bcf4e6875b","Type":"ContainerDied","Data":"d19cd01d15b92dc7454d6cf3fb5973b463816ef9279ed3d67fe67ccb7ea9cc15"} Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.806153 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.806278 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96"} err="failed to get container status \"e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96\": rpc error: code = NotFound desc = could not find container \"e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96\": container with ID starting with e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96 not found: ID does not exist" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.806309 5094 scope.go:117] "RemoveContainer" containerID="91947cfb40227973fecb8ecdcb4c5ccb4aacad2780a5f8683eda8de6a99c2b2e" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.813897 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5b8f9d577d-pgn2k"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.818551 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f11aa87b-3964-4a62-871f-bdf7d1ad7848-kube-api-access-gq8dx" (OuterVolumeSpecName: "kube-api-access-gq8dx") pod "f11aa87b-3964-4a62-871f-bdf7d1ad7848" (UID: "f11aa87b-3964-4a62-871f-bdf7d1ad7848"). InnerVolumeSpecName "kube-api-access-gq8dx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.820644 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5b8f9d577d-pgn2k"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.850751 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-config-data" (OuterVolumeSpecName: "config-data") pod "f11aa87b-3964-4a62-871f-bdf7d1ad7848" (UID: "f11aa87b-3964-4a62-871f-bdf7d1ad7848"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.874568 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bf7m\" (UniqueName: \"kubernetes.io/projected/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-kube-api-access-9bf7m\") pod \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.874616 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-logs\") pod \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.874650 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8j56\" (UniqueName: \"kubernetes.io/projected/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-kube-api-access-z8j56\") pod \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.874754 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-config-data\") pod \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.874791 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-combined-ca-bundle\") pod \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.874824 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-logs\") pod \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.874867 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-config-data\") pod \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.874932 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-config-data-custom\") pod \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.874950 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-combined-ca-bundle\") pod \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.876368 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-logs" (OuterVolumeSpecName: "logs") pod "5d9f1f40-92cc-4f19-9f3b-49651f56bffb" (UID: "5d9f1f40-92cc-4f19-9f3b-49651f56bffb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.877402 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dd5a2da-23e3-4bbd-a211-b3f527e4e28b" path="/var/lib/kubelet/pods/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b/volumes" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.878118 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="290e5022-8d17-4415-87c0-07891b0b66f5" path="/var/lib/kubelet/pods/290e5022-8d17-4415-87c0-07891b0b66f5/volumes" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.878529 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-logs" (OuterVolumeSpecName: "logs") pod "cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" (UID: "cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.878603 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30e79ba9-83fc-4246-9fb2-7136f6ae30a5" path="/var/lib/kubelet/pods/30e79ba9-83fc-4246-9fb2-7136f6ae30a5/volumes" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.879806 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3db6d35c-dfd1-4a59-95d3-cc8a99151c12" path="/var/lib/kubelet/pods/3db6d35c-dfd1-4a59-95d3-cc8a99151c12/volumes" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.880347 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="403a4371-09f4-4206-8d60-5b970d7e4faf" path="/var/lib/kubelet/pods/403a4371-09f4-4206-8d60-5b970d7e4faf/volumes" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.880912 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="876bc507-6cf2-466a-9cd3-6131a1cc590e" path="/var/lib/kubelet/pods/876bc507-6cf2-466a-9cd3-6131a1cc590e/volumes" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.881430 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e30bbcd-c206-4a74-ae52-21462356babf" path="/var/lib/kubelet/pods/8e30bbcd-c206-4a74-ae52-21462356babf/volumes" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.882187 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-scripts\") pod \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.882214 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-public-tls-certs\") pod \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.882249 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-httpd-run\") pod \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.882297 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.882852 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.882869 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.882880 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7nxd\" (UniqueName: \"kubernetes.io/projected/2a1b6a8a-aefe-4f59-8936-f08aed30d8f7-kube-api-access-r7nxd\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.882893 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gq8dx\" (UniqueName: \"kubernetes.io/projected/f11aa87b-3964-4a62-871f-bdf7d1ad7848-kube-api-access-gq8dx\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.882902 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.882910 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f11aa87b-3964-4a62-871f-bdf7d1ad7848-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.882919 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a1b6a8a-aefe-4f59-8936-f08aed30d8f7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.884448 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0e18d8b-2657-4e87-b6ca-009df89bbac8" path="/var/lib/kubelet/pods/a0e18d8b-2657-4e87-b6ca-009df89bbac8/volumes" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.885014 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b90c9110-e4fd-461b-ad2c-a58ff01921d8" path="/var/lib/kubelet/pods/b90c9110-e4fd-461b-ad2c-a58ff01921d8/volumes" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.887062 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" (UID: "cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.898578 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-kube-api-access-9bf7m" (OuterVolumeSpecName: "kube-api-access-9bf7m") pod "5d9f1f40-92cc-4f19-9f3b-49651f56bffb" (UID: "5d9f1f40-92cc-4f19-9f3b-49651f56bffb"). InnerVolumeSpecName "kube-api-access-9bf7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.898788 5094 scope.go:117] "RemoveContainer" containerID="d1984bf7ea49205773f404c2e26dd668c4e64dc0f98adc5a3b111f4efeded4ec" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.904204 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" (UID: "cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.918152 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-kube-api-access-z8j56" (OuterVolumeSpecName: "kube-api-access-z8j56") pod "cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" (UID: "cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4"). InnerVolumeSpecName "kube-api-access-z8j56". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.942541 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5d9f1f40-92cc-4f19-9f3b-49651f56bffb" (UID: "5d9f1f40-92cc-4f19-9f3b-49651f56bffb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.946791 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-scripts" (OuterVolumeSpecName: "scripts") pod "cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" (UID: "cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.984954 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "f11aa87b-3964-4a62-871f-bdf7d1ad7848" (UID: "f11aa87b-3964-4a62-871f-bdf7d1ad7848"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.985946 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.985978 5094 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.986014 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.986024 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bf7m\" (UniqueName: \"kubernetes.io/projected/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-kube-api-access-9bf7m\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.986037 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8j56\" (UniqueName: \"kubernetes.io/projected/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-kube-api-access-z8j56\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.986047 5094 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.986057 5094 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.014936 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.091752 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-public-tls-certs\") pod \"908e2706-d24f-41c9-b481-4c0d5415c5ca\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.091896 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ns5d\" (UniqueName: \"kubernetes.io/projected/908e2706-d24f-41c9-b481-4c0d5415c5ca-kube-api-access-4ns5d\") pod \"908e2706-d24f-41c9-b481-4c0d5415c5ca\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.091968 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-config-data-custom\") pod \"908e2706-d24f-41c9-b481-4c0d5415c5ca\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.092006 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-config-data\") pod \"908e2706-d24f-41c9-b481-4c0d5415c5ca\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.092026 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-combined-ca-bundle\") pod \"908e2706-d24f-41c9-b481-4c0d5415c5ca\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.092095 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-internal-tls-certs\") pod \"908e2706-d24f-41c9-b481-4c0d5415c5ca\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.092147 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/908e2706-d24f-41c9-b481-4c0d5415c5ca-logs\") pod \"908e2706-d24f-41c9-b481-4c0d5415c5ca\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.093495 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/908e2706-d24f-41c9-b481-4c0d5415c5ca-logs" (OuterVolumeSpecName: "logs") pod "908e2706-d24f-41c9-b481-4c0d5415c5ca" (UID: "908e2706-d24f-41c9-b481-4c0d5415c5ca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.095712 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-config-data" (OuterVolumeSpecName: "config-data") pod "5d9f1f40-92cc-4f19-9f3b-49651f56bffb" (UID: "5d9f1f40-92cc-4f19-9f3b-49651f56bffb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.097918 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "908e2706-d24f-41c9-b481-4c0d5415c5ca" (UID: "908e2706-d24f-41c9-b481-4c0d5415c5ca"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.099322 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/908e2706-d24f-41c9-b481-4c0d5415c5ca-kube-api-access-4ns5d" (OuterVolumeSpecName: "kube-api-access-4ns5d") pod "908e2706-d24f-41c9-b481-4c0d5415c5ca" (UID: "908e2706-d24f-41c9-b481-4c0d5415c5ca"). InnerVolumeSpecName "kube-api-access-4ns5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.143665 5094 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.164121 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "908e2706-d24f-41c9-b481-4c0d5415c5ca" (UID: "908e2706-d24f-41c9-b481-4c0d5415c5ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.200166 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-operator-scripts\") pod \"keystone-6cc2-account-create-update-98lt6\" (UID: \"c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6\") " pod="openstack/keystone-6cc2-account-create-update-98lt6" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.200748 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvlxh\" (UniqueName: \"kubernetes.io/projected/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-kube-api-access-tvlxh\") pod \"keystone-6cc2-account-create-update-98lt6\" (UID: \"c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6\") " pod="openstack/keystone-6cc2-account-create-update-98lt6" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.200833 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ns5d\" (UniqueName: \"kubernetes.io/projected/908e2706-d24f-41c9-b481-4c0d5415c5ca-kube-api-access-4ns5d\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.200845 5094 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.200854 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.200864 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.200875 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/908e2706-d24f-41c9-b481-4c0d5415c5ca-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.200884 5094 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: E0220 07:11:30.200301 5094 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 20 07:11:30 crc kubenswrapper[5094]: E0220 07:11:30.201337 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-operator-scripts podName:c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6 nodeName:}" failed. No retries permitted until 2026-02-20 07:11:31.201302604 +0000 UTC m=+1506.073929315 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-operator-scripts") pod "keystone-6cc2-account-create-update-98lt6" (UID: "c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6") : configmap "openstack-scripts" not found Feb 20 07:11:30 crc kubenswrapper[5094]: E0220 07:11:30.205171 5094 projected.go:194] Error preparing data for projected volume kube-api-access-tvlxh for pod openstack/keystone-6cc2-account-create-update-98lt6: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 20 07:11:30 crc kubenswrapper[5094]: E0220 07:11:30.205576 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-kube-api-access-tvlxh podName:c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6 nodeName:}" failed. No retries permitted until 2026-02-20 07:11:31.205217448 +0000 UTC m=+1506.077844149 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-tvlxh" (UniqueName: "kubernetes.io/projected/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-kube-api-access-tvlxh") pod "keystone-6cc2-account-create-update-98lt6" (UID: "c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.206009 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d9f1f40-92cc-4f19-9f3b-49651f56bffb" (UID: "5d9f1f40-92cc-4f19-9f3b-49651f56bffb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: E0220 07:11:30.214993 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c9d8bef2aa627582c2efca41433b2b24a6bb42ef156ade330dc390cb501211cf" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 07:11:30 crc kubenswrapper[5094]: E0220 07:11:30.229570 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c9d8bef2aa627582c2efca41433b2b24a6bb42ef156ade330dc390cb501211cf" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 07:11:30 crc kubenswrapper[5094]: E0220 07:11:30.248110 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c9d8bef2aa627582c2efca41433b2b24a6bb42ef156ade330dc390cb501211cf" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 07:11:30 crc kubenswrapper[5094]: E0220 07:11:30.248209 5094 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="a5bbb9ad-deeb-495f-9750-f7012c00061d" containerName="nova-cell0-conductor-conductor" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.248680 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f11aa87b-3964-4a62-871f-bdf7d1ad7848" (UID: "f11aa87b-3964-4a62-871f-bdf7d1ad7848"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.260869 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="3d3ab399-3fc6-47e1-995c-5e855c554e9e" containerName="galera" containerID="cri-o://b512a557484cc85ddc51b76d4e4921f265f1c6139b4d9b37aa3c645ee24c7d27" gracePeriod=30 Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.270076 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" (UID: "cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.271952 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" (UID: "cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.302425 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.302456 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.302466 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.302476 5094 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.319058 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-config-data" (OuterVolumeSpecName: "config-data") pod "cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" (UID: "cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.322164 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "908e2706-d24f-41c9-b481-4c0d5415c5ca" (UID: "908e2706-d24f-41c9-b481-4c0d5415c5ca"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.352584 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-config-data" (OuterVolumeSpecName: "config-data") pod "908e2706-d24f-41c9-b481-4c0d5415c5ca" (UID: "908e2706-d24f-41c9-b481-4c0d5415c5ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.404543 5094 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.404584 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.404597 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.418145 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "908e2706-d24f-41c9-b481-4c0d5415c5ca" (UID: "908e2706-d24f-41c9-b481-4c0d5415c5ca"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.461988 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.202:3000/\": dial tcp 10.217.0.202:3000: connect: connection refused" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.507120 5094 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: E0220 07:11:30.507252 5094 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 20 07:11:30 crc kubenswrapper[5094]: E0220 07:11:30.507303 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-config-data podName:219c74d6-9f45-4bf8-8c67-acdea3c0fab3 nodeName:}" failed. No retries permitted until 2026-02-20 07:11:38.507287733 +0000 UTC m=+1513.379914444 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-config-data") pod "rabbitmq-server-0" (UID: "219c74d6-9f45-4bf8-8c67-acdea3c0fab3") : configmap "rabbitmq-config-data" not found Feb 20 07:11:30 crc kubenswrapper[5094]: E0220 07:11:30.551325 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e21d77d7f75d6c24720a691c3dafa0724131a4a216a5cb72213f2651bb77470e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 07:11:30 crc kubenswrapper[5094]: E0220 07:11:30.554795 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e21d77d7f75d6c24720a691c3dafa0724131a4a216a5cb72213f2651bb77470e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 07:11:30 crc kubenswrapper[5094]: E0220 07:11:30.556565 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e21d77d7f75d6c24720a691c3dafa0724131a4a216a5cb72213f2651bb77470e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 07:11:30 crc kubenswrapper[5094]: E0220 07:11:30.556656 5094 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="b1504790-ccaf-42d5-a28a-a25f0cd353c9" containerName="nova-scheduler-scheduler" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.649338 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.655547 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.674962 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.705191 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.711361 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/762a565c-672e-4127-a8c6-90f721eeda81-httpd-run\") pod \"762a565c-672e-4127-a8c6-90f721eeda81\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.711722 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-combined-ca-bundle\") pod \"762a565c-672e-4127-a8c6-90f721eeda81\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.711765 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-internal-tls-certs\") pod \"ca3adffb-7baf-45db-ab16-cc1c63510fec\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.711844 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-config-data\") pod \"762a565c-672e-4127-a8c6-90f721eeda81\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.711880 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8nkm\" (UniqueName: \"kubernetes.io/projected/ca3adffb-7baf-45db-ab16-cc1c63510fec-kube-api-access-p8nkm\") pod \"ca3adffb-7baf-45db-ab16-cc1c63510fec\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.711914 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-combined-ca-bundle\") pod \"92877559-6960-4dbf-890a-fb563f4b0bf8\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.711993 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-internal-tls-certs\") pod \"762a565c-672e-4127-a8c6-90f721eeda81\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.712080 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/762a565c-672e-4127-a8c6-90f721eeda81-logs\") pod \"762a565c-672e-4127-a8c6-90f721eeda81\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.712123 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-combined-ca-bundle\") pod \"ca3adffb-7baf-45db-ab16-cc1c63510fec\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.712166 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-public-tls-certs\") pod \"ca3adffb-7baf-45db-ab16-cc1c63510fec\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.712195 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-config-data-custom\") pod \"92877559-6960-4dbf-890a-fb563f4b0bf8\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.712217 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmx2z\" (UniqueName: \"kubernetes.io/projected/92877559-6960-4dbf-890a-fb563f4b0bf8-kube-api-access-fmx2z\") pod \"92877559-6960-4dbf-890a-fb563f4b0bf8\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.712244 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-scripts\") pod \"762a565c-672e-4127-a8c6-90f721eeda81\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.712268 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92877559-6960-4dbf-890a-fb563f4b0bf8-logs\") pod \"92877559-6960-4dbf-890a-fb563f4b0bf8\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.712305 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2fsw\" (UniqueName: \"kubernetes.io/projected/762a565c-672e-4127-a8c6-90f721eeda81-kube-api-access-b2fsw\") pod \"762a565c-672e-4127-a8c6-90f721eeda81\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.712330 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca3adffb-7baf-45db-ab16-cc1c63510fec-logs\") pod \"ca3adffb-7baf-45db-ab16-cc1c63510fec\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.712349 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-config-data\") pod \"ca3adffb-7baf-45db-ab16-cc1c63510fec\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.712375 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-config-data\") pod \"92877559-6960-4dbf-890a-fb563f4b0bf8\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.712396 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"762a565c-672e-4127-a8c6-90f721eeda81\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.712572 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/762a565c-672e-4127-a8c6-90f721eeda81-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "762a565c-672e-4127-a8c6-90f721eeda81" (UID: "762a565c-672e-4127-a8c6-90f721eeda81"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.713303 5094 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/762a565c-672e-4127-a8c6-90f721eeda81-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.718451 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "762a565c-672e-4127-a8c6-90f721eeda81" (UID: "762a565c-672e-4127-a8c6-90f721eeda81"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.719155 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92877559-6960-4dbf-890a-fb563f4b0bf8-logs" (OuterVolumeSpecName: "logs") pod "92877559-6960-4dbf-890a-fb563f4b0bf8" (UID: "92877559-6960-4dbf-890a-fb563f4b0bf8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.724796 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "92877559-6960-4dbf-890a-fb563f4b0bf8" (UID: "92877559-6960-4dbf-890a-fb563f4b0bf8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.726186 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/762a565c-672e-4127-a8c6-90f721eeda81-logs" (OuterVolumeSpecName: "logs") pod "762a565c-672e-4127-a8c6-90f721eeda81" (UID: "762a565c-672e-4127-a8c6-90f721eeda81"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.726273 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-scripts" (OuterVolumeSpecName: "scripts") pod "762a565c-672e-4127-a8c6-90f721eeda81" (UID: "762a565c-672e-4127-a8c6-90f721eeda81"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.726197 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92877559-6960-4dbf-890a-fb563f4b0bf8-kube-api-access-fmx2z" (OuterVolumeSpecName: "kube-api-access-fmx2z") pod "92877559-6960-4dbf-890a-fb563f4b0bf8" (UID: "92877559-6960-4dbf-890a-fb563f4b0bf8"). InnerVolumeSpecName "kube-api-access-fmx2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.737204 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca3adffb-7baf-45db-ab16-cc1c63510fec-logs" (OuterVolumeSpecName: "logs") pod "ca3adffb-7baf-45db-ab16-cc1c63510fec" (UID: "ca3adffb-7baf-45db-ab16-cc1c63510fec"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.744063 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca3adffb-7baf-45db-ab16-cc1c63510fec-kube-api-access-p8nkm" (OuterVolumeSpecName: "kube-api-access-p8nkm") pod "ca3adffb-7baf-45db-ab16-cc1c63510fec" (UID: "ca3adffb-7baf-45db-ab16-cc1c63510fec"). InnerVolumeSpecName "kube-api-access-p8nkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.745511 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/762a565c-672e-4127-a8c6-90f721eeda81-kube-api-access-b2fsw" (OuterVolumeSpecName: "kube-api-access-b2fsw") pod "762a565c-672e-4127-a8c6-90f721eeda81" (UID: "762a565c-672e-4127-a8c6-90f721eeda81"). InnerVolumeSpecName "kube-api-access-b2fsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.757054 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.777586 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.792905 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-config-data" (OuterVolumeSpecName: "config-data") pod "ca3adffb-7baf-45db-ab16-cc1c63510fec" (UID: "ca3adffb-7baf-45db-ab16-cc1c63510fec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.794813 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92877559-6960-4dbf-890a-fb563f4b0bf8" (UID: "92877559-6960-4dbf-890a-fb563f4b0bf8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.801638 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca3adffb-7baf-45db-ab16-cc1c63510fec" (UID: "ca3adffb-7baf-45db-ab16-cc1c63510fec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.801666 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "762a565c-672e-4127-a8c6-90f721eeda81" (UID: "762a565c-672e-4127-a8c6-90f721eeda81"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.814508 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-combined-ca-bundle\") pod \"8f8cb333-2939-4404-b242-67bcf4e6875b\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.814556 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f8cb333-2939-4404-b242-67bcf4e6875b-logs\") pod \"8f8cb333-2939-4404-b242-67bcf4e6875b\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.814611 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-kube-state-metrics-tls-config\") pod \"1fe9db54-4204-4335-a272-c469e0923478\" (UID: \"1fe9db54-4204-4335-a272-c469e0923478\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.814641 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-internal-tls-certs\") pod \"8f8cb333-2939-4404-b242-67bcf4e6875b\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.814660 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-combined-ca-bundle\") pod \"1fe9db54-4204-4335-a272-c469e0923478\" (UID: \"1fe9db54-4204-4335-a272-c469e0923478\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.814769 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-config-data-custom\") pod \"8f8cb333-2939-4404-b242-67bcf4e6875b\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.814825 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lb6j8\" (UniqueName: \"kubernetes.io/projected/8f8cb333-2939-4404-b242-67bcf4e6875b-kube-api-access-lb6j8\") pod \"8f8cb333-2939-4404-b242-67bcf4e6875b\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.814846 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7dd0ff85-ae3a-4035-a096-fea5952b19a7-kolla-config\") pod \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.814899 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-scripts\") pod \"8f8cb333-2939-4404-b242-67bcf4e6875b\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.814950 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knph6\" (UniqueName: \"kubernetes.io/projected/7dd0ff85-ae3a-4035-a096-fea5952b19a7-kube-api-access-knph6\") pod \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815002 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7dd0ff85-ae3a-4035-a096-fea5952b19a7-config-data\") pod \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815027 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-config-data\") pod \"8f8cb333-2939-4404-b242-67bcf4e6875b\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815067 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-kube-state-metrics-tls-certs\") pod \"1fe9db54-4204-4335-a272-c469e0923478\" (UID: \"1fe9db54-4204-4335-a272-c469e0923478\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815164 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8f8cb333-2939-4404-b242-67bcf4e6875b-etc-machine-id\") pod \"8f8cb333-2939-4404-b242-67bcf4e6875b\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815259 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjgxz\" (UniqueName: \"kubernetes.io/projected/1fe9db54-4204-4335-a272-c469e0923478-kube-api-access-gjgxz\") pod \"1fe9db54-4204-4335-a272-c469e0923478\" (UID: \"1fe9db54-4204-4335-a272-c469e0923478\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815302 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dd0ff85-ae3a-4035-a096-fea5952b19a7-memcached-tls-certs\") pod \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815326 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd0ff85-ae3a-4035-a096-fea5952b19a7-combined-ca-bundle\") pod \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815355 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-public-tls-certs\") pod \"8f8cb333-2939-4404-b242-67bcf4e6875b\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815790 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815818 5094 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815828 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmx2z\" (UniqueName: \"kubernetes.io/projected/92877559-6960-4dbf-890a-fb563f4b0bf8-kube-api-access-fmx2z\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815840 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815851 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92877559-6960-4dbf-890a-fb563f4b0bf8-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815860 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2fsw\" (UniqueName: \"kubernetes.io/projected/762a565c-672e-4127-a8c6-90f721eeda81-kube-api-access-b2fsw\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815869 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815877 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca3adffb-7baf-45db-ab16-cc1c63510fec-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815901 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815911 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815921 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8nkm\" (UniqueName: \"kubernetes.io/projected/ca3adffb-7baf-45db-ab16-cc1c63510fec-kube-api-access-p8nkm\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815931 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815941 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/762a565c-672e-4127-a8c6-90f721eeda81-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.817578 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f8cb333-2939-4404-b242-67bcf4e6875b-logs" (OuterVolumeSpecName: "logs") pod "8f8cb333-2939-4404-b242-67bcf4e6875b" (UID: "8f8cb333-2939-4404-b242-67bcf4e6875b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.817987 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dd0ff85-ae3a-4035-a096-fea5952b19a7-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "7dd0ff85-ae3a-4035-a096-fea5952b19a7" (UID: "7dd0ff85-ae3a-4035-a096-fea5952b19a7"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.818019 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f8cb333-2939-4404-b242-67bcf4e6875b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8f8cb333-2939-4404-b242-67bcf4e6875b" (UID: "8f8cb333-2939-4404-b242-67bcf4e6875b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.818599 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dd0ff85-ae3a-4035-a096-fea5952b19a7-config-data" (OuterVolumeSpecName: "config-data") pod "7dd0ff85-ae3a-4035-a096-fea5952b19a7" (UID: "7dd0ff85-ae3a-4035-a096-fea5952b19a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.824401 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-scripts" (OuterVolumeSpecName: "scripts") pod "8f8cb333-2939-4404-b242-67bcf4e6875b" (UID: "8f8cb333-2939-4404-b242-67bcf4e6875b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.824652 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f8cb333-2939-4404-b242-67bcf4e6875b-kube-api-access-lb6j8" (OuterVolumeSpecName: "kube-api-access-lb6j8") pod "8f8cb333-2939-4404-b242-67bcf4e6875b" (UID: "8f8cb333-2939-4404-b242-67bcf4e6875b"). InnerVolumeSpecName "kube-api-access-lb6j8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.826800 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8f8cb333-2939-4404-b242-67bcf4e6875b" (UID: "8f8cb333-2939-4404-b242-67bcf4e6875b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.839086 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dd0ff85-ae3a-4035-a096-fea5952b19a7-kube-api-access-knph6" (OuterVolumeSpecName: "kube-api-access-knph6") pod "7dd0ff85-ae3a-4035-a096-fea5952b19a7" (UID: "7dd0ff85-ae3a-4035-a096-fea5952b19a7"). InnerVolumeSpecName "kube-api-access-knph6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.840652 5094 generic.go:334] "Generic (PLEG): container finished" podID="f3caa33a-a0ec-4fdc-876b-266724a5af50" containerID="d82cb6bb5b6363e25ca9e64db1ea09c7494c05c93392e549c89755977ef55244" exitCode=0 Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.840756 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f3caa33a-a0ec-4fdc-876b-266724a5af50","Type":"ContainerDied","Data":"d82cb6bb5b6363e25ca9e64db1ea09c7494c05c93392e549c89755977ef55244"} Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.841048 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fe9db54-4204-4335-a272-c469e0923478-kube-api-access-gjgxz" (OuterVolumeSpecName: "kube-api-access-gjgxz") pod "1fe9db54-4204-4335-a272-c469e0923478" (UID: "1fe9db54-4204-4335-a272-c469e0923478"). InnerVolumeSpecName "kube-api-access-gjgxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.844317 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-config-data" (OuterVolumeSpecName: "config-data") pod "762a565c-672e-4127-a8c6-90f721eeda81" (UID: "762a565c-672e-4127-a8c6-90f721eeda81"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.844338 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "762a565c-672e-4127-a8c6-90f721eeda81" (UID: "762a565c-672e-4127-a8c6-90f721eeda81"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.848854 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ca3adffb-7baf-45db-ab16-cc1c63510fec" (UID: "ca3adffb-7baf-45db-ab16-cc1c63510fec"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.853257 5094 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.855089 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:11:30 crc kubenswrapper[5094]: E0220 07:11:30.855807 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.865989 5094 generic.go:334] "Generic (PLEG): container finished" podID="d01cbaa4-5543-4cd5-b098-7e4600819d32" containerID="2d7a76b02a624c041d0a977a291c39fc1dacc4367a61490ce8714745def9a3c7" exitCode=0 Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.866074 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d01cbaa4-5543-4cd5-b098-7e4600819d32","Type":"ContainerDied","Data":"2d7a76b02a624c041d0a977a291c39fc1dacc4367a61490ce8714745def9a3c7"} Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.882829 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ca3adffb-7baf-45db-ab16-cc1c63510fec" (UID: "ca3adffb-7baf-45db-ab16-cc1c63510fec"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.898633 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dd0ff85-ae3a-4035-a096-fea5952b19a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7dd0ff85-ae3a-4035-a096-fea5952b19a7" (UID: "7dd0ff85-ae3a-4035-a096-fea5952b19a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.899543 5094 generic.go:334] "Generic (PLEG): container finished" podID="0330a367-c0c9-42a9-9993-1a3b6775fd3b" containerID="e29fe67f4e435861ca66b7f08084602af4c30866304a951f1e2cec0fe4e96725" exitCode=1 Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.899647 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5vc6z" event={"ID":"0330a367-c0c9-42a9-9993-1a3b6775fd3b","Type":"ContainerDied","Data":"e29fe67f4e435861ca66b7f08084602af4c30866304a951f1e2cec0fe4e96725"} Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.899761 5094 scope.go:117] "RemoveContainer" containerID="8d9599986f47c1f116606b523c4cbef0203820a63f129ca2e7a4866193abc8ce" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.900226 5094 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-5vc6z" secret="" err="secret \"galera-openstack-dockercfg-cp9vz\" not found" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.900262 5094 scope.go:117] "RemoveContainer" containerID="e29fe67f4e435861ca66b7f08084602af4c30866304a951f1e2cec0fe4e96725" Feb 20 07:11:30 crc kubenswrapper[5094]: E0220 07:11:30.900601 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-5vc6z_openstack(0330a367-c0c9-42a9-9993-1a3b6775fd3b)\"" pod="openstack/root-account-create-update-5vc6z" podUID="0330a367-c0c9-42a9-9993-1a3b6775fd3b" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.918181 5094 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.918220 5094 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8f8cb333-2939-4404-b242-67bcf4e6875b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.918234 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjgxz\" (UniqueName: \"kubernetes.io/projected/1fe9db54-4204-4335-a272-c469e0923478-kube-api-access-gjgxz\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.918246 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd0ff85-ae3a-4035-a096-fea5952b19a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.918478 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7df9984bd9-6txsf" event={"ID":"92877559-6960-4dbf-890a-fb563f4b0bf8","Type":"ContainerDied","Data":"5c342dad28df34f1d8d92f5a04877af4cb07a57675de3adb965ed98cfe8eaa77"} Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.918591 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.923998 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f8cb333-2939-4404-b242-67bcf4e6875b-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.924025 5094 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.924037 5094 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.924048 5094 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.924059 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lb6j8\" (UniqueName: \"kubernetes.io/projected/8f8cb333-2939-4404-b242-67bcf4e6875b-kube-api-access-lb6j8\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.924094 5094 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7dd0ff85-ae3a-4035-a096-fea5952b19a7-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.924105 5094 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.924114 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.924124 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.924133 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knph6\" (UniqueName: \"kubernetes.io/projected/7dd0ff85-ae3a-4035-a096-fea5952b19a7-kube-api-access-knph6\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.924142 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7dd0ff85-ae3a-4035-a096-fea5952b19a7-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.938352 5094 generic.go:334] "Generic (PLEG): container finished" podID="1218d679-0e51-4bef-9526-db16c8783d8b" containerID="4073373c0648687276e84e986260ddd367c34f5129c119e3fce3d4726ca86b92" exitCode=0 Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.938396 5094 generic.go:334] "Generic (PLEG): container finished" podID="1218d679-0e51-4bef-9526-db16c8783d8b" containerID="f580b2199d149d5c06b5c9bfb9b0ef745161fcfd54354c074e1bf52afa2fbbe4" exitCode=0 Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.938445 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1218d679-0e51-4bef-9526-db16c8783d8b","Type":"ContainerDied","Data":"4073373c0648687276e84e986260ddd367c34f5129c119e3fce3d4726ca86b92"} Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.938480 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1218d679-0e51-4bef-9526-db16c8783d8b","Type":"ContainerDied","Data":"f580b2199d149d5c06b5c9bfb9b0ef745161fcfd54354c074e1bf52afa2fbbe4"} Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.947075 5094 generic.go:334] "Generic (PLEG): container finished" podID="1fe9db54-4204-4335-a272-c469e0923478" containerID="0fcfeb4fdfabbbcb036fd6cf631f390dac9fcd6e68173a7af3e76810e7a92db9" exitCode=2 Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.947187 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1fe9db54-4204-4335-a272-c469e0923478","Type":"ContainerDied","Data":"0fcfeb4fdfabbbcb036fd6cf631f390dac9fcd6e68173a7af3e76810e7a92db9"} Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.947223 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1fe9db54-4204-4335-a272-c469e0923478","Type":"ContainerDied","Data":"8a0b13fdbdedc5064e8f68c82ce215006ed4f58e7530fd19fcca453a9915c200"} Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.947308 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.954124 5094 scope.go:117] "RemoveContainer" containerID="d397c0695b1183c0c759491ae377b37ffc66032936b01b69dbc1bd7fedbcab31" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.956675 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" event={"ID":"908e2706-d24f-41c9-b481-4c0d5415c5ca","Type":"ContainerDied","Data":"106993ad972a41a70b0e11997dac58bd4e6ab90384569399045d1bedeaba95e2"} Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.956872 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.959889 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "1fe9db54-4204-4335-a272-c469e0923478" (UID: "1fe9db54-4204-4335-a272-c469e0923478"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.970504 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fe9db54-4204-4335-a272-c469e0923478" (UID: "1fe9db54-4204-4335-a272-c469e0923478"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.975416 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-config-data" (OuterVolumeSpecName: "config-data") pod "92877559-6960-4dbf-890a-fb563f4b0bf8" (UID: "92877559-6960-4dbf-890a-fb563f4b0bf8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.977989 5094 generic.go:334] "Generic (PLEG): container finished" podID="7dd0ff85-ae3a-4035-a096-fea5952b19a7" containerID="631fbada038605f5c51cfb450102e200cb6c919e6e8732fe4d271c4b0f19cc19" exitCode=0 Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.978184 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.978226 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"7dd0ff85-ae3a-4035-a096-fea5952b19a7","Type":"ContainerDied","Data":"631fbada038605f5c51cfb450102e200cb6c919e6e8732fe4d271c4b0f19cc19"} Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.978325 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"7dd0ff85-ae3a-4035-a096-fea5952b19a7","Type":"ContainerDied","Data":"c905b3c584b4bdb1a44662bb87e5389e8137126047bfc23039edbbaea024118a"} Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:30.992945 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ca3adffb-7baf-45db-ab16-cc1c63510fec","Type":"ContainerDied","Data":"1139ba82db72b15ac53a4a8cb3e78462a70a685a8a4aaff7d443bdc0aa77867c"} Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:30.993054 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.020324 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8f8cb333-2939-4404-b242-67bcf4e6875b" (UID: "8f8cb333-2939-4404-b242-67bcf4e6875b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.026976 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.027571 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.027590 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.027606 5094 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:31 crc kubenswrapper[5094]: E0220 07:11:31.028551 5094 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 20 07:11:31 crc kubenswrapper[5094]: E0220 07:11:31.029161 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-config-data podName:a829c6b3-7069-4544-90dc-40ae83aba524 nodeName:}" failed. No retries permitted until 2026-02-20 07:11:39.029138611 +0000 UTC m=+1513.901765322 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-config-data") pod "rabbitmq-cell1-server-0" (UID: "a829c6b3-7069-4544-90dc-40ae83aba524") : configmap "rabbitmq-cell1-config-data" not found Feb 20 07:11:31 crc kubenswrapper[5094]: E0220 07:11:31.029269 5094 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 20 07:11:31 crc kubenswrapper[5094]: E0220 07:11:31.029391 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0330a367-c0c9-42a9-9993-1a3b6775fd3b-operator-scripts podName:0330a367-c0c9-42a9-9993-1a3b6775fd3b nodeName:}" failed. No retries permitted until 2026-02-20 07:11:31.529362816 +0000 UTC m=+1506.401989527 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0330a367-c0c9-42a9-9993-1a3b6775fd3b-operator-scripts") pod "root-account-create-update-5vc6z" (UID: "0330a367-c0c9-42a9-9993-1a3b6775fd3b") : configmap "openstack-scripts" not found Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.031200 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "1fe9db54-4204-4335-a272-c469e0923478" (UID: "1fe9db54-4204-4335-a272-c469e0923478"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.040924 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-config-data" (OuterVolumeSpecName: "config-data") pod "8f8cb333-2939-4404-b242-67bcf4e6875b" (UID: "8f8cb333-2939-4404-b242-67bcf4e6875b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.042975 5094 scope.go:117] "RemoveContainer" containerID="7fc72402d88effbe5b7f66ca244892cffc5d212981c680da190e5ee72f72b92a" Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.046943 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.047224 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-79d9bcd9d4-9jrtt"] Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.047280 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f11aa87b-3964-4a62-871f-bdf7d1ad7848","Type":"ContainerDied","Data":"cf9b5c579a0d68acf1d445e79a8eef5359bd9a83ecb400c0f57115c47e73ceda"} Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.062936 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8f8cb333-2939-4404-b242-67bcf4e6875b" (UID: "8f8cb333-2939-4404-b242-67bcf4e6875b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:31 crc kubenswrapper[5094]: E0220 07:11:31.078963 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d9599986f47c1f116606b523c4cbef0203820a63f129ca2e7a4866193abc8ce\": container with ID starting with 8d9599986f47c1f116606b523c4cbef0203820a63f129ca2e7a4866193abc8ce not found: ID does not exist" containerID="8d9599986f47c1f116606b523c4cbef0203820a63f129ca2e7a4866193abc8ce" Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.079014 5094 kuberuntime_gc.go:361] "Error getting ContainerStatus for containerID" containerID="8d9599986f47c1f116606b523c4cbef0203820a63f129ca2e7a4866193abc8ce" err="rpc error: code = NotFound desc = could not find container \"8d9599986f47c1f116606b523c4cbef0203820a63f129ca2e7a4866193abc8ce\": container with ID starting with 8d9599986f47c1f116606b523c4cbef0203820a63f129ca2e7a4866193abc8ce not found: ID does not exist" Feb 20 07:11:31 crc kubenswrapper[5094]: E0220 07:11:31.079044 5094 kuberuntime_gc.go:389] "Failed to remove container log dead symlink" err="remove /var/log/containers/root-account-create-update-5vc6z_openstack_mariadb-account-create-update-8d9599986f47c1f116606b523c4cbef0203820a63f129ca2e7a4866193abc8ce.log: no such file or directory" path="/var/log/containers/root-account-create-update-5vc6z_openstack_mariadb-account-create-update-8d9599986f47c1f116606b523c4cbef0203820a63f129ca2e7a4866193abc8ce.log" Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.103252 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-79d9bcd9d4-9jrtt"] Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.103774 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4","Type":"ContainerDied","Data":"85e8b39c918088f619ee9b44d81cb6828488069406841b038890d491ba98168a"} Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.108015 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.115382 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7677694455-llk7m" podUID="37533bd5-22b5-4b59-8672-35eaa19b9295" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.197:5353: i/o timeout" Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.141188 5094 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.141237 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.141248 5094 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.244210 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8f8cb333-2939-4404-b242-67bcf4e6875b","Type":"ContainerDied","Data":"32ffade3cb1f0b317c39a0a7865ad0bdf56c99cb77f0149f0bfd6e89ce114623"} Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.244415 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.246077 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-operator-scripts\") pod \"keystone-6cc2-account-create-update-98lt6\" (UID: \"c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6\") " pod="openstack/keystone-6cc2-account-create-update-98lt6" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.246135 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvlxh\" (UniqueName: \"kubernetes.io/projected/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-kube-api-access-tvlxh\") pod \"keystone-6cc2-account-create-update-98lt6\" (UID: \"c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6\") " pod="openstack/keystone-6cc2-account-create-update-98lt6" Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:31.246825 5094 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:31.246881 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-operator-scripts podName:c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6 nodeName:}" failed. No retries permitted until 2026-02-20 07:11:33.246860175 +0000 UTC m=+1508.119486886 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-operator-scripts") pod "keystone-6cc2-account-create-update-98lt6" (UID: "c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6") : configmap "openstack-scripts" not found Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:31.260320 5094 projected.go:194] Error preparing data for projected volume kube-api-access-tvlxh for pod openstack/keystone-6cc2-account-create-update-98lt6: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:31.260412 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-kube-api-access-tvlxh podName:c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6 nodeName:}" failed. No retries permitted until 2026-02-20 07:11:33.260383289 +0000 UTC m=+1508.133010000 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-tvlxh" (UniqueName: "kubernetes.io/projected/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-kube-api-access-tvlxh") pod "keystone-6cc2-account-create-update-98lt6" (UID: "c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.282621 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"762a565c-672e-4127-a8c6-90f721eeda81","Type":"ContainerDied","Data":"ca1059f5843b1683c2b383612bdd39e42db92c13986bf2a30fa4cc0e0bdde634"} Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.282782 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.303382 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8f8cb333-2939-4404-b242-67bcf4e6875b" (UID: "8f8cb333-2939-4404-b242-67bcf4e6875b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.317875 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dd0ff85-ae3a-4035-a096-fea5952b19a7-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "7dd0ff85-ae3a-4035-a096-fea5952b19a7" (UID: "7dd0ff85-ae3a-4035-a096-fea5952b19a7"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.320138 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6cc2-account-create-update-98lt6" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.320621 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.320753 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" event={"ID":"5d9f1f40-92cc-4f19-9f3b-49651f56bffb","Type":"ContainerDied","Data":"9f163abc1efd183ebd2809a660db1c44ccc0e92e53e74d9ce4dfa48299f86759"} Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.320844 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d63e-account-create-update-tvf55" event={"ID":"2a1b6a8a-aefe-4f59-8936-f08aed30d8f7","Type":"ContainerDied","Data":"9e2e1470fd33e88144567dad9b332e30ed6e81a2d129cafee24b4fca5bfc7939"} Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.320935 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d63e-account-create-update-tvf55" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.371955 5094 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dd0ff85-ae3a-4035-a096-fea5952b19a7-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.371990 5094 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.395327 5094 scope.go:117] "RemoveContainer" containerID="0fcfeb4fdfabbbcb036fd6cf631f390dac9fcd6e68173a7af3e76810e7a92db9" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.515116 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.532103 5094 scope.go:117] "RemoveContainer" containerID="0fcfeb4fdfabbbcb036fd6cf631f390dac9fcd6e68173a7af3e76810e7a92db9" Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:31.536234 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fcfeb4fdfabbbcb036fd6cf631f390dac9fcd6e68173a7af3e76810e7a92db9\": container with ID starting with 0fcfeb4fdfabbbcb036fd6cf631f390dac9fcd6e68173a7af3e76810e7a92db9 not found: ID does not exist" containerID="0fcfeb4fdfabbbcb036fd6cf631f390dac9fcd6e68173a7af3e76810e7a92db9" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.536284 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fcfeb4fdfabbbcb036fd6cf631f390dac9fcd6e68173a7af3e76810e7a92db9"} err="failed to get container status \"0fcfeb4fdfabbbcb036fd6cf631f390dac9fcd6e68173a7af3e76810e7a92db9\": rpc error: code = NotFound desc = could not find container \"0fcfeb4fdfabbbcb036fd6cf631f390dac9fcd6e68173a7af3e76810e7a92db9\": container with ID starting with 0fcfeb4fdfabbbcb036fd6cf631f390dac9fcd6e68173a7af3e76810e7a92db9 not found: ID does not exist" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.536313 5094 scope.go:117] "RemoveContainer" containerID="a9359fe0f2ca547fd93214e43c994a9c0538ad96741f69a95262122bb7dc4d7b" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.570207 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6cc2-account-create-update-98lt6" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.574672 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-combined-ca-bundle\") pod \"d01cbaa4-5543-4cd5-b098-7e4600819d32\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.574806 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d01cbaa4-5543-4cd5-b098-7e4600819d32-etc-machine-id\") pod \"d01cbaa4-5543-4cd5-b098-7e4600819d32\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.574907 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-scripts\") pod \"d01cbaa4-5543-4cd5-b098-7e4600819d32\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.574962 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8b4w\" (UniqueName: \"kubernetes.io/projected/d01cbaa4-5543-4cd5-b098-7e4600819d32-kube-api-access-f8b4w\") pod \"d01cbaa4-5543-4cd5-b098-7e4600819d32\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.575012 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-config-data\") pod \"d01cbaa4-5543-4cd5-b098-7e4600819d32\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.575349 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-config-data-custom\") pod \"d01cbaa4-5543-4cd5-b098-7e4600819d32\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:31.576372 5094 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:31.576457 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0330a367-c0c9-42a9-9993-1a3b6775fd3b-operator-scripts podName:0330a367-c0c9-42a9-9993-1a3b6775fd3b nodeName:}" failed. No retries permitted until 2026-02-20 07:11:32.576437787 +0000 UTC m=+1507.449064498 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0330a367-c0c9-42a9-9993-1a3b6775fd3b-operator-scripts") pod "root-account-create-update-5vc6z" (UID: "0330a367-c0c9-42a9-9993-1a3b6775fd3b") : configmap "openstack-scripts" not found Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.578062 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d01cbaa4-5543-4cd5-b098-7e4600819d32-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d01cbaa4-5543-4cd5-b098-7e4600819d32" (UID: "d01cbaa4-5543-4cd5-b098-7e4600819d32"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.593379 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-scripts" (OuterVolumeSpecName: "scripts") pod "d01cbaa4-5543-4cd5-b098-7e4600819d32" (UID: "d01cbaa4-5543-4cd5-b098-7e4600819d32"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.601251 5094 scope.go:117] "RemoveContainer" containerID="459bc2f2d89a8c0984a77dac5551d18970f5268790591ad15bb4ecd73c5d3e57" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.603994 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.605943 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d01cbaa4-5543-4cd5-b098-7e4600819d32-kube-api-access-f8b4w" (OuterVolumeSpecName: "kube-api-access-f8b4w") pod "d01cbaa4-5543-4cd5-b098-7e4600819d32" (UID: "d01cbaa4-5543-4cd5-b098-7e4600819d32"). InnerVolumeSpecName "kube-api-access-f8b4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.616870 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7df9984bd9-6txsf"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.631535 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d01cbaa4-5543-4cd5-b098-7e4600819d32" (UID: "d01cbaa4-5543-4cd5-b098-7e4600819d32"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.649453 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-7df9984bd9-6txsf"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.668498 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.683142 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3caa33a-a0ec-4fdc-876b-266724a5af50-combined-ca-bundle\") pod \"f3caa33a-a0ec-4fdc-876b-266724a5af50\" (UID: \"f3caa33a-a0ec-4fdc-876b-266724a5af50\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.683209 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3caa33a-a0ec-4fdc-876b-266724a5af50-config-data\") pod \"f3caa33a-a0ec-4fdc-876b-266724a5af50\" (UID: \"f3caa33a-a0ec-4fdc-876b-266724a5af50\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.683412 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7snx5\" (UniqueName: \"kubernetes.io/projected/f3caa33a-a0ec-4fdc-876b-266724a5af50-kube-api-access-7snx5\") pod \"f3caa33a-a0ec-4fdc-876b-266724a5af50\" (UID: \"f3caa33a-a0ec-4fdc-876b-266724a5af50\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.683908 5094 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d01cbaa4-5543-4cd5-b098-7e4600819d32-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.683922 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.683934 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8b4w\" (UniqueName: \"kubernetes.io/projected/d01cbaa4-5543-4cd5-b098-7e4600819d32-kube-api-access-f8b4w\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.683947 5094 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.684197 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d01cbaa4-5543-4cd5-b098-7e4600819d32" (UID: "d01cbaa4-5543-4cd5-b098-7e4600819d32"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.690597 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3caa33a-a0ec-4fdc-876b-266724a5af50-kube-api-access-7snx5" (OuterVolumeSpecName: "kube-api-access-7snx5") pod "f3caa33a-a0ec-4fdc-876b-266724a5af50" (UID: "f3caa33a-a0ec-4fdc-876b-266724a5af50"). InnerVolumeSpecName "kube-api-access-7snx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.692193 5094 scope.go:117] "RemoveContainer" containerID="631fbada038605f5c51cfb450102e200cb6c919e6e8732fe4d271c4b0f19cc19" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.702422 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.717878 5094 scope.go:117] "RemoveContainer" containerID="631fbada038605f5c51cfb450102e200cb6c919e6e8732fe4d271c4b0f19cc19" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.718041 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:31.718693 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"631fbada038605f5c51cfb450102e200cb6c919e6e8732fe4d271c4b0f19cc19\": container with ID starting with 631fbada038605f5c51cfb450102e200cb6c919e6e8732fe4d271c4b0f19cc19 not found: ID does not exist" containerID="631fbada038605f5c51cfb450102e200cb6c919e6e8732fe4d271c4b0f19cc19" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.718741 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"631fbada038605f5c51cfb450102e200cb6c919e6e8732fe4d271c4b0f19cc19"} err="failed to get container status \"631fbada038605f5c51cfb450102e200cb6c919e6e8732fe4d271c4b0f19cc19\": rpc error: code = NotFound desc = could not find container \"631fbada038605f5c51cfb450102e200cb6c919e6e8732fe4d271c4b0f19cc19\": container with ID starting with 631fbada038605f5c51cfb450102e200cb6c919e6e8732fe4d271c4b0f19cc19 not found: ID does not exist" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.718768 5094 scope.go:117] "RemoveContainer" containerID="92a9942e0db46da8477438409c4da3d36ea534be827434e466a4bec7f4c990ab" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.741395 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.746867 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-config-data" (OuterVolumeSpecName: "config-data") pod "d01cbaa4-5543-4cd5-b098-7e4600819d32" (UID: "d01cbaa4-5543-4cd5-b098-7e4600819d32"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.753464 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3caa33a-a0ec-4fdc-876b-266724a5af50-config-data" (OuterVolumeSpecName: "config-data") pod "f3caa33a-a0ec-4fdc-876b-266724a5af50" (UID: "f3caa33a-a0ec-4fdc-876b-266724a5af50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.757481 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3caa33a-a0ec-4fdc-876b-266724a5af50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3caa33a-a0ec-4fdc-876b-266724a5af50" (UID: "f3caa33a-a0ec-4fdc-876b-266724a5af50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.757523 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.765370 5094 scope.go:117] "RemoveContainer" containerID="c34769ddae56119b12d9b51bd88bf7ef48671807437c5ad0d48869446003d663" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.765574 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.780574 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-d63e-account-create-update-tvf55"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.785926 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.785942 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7snx5\" (UniqueName: \"kubernetes.io/projected/f3caa33a-a0ec-4fdc-876b-266724a5af50-kube-api-access-7snx5\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.785952 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.785961 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3caa33a-a0ec-4fdc-876b-266724a5af50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.785969 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3caa33a-a0ec-4fdc-876b-266724a5af50-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.795302 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-d63e-account-create-update-tvf55"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.810484 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.811346 5094 scope.go:117] "RemoveContainer" containerID="50299a2c387cfc7c5adc90764eae8fbdc420d6bc7964e0d04844da05fc246e7d" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.821075 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.827377 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.832668 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.832904 5094 scope.go:117] "RemoveContainer" containerID="229e2aebe241cc126dc9491dbc9e726710e67939495c62f8914e915bce65e45d" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.838757 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.856718 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fe9db54-4204-4335-a272-c469e0923478" path="/var/lib/kubelet/pods/1fe9db54-4204-4335-a272-c469e0923478/volumes" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.856911 5094 scope.go:117] "RemoveContainer" containerID="5c666f7fe1bbd7b65dc971fe96a85660b5d4b37560d1db6e6ea2ac21f9782b5c" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.857623 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a1b6a8a-aefe-4f59-8936-f08aed30d8f7" path="/var/lib/kubelet/pods/2a1b6a8a-aefe-4f59-8936-f08aed30d8f7/volumes" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.858327 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f8cb333-2939-4404-b242-67bcf4e6875b" path="/var/lib/kubelet/pods/8f8cb333-2939-4404-b242-67bcf4e6875b/volumes" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.858936 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="908e2706-d24f-41c9-b481-4c0d5415c5ca" path="/var/lib/kubelet/pods/908e2706-d24f-41c9-b481-4c0d5415c5ca/volumes" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.860314 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92877559-6960-4dbf-890a-fb563f4b0bf8" path="/var/lib/kubelet/pods/92877559-6960-4dbf-890a-fb563f4b0bf8/volumes" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.861493 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca3adffb-7baf-45db-ab16-cc1c63510fec" path="/var/lib/kubelet/pods/ca3adffb-7baf-45db-ab16-cc1c63510fec/volumes" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.862444 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" path="/var/lib/kubelet/pods/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4/volumes" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.863164 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f11aa87b-3964-4a62-871f-bdf7d1ad7848" path="/var/lib/kubelet/pods/f11aa87b-3964-4a62-871f-bdf7d1ad7848/volumes" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.864784 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.864814 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.869763 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.872771 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-7c9ffdc684-v56nc"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.877824 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-7c9ffdc684-v56nc"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.893194 5094 scope.go:117] "RemoveContainer" containerID="0f4330768d5d427a2d77f0009f9dd9c0b2e0c4e46d6fb295f8aa0f285169a62c" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.918404 5094 scope.go:117] "RemoveContainer" containerID="d19cd01d15b92dc7454d6cf3fb5973b463816ef9279ed3d67fe67ccb7ea9cc15" Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:31.926273 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:31.928114 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:31.928315 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:31.928676 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:31.928798 5094 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tj42x" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovsdb-server" Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:31.933151 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:31.938304 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:31.938346 5094 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tj42x" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovs-vswitchd" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.953131 5094 scope.go:117] "RemoveContainer" containerID="e83bab219c26de7197a4c8d483fd96f4ccf30f290122d4606fa75843efcfaa32" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.974071 5094 scope.go:117] "RemoveContainer" containerID="d1f6db0391d91d17006e26254d5724c2e1250f967872ca29f449b7f22386e51b" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.992486 5094 scope.go:117] "RemoveContainer" containerID="fbee36bb89639ece4abdb6a81c5837582f4784307b3986814aecb8c79e38a15c" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.012177 5094 scope.go:117] "RemoveContainer" containerID="b35c284d78ecbe2f2a62876408599b6c0ef8f8ad565bfbbe236f98afa60d9b08" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.076549 5094 scope.go:117] "RemoveContainer" containerID="c53dec831e54b024a954907e297fb4a37cd3b545f865b7a580f6bd56abb0a90d" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.352077 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.351929 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d01cbaa4-5543-4cd5-b098-7e4600819d32","Type":"ContainerDied","Data":"e629f6202467daa64ea1c5522af0e65990925325c9e7a625f6d5ea287157f10f"} Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.352527 5094 scope.go:117] "RemoveContainer" containerID="0d80098740f3555dc54ef0c65032de273b3ce866905e15dc8682ab9661be5b23" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.378191 5094 generic.go:334] "Generic (PLEG): container finished" podID="219c74d6-9f45-4bf8-8c67-acdea3c0fab3" containerID="74e0d7c23ec3f1be5316db26c770a1e0ec492750a824549bed30f944a01c88b6" exitCode=0 Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.378384 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"219c74d6-9f45-4bf8-8c67-acdea3c0fab3","Type":"ContainerDied","Data":"74e0d7c23ec3f1be5316db26c770a1e0ec492750a824549bed30f944a01c88b6"} Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.397587 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f3caa33a-a0ec-4fdc-876b-266724a5af50","Type":"ContainerDied","Data":"04b7e6ec6e40c5ff8ef2a4fd5c6eaae7045f9b9edc65925744ff48699dca0766"} Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.397845 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.421956 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_cd92d75e-9882-4bb7-a41e-cab9777424e8/ovn-northd/0.log" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.421993 5094 generic.go:334] "Generic (PLEG): container finished" podID="cd92d75e-9882-4bb7-a41e-cab9777424e8" containerID="3f7c742aada7ed25bb815042dbf9602749ab67ca6990bc8a7e5d047b638c6680" exitCode=139 Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.422044 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"cd92d75e-9882-4bb7-a41e-cab9777424e8","Type":"ContainerDied","Data":"3f7c742aada7ed25bb815042dbf9602749ab67ca6990bc8a7e5d047b638c6680"} Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.422068 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"cd92d75e-9882-4bb7-a41e-cab9777424e8","Type":"ContainerDied","Data":"152906df1c7c73cf9f0f0833ac0e10aba2f396ee2bbcbe89c7d223cc8c01b900"} Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.422081 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="152906df1c7c73cf9f0f0833ac0e10aba2f396ee2bbcbe89c7d223cc8c01b900" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.443953 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6cc2-account-create-update-98lt6" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.552925 5094 scope.go:117] "RemoveContainer" containerID="2d7a76b02a624c041d0a977a291c39fc1dacc4367a61490ce8714745def9a3c7" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.577777 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_cd92d75e-9882-4bb7-a41e-cab9777424e8/ovn-northd/0.log" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.577890 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.588158 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.609869 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-erlang-cookie-secret\") pod \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.609941 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-pod-info\") pod \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.609989 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-metrics-certs-tls-certs\") pod \"cd92d75e-9882-4bb7-a41e-cab9777424e8\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.610025 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd92d75e-9882-4bb7-a41e-cab9777424e8-config\") pod \"cd92d75e-9882-4bb7-a41e-cab9777424e8\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.610102 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-plugins\") pod \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.610125 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cd92d75e-9882-4bb7-a41e-cab9777424e8-ovn-rundir\") pod \"cd92d75e-9882-4bb7-a41e-cab9777424e8\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.610155 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-968ct\" (UniqueName: \"kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-kube-api-access-968ct\") pod \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.610190 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-tls\") pod \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.610255 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-server-conf\") pod \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.610300 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fksb\" (UniqueName: \"kubernetes.io/projected/cd92d75e-9882-4bb7-a41e-cab9777424e8-kube-api-access-8fksb\") pod \"cd92d75e-9882-4bb7-a41e-cab9777424e8\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.610334 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-ovn-northd-tls-certs\") pod \"cd92d75e-9882-4bb7-a41e-cab9777424e8\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.610356 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-erlang-cookie\") pod \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.610438 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-combined-ca-bundle\") pod \"cd92d75e-9882-4bb7-a41e-cab9777424e8\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.610474 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd92d75e-9882-4bb7-a41e-cab9777424e8-scripts\") pod \"cd92d75e-9882-4bb7-a41e-cab9777424e8\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.610508 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.610531 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-confd\") pod \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.610582 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-config-data\") pod \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.610643 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-plugins-conf\") pod \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:32.611525 5094 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:32.611595 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0330a367-c0c9-42a9-9993-1a3b6775fd3b-operator-scripts podName:0330a367-c0c9-42a9-9993-1a3b6775fd3b nodeName:}" failed. No retries permitted until 2026-02-20 07:11:34.611575189 +0000 UTC m=+1509.484201900 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0330a367-c0c9-42a9-9993-1a3b6775fd3b-operator-scripts") pod "root-account-create-update-5vc6z" (UID: "0330a367-c0c9-42a9-9993-1a3b6775fd3b") : configmap "openstack-scripts" not found Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.614733 5094 scope.go:117] "RemoveContainer" containerID="d82cb6bb5b6363e25ca9e64db1ea09c7494c05c93392e549c89755977ef55244" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.615589 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "219c74d6-9f45-4bf8-8c67-acdea3c0fab3" (UID: "219c74d6-9f45-4bf8-8c67-acdea3c0fab3"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.619269 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd92d75e-9882-4bb7-a41e-cab9777424e8-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "cd92d75e-9882-4bb7-a41e-cab9777424e8" (UID: "cd92d75e-9882-4bb7-a41e-cab9777424e8"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.621799 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-kube-api-access-968ct" (OuterVolumeSpecName: "kube-api-access-968ct") pod "219c74d6-9f45-4bf8-8c67-acdea3c0fab3" (UID: "219c74d6-9f45-4bf8-8c67-acdea3c0fab3"). InnerVolumeSpecName "kube-api-access-968ct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.624294 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "219c74d6-9f45-4bf8-8c67-acdea3c0fab3" (UID: "219c74d6-9f45-4bf8-8c67-acdea3c0fab3"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.660003 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd92d75e-9882-4bb7-a41e-cab9777424e8-scripts" (OuterVolumeSpecName: "scripts") pod "cd92d75e-9882-4bb7-a41e-cab9777424e8" (UID: "cd92d75e-9882-4bb7-a41e-cab9777424e8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.661406 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "219c74d6-9f45-4bf8-8c67-acdea3c0fab3" (UID: "219c74d6-9f45-4bf8-8c67-acdea3c0fab3"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.661405 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "219c74d6-9f45-4bf8-8c67-acdea3c0fab3" (UID: "219c74d6-9f45-4bf8-8c67-acdea3c0fab3"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.661536 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="a829c6b3-7069-4544-90dc-40ae83aba524" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.98:5671: connect: connection refused" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.662816 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6cc2-account-create-update-98lt6"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.663203 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "219c74d6-9f45-4bf8-8c67-acdea3c0fab3" (UID: "219c74d6-9f45-4bf8-8c67-acdea3c0fab3"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.663362 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd92d75e-9882-4bb7-a41e-cab9777424e8-config" (OuterVolumeSpecName: "config") pod "cd92d75e-9882-4bb7-a41e-cab9777424e8" (UID: "cd92d75e-9882-4bb7-a41e-cab9777424e8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.666034 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd92d75e-9882-4bb7-a41e-cab9777424e8-kube-api-access-8fksb" (OuterVolumeSpecName: "kube-api-access-8fksb") pod "cd92d75e-9882-4bb7-a41e-cab9777424e8" (UID: "cd92d75e-9882-4bb7-a41e-cab9777424e8"). InnerVolumeSpecName "kube-api-access-8fksb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.669193 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "219c74d6-9f45-4bf8-8c67-acdea3c0fab3" (UID: "219c74d6-9f45-4bf8-8c67-acdea3c0fab3"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.693157 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-6cc2-account-create-update-98lt6"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.696155 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-pod-info" (OuterVolumeSpecName: "pod-info") pod "219c74d6-9f45-4bf8-8c67-acdea3c0fab3" (UID: "219c74d6-9f45-4bf8-8c67-acdea3c0fab3"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.713958 5094 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.713994 5094 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-pod-info\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.714006 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd92d75e-9882-4bb7-a41e-cab9777424e8-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.714016 5094 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.714026 5094 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cd92d75e-9882-4bb7-a41e-cab9777424e8-ovn-rundir\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.714036 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-968ct\" (UniqueName: \"kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-kube-api-access-968ct\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.714049 5094 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.714057 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fksb\" (UniqueName: \"kubernetes.io/projected/cd92d75e-9882-4bb7-a41e-cab9777424e8-kube-api-access-8fksb\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.714069 5094 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.714080 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.714089 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvlxh\" (UniqueName: \"kubernetes.io/projected/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-kube-api-access-tvlxh\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.714098 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd92d75e-9882-4bb7-a41e-cab9777424e8-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.714128 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.714144 5094 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.727606 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.736854 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.738411 5094 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.752603 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-server-conf" (OuterVolumeSpecName: "server-conf") pod "219c74d6-9f45-4bf8-8c67-acdea3c0fab3" (UID: "219c74d6-9f45-4bf8-8c67-acdea3c0fab3"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.752731 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd92d75e-9882-4bb7-a41e-cab9777424e8" (UID: "cd92d75e-9882-4bb7-a41e-cab9777424e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.760833 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.766544 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.777626 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-config-data" (OuterVolumeSpecName: "config-data") pod "219c74d6-9f45-4bf8-8c67-acdea3c0fab3" (UID: "219c74d6-9f45-4bf8-8c67-acdea3c0fab3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.777680 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "cd92d75e-9882-4bb7-a41e-cab9777424e8" (UID: "cd92d75e-9882-4bb7-a41e-cab9777424e8"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.792910 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "cd92d75e-9882-4bb7-a41e-cab9777424e8" (UID: "cd92d75e-9882-4bb7-a41e-cab9777424e8"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.818918 5094 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.818956 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.818966 5094 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.818975 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.818985 5094 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.819001 5094 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-server-conf\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.825993 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "219c74d6-9f45-4bf8-8c67-acdea3c0fab3" (UID: "219c74d6-9f45-4bf8-8c67-acdea3c0fab3"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.830517 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5vc6z" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.920658 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0330a367-c0c9-42a9-9993-1a3b6775fd3b-operator-scripts\") pod \"0330a367-c0c9-42a9-9993-1a3b6775fd3b\" (UID: \"0330a367-c0c9-42a9-9993-1a3b6775fd3b\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.920989 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfrlw\" (UniqueName: \"kubernetes.io/projected/0330a367-c0c9-42a9-9993-1a3b6775fd3b-kube-api-access-bfrlw\") pod \"0330a367-c0c9-42a9-9993-1a3b6775fd3b\" (UID: \"0330a367-c0c9-42a9-9993-1a3b6775fd3b\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.921424 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0330a367-c0c9-42a9-9993-1a3b6775fd3b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0330a367-c0c9-42a9-9993-1a3b6775fd3b" (UID: "0330a367-c0c9-42a9-9993-1a3b6775fd3b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.923588 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0330a367-c0c9-42a9-9993-1a3b6775fd3b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.923723 5094 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.926972 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0330a367-c0c9-42a9-9993-1a3b6775fd3b-kube-api-access-bfrlw" (OuterVolumeSpecName: "kube-api-access-bfrlw") pod "0330a367-c0c9-42a9-9993-1a3b6775fd3b" (UID: "0330a367-c0c9-42a9-9993-1a3b6775fd3b"). InnerVolumeSpecName "kube-api-access-bfrlw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.987296 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.025140 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3d3ab399-3fc6-47e1-995c-5e855c554e9e-config-data-generated\") pod \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.025225 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-operator-scripts\") pod \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.025529 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.025579 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3ab399-3fc6-47e1-995c-5e855c554e9e-combined-ca-bundle\") pod \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.025767 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-config-data-default\") pod \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.025795 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5mtb\" (UniqueName: \"kubernetes.io/projected/3d3ab399-3fc6-47e1-995c-5e855c554e9e-kube-api-access-b5mtb\") pod \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.025872 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d3ab399-3fc6-47e1-995c-5e855c554e9e-galera-tls-certs\") pod \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.025926 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-kolla-config\") pod \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.026673 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfrlw\" (UniqueName: \"kubernetes.io/projected/0330a367-c0c9-42a9-9993-1a3b6775fd3b-kube-api-access-bfrlw\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.027565 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "3d3ab399-3fc6-47e1-995c-5e855c554e9e" (UID: "3d3ab399-3fc6-47e1-995c-5e855c554e9e"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.028031 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d3ab399-3fc6-47e1-995c-5e855c554e9e-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "3d3ab399-3fc6-47e1-995c-5e855c554e9e" (UID: "3d3ab399-3fc6-47e1-995c-5e855c554e9e"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.028554 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "3d3ab399-3fc6-47e1-995c-5e855c554e9e" (UID: "3d3ab399-3fc6-47e1-995c-5e855c554e9e"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.032986 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d3ab399-3fc6-47e1-995c-5e855c554e9e-kube-api-access-b5mtb" (OuterVolumeSpecName: "kube-api-access-b5mtb") pod "3d3ab399-3fc6-47e1-995c-5e855c554e9e" (UID: "3d3ab399-3fc6-47e1-995c-5e855c554e9e"). InnerVolumeSpecName "kube-api-access-b5mtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.033934 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3d3ab399-3fc6-47e1-995c-5e855c554e9e" (UID: "3d3ab399-3fc6-47e1-995c-5e855c554e9e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.085399 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "mysql-db") pod "3d3ab399-3fc6-47e1-995c-5e855c554e9e" (UID: "3d3ab399-3fc6-47e1-995c-5e855c554e9e"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.115370 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d3ab399-3fc6-47e1-995c-5e855c554e9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d3ab399-3fc6-47e1-995c-5e855c554e9e" (UID: "3d3ab399-3fc6-47e1-995c-5e855c554e9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.121084 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d3ab399-3fc6-47e1-995c-5e855c554e9e-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "3d3ab399-3fc6-47e1-995c-5e855c554e9e" (UID: "3d3ab399-3fc6-47e1-995c-5e855c554e9e"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.131548 5094 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.131585 5094 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3d3ab399-3fc6-47e1-995c-5e855c554e9e-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.132082 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.132130 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.132142 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3ab399-3fc6-47e1-995c-5e855c554e9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.132151 5094 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.132160 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5mtb\" (UniqueName: \"kubernetes.io/projected/3d3ab399-3fc6-47e1-995c-5e855c554e9e-kube-api-access-b5mtb\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.132171 5094 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d3ab399-3fc6-47e1-995c-5e855c554e9e-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.147220 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/keystone-55468cd684-wv6dn" podUID="732b4015-53b2-4422-b7d1-12b65f6e0c92" containerName="keystone-api" probeResult="failure" output="Get \"https://10.217.0.151:5000/v3\": dial tcp 10.217.0.151:5000: connect: connection refused" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.162756 5094 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.237082 5094 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.302671 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.338573 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-config-data\") pod \"a829c6b3-7069-4544-90dc-40ae83aba524\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.338636 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-erlang-cookie\") pod \"a829c6b3-7069-4544-90dc-40ae83aba524\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.338690 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxfjr\" (UniqueName: \"kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-kube-api-access-fxfjr\") pod \"a829c6b3-7069-4544-90dc-40ae83aba524\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.338741 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-plugins\") pod \"a829c6b3-7069-4544-90dc-40ae83aba524\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.338802 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"a829c6b3-7069-4544-90dc-40ae83aba524\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.338828 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a829c6b3-7069-4544-90dc-40ae83aba524-erlang-cookie-secret\") pod \"a829c6b3-7069-4544-90dc-40ae83aba524\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.338861 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-tls\") pod \"a829c6b3-7069-4544-90dc-40ae83aba524\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.338930 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a829c6b3-7069-4544-90dc-40ae83aba524-pod-info\") pod \"a829c6b3-7069-4544-90dc-40ae83aba524\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.338951 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-server-conf\") pod \"a829c6b3-7069-4544-90dc-40ae83aba524\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.338978 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-confd\") pod \"a829c6b3-7069-4544-90dc-40ae83aba524\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.339039 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-plugins-conf\") pod \"a829c6b3-7069-4544-90dc-40ae83aba524\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.339908 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a829c6b3-7069-4544-90dc-40ae83aba524" (UID: "a829c6b3-7069-4544-90dc-40ae83aba524"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.342115 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a829c6b3-7069-4544-90dc-40ae83aba524" (UID: "a829c6b3-7069-4544-90dc-40ae83aba524"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.343101 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a829c6b3-7069-4544-90dc-40ae83aba524" (UID: "a829c6b3-7069-4544-90dc-40ae83aba524"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.349725 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a829c6b3-7069-4544-90dc-40ae83aba524-pod-info" (OuterVolumeSpecName: "pod-info") pod "a829c6b3-7069-4544-90dc-40ae83aba524" (UID: "a829c6b3-7069-4544-90dc-40ae83aba524"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.362236 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "a829c6b3-7069-4544-90dc-40ae83aba524" (UID: "a829c6b3-7069-4544-90dc-40ae83aba524"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.362958 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a829c6b3-7069-4544-90dc-40ae83aba524-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a829c6b3-7069-4544-90dc-40ae83aba524" (UID: "a829c6b3-7069-4544-90dc-40ae83aba524"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.365558 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "a829c6b3-7069-4544-90dc-40ae83aba524" (UID: "a829c6b3-7069-4544-90dc-40ae83aba524"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.374489 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-kube-api-access-fxfjr" (OuterVolumeSpecName: "kube-api-access-fxfjr") pod "a829c6b3-7069-4544-90dc-40ae83aba524" (UID: "a829c6b3-7069-4544-90dc-40ae83aba524"). InnerVolumeSpecName "kube-api-access-fxfjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.390606 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-config-data" (OuterVolumeSpecName: "config-data") pod "a829c6b3-7069-4544-90dc-40ae83aba524" (UID: "a829c6b3-7069-4544-90dc-40ae83aba524"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.411599 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-server-conf" (OuterVolumeSpecName: "server-conf") pod "a829c6b3-7069-4544-90dc-40ae83aba524" (UID: "a829c6b3-7069-4544-90dc-40ae83aba524"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.440888 5094 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a829c6b3-7069-4544-90dc-40ae83aba524-pod-info\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.440937 5094 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-server-conf\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.440951 5094 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.440962 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.440973 5094 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.440989 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxfjr\" (UniqueName: \"kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-kube-api-access-fxfjr\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.440998 5094 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.441024 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.441033 5094 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a829c6b3-7069-4544-90dc-40ae83aba524-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.441041 5094 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.460498 5094 generic.go:334] "Generic (PLEG): container finished" podID="a5bbb9ad-deeb-495f-9750-f7012c00061d" containerID="c9d8bef2aa627582c2efca41433b2b24a6bb42ef156ade330dc390cb501211cf" exitCode=0 Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.460562 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a5bbb9ad-deeb-495f-9750-f7012c00061d","Type":"ContainerDied","Data":"c9d8bef2aa627582c2efca41433b2b24a6bb42ef156ade330dc390cb501211cf"} Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.462644 5094 generic.go:334] "Generic (PLEG): container finished" podID="732b4015-53b2-4422-b7d1-12b65f6e0c92" containerID="d6784972d1434d152307a7464021eb59a6a508fa634f71c81ae53ac06db7b051" exitCode=0 Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.462684 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-55468cd684-wv6dn" event={"ID":"732b4015-53b2-4422-b7d1-12b65f6e0c92","Type":"ContainerDied","Data":"d6784972d1434d152307a7464021eb59a6a508fa634f71c81ae53ac06db7b051"} Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.466023 5094 generic.go:334] "Generic (PLEG): container finished" podID="3d3ab399-3fc6-47e1-995c-5e855c554e9e" containerID="b512a557484cc85ddc51b76d4e4921f265f1c6139b4d9b37aa3c645ee24c7d27" exitCode=0 Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.466067 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3d3ab399-3fc6-47e1-995c-5e855c554e9e","Type":"ContainerDied","Data":"b512a557484cc85ddc51b76d4e4921f265f1c6139b4d9b37aa3c645ee24c7d27"} Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.466084 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3d3ab399-3fc6-47e1-995c-5e855c554e9e","Type":"ContainerDied","Data":"69690ca4b7abd1bc1955c808ad93fa95a3a579fa31419e9ead102d78d2680915"} Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.466106 5094 scope.go:117] "RemoveContainer" containerID="b512a557484cc85ddc51b76d4e4921f265f1c6139b4d9b37aa3c645ee24c7d27" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.466224 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.469072 5094 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.472666 5094 generic.go:334] "Generic (PLEG): container finished" podID="b1504790-ccaf-42d5-a28a-a25f0cd353c9" containerID="e21d77d7f75d6c24720a691c3dafa0724131a4a216a5cb72213f2651bb77470e" exitCode=0 Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.472870 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b1504790-ccaf-42d5-a28a-a25f0cd353c9","Type":"ContainerDied","Data":"e21d77d7f75d6c24720a691c3dafa0724131a4a216a5cb72213f2651bb77470e"} Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.477416 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5vc6z" event={"ID":"0330a367-c0c9-42a9-9993-1a3b6775fd3b","Type":"ContainerDied","Data":"09536db7b188d6d7e190e80f4a66c3a18be0cf58fe509f674089f0d4cd9626eb"} Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.477497 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5vc6z" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.482655 5094 generic.go:334] "Generic (PLEG): container finished" podID="a829c6b3-7069-4544-90dc-40ae83aba524" containerID="bf7d170cd7c0f8170ef78ab632229324322185632d477a161a83826e71f489e8" exitCode=0 Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.482752 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.482757 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a829c6b3-7069-4544-90dc-40ae83aba524","Type":"ContainerDied","Data":"bf7d170cd7c0f8170ef78ab632229324322185632d477a161a83826e71f489e8"} Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.482836 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a829c6b3-7069-4544-90dc-40ae83aba524","Type":"ContainerDied","Data":"85bfead9f46b6ccf4ed616ed7699233ce7fd6b5f310fe2c58dee08099b82f9b7"} Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.489855 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a829c6b3-7069-4544-90dc-40ae83aba524" (UID: "a829c6b3-7069-4544-90dc-40ae83aba524"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.494262 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.494288 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.494309 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"219c74d6-9f45-4bf8-8c67-acdea3c0fab3","Type":"ContainerDied","Data":"a54c1c15c6f9b215b75b1006c2e0a8430344b718fe1307994740a6fb6ec55a17"} Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.544421 5094 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.544463 5094 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.545386 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.555432 5094 scope.go:117] "RemoveContainer" containerID="c2fe8328d6470157d2d40189e9eccecbe7d439584278e384fea09cbd5a11eb25" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.593394 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-5vc6z"] Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.604660 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-5vc6z"] Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.620655 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.621116 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.621442 5094 scope.go:117] "RemoveContainer" containerID="b512a557484cc85ddc51b76d4e4921f265f1c6139b4d9b37aa3c645ee24c7d27" Feb 20 07:11:33 crc kubenswrapper[5094]: E0220 07:11:33.622565 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b512a557484cc85ddc51b76d4e4921f265f1c6139b4d9b37aa3c645ee24c7d27\": container with ID starting with b512a557484cc85ddc51b76d4e4921f265f1c6139b4d9b37aa3c645ee24c7d27 not found: ID does not exist" containerID="b512a557484cc85ddc51b76d4e4921f265f1c6139b4d9b37aa3c645ee24c7d27" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.622607 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b512a557484cc85ddc51b76d4e4921f265f1c6139b4d9b37aa3c645ee24c7d27"} err="failed to get container status \"b512a557484cc85ddc51b76d4e4921f265f1c6139b4d9b37aa3c645ee24c7d27\": rpc error: code = NotFound desc = could not find container \"b512a557484cc85ddc51b76d4e4921f265f1c6139b4d9b37aa3c645ee24c7d27\": container with ID starting with b512a557484cc85ddc51b76d4e4921f265f1c6139b4d9b37aa3c645ee24c7d27 not found: ID does not exist" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.622634 5094 scope.go:117] "RemoveContainer" containerID="c2fe8328d6470157d2d40189e9eccecbe7d439584278e384fea09cbd5a11eb25" Feb 20 07:11:33 crc kubenswrapper[5094]: E0220 07:11:33.622915 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2fe8328d6470157d2d40189e9eccecbe7d439584278e384fea09cbd5a11eb25\": container with ID starting with c2fe8328d6470157d2d40189e9eccecbe7d439584278e384fea09cbd5a11eb25 not found: ID does not exist" containerID="c2fe8328d6470157d2d40189e9eccecbe7d439584278e384fea09cbd5a11eb25" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.622941 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2fe8328d6470157d2d40189e9eccecbe7d439584278e384fea09cbd5a11eb25"} err="failed to get container status \"c2fe8328d6470157d2d40189e9eccecbe7d439584278e384fea09cbd5a11eb25\": rpc error: code = NotFound desc = could not find container \"c2fe8328d6470157d2d40189e9eccecbe7d439584278e384fea09cbd5a11eb25\": container with ID starting with c2fe8328d6470157d2d40189e9eccecbe7d439584278e384fea09cbd5a11eb25 not found: ID does not exist" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.622960 5094 scope.go:117] "RemoveContainer" containerID="e29fe67f4e435861ca66b7f08084602af4c30866304a951f1e2cec0fe4e96725" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.630570 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.641552 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.645818 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dw5l\" (UniqueName: \"kubernetes.io/projected/732b4015-53b2-4422-b7d1-12b65f6e0c92-kube-api-access-7dw5l\") pod \"732b4015-53b2-4422-b7d1-12b65f6e0c92\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.645868 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whfw7\" (UniqueName: \"kubernetes.io/projected/a5bbb9ad-deeb-495f-9750-f7012c00061d-kube-api-access-whfw7\") pod \"a5bbb9ad-deeb-495f-9750-f7012c00061d\" (UID: \"a5bbb9ad-deeb-495f-9750-f7012c00061d\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.645900 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-combined-ca-bundle\") pod \"732b4015-53b2-4422-b7d1-12b65f6e0c92\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.645963 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-fernet-keys\") pod \"732b4015-53b2-4422-b7d1-12b65f6e0c92\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.646029 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5bbb9ad-deeb-495f-9750-f7012c00061d-config-data\") pod \"a5bbb9ad-deeb-495f-9750-f7012c00061d\" (UID: \"a5bbb9ad-deeb-495f-9750-f7012c00061d\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.646095 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-credential-keys\") pod \"732b4015-53b2-4422-b7d1-12b65f6e0c92\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.646128 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-internal-tls-certs\") pod \"732b4015-53b2-4422-b7d1-12b65f6e0c92\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.646150 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-scripts\") pod \"732b4015-53b2-4422-b7d1-12b65f6e0c92\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.646170 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-config-data\") pod \"732b4015-53b2-4422-b7d1-12b65f6e0c92\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.647130 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5bbb9ad-deeb-495f-9750-f7012c00061d-combined-ca-bundle\") pod \"a5bbb9ad-deeb-495f-9750-f7012c00061d\" (UID: \"a5bbb9ad-deeb-495f-9750-f7012c00061d\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.647158 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-public-tls-certs\") pod \"732b4015-53b2-4422-b7d1-12b65f6e0c92\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.651201 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "732b4015-53b2-4422-b7d1-12b65f6e0c92" (UID: "732b4015-53b2-4422-b7d1-12b65f6e0c92"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.651963 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "732b4015-53b2-4422-b7d1-12b65f6e0c92" (UID: "732b4015-53b2-4422-b7d1-12b65f6e0c92"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.656491 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5bbb9ad-deeb-495f-9750-f7012c00061d-kube-api-access-whfw7" (OuterVolumeSpecName: "kube-api-access-whfw7") pod "a5bbb9ad-deeb-495f-9750-f7012c00061d" (UID: "a5bbb9ad-deeb-495f-9750-f7012c00061d"). InnerVolumeSpecName "kube-api-access-whfw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.658308 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-scripts" (OuterVolumeSpecName: "scripts") pod "732b4015-53b2-4422-b7d1-12b65f6e0c92" (UID: "732b4015-53b2-4422-b7d1-12b65f6e0c92"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.658437 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.662638 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/732b4015-53b2-4422-b7d1-12b65f6e0c92-kube-api-access-7dw5l" (OuterVolumeSpecName: "kube-api-access-7dw5l") pod "732b4015-53b2-4422-b7d1-12b65f6e0c92" (UID: "732b4015-53b2-4422-b7d1-12b65f6e0c92"). InnerVolumeSpecName "kube-api-access-7dw5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.674695 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.690575 5094 scope.go:117] "RemoveContainer" containerID="bf7d170cd7c0f8170ef78ab632229324322185632d477a161a83826e71f489e8" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.691784 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.702870 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.703996 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5bbb9ad-deeb-495f-9750-f7012c00061d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5bbb9ad-deeb-495f-9750-f7012c00061d" (UID: "a5bbb9ad-deeb-495f-9750-f7012c00061d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.739454 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5bbb9ad-deeb-495f-9750-f7012c00061d-config-data" (OuterVolumeSpecName: "config-data") pod "a5bbb9ad-deeb-495f-9750-f7012c00061d" (UID: "a5bbb9ad-deeb-495f-9750-f7012c00061d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.748544 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1504790-ccaf-42d5-a28a-a25f0cd353c9-config-data\") pod \"b1504790-ccaf-42d5-a28a-a25f0cd353c9\" (UID: \"b1504790-ccaf-42d5-a28a-a25f0cd353c9\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.748660 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpdrv\" (UniqueName: \"kubernetes.io/projected/b1504790-ccaf-42d5-a28a-a25f0cd353c9-kube-api-access-fpdrv\") pod \"b1504790-ccaf-42d5-a28a-a25f0cd353c9\" (UID: \"b1504790-ccaf-42d5-a28a-a25f0cd353c9\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.748943 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1504790-ccaf-42d5-a28a-a25f0cd353c9-combined-ca-bundle\") pod \"b1504790-ccaf-42d5-a28a-a25f0cd353c9\" (UID: \"b1504790-ccaf-42d5-a28a-a25f0cd353c9\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.749418 5094 scope.go:117] "RemoveContainer" containerID="24c496b6fb0954ae89602f61cc9646541943f3eefe9945f4d983f81e20a69e1c" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.749473 5094 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.749489 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5bbb9ad-deeb-495f-9750-f7012c00061d-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.749503 5094 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.749515 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.749531 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5bbb9ad-deeb-495f-9750-f7012c00061d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.749542 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dw5l\" (UniqueName: \"kubernetes.io/projected/732b4015-53b2-4422-b7d1-12b65f6e0c92-kube-api-access-7dw5l\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.749554 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whfw7\" (UniqueName: \"kubernetes.io/projected/a5bbb9ad-deeb-495f-9750-f7012c00061d-kube-api-access-whfw7\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.753403 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1504790-ccaf-42d5-a28a-a25f0cd353c9-kube-api-access-fpdrv" (OuterVolumeSpecName: "kube-api-access-fpdrv") pod "b1504790-ccaf-42d5-a28a-a25f0cd353c9" (UID: "b1504790-ccaf-42d5-a28a-a25f0cd353c9"). InnerVolumeSpecName "kube-api-access-fpdrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.769293 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "732b4015-53b2-4422-b7d1-12b65f6e0c92" (UID: "732b4015-53b2-4422-b7d1-12b65f6e0c92"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.772314 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "732b4015-53b2-4422-b7d1-12b65f6e0c92" (UID: "732b4015-53b2-4422-b7d1-12b65f6e0c92"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.774136 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-config-data" (OuterVolumeSpecName: "config-data") pod "732b4015-53b2-4422-b7d1-12b65f6e0c92" (UID: "732b4015-53b2-4422-b7d1-12b65f6e0c92"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.778915 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1504790-ccaf-42d5-a28a-a25f0cd353c9-config-data" (OuterVolumeSpecName: "config-data") pod "b1504790-ccaf-42d5-a28a-a25f0cd353c9" (UID: "b1504790-ccaf-42d5-a28a-a25f0cd353c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.785726 5094 scope.go:117] "RemoveContainer" containerID="bf7d170cd7c0f8170ef78ab632229324322185632d477a161a83826e71f489e8" Feb 20 07:11:33 crc kubenswrapper[5094]: E0220 07:11:33.786285 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf7d170cd7c0f8170ef78ab632229324322185632d477a161a83826e71f489e8\": container with ID starting with bf7d170cd7c0f8170ef78ab632229324322185632d477a161a83826e71f489e8 not found: ID does not exist" containerID="bf7d170cd7c0f8170ef78ab632229324322185632d477a161a83826e71f489e8" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.786325 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf7d170cd7c0f8170ef78ab632229324322185632d477a161a83826e71f489e8"} err="failed to get container status \"bf7d170cd7c0f8170ef78ab632229324322185632d477a161a83826e71f489e8\": rpc error: code = NotFound desc = could not find container \"bf7d170cd7c0f8170ef78ab632229324322185632d477a161a83826e71f489e8\": container with ID starting with bf7d170cd7c0f8170ef78ab632229324322185632d477a161a83826e71f489e8 not found: ID does not exist" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.786356 5094 scope.go:117] "RemoveContainer" containerID="24c496b6fb0954ae89602f61cc9646541943f3eefe9945f4d983f81e20a69e1c" Feb 20 07:11:33 crc kubenswrapper[5094]: E0220 07:11:33.787371 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24c496b6fb0954ae89602f61cc9646541943f3eefe9945f4d983f81e20a69e1c\": container with ID starting with 24c496b6fb0954ae89602f61cc9646541943f3eefe9945f4d983f81e20a69e1c not found: ID does not exist" containerID="24c496b6fb0954ae89602f61cc9646541943f3eefe9945f4d983f81e20a69e1c" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.787405 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24c496b6fb0954ae89602f61cc9646541943f3eefe9945f4d983f81e20a69e1c"} err="failed to get container status \"24c496b6fb0954ae89602f61cc9646541943f3eefe9945f4d983f81e20a69e1c\": rpc error: code = NotFound desc = could not find container \"24c496b6fb0954ae89602f61cc9646541943f3eefe9945f4d983f81e20a69e1c\": container with ID starting with 24c496b6fb0954ae89602f61cc9646541943f3eefe9945f4d983f81e20a69e1c not found: ID does not exist" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.787424 5094 scope.go:117] "RemoveContainer" containerID="74e0d7c23ec3f1be5316db26c770a1e0ec492750a824549bed30f944a01c88b6" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.787550 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1504790-ccaf-42d5-a28a-a25f0cd353c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1504790-ccaf-42d5-a28a-a25f0cd353c9" (UID: "b1504790-ccaf-42d5-a28a-a25f0cd353c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.801278 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "732b4015-53b2-4422-b7d1-12b65f6e0c92" (UID: "732b4015-53b2-4422-b7d1-12b65f6e0c92"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.806799 5094 scope.go:117] "RemoveContainer" containerID="0b471ed3a22d48304ba36ebcf4e6acbda6b19074d6e3a72832e1dda4c6f2f145" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.822585 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.828291 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.850296 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0330a367-c0c9-42a9-9993-1a3b6775fd3b" path="/var/lib/kubelet/pods/0330a367-c0c9-42a9-9993-1a3b6775fd3b/volumes" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.851223 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="219c74d6-9f45-4bf8-8c67-acdea3c0fab3" path="/var/lib/kubelet/pods/219c74d6-9f45-4bf8-8c67-acdea3c0fab3/volumes" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.851630 5094 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.851654 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1504790-ccaf-42d5-a28a-a25f0cd353c9-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.851674 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.851687 5094 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.851714 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpdrv\" (UniqueName: \"kubernetes.io/projected/b1504790-ccaf-42d5-a28a-a25f0cd353c9-kube-api-access-fpdrv\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.851725 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.851734 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1504790-ccaf-42d5-a28a-a25f0cd353c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.853595 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d3ab399-3fc6-47e1-995c-5e855c554e9e" path="/var/lib/kubelet/pods/3d3ab399-3fc6-47e1-995c-5e855c554e9e/volumes" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.854401 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d9f1f40-92cc-4f19-9f3b-49651f56bffb" path="/var/lib/kubelet/pods/5d9f1f40-92cc-4f19-9f3b-49651f56bffb/volumes" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.856660 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="762a565c-672e-4127-a8c6-90f721eeda81" path="/var/lib/kubelet/pods/762a565c-672e-4127-a8c6-90f721eeda81/volumes" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.858740 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dd0ff85-ae3a-4035-a096-fea5952b19a7" path="/var/lib/kubelet/pods/7dd0ff85-ae3a-4035-a096-fea5952b19a7/volumes" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.859494 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a829c6b3-7069-4544-90dc-40ae83aba524" path="/var/lib/kubelet/pods/a829c6b3-7069-4544-90dc-40ae83aba524/volumes" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.860437 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6" path="/var/lib/kubelet/pods/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6/volumes" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.860805 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd92d75e-9882-4bb7-a41e-cab9777424e8" path="/var/lib/kubelet/pods/cd92d75e-9882-4bb7-a41e-cab9777424e8/volumes" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.861352 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d01cbaa4-5543-4cd5-b098-7e4600819d32" path="/var/lib/kubelet/pods/d01cbaa4-5543-4cd5-b098-7e4600819d32/volumes" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.862351 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3caa33a-a0ec-4fdc-876b-266724a5af50" path="/var/lib/kubelet/pods/f3caa33a-a0ec-4fdc-876b-266724a5af50/volumes" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.015672 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.054014 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-combined-ca-bundle\") pod \"1218d679-0e51-4bef-9526-db16c8783d8b\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.054085 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-config-data\") pod \"1218d679-0e51-4bef-9526-db16c8783d8b\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.054141 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-ceilometer-tls-certs\") pod \"1218d679-0e51-4bef-9526-db16c8783d8b\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.054187 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1218d679-0e51-4bef-9526-db16c8783d8b-run-httpd\") pod \"1218d679-0e51-4bef-9526-db16c8783d8b\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.054222 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-scripts\") pod \"1218d679-0e51-4bef-9526-db16c8783d8b\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.054303 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmtnp\" (UniqueName: \"kubernetes.io/projected/1218d679-0e51-4bef-9526-db16c8783d8b-kube-api-access-rmtnp\") pod \"1218d679-0e51-4bef-9526-db16c8783d8b\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.054337 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1218d679-0e51-4bef-9526-db16c8783d8b-log-httpd\") pod \"1218d679-0e51-4bef-9526-db16c8783d8b\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.054361 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-sg-core-conf-yaml\") pod \"1218d679-0e51-4bef-9526-db16c8783d8b\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.055016 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1218d679-0e51-4bef-9526-db16c8783d8b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1218d679-0e51-4bef-9526-db16c8783d8b" (UID: "1218d679-0e51-4bef-9526-db16c8783d8b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.055955 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1218d679-0e51-4bef-9526-db16c8783d8b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1218d679-0e51-4bef-9526-db16c8783d8b" (UID: "1218d679-0e51-4bef-9526-db16c8783d8b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.063158 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-scripts" (OuterVolumeSpecName: "scripts") pod "1218d679-0e51-4bef-9526-db16c8783d8b" (UID: "1218d679-0e51-4bef-9526-db16c8783d8b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.071961 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1218d679-0e51-4bef-9526-db16c8783d8b-kube-api-access-rmtnp" (OuterVolumeSpecName: "kube-api-access-rmtnp") pod "1218d679-0e51-4bef-9526-db16c8783d8b" (UID: "1218d679-0e51-4bef-9526-db16c8783d8b"). InnerVolumeSpecName "kube-api-access-rmtnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.125797 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1218d679-0e51-4bef-9526-db16c8783d8b" (UID: "1218d679-0e51-4bef-9526-db16c8783d8b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.138900 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1218d679-0e51-4bef-9526-db16c8783d8b" (UID: "1218d679-0e51-4bef-9526-db16c8783d8b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.155929 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "1218d679-0e51-4bef-9526-db16c8783d8b" (UID: "1218d679-0e51-4bef-9526-db16c8783d8b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.161152 5094 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.161191 5094 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1218d679-0e51-4bef-9526-db16c8783d8b-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.161202 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.161214 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmtnp\" (UniqueName: \"kubernetes.io/projected/1218d679-0e51-4bef-9526-db16c8783d8b-kube-api-access-rmtnp\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.161232 5094 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1218d679-0e51-4bef-9526-db16c8783d8b-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.161242 5094 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.161252 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.174964 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-config-data" (OuterVolumeSpecName: "config-data") pod "1218d679-0e51-4bef-9526-db16c8783d8b" (UID: "1218d679-0e51-4bef-9526-db16c8783d8b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.262728 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.510976 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b1504790-ccaf-42d5-a28a-a25f0cd353c9","Type":"ContainerDied","Data":"15996ec151f9ac116c6912aa4e992bb9af3fc72485808d76d5e14b93da12f57f"} Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.511043 5094 scope.go:117] "RemoveContainer" containerID="e21d77d7f75d6c24720a691c3dafa0724131a4a216a5cb72213f2651bb77470e" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.511002 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.524986 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a5bbb9ad-deeb-495f-9750-f7012c00061d","Type":"ContainerDied","Data":"8fedbe0d4bad6502fc4850eabe0e6dd2cb68cdc0ac0b36644634a1ea4c48d937"} Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.525069 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.537933 5094 scope.go:117] "RemoveContainer" containerID="c9d8bef2aa627582c2efca41433b2b24a6bb42ef156ade330dc390cb501211cf" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.543558 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.548211 5094 generic.go:334] "Generic (PLEG): container finished" podID="1218d679-0e51-4bef-9526-db16c8783d8b" containerID="b54b09ff79204e43835570c3e6dd4b61080c0d99a2704ffb7699787efee4998c" exitCode=0 Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.548291 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1218d679-0e51-4bef-9526-db16c8783d8b","Type":"ContainerDied","Data":"b54b09ff79204e43835570c3e6dd4b61080c0d99a2704ffb7699787efee4998c"} Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.548325 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1218d679-0e51-4bef-9526-db16c8783d8b","Type":"ContainerDied","Data":"98363b14b517b7143e52cebeb4cbb5e2929e2cad785f4ba33d41cbc51fc7ecfd"} Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.548401 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.552860 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-55468cd684-wv6dn" event={"ID":"732b4015-53b2-4422-b7d1-12b65f6e0c92","Type":"ContainerDied","Data":"0e6de3c16ee5f3004f5d74169204f09eb1abb0191c70b76d1a09ff44b2f07e6d"} Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.552923 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.562875 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.569182 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.569236 5094 scope.go:117] "RemoveContainer" containerID="4073373c0648687276e84e986260ddd367c34f5129c119e3fce3d4726ca86b92" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.584813 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.596189 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-55468cd684-wv6dn"] Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.606741 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-55468cd684-wv6dn"] Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.611339 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.615737 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.627243 5094 scope.go:117] "RemoveContainer" containerID="0ea214df3349d0a0166b8f5bd5c44fd7b531097f01bbd476b5f80570546b706a" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.649446 5094 scope.go:117] "RemoveContainer" containerID="b54b09ff79204e43835570c3e6dd4b61080c0d99a2704ffb7699787efee4998c" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.679026 5094 scope.go:117] "RemoveContainer" containerID="f580b2199d149d5c06b5c9bfb9b0ef745161fcfd54354c074e1bf52afa2fbbe4" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.706377 5094 scope.go:117] "RemoveContainer" containerID="4073373c0648687276e84e986260ddd367c34f5129c119e3fce3d4726ca86b92" Feb 20 07:11:34 crc kubenswrapper[5094]: E0220 07:11:34.707323 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4073373c0648687276e84e986260ddd367c34f5129c119e3fce3d4726ca86b92\": container with ID starting with 4073373c0648687276e84e986260ddd367c34f5129c119e3fce3d4726ca86b92 not found: ID does not exist" containerID="4073373c0648687276e84e986260ddd367c34f5129c119e3fce3d4726ca86b92" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.707382 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4073373c0648687276e84e986260ddd367c34f5129c119e3fce3d4726ca86b92"} err="failed to get container status \"4073373c0648687276e84e986260ddd367c34f5129c119e3fce3d4726ca86b92\": rpc error: code = NotFound desc = could not find container \"4073373c0648687276e84e986260ddd367c34f5129c119e3fce3d4726ca86b92\": container with ID starting with 4073373c0648687276e84e986260ddd367c34f5129c119e3fce3d4726ca86b92 not found: ID does not exist" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.707417 5094 scope.go:117] "RemoveContainer" containerID="0ea214df3349d0a0166b8f5bd5c44fd7b531097f01bbd476b5f80570546b706a" Feb 20 07:11:34 crc kubenswrapper[5094]: E0220 07:11:34.708135 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ea214df3349d0a0166b8f5bd5c44fd7b531097f01bbd476b5f80570546b706a\": container with ID starting with 0ea214df3349d0a0166b8f5bd5c44fd7b531097f01bbd476b5f80570546b706a not found: ID does not exist" containerID="0ea214df3349d0a0166b8f5bd5c44fd7b531097f01bbd476b5f80570546b706a" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.708184 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ea214df3349d0a0166b8f5bd5c44fd7b531097f01bbd476b5f80570546b706a"} err="failed to get container status \"0ea214df3349d0a0166b8f5bd5c44fd7b531097f01bbd476b5f80570546b706a\": rpc error: code = NotFound desc = could not find container \"0ea214df3349d0a0166b8f5bd5c44fd7b531097f01bbd476b5f80570546b706a\": container with ID starting with 0ea214df3349d0a0166b8f5bd5c44fd7b531097f01bbd476b5f80570546b706a not found: ID does not exist" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.708226 5094 scope.go:117] "RemoveContainer" containerID="b54b09ff79204e43835570c3e6dd4b61080c0d99a2704ffb7699787efee4998c" Feb 20 07:11:34 crc kubenswrapper[5094]: E0220 07:11:34.708609 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b54b09ff79204e43835570c3e6dd4b61080c0d99a2704ffb7699787efee4998c\": container with ID starting with b54b09ff79204e43835570c3e6dd4b61080c0d99a2704ffb7699787efee4998c not found: ID does not exist" containerID="b54b09ff79204e43835570c3e6dd4b61080c0d99a2704ffb7699787efee4998c" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.708635 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b54b09ff79204e43835570c3e6dd4b61080c0d99a2704ffb7699787efee4998c"} err="failed to get container status \"b54b09ff79204e43835570c3e6dd4b61080c0d99a2704ffb7699787efee4998c\": rpc error: code = NotFound desc = could not find container \"b54b09ff79204e43835570c3e6dd4b61080c0d99a2704ffb7699787efee4998c\": container with ID starting with b54b09ff79204e43835570c3e6dd4b61080c0d99a2704ffb7699787efee4998c not found: ID does not exist" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.708653 5094 scope.go:117] "RemoveContainer" containerID="f580b2199d149d5c06b5c9bfb9b0ef745161fcfd54354c074e1bf52afa2fbbe4" Feb 20 07:11:34 crc kubenswrapper[5094]: E0220 07:11:34.708981 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f580b2199d149d5c06b5c9bfb9b0ef745161fcfd54354c074e1bf52afa2fbbe4\": container with ID starting with f580b2199d149d5c06b5c9bfb9b0ef745161fcfd54354c074e1bf52afa2fbbe4 not found: ID does not exist" containerID="f580b2199d149d5c06b5c9bfb9b0ef745161fcfd54354c074e1bf52afa2fbbe4" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.709003 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f580b2199d149d5c06b5c9bfb9b0ef745161fcfd54354c074e1bf52afa2fbbe4"} err="failed to get container status \"f580b2199d149d5c06b5c9bfb9b0ef745161fcfd54354c074e1bf52afa2fbbe4\": rpc error: code = NotFound desc = could not find container \"f580b2199d149d5c06b5c9bfb9b0ef745161fcfd54354c074e1bf52afa2fbbe4\": container with ID starting with f580b2199d149d5c06b5c9bfb9b0ef745161fcfd54354c074e1bf52afa2fbbe4 not found: ID does not exist" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.709018 5094 scope.go:117] "RemoveContainer" containerID="d6784972d1434d152307a7464021eb59a6a508fa634f71c81ae53ac06db7b051" Feb 20 07:11:35 crc kubenswrapper[5094]: I0220 07:11:35.862546 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" path="/var/lib/kubelet/pods/1218d679-0e51-4bef-9526-db16c8783d8b/volumes" Feb 20 07:11:35 crc kubenswrapper[5094]: I0220 07:11:35.863916 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="732b4015-53b2-4422-b7d1-12b65f6e0c92" path="/var/lib/kubelet/pods/732b4015-53b2-4422-b7d1-12b65f6e0c92/volumes" Feb 20 07:11:35 crc kubenswrapper[5094]: I0220 07:11:35.864572 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5bbb9ad-deeb-495f-9750-f7012c00061d" path="/var/lib/kubelet/pods/a5bbb9ad-deeb-495f-9750-f7012c00061d/volumes" Feb 20 07:11:35 crc kubenswrapper[5094]: I0220 07:11:35.869162 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1504790-ccaf-42d5-a28a-a25f0cd353c9" path="/var/lib/kubelet/pods/b1504790-ccaf-42d5-a28a-a25f0cd353c9/volumes" Feb 20 07:11:36 crc kubenswrapper[5094]: E0220 07:11:36.925885 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:36 crc kubenswrapper[5094]: E0220 07:11:36.926326 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:36 crc kubenswrapper[5094]: E0220 07:11:36.926589 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:36 crc kubenswrapper[5094]: E0220 07:11:36.926619 5094 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tj42x" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovsdb-server" Feb 20 07:11:36 crc kubenswrapper[5094]: E0220 07:11:36.927358 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:36 crc kubenswrapper[5094]: E0220 07:11:36.928647 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:36 crc kubenswrapper[5094]: E0220 07:11:36.936529 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:36 crc kubenswrapper[5094]: E0220 07:11:36.936566 5094 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tj42x" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovs-vswitchd" Feb 20 07:11:41 crc kubenswrapper[5094]: E0220 07:11:41.928118 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:41 crc kubenswrapper[5094]: E0220 07:11:41.928355 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:41 crc kubenswrapper[5094]: E0220 07:11:41.932005 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:41 crc kubenswrapper[5094]: E0220 07:11:41.935390 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:41 crc kubenswrapper[5094]: E0220 07:11:41.935464 5094 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tj42x" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovsdb-server" Feb 20 07:11:41 crc kubenswrapper[5094]: E0220 07:11:41.935492 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:41 crc kubenswrapper[5094]: E0220 07:11:41.937642 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:41 crc kubenswrapper[5094]: E0220 07:11:41.937719 5094 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tj42x" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovs-vswitchd" Feb 20 07:11:42 crc kubenswrapper[5094]: I0220 07:11:42.669838 5094 generic.go:334] "Generic (PLEG): container finished" podID="530069d2-7146-46eb-9c88-056cc8a583b2" containerID="2d88bbf120f5f105b4c231e353a818cac99e32ccbc4f6e26a6b8f4bf8f8c6db4" exitCode=0 Feb 20 07:11:42 crc kubenswrapper[5094]: I0220 07:11:42.669914 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54bd68f77-fkqmr" event={"ID":"530069d2-7146-46eb-9c88-056cc8a583b2","Type":"ContainerDied","Data":"2d88bbf120f5f105b4c231e353a818cac99e32ccbc4f6e26a6b8f4bf8f8c6db4"} Feb 20 07:11:42 crc kubenswrapper[5094]: I0220 07:11:42.841336 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:11:42 crc kubenswrapper[5094]: E0220 07:11:42.842158 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:11:42 crc kubenswrapper[5094]: I0220 07:11:42.953597 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.149934 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-public-tls-certs\") pod \"530069d2-7146-46eb-9c88-056cc8a583b2\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.150041 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-internal-tls-certs\") pod \"530069d2-7146-46eb-9c88-056cc8a583b2\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.150158 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-config\") pod \"530069d2-7146-46eb-9c88-056cc8a583b2\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.151104 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-combined-ca-bundle\") pod \"530069d2-7146-46eb-9c88-056cc8a583b2\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.151314 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmlpd\" (UniqueName: \"kubernetes.io/projected/530069d2-7146-46eb-9c88-056cc8a583b2-kube-api-access-wmlpd\") pod \"530069d2-7146-46eb-9c88-056cc8a583b2\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.151538 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-ovndb-tls-certs\") pod \"530069d2-7146-46eb-9c88-056cc8a583b2\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.151657 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-httpd-config\") pod \"530069d2-7146-46eb-9c88-056cc8a583b2\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.169752 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/530069d2-7146-46eb-9c88-056cc8a583b2-kube-api-access-wmlpd" (OuterVolumeSpecName: "kube-api-access-wmlpd") pod "530069d2-7146-46eb-9c88-056cc8a583b2" (UID: "530069d2-7146-46eb-9c88-056cc8a583b2"). InnerVolumeSpecName "kube-api-access-wmlpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.175928 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "530069d2-7146-46eb-9c88-056cc8a583b2" (UID: "530069d2-7146-46eb-9c88-056cc8a583b2"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.211391 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "530069d2-7146-46eb-9c88-056cc8a583b2" (UID: "530069d2-7146-46eb-9c88-056cc8a583b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.218242 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "530069d2-7146-46eb-9c88-056cc8a583b2" (UID: "530069d2-7146-46eb-9c88-056cc8a583b2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.220021 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-config" (OuterVolumeSpecName: "config") pod "530069d2-7146-46eb-9c88-056cc8a583b2" (UID: "530069d2-7146-46eb-9c88-056cc8a583b2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.225550 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "530069d2-7146-46eb-9c88-056cc8a583b2" (UID: "530069d2-7146-46eb-9c88-056cc8a583b2"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.231460 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "530069d2-7146-46eb-9c88-056cc8a583b2" (UID: "530069d2-7146-46eb-9c88-056cc8a583b2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.255553 5094 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.255603 5094 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.255623 5094 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.255645 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.255666 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.255685 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmlpd\" (UniqueName: \"kubernetes.io/projected/530069d2-7146-46eb-9c88-056cc8a583b2-kube-api-access-wmlpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.255725 5094 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.684758 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54bd68f77-fkqmr" event={"ID":"530069d2-7146-46eb-9c88-056cc8a583b2","Type":"ContainerDied","Data":"bd1a8216869bf24d8d0b2b58cd3351c7ec8fb6113dac3dda70cf8b0a5e81d368"} Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.684857 5094 scope.go:117] "RemoveContainer" containerID="5cb136f403a952c3f992bdf43a3607fce1325d571764a30723172fab59ca2ce3" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.684883 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.722004 5094 scope.go:117] "RemoveContainer" containerID="2d88bbf120f5f105b4c231e353a818cac99e32ccbc4f6e26a6b8f4bf8f8c6db4" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.745463 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-54bd68f77-fkqmr"] Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.753295 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-54bd68f77-fkqmr"] Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.853299 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="530069d2-7146-46eb-9c88-056cc8a583b2" path="/var/lib/kubelet/pods/530069d2-7146-46eb-9c88-056cc8a583b2/volumes" Feb 20 07:11:46 crc kubenswrapper[5094]: E0220 07:11:46.924791 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:46 crc kubenswrapper[5094]: E0220 07:11:46.927093 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:46 crc kubenswrapper[5094]: E0220 07:11:46.927090 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:46 crc kubenswrapper[5094]: E0220 07:11:46.927899 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:46 crc kubenswrapper[5094]: E0220 07:11:46.928052 5094 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tj42x" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovsdb-server" Feb 20 07:11:46 crc kubenswrapper[5094]: E0220 07:11:46.929861 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:46 crc kubenswrapper[5094]: E0220 07:11:46.933853 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:46 crc kubenswrapper[5094]: E0220 07:11:46.933932 5094 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tj42x" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovs-vswitchd" Feb 20 07:11:51 crc kubenswrapper[5094]: E0220 07:11:51.924924 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:51 crc kubenswrapper[5094]: E0220 07:11:51.926250 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:51 crc kubenswrapper[5094]: E0220 07:11:51.926444 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:51 crc kubenswrapper[5094]: E0220 07:11:51.926747 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:51 crc kubenswrapper[5094]: E0220 07:11:51.926804 5094 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tj42x" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovsdb-server" Feb 20 07:11:51 crc kubenswrapper[5094]: E0220 07:11:51.927857 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:51 crc kubenswrapper[5094]: E0220 07:11:51.930355 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:51 crc kubenswrapper[5094]: E0220 07:11:51.930393 5094 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tj42x" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovs-vswitchd" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.533086 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tj42x_07969dc9-1a07-455c-b6c4-6b5f3bb23cb9/ovs-vswitchd/0.log" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.534393 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.697503 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-run\") pod \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.697640 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-run" (OuterVolumeSpecName: "var-run") pod "07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" (UID: "07969dc9-1a07-455c-b6c4-6b5f3bb23cb9"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.698073 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-lib\") pod \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.698125 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88866\" (UniqueName: \"kubernetes.io/projected/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-kube-api-access-88866\") pod \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.698165 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-etc-ovs\") pod \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.698221 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-log\") pod \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.698168 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-lib" (OuterVolumeSpecName: "var-lib") pod "07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" (UID: "07969dc9-1a07-455c-b6c4-6b5f3bb23cb9"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.698273 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" (UID: "07969dc9-1a07-455c-b6c4-6b5f3bb23cb9"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.698319 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-scripts\") pod \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.698353 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-log" (OuterVolumeSpecName: "var-log") pod "07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" (UID: "07969dc9-1a07-455c-b6c4-6b5f3bb23cb9"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.699041 5094 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-etc-ovs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.699089 5094 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-log\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.699106 5094 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-run\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.699118 5094 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-lib\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.699607 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-scripts" (OuterVolumeSpecName: "scripts") pod "07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" (UID: "07969dc9-1a07-455c-b6c4-6b5f3bb23cb9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.715192 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-kube-api-access-88866" (OuterVolumeSpecName: "kube-api-access-88866") pod "07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" (UID: "07969dc9-1a07-455c-b6c4-6b5f3bb23cb9"). InnerVolumeSpecName "kube-api-access-88866". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.799696 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88866\" (UniqueName: \"kubernetes.io/projected/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-kube-api-access-88866\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.799758 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.831652 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tj42x_07969dc9-1a07-455c-b6c4-6b5f3bb23cb9/ovs-vswitchd/0.log" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.832407 5094 generic.go:334] "Generic (PLEG): container finished" podID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" exitCode=137 Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.832449 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tj42x" event={"ID":"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9","Type":"ContainerDied","Data":"381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a"} Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.832482 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tj42x" event={"ID":"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9","Type":"ContainerDied","Data":"6ed1fedc1eeb0edbe644ddd0c1faadaf13e34c06761ccbb5563c487421983aa5"} Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.832502 5094 scope.go:117] "RemoveContainer" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.832693 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.902485 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-tj42x"] Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.923000 5094 scope.go:117] "RemoveContainer" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.929306 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-tj42x"] Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.966356 5094 scope.go:117] "RemoveContainer" containerID="35c773a054d35ff2b9950832498531c8e991fc4664138ffa116e29aba081b5bf" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.001683 5094 scope.go:117] "RemoveContainer" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" Feb 20 07:11:55 crc kubenswrapper[5094]: E0220 07:11:55.002285 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a\": container with ID starting with 381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a not found: ID does not exist" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.002331 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a"} err="failed to get container status \"381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a\": rpc error: code = NotFound desc = could not find container \"381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a\": container with ID starting with 381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a not found: ID does not exist" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.002363 5094 scope.go:117] "RemoveContainer" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" Feb 20 07:11:55 crc kubenswrapper[5094]: E0220 07:11:55.002851 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d\": container with ID starting with ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d not found: ID does not exist" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.002915 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d"} err="failed to get container status \"ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d\": rpc error: code = NotFound desc = could not find container \"ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d\": container with ID starting with ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d not found: ID does not exist" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.002949 5094 scope.go:117] "RemoveContainer" containerID="35c773a054d35ff2b9950832498531c8e991fc4664138ffa116e29aba081b5bf" Feb 20 07:11:55 crc kubenswrapper[5094]: E0220 07:11:55.003279 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35c773a054d35ff2b9950832498531c8e991fc4664138ffa116e29aba081b5bf\": container with ID starting with 35c773a054d35ff2b9950832498531c8e991fc4664138ffa116e29aba081b5bf not found: ID does not exist" containerID="35c773a054d35ff2b9950832498531c8e991fc4664138ffa116e29aba081b5bf" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.003307 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35c773a054d35ff2b9950832498531c8e991fc4664138ffa116e29aba081b5bf"} err="failed to get container status \"35c773a054d35ff2b9950832498531c8e991fc4664138ffa116e29aba081b5bf\": rpc error: code = NotFound desc = could not find container \"35c773a054d35ff2b9950832498531c8e991fc4664138ffa116e29aba081b5bf\": container with ID starting with 35c773a054d35ff2b9950832498531c8e991fc4664138ffa116e29aba081b5bf not found: ID does not exist" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.287806 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.412301 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-combined-ca-bundle\") pod \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.412687 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-lock\") pod \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.412793 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.412865 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-cache\") pod \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.412918 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift\") pod \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.412959 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5g7x\" (UniqueName: \"kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-kube-api-access-n5g7x\") pod \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.413616 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-lock" (OuterVolumeSpecName: "lock") pod "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" (UID: "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.413611 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-cache" (OuterVolumeSpecName: "cache") pod "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" (UID: "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.418243 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" (UID: "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.420950 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-kube-api-access-n5g7x" (OuterVolumeSpecName: "kube-api-access-n5g7x") pod "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" (UID: "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3"). InnerVolumeSpecName "kube-api-access-n5g7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.425827 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "swift") pod "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" (UID: "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.514236 5094 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-cache\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.514456 5094 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.514544 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5g7x\" (UniqueName: \"kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-kube-api-access-n5g7x\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.514600 5094 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-lock\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.514672 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.528497 5094 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.616733 5094 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.654743 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" (UID: "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.717513 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.849974 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" path="/var/lib/kubelet/pods/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9/volumes" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.850696 5094 generic.go:334] "Generic (PLEG): container finished" podID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerID="694f121dca03e8e25df5cfb69cf088e987e8cad432e283e4814356d5f0848991" exitCode=137 Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.851017 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.851151 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerDied","Data":"694f121dca03e8e25df5cfb69cf088e987e8cad432e283e4814356d5f0848991"} Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.852252 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerDied","Data":"b318a2984af52699dbcc87bf8935047ececfd11a736630826d02b012b12ef5e4"} Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.852309 5094 scope.go:117] "RemoveContainer" containerID="694f121dca03e8e25df5cfb69cf088e987e8cad432e283e4814356d5f0848991" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.889972 5094 scope.go:117] "RemoveContainer" containerID="d95b17f09f786d48faa3531b254622503fd28358bef08fa635117e335717ec76" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.910509 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.918560 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.928653 5094 scope.go:117] "RemoveContainer" containerID="ec9511e4e89788b48919a84e31eb85bf09ba8ef7f2fd3f486aacf3733fac08ce" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.964107 5094 scope.go:117] "RemoveContainer" containerID="997b5f8501f8553cfdec1b566ef6a822ad0fc847b09fa07489c9cc4bdaa9307c" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.980624 5094 scope.go:117] "RemoveContainer" containerID="c948d686dc04dce290ff45631530d5f46650b2f0c7f924c52cff39ae5c18105d" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.003127 5094 scope.go:117] "RemoveContainer" containerID="b29a2c6ccb9cf15778496c1427d8d36bd92976f1a511632fe77d63af46238ab2" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.021117 5094 scope.go:117] "RemoveContainer" containerID="6b2e14e49e3a0c6e84ab2a157f1e34b73c7517b9c172442d8359cdf3a71c1601" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.040460 5094 scope.go:117] "RemoveContainer" containerID="ecea14d9b3d89c636aa8573b40d00132c5867dc6539f43130d1e69748ccb2dd3" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.088800 5094 scope.go:117] "RemoveContainer" containerID="da22674644cab0e6f810eb224627362470513debe3726c5711ef469be1db004e" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.107506 5094 scope.go:117] "RemoveContainer" containerID="798e86534a684e476081f370e9263cf67a92cf7ab886af18bececee01f397813" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.124199 5094 scope.go:117] "RemoveContainer" containerID="13486685a923bf2693233900e7c25d8638639ebc4de0d1f20f27c486791e4f53" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.146269 5094 scope.go:117] "RemoveContainer" containerID="876ede354cfdccb940d2003cb9a673f9c21d14f390f0a6c414edee229821334b" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.168287 5094 scope.go:117] "RemoveContainer" containerID="87d22bfd10a9fea3abdf5984327defada78a003b41f7d6829b04b81bf140871a" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.191116 5094 scope.go:117] "RemoveContainer" containerID="4eef6c37c6d679f44709f4c17ae202933f7640edde1e1c08cde85643a4c659b9" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.210323 5094 scope.go:117] "RemoveContainer" containerID="77762b2b7b3381830ea1b52f38b5c149eed591367db41376905ffbc17a991c9c" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.230897 5094 scope.go:117] "RemoveContainer" containerID="694f121dca03e8e25df5cfb69cf088e987e8cad432e283e4814356d5f0848991" Feb 20 07:11:56 crc kubenswrapper[5094]: E0220 07:11:56.231563 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"694f121dca03e8e25df5cfb69cf088e987e8cad432e283e4814356d5f0848991\": container with ID starting with 694f121dca03e8e25df5cfb69cf088e987e8cad432e283e4814356d5f0848991 not found: ID does not exist" containerID="694f121dca03e8e25df5cfb69cf088e987e8cad432e283e4814356d5f0848991" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.231600 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"694f121dca03e8e25df5cfb69cf088e987e8cad432e283e4814356d5f0848991"} err="failed to get container status \"694f121dca03e8e25df5cfb69cf088e987e8cad432e283e4814356d5f0848991\": rpc error: code = NotFound desc = could not find container \"694f121dca03e8e25df5cfb69cf088e987e8cad432e283e4814356d5f0848991\": container with ID starting with 694f121dca03e8e25df5cfb69cf088e987e8cad432e283e4814356d5f0848991 not found: ID does not exist" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.231626 5094 scope.go:117] "RemoveContainer" containerID="d95b17f09f786d48faa3531b254622503fd28358bef08fa635117e335717ec76" Feb 20 07:11:56 crc kubenswrapper[5094]: E0220 07:11:56.232107 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d95b17f09f786d48faa3531b254622503fd28358bef08fa635117e335717ec76\": container with ID starting with d95b17f09f786d48faa3531b254622503fd28358bef08fa635117e335717ec76 not found: ID does not exist" containerID="d95b17f09f786d48faa3531b254622503fd28358bef08fa635117e335717ec76" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.232190 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d95b17f09f786d48faa3531b254622503fd28358bef08fa635117e335717ec76"} err="failed to get container status \"d95b17f09f786d48faa3531b254622503fd28358bef08fa635117e335717ec76\": rpc error: code = NotFound desc = could not find container \"d95b17f09f786d48faa3531b254622503fd28358bef08fa635117e335717ec76\": container with ID starting with d95b17f09f786d48faa3531b254622503fd28358bef08fa635117e335717ec76 not found: ID does not exist" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.232263 5094 scope.go:117] "RemoveContainer" containerID="ec9511e4e89788b48919a84e31eb85bf09ba8ef7f2fd3f486aacf3733fac08ce" Feb 20 07:11:56 crc kubenswrapper[5094]: E0220 07:11:56.232806 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec9511e4e89788b48919a84e31eb85bf09ba8ef7f2fd3f486aacf3733fac08ce\": container with ID starting with ec9511e4e89788b48919a84e31eb85bf09ba8ef7f2fd3f486aacf3733fac08ce not found: ID does not exist" containerID="ec9511e4e89788b48919a84e31eb85bf09ba8ef7f2fd3f486aacf3733fac08ce" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.233190 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec9511e4e89788b48919a84e31eb85bf09ba8ef7f2fd3f486aacf3733fac08ce"} err="failed to get container status \"ec9511e4e89788b48919a84e31eb85bf09ba8ef7f2fd3f486aacf3733fac08ce\": rpc error: code = NotFound desc = could not find container \"ec9511e4e89788b48919a84e31eb85bf09ba8ef7f2fd3f486aacf3733fac08ce\": container with ID starting with ec9511e4e89788b48919a84e31eb85bf09ba8ef7f2fd3f486aacf3733fac08ce not found: ID does not exist" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.233257 5094 scope.go:117] "RemoveContainer" containerID="997b5f8501f8553cfdec1b566ef6a822ad0fc847b09fa07489c9cc4bdaa9307c" Feb 20 07:11:56 crc kubenswrapper[5094]: E0220 07:11:56.234160 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"997b5f8501f8553cfdec1b566ef6a822ad0fc847b09fa07489c9cc4bdaa9307c\": container with ID starting with 997b5f8501f8553cfdec1b566ef6a822ad0fc847b09fa07489c9cc4bdaa9307c not found: ID does not exist" containerID="997b5f8501f8553cfdec1b566ef6a822ad0fc847b09fa07489c9cc4bdaa9307c" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.234251 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"997b5f8501f8553cfdec1b566ef6a822ad0fc847b09fa07489c9cc4bdaa9307c"} err="failed to get container status \"997b5f8501f8553cfdec1b566ef6a822ad0fc847b09fa07489c9cc4bdaa9307c\": rpc error: code = NotFound desc = could not find container \"997b5f8501f8553cfdec1b566ef6a822ad0fc847b09fa07489c9cc4bdaa9307c\": container with ID starting with 997b5f8501f8553cfdec1b566ef6a822ad0fc847b09fa07489c9cc4bdaa9307c not found: ID does not exist" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.234315 5094 scope.go:117] "RemoveContainer" containerID="c948d686dc04dce290ff45631530d5f46650b2f0c7f924c52cff39ae5c18105d" Feb 20 07:11:56 crc kubenswrapper[5094]: E0220 07:11:56.234717 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c948d686dc04dce290ff45631530d5f46650b2f0c7f924c52cff39ae5c18105d\": container with ID starting with c948d686dc04dce290ff45631530d5f46650b2f0c7f924c52cff39ae5c18105d not found: ID does not exist" containerID="c948d686dc04dce290ff45631530d5f46650b2f0c7f924c52cff39ae5c18105d" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.234745 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c948d686dc04dce290ff45631530d5f46650b2f0c7f924c52cff39ae5c18105d"} err="failed to get container status \"c948d686dc04dce290ff45631530d5f46650b2f0c7f924c52cff39ae5c18105d\": rpc error: code = NotFound desc = could not find container \"c948d686dc04dce290ff45631530d5f46650b2f0c7f924c52cff39ae5c18105d\": container with ID starting with c948d686dc04dce290ff45631530d5f46650b2f0c7f924c52cff39ae5c18105d not found: ID does not exist" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.234764 5094 scope.go:117] "RemoveContainer" containerID="b29a2c6ccb9cf15778496c1427d8d36bd92976f1a511632fe77d63af46238ab2" Feb 20 07:11:56 crc kubenswrapper[5094]: E0220 07:11:56.235067 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b29a2c6ccb9cf15778496c1427d8d36bd92976f1a511632fe77d63af46238ab2\": container with ID starting with b29a2c6ccb9cf15778496c1427d8d36bd92976f1a511632fe77d63af46238ab2 not found: ID does not exist" containerID="b29a2c6ccb9cf15778496c1427d8d36bd92976f1a511632fe77d63af46238ab2" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.235116 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b29a2c6ccb9cf15778496c1427d8d36bd92976f1a511632fe77d63af46238ab2"} err="failed to get container status \"b29a2c6ccb9cf15778496c1427d8d36bd92976f1a511632fe77d63af46238ab2\": rpc error: code = NotFound desc = could not find container \"b29a2c6ccb9cf15778496c1427d8d36bd92976f1a511632fe77d63af46238ab2\": container with ID starting with b29a2c6ccb9cf15778496c1427d8d36bd92976f1a511632fe77d63af46238ab2 not found: ID does not exist" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.235144 5094 scope.go:117] "RemoveContainer" containerID="6b2e14e49e3a0c6e84ab2a157f1e34b73c7517b9c172442d8359cdf3a71c1601" Feb 20 07:11:56 crc kubenswrapper[5094]: E0220 07:11:56.235622 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b2e14e49e3a0c6e84ab2a157f1e34b73c7517b9c172442d8359cdf3a71c1601\": container with ID starting with 6b2e14e49e3a0c6e84ab2a157f1e34b73c7517b9c172442d8359cdf3a71c1601 not found: ID does not exist" containerID="6b2e14e49e3a0c6e84ab2a157f1e34b73c7517b9c172442d8359cdf3a71c1601" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.235645 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b2e14e49e3a0c6e84ab2a157f1e34b73c7517b9c172442d8359cdf3a71c1601"} err="failed to get container status \"6b2e14e49e3a0c6e84ab2a157f1e34b73c7517b9c172442d8359cdf3a71c1601\": rpc error: code = NotFound desc = could not find container \"6b2e14e49e3a0c6e84ab2a157f1e34b73c7517b9c172442d8359cdf3a71c1601\": container with ID starting with 6b2e14e49e3a0c6e84ab2a157f1e34b73c7517b9c172442d8359cdf3a71c1601 not found: ID does not exist" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.235657 5094 scope.go:117] "RemoveContainer" containerID="ecea14d9b3d89c636aa8573b40d00132c5867dc6539f43130d1e69748ccb2dd3" Feb 20 07:11:56 crc kubenswrapper[5094]: E0220 07:11:56.235937 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecea14d9b3d89c636aa8573b40d00132c5867dc6539f43130d1e69748ccb2dd3\": container with ID starting with ecea14d9b3d89c636aa8573b40d00132c5867dc6539f43130d1e69748ccb2dd3 not found: ID does not exist" containerID="ecea14d9b3d89c636aa8573b40d00132c5867dc6539f43130d1e69748ccb2dd3" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.236009 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecea14d9b3d89c636aa8573b40d00132c5867dc6539f43130d1e69748ccb2dd3"} err="failed to get container status \"ecea14d9b3d89c636aa8573b40d00132c5867dc6539f43130d1e69748ccb2dd3\": rpc error: code = NotFound desc = could not find container \"ecea14d9b3d89c636aa8573b40d00132c5867dc6539f43130d1e69748ccb2dd3\": container with ID starting with ecea14d9b3d89c636aa8573b40d00132c5867dc6539f43130d1e69748ccb2dd3 not found: ID does not exist" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.236076 5094 scope.go:117] "RemoveContainer" containerID="da22674644cab0e6f810eb224627362470513debe3726c5711ef469be1db004e" Feb 20 07:11:56 crc kubenswrapper[5094]: E0220 07:11:56.236654 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da22674644cab0e6f810eb224627362470513debe3726c5711ef469be1db004e\": container with ID starting with da22674644cab0e6f810eb224627362470513debe3726c5711ef469be1db004e not found: ID does not exist" containerID="da22674644cab0e6f810eb224627362470513debe3726c5711ef469be1db004e" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.236738 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da22674644cab0e6f810eb224627362470513debe3726c5711ef469be1db004e"} err="failed to get container status \"da22674644cab0e6f810eb224627362470513debe3726c5711ef469be1db004e\": rpc error: code = NotFound desc = could not find container \"da22674644cab0e6f810eb224627362470513debe3726c5711ef469be1db004e\": container with ID starting with da22674644cab0e6f810eb224627362470513debe3726c5711ef469be1db004e not found: ID does not exist" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.236797 5094 scope.go:117] "RemoveContainer" containerID="798e86534a684e476081f370e9263cf67a92cf7ab886af18bececee01f397813" Feb 20 07:11:56 crc kubenswrapper[5094]: E0220 07:11:56.237149 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"798e86534a684e476081f370e9263cf67a92cf7ab886af18bececee01f397813\": container with ID starting with 798e86534a684e476081f370e9263cf67a92cf7ab886af18bececee01f397813 not found: ID does not exist" containerID="798e86534a684e476081f370e9263cf67a92cf7ab886af18bececee01f397813" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.237199 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"798e86534a684e476081f370e9263cf67a92cf7ab886af18bececee01f397813"} err="failed to get container status \"798e86534a684e476081f370e9263cf67a92cf7ab886af18bececee01f397813\": rpc error: code = NotFound desc = could not find container \"798e86534a684e476081f370e9263cf67a92cf7ab886af18bececee01f397813\": container with ID starting with 798e86534a684e476081f370e9263cf67a92cf7ab886af18bececee01f397813 not found: ID does not exist" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.237229 5094 scope.go:117] "RemoveContainer" containerID="13486685a923bf2693233900e7c25d8638639ebc4de0d1f20f27c486791e4f53" Feb 20 07:11:56 crc kubenswrapper[5094]: E0220 07:11:56.237569 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13486685a923bf2693233900e7c25d8638639ebc4de0d1f20f27c486791e4f53\": container with ID starting with 13486685a923bf2693233900e7c25d8638639ebc4de0d1f20f27c486791e4f53 not found: ID does not exist" containerID="13486685a923bf2693233900e7c25d8638639ebc4de0d1f20f27c486791e4f53" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.237595 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13486685a923bf2693233900e7c25d8638639ebc4de0d1f20f27c486791e4f53"} err="failed to get container status \"13486685a923bf2693233900e7c25d8638639ebc4de0d1f20f27c486791e4f53\": rpc error: code = NotFound desc = could not find container \"13486685a923bf2693233900e7c25d8638639ebc4de0d1f20f27c486791e4f53\": container with ID starting with 13486685a923bf2693233900e7c25d8638639ebc4de0d1f20f27c486791e4f53 not found: ID does not exist" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.237613 5094 scope.go:117] "RemoveContainer" containerID="876ede354cfdccb940d2003cb9a673f9c21d14f390f0a6c414edee229821334b" Feb 20 07:11:56 crc kubenswrapper[5094]: E0220 07:11:56.237971 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"876ede354cfdccb940d2003cb9a673f9c21d14f390f0a6c414edee229821334b\": container with ID starting with 876ede354cfdccb940d2003cb9a673f9c21d14f390f0a6c414edee229821334b not found: ID does not exist" containerID="876ede354cfdccb940d2003cb9a673f9c21d14f390f0a6c414edee229821334b" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.238016 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"876ede354cfdccb940d2003cb9a673f9c21d14f390f0a6c414edee229821334b"} err="failed to get container status \"876ede354cfdccb940d2003cb9a673f9c21d14f390f0a6c414edee229821334b\": rpc error: code = NotFound desc = could not find container \"876ede354cfdccb940d2003cb9a673f9c21d14f390f0a6c414edee229821334b\": container with ID starting with 876ede354cfdccb940d2003cb9a673f9c21d14f390f0a6c414edee229821334b not found: ID does not exist" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.238046 5094 scope.go:117] "RemoveContainer" containerID="87d22bfd10a9fea3abdf5984327defada78a003b41f7d6829b04b81bf140871a" Feb 20 07:11:56 crc kubenswrapper[5094]: E0220 07:11:56.238377 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87d22bfd10a9fea3abdf5984327defada78a003b41f7d6829b04b81bf140871a\": container with ID starting with 87d22bfd10a9fea3abdf5984327defada78a003b41f7d6829b04b81bf140871a not found: ID does not exist" containerID="87d22bfd10a9fea3abdf5984327defada78a003b41f7d6829b04b81bf140871a" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.238405 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87d22bfd10a9fea3abdf5984327defada78a003b41f7d6829b04b81bf140871a"} err="failed to get container status \"87d22bfd10a9fea3abdf5984327defada78a003b41f7d6829b04b81bf140871a\": rpc error: code = NotFound desc = could not find container \"87d22bfd10a9fea3abdf5984327defada78a003b41f7d6829b04b81bf140871a\": container with ID starting with 87d22bfd10a9fea3abdf5984327defada78a003b41f7d6829b04b81bf140871a not found: ID does not exist" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.238420 5094 scope.go:117] "RemoveContainer" containerID="4eef6c37c6d679f44709f4c17ae202933f7640edde1e1c08cde85643a4c659b9" Feb 20 07:11:56 crc kubenswrapper[5094]: E0220 07:11:56.238903 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4eef6c37c6d679f44709f4c17ae202933f7640edde1e1c08cde85643a4c659b9\": container with ID starting with 4eef6c37c6d679f44709f4c17ae202933f7640edde1e1c08cde85643a4c659b9 not found: ID does not exist" containerID="4eef6c37c6d679f44709f4c17ae202933f7640edde1e1c08cde85643a4c659b9" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.238930 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eef6c37c6d679f44709f4c17ae202933f7640edde1e1c08cde85643a4c659b9"} err="failed to get container status \"4eef6c37c6d679f44709f4c17ae202933f7640edde1e1c08cde85643a4c659b9\": rpc error: code = NotFound desc = could not find container \"4eef6c37c6d679f44709f4c17ae202933f7640edde1e1c08cde85643a4c659b9\": container with ID starting with 4eef6c37c6d679f44709f4c17ae202933f7640edde1e1c08cde85643a4c659b9 not found: ID does not exist" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.238964 5094 scope.go:117] "RemoveContainer" containerID="77762b2b7b3381830ea1b52f38b5c149eed591367db41376905ffbc17a991c9c" Feb 20 07:11:56 crc kubenswrapper[5094]: E0220 07:11:56.239231 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77762b2b7b3381830ea1b52f38b5c149eed591367db41376905ffbc17a991c9c\": container with ID starting with 77762b2b7b3381830ea1b52f38b5c149eed591367db41376905ffbc17a991c9c not found: ID does not exist" containerID="77762b2b7b3381830ea1b52f38b5c149eed591367db41376905ffbc17a991c9c" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.239281 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77762b2b7b3381830ea1b52f38b5c149eed591367db41376905ffbc17a991c9c"} err="failed to get container status \"77762b2b7b3381830ea1b52f38b5c149eed591367db41376905ffbc17a991c9c\": rpc error: code = NotFound desc = could not find container \"77762b2b7b3381830ea1b52f38b5c149eed591367db41376905ffbc17a991c9c\": container with ID starting with 77762b2b7b3381830ea1b52f38b5c149eed591367db41376905ffbc17a991c9c not found: ID does not exist" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.840220 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:11:56 crc kubenswrapper[5094]: E0220 07:11:56.840942 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:11:57 crc kubenswrapper[5094]: I0220 07:11:57.849246 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" path="/var/lib/kubelet/pods/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3/volumes" Feb 20 07:12:08 crc kubenswrapper[5094]: I0220 07:12:08.841014 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:12:08 crc kubenswrapper[5094]: E0220 07:12:08.841834 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:12:22 crc kubenswrapper[5094]: I0220 07:12:22.840560 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:12:22 crc kubenswrapper[5094]: E0220 07:12:22.841854 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:12:36 crc kubenswrapper[5094]: I0220 07:12:36.840664 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:12:36 crc kubenswrapper[5094]: E0220 07:12:36.841999 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:12:47 crc kubenswrapper[5094]: I0220 07:12:47.839936 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:12:47 crc kubenswrapper[5094]: E0220 07:12:47.841024 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:12:58 crc kubenswrapper[5094]: I0220 07:12:58.840198 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:12:58 crc kubenswrapper[5094]: E0220 07:12:58.841860 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:13:12 crc kubenswrapper[5094]: I0220 07:13:12.841484 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:13:12 crc kubenswrapper[5094]: E0220 07:13:12.844248 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:13:27 crc kubenswrapper[5094]: I0220 07:13:27.841034 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:13:27 crc kubenswrapper[5094]: E0220 07:13:27.842447 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:13:31 crc kubenswrapper[5094]: I0220 07:13:31.716890 5094 scope.go:117] "RemoveContainer" containerID="d9a07c98406e23d72c5a2bc3d04e8964b30bc89dab757f6e64abbd3de62c1272" Feb 20 07:13:31 crc kubenswrapper[5094]: I0220 07:13:31.761574 5094 scope.go:117] "RemoveContainer" containerID="378b26e1e0650ae576632665d611910465c17369e442435b9765cd97f7bbf4b7" Feb 20 07:13:31 crc kubenswrapper[5094]: I0220 07:13:31.799572 5094 scope.go:117] "RemoveContainer" containerID="627e60d9c9677f6a945a638d36a6e55c653ad3df560faf93d76ad3fae2820334" Feb 20 07:13:31 crc kubenswrapper[5094]: I0220 07:13:31.829819 5094 scope.go:117] "RemoveContainer" containerID="7c8241aa612d986c2efd3e576b0082b1361568858d2a7098f35d783948c494f3" Feb 20 07:13:31 crc kubenswrapper[5094]: I0220 07:13:31.867131 5094 scope.go:117] "RemoveContainer" containerID="b30b7402d8eb93fe27dd4eeb5df1c58c1d66056e0ef8f55ed4b6d91fb78c16c7" Feb 20 07:13:31 crc kubenswrapper[5094]: I0220 07:13:31.933528 5094 scope.go:117] "RemoveContainer" containerID="75b382b7cacb24f534d4ec56b90d8492c32e928df430e4a3f769babf549e4aa5" Feb 20 07:13:31 crc kubenswrapper[5094]: I0220 07:13:31.957945 5094 scope.go:117] "RemoveContainer" containerID="3f7c742aada7ed25bb815042dbf9602749ab67ca6990bc8a7e5d047b638c6680" Feb 20 07:13:31 crc kubenswrapper[5094]: I0220 07:13:31.998213 5094 scope.go:117] "RemoveContainer" containerID="cef36671b09afd9d82ea9087a220ba378848b8caf63bdb34c5ff82372929ee6f" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.039445 5094 scope.go:117] "RemoveContainer" containerID="aae8dde1864d6aaeb77909eb96b990c39e162c1e7c99e123f9bcf832ed144feb" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.071139 5094 scope.go:117] "RemoveContainer" containerID="59cc73fd6408558710efa6324658cf301b0cc15eed3c78c0c37707c5d008b54e" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.103881 5094 scope.go:117] "RemoveContainer" containerID="faa3a4c7f983e78d891220a937038d927c9ef5f317cdc8774b24154cf98f3466" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.165121 5094 scope.go:117] "RemoveContainer" containerID="b09eaa7d98442981b4e7ce37eedd93a2e3a6cd66a6970eb460a847a861e69caa" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.191088 5094 scope.go:117] "RemoveContainer" containerID="c8a3057121d16618bfdbd39860a04679adcd72a905e063ca9af153f1f199e6f2" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.228213 5094 scope.go:117] "RemoveContainer" containerID="bec04cc90bea00c99a6cbeb35313f8c3b75e668698426b61e7cbceb44b686553" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.296747 5094 scope.go:117] "RemoveContainer" containerID="484f3fb839183cec10038487f86ef12f28aad48e989d27e0f371b4836997c9c1" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.323499 5094 scope.go:117] "RemoveContainer" containerID="b37e6501ae2ac6b9c9a4901b1e8b894900b9c70d4214f260a2bb15e75fba5205" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.351255 5094 scope.go:117] "RemoveContainer" containerID="a9068c6d92101548d5c981bed911d9bb32d305a4eac5c47442863a89e8e65fe9" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.384072 5094 scope.go:117] "RemoveContainer" containerID="50b02908599fab0b56ac49b8dfc4de2ac6a680f5927a195974880e894fd05f07" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.416334 5094 scope.go:117] "RemoveContainer" containerID="0ce2ed20e689517e7e40aa9a1d41aa5a5a7a220e9e24137c5b2fd77c89b217f6" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.455510 5094 scope.go:117] "RemoveContainer" containerID="5fbdea48cb9017b90d8f206860f008a7a92776227fe74ba390e642bcf9bceabc" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.477953 5094 scope.go:117] "RemoveContainer" containerID="f18af370a6085a561175edbcf861dea18a9b31989555b80d968ac0b83530959d" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.514846 5094 scope.go:117] "RemoveContainer" containerID="2a24e00dad1ec7884b816153f26b2812e819c54c4c8093cf48001992ec89df96" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.542668 5094 scope.go:117] "RemoveContainer" containerID="12ead5ac0eb56e9815a9bc9d890ea3167a7072689f354e4d8ffab5e114525ab7" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.596047 5094 scope.go:117] "RemoveContainer" containerID="652be6b8a19337e61493b053c834d7ff10c694e4819ba0dbe010139f1aba0575" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.621559 5094 scope.go:117] "RemoveContainer" containerID="ba5029a86f52015ae26ae9c4af241df191f71a5df81010d7bab393d3d450c913" Feb 20 07:13:38 crc kubenswrapper[5094]: I0220 07:13:38.841176 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:13:38 crc kubenswrapper[5094]: E0220 07:13:38.842089 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:13:52 crc kubenswrapper[5094]: I0220 07:13:52.841414 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:13:52 crc kubenswrapper[5094]: E0220 07:13:52.842329 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:14:06 crc kubenswrapper[5094]: I0220 07:14:06.841070 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:14:06 crc kubenswrapper[5094]: E0220 07:14:06.842621 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:14:21 crc kubenswrapper[5094]: I0220 07:14:21.840449 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:14:21 crc kubenswrapper[5094]: E0220 07:14:21.841077 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:14:33 crc kubenswrapper[5094]: I0220 07:14:33.126406 5094 scope.go:117] "RemoveContainer" containerID="8e52dfc26f0d8eee3b5547850bbe5713b909b7c78a2c9a2bf1d9cded5250f6d8" Feb 20 07:14:33 crc kubenswrapper[5094]: I0220 07:14:33.184830 5094 scope.go:117] "RemoveContainer" containerID="8c0a9ed7e7a708519d24ca670cb89abe814680e00b43aecd0756535bb04fcacc" Feb 20 07:14:33 crc kubenswrapper[5094]: I0220 07:14:33.222370 5094 scope.go:117] "RemoveContainer" containerID="2e821859400e46f67b2a4c22980c8831875df8a723826e86fdac889d47a73a82" Feb 20 07:14:33 crc kubenswrapper[5094]: I0220 07:14:33.292167 5094 scope.go:117] "RemoveContainer" containerID="d8a84b3a4264179ed1969ed49bb2b66b1b9094412f8430af00f5618a35872170" Feb 20 07:14:33 crc kubenswrapper[5094]: I0220 07:14:33.840867 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:14:33 crc kubenswrapper[5094]: E0220 07:14:33.841287 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:14:46 crc kubenswrapper[5094]: I0220 07:14:46.840936 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:14:46 crc kubenswrapper[5094]: E0220 07:14:46.842325 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.166139 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs"] Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.167807 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0330a367-c0c9-42a9-9993-1a3b6775fd3b" containerName="mariadb-account-create-update" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.167832 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0330a367-c0c9-42a9-9993-1a3b6775fd3b" containerName="mariadb-account-create-update" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.167859 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a829c6b3-7069-4544-90dc-40ae83aba524" containerName="rabbitmq" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.167872 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a829c6b3-7069-4544-90dc-40ae83aba524" containerName="rabbitmq" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.167885 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="proxy-httpd" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.167898 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="proxy-httpd" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.167920 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92877559-6960-4dbf-890a-fb563f4b0bf8" containerName="barbican-worker" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.167933 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="92877559-6960-4dbf-890a-fb563f4b0bf8" containerName="barbican-worker" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.167959 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d3ab399-3fc6-47e1-995c-5e855c554e9e" containerName="galera" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.167971 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d3ab399-3fc6-47e1-995c-5e855c554e9e" containerName="galera" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.167987 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d01cbaa4-5543-4cd5-b098-7e4600819d32" containerName="cinder-scheduler" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.167999 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d01cbaa4-5543-4cd5-b098-7e4600819d32" containerName="cinder-scheduler" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168021 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3caa33a-a0ec-4fdc-876b-266724a5af50" containerName="nova-cell1-conductor-conductor" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168033 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3caa33a-a0ec-4fdc-876b-266724a5af50" containerName="nova-cell1-conductor-conductor" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168047 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f8cb333-2939-4404-b242-67bcf4e6875b" containerName="cinder-api-log" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168059 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f8cb333-2939-4404-b242-67bcf4e6875b" containerName="cinder-api-log" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168072 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="account-auditor" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168084 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="account-auditor" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168104 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="sg-core" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168119 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="sg-core" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168138 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="container-updater" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168151 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="container-updater" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168169 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="762a565c-672e-4127-a8c6-90f721eeda81" containerName="glance-httpd" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168182 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="762a565c-672e-4127-a8c6-90f721eeda81" containerName="glance-httpd" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168198 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="ceilometer-central-agent" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168210 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="ceilometer-central-agent" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168232 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11aa87b-3964-4a62-871f-bdf7d1ad7848" containerName="nova-metadata-metadata" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168245 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11aa87b-3964-4a62-871f-bdf7d1ad7848" containerName="nova-metadata-metadata" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168266 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd92d75e-9882-4bb7-a41e-cab9777424e8" containerName="ovn-northd" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168278 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd92d75e-9882-4bb7-a41e-cab9777424e8" containerName="ovn-northd" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168314 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="219c74d6-9f45-4bf8-8c67-acdea3c0fab3" containerName="rabbitmq" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168326 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="219c74d6-9f45-4bf8-8c67-acdea3c0fab3" containerName="rabbitmq" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168349 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="account-server" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168364 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="account-server" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168386 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="908e2706-d24f-41c9-b481-4c0d5415c5ca" containerName="barbican-api-log" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168399 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="908e2706-d24f-41c9-b481-4c0d5415c5ca" containerName="barbican-api-log" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168424 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1504790-ccaf-42d5-a28a-a25f0cd353c9" containerName="nova-scheduler-scheduler" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168437 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1504790-ccaf-42d5-a28a-a25f0cd353c9" containerName="nova-scheduler-scheduler" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168451 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovsdb-server" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168463 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovsdb-server" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168476 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="rsync" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168487 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="rsync" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168509 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92877559-6960-4dbf-890a-fb563f4b0bf8" containerName="barbican-worker-log" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168521 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="92877559-6960-4dbf-890a-fb563f4b0bf8" containerName="barbican-worker-log" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168542 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca3adffb-7baf-45db-ab16-cc1c63510fec" containerName="nova-api-log" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168554 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca3adffb-7baf-45db-ab16-cc1c63510fec" containerName="nova-api-log" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168569 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="account-replicator" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168581 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="account-replicator" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168597 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-updater" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168609 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-updater" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168623 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dd0ff85-ae3a-4035-a096-fea5952b19a7" containerName="memcached" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168634 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd0ff85-ae3a-4035-a096-fea5952b19a7" containerName="memcached" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168652 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovs-vswitchd" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168665 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovs-vswitchd" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168678 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d9f1f40-92cc-4f19-9f3b-49651f56bffb" containerName="barbican-keystone-listener" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168691 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d9f1f40-92cc-4f19-9f3b-49651f56bffb" containerName="barbican-keystone-listener" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168732 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="219c74d6-9f45-4bf8-8c67-acdea3c0fab3" containerName="setup-container" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168757 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="219c74d6-9f45-4bf8-8c67-acdea3c0fab3" containerName="setup-container" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168779 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="container-auditor" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168793 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="container-auditor" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168818 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="530069d2-7146-46eb-9c88-056cc8a583b2" containerName="neutron-api" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168829 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="530069d2-7146-46eb-9c88-056cc8a583b2" containerName="neutron-api" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168852 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd92d75e-9882-4bb7-a41e-cab9777424e8" containerName="openstack-network-exporter" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168865 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd92d75e-9882-4bb7-a41e-cab9777424e8" containerName="openstack-network-exporter" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168879 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" containerName="glance-log" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168891 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" containerName="glance-log" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168907 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" containerName="glance-httpd" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168919 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" containerName="glance-httpd" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168936 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="762a565c-672e-4127-a8c6-90f721eeda81" containerName="glance-log" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168951 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="762a565c-672e-4127-a8c6-90f721eeda81" containerName="glance-log" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168967 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-server" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168980 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-server" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169000 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovsdb-server-init" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169013 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovsdb-server-init" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169034 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d3ab399-3fc6-47e1-995c-5e855c554e9e" containerName="mysql-bootstrap" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169046 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d3ab399-3fc6-47e1-995c-5e855c554e9e" containerName="mysql-bootstrap" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169060 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca3adffb-7baf-45db-ab16-cc1c63510fec" containerName="nova-api-api" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169073 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca3adffb-7baf-45db-ab16-cc1c63510fec" containerName="nova-api-api" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169095 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d01cbaa4-5543-4cd5-b098-7e4600819d32" containerName="probe" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169106 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d01cbaa4-5543-4cd5-b098-7e4600819d32" containerName="probe" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169121 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d9f1f40-92cc-4f19-9f3b-49651f56bffb" containerName="barbican-keystone-listener-log" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169134 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d9f1f40-92cc-4f19-9f3b-49651f56bffb" containerName="barbican-keystone-listener-log" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169149 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="ceilometer-notification-agent" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169165 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="ceilometer-notification-agent" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169185 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f8cb333-2939-4404-b242-67bcf4e6875b" containerName="cinder-api" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169197 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f8cb333-2939-4404-b242-67bcf4e6875b" containerName="cinder-api" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169214 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a829c6b3-7069-4544-90dc-40ae83aba524" containerName="setup-container" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169226 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a829c6b3-7069-4544-90dc-40ae83aba524" containerName="setup-container" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169245 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-replicator" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169258 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-replicator" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169274 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="swift-recon-cron" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169286 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="swift-recon-cron" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169317 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="container-server" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169329 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="container-server" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169343 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe9db54-4204-4335-a272-c469e0923478" containerName="kube-state-metrics" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169356 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe9db54-4204-4335-a272-c469e0923478" containerName="kube-state-metrics" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169377 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="account-reaper" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169390 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="account-reaper" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169403 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5bbb9ad-deeb-495f-9750-f7012c00061d" containerName="nova-cell0-conductor-conductor" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169416 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5bbb9ad-deeb-495f-9750-f7012c00061d" containerName="nova-cell0-conductor-conductor" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169432 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-auditor" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169444 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-auditor" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169458 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="732b4015-53b2-4422-b7d1-12b65f6e0c92" containerName="keystone-api" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169470 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="732b4015-53b2-4422-b7d1-12b65f6e0c92" containerName="keystone-api" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169489 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-expirer" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169501 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-expirer" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169520 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="container-replicator" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169532 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="container-replicator" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169544 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11aa87b-3964-4a62-871f-bdf7d1ad7848" containerName="nova-metadata-log" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169560 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11aa87b-3964-4a62-871f-bdf7d1ad7848" containerName="nova-metadata-log" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169578 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="530069d2-7146-46eb-9c88-056cc8a583b2" containerName="neutron-httpd" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169590 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="530069d2-7146-46eb-9c88-056cc8a583b2" containerName="neutron-httpd" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169611 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="908e2706-d24f-41c9-b481-4c0d5415c5ca" containerName="barbican-api" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169623 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="908e2706-d24f-41c9-b481-4c0d5415c5ca" containerName="barbican-api" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169901 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="908e2706-d24f-41c9-b481-4c0d5415c5ca" containerName="barbican-api" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169920 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovs-vswitchd" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169933 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f8cb333-2939-4404-b242-67bcf4e6875b" containerName="cinder-api-log" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169946 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="0330a367-c0c9-42a9-9993-1a3b6775fd3b" containerName="mariadb-account-create-update" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169965 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" containerName="glance-log" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169982 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="92877559-6960-4dbf-890a-fb563f4b0bf8" containerName="barbican-worker-log" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169998 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a829c6b3-7069-4544-90dc-40ae83aba524" containerName="rabbitmq" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170014 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d9f1f40-92cc-4f19-9f3b-49651f56bffb" containerName="barbican-keystone-listener-log" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170030 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca3adffb-7baf-45db-ab16-cc1c63510fec" containerName="nova-api-api" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170048 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="container-updater" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170067 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="762a565c-672e-4127-a8c6-90f721eeda81" containerName="glance-log" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170082 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="container-server" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170098 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="530069d2-7146-46eb-9c88-056cc8a583b2" containerName="neutron-api" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170115 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd92d75e-9882-4bb7-a41e-cab9777424e8" containerName="openstack-network-exporter" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170131 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="rsync" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170150 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" containerName="glance-httpd" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170164 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="219c74d6-9f45-4bf8-8c67-acdea3c0fab3" containerName="rabbitmq" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170177 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="762a565c-672e-4127-a8c6-90f721eeda81" containerName="glance-httpd" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170194 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovsdb-server" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170211 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="account-server" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170230 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-replicator" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170246 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="proxy-httpd" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170269 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-auditor" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170291 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="0330a367-c0c9-42a9-9993-1a3b6775fd3b" containerName="mariadb-account-create-update" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170312 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="732b4015-53b2-4422-b7d1-12b65f6e0c92" containerName="keystone-api" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170326 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f8cb333-2939-4404-b242-67bcf4e6875b" containerName="cinder-api" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170345 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3caa33a-a0ec-4fdc-876b-266724a5af50" containerName="nova-cell1-conductor-conductor" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170359 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd92d75e-9882-4bb7-a41e-cab9777424e8" containerName="ovn-northd" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170376 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="container-auditor" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170391 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-server" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170417 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="swift-recon-cron" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170432 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="account-replicator" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170450 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="908e2706-d24f-41c9-b481-4c0d5415c5ca" containerName="barbican-api-log" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170463 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="ceilometer-central-agent" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170477 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="sg-core" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170493 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d9f1f40-92cc-4f19-9f3b-49651f56bffb" containerName="barbican-keystone-listener" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170505 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="530069d2-7146-46eb-9c88-056cc8a583b2" containerName="neutron-httpd" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170521 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5bbb9ad-deeb-495f-9750-f7012c00061d" containerName="nova-cell0-conductor-conductor" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170534 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="92877559-6960-4dbf-890a-fb563f4b0bf8" containerName="barbican-worker" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170550 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d01cbaa4-5543-4cd5-b098-7e4600819d32" containerName="cinder-scheduler" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170567 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-expirer" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170584 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dd0ff85-ae3a-4035-a096-fea5952b19a7" containerName="memcached" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170598 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d3ab399-3fc6-47e1-995c-5e855c554e9e" containerName="galera" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170613 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="container-replicator" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170626 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1504790-ccaf-42d5-a28a-a25f0cd353c9" containerName="nova-scheduler-scheduler" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170645 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="ceilometer-notification-agent" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170664 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f11aa87b-3964-4a62-871f-bdf7d1ad7848" containerName="nova-metadata-log" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170678 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="account-auditor" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170697 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-updater" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170743 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d01cbaa4-5543-4cd5-b098-7e4600819d32" containerName="probe" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170760 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe9db54-4204-4335-a272-c469e0923478" containerName="kube-state-metrics" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170779 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f11aa87b-3964-4a62-871f-bdf7d1ad7848" containerName="nova-metadata-metadata" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170800 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca3adffb-7baf-45db-ab16-cc1c63510fec" containerName="nova-api-log" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170839 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="account-reaper" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.171635 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.175167 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.176461 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.189336 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs"] Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.286226 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lrb4\" (UniqueName: \"kubernetes.io/projected/74085586-b345-46e6-9367-d3b5243312a4-kube-api-access-5lrb4\") pod \"collect-profiles-29526195-mm4vs\" (UID: \"74085586-b345-46e6-9367-d3b5243312a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.286477 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74085586-b345-46e6-9367-d3b5243312a4-config-volume\") pod \"collect-profiles-29526195-mm4vs\" (UID: \"74085586-b345-46e6-9367-d3b5243312a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.286580 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74085586-b345-46e6-9367-d3b5243312a4-secret-volume\") pod \"collect-profiles-29526195-mm4vs\" (UID: \"74085586-b345-46e6-9367-d3b5243312a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.387956 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74085586-b345-46e6-9367-d3b5243312a4-secret-volume\") pod \"collect-profiles-29526195-mm4vs\" (UID: \"74085586-b345-46e6-9367-d3b5243312a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.388094 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lrb4\" (UniqueName: \"kubernetes.io/projected/74085586-b345-46e6-9367-d3b5243312a4-kube-api-access-5lrb4\") pod \"collect-profiles-29526195-mm4vs\" (UID: \"74085586-b345-46e6-9367-d3b5243312a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.388146 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74085586-b345-46e6-9367-d3b5243312a4-config-volume\") pod \"collect-profiles-29526195-mm4vs\" (UID: \"74085586-b345-46e6-9367-d3b5243312a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.389100 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74085586-b345-46e6-9367-d3b5243312a4-config-volume\") pod \"collect-profiles-29526195-mm4vs\" (UID: \"74085586-b345-46e6-9367-d3b5243312a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.399598 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74085586-b345-46e6-9367-d3b5243312a4-secret-volume\") pod \"collect-profiles-29526195-mm4vs\" (UID: \"74085586-b345-46e6-9367-d3b5243312a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.414302 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lrb4\" (UniqueName: \"kubernetes.io/projected/74085586-b345-46e6-9367-d3b5243312a4-kube-api-access-5lrb4\") pod \"collect-profiles-29526195-mm4vs\" (UID: \"74085586-b345-46e6-9367-d3b5243312a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.501035 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.779255 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs"] Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.841433 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.841991 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:15:01 crc kubenswrapper[5094]: I0220 07:15:01.025859 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" event={"ID":"74085586-b345-46e6-9367-d3b5243312a4","Type":"ContainerStarted","Data":"c476764b7efa3f5032f50b9bb9a9da3e47b6999aab2c3a50d607379d62e94686"} Feb 20 07:15:01 crc kubenswrapper[5094]: I0220 07:15:01.025926 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" event={"ID":"74085586-b345-46e6-9367-d3b5243312a4","Type":"ContainerStarted","Data":"c03a4bde583c80e36a47cfc23ed3195915e04446ab8cdd2137f1d6df7125216f"} Feb 20 07:15:01 crc kubenswrapper[5094]: I0220 07:15:01.064364 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" podStartSLOduration=1.06433217 podStartE2EDuration="1.06433217s" podCreationTimestamp="2026-02-20 07:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:15:01.056519482 +0000 UTC m=+1715.929146203" watchObservedRunningTime="2026-02-20 07:15:01.06433217 +0000 UTC m=+1715.936958881" Feb 20 07:15:02 crc kubenswrapper[5094]: I0220 07:15:02.038746 5094 generic.go:334] "Generic (PLEG): container finished" podID="74085586-b345-46e6-9367-d3b5243312a4" containerID="c476764b7efa3f5032f50b9bb9a9da3e47b6999aab2c3a50d607379d62e94686" exitCode=0 Feb 20 07:15:02 crc kubenswrapper[5094]: I0220 07:15:02.038919 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" event={"ID":"74085586-b345-46e6-9367-d3b5243312a4","Type":"ContainerDied","Data":"c476764b7efa3f5032f50b9bb9a9da3e47b6999aab2c3a50d607379d62e94686"} Feb 20 07:15:03 crc kubenswrapper[5094]: I0220 07:15:03.455223 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" Feb 20 07:15:03 crc kubenswrapper[5094]: I0220 07:15:03.554339 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lrb4\" (UniqueName: \"kubernetes.io/projected/74085586-b345-46e6-9367-d3b5243312a4-kube-api-access-5lrb4\") pod \"74085586-b345-46e6-9367-d3b5243312a4\" (UID: \"74085586-b345-46e6-9367-d3b5243312a4\") " Feb 20 07:15:03 crc kubenswrapper[5094]: I0220 07:15:03.554447 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74085586-b345-46e6-9367-d3b5243312a4-config-volume\") pod \"74085586-b345-46e6-9367-d3b5243312a4\" (UID: \"74085586-b345-46e6-9367-d3b5243312a4\") " Feb 20 07:15:03 crc kubenswrapper[5094]: I0220 07:15:03.554513 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74085586-b345-46e6-9367-d3b5243312a4-secret-volume\") pod \"74085586-b345-46e6-9367-d3b5243312a4\" (UID: \"74085586-b345-46e6-9367-d3b5243312a4\") " Feb 20 07:15:03 crc kubenswrapper[5094]: I0220 07:15:03.555417 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74085586-b345-46e6-9367-d3b5243312a4-config-volume" (OuterVolumeSpecName: "config-volume") pod "74085586-b345-46e6-9367-d3b5243312a4" (UID: "74085586-b345-46e6-9367-d3b5243312a4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:15:03 crc kubenswrapper[5094]: I0220 07:15:03.561954 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74085586-b345-46e6-9367-d3b5243312a4-kube-api-access-5lrb4" (OuterVolumeSpecName: "kube-api-access-5lrb4") pod "74085586-b345-46e6-9367-d3b5243312a4" (UID: "74085586-b345-46e6-9367-d3b5243312a4"). InnerVolumeSpecName "kube-api-access-5lrb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:15:03 crc kubenswrapper[5094]: I0220 07:15:03.563247 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74085586-b345-46e6-9367-d3b5243312a4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "74085586-b345-46e6-9367-d3b5243312a4" (UID: "74085586-b345-46e6-9367-d3b5243312a4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:15:03 crc kubenswrapper[5094]: I0220 07:15:03.656240 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lrb4\" (UniqueName: \"kubernetes.io/projected/74085586-b345-46e6-9367-d3b5243312a4-kube-api-access-5lrb4\") on node \"crc\" DevicePath \"\"" Feb 20 07:15:03 crc kubenswrapper[5094]: I0220 07:15:03.656272 5094 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74085586-b345-46e6-9367-d3b5243312a4-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 07:15:03 crc kubenswrapper[5094]: I0220 07:15:03.656284 5094 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74085586-b345-46e6-9367-d3b5243312a4-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 07:15:04 crc kubenswrapper[5094]: I0220 07:15:04.063443 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" event={"ID":"74085586-b345-46e6-9367-d3b5243312a4","Type":"ContainerDied","Data":"c03a4bde583c80e36a47cfc23ed3195915e04446ab8cdd2137f1d6df7125216f"} Feb 20 07:15:04 crc kubenswrapper[5094]: I0220 07:15:04.063512 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c03a4bde583c80e36a47cfc23ed3195915e04446ab8cdd2137f1d6df7125216f" Feb 20 07:15:04 crc kubenswrapper[5094]: I0220 07:15:04.063594 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" Feb 20 07:15:14 crc kubenswrapper[5094]: I0220 07:15:14.844754 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:15:14 crc kubenswrapper[5094]: E0220 07:15:14.846338 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:15:25 crc kubenswrapper[5094]: I0220 07:15:25.849342 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:15:25 crc kubenswrapper[5094]: E0220 07:15:25.850998 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:15:33 crc kubenswrapper[5094]: I0220 07:15:33.419993 5094 scope.go:117] "RemoveContainer" containerID="1616ce3dec6e69416e60df4bd5866fdeda9b06b012b17e0a3ceddd1a84ee25c5" Feb 20 07:15:33 crc kubenswrapper[5094]: I0220 07:15:33.461634 5094 scope.go:117] "RemoveContainer" containerID="87c896e6058562619d75ebeaacab8cf5ef47914c9fd0a960cbfc01098b412e02" Feb 20 07:15:33 crc kubenswrapper[5094]: I0220 07:15:33.512624 5094 scope.go:117] "RemoveContainer" containerID="514774461bdf76f918de93cfcbabf0b67e1bca119186db34bd24f1a423cf7e05" Feb 20 07:15:33 crc kubenswrapper[5094]: I0220 07:15:33.544983 5094 scope.go:117] "RemoveContainer" containerID="f74e9fd620d60bb7c55d8ca9b94a45b983b355d7aa77e6d394eb827e69cef1af" Feb 20 07:15:33 crc kubenswrapper[5094]: I0220 07:15:33.578616 5094 scope.go:117] "RemoveContainer" containerID="6d3b6790676924518ae410dca9464ac17e8adb8be7c1c0809abd3e37c9afadec" Feb 20 07:15:33 crc kubenswrapper[5094]: I0220 07:15:33.635650 5094 scope.go:117] "RemoveContainer" containerID="a7698ff70c057c639a740972f6d9a16bf4a7bb2202bca0a958050e7b6bf16b01" Feb 20 07:15:33 crc kubenswrapper[5094]: I0220 07:15:33.698280 5094 scope.go:117] "RemoveContainer" containerID="5c648081449bf27866a97d9ce26a3b96bf18b0b2aa6ccd28cfd7c308a8ad471b" Feb 20 07:15:36 crc kubenswrapper[5094]: I0220 07:15:36.841344 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:15:36 crc kubenswrapper[5094]: E0220 07:15:36.842676 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:15:47 crc kubenswrapper[5094]: I0220 07:15:47.840496 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:15:47 crc kubenswrapper[5094]: E0220 07:15:47.842034 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:15:59 crc kubenswrapper[5094]: I0220 07:15:59.840208 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:15:59 crc kubenswrapper[5094]: E0220 07:15:59.841051 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:16:13 crc kubenswrapper[5094]: I0220 07:16:13.841621 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:16:14 crc kubenswrapper[5094]: I0220 07:16:14.931448 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"38e73fba76bbd62d29e5f1f0936fd251ad4017c1b3ae50e879b97b18fd5f82df"} Feb 20 07:16:23 crc kubenswrapper[5094]: I0220 07:16:23.610551 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fxgmt"] Feb 20 07:16:23 crc kubenswrapper[5094]: E0220 07:16:23.612412 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74085586-b345-46e6-9367-d3b5243312a4" containerName="collect-profiles" Feb 20 07:16:23 crc kubenswrapper[5094]: I0220 07:16:23.612432 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="74085586-b345-46e6-9367-d3b5243312a4" containerName="collect-profiles" Feb 20 07:16:23 crc kubenswrapper[5094]: E0220 07:16:23.612465 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0330a367-c0c9-42a9-9993-1a3b6775fd3b" containerName="mariadb-account-create-update" Feb 20 07:16:23 crc kubenswrapper[5094]: I0220 07:16:23.612474 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0330a367-c0c9-42a9-9993-1a3b6775fd3b" containerName="mariadb-account-create-update" Feb 20 07:16:23 crc kubenswrapper[5094]: I0220 07:16:23.612665 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="74085586-b345-46e6-9367-d3b5243312a4" containerName="collect-profiles" Feb 20 07:16:23 crc kubenswrapper[5094]: I0220 07:16:23.614313 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:23 crc kubenswrapper[5094]: I0220 07:16:23.644536 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fxgmt"] Feb 20 07:16:23 crc kubenswrapper[5094]: I0220 07:16:23.704640 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfj9q\" (UniqueName: \"kubernetes.io/projected/fc330ce9-f173-403a-a659-c3c7326a8ae5-kube-api-access-kfj9q\") pod \"redhat-operators-fxgmt\" (UID: \"fc330ce9-f173-403a-a659-c3c7326a8ae5\") " pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:23 crc kubenswrapper[5094]: I0220 07:16:23.704930 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc330ce9-f173-403a-a659-c3c7326a8ae5-catalog-content\") pod \"redhat-operators-fxgmt\" (UID: \"fc330ce9-f173-403a-a659-c3c7326a8ae5\") " pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:23 crc kubenswrapper[5094]: I0220 07:16:23.705181 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc330ce9-f173-403a-a659-c3c7326a8ae5-utilities\") pod \"redhat-operators-fxgmt\" (UID: \"fc330ce9-f173-403a-a659-c3c7326a8ae5\") " pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:23 crc kubenswrapper[5094]: I0220 07:16:23.806770 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc330ce9-f173-403a-a659-c3c7326a8ae5-utilities\") pod \"redhat-operators-fxgmt\" (UID: \"fc330ce9-f173-403a-a659-c3c7326a8ae5\") " pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:23 crc kubenswrapper[5094]: I0220 07:16:23.806867 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfj9q\" (UniqueName: \"kubernetes.io/projected/fc330ce9-f173-403a-a659-c3c7326a8ae5-kube-api-access-kfj9q\") pod \"redhat-operators-fxgmt\" (UID: \"fc330ce9-f173-403a-a659-c3c7326a8ae5\") " pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:23 crc kubenswrapper[5094]: I0220 07:16:23.806922 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc330ce9-f173-403a-a659-c3c7326a8ae5-catalog-content\") pod \"redhat-operators-fxgmt\" (UID: \"fc330ce9-f173-403a-a659-c3c7326a8ae5\") " pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:23 crc kubenswrapper[5094]: I0220 07:16:23.807412 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc330ce9-f173-403a-a659-c3c7326a8ae5-catalog-content\") pod \"redhat-operators-fxgmt\" (UID: \"fc330ce9-f173-403a-a659-c3c7326a8ae5\") " pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:23 crc kubenswrapper[5094]: I0220 07:16:23.807563 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc330ce9-f173-403a-a659-c3c7326a8ae5-utilities\") pod \"redhat-operators-fxgmt\" (UID: \"fc330ce9-f173-403a-a659-c3c7326a8ae5\") " pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:23 crc kubenswrapper[5094]: I0220 07:16:23.832791 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfj9q\" (UniqueName: \"kubernetes.io/projected/fc330ce9-f173-403a-a659-c3c7326a8ae5-kube-api-access-kfj9q\") pod \"redhat-operators-fxgmt\" (UID: \"fc330ce9-f173-403a-a659-c3c7326a8ae5\") " pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:23 crc kubenswrapper[5094]: I0220 07:16:23.942064 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:24 crc kubenswrapper[5094]: I0220 07:16:24.333087 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fxgmt"] Feb 20 07:16:25 crc kubenswrapper[5094]: I0220 07:16:25.028996 5094 generic.go:334] "Generic (PLEG): container finished" podID="fc330ce9-f173-403a-a659-c3c7326a8ae5" containerID="839c0f255954bd2ecc5395fae7edd7e494e820f2caf00dccf86f613d6f0aa240" exitCode=0 Feb 20 07:16:25 crc kubenswrapper[5094]: I0220 07:16:25.029069 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxgmt" event={"ID":"fc330ce9-f173-403a-a659-c3c7326a8ae5","Type":"ContainerDied","Data":"839c0f255954bd2ecc5395fae7edd7e494e820f2caf00dccf86f613d6f0aa240"} Feb 20 07:16:25 crc kubenswrapper[5094]: I0220 07:16:25.029148 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxgmt" event={"ID":"fc330ce9-f173-403a-a659-c3c7326a8ae5","Type":"ContainerStarted","Data":"2aba90f111620ddf577aab25aaa8798b21a91cff9c03d53b756128d812853f94"} Feb 20 07:16:25 crc kubenswrapper[5094]: I0220 07:16:25.032799 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 07:16:26 crc kubenswrapper[5094]: I0220 07:16:26.045038 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxgmt" event={"ID":"fc330ce9-f173-403a-a659-c3c7326a8ae5","Type":"ContainerStarted","Data":"08cf71aa7ef86aa96ab4a725e3a786c9f214125d1e6aaebd458da7b349136d2e"} Feb 20 07:16:26 crc kubenswrapper[5094]: I0220 07:16:26.607558 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bpgwq"] Feb 20 07:16:26 crc kubenswrapper[5094]: I0220 07:16:26.610591 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:26 crc kubenswrapper[5094]: I0220 07:16:26.649293 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bpgwq"] Feb 20 07:16:26 crc kubenswrapper[5094]: I0220 07:16:26.669016 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edef768c-5542-4656-a22c-61559bee852a-catalog-content\") pod \"community-operators-bpgwq\" (UID: \"edef768c-5542-4656-a22c-61559bee852a\") " pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:26 crc kubenswrapper[5094]: I0220 07:16:26.669095 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2fnl\" (UniqueName: \"kubernetes.io/projected/edef768c-5542-4656-a22c-61559bee852a-kube-api-access-p2fnl\") pod \"community-operators-bpgwq\" (UID: \"edef768c-5542-4656-a22c-61559bee852a\") " pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:26 crc kubenswrapper[5094]: I0220 07:16:26.669151 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edef768c-5542-4656-a22c-61559bee852a-utilities\") pod \"community-operators-bpgwq\" (UID: \"edef768c-5542-4656-a22c-61559bee852a\") " pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:26 crc kubenswrapper[5094]: I0220 07:16:26.770637 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edef768c-5542-4656-a22c-61559bee852a-utilities\") pod \"community-operators-bpgwq\" (UID: \"edef768c-5542-4656-a22c-61559bee852a\") " pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:26 crc kubenswrapper[5094]: I0220 07:16:26.770761 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edef768c-5542-4656-a22c-61559bee852a-catalog-content\") pod \"community-operators-bpgwq\" (UID: \"edef768c-5542-4656-a22c-61559bee852a\") " pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:26 crc kubenswrapper[5094]: I0220 07:16:26.770796 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2fnl\" (UniqueName: \"kubernetes.io/projected/edef768c-5542-4656-a22c-61559bee852a-kube-api-access-p2fnl\") pod \"community-operators-bpgwq\" (UID: \"edef768c-5542-4656-a22c-61559bee852a\") " pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:26 crc kubenswrapper[5094]: I0220 07:16:26.771593 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edef768c-5542-4656-a22c-61559bee852a-catalog-content\") pod \"community-operators-bpgwq\" (UID: \"edef768c-5542-4656-a22c-61559bee852a\") " pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:26 crc kubenswrapper[5094]: I0220 07:16:26.771595 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edef768c-5542-4656-a22c-61559bee852a-utilities\") pod \"community-operators-bpgwq\" (UID: \"edef768c-5542-4656-a22c-61559bee852a\") " pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:26 crc kubenswrapper[5094]: I0220 07:16:26.806409 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2fnl\" (UniqueName: \"kubernetes.io/projected/edef768c-5542-4656-a22c-61559bee852a-kube-api-access-p2fnl\") pod \"community-operators-bpgwq\" (UID: \"edef768c-5542-4656-a22c-61559bee852a\") " pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:26 crc kubenswrapper[5094]: I0220 07:16:26.929402 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:27 crc kubenswrapper[5094]: I0220 07:16:27.082539 5094 generic.go:334] "Generic (PLEG): container finished" podID="fc330ce9-f173-403a-a659-c3c7326a8ae5" containerID="08cf71aa7ef86aa96ab4a725e3a786c9f214125d1e6aaebd458da7b349136d2e" exitCode=0 Feb 20 07:16:27 crc kubenswrapper[5094]: I0220 07:16:27.082596 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxgmt" event={"ID":"fc330ce9-f173-403a-a659-c3c7326a8ae5","Type":"ContainerDied","Data":"08cf71aa7ef86aa96ab4a725e3a786c9f214125d1e6aaebd458da7b349136d2e"} Feb 20 07:16:27 crc kubenswrapper[5094]: I0220 07:16:27.245265 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bpgwq"] Feb 20 07:16:28 crc kubenswrapper[5094]: I0220 07:16:28.100849 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxgmt" event={"ID":"fc330ce9-f173-403a-a659-c3c7326a8ae5","Type":"ContainerStarted","Data":"372985f5d8ae355c4b45a4d003d44e7bcabe22f67b2da53047bd46bbae4ca521"} Feb 20 07:16:28 crc kubenswrapper[5094]: I0220 07:16:28.102965 5094 generic.go:334] "Generic (PLEG): container finished" podID="edef768c-5542-4656-a22c-61559bee852a" containerID="445c378b533d8cda17dde1a3261e7c35527956eb919d6b68028d1be9060fcba6" exitCode=0 Feb 20 07:16:28 crc kubenswrapper[5094]: I0220 07:16:28.103067 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpgwq" event={"ID":"edef768c-5542-4656-a22c-61559bee852a","Type":"ContainerDied","Data":"445c378b533d8cda17dde1a3261e7c35527956eb919d6b68028d1be9060fcba6"} Feb 20 07:16:28 crc kubenswrapper[5094]: I0220 07:16:28.103138 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpgwq" event={"ID":"edef768c-5542-4656-a22c-61559bee852a","Type":"ContainerStarted","Data":"7c54b7f653705f50f4a7768ba3bc601f95148cb9ef2c08121132b654c1dacb31"} Feb 20 07:16:28 crc kubenswrapper[5094]: I0220 07:16:28.140795 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fxgmt" podStartSLOduration=2.694275211 podStartE2EDuration="5.140768737s" podCreationTimestamp="2026-02-20 07:16:23 +0000 UTC" firstStartedPulling="2026-02-20 07:16:25.032509259 +0000 UTC m=+1799.905135970" lastFinishedPulling="2026-02-20 07:16:27.479002785 +0000 UTC m=+1802.351629496" observedRunningTime="2026-02-20 07:16:28.139316212 +0000 UTC m=+1803.011942943" watchObservedRunningTime="2026-02-20 07:16:28.140768737 +0000 UTC m=+1803.013395458" Feb 20 07:16:29 crc kubenswrapper[5094]: I0220 07:16:29.114484 5094 generic.go:334] "Generic (PLEG): container finished" podID="edef768c-5542-4656-a22c-61559bee852a" containerID="86d63d89448664d061af9377adb2ecbd115c5a9e9c513ff7ecb7c1633f03b880" exitCode=0 Feb 20 07:16:29 crc kubenswrapper[5094]: I0220 07:16:29.114607 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpgwq" event={"ID":"edef768c-5542-4656-a22c-61559bee852a","Type":"ContainerDied","Data":"86d63d89448664d061af9377adb2ecbd115c5a9e9c513ff7ecb7c1633f03b880"} Feb 20 07:16:30 crc kubenswrapper[5094]: I0220 07:16:30.125123 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpgwq" event={"ID":"edef768c-5542-4656-a22c-61559bee852a","Type":"ContainerStarted","Data":"4dfed148a8d236fc70447485880962be98056daf836753eac14634d846d47301"} Feb 20 07:16:30 crc kubenswrapper[5094]: I0220 07:16:30.148111 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bpgwq" podStartSLOduration=2.77341787 podStartE2EDuration="4.14808542s" podCreationTimestamp="2026-02-20 07:16:26 +0000 UTC" firstStartedPulling="2026-02-20 07:16:28.105929631 +0000 UTC m=+1802.978556342" lastFinishedPulling="2026-02-20 07:16:29.480597161 +0000 UTC m=+1804.353223892" observedRunningTime="2026-02-20 07:16:30.141786368 +0000 UTC m=+1805.014413079" watchObservedRunningTime="2026-02-20 07:16:30.14808542 +0000 UTC m=+1805.020712161" Feb 20 07:16:33 crc kubenswrapper[5094]: I0220 07:16:33.888785 5094 scope.go:117] "RemoveContainer" containerID="ef54747f742ee43a493f0fa824a214fb6267fc5169fba7034a316b9cd14cd93e" Feb 20 07:16:33 crc kubenswrapper[5094]: I0220 07:16:33.929195 5094 scope.go:117] "RemoveContainer" containerID="20a41a106027a121889419317d3bee9adb413422fa7d51549986b6bb65338151" Feb 20 07:16:33 crc kubenswrapper[5094]: I0220 07:16:33.944119 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:33 crc kubenswrapper[5094]: I0220 07:16:33.944984 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:33 crc kubenswrapper[5094]: I0220 07:16:33.995019 5094 scope.go:117] "RemoveContainer" containerID="f1120a67726b82c35452578214de053dba930e494ccdb044a45f2d20acdc8e01" Feb 20 07:16:34 crc kubenswrapper[5094]: I0220 07:16:34.023191 5094 scope.go:117] "RemoveContainer" containerID="f4dba9092d5d591bec2aa8645bd29d545b24d0697fb34f266f49cf7c75f5e2e2" Feb 20 07:16:34 crc kubenswrapper[5094]: I0220 07:16:34.070282 5094 scope.go:117] "RemoveContainer" containerID="f58aa2d0199fdb5db167c5aef1bd053858fb706513548fbeaab67aa8e11ddcd5" Feb 20 07:16:34 crc kubenswrapper[5094]: I0220 07:16:34.096637 5094 scope.go:117] "RemoveContainer" containerID="3b84d0ad082d2f55f39f713582ba0b77187948e8fdabcb2633a95c7f3cd76484" Feb 20 07:16:34 crc kubenswrapper[5094]: I0220 07:16:34.120577 5094 scope.go:117] "RemoveContainer" containerID="8fc9ffae76a3123cc0d37835acdbe04a2f9a8e499d724ece0380a09483cd0561" Feb 20 07:16:34 crc kubenswrapper[5094]: I0220 07:16:34.147105 5094 scope.go:117] "RemoveContainer" containerID="1dec1e1d1106c53f0481c59fed514648b5c6714553079cbf04bed733473a6862" Feb 20 07:16:35 crc kubenswrapper[5094]: I0220 07:16:35.015090 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fxgmt" podUID="fc330ce9-f173-403a-a659-c3c7326a8ae5" containerName="registry-server" probeResult="failure" output=< Feb 20 07:16:35 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 07:16:35 crc kubenswrapper[5094]: > Feb 20 07:16:36 crc kubenswrapper[5094]: I0220 07:16:36.930370 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:36 crc kubenswrapper[5094]: I0220 07:16:36.930852 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:37 crc kubenswrapper[5094]: I0220 07:16:37.002300 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:37 crc kubenswrapper[5094]: I0220 07:16:37.242693 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:40 crc kubenswrapper[5094]: I0220 07:16:40.766741 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bpgwq"] Feb 20 07:16:40 crc kubenswrapper[5094]: I0220 07:16:40.768613 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bpgwq" podUID="edef768c-5542-4656-a22c-61559bee852a" containerName="registry-server" containerID="cri-o://4dfed148a8d236fc70447485880962be98056daf836753eac14634d846d47301" gracePeriod=2 Feb 20 07:16:41 crc kubenswrapper[5094]: I0220 07:16:41.231862 5094 generic.go:334] "Generic (PLEG): container finished" podID="edef768c-5542-4656-a22c-61559bee852a" containerID="4dfed148a8d236fc70447485880962be98056daf836753eac14634d846d47301" exitCode=0 Feb 20 07:16:41 crc kubenswrapper[5094]: I0220 07:16:41.231976 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpgwq" event={"ID":"edef768c-5542-4656-a22c-61559bee852a","Type":"ContainerDied","Data":"4dfed148a8d236fc70447485880962be98056daf836753eac14634d846d47301"} Feb 20 07:16:41 crc kubenswrapper[5094]: I0220 07:16:41.301017 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:41 crc kubenswrapper[5094]: I0220 07:16:41.350902 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2fnl\" (UniqueName: \"kubernetes.io/projected/edef768c-5542-4656-a22c-61559bee852a-kube-api-access-p2fnl\") pod \"edef768c-5542-4656-a22c-61559bee852a\" (UID: \"edef768c-5542-4656-a22c-61559bee852a\") " Feb 20 07:16:41 crc kubenswrapper[5094]: I0220 07:16:41.351023 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edef768c-5542-4656-a22c-61559bee852a-catalog-content\") pod \"edef768c-5542-4656-a22c-61559bee852a\" (UID: \"edef768c-5542-4656-a22c-61559bee852a\") " Feb 20 07:16:41 crc kubenswrapper[5094]: I0220 07:16:41.351142 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edef768c-5542-4656-a22c-61559bee852a-utilities\") pod \"edef768c-5542-4656-a22c-61559bee852a\" (UID: \"edef768c-5542-4656-a22c-61559bee852a\") " Feb 20 07:16:41 crc kubenswrapper[5094]: I0220 07:16:41.352731 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edef768c-5542-4656-a22c-61559bee852a-utilities" (OuterVolumeSpecName: "utilities") pod "edef768c-5542-4656-a22c-61559bee852a" (UID: "edef768c-5542-4656-a22c-61559bee852a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:16:41 crc kubenswrapper[5094]: I0220 07:16:41.361273 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edef768c-5542-4656-a22c-61559bee852a-kube-api-access-p2fnl" (OuterVolumeSpecName: "kube-api-access-p2fnl") pod "edef768c-5542-4656-a22c-61559bee852a" (UID: "edef768c-5542-4656-a22c-61559bee852a"). InnerVolumeSpecName "kube-api-access-p2fnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:16:41 crc kubenswrapper[5094]: I0220 07:16:41.419503 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edef768c-5542-4656-a22c-61559bee852a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "edef768c-5542-4656-a22c-61559bee852a" (UID: "edef768c-5542-4656-a22c-61559bee852a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:16:41 crc kubenswrapper[5094]: I0220 07:16:41.454048 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2fnl\" (UniqueName: \"kubernetes.io/projected/edef768c-5542-4656-a22c-61559bee852a-kube-api-access-p2fnl\") on node \"crc\" DevicePath \"\"" Feb 20 07:16:41 crc kubenswrapper[5094]: I0220 07:16:41.454112 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edef768c-5542-4656-a22c-61559bee852a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:16:41 crc kubenswrapper[5094]: I0220 07:16:41.454131 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edef768c-5542-4656-a22c-61559bee852a-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:16:42 crc kubenswrapper[5094]: I0220 07:16:42.249768 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpgwq" event={"ID":"edef768c-5542-4656-a22c-61559bee852a","Type":"ContainerDied","Data":"7c54b7f653705f50f4a7768ba3bc601f95148cb9ef2c08121132b654c1dacb31"} Feb 20 07:16:42 crc kubenswrapper[5094]: I0220 07:16:42.249887 5094 scope.go:117] "RemoveContainer" containerID="4dfed148a8d236fc70447485880962be98056daf836753eac14634d846d47301" Feb 20 07:16:42 crc kubenswrapper[5094]: I0220 07:16:42.250215 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:42 crc kubenswrapper[5094]: I0220 07:16:42.299854 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bpgwq"] Feb 20 07:16:42 crc kubenswrapper[5094]: I0220 07:16:42.314080 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bpgwq"] Feb 20 07:16:42 crc kubenswrapper[5094]: I0220 07:16:42.315994 5094 scope.go:117] "RemoveContainer" containerID="86d63d89448664d061af9377adb2ecbd115c5a9e9c513ff7ecb7c1633f03b880" Feb 20 07:16:42 crc kubenswrapper[5094]: I0220 07:16:42.346526 5094 scope.go:117] "RemoveContainer" containerID="445c378b533d8cda17dde1a3261e7c35527956eb919d6b68028d1be9060fcba6" Feb 20 07:16:43 crc kubenswrapper[5094]: I0220 07:16:43.859051 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edef768c-5542-4656-a22c-61559bee852a" path="/var/lib/kubelet/pods/edef768c-5542-4656-a22c-61559bee852a/volumes" Feb 20 07:16:44 crc kubenswrapper[5094]: I0220 07:16:44.006505 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:44 crc kubenswrapper[5094]: I0220 07:16:44.074182 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:44 crc kubenswrapper[5094]: I0220 07:16:44.968849 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fxgmt"] Feb 20 07:16:45 crc kubenswrapper[5094]: I0220 07:16:45.278851 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fxgmt" podUID="fc330ce9-f173-403a-a659-c3c7326a8ae5" containerName="registry-server" containerID="cri-o://372985f5d8ae355c4b45a4d003d44e7bcabe22f67b2da53047bd46bbae4ca521" gracePeriod=2 Feb 20 07:16:45 crc kubenswrapper[5094]: I0220 07:16:45.865244 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:45 crc kubenswrapper[5094]: I0220 07:16:45.936418 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc330ce9-f173-403a-a659-c3c7326a8ae5-catalog-content\") pod \"fc330ce9-f173-403a-a659-c3c7326a8ae5\" (UID: \"fc330ce9-f173-403a-a659-c3c7326a8ae5\") " Feb 20 07:16:45 crc kubenswrapper[5094]: I0220 07:16:45.936561 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc330ce9-f173-403a-a659-c3c7326a8ae5-utilities\") pod \"fc330ce9-f173-403a-a659-c3c7326a8ae5\" (UID: \"fc330ce9-f173-403a-a659-c3c7326a8ae5\") " Feb 20 07:16:45 crc kubenswrapper[5094]: I0220 07:16:45.936674 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfj9q\" (UniqueName: \"kubernetes.io/projected/fc330ce9-f173-403a-a659-c3c7326a8ae5-kube-api-access-kfj9q\") pod \"fc330ce9-f173-403a-a659-c3c7326a8ae5\" (UID: \"fc330ce9-f173-403a-a659-c3c7326a8ae5\") " Feb 20 07:16:45 crc kubenswrapper[5094]: I0220 07:16:45.937653 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc330ce9-f173-403a-a659-c3c7326a8ae5-utilities" (OuterVolumeSpecName: "utilities") pod "fc330ce9-f173-403a-a659-c3c7326a8ae5" (UID: "fc330ce9-f173-403a-a659-c3c7326a8ae5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:16:45 crc kubenswrapper[5094]: I0220 07:16:45.947334 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc330ce9-f173-403a-a659-c3c7326a8ae5-kube-api-access-kfj9q" (OuterVolumeSpecName: "kube-api-access-kfj9q") pod "fc330ce9-f173-403a-a659-c3c7326a8ae5" (UID: "fc330ce9-f173-403a-a659-c3c7326a8ae5"). InnerVolumeSpecName "kube-api-access-kfj9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.039613 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc330ce9-f173-403a-a659-c3c7326a8ae5-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.039668 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfj9q\" (UniqueName: \"kubernetes.io/projected/fc330ce9-f173-403a-a659-c3c7326a8ae5-kube-api-access-kfj9q\") on node \"crc\" DevicePath \"\"" Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.125827 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc330ce9-f173-403a-a659-c3c7326a8ae5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc330ce9-f173-403a-a659-c3c7326a8ae5" (UID: "fc330ce9-f173-403a-a659-c3c7326a8ae5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.141403 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc330ce9-f173-403a-a659-c3c7326a8ae5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.293434 5094 generic.go:334] "Generic (PLEG): container finished" podID="fc330ce9-f173-403a-a659-c3c7326a8ae5" containerID="372985f5d8ae355c4b45a4d003d44e7bcabe22f67b2da53047bd46bbae4ca521" exitCode=0 Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.293535 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxgmt" event={"ID":"fc330ce9-f173-403a-a659-c3c7326a8ae5","Type":"ContainerDied","Data":"372985f5d8ae355c4b45a4d003d44e7bcabe22f67b2da53047bd46bbae4ca521"} Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.293545 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.293587 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxgmt" event={"ID":"fc330ce9-f173-403a-a659-c3c7326a8ae5","Type":"ContainerDied","Data":"2aba90f111620ddf577aab25aaa8798b21a91cff9c03d53b756128d812853f94"} Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.293628 5094 scope.go:117] "RemoveContainer" containerID="372985f5d8ae355c4b45a4d003d44e7bcabe22f67b2da53047bd46bbae4ca521" Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.336138 5094 scope.go:117] "RemoveContainer" containerID="08cf71aa7ef86aa96ab4a725e3a786c9f214125d1e6aaebd458da7b349136d2e" Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.367565 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fxgmt"] Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.375069 5094 scope.go:117] "RemoveContainer" containerID="839c0f255954bd2ecc5395fae7edd7e494e820f2caf00dccf86f613d6f0aa240" Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.380888 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fxgmt"] Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.410689 5094 scope.go:117] "RemoveContainer" containerID="372985f5d8ae355c4b45a4d003d44e7bcabe22f67b2da53047bd46bbae4ca521" Feb 20 07:16:46 crc kubenswrapper[5094]: E0220 07:16:46.411198 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"372985f5d8ae355c4b45a4d003d44e7bcabe22f67b2da53047bd46bbae4ca521\": container with ID starting with 372985f5d8ae355c4b45a4d003d44e7bcabe22f67b2da53047bd46bbae4ca521 not found: ID does not exist" containerID="372985f5d8ae355c4b45a4d003d44e7bcabe22f67b2da53047bd46bbae4ca521" Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.411382 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"372985f5d8ae355c4b45a4d003d44e7bcabe22f67b2da53047bd46bbae4ca521"} err="failed to get container status \"372985f5d8ae355c4b45a4d003d44e7bcabe22f67b2da53047bd46bbae4ca521\": rpc error: code = NotFound desc = could not find container \"372985f5d8ae355c4b45a4d003d44e7bcabe22f67b2da53047bd46bbae4ca521\": container with ID starting with 372985f5d8ae355c4b45a4d003d44e7bcabe22f67b2da53047bd46bbae4ca521 not found: ID does not exist" Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.411431 5094 scope.go:117] "RemoveContainer" containerID="08cf71aa7ef86aa96ab4a725e3a786c9f214125d1e6aaebd458da7b349136d2e" Feb 20 07:16:46 crc kubenswrapper[5094]: E0220 07:16:46.412022 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08cf71aa7ef86aa96ab4a725e3a786c9f214125d1e6aaebd458da7b349136d2e\": container with ID starting with 08cf71aa7ef86aa96ab4a725e3a786c9f214125d1e6aaebd458da7b349136d2e not found: ID does not exist" containerID="08cf71aa7ef86aa96ab4a725e3a786c9f214125d1e6aaebd458da7b349136d2e" Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.412055 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08cf71aa7ef86aa96ab4a725e3a786c9f214125d1e6aaebd458da7b349136d2e"} err="failed to get container status \"08cf71aa7ef86aa96ab4a725e3a786c9f214125d1e6aaebd458da7b349136d2e\": rpc error: code = NotFound desc = could not find container \"08cf71aa7ef86aa96ab4a725e3a786c9f214125d1e6aaebd458da7b349136d2e\": container with ID starting with 08cf71aa7ef86aa96ab4a725e3a786c9f214125d1e6aaebd458da7b349136d2e not found: ID does not exist" Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.412075 5094 scope.go:117] "RemoveContainer" containerID="839c0f255954bd2ecc5395fae7edd7e494e820f2caf00dccf86f613d6f0aa240" Feb 20 07:16:46 crc kubenswrapper[5094]: E0220 07:16:46.412358 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"839c0f255954bd2ecc5395fae7edd7e494e820f2caf00dccf86f613d6f0aa240\": container with ID starting with 839c0f255954bd2ecc5395fae7edd7e494e820f2caf00dccf86f613d6f0aa240 not found: ID does not exist" containerID="839c0f255954bd2ecc5395fae7edd7e494e820f2caf00dccf86f613d6f0aa240" Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.412387 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"839c0f255954bd2ecc5395fae7edd7e494e820f2caf00dccf86f613d6f0aa240"} err="failed to get container status \"839c0f255954bd2ecc5395fae7edd7e494e820f2caf00dccf86f613d6f0aa240\": rpc error: code = NotFound desc = could not find container \"839c0f255954bd2ecc5395fae7edd7e494e820f2caf00dccf86f613d6f0aa240\": container with ID starting with 839c0f255954bd2ecc5395fae7edd7e494e820f2caf00dccf86f613d6f0aa240 not found: ID does not exist" Feb 20 07:16:47 crc kubenswrapper[5094]: I0220 07:16:47.855610 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc330ce9-f173-403a-a659-c3c7326a8ae5" path="/var/lib/kubelet/pods/fc330ce9-f173-403a-a659-c3c7326a8ae5/volumes" Feb 20 07:18:34 crc kubenswrapper[5094]: I0220 07:18:34.107665 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:18:34 crc kubenswrapper[5094]: I0220 07:18:34.108662 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:19:04 crc kubenswrapper[5094]: I0220 07:19:04.107357 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:19:04 crc kubenswrapper[5094]: I0220 07:19:04.108553 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.572867 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-97lzn"] Feb 20 07:19:27 crc kubenswrapper[5094]: E0220 07:19:27.574449 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edef768c-5542-4656-a22c-61559bee852a" containerName="extract-content" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.574481 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="edef768c-5542-4656-a22c-61559bee852a" containerName="extract-content" Feb 20 07:19:27 crc kubenswrapper[5094]: E0220 07:19:27.574510 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edef768c-5542-4656-a22c-61559bee852a" containerName="extract-utilities" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.574527 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="edef768c-5542-4656-a22c-61559bee852a" containerName="extract-utilities" Feb 20 07:19:27 crc kubenswrapper[5094]: E0220 07:19:27.574558 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc330ce9-f173-403a-a659-c3c7326a8ae5" containerName="extract-utilities" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.574578 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc330ce9-f173-403a-a659-c3c7326a8ae5" containerName="extract-utilities" Feb 20 07:19:27 crc kubenswrapper[5094]: E0220 07:19:27.574617 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc330ce9-f173-403a-a659-c3c7326a8ae5" containerName="extract-content" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.574634 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc330ce9-f173-403a-a659-c3c7326a8ae5" containerName="extract-content" Feb 20 07:19:27 crc kubenswrapper[5094]: E0220 07:19:27.574663 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edef768c-5542-4656-a22c-61559bee852a" containerName="registry-server" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.574680 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="edef768c-5542-4656-a22c-61559bee852a" containerName="registry-server" Feb 20 07:19:27 crc kubenswrapper[5094]: E0220 07:19:27.574748 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc330ce9-f173-403a-a659-c3c7326a8ae5" containerName="registry-server" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.574766 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc330ce9-f173-403a-a659-c3c7326a8ae5" containerName="registry-server" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.575082 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc330ce9-f173-403a-a659-c3c7326a8ae5" containerName="registry-server" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.575112 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="edef768c-5542-4656-a22c-61559bee852a" containerName="registry-server" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.577012 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.613751 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-97lzn"] Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.662160 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dn62\" (UniqueName: \"kubernetes.io/projected/97983ca9-1559-417f-9d3d-876f4dc9301a-kube-api-access-4dn62\") pod \"redhat-marketplace-97lzn\" (UID: \"97983ca9-1559-417f-9d3d-876f4dc9301a\") " pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.662257 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97983ca9-1559-417f-9d3d-876f4dc9301a-catalog-content\") pod \"redhat-marketplace-97lzn\" (UID: \"97983ca9-1559-417f-9d3d-876f4dc9301a\") " pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.662336 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97983ca9-1559-417f-9d3d-876f4dc9301a-utilities\") pod \"redhat-marketplace-97lzn\" (UID: \"97983ca9-1559-417f-9d3d-876f4dc9301a\") " pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.764360 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97983ca9-1559-417f-9d3d-876f4dc9301a-utilities\") pod \"redhat-marketplace-97lzn\" (UID: \"97983ca9-1559-417f-9d3d-876f4dc9301a\") " pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.764446 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dn62\" (UniqueName: \"kubernetes.io/projected/97983ca9-1559-417f-9d3d-876f4dc9301a-kube-api-access-4dn62\") pod \"redhat-marketplace-97lzn\" (UID: \"97983ca9-1559-417f-9d3d-876f4dc9301a\") " pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.764510 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97983ca9-1559-417f-9d3d-876f4dc9301a-catalog-content\") pod \"redhat-marketplace-97lzn\" (UID: \"97983ca9-1559-417f-9d3d-876f4dc9301a\") " pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.765465 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97983ca9-1559-417f-9d3d-876f4dc9301a-catalog-content\") pod \"redhat-marketplace-97lzn\" (UID: \"97983ca9-1559-417f-9d3d-876f4dc9301a\") " pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.765915 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97983ca9-1559-417f-9d3d-876f4dc9301a-utilities\") pod \"redhat-marketplace-97lzn\" (UID: \"97983ca9-1559-417f-9d3d-876f4dc9301a\") " pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.790115 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dn62\" (UniqueName: \"kubernetes.io/projected/97983ca9-1559-417f-9d3d-876f4dc9301a-kube-api-access-4dn62\") pod \"redhat-marketplace-97lzn\" (UID: \"97983ca9-1559-417f-9d3d-876f4dc9301a\") " pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.915260 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:28 crc kubenswrapper[5094]: I0220 07:19:28.167234 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-97lzn"] Feb 20 07:19:29 crc kubenswrapper[5094]: I0220 07:19:29.016824 5094 generic.go:334] "Generic (PLEG): container finished" podID="97983ca9-1559-417f-9d3d-876f4dc9301a" containerID="919dfcc7f250bd3e3599094bc494141651def23aa2461960fe0fad45431e684c" exitCode=0 Feb 20 07:19:29 crc kubenswrapper[5094]: I0220 07:19:29.017130 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-97lzn" event={"ID":"97983ca9-1559-417f-9d3d-876f4dc9301a","Type":"ContainerDied","Data":"919dfcc7f250bd3e3599094bc494141651def23aa2461960fe0fad45431e684c"} Feb 20 07:19:29 crc kubenswrapper[5094]: I0220 07:19:29.017388 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-97lzn" event={"ID":"97983ca9-1559-417f-9d3d-876f4dc9301a","Type":"ContainerStarted","Data":"83c2439b2b22253659c1167a4f6763f69e16ecce665b8c681073b07ed24d0fda"} Feb 20 07:19:30 crc kubenswrapper[5094]: I0220 07:19:30.026392 5094 generic.go:334] "Generic (PLEG): container finished" podID="97983ca9-1559-417f-9d3d-876f4dc9301a" containerID="f2fc254af1f6f42e8ded1fcc5cc5bf7d80c82de1e467dd8a5830d513f713bbd6" exitCode=0 Feb 20 07:19:30 crc kubenswrapper[5094]: I0220 07:19:30.026538 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-97lzn" event={"ID":"97983ca9-1559-417f-9d3d-876f4dc9301a","Type":"ContainerDied","Data":"f2fc254af1f6f42e8ded1fcc5cc5bf7d80c82de1e467dd8a5830d513f713bbd6"} Feb 20 07:19:31 crc kubenswrapper[5094]: I0220 07:19:31.035481 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-97lzn" event={"ID":"97983ca9-1559-417f-9d3d-876f4dc9301a","Type":"ContainerStarted","Data":"fdc32efb3bc3d4c0e97cd8e6e4eebccb3a9a3a5fb028c32c4814a64109c85647"} Feb 20 07:19:34 crc kubenswrapper[5094]: I0220 07:19:34.107193 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:19:34 crc kubenswrapper[5094]: I0220 07:19:34.107832 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:19:34 crc kubenswrapper[5094]: I0220 07:19:34.107911 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 07:19:34 crc kubenswrapper[5094]: I0220 07:19:34.109046 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"38e73fba76bbd62d29e5f1f0936fd251ad4017c1b3ae50e879b97b18fd5f82df"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 07:19:34 crc kubenswrapper[5094]: I0220 07:19:34.109189 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://38e73fba76bbd62d29e5f1f0936fd251ad4017c1b3ae50e879b97b18fd5f82df" gracePeriod=600 Feb 20 07:19:35 crc kubenswrapper[5094]: I0220 07:19:35.077665 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="38e73fba76bbd62d29e5f1f0936fd251ad4017c1b3ae50e879b97b18fd5f82df" exitCode=0 Feb 20 07:19:35 crc kubenswrapper[5094]: I0220 07:19:35.077720 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"38e73fba76bbd62d29e5f1f0936fd251ad4017c1b3ae50e879b97b18fd5f82df"} Feb 20 07:19:35 crc kubenswrapper[5094]: I0220 07:19:35.078523 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942"} Feb 20 07:19:35 crc kubenswrapper[5094]: I0220 07:19:35.078565 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:19:35 crc kubenswrapper[5094]: I0220 07:19:35.102995 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-97lzn" podStartSLOduration=6.718236209 podStartE2EDuration="8.102892999s" podCreationTimestamp="2026-02-20 07:19:27 +0000 UTC" firstStartedPulling="2026-02-20 07:19:29.019419173 +0000 UTC m=+1983.892045924" lastFinishedPulling="2026-02-20 07:19:30.404075973 +0000 UTC m=+1985.276702714" observedRunningTime="2026-02-20 07:19:31.061048259 +0000 UTC m=+1985.933674970" watchObservedRunningTime="2026-02-20 07:19:35.102892999 +0000 UTC m=+1989.975519710" Feb 20 07:19:37 crc kubenswrapper[5094]: I0220 07:19:37.916382 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:37 crc kubenswrapper[5094]: I0220 07:19:37.916891 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:37 crc kubenswrapper[5094]: I0220 07:19:37.989251 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:38 crc kubenswrapper[5094]: I0220 07:19:38.178782 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:38 crc kubenswrapper[5094]: I0220 07:19:38.254059 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-97lzn"] Feb 20 07:19:40 crc kubenswrapper[5094]: I0220 07:19:40.127907 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-97lzn" podUID="97983ca9-1559-417f-9d3d-876f4dc9301a" containerName="registry-server" containerID="cri-o://fdc32efb3bc3d4c0e97cd8e6e4eebccb3a9a3a5fb028c32c4814a64109c85647" gracePeriod=2 Feb 20 07:19:41 crc kubenswrapper[5094]: I0220 07:19:41.143064 5094 generic.go:334] "Generic (PLEG): container finished" podID="97983ca9-1559-417f-9d3d-876f4dc9301a" containerID="fdc32efb3bc3d4c0e97cd8e6e4eebccb3a9a3a5fb028c32c4814a64109c85647" exitCode=0 Feb 20 07:19:41 crc kubenswrapper[5094]: I0220 07:19:41.143134 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-97lzn" event={"ID":"97983ca9-1559-417f-9d3d-876f4dc9301a","Type":"ContainerDied","Data":"fdc32efb3bc3d4c0e97cd8e6e4eebccb3a9a3a5fb028c32c4814a64109c85647"} Feb 20 07:19:41 crc kubenswrapper[5094]: I0220 07:19:41.143520 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-97lzn" event={"ID":"97983ca9-1559-417f-9d3d-876f4dc9301a","Type":"ContainerDied","Data":"83c2439b2b22253659c1167a4f6763f69e16ecce665b8c681073b07ed24d0fda"} Feb 20 07:19:41 crc kubenswrapper[5094]: I0220 07:19:41.143544 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83c2439b2b22253659c1167a4f6763f69e16ecce665b8c681073b07ed24d0fda" Feb 20 07:19:41 crc kubenswrapper[5094]: I0220 07:19:41.143157 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:41 crc kubenswrapper[5094]: I0220 07:19:41.329254 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97983ca9-1559-417f-9d3d-876f4dc9301a-utilities\") pod \"97983ca9-1559-417f-9d3d-876f4dc9301a\" (UID: \"97983ca9-1559-417f-9d3d-876f4dc9301a\") " Feb 20 07:19:41 crc kubenswrapper[5094]: I0220 07:19:41.329357 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97983ca9-1559-417f-9d3d-876f4dc9301a-catalog-content\") pod \"97983ca9-1559-417f-9d3d-876f4dc9301a\" (UID: \"97983ca9-1559-417f-9d3d-876f4dc9301a\") " Feb 20 07:19:41 crc kubenswrapper[5094]: I0220 07:19:41.329404 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dn62\" (UniqueName: \"kubernetes.io/projected/97983ca9-1559-417f-9d3d-876f4dc9301a-kube-api-access-4dn62\") pod \"97983ca9-1559-417f-9d3d-876f4dc9301a\" (UID: \"97983ca9-1559-417f-9d3d-876f4dc9301a\") " Feb 20 07:19:41 crc kubenswrapper[5094]: I0220 07:19:41.331024 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97983ca9-1559-417f-9d3d-876f4dc9301a-utilities" (OuterVolumeSpecName: "utilities") pod "97983ca9-1559-417f-9d3d-876f4dc9301a" (UID: "97983ca9-1559-417f-9d3d-876f4dc9301a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:19:41 crc kubenswrapper[5094]: I0220 07:19:41.342067 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97983ca9-1559-417f-9d3d-876f4dc9301a-kube-api-access-4dn62" (OuterVolumeSpecName: "kube-api-access-4dn62") pod "97983ca9-1559-417f-9d3d-876f4dc9301a" (UID: "97983ca9-1559-417f-9d3d-876f4dc9301a"). InnerVolumeSpecName "kube-api-access-4dn62". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:19:41 crc kubenswrapper[5094]: I0220 07:19:41.361906 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97983ca9-1559-417f-9d3d-876f4dc9301a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "97983ca9-1559-417f-9d3d-876f4dc9301a" (UID: "97983ca9-1559-417f-9d3d-876f4dc9301a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:19:41 crc kubenswrapper[5094]: I0220 07:19:41.432480 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97983ca9-1559-417f-9d3d-876f4dc9301a-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:19:41 crc kubenswrapper[5094]: I0220 07:19:41.433031 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97983ca9-1559-417f-9d3d-876f4dc9301a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:19:41 crc kubenswrapper[5094]: I0220 07:19:41.433226 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dn62\" (UniqueName: \"kubernetes.io/projected/97983ca9-1559-417f-9d3d-876f4dc9301a-kube-api-access-4dn62\") on node \"crc\" DevicePath \"\"" Feb 20 07:19:42 crc kubenswrapper[5094]: I0220 07:19:42.152446 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:42 crc kubenswrapper[5094]: I0220 07:19:42.177592 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-97lzn"] Feb 20 07:19:42 crc kubenswrapper[5094]: I0220 07:19:42.193114 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-97lzn"] Feb 20 07:19:43 crc kubenswrapper[5094]: I0220 07:19:43.850041 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97983ca9-1559-417f-9d3d-876f4dc9301a" path="/var/lib/kubelet/pods/97983ca9-1559-417f-9d3d-876f4dc9301a/volumes" Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.462882 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2nsvk"] Feb 20 07:21:05 crc kubenswrapper[5094]: E0220 07:21:05.464303 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97983ca9-1559-417f-9d3d-876f4dc9301a" containerName="registry-server" Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.464327 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="97983ca9-1559-417f-9d3d-876f4dc9301a" containerName="registry-server" Feb 20 07:21:05 crc kubenswrapper[5094]: E0220 07:21:05.464386 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97983ca9-1559-417f-9d3d-876f4dc9301a" containerName="extract-content" Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.464400 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="97983ca9-1559-417f-9d3d-876f4dc9301a" containerName="extract-content" Feb 20 07:21:05 crc kubenswrapper[5094]: E0220 07:21:05.464437 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97983ca9-1559-417f-9d3d-876f4dc9301a" containerName="extract-utilities" Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.464453 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="97983ca9-1559-417f-9d3d-876f4dc9301a" containerName="extract-utilities" Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.464765 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="97983ca9-1559-417f-9d3d-876f4dc9301a" containerName="registry-server" Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.466785 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.491089 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2nsvk"] Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.557471 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdz52\" (UniqueName: \"kubernetes.io/projected/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-kube-api-access-fdz52\") pod \"certified-operators-2nsvk\" (UID: \"5a1b1538-58d2-448e-8a39-9ec2dac98a3e\") " pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.557618 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-utilities\") pod \"certified-operators-2nsvk\" (UID: \"5a1b1538-58d2-448e-8a39-9ec2dac98a3e\") " pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.557815 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-catalog-content\") pod \"certified-operators-2nsvk\" (UID: \"5a1b1538-58d2-448e-8a39-9ec2dac98a3e\") " pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.659933 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-catalog-content\") pod \"certified-operators-2nsvk\" (UID: \"5a1b1538-58d2-448e-8a39-9ec2dac98a3e\") " pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.660122 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdz52\" (UniqueName: \"kubernetes.io/projected/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-kube-api-access-fdz52\") pod \"certified-operators-2nsvk\" (UID: \"5a1b1538-58d2-448e-8a39-9ec2dac98a3e\") " pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.660158 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-utilities\") pod \"certified-operators-2nsvk\" (UID: \"5a1b1538-58d2-448e-8a39-9ec2dac98a3e\") " pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.660798 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-catalog-content\") pod \"certified-operators-2nsvk\" (UID: \"5a1b1538-58d2-448e-8a39-9ec2dac98a3e\") " pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.661258 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-utilities\") pod \"certified-operators-2nsvk\" (UID: \"5a1b1538-58d2-448e-8a39-9ec2dac98a3e\") " pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.681778 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdz52\" (UniqueName: \"kubernetes.io/projected/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-kube-api-access-fdz52\") pod \"certified-operators-2nsvk\" (UID: \"5a1b1538-58d2-448e-8a39-9ec2dac98a3e\") " pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.803979 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:06 crc kubenswrapper[5094]: I0220 07:21:06.093663 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2nsvk"] Feb 20 07:21:06 crc kubenswrapper[5094]: I0220 07:21:06.985366 5094 generic.go:334] "Generic (PLEG): container finished" podID="5a1b1538-58d2-448e-8a39-9ec2dac98a3e" containerID="cd368ec9f4e03f9946c8fa62094be1a97b1488bfc915477e1dc38ac037b9642d" exitCode=0 Feb 20 07:21:06 crc kubenswrapper[5094]: I0220 07:21:06.985489 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2nsvk" event={"ID":"5a1b1538-58d2-448e-8a39-9ec2dac98a3e","Type":"ContainerDied","Data":"cd368ec9f4e03f9946c8fa62094be1a97b1488bfc915477e1dc38ac037b9642d"} Feb 20 07:21:06 crc kubenswrapper[5094]: I0220 07:21:06.985973 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2nsvk" event={"ID":"5a1b1538-58d2-448e-8a39-9ec2dac98a3e","Type":"ContainerStarted","Data":"3f86ead75ecd7c62888f8f57bd59f88b3baa96e25bd27fb0c40378046f28994b"} Feb 20 07:21:08 crc kubenswrapper[5094]: I0220 07:21:08.000017 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2nsvk" event={"ID":"5a1b1538-58d2-448e-8a39-9ec2dac98a3e","Type":"ContainerStarted","Data":"19cc6b58ad6d5203a786adb8549d85e9adfa140f0a07d4288e451077fe611ffa"} Feb 20 07:21:09 crc kubenswrapper[5094]: I0220 07:21:09.027633 5094 generic.go:334] "Generic (PLEG): container finished" podID="5a1b1538-58d2-448e-8a39-9ec2dac98a3e" containerID="19cc6b58ad6d5203a786adb8549d85e9adfa140f0a07d4288e451077fe611ffa" exitCode=0 Feb 20 07:21:09 crc kubenswrapper[5094]: I0220 07:21:09.027748 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2nsvk" event={"ID":"5a1b1538-58d2-448e-8a39-9ec2dac98a3e","Type":"ContainerDied","Data":"19cc6b58ad6d5203a786adb8549d85e9adfa140f0a07d4288e451077fe611ffa"} Feb 20 07:21:10 crc kubenswrapper[5094]: I0220 07:21:10.038779 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2nsvk" event={"ID":"5a1b1538-58d2-448e-8a39-9ec2dac98a3e","Type":"ContainerStarted","Data":"9b0e2a585d47175ec5335934394eaca3a88ae9406ac2a5bc56d7d5d3cb0894d5"} Feb 20 07:21:10 crc kubenswrapper[5094]: I0220 07:21:10.066969 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2nsvk" podStartSLOduration=2.617735178 podStartE2EDuration="5.066939731s" podCreationTimestamp="2026-02-20 07:21:05 +0000 UTC" firstStartedPulling="2026-02-20 07:21:06.989136402 +0000 UTC m=+2081.861763123" lastFinishedPulling="2026-02-20 07:21:09.438340965 +0000 UTC m=+2084.310967676" observedRunningTime="2026-02-20 07:21:10.062797122 +0000 UTC m=+2084.935423853" watchObservedRunningTime="2026-02-20 07:21:10.066939731 +0000 UTC m=+2084.939566452" Feb 20 07:21:15 crc kubenswrapper[5094]: I0220 07:21:15.804921 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:15 crc kubenswrapper[5094]: I0220 07:21:15.805545 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:15 crc kubenswrapper[5094]: I0220 07:21:15.890966 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:16 crc kubenswrapper[5094]: I0220 07:21:16.151312 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:16 crc kubenswrapper[5094]: I0220 07:21:16.222245 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2nsvk"] Feb 20 07:21:18 crc kubenswrapper[5094]: I0220 07:21:18.115234 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2nsvk" podUID="5a1b1538-58d2-448e-8a39-9ec2dac98a3e" containerName="registry-server" containerID="cri-o://9b0e2a585d47175ec5335934394eaca3a88ae9406ac2a5bc56d7d5d3cb0894d5" gracePeriod=2 Feb 20 07:21:18 crc kubenswrapper[5094]: I0220 07:21:18.563377 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:18 crc kubenswrapper[5094]: I0220 07:21:18.627788 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdz52\" (UniqueName: \"kubernetes.io/projected/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-kube-api-access-fdz52\") pod \"5a1b1538-58d2-448e-8a39-9ec2dac98a3e\" (UID: \"5a1b1538-58d2-448e-8a39-9ec2dac98a3e\") " Feb 20 07:21:18 crc kubenswrapper[5094]: I0220 07:21:18.627909 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-catalog-content\") pod \"5a1b1538-58d2-448e-8a39-9ec2dac98a3e\" (UID: \"5a1b1538-58d2-448e-8a39-9ec2dac98a3e\") " Feb 20 07:21:18 crc kubenswrapper[5094]: I0220 07:21:18.627955 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-utilities\") pod \"5a1b1538-58d2-448e-8a39-9ec2dac98a3e\" (UID: \"5a1b1538-58d2-448e-8a39-9ec2dac98a3e\") " Feb 20 07:21:18 crc kubenswrapper[5094]: I0220 07:21:18.629792 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-utilities" (OuterVolumeSpecName: "utilities") pod "5a1b1538-58d2-448e-8a39-9ec2dac98a3e" (UID: "5a1b1538-58d2-448e-8a39-9ec2dac98a3e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:21:18 crc kubenswrapper[5094]: I0220 07:21:18.637999 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-kube-api-access-fdz52" (OuterVolumeSpecName: "kube-api-access-fdz52") pod "5a1b1538-58d2-448e-8a39-9ec2dac98a3e" (UID: "5a1b1538-58d2-448e-8a39-9ec2dac98a3e"). InnerVolumeSpecName "kube-api-access-fdz52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:21:18 crc kubenswrapper[5094]: I0220 07:21:18.728954 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:21:18 crc kubenswrapper[5094]: I0220 07:21:18.728990 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdz52\" (UniqueName: \"kubernetes.io/projected/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-kube-api-access-fdz52\") on node \"crc\" DevicePath \"\"" Feb 20 07:21:18 crc kubenswrapper[5094]: I0220 07:21:18.849137 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a1b1538-58d2-448e-8a39-9ec2dac98a3e" (UID: "5a1b1538-58d2-448e-8a39-9ec2dac98a3e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:21:18 crc kubenswrapper[5094]: I0220 07:21:18.932210 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:21:19 crc kubenswrapper[5094]: I0220 07:21:19.131039 5094 generic.go:334] "Generic (PLEG): container finished" podID="5a1b1538-58d2-448e-8a39-9ec2dac98a3e" containerID="9b0e2a585d47175ec5335934394eaca3a88ae9406ac2a5bc56d7d5d3cb0894d5" exitCode=0 Feb 20 07:21:19 crc kubenswrapper[5094]: I0220 07:21:19.131136 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2nsvk" event={"ID":"5a1b1538-58d2-448e-8a39-9ec2dac98a3e","Type":"ContainerDied","Data":"9b0e2a585d47175ec5335934394eaca3a88ae9406ac2a5bc56d7d5d3cb0894d5"} Feb 20 07:21:19 crc kubenswrapper[5094]: I0220 07:21:19.131198 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2nsvk" event={"ID":"5a1b1538-58d2-448e-8a39-9ec2dac98a3e","Type":"ContainerDied","Data":"3f86ead75ecd7c62888f8f57bd59f88b3baa96e25bd27fb0c40378046f28994b"} Feb 20 07:21:19 crc kubenswrapper[5094]: I0220 07:21:19.131229 5094 scope.go:117] "RemoveContainer" containerID="9b0e2a585d47175ec5335934394eaca3a88ae9406ac2a5bc56d7d5d3cb0894d5" Feb 20 07:21:19 crc kubenswrapper[5094]: I0220 07:21:19.131457 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:19 crc kubenswrapper[5094]: I0220 07:21:19.164538 5094 scope.go:117] "RemoveContainer" containerID="19cc6b58ad6d5203a786adb8549d85e9adfa140f0a07d4288e451077fe611ffa" Feb 20 07:21:19 crc kubenswrapper[5094]: I0220 07:21:19.197617 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2nsvk"] Feb 20 07:21:19 crc kubenswrapper[5094]: I0220 07:21:19.230747 5094 scope.go:117] "RemoveContainer" containerID="cd368ec9f4e03f9946c8fa62094be1a97b1488bfc915477e1dc38ac037b9642d" Feb 20 07:21:19 crc kubenswrapper[5094]: I0220 07:21:19.233260 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2nsvk"] Feb 20 07:21:19 crc kubenswrapper[5094]: I0220 07:21:19.259979 5094 scope.go:117] "RemoveContainer" containerID="9b0e2a585d47175ec5335934394eaca3a88ae9406ac2a5bc56d7d5d3cb0894d5" Feb 20 07:21:19 crc kubenswrapper[5094]: E0220 07:21:19.260810 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b0e2a585d47175ec5335934394eaca3a88ae9406ac2a5bc56d7d5d3cb0894d5\": container with ID starting with 9b0e2a585d47175ec5335934394eaca3a88ae9406ac2a5bc56d7d5d3cb0894d5 not found: ID does not exist" containerID="9b0e2a585d47175ec5335934394eaca3a88ae9406ac2a5bc56d7d5d3cb0894d5" Feb 20 07:21:19 crc kubenswrapper[5094]: I0220 07:21:19.260868 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b0e2a585d47175ec5335934394eaca3a88ae9406ac2a5bc56d7d5d3cb0894d5"} err="failed to get container status \"9b0e2a585d47175ec5335934394eaca3a88ae9406ac2a5bc56d7d5d3cb0894d5\": rpc error: code = NotFound desc = could not find container \"9b0e2a585d47175ec5335934394eaca3a88ae9406ac2a5bc56d7d5d3cb0894d5\": container with ID starting with 9b0e2a585d47175ec5335934394eaca3a88ae9406ac2a5bc56d7d5d3cb0894d5 not found: ID does not exist" Feb 20 07:21:19 crc kubenswrapper[5094]: I0220 07:21:19.260909 5094 scope.go:117] "RemoveContainer" containerID="19cc6b58ad6d5203a786adb8549d85e9adfa140f0a07d4288e451077fe611ffa" Feb 20 07:21:19 crc kubenswrapper[5094]: E0220 07:21:19.261640 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19cc6b58ad6d5203a786adb8549d85e9adfa140f0a07d4288e451077fe611ffa\": container with ID starting with 19cc6b58ad6d5203a786adb8549d85e9adfa140f0a07d4288e451077fe611ffa not found: ID does not exist" containerID="19cc6b58ad6d5203a786adb8549d85e9adfa140f0a07d4288e451077fe611ffa" Feb 20 07:21:19 crc kubenswrapper[5094]: I0220 07:21:19.261697 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19cc6b58ad6d5203a786adb8549d85e9adfa140f0a07d4288e451077fe611ffa"} err="failed to get container status \"19cc6b58ad6d5203a786adb8549d85e9adfa140f0a07d4288e451077fe611ffa\": rpc error: code = NotFound desc = could not find container \"19cc6b58ad6d5203a786adb8549d85e9adfa140f0a07d4288e451077fe611ffa\": container with ID starting with 19cc6b58ad6d5203a786adb8549d85e9adfa140f0a07d4288e451077fe611ffa not found: ID does not exist" Feb 20 07:21:19 crc kubenswrapper[5094]: I0220 07:21:19.261778 5094 scope.go:117] "RemoveContainer" containerID="cd368ec9f4e03f9946c8fa62094be1a97b1488bfc915477e1dc38ac037b9642d" Feb 20 07:21:19 crc kubenswrapper[5094]: E0220 07:21:19.262211 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd368ec9f4e03f9946c8fa62094be1a97b1488bfc915477e1dc38ac037b9642d\": container with ID starting with cd368ec9f4e03f9946c8fa62094be1a97b1488bfc915477e1dc38ac037b9642d not found: ID does not exist" containerID="cd368ec9f4e03f9946c8fa62094be1a97b1488bfc915477e1dc38ac037b9642d" Feb 20 07:21:19 crc kubenswrapper[5094]: I0220 07:21:19.262252 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd368ec9f4e03f9946c8fa62094be1a97b1488bfc915477e1dc38ac037b9642d"} err="failed to get container status \"cd368ec9f4e03f9946c8fa62094be1a97b1488bfc915477e1dc38ac037b9642d\": rpc error: code = NotFound desc = could not find container \"cd368ec9f4e03f9946c8fa62094be1a97b1488bfc915477e1dc38ac037b9642d\": container with ID starting with cd368ec9f4e03f9946c8fa62094be1a97b1488bfc915477e1dc38ac037b9642d not found: ID does not exist" Feb 20 07:21:19 crc kubenswrapper[5094]: I0220 07:21:19.853131 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a1b1538-58d2-448e-8a39-9ec2dac98a3e" path="/var/lib/kubelet/pods/5a1b1538-58d2-448e-8a39-9ec2dac98a3e/volumes" Feb 20 07:21:34 crc kubenswrapper[5094]: I0220 07:21:34.107872 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:21:34 crc kubenswrapper[5094]: I0220 07:21:34.108848 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:22:04 crc kubenswrapper[5094]: I0220 07:22:04.106642 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:22:04 crc kubenswrapper[5094]: I0220 07:22:04.107810 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:22:34 crc kubenswrapper[5094]: I0220 07:22:34.106525 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:22:34 crc kubenswrapper[5094]: I0220 07:22:34.107424 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:22:34 crc kubenswrapper[5094]: I0220 07:22:34.107506 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 07:22:34 crc kubenswrapper[5094]: I0220 07:22:34.108589 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 07:22:34 crc kubenswrapper[5094]: I0220 07:22:34.108699 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" gracePeriod=600 Feb 20 07:22:34 crc kubenswrapper[5094]: E0220 07:22:34.237370 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:22:34 crc kubenswrapper[5094]: I0220 07:22:34.946032 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" exitCode=0 Feb 20 07:22:34 crc kubenswrapper[5094]: I0220 07:22:34.946166 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942"} Feb 20 07:22:34 crc kubenswrapper[5094]: I0220 07:22:34.946783 5094 scope.go:117] "RemoveContainer" containerID="38e73fba76bbd62d29e5f1f0936fd251ad4017c1b3ae50e879b97b18fd5f82df" Feb 20 07:22:34 crc kubenswrapper[5094]: I0220 07:22:34.948734 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:22:34 crc kubenswrapper[5094]: E0220 07:22:34.949331 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:22:46 crc kubenswrapper[5094]: I0220 07:22:46.840382 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:22:46 crc kubenswrapper[5094]: E0220 07:22:46.841483 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:23:00 crc kubenswrapper[5094]: I0220 07:23:00.840993 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:23:00 crc kubenswrapper[5094]: E0220 07:23:00.841986 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:23:11 crc kubenswrapper[5094]: I0220 07:23:11.840737 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:23:11 crc kubenswrapper[5094]: E0220 07:23:11.842892 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:23:25 crc kubenswrapper[5094]: I0220 07:23:25.849674 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:23:25 crc kubenswrapper[5094]: E0220 07:23:25.851536 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:23:36 crc kubenswrapper[5094]: I0220 07:23:36.840938 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:23:36 crc kubenswrapper[5094]: E0220 07:23:36.844734 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:23:47 crc kubenswrapper[5094]: I0220 07:23:47.841276 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:23:47 crc kubenswrapper[5094]: E0220 07:23:47.842539 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:24:01 crc kubenswrapper[5094]: I0220 07:24:01.840582 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:24:01 crc kubenswrapper[5094]: E0220 07:24:01.843583 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:24:14 crc kubenswrapper[5094]: I0220 07:24:14.840424 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:24:14 crc kubenswrapper[5094]: E0220 07:24:14.841304 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:24:29 crc kubenswrapper[5094]: I0220 07:24:29.840858 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:24:29 crc kubenswrapper[5094]: E0220 07:24:29.841844 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:24:41 crc kubenswrapper[5094]: I0220 07:24:41.840551 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:24:41 crc kubenswrapper[5094]: E0220 07:24:41.841957 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:24:56 crc kubenswrapper[5094]: I0220 07:24:56.841506 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:24:56 crc kubenswrapper[5094]: E0220 07:24:56.842830 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:25:08 crc kubenswrapper[5094]: I0220 07:25:08.841595 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:25:08 crc kubenswrapper[5094]: E0220 07:25:08.842603 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:25:21 crc kubenswrapper[5094]: I0220 07:25:21.841584 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:25:21 crc kubenswrapper[5094]: E0220 07:25:21.842987 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:25:34 crc kubenswrapper[5094]: I0220 07:25:34.546767 5094 scope.go:117] "RemoveContainer" containerID="919dfcc7f250bd3e3599094bc494141651def23aa2461960fe0fad45431e684c" Feb 20 07:25:34 crc kubenswrapper[5094]: I0220 07:25:34.599518 5094 scope.go:117] "RemoveContainer" containerID="f2fc254af1f6f42e8ded1fcc5cc5bf7d80c82de1e467dd8a5830d513f713bbd6" Feb 20 07:25:34 crc kubenswrapper[5094]: I0220 07:25:34.643216 5094 scope.go:117] "RemoveContainer" containerID="fdc32efb3bc3d4c0e97cd8e6e4eebccb3a9a3a5fb028c32c4814a64109c85647" Feb 20 07:25:34 crc kubenswrapper[5094]: I0220 07:25:34.841070 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:25:34 crc kubenswrapper[5094]: E0220 07:25:34.842209 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:25:46 crc kubenswrapper[5094]: I0220 07:25:46.841019 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:25:46 crc kubenswrapper[5094]: E0220 07:25:46.842143 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:25:57 crc kubenswrapper[5094]: I0220 07:25:57.843447 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:25:57 crc kubenswrapper[5094]: E0220 07:25:57.845294 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:26:10 crc kubenswrapper[5094]: I0220 07:26:10.841010 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:26:10 crc kubenswrapper[5094]: E0220 07:26:10.842221 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:26:22 crc kubenswrapper[5094]: I0220 07:26:22.841896 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:26:22 crc kubenswrapper[5094]: E0220 07:26:22.845764 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:26:32 crc kubenswrapper[5094]: I0220 07:26:32.694288 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-h7xgp"] Feb 20 07:26:32 crc kubenswrapper[5094]: E0220 07:26:32.695374 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a1b1538-58d2-448e-8a39-9ec2dac98a3e" containerName="extract-utilities" Feb 20 07:26:32 crc kubenswrapper[5094]: I0220 07:26:32.695394 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1b1538-58d2-448e-8a39-9ec2dac98a3e" containerName="extract-utilities" Feb 20 07:26:32 crc kubenswrapper[5094]: E0220 07:26:32.695437 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a1b1538-58d2-448e-8a39-9ec2dac98a3e" containerName="extract-content" Feb 20 07:26:32 crc kubenswrapper[5094]: I0220 07:26:32.695452 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1b1538-58d2-448e-8a39-9ec2dac98a3e" containerName="extract-content" Feb 20 07:26:32 crc kubenswrapper[5094]: E0220 07:26:32.695472 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a1b1538-58d2-448e-8a39-9ec2dac98a3e" containerName="registry-server" Feb 20 07:26:32 crc kubenswrapper[5094]: I0220 07:26:32.695481 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1b1538-58d2-448e-8a39-9ec2dac98a3e" containerName="registry-server" Feb 20 07:26:32 crc kubenswrapper[5094]: I0220 07:26:32.695670 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a1b1538-58d2-448e-8a39-9ec2dac98a3e" containerName="registry-server" Feb 20 07:26:32 crc kubenswrapper[5094]: I0220 07:26:32.697160 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:32 crc kubenswrapper[5094]: I0220 07:26:32.715146 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-catalog-content\") pod \"redhat-operators-h7xgp\" (UID: \"8d9d8b9a-3617-4295-bea3-3339d6dc7b88\") " pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:32 crc kubenswrapper[5094]: I0220 07:26:32.715224 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2gsc\" (UniqueName: \"kubernetes.io/projected/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-kube-api-access-b2gsc\") pod \"redhat-operators-h7xgp\" (UID: \"8d9d8b9a-3617-4295-bea3-3339d6dc7b88\") " pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:32 crc kubenswrapper[5094]: I0220 07:26:32.715299 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-utilities\") pod \"redhat-operators-h7xgp\" (UID: \"8d9d8b9a-3617-4295-bea3-3339d6dc7b88\") " pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:32 crc kubenswrapper[5094]: I0220 07:26:32.715938 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h7xgp"] Feb 20 07:26:32 crc kubenswrapper[5094]: I0220 07:26:32.817615 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-catalog-content\") pod \"redhat-operators-h7xgp\" (UID: \"8d9d8b9a-3617-4295-bea3-3339d6dc7b88\") " pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:32 crc kubenswrapper[5094]: I0220 07:26:32.817735 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2gsc\" (UniqueName: \"kubernetes.io/projected/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-kube-api-access-b2gsc\") pod \"redhat-operators-h7xgp\" (UID: \"8d9d8b9a-3617-4295-bea3-3339d6dc7b88\") " pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:32 crc kubenswrapper[5094]: I0220 07:26:32.817803 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-utilities\") pod \"redhat-operators-h7xgp\" (UID: \"8d9d8b9a-3617-4295-bea3-3339d6dc7b88\") " pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:32 crc kubenswrapper[5094]: I0220 07:26:32.818242 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-catalog-content\") pod \"redhat-operators-h7xgp\" (UID: \"8d9d8b9a-3617-4295-bea3-3339d6dc7b88\") " pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:32 crc kubenswrapper[5094]: I0220 07:26:32.818599 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-utilities\") pod \"redhat-operators-h7xgp\" (UID: \"8d9d8b9a-3617-4295-bea3-3339d6dc7b88\") " pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:32 crc kubenswrapper[5094]: I0220 07:26:32.845998 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2gsc\" (UniqueName: \"kubernetes.io/projected/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-kube-api-access-b2gsc\") pod \"redhat-operators-h7xgp\" (UID: \"8d9d8b9a-3617-4295-bea3-3339d6dc7b88\") " pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:33 crc kubenswrapper[5094]: I0220 07:26:33.020884 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:33 crc kubenswrapper[5094]: I0220 07:26:33.478968 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h7xgp"] Feb 20 07:26:33 crc kubenswrapper[5094]: I0220 07:26:33.841551 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:26:33 crc kubenswrapper[5094]: E0220 07:26:33.842359 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:26:34 crc kubenswrapper[5094]: I0220 07:26:34.455309 5094 generic.go:334] "Generic (PLEG): container finished" podID="8d9d8b9a-3617-4295-bea3-3339d6dc7b88" containerID="2f0a1b6aa6ccdc312d21765181ced4b72706d82dcb309f3bea989ecd2d0916c9" exitCode=0 Feb 20 07:26:34 crc kubenswrapper[5094]: I0220 07:26:34.455387 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7xgp" event={"ID":"8d9d8b9a-3617-4295-bea3-3339d6dc7b88","Type":"ContainerDied","Data":"2f0a1b6aa6ccdc312d21765181ced4b72706d82dcb309f3bea989ecd2d0916c9"} Feb 20 07:26:34 crc kubenswrapper[5094]: I0220 07:26:34.455462 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7xgp" event={"ID":"8d9d8b9a-3617-4295-bea3-3339d6dc7b88","Type":"ContainerStarted","Data":"9d79db86dab6a517e10d93843c540d48efb57163ba63e6df9bf20cab17a7726a"} Feb 20 07:26:34 crc kubenswrapper[5094]: I0220 07:26:34.458092 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 07:26:35 crc kubenswrapper[5094]: I0220 07:26:35.468968 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7xgp" event={"ID":"8d9d8b9a-3617-4295-bea3-3339d6dc7b88","Type":"ContainerStarted","Data":"aab72c0eb6de3e2be159e5463fec369c127364c1f8c91363519fd5c5dc0a8c01"} Feb 20 07:26:36 crc kubenswrapper[5094]: I0220 07:26:36.481571 5094 generic.go:334] "Generic (PLEG): container finished" podID="8d9d8b9a-3617-4295-bea3-3339d6dc7b88" containerID="aab72c0eb6de3e2be159e5463fec369c127364c1f8c91363519fd5c5dc0a8c01" exitCode=0 Feb 20 07:26:36 crc kubenswrapper[5094]: I0220 07:26:36.481651 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7xgp" event={"ID":"8d9d8b9a-3617-4295-bea3-3339d6dc7b88","Type":"ContainerDied","Data":"aab72c0eb6de3e2be159e5463fec369c127364c1f8c91363519fd5c5dc0a8c01"} Feb 20 07:26:37 crc kubenswrapper[5094]: I0220 07:26:37.495024 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7xgp" event={"ID":"8d9d8b9a-3617-4295-bea3-3339d6dc7b88","Type":"ContainerStarted","Data":"bb4ba2daf05a507d629a231d540f975a77bf3c296d57ebb79a07ed3f7ff01b24"} Feb 20 07:26:37 crc kubenswrapper[5094]: I0220 07:26:37.533652 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-h7xgp" podStartSLOduration=3.11470225 podStartE2EDuration="5.533625635s" podCreationTimestamp="2026-02-20 07:26:32 +0000 UTC" firstStartedPulling="2026-02-20 07:26:34.457813416 +0000 UTC m=+2409.330440127" lastFinishedPulling="2026-02-20 07:26:36.876736801 +0000 UTC m=+2411.749363512" observedRunningTime="2026-02-20 07:26:37.531101975 +0000 UTC m=+2412.403728706" watchObservedRunningTime="2026-02-20 07:26:37.533625635 +0000 UTC m=+2412.406252376" Feb 20 07:26:43 crc kubenswrapper[5094]: I0220 07:26:43.021558 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:43 crc kubenswrapper[5094]: I0220 07:26:43.022077 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:44 crc kubenswrapper[5094]: I0220 07:26:44.086222 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h7xgp" podUID="8d9d8b9a-3617-4295-bea3-3339d6dc7b88" containerName="registry-server" probeResult="failure" output=< Feb 20 07:26:44 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 07:26:44 crc kubenswrapper[5094]: > Feb 20 07:26:45 crc kubenswrapper[5094]: I0220 07:26:45.850633 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:26:45 crc kubenswrapper[5094]: E0220 07:26:45.850958 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:26:53 crc kubenswrapper[5094]: I0220 07:26:53.107889 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:53 crc kubenswrapper[5094]: I0220 07:26:53.167758 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:53 crc kubenswrapper[5094]: I0220 07:26:53.359744 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h7xgp"] Feb 20 07:26:54 crc kubenswrapper[5094]: I0220 07:26:54.670367 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-h7xgp" podUID="8d9d8b9a-3617-4295-bea3-3339d6dc7b88" containerName="registry-server" containerID="cri-o://bb4ba2daf05a507d629a231d540f975a77bf3c296d57ebb79a07ed3f7ff01b24" gracePeriod=2 Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.130297 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.170639 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2gsc\" (UniqueName: \"kubernetes.io/projected/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-kube-api-access-b2gsc\") pod \"8d9d8b9a-3617-4295-bea3-3339d6dc7b88\" (UID: \"8d9d8b9a-3617-4295-bea3-3339d6dc7b88\") " Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.170822 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-catalog-content\") pod \"8d9d8b9a-3617-4295-bea3-3339d6dc7b88\" (UID: \"8d9d8b9a-3617-4295-bea3-3339d6dc7b88\") " Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.170936 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-utilities\") pod \"8d9d8b9a-3617-4295-bea3-3339d6dc7b88\" (UID: \"8d9d8b9a-3617-4295-bea3-3339d6dc7b88\") " Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.173298 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-utilities" (OuterVolumeSpecName: "utilities") pod "8d9d8b9a-3617-4295-bea3-3339d6dc7b88" (UID: "8d9d8b9a-3617-4295-bea3-3339d6dc7b88"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.192043 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-kube-api-access-b2gsc" (OuterVolumeSpecName: "kube-api-access-b2gsc") pod "8d9d8b9a-3617-4295-bea3-3339d6dc7b88" (UID: "8d9d8b9a-3617-4295-bea3-3339d6dc7b88"). InnerVolumeSpecName "kube-api-access-b2gsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.272233 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.272272 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2gsc\" (UniqueName: \"kubernetes.io/projected/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-kube-api-access-b2gsc\") on node \"crc\" DevicePath \"\"" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.415589 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8d9d8b9a-3617-4295-bea3-3339d6dc7b88" (UID: "8d9d8b9a-3617-4295-bea3-3339d6dc7b88"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.476651 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.679300 5094 generic.go:334] "Generic (PLEG): container finished" podID="8d9d8b9a-3617-4295-bea3-3339d6dc7b88" containerID="bb4ba2daf05a507d629a231d540f975a77bf3c296d57ebb79a07ed3f7ff01b24" exitCode=0 Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.679348 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7xgp" event={"ID":"8d9d8b9a-3617-4295-bea3-3339d6dc7b88","Type":"ContainerDied","Data":"bb4ba2daf05a507d629a231d540f975a77bf3c296d57ebb79a07ed3f7ff01b24"} Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.679383 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7xgp" event={"ID":"8d9d8b9a-3617-4295-bea3-3339d6dc7b88","Type":"ContainerDied","Data":"9d79db86dab6a517e10d93843c540d48efb57163ba63e6df9bf20cab17a7726a"} Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.679403 5094 scope.go:117] "RemoveContainer" containerID="bb4ba2daf05a507d629a231d540f975a77bf3c296d57ebb79a07ed3f7ff01b24" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.679438 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.702462 5094 scope.go:117] "RemoveContainer" containerID="aab72c0eb6de3e2be159e5463fec369c127364c1f8c91363519fd5c5dc0a8c01" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.719506 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h7xgp"] Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.724397 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-h7xgp"] Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.729494 5094 scope.go:117] "RemoveContainer" containerID="2f0a1b6aa6ccdc312d21765181ced4b72706d82dcb309f3bea989ecd2d0916c9" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.757790 5094 scope.go:117] "RemoveContainer" containerID="bb4ba2daf05a507d629a231d540f975a77bf3c296d57ebb79a07ed3f7ff01b24" Feb 20 07:26:55 crc kubenswrapper[5094]: E0220 07:26:55.758266 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb4ba2daf05a507d629a231d540f975a77bf3c296d57ebb79a07ed3f7ff01b24\": container with ID starting with bb4ba2daf05a507d629a231d540f975a77bf3c296d57ebb79a07ed3f7ff01b24 not found: ID does not exist" containerID="bb4ba2daf05a507d629a231d540f975a77bf3c296d57ebb79a07ed3f7ff01b24" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.758346 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb4ba2daf05a507d629a231d540f975a77bf3c296d57ebb79a07ed3f7ff01b24"} err="failed to get container status \"bb4ba2daf05a507d629a231d540f975a77bf3c296d57ebb79a07ed3f7ff01b24\": rpc error: code = NotFound desc = could not find container \"bb4ba2daf05a507d629a231d540f975a77bf3c296d57ebb79a07ed3f7ff01b24\": container with ID starting with bb4ba2daf05a507d629a231d540f975a77bf3c296d57ebb79a07ed3f7ff01b24 not found: ID does not exist" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.758392 5094 scope.go:117] "RemoveContainer" containerID="aab72c0eb6de3e2be159e5463fec369c127364c1f8c91363519fd5c5dc0a8c01" Feb 20 07:26:55 crc kubenswrapper[5094]: E0220 07:26:55.759090 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aab72c0eb6de3e2be159e5463fec369c127364c1f8c91363519fd5c5dc0a8c01\": container with ID starting with aab72c0eb6de3e2be159e5463fec369c127364c1f8c91363519fd5c5dc0a8c01 not found: ID does not exist" containerID="aab72c0eb6de3e2be159e5463fec369c127364c1f8c91363519fd5c5dc0a8c01" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.759156 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aab72c0eb6de3e2be159e5463fec369c127364c1f8c91363519fd5c5dc0a8c01"} err="failed to get container status \"aab72c0eb6de3e2be159e5463fec369c127364c1f8c91363519fd5c5dc0a8c01\": rpc error: code = NotFound desc = could not find container \"aab72c0eb6de3e2be159e5463fec369c127364c1f8c91363519fd5c5dc0a8c01\": container with ID starting with aab72c0eb6de3e2be159e5463fec369c127364c1f8c91363519fd5c5dc0a8c01 not found: ID does not exist" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.759197 5094 scope.go:117] "RemoveContainer" containerID="2f0a1b6aa6ccdc312d21765181ced4b72706d82dcb309f3bea989ecd2d0916c9" Feb 20 07:26:55 crc kubenswrapper[5094]: E0220 07:26:55.759593 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f0a1b6aa6ccdc312d21765181ced4b72706d82dcb309f3bea989ecd2d0916c9\": container with ID starting with 2f0a1b6aa6ccdc312d21765181ced4b72706d82dcb309f3bea989ecd2d0916c9 not found: ID does not exist" containerID="2f0a1b6aa6ccdc312d21765181ced4b72706d82dcb309f3bea989ecd2d0916c9" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.759640 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f0a1b6aa6ccdc312d21765181ced4b72706d82dcb309f3bea989ecd2d0916c9"} err="failed to get container status \"2f0a1b6aa6ccdc312d21765181ced4b72706d82dcb309f3bea989ecd2d0916c9\": rpc error: code = NotFound desc = could not find container \"2f0a1b6aa6ccdc312d21765181ced4b72706d82dcb309f3bea989ecd2d0916c9\": container with ID starting with 2f0a1b6aa6ccdc312d21765181ced4b72706d82dcb309f3bea989ecd2d0916c9 not found: ID does not exist" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.851811 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d9d8b9a-3617-4295-bea3-3339d6dc7b88" path="/var/lib/kubelet/pods/8d9d8b9a-3617-4295-bea3-3339d6dc7b88/volumes" Feb 20 07:26:58 crc kubenswrapper[5094]: I0220 07:26:58.841411 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:26:58 crc kubenswrapper[5094]: E0220 07:26:58.842560 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:27:12 crc kubenswrapper[5094]: I0220 07:27:12.841134 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:27:12 crc kubenswrapper[5094]: E0220 07:27:12.842461 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:27:23 crc kubenswrapper[5094]: I0220 07:27:23.840991 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:27:23 crc kubenswrapper[5094]: E0220 07:27:23.842583 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.261660 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p5hgb"] Feb 20 07:27:27 crc kubenswrapper[5094]: E0220 07:27:27.263215 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d9d8b9a-3617-4295-bea3-3339d6dc7b88" containerName="extract-utilities" Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.263246 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d9d8b9a-3617-4295-bea3-3339d6dc7b88" containerName="extract-utilities" Feb 20 07:27:27 crc kubenswrapper[5094]: E0220 07:27:27.263283 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d9d8b9a-3617-4295-bea3-3339d6dc7b88" containerName="registry-server" Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.263293 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d9d8b9a-3617-4295-bea3-3339d6dc7b88" containerName="registry-server" Feb 20 07:27:27 crc kubenswrapper[5094]: E0220 07:27:27.263317 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d9d8b9a-3617-4295-bea3-3339d6dc7b88" containerName="extract-content" Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.263327 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d9d8b9a-3617-4295-bea3-3339d6dc7b88" containerName="extract-content" Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.263553 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d9d8b9a-3617-4295-bea3-3339d6dc7b88" containerName="registry-server" Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.265395 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.293054 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p5hgb"] Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.466219 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szkzx\" (UniqueName: \"kubernetes.io/projected/9110846b-830e-4e8f-a471-34dea42873ff-kube-api-access-szkzx\") pod \"community-operators-p5hgb\" (UID: \"9110846b-830e-4e8f-a471-34dea42873ff\") " pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.466294 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9110846b-830e-4e8f-a471-34dea42873ff-utilities\") pod \"community-operators-p5hgb\" (UID: \"9110846b-830e-4e8f-a471-34dea42873ff\") " pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.466378 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9110846b-830e-4e8f-a471-34dea42873ff-catalog-content\") pod \"community-operators-p5hgb\" (UID: \"9110846b-830e-4e8f-a471-34dea42873ff\") " pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.567949 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szkzx\" (UniqueName: \"kubernetes.io/projected/9110846b-830e-4e8f-a471-34dea42873ff-kube-api-access-szkzx\") pod \"community-operators-p5hgb\" (UID: \"9110846b-830e-4e8f-a471-34dea42873ff\") " pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.568007 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9110846b-830e-4e8f-a471-34dea42873ff-utilities\") pod \"community-operators-p5hgb\" (UID: \"9110846b-830e-4e8f-a471-34dea42873ff\") " pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.568079 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9110846b-830e-4e8f-a471-34dea42873ff-catalog-content\") pod \"community-operators-p5hgb\" (UID: \"9110846b-830e-4e8f-a471-34dea42873ff\") " pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.568728 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9110846b-830e-4e8f-a471-34dea42873ff-catalog-content\") pod \"community-operators-p5hgb\" (UID: \"9110846b-830e-4e8f-a471-34dea42873ff\") " pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.569033 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9110846b-830e-4e8f-a471-34dea42873ff-utilities\") pod \"community-operators-p5hgb\" (UID: \"9110846b-830e-4e8f-a471-34dea42873ff\") " pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.600540 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szkzx\" (UniqueName: \"kubernetes.io/projected/9110846b-830e-4e8f-a471-34dea42873ff-kube-api-access-szkzx\") pod \"community-operators-p5hgb\" (UID: \"9110846b-830e-4e8f-a471-34dea42873ff\") " pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.887033 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:28 crc kubenswrapper[5094]: I0220 07:27:28.191848 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p5hgb"] Feb 20 07:27:29 crc kubenswrapper[5094]: I0220 07:27:29.094965 5094 generic.go:334] "Generic (PLEG): container finished" podID="9110846b-830e-4e8f-a471-34dea42873ff" containerID="ddb97b2ad2e49c250ac96b91b9d112739dd6263d370a29d6fe43a819df35affd" exitCode=0 Feb 20 07:27:29 crc kubenswrapper[5094]: I0220 07:27:29.095431 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p5hgb" event={"ID":"9110846b-830e-4e8f-a471-34dea42873ff","Type":"ContainerDied","Data":"ddb97b2ad2e49c250ac96b91b9d112739dd6263d370a29d6fe43a819df35affd"} Feb 20 07:27:29 crc kubenswrapper[5094]: I0220 07:27:29.095478 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p5hgb" event={"ID":"9110846b-830e-4e8f-a471-34dea42873ff","Type":"ContainerStarted","Data":"3888aa249479a0fe72c5da8b03cb1d89510858b5eefdfb659b65a7a6857906df"} Feb 20 07:27:30 crc kubenswrapper[5094]: I0220 07:27:30.108513 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p5hgb" event={"ID":"9110846b-830e-4e8f-a471-34dea42873ff","Type":"ContainerStarted","Data":"01faa754532731ea409f577ed20393ac8371b02a53270a34ed7d9796c8811278"} Feb 20 07:27:31 crc kubenswrapper[5094]: I0220 07:27:31.123801 5094 generic.go:334] "Generic (PLEG): container finished" podID="9110846b-830e-4e8f-a471-34dea42873ff" containerID="01faa754532731ea409f577ed20393ac8371b02a53270a34ed7d9796c8811278" exitCode=0 Feb 20 07:27:31 crc kubenswrapper[5094]: I0220 07:27:31.123886 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p5hgb" event={"ID":"9110846b-830e-4e8f-a471-34dea42873ff","Type":"ContainerDied","Data":"01faa754532731ea409f577ed20393ac8371b02a53270a34ed7d9796c8811278"} Feb 20 07:27:32 crc kubenswrapper[5094]: I0220 07:27:32.140169 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p5hgb" event={"ID":"9110846b-830e-4e8f-a471-34dea42873ff","Type":"ContainerStarted","Data":"4b4bb133e97c256b64947d6bad0eab28408ddc4503007b44ed04ed730a6277b3"} Feb 20 07:27:32 crc kubenswrapper[5094]: I0220 07:27:32.175042 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p5hgb" podStartSLOduration=2.735628534 podStartE2EDuration="5.17501731s" podCreationTimestamp="2026-02-20 07:27:27 +0000 UTC" firstStartedPulling="2026-02-20 07:27:29.099450285 +0000 UTC m=+2463.972077046" lastFinishedPulling="2026-02-20 07:27:31.538839091 +0000 UTC m=+2466.411465822" observedRunningTime="2026-02-20 07:27:32.168915913 +0000 UTC m=+2467.041542644" watchObservedRunningTime="2026-02-20 07:27:32.17501731 +0000 UTC m=+2467.047644031" Feb 20 07:27:35 crc kubenswrapper[5094]: I0220 07:27:35.848602 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:27:36 crc kubenswrapper[5094]: I0220 07:27:36.186854 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"d490ad7f0df91dbec2d268bb4e610d1c74d888b0923d10ec319d040c9dea3ddc"} Feb 20 07:27:37 crc kubenswrapper[5094]: I0220 07:27:37.888234 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:37 crc kubenswrapper[5094]: I0220 07:27:37.889427 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:37 crc kubenswrapper[5094]: I0220 07:27:37.982609 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:38 crc kubenswrapper[5094]: I0220 07:27:38.274288 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:38 crc kubenswrapper[5094]: I0220 07:27:38.386253 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p5hgb"] Feb 20 07:27:40 crc kubenswrapper[5094]: I0220 07:27:40.228382 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p5hgb" podUID="9110846b-830e-4e8f-a471-34dea42873ff" containerName="registry-server" containerID="cri-o://4b4bb133e97c256b64947d6bad0eab28408ddc4503007b44ed04ed730a6277b3" gracePeriod=2 Feb 20 07:27:40 crc kubenswrapper[5094]: I0220 07:27:40.836278 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:40 crc kubenswrapper[5094]: I0220 07:27:40.930422 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9110846b-830e-4e8f-a471-34dea42873ff-utilities\") pod \"9110846b-830e-4e8f-a471-34dea42873ff\" (UID: \"9110846b-830e-4e8f-a471-34dea42873ff\") " Feb 20 07:27:40 crc kubenswrapper[5094]: I0220 07:27:40.930532 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szkzx\" (UniqueName: \"kubernetes.io/projected/9110846b-830e-4e8f-a471-34dea42873ff-kube-api-access-szkzx\") pod \"9110846b-830e-4e8f-a471-34dea42873ff\" (UID: \"9110846b-830e-4e8f-a471-34dea42873ff\") " Feb 20 07:27:40 crc kubenswrapper[5094]: I0220 07:27:40.930763 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9110846b-830e-4e8f-a471-34dea42873ff-catalog-content\") pod \"9110846b-830e-4e8f-a471-34dea42873ff\" (UID: \"9110846b-830e-4e8f-a471-34dea42873ff\") " Feb 20 07:27:40 crc kubenswrapper[5094]: I0220 07:27:40.932449 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9110846b-830e-4e8f-a471-34dea42873ff-utilities" (OuterVolumeSpecName: "utilities") pod "9110846b-830e-4e8f-a471-34dea42873ff" (UID: "9110846b-830e-4e8f-a471-34dea42873ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:27:40 crc kubenswrapper[5094]: I0220 07:27:40.940954 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9110846b-830e-4e8f-a471-34dea42873ff-kube-api-access-szkzx" (OuterVolumeSpecName: "kube-api-access-szkzx") pod "9110846b-830e-4e8f-a471-34dea42873ff" (UID: "9110846b-830e-4e8f-a471-34dea42873ff"). InnerVolumeSpecName "kube-api-access-szkzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.017899 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9110846b-830e-4e8f-a471-34dea42873ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9110846b-830e-4e8f-a471-34dea42873ff" (UID: "9110846b-830e-4e8f-a471-34dea42873ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.033908 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9110846b-830e-4e8f-a471-34dea42873ff-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.033981 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szkzx\" (UniqueName: \"kubernetes.io/projected/9110846b-830e-4e8f-a471-34dea42873ff-kube-api-access-szkzx\") on node \"crc\" DevicePath \"\"" Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.034002 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9110846b-830e-4e8f-a471-34dea42873ff-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.241796 5094 generic.go:334] "Generic (PLEG): container finished" podID="9110846b-830e-4e8f-a471-34dea42873ff" containerID="4b4bb133e97c256b64947d6bad0eab28408ddc4503007b44ed04ed730a6277b3" exitCode=0 Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.241874 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p5hgb" event={"ID":"9110846b-830e-4e8f-a471-34dea42873ff","Type":"ContainerDied","Data":"4b4bb133e97c256b64947d6bad0eab28408ddc4503007b44ed04ed730a6277b3"} Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.242427 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p5hgb" event={"ID":"9110846b-830e-4e8f-a471-34dea42873ff","Type":"ContainerDied","Data":"3888aa249479a0fe72c5da8b03cb1d89510858b5eefdfb659b65a7a6857906df"} Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.242473 5094 scope.go:117] "RemoveContainer" containerID="4b4bb133e97c256b64947d6bad0eab28408ddc4503007b44ed04ed730a6277b3" Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.242055 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.296208 5094 scope.go:117] "RemoveContainer" containerID="01faa754532731ea409f577ed20393ac8371b02a53270a34ed7d9796c8811278" Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.310772 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p5hgb"] Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.314924 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p5hgb"] Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.331183 5094 scope.go:117] "RemoveContainer" containerID="ddb97b2ad2e49c250ac96b91b9d112739dd6263d370a29d6fe43a819df35affd" Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.369228 5094 scope.go:117] "RemoveContainer" containerID="4b4bb133e97c256b64947d6bad0eab28408ddc4503007b44ed04ed730a6277b3" Feb 20 07:27:41 crc kubenswrapper[5094]: E0220 07:27:41.369775 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b4bb133e97c256b64947d6bad0eab28408ddc4503007b44ed04ed730a6277b3\": container with ID starting with 4b4bb133e97c256b64947d6bad0eab28408ddc4503007b44ed04ed730a6277b3 not found: ID does not exist" containerID="4b4bb133e97c256b64947d6bad0eab28408ddc4503007b44ed04ed730a6277b3" Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.369835 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b4bb133e97c256b64947d6bad0eab28408ddc4503007b44ed04ed730a6277b3"} err="failed to get container status \"4b4bb133e97c256b64947d6bad0eab28408ddc4503007b44ed04ed730a6277b3\": rpc error: code = NotFound desc = could not find container \"4b4bb133e97c256b64947d6bad0eab28408ddc4503007b44ed04ed730a6277b3\": container with ID starting with 4b4bb133e97c256b64947d6bad0eab28408ddc4503007b44ed04ed730a6277b3 not found: ID does not exist" Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.369876 5094 scope.go:117] "RemoveContainer" containerID="01faa754532731ea409f577ed20393ac8371b02a53270a34ed7d9796c8811278" Feb 20 07:27:41 crc kubenswrapper[5094]: E0220 07:27:41.370546 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01faa754532731ea409f577ed20393ac8371b02a53270a34ed7d9796c8811278\": container with ID starting with 01faa754532731ea409f577ed20393ac8371b02a53270a34ed7d9796c8811278 not found: ID does not exist" containerID="01faa754532731ea409f577ed20393ac8371b02a53270a34ed7d9796c8811278" Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.370609 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01faa754532731ea409f577ed20393ac8371b02a53270a34ed7d9796c8811278"} err="failed to get container status \"01faa754532731ea409f577ed20393ac8371b02a53270a34ed7d9796c8811278\": rpc error: code = NotFound desc = could not find container \"01faa754532731ea409f577ed20393ac8371b02a53270a34ed7d9796c8811278\": container with ID starting with 01faa754532731ea409f577ed20393ac8371b02a53270a34ed7d9796c8811278 not found: ID does not exist" Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.370634 5094 scope.go:117] "RemoveContainer" containerID="ddb97b2ad2e49c250ac96b91b9d112739dd6263d370a29d6fe43a819df35affd" Feb 20 07:27:41 crc kubenswrapper[5094]: E0220 07:27:41.371574 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddb97b2ad2e49c250ac96b91b9d112739dd6263d370a29d6fe43a819df35affd\": container with ID starting with ddb97b2ad2e49c250ac96b91b9d112739dd6263d370a29d6fe43a819df35affd not found: ID does not exist" containerID="ddb97b2ad2e49c250ac96b91b9d112739dd6263d370a29d6fe43a819df35affd" Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.371615 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddb97b2ad2e49c250ac96b91b9d112739dd6263d370a29d6fe43a819df35affd"} err="failed to get container status \"ddb97b2ad2e49c250ac96b91b9d112739dd6263d370a29d6fe43a819df35affd\": rpc error: code = NotFound desc = could not find container \"ddb97b2ad2e49c250ac96b91b9d112739dd6263d370a29d6fe43a819df35affd\": container with ID starting with ddb97b2ad2e49c250ac96b91b9d112739dd6263d370a29d6fe43a819df35affd not found: ID does not exist" Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.857818 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9110846b-830e-4e8f-a471-34dea42873ff" path="/var/lib/kubelet/pods/9110846b-830e-4e8f-a471-34dea42873ff/volumes" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.175081 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw"] Feb 20 07:30:00 crc kubenswrapper[5094]: E0220 07:30:00.176456 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9110846b-830e-4e8f-a471-34dea42873ff" containerName="registry-server" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.176480 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="9110846b-830e-4e8f-a471-34dea42873ff" containerName="registry-server" Feb 20 07:30:00 crc kubenswrapper[5094]: E0220 07:30:00.176523 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9110846b-830e-4e8f-a471-34dea42873ff" containerName="extract-content" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.176538 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="9110846b-830e-4e8f-a471-34dea42873ff" containerName="extract-content" Feb 20 07:30:00 crc kubenswrapper[5094]: E0220 07:30:00.176565 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9110846b-830e-4e8f-a471-34dea42873ff" containerName="extract-utilities" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.176578 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="9110846b-830e-4e8f-a471-34dea42873ff" containerName="extract-utilities" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.177313 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="9110846b-830e-4e8f-a471-34dea42873ff" containerName="registry-server" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.178914 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.183803 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.186275 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.211355 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw"] Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.212816 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-secret-volume\") pod \"collect-profiles-29526210-hwdfw\" (UID: \"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.212909 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-config-volume\") pod \"collect-profiles-29526210-hwdfw\" (UID: \"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.213153 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg8sw\" (UniqueName: \"kubernetes.io/projected/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-kube-api-access-mg8sw\") pod \"collect-profiles-29526210-hwdfw\" (UID: \"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.315241 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-secret-volume\") pod \"collect-profiles-29526210-hwdfw\" (UID: \"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.315332 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-config-volume\") pod \"collect-profiles-29526210-hwdfw\" (UID: \"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.315461 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg8sw\" (UniqueName: \"kubernetes.io/projected/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-kube-api-access-mg8sw\") pod \"collect-profiles-29526210-hwdfw\" (UID: \"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.317282 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-config-volume\") pod \"collect-profiles-29526210-hwdfw\" (UID: \"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.331091 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-secret-volume\") pod \"collect-profiles-29526210-hwdfw\" (UID: \"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.339967 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg8sw\" (UniqueName: \"kubernetes.io/projected/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-kube-api-access-mg8sw\") pod \"collect-profiles-29526210-hwdfw\" (UID: \"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.525623 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw" Feb 20 07:30:01 crc kubenswrapper[5094]: I0220 07:30:01.045050 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw"] Feb 20 07:30:01 crc kubenswrapper[5094]: I0220 07:30:01.748808 5094 generic.go:334] "Generic (PLEG): container finished" podID="e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d" containerID="07ea8e807e5436859467c750ef51269844eba966788ab09e687b71868fdd8b31" exitCode=0 Feb 20 07:30:01 crc kubenswrapper[5094]: I0220 07:30:01.748934 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw" event={"ID":"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d","Type":"ContainerDied","Data":"07ea8e807e5436859467c750ef51269844eba966788ab09e687b71868fdd8b31"} Feb 20 07:30:01 crc kubenswrapper[5094]: I0220 07:30:01.749203 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw" event={"ID":"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d","Type":"ContainerStarted","Data":"75598ee0137b474f71854526287c35843ae94a087c015ae47398f1b6cd665787"} Feb 20 07:30:03 crc kubenswrapper[5094]: I0220 07:30:03.149319 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw" Feb 20 07:30:03 crc kubenswrapper[5094]: I0220 07:30:03.281836 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-config-volume\") pod \"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d\" (UID: \"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d\") " Feb 20 07:30:03 crc kubenswrapper[5094]: I0220 07:30:03.281928 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-secret-volume\") pod \"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d\" (UID: \"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d\") " Feb 20 07:30:03 crc kubenswrapper[5094]: I0220 07:30:03.282087 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg8sw\" (UniqueName: \"kubernetes.io/projected/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-kube-api-access-mg8sw\") pod \"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d\" (UID: \"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d\") " Feb 20 07:30:03 crc kubenswrapper[5094]: I0220 07:30:03.283529 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-config-volume" (OuterVolumeSpecName: "config-volume") pod "e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d" (UID: "e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:30:03 crc kubenswrapper[5094]: I0220 07:30:03.290496 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-kube-api-access-mg8sw" (OuterVolumeSpecName: "kube-api-access-mg8sw") pod "e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d" (UID: "e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d"). InnerVolumeSpecName "kube-api-access-mg8sw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:30:03 crc kubenswrapper[5094]: I0220 07:30:03.291972 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d" (UID: "e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:30:03 crc kubenswrapper[5094]: I0220 07:30:03.385041 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg8sw\" (UniqueName: \"kubernetes.io/projected/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-kube-api-access-mg8sw\") on node \"crc\" DevicePath \"\"" Feb 20 07:30:03 crc kubenswrapper[5094]: I0220 07:30:03.385124 5094 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 07:30:03 crc kubenswrapper[5094]: I0220 07:30:03.385146 5094 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 07:30:03 crc kubenswrapper[5094]: I0220 07:30:03.780104 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw" event={"ID":"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d","Type":"ContainerDied","Data":"75598ee0137b474f71854526287c35843ae94a087c015ae47398f1b6cd665787"} Feb 20 07:30:03 crc kubenswrapper[5094]: I0220 07:30:03.780183 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75598ee0137b474f71854526287c35843ae94a087c015ae47398f1b6cd665787" Feb 20 07:30:03 crc kubenswrapper[5094]: I0220 07:30:03.780259 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw" Feb 20 07:30:04 crc kubenswrapper[5094]: I0220 07:30:04.106655 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:30:04 crc kubenswrapper[5094]: I0220 07:30:04.107898 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:30:04 crc kubenswrapper[5094]: I0220 07:30:04.248005 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4"] Feb 20 07:30:04 crc kubenswrapper[5094]: I0220 07:30:04.256931 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4"] Feb 20 07:30:05 crc kubenswrapper[5094]: I0220 07:30:05.859080 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="806ba791-714c-4d13-b595-d4f6ccf06aea" path="/var/lib/kubelet/pods/806ba791-714c-4d13-b595-d4f6ccf06aea/volumes" Feb 20 07:30:23 crc kubenswrapper[5094]: I0220 07:30:23.479921 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zb9lv"] Feb 20 07:30:23 crc kubenswrapper[5094]: E0220 07:30:23.481401 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d" containerName="collect-profiles" Feb 20 07:30:23 crc kubenswrapper[5094]: I0220 07:30:23.481435 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d" containerName="collect-profiles" Feb 20 07:30:23 crc kubenswrapper[5094]: I0220 07:30:23.481763 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d" containerName="collect-profiles" Feb 20 07:30:23 crc kubenswrapper[5094]: I0220 07:30:23.487334 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:23 crc kubenswrapper[5094]: I0220 07:30:23.505838 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zb9lv"] Feb 20 07:30:23 crc kubenswrapper[5094]: I0220 07:30:23.593464 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3427dbbc-6f50-4576-9871-52bd5a127484-catalog-content\") pod \"redhat-marketplace-zb9lv\" (UID: \"3427dbbc-6f50-4576-9871-52bd5a127484\") " pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:23 crc kubenswrapper[5094]: I0220 07:30:23.593544 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3427dbbc-6f50-4576-9871-52bd5a127484-utilities\") pod \"redhat-marketplace-zb9lv\" (UID: \"3427dbbc-6f50-4576-9871-52bd5a127484\") " pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:23 crc kubenswrapper[5094]: I0220 07:30:23.593571 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxl44\" (UniqueName: \"kubernetes.io/projected/3427dbbc-6f50-4576-9871-52bd5a127484-kube-api-access-qxl44\") pod \"redhat-marketplace-zb9lv\" (UID: \"3427dbbc-6f50-4576-9871-52bd5a127484\") " pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:23 crc kubenswrapper[5094]: I0220 07:30:23.696461 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3427dbbc-6f50-4576-9871-52bd5a127484-utilities\") pod \"redhat-marketplace-zb9lv\" (UID: \"3427dbbc-6f50-4576-9871-52bd5a127484\") " pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:23 crc kubenswrapper[5094]: I0220 07:30:23.696557 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxl44\" (UniqueName: \"kubernetes.io/projected/3427dbbc-6f50-4576-9871-52bd5a127484-kube-api-access-qxl44\") pod \"redhat-marketplace-zb9lv\" (UID: \"3427dbbc-6f50-4576-9871-52bd5a127484\") " pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:23 crc kubenswrapper[5094]: I0220 07:30:23.696776 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3427dbbc-6f50-4576-9871-52bd5a127484-catalog-content\") pod \"redhat-marketplace-zb9lv\" (UID: \"3427dbbc-6f50-4576-9871-52bd5a127484\") " pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:23 crc kubenswrapper[5094]: I0220 07:30:23.698028 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3427dbbc-6f50-4576-9871-52bd5a127484-utilities\") pod \"redhat-marketplace-zb9lv\" (UID: \"3427dbbc-6f50-4576-9871-52bd5a127484\") " pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:23 crc kubenswrapper[5094]: I0220 07:30:23.698047 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3427dbbc-6f50-4576-9871-52bd5a127484-catalog-content\") pod \"redhat-marketplace-zb9lv\" (UID: \"3427dbbc-6f50-4576-9871-52bd5a127484\") " pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:23 crc kubenswrapper[5094]: I0220 07:30:23.724917 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxl44\" (UniqueName: \"kubernetes.io/projected/3427dbbc-6f50-4576-9871-52bd5a127484-kube-api-access-qxl44\") pod \"redhat-marketplace-zb9lv\" (UID: \"3427dbbc-6f50-4576-9871-52bd5a127484\") " pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:23 crc kubenswrapper[5094]: I0220 07:30:23.810328 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:24 crc kubenswrapper[5094]: I0220 07:30:24.164863 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zb9lv"] Feb 20 07:30:25 crc kubenswrapper[5094]: I0220 07:30:25.017898 5094 generic.go:334] "Generic (PLEG): container finished" podID="3427dbbc-6f50-4576-9871-52bd5a127484" containerID="856db63d99e72886311798f9bcc337791adb04e8a897975b65be61f9118b9855" exitCode=0 Feb 20 07:30:25 crc kubenswrapper[5094]: I0220 07:30:25.018273 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zb9lv" event={"ID":"3427dbbc-6f50-4576-9871-52bd5a127484","Type":"ContainerDied","Data":"856db63d99e72886311798f9bcc337791adb04e8a897975b65be61f9118b9855"} Feb 20 07:30:25 crc kubenswrapper[5094]: I0220 07:30:25.018408 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zb9lv" event={"ID":"3427dbbc-6f50-4576-9871-52bd5a127484","Type":"ContainerStarted","Data":"64477b0fb18d73d2b6d5e2bce6c185ea99cd22a3e25012ecc5859d23259c3a85"} Feb 20 07:30:26 crc kubenswrapper[5094]: I0220 07:30:26.026439 5094 generic.go:334] "Generic (PLEG): container finished" podID="3427dbbc-6f50-4576-9871-52bd5a127484" containerID="6db3a6fda00d983f944b7c17a0874460f1560455d19e5117c4d0307427156295" exitCode=0 Feb 20 07:30:26 crc kubenswrapper[5094]: I0220 07:30:26.026588 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zb9lv" event={"ID":"3427dbbc-6f50-4576-9871-52bd5a127484","Type":"ContainerDied","Data":"6db3a6fda00d983f944b7c17a0874460f1560455d19e5117c4d0307427156295"} Feb 20 07:30:27 crc kubenswrapper[5094]: I0220 07:30:27.041215 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zb9lv" event={"ID":"3427dbbc-6f50-4576-9871-52bd5a127484","Type":"ContainerStarted","Data":"5e1745b21ed11314dd9bd34e5fd2f91c250a53111792f9a404cce5cba1f01f42"} Feb 20 07:30:27 crc kubenswrapper[5094]: I0220 07:30:27.071910 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zb9lv" podStartSLOduration=2.532102259 podStartE2EDuration="4.071883548s" podCreationTimestamp="2026-02-20 07:30:23 +0000 UTC" firstStartedPulling="2026-02-20 07:30:25.020881948 +0000 UTC m=+2639.893508699" lastFinishedPulling="2026-02-20 07:30:26.560663267 +0000 UTC m=+2641.433289988" observedRunningTime="2026-02-20 07:30:27.069098681 +0000 UTC m=+2641.941725392" watchObservedRunningTime="2026-02-20 07:30:27.071883548 +0000 UTC m=+2641.944510269" Feb 20 07:30:33 crc kubenswrapper[5094]: I0220 07:30:33.811263 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:33 crc kubenswrapper[5094]: I0220 07:30:33.812231 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:33 crc kubenswrapper[5094]: I0220 07:30:33.882652 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:34 crc kubenswrapper[5094]: I0220 07:30:34.107221 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:30:34 crc kubenswrapper[5094]: I0220 07:30:34.107548 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:30:34 crc kubenswrapper[5094]: I0220 07:30:34.146502 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:34 crc kubenswrapper[5094]: I0220 07:30:34.216096 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zb9lv"] Feb 20 07:30:34 crc kubenswrapper[5094]: I0220 07:30:34.842275 5094 scope.go:117] "RemoveContainer" containerID="97bfbe799eec96272d8e9a6b7afb335847d35846686c6b7385ed8fc78aa4aec5" Feb 20 07:30:36 crc kubenswrapper[5094]: I0220 07:30:36.111045 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zb9lv" podUID="3427dbbc-6f50-4576-9871-52bd5a127484" containerName="registry-server" containerID="cri-o://5e1745b21ed11314dd9bd34e5fd2f91c250a53111792f9a404cce5cba1f01f42" gracePeriod=2 Feb 20 07:30:36 crc kubenswrapper[5094]: I0220 07:30:36.588617 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:36 crc kubenswrapper[5094]: I0220 07:30:36.724734 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3427dbbc-6f50-4576-9871-52bd5a127484-catalog-content\") pod \"3427dbbc-6f50-4576-9871-52bd5a127484\" (UID: \"3427dbbc-6f50-4576-9871-52bd5a127484\") " Feb 20 07:30:36 crc kubenswrapper[5094]: I0220 07:30:36.724885 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxl44\" (UniqueName: \"kubernetes.io/projected/3427dbbc-6f50-4576-9871-52bd5a127484-kube-api-access-qxl44\") pod \"3427dbbc-6f50-4576-9871-52bd5a127484\" (UID: \"3427dbbc-6f50-4576-9871-52bd5a127484\") " Feb 20 07:30:36 crc kubenswrapper[5094]: I0220 07:30:36.724927 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3427dbbc-6f50-4576-9871-52bd5a127484-utilities\") pod \"3427dbbc-6f50-4576-9871-52bd5a127484\" (UID: \"3427dbbc-6f50-4576-9871-52bd5a127484\") " Feb 20 07:30:36 crc kubenswrapper[5094]: I0220 07:30:36.725859 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3427dbbc-6f50-4576-9871-52bd5a127484-utilities" (OuterVolumeSpecName: "utilities") pod "3427dbbc-6f50-4576-9871-52bd5a127484" (UID: "3427dbbc-6f50-4576-9871-52bd5a127484"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:30:36 crc kubenswrapper[5094]: I0220 07:30:36.732436 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3427dbbc-6f50-4576-9871-52bd5a127484-kube-api-access-qxl44" (OuterVolumeSpecName: "kube-api-access-qxl44") pod "3427dbbc-6f50-4576-9871-52bd5a127484" (UID: "3427dbbc-6f50-4576-9871-52bd5a127484"). InnerVolumeSpecName "kube-api-access-qxl44". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:30:36 crc kubenswrapper[5094]: I0220 07:30:36.748189 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3427dbbc-6f50-4576-9871-52bd5a127484-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3427dbbc-6f50-4576-9871-52bd5a127484" (UID: "3427dbbc-6f50-4576-9871-52bd5a127484"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:30:36 crc kubenswrapper[5094]: I0220 07:30:36.826484 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxl44\" (UniqueName: \"kubernetes.io/projected/3427dbbc-6f50-4576-9871-52bd5a127484-kube-api-access-qxl44\") on node \"crc\" DevicePath \"\"" Feb 20 07:30:36 crc kubenswrapper[5094]: I0220 07:30:36.827664 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3427dbbc-6f50-4576-9871-52bd5a127484-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:30:36 crc kubenswrapper[5094]: I0220 07:30:36.827688 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3427dbbc-6f50-4576-9871-52bd5a127484-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:30:37 crc kubenswrapper[5094]: I0220 07:30:37.121275 5094 generic.go:334] "Generic (PLEG): container finished" podID="3427dbbc-6f50-4576-9871-52bd5a127484" containerID="5e1745b21ed11314dd9bd34e5fd2f91c250a53111792f9a404cce5cba1f01f42" exitCode=0 Feb 20 07:30:37 crc kubenswrapper[5094]: I0220 07:30:37.121351 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zb9lv" event={"ID":"3427dbbc-6f50-4576-9871-52bd5a127484","Type":"ContainerDied","Data":"5e1745b21ed11314dd9bd34e5fd2f91c250a53111792f9a404cce5cba1f01f42"} Feb 20 07:30:37 crc kubenswrapper[5094]: I0220 07:30:37.121384 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:37 crc kubenswrapper[5094]: I0220 07:30:37.121889 5094 scope.go:117] "RemoveContainer" containerID="5e1745b21ed11314dd9bd34e5fd2f91c250a53111792f9a404cce5cba1f01f42" Feb 20 07:30:37 crc kubenswrapper[5094]: I0220 07:30:37.135867 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zb9lv" event={"ID":"3427dbbc-6f50-4576-9871-52bd5a127484","Type":"ContainerDied","Data":"64477b0fb18d73d2b6d5e2bce6c185ea99cd22a3e25012ecc5859d23259c3a85"} Feb 20 07:30:37 crc kubenswrapper[5094]: I0220 07:30:37.166387 5094 scope.go:117] "RemoveContainer" containerID="6db3a6fda00d983f944b7c17a0874460f1560455d19e5117c4d0307427156295" Feb 20 07:30:37 crc kubenswrapper[5094]: I0220 07:30:37.168866 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zb9lv"] Feb 20 07:30:37 crc kubenswrapper[5094]: I0220 07:30:37.178144 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zb9lv"] Feb 20 07:30:37 crc kubenswrapper[5094]: I0220 07:30:37.192226 5094 scope.go:117] "RemoveContainer" containerID="856db63d99e72886311798f9bcc337791adb04e8a897975b65be61f9118b9855" Feb 20 07:30:37 crc kubenswrapper[5094]: I0220 07:30:37.221467 5094 scope.go:117] "RemoveContainer" containerID="5e1745b21ed11314dd9bd34e5fd2f91c250a53111792f9a404cce5cba1f01f42" Feb 20 07:30:37 crc kubenswrapper[5094]: E0220 07:30:37.221993 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e1745b21ed11314dd9bd34e5fd2f91c250a53111792f9a404cce5cba1f01f42\": container with ID starting with 5e1745b21ed11314dd9bd34e5fd2f91c250a53111792f9a404cce5cba1f01f42 not found: ID does not exist" containerID="5e1745b21ed11314dd9bd34e5fd2f91c250a53111792f9a404cce5cba1f01f42" Feb 20 07:30:37 crc kubenswrapper[5094]: I0220 07:30:37.222028 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e1745b21ed11314dd9bd34e5fd2f91c250a53111792f9a404cce5cba1f01f42"} err="failed to get container status \"5e1745b21ed11314dd9bd34e5fd2f91c250a53111792f9a404cce5cba1f01f42\": rpc error: code = NotFound desc = could not find container \"5e1745b21ed11314dd9bd34e5fd2f91c250a53111792f9a404cce5cba1f01f42\": container with ID starting with 5e1745b21ed11314dd9bd34e5fd2f91c250a53111792f9a404cce5cba1f01f42 not found: ID does not exist" Feb 20 07:30:37 crc kubenswrapper[5094]: I0220 07:30:37.222052 5094 scope.go:117] "RemoveContainer" containerID="6db3a6fda00d983f944b7c17a0874460f1560455d19e5117c4d0307427156295" Feb 20 07:30:37 crc kubenswrapper[5094]: E0220 07:30:37.222406 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6db3a6fda00d983f944b7c17a0874460f1560455d19e5117c4d0307427156295\": container with ID starting with 6db3a6fda00d983f944b7c17a0874460f1560455d19e5117c4d0307427156295 not found: ID does not exist" containerID="6db3a6fda00d983f944b7c17a0874460f1560455d19e5117c4d0307427156295" Feb 20 07:30:37 crc kubenswrapper[5094]: I0220 07:30:37.222479 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6db3a6fda00d983f944b7c17a0874460f1560455d19e5117c4d0307427156295"} err="failed to get container status \"6db3a6fda00d983f944b7c17a0874460f1560455d19e5117c4d0307427156295\": rpc error: code = NotFound desc = could not find container \"6db3a6fda00d983f944b7c17a0874460f1560455d19e5117c4d0307427156295\": container with ID starting with 6db3a6fda00d983f944b7c17a0874460f1560455d19e5117c4d0307427156295 not found: ID does not exist" Feb 20 07:30:37 crc kubenswrapper[5094]: I0220 07:30:37.222520 5094 scope.go:117] "RemoveContainer" containerID="856db63d99e72886311798f9bcc337791adb04e8a897975b65be61f9118b9855" Feb 20 07:30:37 crc kubenswrapper[5094]: E0220 07:30:37.222966 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"856db63d99e72886311798f9bcc337791adb04e8a897975b65be61f9118b9855\": container with ID starting with 856db63d99e72886311798f9bcc337791adb04e8a897975b65be61f9118b9855 not found: ID does not exist" containerID="856db63d99e72886311798f9bcc337791adb04e8a897975b65be61f9118b9855" Feb 20 07:30:37 crc kubenswrapper[5094]: I0220 07:30:37.223023 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"856db63d99e72886311798f9bcc337791adb04e8a897975b65be61f9118b9855"} err="failed to get container status \"856db63d99e72886311798f9bcc337791adb04e8a897975b65be61f9118b9855\": rpc error: code = NotFound desc = could not find container \"856db63d99e72886311798f9bcc337791adb04e8a897975b65be61f9118b9855\": container with ID starting with 856db63d99e72886311798f9bcc337791adb04e8a897975b65be61f9118b9855 not found: ID does not exist" Feb 20 07:30:37 crc kubenswrapper[5094]: I0220 07:30:37.855910 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3427dbbc-6f50-4576-9871-52bd5a127484" path="/var/lib/kubelet/pods/3427dbbc-6f50-4576-9871-52bd5a127484/volumes" Feb 20 07:31:04 crc kubenswrapper[5094]: I0220 07:31:04.107059 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:31:04 crc kubenswrapper[5094]: I0220 07:31:04.107871 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:31:04 crc kubenswrapper[5094]: I0220 07:31:04.108055 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 07:31:04 crc kubenswrapper[5094]: I0220 07:31:04.108940 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d490ad7f0df91dbec2d268bb4e610d1c74d888b0923d10ec319d040c9dea3ddc"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 07:31:04 crc kubenswrapper[5094]: I0220 07:31:04.109006 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://d490ad7f0df91dbec2d268bb4e610d1c74d888b0923d10ec319d040c9dea3ddc" gracePeriod=600 Feb 20 07:31:04 crc kubenswrapper[5094]: I0220 07:31:04.405923 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="d490ad7f0df91dbec2d268bb4e610d1c74d888b0923d10ec319d040c9dea3ddc" exitCode=0 Feb 20 07:31:04 crc kubenswrapper[5094]: I0220 07:31:04.405978 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"d490ad7f0df91dbec2d268bb4e610d1c74d888b0923d10ec319d040c9dea3ddc"} Feb 20 07:31:04 crc kubenswrapper[5094]: I0220 07:31:04.406027 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:31:05 crc kubenswrapper[5094]: I0220 07:31:05.429413 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c"} Feb 20 07:32:01 crc kubenswrapper[5094]: I0220 07:32:01.884006 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rjxrj"] Feb 20 07:32:01 crc kubenswrapper[5094]: E0220 07:32:01.885246 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3427dbbc-6f50-4576-9871-52bd5a127484" containerName="extract-content" Feb 20 07:32:01 crc kubenswrapper[5094]: I0220 07:32:01.885263 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3427dbbc-6f50-4576-9871-52bd5a127484" containerName="extract-content" Feb 20 07:32:01 crc kubenswrapper[5094]: E0220 07:32:01.885295 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3427dbbc-6f50-4576-9871-52bd5a127484" containerName="registry-server" Feb 20 07:32:01 crc kubenswrapper[5094]: I0220 07:32:01.885302 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3427dbbc-6f50-4576-9871-52bd5a127484" containerName="registry-server" Feb 20 07:32:01 crc kubenswrapper[5094]: E0220 07:32:01.885335 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3427dbbc-6f50-4576-9871-52bd5a127484" containerName="extract-utilities" Feb 20 07:32:01 crc kubenswrapper[5094]: I0220 07:32:01.885343 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3427dbbc-6f50-4576-9871-52bd5a127484" containerName="extract-utilities" Feb 20 07:32:01 crc kubenswrapper[5094]: I0220 07:32:01.885538 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="3427dbbc-6f50-4576-9871-52bd5a127484" containerName="registry-server" Feb 20 07:32:01 crc kubenswrapper[5094]: I0220 07:32:01.886632 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:01 crc kubenswrapper[5094]: I0220 07:32:01.907961 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rjxrj"] Feb 20 07:32:01 crc kubenswrapper[5094]: I0220 07:32:01.941636 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d651298e-32ea-456a-ac1e-45aae2ab365f-catalog-content\") pod \"certified-operators-rjxrj\" (UID: \"d651298e-32ea-456a-ac1e-45aae2ab365f\") " pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:01 crc kubenswrapper[5094]: I0220 07:32:01.941773 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d651298e-32ea-456a-ac1e-45aae2ab365f-utilities\") pod \"certified-operators-rjxrj\" (UID: \"d651298e-32ea-456a-ac1e-45aae2ab365f\") " pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:01 crc kubenswrapper[5094]: I0220 07:32:01.941907 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktbrp\" (UniqueName: \"kubernetes.io/projected/d651298e-32ea-456a-ac1e-45aae2ab365f-kube-api-access-ktbrp\") pod \"certified-operators-rjxrj\" (UID: \"d651298e-32ea-456a-ac1e-45aae2ab365f\") " pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:02 crc kubenswrapper[5094]: I0220 07:32:02.043336 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d651298e-32ea-456a-ac1e-45aae2ab365f-catalog-content\") pod \"certified-operators-rjxrj\" (UID: \"d651298e-32ea-456a-ac1e-45aae2ab365f\") " pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:02 crc kubenswrapper[5094]: I0220 07:32:02.043830 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d651298e-32ea-456a-ac1e-45aae2ab365f-utilities\") pod \"certified-operators-rjxrj\" (UID: \"d651298e-32ea-456a-ac1e-45aae2ab365f\") " pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:02 crc kubenswrapper[5094]: I0220 07:32:02.043892 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktbrp\" (UniqueName: \"kubernetes.io/projected/d651298e-32ea-456a-ac1e-45aae2ab365f-kube-api-access-ktbrp\") pod \"certified-operators-rjxrj\" (UID: \"d651298e-32ea-456a-ac1e-45aae2ab365f\") " pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:02 crc kubenswrapper[5094]: I0220 07:32:02.044678 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d651298e-32ea-456a-ac1e-45aae2ab365f-catalog-content\") pod \"certified-operators-rjxrj\" (UID: \"d651298e-32ea-456a-ac1e-45aae2ab365f\") " pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:02 crc kubenswrapper[5094]: I0220 07:32:02.044929 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d651298e-32ea-456a-ac1e-45aae2ab365f-utilities\") pod \"certified-operators-rjxrj\" (UID: \"d651298e-32ea-456a-ac1e-45aae2ab365f\") " pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:02 crc kubenswrapper[5094]: I0220 07:32:02.070081 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktbrp\" (UniqueName: \"kubernetes.io/projected/d651298e-32ea-456a-ac1e-45aae2ab365f-kube-api-access-ktbrp\") pod \"certified-operators-rjxrj\" (UID: \"d651298e-32ea-456a-ac1e-45aae2ab365f\") " pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:02 crc kubenswrapper[5094]: I0220 07:32:02.212200 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:02 crc kubenswrapper[5094]: I0220 07:32:02.751873 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rjxrj"] Feb 20 07:32:02 crc kubenswrapper[5094]: I0220 07:32:02.997233 5094 generic.go:334] "Generic (PLEG): container finished" podID="d651298e-32ea-456a-ac1e-45aae2ab365f" containerID="5f6720f2f59675a53d1c3e15548c768221d2db6314a2271d1f4b8ffd512c3a0e" exitCode=0 Feb 20 07:32:02 crc kubenswrapper[5094]: I0220 07:32:02.997301 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjxrj" event={"ID":"d651298e-32ea-456a-ac1e-45aae2ab365f","Type":"ContainerDied","Data":"5f6720f2f59675a53d1c3e15548c768221d2db6314a2271d1f4b8ffd512c3a0e"} Feb 20 07:32:02 crc kubenswrapper[5094]: I0220 07:32:02.997346 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjxrj" event={"ID":"d651298e-32ea-456a-ac1e-45aae2ab365f","Type":"ContainerStarted","Data":"0f699d5d028fd0e043ec3e175912b3dd63456e67297d084d0a7054349caab64f"} Feb 20 07:32:03 crc kubenswrapper[5094]: I0220 07:32:03.000114 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 07:32:04 crc kubenswrapper[5094]: I0220 07:32:04.010327 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjxrj" event={"ID":"d651298e-32ea-456a-ac1e-45aae2ab365f","Type":"ContainerStarted","Data":"ebd1b1d5cfefa2fe80557b7d7e5c76e3a22ae284204a9144996853e2a1cd3399"} Feb 20 07:32:05 crc kubenswrapper[5094]: I0220 07:32:05.021023 5094 generic.go:334] "Generic (PLEG): container finished" podID="d651298e-32ea-456a-ac1e-45aae2ab365f" containerID="ebd1b1d5cfefa2fe80557b7d7e5c76e3a22ae284204a9144996853e2a1cd3399" exitCode=0 Feb 20 07:32:05 crc kubenswrapper[5094]: I0220 07:32:05.021124 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjxrj" event={"ID":"d651298e-32ea-456a-ac1e-45aae2ab365f","Type":"ContainerDied","Data":"ebd1b1d5cfefa2fe80557b7d7e5c76e3a22ae284204a9144996853e2a1cd3399"} Feb 20 07:32:06 crc kubenswrapper[5094]: I0220 07:32:06.048952 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjxrj" event={"ID":"d651298e-32ea-456a-ac1e-45aae2ab365f","Type":"ContainerStarted","Data":"d25d9975591ae947b1302119c4f8681703e4577e32c7b55a34983143f6a6b29a"} Feb 20 07:32:06 crc kubenswrapper[5094]: I0220 07:32:06.088649 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rjxrj" podStartSLOduration=2.679409058 podStartE2EDuration="5.088610051s" podCreationTimestamp="2026-02-20 07:32:01 +0000 UTC" firstStartedPulling="2026-02-20 07:32:02.999644631 +0000 UTC m=+2737.872271342" lastFinishedPulling="2026-02-20 07:32:05.408845594 +0000 UTC m=+2740.281472335" observedRunningTime="2026-02-20 07:32:06.077540606 +0000 UTC m=+2740.950167347" watchObservedRunningTime="2026-02-20 07:32:06.088610051 +0000 UTC m=+2740.961236792" Feb 20 07:32:12 crc kubenswrapper[5094]: I0220 07:32:12.213431 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:12 crc kubenswrapper[5094]: I0220 07:32:12.215742 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:12 crc kubenswrapper[5094]: I0220 07:32:12.290514 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:13 crc kubenswrapper[5094]: I0220 07:32:13.215853 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:13 crc kubenswrapper[5094]: I0220 07:32:13.282404 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rjxrj"] Feb 20 07:32:15 crc kubenswrapper[5094]: I0220 07:32:15.146791 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rjxrj" podUID="d651298e-32ea-456a-ac1e-45aae2ab365f" containerName="registry-server" containerID="cri-o://d25d9975591ae947b1302119c4f8681703e4577e32c7b55a34983143f6a6b29a" gracePeriod=2 Feb 20 07:32:15 crc kubenswrapper[5094]: I0220 07:32:15.736401 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:15 crc kubenswrapper[5094]: I0220 07:32:15.915625 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktbrp\" (UniqueName: \"kubernetes.io/projected/d651298e-32ea-456a-ac1e-45aae2ab365f-kube-api-access-ktbrp\") pod \"d651298e-32ea-456a-ac1e-45aae2ab365f\" (UID: \"d651298e-32ea-456a-ac1e-45aae2ab365f\") " Feb 20 07:32:15 crc kubenswrapper[5094]: I0220 07:32:15.915780 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d651298e-32ea-456a-ac1e-45aae2ab365f-catalog-content\") pod \"d651298e-32ea-456a-ac1e-45aae2ab365f\" (UID: \"d651298e-32ea-456a-ac1e-45aae2ab365f\") " Feb 20 07:32:15 crc kubenswrapper[5094]: I0220 07:32:15.915942 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d651298e-32ea-456a-ac1e-45aae2ab365f-utilities\") pod \"d651298e-32ea-456a-ac1e-45aae2ab365f\" (UID: \"d651298e-32ea-456a-ac1e-45aae2ab365f\") " Feb 20 07:32:15 crc kubenswrapper[5094]: I0220 07:32:15.917249 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d651298e-32ea-456a-ac1e-45aae2ab365f-utilities" (OuterVolumeSpecName: "utilities") pod "d651298e-32ea-456a-ac1e-45aae2ab365f" (UID: "d651298e-32ea-456a-ac1e-45aae2ab365f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:32:15 crc kubenswrapper[5094]: I0220 07:32:15.924845 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d651298e-32ea-456a-ac1e-45aae2ab365f-kube-api-access-ktbrp" (OuterVolumeSpecName: "kube-api-access-ktbrp") pod "d651298e-32ea-456a-ac1e-45aae2ab365f" (UID: "d651298e-32ea-456a-ac1e-45aae2ab365f"). InnerVolumeSpecName "kube-api-access-ktbrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:32:15 crc kubenswrapper[5094]: I0220 07:32:15.968529 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d651298e-32ea-456a-ac1e-45aae2ab365f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d651298e-32ea-456a-ac1e-45aae2ab365f" (UID: "d651298e-32ea-456a-ac1e-45aae2ab365f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.018187 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktbrp\" (UniqueName: \"kubernetes.io/projected/d651298e-32ea-456a-ac1e-45aae2ab365f-kube-api-access-ktbrp\") on node \"crc\" DevicePath \"\"" Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.018725 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d651298e-32ea-456a-ac1e-45aae2ab365f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.018745 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d651298e-32ea-456a-ac1e-45aae2ab365f-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.165267 5094 generic.go:334] "Generic (PLEG): container finished" podID="d651298e-32ea-456a-ac1e-45aae2ab365f" containerID="d25d9975591ae947b1302119c4f8681703e4577e32c7b55a34983143f6a6b29a" exitCode=0 Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.165346 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjxrj" event={"ID":"d651298e-32ea-456a-ac1e-45aae2ab365f","Type":"ContainerDied","Data":"d25d9975591ae947b1302119c4f8681703e4577e32c7b55a34983143f6a6b29a"} Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.165398 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjxrj" event={"ID":"d651298e-32ea-456a-ac1e-45aae2ab365f","Type":"ContainerDied","Data":"0f699d5d028fd0e043ec3e175912b3dd63456e67297d084d0a7054349caab64f"} Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.165453 5094 scope.go:117] "RemoveContainer" containerID="d25d9975591ae947b1302119c4f8681703e4577e32c7b55a34983143f6a6b29a" Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.165672 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.206408 5094 scope.go:117] "RemoveContainer" containerID="ebd1b1d5cfefa2fe80557b7d7e5c76e3a22ae284204a9144996853e2a1cd3399" Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.225995 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rjxrj"] Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.241932 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rjxrj"] Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.251811 5094 scope.go:117] "RemoveContainer" containerID="5f6720f2f59675a53d1c3e15548c768221d2db6314a2271d1f4b8ffd512c3a0e" Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.291015 5094 scope.go:117] "RemoveContainer" containerID="d25d9975591ae947b1302119c4f8681703e4577e32c7b55a34983143f6a6b29a" Feb 20 07:32:16 crc kubenswrapper[5094]: E0220 07:32:16.292185 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d25d9975591ae947b1302119c4f8681703e4577e32c7b55a34983143f6a6b29a\": container with ID starting with d25d9975591ae947b1302119c4f8681703e4577e32c7b55a34983143f6a6b29a not found: ID does not exist" containerID="d25d9975591ae947b1302119c4f8681703e4577e32c7b55a34983143f6a6b29a" Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.292259 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d25d9975591ae947b1302119c4f8681703e4577e32c7b55a34983143f6a6b29a"} err="failed to get container status \"d25d9975591ae947b1302119c4f8681703e4577e32c7b55a34983143f6a6b29a\": rpc error: code = NotFound desc = could not find container \"d25d9975591ae947b1302119c4f8681703e4577e32c7b55a34983143f6a6b29a\": container with ID starting with d25d9975591ae947b1302119c4f8681703e4577e32c7b55a34983143f6a6b29a not found: ID does not exist" Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.292311 5094 scope.go:117] "RemoveContainer" containerID="ebd1b1d5cfefa2fe80557b7d7e5c76e3a22ae284204a9144996853e2a1cd3399" Feb 20 07:32:16 crc kubenswrapper[5094]: E0220 07:32:16.293054 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebd1b1d5cfefa2fe80557b7d7e5c76e3a22ae284204a9144996853e2a1cd3399\": container with ID starting with ebd1b1d5cfefa2fe80557b7d7e5c76e3a22ae284204a9144996853e2a1cd3399 not found: ID does not exist" containerID="ebd1b1d5cfefa2fe80557b7d7e5c76e3a22ae284204a9144996853e2a1cd3399" Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.293087 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebd1b1d5cfefa2fe80557b7d7e5c76e3a22ae284204a9144996853e2a1cd3399"} err="failed to get container status \"ebd1b1d5cfefa2fe80557b7d7e5c76e3a22ae284204a9144996853e2a1cd3399\": rpc error: code = NotFound desc = could not find container \"ebd1b1d5cfefa2fe80557b7d7e5c76e3a22ae284204a9144996853e2a1cd3399\": container with ID starting with ebd1b1d5cfefa2fe80557b7d7e5c76e3a22ae284204a9144996853e2a1cd3399 not found: ID does not exist" Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.293108 5094 scope.go:117] "RemoveContainer" containerID="5f6720f2f59675a53d1c3e15548c768221d2db6314a2271d1f4b8ffd512c3a0e" Feb 20 07:32:16 crc kubenswrapper[5094]: E0220 07:32:16.293728 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f6720f2f59675a53d1c3e15548c768221d2db6314a2271d1f4b8ffd512c3a0e\": container with ID starting with 5f6720f2f59675a53d1c3e15548c768221d2db6314a2271d1f4b8ffd512c3a0e not found: ID does not exist" containerID="5f6720f2f59675a53d1c3e15548c768221d2db6314a2271d1f4b8ffd512c3a0e" Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.293812 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f6720f2f59675a53d1c3e15548c768221d2db6314a2271d1f4b8ffd512c3a0e"} err="failed to get container status \"5f6720f2f59675a53d1c3e15548c768221d2db6314a2271d1f4b8ffd512c3a0e\": rpc error: code = NotFound desc = could not find container \"5f6720f2f59675a53d1c3e15548c768221d2db6314a2271d1f4b8ffd512c3a0e\": container with ID starting with 5f6720f2f59675a53d1c3e15548c768221d2db6314a2271d1f4b8ffd512c3a0e not found: ID does not exist" Feb 20 07:32:17 crc kubenswrapper[5094]: I0220 07:32:17.853348 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d651298e-32ea-456a-ac1e-45aae2ab365f" path="/var/lib/kubelet/pods/d651298e-32ea-456a-ac1e-45aae2ab365f/volumes" Feb 20 07:33:04 crc kubenswrapper[5094]: I0220 07:33:04.107274 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:33:04 crc kubenswrapper[5094]: I0220 07:33:04.108363 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:33:34 crc kubenswrapper[5094]: I0220 07:33:34.107074 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:33:34 crc kubenswrapper[5094]: I0220 07:33:34.108037 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:34:04 crc kubenswrapper[5094]: I0220 07:34:04.107025 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:34:04 crc kubenswrapper[5094]: I0220 07:34:04.108105 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:34:04 crc kubenswrapper[5094]: I0220 07:34:04.108191 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 07:34:04 crc kubenswrapper[5094]: I0220 07:34:04.109508 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 07:34:04 crc kubenswrapper[5094]: I0220 07:34:04.109620 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" gracePeriod=600 Feb 20 07:34:04 crc kubenswrapper[5094]: E0220 07:34:04.246214 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:34:04 crc kubenswrapper[5094]: I0220 07:34:04.299389 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" exitCode=0 Feb 20 07:34:04 crc kubenswrapper[5094]: I0220 07:34:04.299512 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c"} Feb 20 07:34:04 crc kubenswrapper[5094]: I0220 07:34:04.299652 5094 scope.go:117] "RemoveContainer" containerID="d490ad7f0df91dbec2d268bb4e610d1c74d888b0923d10ec319d040c9dea3ddc" Feb 20 07:34:04 crc kubenswrapper[5094]: I0220 07:34:04.301825 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:34:04 crc kubenswrapper[5094]: E0220 07:34:04.302459 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:34:18 crc kubenswrapper[5094]: I0220 07:34:18.840991 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:34:18 crc kubenswrapper[5094]: E0220 07:34:18.842492 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:34:33 crc kubenswrapper[5094]: I0220 07:34:33.841234 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:34:33 crc kubenswrapper[5094]: E0220 07:34:33.842612 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:34:45 crc kubenswrapper[5094]: I0220 07:34:45.853885 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:34:45 crc kubenswrapper[5094]: E0220 07:34:45.855065 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:34:59 crc kubenswrapper[5094]: I0220 07:34:59.840634 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:34:59 crc kubenswrapper[5094]: E0220 07:34:59.841580 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:35:13 crc kubenswrapper[5094]: I0220 07:35:13.840659 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:35:13 crc kubenswrapper[5094]: E0220 07:35:13.841974 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:35:27 crc kubenswrapper[5094]: I0220 07:35:27.843293 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:35:27 crc kubenswrapper[5094]: E0220 07:35:27.844120 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:35:39 crc kubenswrapper[5094]: I0220 07:35:39.840842 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:35:39 crc kubenswrapper[5094]: E0220 07:35:39.841715 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:35:50 crc kubenswrapper[5094]: I0220 07:35:50.841157 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:35:50 crc kubenswrapper[5094]: E0220 07:35:50.842637 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:36:04 crc kubenswrapper[5094]: I0220 07:36:04.839965 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:36:04 crc kubenswrapper[5094]: E0220 07:36:04.841191 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:36:18 crc kubenswrapper[5094]: I0220 07:36:18.841443 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:36:18 crc kubenswrapper[5094]: E0220 07:36:18.842674 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:36:30 crc kubenswrapper[5094]: I0220 07:36:30.845446 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:36:30 crc kubenswrapper[5094]: E0220 07:36:30.846971 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:36:45 crc kubenswrapper[5094]: I0220 07:36:45.844382 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:36:45 crc kubenswrapper[5094]: E0220 07:36:45.845294 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:36:56 crc kubenswrapper[5094]: I0220 07:36:56.841946 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:36:56 crc kubenswrapper[5094]: E0220 07:36:56.843371 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:37:08 crc kubenswrapper[5094]: I0220 07:37:08.841177 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:37:08 crc kubenswrapper[5094]: E0220 07:37:08.842460 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:37:20 crc kubenswrapper[5094]: I0220 07:37:20.841430 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:37:20 crc kubenswrapper[5094]: E0220 07:37:20.842669 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:37:33 crc kubenswrapper[5094]: I0220 07:37:33.842324 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:37:33 crc kubenswrapper[5094]: E0220 07:37:33.845342 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.135309 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z4qm5"] Feb 20 07:37:46 crc kubenswrapper[5094]: E0220 07:37:46.137574 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d651298e-32ea-456a-ac1e-45aae2ab365f" containerName="extract-utilities" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.137640 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d651298e-32ea-456a-ac1e-45aae2ab365f" containerName="extract-utilities" Feb 20 07:37:46 crc kubenswrapper[5094]: E0220 07:37:46.137682 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d651298e-32ea-456a-ac1e-45aae2ab365f" containerName="extract-content" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.137695 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d651298e-32ea-456a-ac1e-45aae2ab365f" containerName="extract-content" Feb 20 07:37:46 crc kubenswrapper[5094]: E0220 07:37:46.137734 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d651298e-32ea-456a-ac1e-45aae2ab365f" containerName="registry-server" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.137744 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d651298e-32ea-456a-ac1e-45aae2ab365f" containerName="registry-server" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.138349 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d651298e-32ea-456a-ac1e-45aae2ab365f" containerName="registry-server" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.142795 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.156483 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q972\" (UniqueName: \"kubernetes.io/projected/5e5334f3-7266-4687-b9ba-9574b54c9a29-kube-api-access-2q972\") pod \"redhat-operators-z4qm5\" (UID: \"5e5334f3-7266-4687-b9ba-9574b54c9a29\") " pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.156682 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z4qm5"] Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.156677 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e5334f3-7266-4687-b9ba-9574b54c9a29-catalog-content\") pod \"redhat-operators-z4qm5\" (UID: \"5e5334f3-7266-4687-b9ba-9574b54c9a29\") " pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.156840 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e5334f3-7266-4687-b9ba-9574b54c9a29-utilities\") pod \"redhat-operators-z4qm5\" (UID: \"5e5334f3-7266-4687-b9ba-9574b54c9a29\") " pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.258804 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e5334f3-7266-4687-b9ba-9574b54c9a29-utilities\") pod \"redhat-operators-z4qm5\" (UID: \"5e5334f3-7266-4687-b9ba-9574b54c9a29\") " pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.258934 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q972\" (UniqueName: \"kubernetes.io/projected/5e5334f3-7266-4687-b9ba-9574b54c9a29-kube-api-access-2q972\") pod \"redhat-operators-z4qm5\" (UID: \"5e5334f3-7266-4687-b9ba-9574b54c9a29\") " pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.258991 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e5334f3-7266-4687-b9ba-9574b54c9a29-catalog-content\") pod \"redhat-operators-z4qm5\" (UID: \"5e5334f3-7266-4687-b9ba-9574b54c9a29\") " pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.259437 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e5334f3-7266-4687-b9ba-9574b54c9a29-catalog-content\") pod \"redhat-operators-z4qm5\" (UID: \"5e5334f3-7266-4687-b9ba-9574b54c9a29\") " pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.259518 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e5334f3-7266-4687-b9ba-9574b54c9a29-utilities\") pod \"redhat-operators-z4qm5\" (UID: \"5e5334f3-7266-4687-b9ba-9574b54c9a29\") " pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.283803 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q972\" (UniqueName: \"kubernetes.io/projected/5e5334f3-7266-4687-b9ba-9574b54c9a29-kube-api-access-2q972\") pod \"redhat-operators-z4qm5\" (UID: \"5e5334f3-7266-4687-b9ba-9574b54c9a29\") " pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.484523 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.956101 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z4qm5"] Feb 20 07:37:47 crc kubenswrapper[5094]: I0220 07:37:47.578027 5094 generic.go:334] "Generic (PLEG): container finished" podID="5e5334f3-7266-4687-b9ba-9574b54c9a29" containerID="96812e61d21b8233b73669426f1d067f0b54883330c1735f586c81e3cc39b367" exitCode=0 Feb 20 07:37:47 crc kubenswrapper[5094]: I0220 07:37:47.578090 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4qm5" event={"ID":"5e5334f3-7266-4687-b9ba-9574b54c9a29","Type":"ContainerDied","Data":"96812e61d21b8233b73669426f1d067f0b54883330c1735f586c81e3cc39b367"} Feb 20 07:37:47 crc kubenswrapper[5094]: I0220 07:37:47.578132 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4qm5" event={"ID":"5e5334f3-7266-4687-b9ba-9574b54c9a29","Type":"ContainerStarted","Data":"89c9e252d537af174af4376fa5796e176641ea798d855baa2c860628688dd9e3"} Feb 20 07:37:47 crc kubenswrapper[5094]: I0220 07:37:47.581611 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 07:37:47 crc kubenswrapper[5094]: I0220 07:37:47.840829 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:37:47 crc kubenswrapper[5094]: E0220 07:37:47.841071 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:37:48 crc kubenswrapper[5094]: I0220 07:37:48.596817 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4qm5" event={"ID":"5e5334f3-7266-4687-b9ba-9574b54c9a29","Type":"ContainerStarted","Data":"cc157226f40f2c371a50404a0f66c2791e56253bdeeb60a3172855ec95badfb0"} Feb 20 07:37:49 crc kubenswrapper[5094]: I0220 07:37:49.611234 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4qm5" event={"ID":"5e5334f3-7266-4687-b9ba-9574b54c9a29","Type":"ContainerDied","Data":"cc157226f40f2c371a50404a0f66c2791e56253bdeeb60a3172855ec95badfb0"} Feb 20 07:37:49 crc kubenswrapper[5094]: I0220 07:37:49.612799 5094 generic.go:334] "Generic (PLEG): container finished" podID="5e5334f3-7266-4687-b9ba-9574b54c9a29" containerID="cc157226f40f2c371a50404a0f66c2791e56253bdeeb60a3172855ec95badfb0" exitCode=0 Feb 20 07:37:50 crc kubenswrapper[5094]: I0220 07:37:50.625929 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4qm5" event={"ID":"5e5334f3-7266-4687-b9ba-9574b54c9a29","Type":"ContainerStarted","Data":"48b1f3caf6c0a4a92f9bd9434bbade328a3a706049e01e48a1c0f40d6cf2fe22"} Feb 20 07:37:50 crc kubenswrapper[5094]: I0220 07:37:50.654237 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z4qm5" podStartSLOduration=2.000894997 podStartE2EDuration="4.654214586s" podCreationTimestamp="2026-02-20 07:37:46 +0000 UTC" firstStartedPulling="2026-02-20 07:37:47.581173628 +0000 UTC m=+3082.453800369" lastFinishedPulling="2026-02-20 07:37:50.234493217 +0000 UTC m=+3085.107119958" observedRunningTime="2026-02-20 07:37:50.651160533 +0000 UTC m=+3085.523787254" watchObservedRunningTime="2026-02-20 07:37:50.654214586 +0000 UTC m=+3085.526841297" Feb 20 07:37:54 crc kubenswrapper[5094]: I0220 07:37:54.860890 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-85zvz"] Feb 20 07:37:54 crc kubenswrapper[5094]: I0220 07:37:54.864336 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:37:54 crc kubenswrapper[5094]: I0220 07:37:54.877564 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-85zvz"] Feb 20 07:37:55 crc kubenswrapper[5094]: I0220 07:37:55.030212 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-utilities\") pod \"community-operators-85zvz\" (UID: \"aa5d2e9e-7d8a-48bf-a074-eb34159551ed\") " pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:37:55 crc kubenswrapper[5094]: I0220 07:37:55.030358 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-catalog-content\") pod \"community-operators-85zvz\" (UID: \"aa5d2e9e-7d8a-48bf-a074-eb34159551ed\") " pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:37:55 crc kubenswrapper[5094]: I0220 07:37:55.030422 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9gwv\" (UniqueName: \"kubernetes.io/projected/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-kube-api-access-v9gwv\") pod \"community-operators-85zvz\" (UID: \"aa5d2e9e-7d8a-48bf-a074-eb34159551ed\") " pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:37:55 crc kubenswrapper[5094]: I0220 07:37:55.132434 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-utilities\") pod \"community-operators-85zvz\" (UID: \"aa5d2e9e-7d8a-48bf-a074-eb34159551ed\") " pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:37:55 crc kubenswrapper[5094]: I0220 07:37:55.132541 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-catalog-content\") pod \"community-operators-85zvz\" (UID: \"aa5d2e9e-7d8a-48bf-a074-eb34159551ed\") " pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:37:55 crc kubenswrapper[5094]: I0220 07:37:55.132590 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9gwv\" (UniqueName: \"kubernetes.io/projected/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-kube-api-access-v9gwv\") pod \"community-operators-85zvz\" (UID: \"aa5d2e9e-7d8a-48bf-a074-eb34159551ed\") " pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:37:55 crc kubenswrapper[5094]: I0220 07:37:55.133114 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-utilities\") pod \"community-operators-85zvz\" (UID: \"aa5d2e9e-7d8a-48bf-a074-eb34159551ed\") " pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:37:55 crc kubenswrapper[5094]: I0220 07:37:55.133332 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-catalog-content\") pod \"community-operators-85zvz\" (UID: \"aa5d2e9e-7d8a-48bf-a074-eb34159551ed\") " pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:37:55 crc kubenswrapper[5094]: I0220 07:37:55.153016 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9gwv\" (UniqueName: \"kubernetes.io/projected/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-kube-api-access-v9gwv\") pod \"community-operators-85zvz\" (UID: \"aa5d2e9e-7d8a-48bf-a074-eb34159551ed\") " pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:37:55 crc kubenswrapper[5094]: I0220 07:37:55.208134 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:37:55 crc kubenswrapper[5094]: I0220 07:37:55.717482 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-85zvz"] Feb 20 07:37:56 crc kubenswrapper[5094]: I0220 07:37:56.485333 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:37:56 crc kubenswrapper[5094]: I0220 07:37:56.485894 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:37:56 crc kubenswrapper[5094]: I0220 07:37:56.697851 5094 generic.go:334] "Generic (PLEG): container finished" podID="aa5d2e9e-7d8a-48bf-a074-eb34159551ed" containerID="7829db94cf14a731cf0d4a4debe8e412745f5515a5c8ef696b29616ff6379529" exitCode=0 Feb 20 07:37:56 crc kubenswrapper[5094]: I0220 07:37:56.697942 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85zvz" event={"ID":"aa5d2e9e-7d8a-48bf-a074-eb34159551ed","Type":"ContainerDied","Data":"7829db94cf14a731cf0d4a4debe8e412745f5515a5c8ef696b29616ff6379529"} Feb 20 07:37:56 crc kubenswrapper[5094]: I0220 07:37:56.698015 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85zvz" event={"ID":"aa5d2e9e-7d8a-48bf-a074-eb34159551ed","Type":"ContainerStarted","Data":"db1d2b274b0420325bf7c8dcf9ca8777991c781f0ee71c959f5ecabbef484364"} Feb 20 07:37:57 crc kubenswrapper[5094]: I0220 07:37:57.557754 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z4qm5" podUID="5e5334f3-7266-4687-b9ba-9574b54c9a29" containerName="registry-server" probeResult="failure" output=< Feb 20 07:37:57 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 07:37:57 crc kubenswrapper[5094]: > Feb 20 07:37:57 crc kubenswrapper[5094]: I0220 07:37:57.713551 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85zvz" event={"ID":"aa5d2e9e-7d8a-48bf-a074-eb34159551ed","Type":"ContainerStarted","Data":"9c2b26b5816acc01824f40ce5b391535738ae133404c9be800dc0e2134006ef8"} Feb 20 07:37:58 crc kubenswrapper[5094]: I0220 07:37:58.727116 5094 generic.go:334] "Generic (PLEG): container finished" podID="aa5d2e9e-7d8a-48bf-a074-eb34159551ed" containerID="9c2b26b5816acc01824f40ce5b391535738ae133404c9be800dc0e2134006ef8" exitCode=0 Feb 20 07:37:58 crc kubenswrapper[5094]: I0220 07:37:58.727192 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85zvz" event={"ID":"aa5d2e9e-7d8a-48bf-a074-eb34159551ed","Type":"ContainerDied","Data":"9c2b26b5816acc01824f40ce5b391535738ae133404c9be800dc0e2134006ef8"} Feb 20 07:37:59 crc kubenswrapper[5094]: I0220 07:37:59.738973 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85zvz" event={"ID":"aa5d2e9e-7d8a-48bf-a074-eb34159551ed","Type":"ContainerStarted","Data":"6745393131f6a6566fa43ddaa71dd46bd7c4aa4b80b6f03dc0d4589a8d454a12"} Feb 20 07:37:59 crc kubenswrapper[5094]: I0220 07:37:59.765902 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-85zvz" podStartSLOduration=3.304419015 podStartE2EDuration="5.765851761s" podCreationTimestamp="2026-02-20 07:37:54 +0000 UTC" firstStartedPulling="2026-02-20 07:37:56.701079751 +0000 UTC m=+3091.573706472" lastFinishedPulling="2026-02-20 07:37:59.162512497 +0000 UTC m=+3094.035139218" observedRunningTime="2026-02-20 07:37:59.759784395 +0000 UTC m=+3094.632411156" watchObservedRunningTime="2026-02-20 07:37:59.765851761 +0000 UTC m=+3094.638478482" Feb 20 07:38:00 crc kubenswrapper[5094]: I0220 07:38:00.841334 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:38:00 crc kubenswrapper[5094]: E0220 07:38:00.841929 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:38:05 crc kubenswrapper[5094]: I0220 07:38:05.209973 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:38:05 crc kubenswrapper[5094]: I0220 07:38:05.210986 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:38:05 crc kubenswrapper[5094]: I0220 07:38:05.424950 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:38:05 crc kubenswrapper[5094]: I0220 07:38:05.914404 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:38:06 crc kubenswrapper[5094]: I0220 07:38:06.563506 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:38:06 crc kubenswrapper[5094]: I0220 07:38:06.653225 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:38:08 crc kubenswrapper[5094]: I0220 07:38:08.873621 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-85zvz"] Feb 20 07:38:08 crc kubenswrapper[5094]: I0220 07:38:08.874589 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-85zvz" podUID="aa5d2e9e-7d8a-48bf-a074-eb34159551ed" containerName="registry-server" containerID="cri-o://6745393131f6a6566fa43ddaa71dd46bd7c4aa4b80b6f03dc0d4589a8d454a12" gracePeriod=2 Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.347232 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.419466 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-catalog-content\") pod \"aa5d2e9e-7d8a-48bf-a074-eb34159551ed\" (UID: \"aa5d2e9e-7d8a-48bf-a074-eb34159551ed\") " Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.419603 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9gwv\" (UniqueName: \"kubernetes.io/projected/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-kube-api-access-v9gwv\") pod \"aa5d2e9e-7d8a-48bf-a074-eb34159551ed\" (UID: \"aa5d2e9e-7d8a-48bf-a074-eb34159551ed\") " Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.419893 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-utilities\") pod \"aa5d2e9e-7d8a-48bf-a074-eb34159551ed\" (UID: \"aa5d2e9e-7d8a-48bf-a074-eb34159551ed\") " Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.421110 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-utilities" (OuterVolumeSpecName: "utilities") pod "aa5d2e9e-7d8a-48bf-a074-eb34159551ed" (UID: "aa5d2e9e-7d8a-48bf-a074-eb34159551ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.431331 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-kube-api-access-v9gwv" (OuterVolumeSpecName: "kube-api-access-v9gwv") pod "aa5d2e9e-7d8a-48bf-a074-eb34159551ed" (UID: "aa5d2e9e-7d8a-48bf-a074-eb34159551ed"). InnerVolumeSpecName "kube-api-access-v9gwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.500995 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa5d2e9e-7d8a-48bf-a074-eb34159551ed" (UID: "aa5d2e9e-7d8a-48bf-a074-eb34159551ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.522202 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.522243 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.522260 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9gwv\" (UniqueName: \"kubernetes.io/projected/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-kube-api-access-v9gwv\") on node \"crc\" DevicePath \"\"" Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.891624 5094 generic.go:334] "Generic (PLEG): container finished" podID="aa5d2e9e-7d8a-48bf-a074-eb34159551ed" containerID="6745393131f6a6566fa43ddaa71dd46bd7c4aa4b80b6f03dc0d4589a8d454a12" exitCode=0 Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.891774 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85zvz" event={"ID":"aa5d2e9e-7d8a-48bf-a074-eb34159551ed","Type":"ContainerDied","Data":"6745393131f6a6566fa43ddaa71dd46bd7c4aa4b80b6f03dc0d4589a8d454a12"} Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.891845 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85zvz" event={"ID":"aa5d2e9e-7d8a-48bf-a074-eb34159551ed","Type":"ContainerDied","Data":"db1d2b274b0420325bf7c8dcf9ca8777991c781f0ee71c959f5ecabbef484364"} Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.891889 5094 scope.go:117] "RemoveContainer" containerID="6745393131f6a6566fa43ddaa71dd46bd7c4aa4b80b6f03dc0d4589a8d454a12" Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.893011 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.937673 5094 scope.go:117] "RemoveContainer" containerID="9c2b26b5816acc01824f40ce5b391535738ae133404c9be800dc0e2134006ef8" Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.939012 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-85zvz"] Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.945424 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-85zvz"] Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.966825 5094 scope.go:117] "RemoveContainer" containerID="7829db94cf14a731cf0d4a4debe8e412745f5515a5c8ef696b29616ff6379529" Feb 20 07:38:10 crc kubenswrapper[5094]: I0220 07:38:10.012587 5094 scope.go:117] "RemoveContainer" containerID="6745393131f6a6566fa43ddaa71dd46bd7c4aa4b80b6f03dc0d4589a8d454a12" Feb 20 07:38:10 crc kubenswrapper[5094]: E0220 07:38:10.013201 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6745393131f6a6566fa43ddaa71dd46bd7c4aa4b80b6f03dc0d4589a8d454a12\": container with ID starting with 6745393131f6a6566fa43ddaa71dd46bd7c4aa4b80b6f03dc0d4589a8d454a12 not found: ID does not exist" containerID="6745393131f6a6566fa43ddaa71dd46bd7c4aa4b80b6f03dc0d4589a8d454a12" Feb 20 07:38:10 crc kubenswrapper[5094]: I0220 07:38:10.013260 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6745393131f6a6566fa43ddaa71dd46bd7c4aa4b80b6f03dc0d4589a8d454a12"} err="failed to get container status \"6745393131f6a6566fa43ddaa71dd46bd7c4aa4b80b6f03dc0d4589a8d454a12\": rpc error: code = NotFound desc = could not find container \"6745393131f6a6566fa43ddaa71dd46bd7c4aa4b80b6f03dc0d4589a8d454a12\": container with ID starting with 6745393131f6a6566fa43ddaa71dd46bd7c4aa4b80b6f03dc0d4589a8d454a12 not found: ID does not exist" Feb 20 07:38:10 crc kubenswrapper[5094]: I0220 07:38:10.013298 5094 scope.go:117] "RemoveContainer" containerID="9c2b26b5816acc01824f40ce5b391535738ae133404c9be800dc0e2134006ef8" Feb 20 07:38:10 crc kubenswrapper[5094]: E0220 07:38:10.014027 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c2b26b5816acc01824f40ce5b391535738ae133404c9be800dc0e2134006ef8\": container with ID starting with 9c2b26b5816acc01824f40ce5b391535738ae133404c9be800dc0e2134006ef8 not found: ID does not exist" containerID="9c2b26b5816acc01824f40ce5b391535738ae133404c9be800dc0e2134006ef8" Feb 20 07:38:10 crc kubenswrapper[5094]: I0220 07:38:10.014114 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c2b26b5816acc01824f40ce5b391535738ae133404c9be800dc0e2134006ef8"} err="failed to get container status \"9c2b26b5816acc01824f40ce5b391535738ae133404c9be800dc0e2134006ef8\": rpc error: code = NotFound desc = could not find container \"9c2b26b5816acc01824f40ce5b391535738ae133404c9be800dc0e2134006ef8\": container with ID starting with 9c2b26b5816acc01824f40ce5b391535738ae133404c9be800dc0e2134006ef8 not found: ID does not exist" Feb 20 07:38:10 crc kubenswrapper[5094]: I0220 07:38:10.014169 5094 scope.go:117] "RemoveContainer" containerID="7829db94cf14a731cf0d4a4debe8e412745f5515a5c8ef696b29616ff6379529" Feb 20 07:38:10 crc kubenswrapper[5094]: E0220 07:38:10.015176 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7829db94cf14a731cf0d4a4debe8e412745f5515a5c8ef696b29616ff6379529\": container with ID starting with 7829db94cf14a731cf0d4a4debe8e412745f5515a5c8ef696b29616ff6379529 not found: ID does not exist" containerID="7829db94cf14a731cf0d4a4debe8e412745f5515a5c8ef696b29616ff6379529" Feb 20 07:38:10 crc kubenswrapper[5094]: I0220 07:38:10.015250 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7829db94cf14a731cf0d4a4debe8e412745f5515a5c8ef696b29616ff6379529"} err="failed to get container status \"7829db94cf14a731cf0d4a4debe8e412745f5515a5c8ef696b29616ff6379529\": rpc error: code = NotFound desc = could not find container \"7829db94cf14a731cf0d4a4debe8e412745f5515a5c8ef696b29616ff6379529\": container with ID starting with 7829db94cf14a731cf0d4a4debe8e412745f5515a5c8ef696b29616ff6379529 not found: ID does not exist" Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.274453 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z4qm5"] Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.275010 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z4qm5" podUID="5e5334f3-7266-4687-b9ba-9574b54c9a29" containerName="registry-server" containerID="cri-o://48b1f3caf6c0a4a92f9bd9434bbade328a3a706049e01e48a1c0f40d6cf2fe22" gracePeriod=2 Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.797575 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.842163 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:38:11 crc kubenswrapper[5094]: E0220 07:38:11.842624 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.853623 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa5d2e9e-7d8a-48bf-a074-eb34159551ed" path="/var/lib/kubelet/pods/aa5d2e9e-7d8a-48bf-a074-eb34159551ed/volumes" Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.874541 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e5334f3-7266-4687-b9ba-9574b54c9a29-utilities\") pod \"5e5334f3-7266-4687-b9ba-9574b54c9a29\" (UID: \"5e5334f3-7266-4687-b9ba-9574b54c9a29\") " Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.874652 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q972\" (UniqueName: \"kubernetes.io/projected/5e5334f3-7266-4687-b9ba-9574b54c9a29-kube-api-access-2q972\") pod \"5e5334f3-7266-4687-b9ba-9574b54c9a29\" (UID: \"5e5334f3-7266-4687-b9ba-9574b54c9a29\") " Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.874761 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e5334f3-7266-4687-b9ba-9574b54c9a29-catalog-content\") pod \"5e5334f3-7266-4687-b9ba-9574b54c9a29\" (UID: \"5e5334f3-7266-4687-b9ba-9574b54c9a29\") " Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.875582 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e5334f3-7266-4687-b9ba-9574b54c9a29-utilities" (OuterVolumeSpecName: "utilities") pod "5e5334f3-7266-4687-b9ba-9574b54c9a29" (UID: "5e5334f3-7266-4687-b9ba-9574b54c9a29"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.882943 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e5334f3-7266-4687-b9ba-9574b54c9a29-kube-api-access-2q972" (OuterVolumeSpecName: "kube-api-access-2q972") pod "5e5334f3-7266-4687-b9ba-9574b54c9a29" (UID: "5e5334f3-7266-4687-b9ba-9574b54c9a29"). InnerVolumeSpecName "kube-api-access-2q972". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.918284 5094 generic.go:334] "Generic (PLEG): container finished" podID="5e5334f3-7266-4687-b9ba-9574b54c9a29" containerID="48b1f3caf6c0a4a92f9bd9434bbade328a3a706049e01e48a1c0f40d6cf2fe22" exitCode=0 Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.918336 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4qm5" event={"ID":"5e5334f3-7266-4687-b9ba-9574b54c9a29","Type":"ContainerDied","Data":"48b1f3caf6c0a4a92f9bd9434bbade328a3a706049e01e48a1c0f40d6cf2fe22"} Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.918370 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4qm5" event={"ID":"5e5334f3-7266-4687-b9ba-9574b54c9a29","Type":"ContainerDied","Data":"89c9e252d537af174af4376fa5796e176641ea798d855baa2c860628688dd9e3"} Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.918393 5094 scope.go:117] "RemoveContainer" containerID="48b1f3caf6c0a4a92f9bd9434bbade328a3a706049e01e48a1c0f40d6cf2fe22" Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.918512 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.957972 5094 scope.go:117] "RemoveContainer" containerID="cc157226f40f2c371a50404a0f66c2791e56253bdeeb60a3172855ec95badfb0" Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.979393 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e5334f3-7266-4687-b9ba-9574b54c9a29-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.979515 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q972\" (UniqueName: \"kubernetes.io/projected/5e5334f3-7266-4687-b9ba-9574b54c9a29-kube-api-access-2q972\") on node \"crc\" DevicePath \"\"" Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.993002 5094 scope.go:117] "RemoveContainer" containerID="96812e61d21b8233b73669426f1d067f0b54883330c1735f586c81e3cc39b367" Feb 20 07:38:12 crc kubenswrapper[5094]: I0220 07:38:12.025372 5094 scope.go:117] "RemoveContainer" containerID="48b1f3caf6c0a4a92f9bd9434bbade328a3a706049e01e48a1c0f40d6cf2fe22" Feb 20 07:38:12 crc kubenswrapper[5094]: E0220 07:38:12.025931 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48b1f3caf6c0a4a92f9bd9434bbade328a3a706049e01e48a1c0f40d6cf2fe22\": container with ID starting with 48b1f3caf6c0a4a92f9bd9434bbade328a3a706049e01e48a1c0f40d6cf2fe22 not found: ID does not exist" containerID="48b1f3caf6c0a4a92f9bd9434bbade328a3a706049e01e48a1c0f40d6cf2fe22" Feb 20 07:38:12 crc kubenswrapper[5094]: I0220 07:38:12.026034 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48b1f3caf6c0a4a92f9bd9434bbade328a3a706049e01e48a1c0f40d6cf2fe22"} err="failed to get container status \"48b1f3caf6c0a4a92f9bd9434bbade328a3a706049e01e48a1c0f40d6cf2fe22\": rpc error: code = NotFound desc = could not find container \"48b1f3caf6c0a4a92f9bd9434bbade328a3a706049e01e48a1c0f40d6cf2fe22\": container with ID starting with 48b1f3caf6c0a4a92f9bd9434bbade328a3a706049e01e48a1c0f40d6cf2fe22 not found: ID does not exist" Feb 20 07:38:12 crc kubenswrapper[5094]: I0220 07:38:12.026117 5094 scope.go:117] "RemoveContainer" containerID="cc157226f40f2c371a50404a0f66c2791e56253bdeeb60a3172855ec95badfb0" Feb 20 07:38:12 crc kubenswrapper[5094]: E0220 07:38:12.026570 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc157226f40f2c371a50404a0f66c2791e56253bdeeb60a3172855ec95badfb0\": container with ID starting with cc157226f40f2c371a50404a0f66c2791e56253bdeeb60a3172855ec95badfb0 not found: ID does not exist" containerID="cc157226f40f2c371a50404a0f66c2791e56253bdeeb60a3172855ec95badfb0" Feb 20 07:38:12 crc kubenswrapper[5094]: I0220 07:38:12.026649 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc157226f40f2c371a50404a0f66c2791e56253bdeeb60a3172855ec95badfb0"} err="failed to get container status \"cc157226f40f2c371a50404a0f66c2791e56253bdeeb60a3172855ec95badfb0\": rpc error: code = NotFound desc = could not find container \"cc157226f40f2c371a50404a0f66c2791e56253bdeeb60a3172855ec95badfb0\": container with ID starting with cc157226f40f2c371a50404a0f66c2791e56253bdeeb60a3172855ec95badfb0 not found: ID does not exist" Feb 20 07:38:12 crc kubenswrapper[5094]: I0220 07:38:12.026733 5094 scope.go:117] "RemoveContainer" containerID="96812e61d21b8233b73669426f1d067f0b54883330c1735f586c81e3cc39b367" Feb 20 07:38:12 crc kubenswrapper[5094]: E0220 07:38:12.027176 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96812e61d21b8233b73669426f1d067f0b54883330c1735f586c81e3cc39b367\": container with ID starting with 96812e61d21b8233b73669426f1d067f0b54883330c1735f586c81e3cc39b367 not found: ID does not exist" containerID="96812e61d21b8233b73669426f1d067f0b54883330c1735f586c81e3cc39b367" Feb 20 07:38:12 crc kubenswrapper[5094]: I0220 07:38:12.027251 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96812e61d21b8233b73669426f1d067f0b54883330c1735f586c81e3cc39b367"} err="failed to get container status \"96812e61d21b8233b73669426f1d067f0b54883330c1735f586c81e3cc39b367\": rpc error: code = NotFound desc = could not find container \"96812e61d21b8233b73669426f1d067f0b54883330c1735f586c81e3cc39b367\": container with ID starting with 96812e61d21b8233b73669426f1d067f0b54883330c1735f586c81e3cc39b367 not found: ID does not exist" Feb 20 07:38:12 crc kubenswrapper[5094]: I0220 07:38:12.029483 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e5334f3-7266-4687-b9ba-9574b54c9a29-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e5334f3-7266-4687-b9ba-9574b54c9a29" (UID: "5e5334f3-7266-4687-b9ba-9574b54c9a29"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:38:12 crc kubenswrapper[5094]: I0220 07:38:12.081956 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e5334f3-7266-4687-b9ba-9574b54c9a29-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:38:12 crc kubenswrapper[5094]: I0220 07:38:12.278251 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z4qm5"] Feb 20 07:38:12 crc kubenswrapper[5094]: I0220 07:38:12.285469 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z4qm5"] Feb 20 07:38:13 crc kubenswrapper[5094]: I0220 07:38:13.857616 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e5334f3-7266-4687-b9ba-9574b54c9a29" path="/var/lib/kubelet/pods/5e5334f3-7266-4687-b9ba-9574b54c9a29/volumes" Feb 20 07:38:24 crc kubenswrapper[5094]: I0220 07:38:24.841042 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:38:24 crc kubenswrapper[5094]: E0220 07:38:24.841632 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:38:36 crc kubenswrapper[5094]: I0220 07:38:36.840483 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:38:36 crc kubenswrapper[5094]: E0220 07:38:36.843654 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:38:49 crc kubenswrapper[5094]: I0220 07:38:49.841334 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:38:49 crc kubenswrapper[5094]: E0220 07:38:49.843137 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:39:01 crc kubenswrapper[5094]: I0220 07:39:01.840801 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:39:01 crc kubenswrapper[5094]: E0220 07:39:01.841880 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:39:12 crc kubenswrapper[5094]: I0220 07:39:12.840418 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:39:13 crc kubenswrapper[5094]: I0220 07:39:13.522302 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"75ecd0d14bd85ee39b5bd2719875383b1d61189bd8817e3a936faa65a117f070"} Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.513214 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5zpzv"] Feb 20 07:40:52 crc kubenswrapper[5094]: E0220 07:40:52.514536 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa5d2e9e-7d8a-48bf-a074-eb34159551ed" containerName="extract-utilities" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.514564 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa5d2e9e-7d8a-48bf-a074-eb34159551ed" containerName="extract-utilities" Feb 20 07:40:52 crc kubenswrapper[5094]: E0220 07:40:52.514589 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e5334f3-7266-4687-b9ba-9574b54c9a29" containerName="extract-content" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.514601 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e5334f3-7266-4687-b9ba-9574b54c9a29" containerName="extract-content" Feb 20 07:40:52 crc kubenswrapper[5094]: E0220 07:40:52.514625 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa5d2e9e-7d8a-48bf-a074-eb34159551ed" containerName="extract-content" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.514636 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa5d2e9e-7d8a-48bf-a074-eb34159551ed" containerName="extract-content" Feb 20 07:40:52 crc kubenswrapper[5094]: E0220 07:40:52.514652 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e5334f3-7266-4687-b9ba-9574b54c9a29" containerName="registry-server" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.514662 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e5334f3-7266-4687-b9ba-9574b54c9a29" containerName="registry-server" Feb 20 07:40:52 crc kubenswrapper[5094]: E0220 07:40:52.514681 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e5334f3-7266-4687-b9ba-9574b54c9a29" containerName="extract-utilities" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.514691 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e5334f3-7266-4687-b9ba-9574b54c9a29" containerName="extract-utilities" Feb 20 07:40:52 crc kubenswrapper[5094]: E0220 07:40:52.514748 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa5d2e9e-7d8a-48bf-a074-eb34159551ed" containerName="registry-server" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.514762 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa5d2e9e-7d8a-48bf-a074-eb34159551ed" containerName="registry-server" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.515015 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa5d2e9e-7d8a-48bf-a074-eb34159551ed" containerName="registry-server" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.515053 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e5334f3-7266-4687-b9ba-9574b54c9a29" containerName="registry-server" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.516884 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.539500 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5zpzv"] Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.671668 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc53884f-e926-4579-956a-0c9719af5d1e-catalog-content\") pod \"redhat-marketplace-5zpzv\" (UID: \"fc53884f-e926-4579-956a-0c9719af5d1e\") " pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.671859 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc53884f-e926-4579-956a-0c9719af5d1e-utilities\") pod \"redhat-marketplace-5zpzv\" (UID: \"fc53884f-e926-4579-956a-0c9719af5d1e\") " pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.672117 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czt8d\" (UniqueName: \"kubernetes.io/projected/fc53884f-e926-4579-956a-0c9719af5d1e-kube-api-access-czt8d\") pod \"redhat-marketplace-5zpzv\" (UID: \"fc53884f-e926-4579-956a-0c9719af5d1e\") " pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.774121 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc53884f-e926-4579-956a-0c9719af5d1e-catalog-content\") pod \"redhat-marketplace-5zpzv\" (UID: \"fc53884f-e926-4579-956a-0c9719af5d1e\") " pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.774295 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc53884f-e926-4579-956a-0c9719af5d1e-utilities\") pod \"redhat-marketplace-5zpzv\" (UID: \"fc53884f-e926-4579-956a-0c9719af5d1e\") " pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.774402 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czt8d\" (UniqueName: \"kubernetes.io/projected/fc53884f-e926-4579-956a-0c9719af5d1e-kube-api-access-czt8d\") pod \"redhat-marketplace-5zpzv\" (UID: \"fc53884f-e926-4579-956a-0c9719af5d1e\") " pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.774770 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc53884f-e926-4579-956a-0c9719af5d1e-catalog-content\") pod \"redhat-marketplace-5zpzv\" (UID: \"fc53884f-e926-4579-956a-0c9719af5d1e\") " pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.774953 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc53884f-e926-4579-956a-0c9719af5d1e-utilities\") pod \"redhat-marketplace-5zpzv\" (UID: \"fc53884f-e926-4579-956a-0c9719af5d1e\") " pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.796378 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czt8d\" (UniqueName: \"kubernetes.io/projected/fc53884f-e926-4579-956a-0c9719af5d1e-kube-api-access-czt8d\") pod \"redhat-marketplace-5zpzv\" (UID: \"fc53884f-e926-4579-956a-0c9719af5d1e\") " pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.855248 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:40:53 crc kubenswrapper[5094]: I0220 07:40:53.377389 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5zpzv"] Feb 20 07:40:53 crc kubenswrapper[5094]: I0220 07:40:53.620357 5094 generic.go:334] "Generic (PLEG): container finished" podID="fc53884f-e926-4579-956a-0c9719af5d1e" containerID="c5708f17c871b826ce6a820ca60584c66b7256d5db35ca619fd1c4b2123c3bcf" exitCode=0 Feb 20 07:40:53 crc kubenswrapper[5094]: I0220 07:40:53.620416 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5zpzv" event={"ID":"fc53884f-e926-4579-956a-0c9719af5d1e","Type":"ContainerDied","Data":"c5708f17c871b826ce6a820ca60584c66b7256d5db35ca619fd1c4b2123c3bcf"} Feb 20 07:40:53 crc kubenswrapper[5094]: I0220 07:40:53.620451 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5zpzv" event={"ID":"fc53884f-e926-4579-956a-0c9719af5d1e","Type":"ContainerStarted","Data":"29f8cb5dcecf22a8c0b8b2443cdd71954f5dcbf716659229fa65a0f279b47e16"} Feb 20 07:40:54 crc kubenswrapper[5094]: I0220 07:40:54.634595 5094 generic.go:334] "Generic (PLEG): container finished" podID="fc53884f-e926-4579-956a-0c9719af5d1e" containerID="16452559f322a735a8317b7a294e5189e9d7b6b8045c95aa3d711b6779032c89" exitCode=0 Feb 20 07:40:54 crc kubenswrapper[5094]: I0220 07:40:54.634735 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5zpzv" event={"ID":"fc53884f-e926-4579-956a-0c9719af5d1e","Type":"ContainerDied","Data":"16452559f322a735a8317b7a294e5189e9d7b6b8045c95aa3d711b6779032c89"} Feb 20 07:40:55 crc kubenswrapper[5094]: I0220 07:40:55.646030 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5zpzv" event={"ID":"fc53884f-e926-4579-956a-0c9719af5d1e","Type":"ContainerStarted","Data":"a9a00fd798c4e21cff5dbd987e0c9c633172ed17f009cc1b9114dc3f412b6f91"} Feb 20 07:40:55 crc kubenswrapper[5094]: I0220 07:40:55.684957 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5zpzv" podStartSLOduration=2.250436356 podStartE2EDuration="3.684933057s" podCreationTimestamp="2026-02-20 07:40:52 +0000 UTC" firstStartedPulling="2026-02-20 07:40:53.623370363 +0000 UTC m=+3268.495997074" lastFinishedPulling="2026-02-20 07:40:55.057867054 +0000 UTC m=+3269.930493775" observedRunningTime="2026-02-20 07:40:55.677654513 +0000 UTC m=+3270.550281294" watchObservedRunningTime="2026-02-20 07:40:55.684933057 +0000 UTC m=+3270.557559768" Feb 20 07:41:02 crc kubenswrapper[5094]: I0220 07:41:02.856037 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:41:02 crc kubenswrapper[5094]: I0220 07:41:02.857831 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:41:02 crc kubenswrapper[5094]: I0220 07:41:02.921873 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:41:03 crc kubenswrapper[5094]: I0220 07:41:03.770476 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:41:03 crc kubenswrapper[5094]: I0220 07:41:03.838156 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5zpzv"] Feb 20 07:41:05 crc kubenswrapper[5094]: I0220 07:41:05.731547 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5zpzv" podUID="fc53884f-e926-4579-956a-0c9719af5d1e" containerName="registry-server" containerID="cri-o://a9a00fd798c4e21cff5dbd987e0c9c633172ed17f009cc1b9114dc3f412b6f91" gracePeriod=2 Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.216225 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.324440 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc53884f-e926-4579-956a-0c9719af5d1e-utilities\") pod \"fc53884f-e926-4579-956a-0c9719af5d1e\" (UID: \"fc53884f-e926-4579-956a-0c9719af5d1e\") " Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.324497 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc53884f-e926-4579-956a-0c9719af5d1e-catalog-content\") pod \"fc53884f-e926-4579-956a-0c9719af5d1e\" (UID: \"fc53884f-e926-4579-956a-0c9719af5d1e\") " Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.324575 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czt8d\" (UniqueName: \"kubernetes.io/projected/fc53884f-e926-4579-956a-0c9719af5d1e-kube-api-access-czt8d\") pod \"fc53884f-e926-4579-956a-0c9719af5d1e\" (UID: \"fc53884f-e926-4579-956a-0c9719af5d1e\") " Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.325868 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc53884f-e926-4579-956a-0c9719af5d1e-utilities" (OuterVolumeSpecName: "utilities") pod "fc53884f-e926-4579-956a-0c9719af5d1e" (UID: "fc53884f-e926-4579-956a-0c9719af5d1e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.331229 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc53884f-e926-4579-956a-0c9719af5d1e-kube-api-access-czt8d" (OuterVolumeSpecName: "kube-api-access-czt8d") pod "fc53884f-e926-4579-956a-0c9719af5d1e" (UID: "fc53884f-e926-4579-956a-0c9719af5d1e"). InnerVolumeSpecName "kube-api-access-czt8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.351735 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc53884f-e926-4579-956a-0c9719af5d1e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc53884f-e926-4579-956a-0c9719af5d1e" (UID: "fc53884f-e926-4579-956a-0c9719af5d1e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.426121 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc53884f-e926-4579-956a-0c9719af5d1e-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.426159 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc53884f-e926-4579-956a-0c9719af5d1e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.426172 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czt8d\" (UniqueName: \"kubernetes.io/projected/fc53884f-e926-4579-956a-0c9719af5d1e-kube-api-access-czt8d\") on node \"crc\" DevicePath \"\"" Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.763385 5094 generic.go:334] "Generic (PLEG): container finished" podID="fc53884f-e926-4579-956a-0c9719af5d1e" containerID="a9a00fd798c4e21cff5dbd987e0c9c633172ed17f009cc1b9114dc3f412b6f91" exitCode=0 Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.763460 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5zpzv" event={"ID":"fc53884f-e926-4579-956a-0c9719af5d1e","Type":"ContainerDied","Data":"a9a00fd798c4e21cff5dbd987e0c9c633172ed17f009cc1b9114dc3f412b6f91"} Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.763519 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5zpzv" event={"ID":"fc53884f-e926-4579-956a-0c9719af5d1e","Type":"ContainerDied","Data":"29f8cb5dcecf22a8c0b8b2443cdd71954f5dcbf716659229fa65a0f279b47e16"} Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.763552 5094 scope.go:117] "RemoveContainer" containerID="a9a00fd798c4e21cff5dbd987e0c9c633172ed17f009cc1b9114dc3f412b6f91" Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.763554 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.801217 5094 scope.go:117] "RemoveContainer" containerID="16452559f322a735a8317b7a294e5189e9d7b6b8045c95aa3d711b6779032c89" Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.828063 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5zpzv"] Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.841064 5094 scope.go:117] "RemoveContainer" containerID="c5708f17c871b826ce6a820ca60584c66b7256d5db35ca619fd1c4b2123c3bcf" Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.846065 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5zpzv"] Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.865294 5094 scope.go:117] "RemoveContainer" containerID="a9a00fd798c4e21cff5dbd987e0c9c633172ed17f009cc1b9114dc3f412b6f91" Feb 20 07:41:06 crc kubenswrapper[5094]: E0220 07:41:06.865844 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9a00fd798c4e21cff5dbd987e0c9c633172ed17f009cc1b9114dc3f412b6f91\": container with ID starting with a9a00fd798c4e21cff5dbd987e0c9c633172ed17f009cc1b9114dc3f412b6f91 not found: ID does not exist" containerID="a9a00fd798c4e21cff5dbd987e0c9c633172ed17f009cc1b9114dc3f412b6f91" Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.865884 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9a00fd798c4e21cff5dbd987e0c9c633172ed17f009cc1b9114dc3f412b6f91"} err="failed to get container status \"a9a00fd798c4e21cff5dbd987e0c9c633172ed17f009cc1b9114dc3f412b6f91\": rpc error: code = NotFound desc = could not find container \"a9a00fd798c4e21cff5dbd987e0c9c633172ed17f009cc1b9114dc3f412b6f91\": container with ID starting with a9a00fd798c4e21cff5dbd987e0c9c633172ed17f009cc1b9114dc3f412b6f91 not found: ID does not exist" Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.865914 5094 scope.go:117] "RemoveContainer" containerID="16452559f322a735a8317b7a294e5189e9d7b6b8045c95aa3d711b6779032c89" Feb 20 07:41:06 crc kubenswrapper[5094]: E0220 07:41:06.866434 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16452559f322a735a8317b7a294e5189e9d7b6b8045c95aa3d711b6779032c89\": container with ID starting with 16452559f322a735a8317b7a294e5189e9d7b6b8045c95aa3d711b6779032c89 not found: ID does not exist" containerID="16452559f322a735a8317b7a294e5189e9d7b6b8045c95aa3d711b6779032c89" Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.866463 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16452559f322a735a8317b7a294e5189e9d7b6b8045c95aa3d711b6779032c89"} err="failed to get container status \"16452559f322a735a8317b7a294e5189e9d7b6b8045c95aa3d711b6779032c89\": rpc error: code = NotFound desc = could not find container \"16452559f322a735a8317b7a294e5189e9d7b6b8045c95aa3d711b6779032c89\": container with ID starting with 16452559f322a735a8317b7a294e5189e9d7b6b8045c95aa3d711b6779032c89 not found: ID does not exist" Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.866482 5094 scope.go:117] "RemoveContainer" containerID="c5708f17c871b826ce6a820ca60584c66b7256d5db35ca619fd1c4b2123c3bcf" Feb 20 07:41:06 crc kubenswrapper[5094]: E0220 07:41:06.866875 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5708f17c871b826ce6a820ca60584c66b7256d5db35ca619fd1c4b2123c3bcf\": container with ID starting with c5708f17c871b826ce6a820ca60584c66b7256d5db35ca619fd1c4b2123c3bcf not found: ID does not exist" containerID="c5708f17c871b826ce6a820ca60584c66b7256d5db35ca619fd1c4b2123c3bcf" Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.866902 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5708f17c871b826ce6a820ca60584c66b7256d5db35ca619fd1c4b2123c3bcf"} err="failed to get container status \"c5708f17c871b826ce6a820ca60584c66b7256d5db35ca619fd1c4b2123c3bcf\": rpc error: code = NotFound desc = could not find container \"c5708f17c871b826ce6a820ca60584c66b7256d5db35ca619fd1c4b2123c3bcf\": container with ID starting with c5708f17c871b826ce6a820ca60584c66b7256d5db35ca619fd1c4b2123c3bcf not found: ID does not exist" Feb 20 07:41:07 crc kubenswrapper[5094]: I0220 07:41:07.856476 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc53884f-e926-4579-956a-0c9719af5d1e" path="/var/lib/kubelet/pods/fc53884f-e926-4579-956a-0c9719af5d1e/volumes" Feb 20 07:41:34 crc kubenswrapper[5094]: I0220 07:41:34.107317 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:41:34 crc kubenswrapper[5094]: I0220 07:41:34.108088 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:42:04 crc kubenswrapper[5094]: I0220 07:42:04.106954 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:42:04 crc kubenswrapper[5094]: I0220 07:42:04.108031 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:42:34 crc kubenswrapper[5094]: I0220 07:42:34.106643 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:42:34 crc kubenswrapper[5094]: I0220 07:42:34.107501 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:42:34 crc kubenswrapper[5094]: I0220 07:42:34.107581 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 07:42:34 crc kubenswrapper[5094]: I0220 07:42:34.108619 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"75ecd0d14bd85ee39b5bd2719875383b1d61189bd8817e3a936faa65a117f070"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 07:42:34 crc kubenswrapper[5094]: I0220 07:42:34.108765 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://75ecd0d14bd85ee39b5bd2719875383b1d61189bd8817e3a936faa65a117f070" gracePeriod=600 Feb 20 07:42:34 crc kubenswrapper[5094]: I0220 07:42:34.666019 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="75ecd0d14bd85ee39b5bd2719875383b1d61189bd8817e3a936faa65a117f070" exitCode=0 Feb 20 07:42:34 crc kubenswrapper[5094]: I0220 07:42:34.666055 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"75ecd0d14bd85ee39b5bd2719875383b1d61189bd8817e3a936faa65a117f070"} Feb 20 07:42:34 crc kubenswrapper[5094]: I0220 07:42:34.666579 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949"} Feb 20 07:42:34 crc kubenswrapper[5094]: I0220 07:42:34.666607 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:44:34 crc kubenswrapper[5094]: I0220 07:44:34.107458 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:44:34 crc kubenswrapper[5094]: I0220 07:44:34.109988 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.173343 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz"] Feb 20 07:45:00 crc kubenswrapper[5094]: E0220 07:45:00.174940 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc53884f-e926-4579-956a-0c9719af5d1e" containerName="extract-utilities" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.174967 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc53884f-e926-4579-956a-0c9719af5d1e" containerName="extract-utilities" Feb 20 07:45:00 crc kubenswrapper[5094]: E0220 07:45:00.174988 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc53884f-e926-4579-956a-0c9719af5d1e" containerName="registry-server" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.175000 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc53884f-e926-4579-956a-0c9719af5d1e" containerName="registry-server" Feb 20 07:45:00 crc kubenswrapper[5094]: E0220 07:45:00.175033 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc53884f-e926-4579-956a-0c9719af5d1e" containerName="extract-content" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.175044 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc53884f-e926-4579-956a-0c9719af5d1e" containerName="extract-content" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.175264 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc53884f-e926-4579-956a-0c9719af5d1e" containerName="registry-server" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.176025 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.179789 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.180140 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz"] Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.180978 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.258614 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f8zp\" (UniqueName: \"kubernetes.io/projected/1c1d2dad-446d-40c2-aceb-de13411f5c93-kube-api-access-9f8zp\") pod \"collect-profiles-29526225-wq7wz\" (UID: \"1c1d2dad-446d-40c2-aceb-de13411f5c93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.259006 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c1d2dad-446d-40c2-aceb-de13411f5c93-secret-volume\") pod \"collect-profiles-29526225-wq7wz\" (UID: \"1c1d2dad-446d-40c2-aceb-de13411f5c93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.259051 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c1d2dad-446d-40c2-aceb-de13411f5c93-config-volume\") pod \"collect-profiles-29526225-wq7wz\" (UID: \"1c1d2dad-446d-40c2-aceb-de13411f5c93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.360889 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f8zp\" (UniqueName: \"kubernetes.io/projected/1c1d2dad-446d-40c2-aceb-de13411f5c93-kube-api-access-9f8zp\") pod \"collect-profiles-29526225-wq7wz\" (UID: \"1c1d2dad-446d-40c2-aceb-de13411f5c93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.361311 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c1d2dad-446d-40c2-aceb-de13411f5c93-secret-volume\") pod \"collect-profiles-29526225-wq7wz\" (UID: \"1c1d2dad-446d-40c2-aceb-de13411f5c93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.361360 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c1d2dad-446d-40c2-aceb-de13411f5c93-config-volume\") pod \"collect-profiles-29526225-wq7wz\" (UID: \"1c1d2dad-446d-40c2-aceb-de13411f5c93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.364518 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c1d2dad-446d-40c2-aceb-de13411f5c93-config-volume\") pod \"collect-profiles-29526225-wq7wz\" (UID: \"1c1d2dad-446d-40c2-aceb-de13411f5c93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.374002 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c1d2dad-446d-40c2-aceb-de13411f5c93-secret-volume\") pod \"collect-profiles-29526225-wq7wz\" (UID: \"1c1d2dad-446d-40c2-aceb-de13411f5c93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.395660 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f8zp\" (UniqueName: \"kubernetes.io/projected/1c1d2dad-446d-40c2-aceb-de13411f5c93-kube-api-access-9f8zp\") pod \"collect-profiles-29526225-wq7wz\" (UID: \"1c1d2dad-446d-40c2-aceb-de13411f5c93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.515675 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.807773 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz"] Feb 20 07:45:01 crc kubenswrapper[5094]: I0220 07:45:01.139921 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" event={"ID":"1c1d2dad-446d-40c2-aceb-de13411f5c93","Type":"ContainerStarted","Data":"362eb9911b24b0b4b251a0b53f012f2ce0407b0cfe5ced9cb86222665657d8eb"} Feb 20 07:45:01 crc kubenswrapper[5094]: I0220 07:45:01.140433 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" event={"ID":"1c1d2dad-446d-40c2-aceb-de13411f5c93","Type":"ContainerStarted","Data":"0590d2f3731c18e064c43a35704333d310e3d17c842b6353c78f230a2bbbb357"} Feb 20 07:45:01 crc kubenswrapper[5094]: I0220 07:45:01.171101 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" podStartSLOduration=1.169931681 podStartE2EDuration="1.169931681s" podCreationTimestamp="2026-02-20 07:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:45:01.163945316 +0000 UTC m=+3516.036572067" watchObservedRunningTime="2026-02-20 07:45:01.169931681 +0000 UTC m=+3516.042558402" Feb 20 07:45:02 crc kubenswrapper[5094]: I0220 07:45:02.155607 5094 generic.go:334] "Generic (PLEG): container finished" podID="1c1d2dad-446d-40c2-aceb-de13411f5c93" containerID="362eb9911b24b0b4b251a0b53f012f2ce0407b0cfe5ced9cb86222665657d8eb" exitCode=0 Feb 20 07:45:02 crc kubenswrapper[5094]: I0220 07:45:02.155697 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" event={"ID":"1c1d2dad-446d-40c2-aceb-de13411f5c93","Type":"ContainerDied","Data":"362eb9911b24b0b4b251a0b53f012f2ce0407b0cfe5ced9cb86222665657d8eb"} Feb 20 07:45:03 crc kubenswrapper[5094]: I0220 07:45:03.567238 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" Feb 20 07:45:03 crc kubenswrapper[5094]: I0220 07:45:03.724240 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9f8zp\" (UniqueName: \"kubernetes.io/projected/1c1d2dad-446d-40c2-aceb-de13411f5c93-kube-api-access-9f8zp\") pod \"1c1d2dad-446d-40c2-aceb-de13411f5c93\" (UID: \"1c1d2dad-446d-40c2-aceb-de13411f5c93\") " Feb 20 07:45:03 crc kubenswrapper[5094]: I0220 07:45:03.724406 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c1d2dad-446d-40c2-aceb-de13411f5c93-config-volume\") pod \"1c1d2dad-446d-40c2-aceb-de13411f5c93\" (UID: \"1c1d2dad-446d-40c2-aceb-de13411f5c93\") " Feb 20 07:45:03 crc kubenswrapper[5094]: I0220 07:45:03.724452 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c1d2dad-446d-40c2-aceb-de13411f5c93-secret-volume\") pod \"1c1d2dad-446d-40c2-aceb-de13411f5c93\" (UID: \"1c1d2dad-446d-40c2-aceb-de13411f5c93\") " Feb 20 07:45:03 crc kubenswrapper[5094]: I0220 07:45:03.725515 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c1d2dad-446d-40c2-aceb-de13411f5c93-config-volume" (OuterVolumeSpecName: "config-volume") pod "1c1d2dad-446d-40c2-aceb-de13411f5c93" (UID: "1c1d2dad-446d-40c2-aceb-de13411f5c93"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:45:03 crc kubenswrapper[5094]: I0220 07:45:03.725842 5094 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c1d2dad-446d-40c2-aceb-de13411f5c93-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 07:45:03 crc kubenswrapper[5094]: I0220 07:45:03.731537 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c1d2dad-446d-40c2-aceb-de13411f5c93-kube-api-access-9f8zp" (OuterVolumeSpecName: "kube-api-access-9f8zp") pod "1c1d2dad-446d-40c2-aceb-de13411f5c93" (UID: "1c1d2dad-446d-40c2-aceb-de13411f5c93"). InnerVolumeSpecName "kube-api-access-9f8zp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:45:03 crc kubenswrapper[5094]: I0220 07:45:03.735033 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c1d2dad-446d-40c2-aceb-de13411f5c93-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1c1d2dad-446d-40c2-aceb-de13411f5c93" (UID: "1c1d2dad-446d-40c2-aceb-de13411f5c93"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:45:03 crc kubenswrapper[5094]: I0220 07:45:03.827288 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9f8zp\" (UniqueName: \"kubernetes.io/projected/1c1d2dad-446d-40c2-aceb-de13411f5c93-kube-api-access-9f8zp\") on node \"crc\" DevicePath \"\"" Feb 20 07:45:03 crc kubenswrapper[5094]: I0220 07:45:03.827333 5094 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c1d2dad-446d-40c2-aceb-de13411f5c93-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 07:45:04 crc kubenswrapper[5094]: I0220 07:45:04.106892 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:45:04 crc kubenswrapper[5094]: I0220 07:45:04.107028 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:45:04 crc kubenswrapper[5094]: I0220 07:45:04.178275 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" event={"ID":"1c1d2dad-446d-40c2-aceb-de13411f5c93","Type":"ContainerDied","Data":"0590d2f3731c18e064c43a35704333d310e3d17c842b6353c78f230a2bbbb357"} Feb 20 07:45:04 crc kubenswrapper[5094]: I0220 07:45:04.178347 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0590d2f3731c18e064c43a35704333d310e3d17c842b6353c78f230a2bbbb357" Feb 20 07:45:04 crc kubenswrapper[5094]: I0220 07:45:04.178401 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" Feb 20 07:45:04 crc kubenswrapper[5094]: I0220 07:45:04.257423 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5"] Feb 20 07:45:04 crc kubenswrapper[5094]: I0220 07:45:04.269625 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5"] Feb 20 07:45:05 crc kubenswrapper[5094]: I0220 07:45:05.851334 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76c5d78f-89fe-4ed3-b64e-52de9f0dec4d" path="/var/lib/kubelet/pods/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d/volumes" Feb 20 07:45:34 crc kubenswrapper[5094]: I0220 07:45:34.106648 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:45:34 crc kubenswrapper[5094]: I0220 07:45:34.107756 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:45:34 crc kubenswrapper[5094]: I0220 07:45:34.107888 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 07:45:34 crc kubenswrapper[5094]: I0220 07:45:34.109160 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 07:45:34 crc kubenswrapper[5094]: I0220 07:45:34.109273 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" gracePeriod=600 Feb 20 07:45:34 crc kubenswrapper[5094]: E0220 07:45:34.247347 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:45:34 crc kubenswrapper[5094]: I0220 07:45:34.521260 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" exitCode=0 Feb 20 07:45:34 crc kubenswrapper[5094]: I0220 07:45:34.521699 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949"} Feb 20 07:45:34 crc kubenswrapper[5094]: I0220 07:45:34.522057 5094 scope.go:117] "RemoveContainer" containerID="75ecd0d14bd85ee39b5bd2719875383b1d61189bd8817e3a936faa65a117f070" Feb 20 07:45:34 crc kubenswrapper[5094]: I0220 07:45:34.523053 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:45:34 crc kubenswrapper[5094]: E0220 07:45:34.523494 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:45:35 crc kubenswrapper[5094]: I0220 07:45:35.268464 5094 scope.go:117] "RemoveContainer" containerID="a3c7984448d7f3db690223dff864550548436ad39114dac24772a87d3288c8ea" Feb 20 07:45:45 crc kubenswrapper[5094]: I0220 07:45:45.849589 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:45:45 crc kubenswrapper[5094]: E0220 07:45:45.850921 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:45:56 crc kubenswrapper[5094]: I0220 07:45:56.842413 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:45:56 crc kubenswrapper[5094]: E0220 07:45:56.843980 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:46:09 crc kubenswrapper[5094]: I0220 07:46:09.840985 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:46:09 crc kubenswrapper[5094]: E0220 07:46:09.842164 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:46:22 crc kubenswrapper[5094]: I0220 07:46:22.841122 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:46:22 crc kubenswrapper[5094]: E0220 07:46:22.842676 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:46:35 crc kubenswrapper[5094]: I0220 07:46:35.841137 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:46:35 crc kubenswrapper[5094]: E0220 07:46:35.843199 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:46:46 crc kubenswrapper[5094]: I0220 07:46:46.840444 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:46:46 crc kubenswrapper[5094]: E0220 07:46:46.841591 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:46:59 crc kubenswrapper[5094]: I0220 07:46:59.840129 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:46:59 crc kubenswrapper[5094]: E0220 07:46:59.841121 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:47:14 crc kubenswrapper[5094]: I0220 07:47:14.841976 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:47:14 crc kubenswrapper[5094]: E0220 07:47:14.842992 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:47:28 crc kubenswrapper[5094]: I0220 07:47:28.840230 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:47:28 crc kubenswrapper[5094]: E0220 07:47:28.841310 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:47:39 crc kubenswrapper[5094]: I0220 07:47:39.840589 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:47:39 crc kubenswrapper[5094]: E0220 07:47:39.841987 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:47:50 crc kubenswrapper[5094]: I0220 07:47:50.842222 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:47:50 crc kubenswrapper[5094]: E0220 07:47:50.843749 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:48:02 crc kubenswrapper[5094]: I0220 07:48:02.444893 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-npwbj"] Feb 20 07:48:02 crc kubenswrapper[5094]: E0220 07:48:02.446178 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c1d2dad-446d-40c2-aceb-de13411f5c93" containerName="collect-profiles" Feb 20 07:48:02 crc kubenswrapper[5094]: I0220 07:48:02.446198 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c1d2dad-446d-40c2-aceb-de13411f5c93" containerName="collect-profiles" Feb 20 07:48:02 crc kubenswrapper[5094]: I0220 07:48:02.446418 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c1d2dad-446d-40c2-aceb-de13411f5c93" containerName="collect-profiles" Feb 20 07:48:02 crc kubenswrapper[5094]: I0220 07:48:02.447809 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:02 crc kubenswrapper[5094]: I0220 07:48:02.461049 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-npwbj"] Feb 20 07:48:02 crc kubenswrapper[5094]: I0220 07:48:02.559408 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vj9s\" (UniqueName: \"kubernetes.io/projected/acf90a1a-02eb-43e2-9533-ca348b502c35-kube-api-access-8vj9s\") pod \"redhat-operators-npwbj\" (UID: \"acf90a1a-02eb-43e2-9533-ca348b502c35\") " pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:02 crc kubenswrapper[5094]: I0220 07:48:02.559734 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acf90a1a-02eb-43e2-9533-ca348b502c35-utilities\") pod \"redhat-operators-npwbj\" (UID: \"acf90a1a-02eb-43e2-9533-ca348b502c35\") " pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:02 crc kubenswrapper[5094]: I0220 07:48:02.559798 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acf90a1a-02eb-43e2-9533-ca348b502c35-catalog-content\") pod \"redhat-operators-npwbj\" (UID: \"acf90a1a-02eb-43e2-9533-ca348b502c35\") " pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:02 crc kubenswrapper[5094]: I0220 07:48:02.661337 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vj9s\" (UniqueName: \"kubernetes.io/projected/acf90a1a-02eb-43e2-9533-ca348b502c35-kube-api-access-8vj9s\") pod \"redhat-operators-npwbj\" (UID: \"acf90a1a-02eb-43e2-9533-ca348b502c35\") " pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:02 crc kubenswrapper[5094]: I0220 07:48:02.661483 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acf90a1a-02eb-43e2-9533-ca348b502c35-utilities\") pod \"redhat-operators-npwbj\" (UID: \"acf90a1a-02eb-43e2-9533-ca348b502c35\") " pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:02 crc kubenswrapper[5094]: I0220 07:48:02.661518 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acf90a1a-02eb-43e2-9533-ca348b502c35-catalog-content\") pod \"redhat-operators-npwbj\" (UID: \"acf90a1a-02eb-43e2-9533-ca348b502c35\") " pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:02 crc kubenswrapper[5094]: I0220 07:48:02.662318 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acf90a1a-02eb-43e2-9533-ca348b502c35-catalog-content\") pod \"redhat-operators-npwbj\" (UID: \"acf90a1a-02eb-43e2-9533-ca348b502c35\") " pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:02 crc kubenswrapper[5094]: I0220 07:48:02.662598 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acf90a1a-02eb-43e2-9533-ca348b502c35-utilities\") pod \"redhat-operators-npwbj\" (UID: \"acf90a1a-02eb-43e2-9533-ca348b502c35\") " pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:02 crc kubenswrapper[5094]: I0220 07:48:02.698727 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vj9s\" (UniqueName: \"kubernetes.io/projected/acf90a1a-02eb-43e2-9533-ca348b502c35-kube-api-access-8vj9s\") pod \"redhat-operators-npwbj\" (UID: \"acf90a1a-02eb-43e2-9533-ca348b502c35\") " pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:02 crc kubenswrapper[5094]: I0220 07:48:02.778644 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:02 crc kubenswrapper[5094]: I0220 07:48:02.840413 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:48:02 crc kubenswrapper[5094]: E0220 07:48:02.841046 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:48:03 crc kubenswrapper[5094]: I0220 07:48:03.358513 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-npwbj"] Feb 20 07:48:04 crc kubenswrapper[5094]: I0220 07:48:04.077497 5094 generic.go:334] "Generic (PLEG): container finished" podID="acf90a1a-02eb-43e2-9533-ca348b502c35" containerID="2c24953b240ddf964b6379fe0d758314d8b25c77572be0dea4c9689428fe0745" exitCode=0 Feb 20 07:48:04 crc kubenswrapper[5094]: I0220 07:48:04.077584 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npwbj" event={"ID":"acf90a1a-02eb-43e2-9533-ca348b502c35","Type":"ContainerDied","Data":"2c24953b240ddf964b6379fe0d758314d8b25c77572be0dea4c9689428fe0745"} Feb 20 07:48:04 crc kubenswrapper[5094]: I0220 07:48:04.077988 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npwbj" event={"ID":"acf90a1a-02eb-43e2-9533-ca348b502c35","Type":"ContainerStarted","Data":"c0599642c8d98050e1f7e47d03328b62c53a4b4fd04a8ac2ddf65e7ae4f3f5e8"} Feb 20 07:48:04 crc kubenswrapper[5094]: I0220 07:48:04.081064 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 07:48:05 crc kubenswrapper[5094]: I0220 07:48:05.091120 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npwbj" event={"ID":"acf90a1a-02eb-43e2-9533-ca348b502c35","Type":"ContainerStarted","Data":"9a86fc11356d6af5a7cedfc80ec43da9122b830459a8c0070ff2d5aebd9d1717"} Feb 20 07:48:06 crc kubenswrapper[5094]: I0220 07:48:06.101647 5094 generic.go:334] "Generic (PLEG): container finished" podID="acf90a1a-02eb-43e2-9533-ca348b502c35" containerID="9a86fc11356d6af5a7cedfc80ec43da9122b830459a8c0070ff2d5aebd9d1717" exitCode=0 Feb 20 07:48:06 crc kubenswrapper[5094]: I0220 07:48:06.101749 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npwbj" event={"ID":"acf90a1a-02eb-43e2-9533-ca348b502c35","Type":"ContainerDied","Data":"9a86fc11356d6af5a7cedfc80ec43da9122b830459a8c0070ff2d5aebd9d1717"} Feb 20 07:48:07 crc kubenswrapper[5094]: I0220 07:48:07.114521 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npwbj" event={"ID":"acf90a1a-02eb-43e2-9533-ca348b502c35","Type":"ContainerStarted","Data":"7961c8adb678afe9de644fbcd5dcc9c1fe579a81300a9f219bfc31cc94e5ed4e"} Feb 20 07:48:07 crc kubenswrapper[5094]: I0220 07:48:07.154463 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-npwbj" podStartSLOduration=2.736642839 podStartE2EDuration="5.154434659s" podCreationTimestamp="2026-02-20 07:48:02 +0000 UTC" firstStartedPulling="2026-02-20 07:48:04.080784637 +0000 UTC m=+3698.953411358" lastFinishedPulling="2026-02-20 07:48:06.498576457 +0000 UTC m=+3701.371203178" observedRunningTime="2026-02-20 07:48:07.14485962 +0000 UTC m=+3702.017486371" watchObservedRunningTime="2026-02-20 07:48:07.154434659 +0000 UTC m=+3702.027061410" Feb 20 07:48:12 crc kubenswrapper[5094]: I0220 07:48:12.779180 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:12 crc kubenswrapper[5094]: I0220 07:48:12.780955 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:13 crc kubenswrapper[5094]: I0220 07:48:13.852596 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-npwbj" podUID="acf90a1a-02eb-43e2-9533-ca348b502c35" containerName="registry-server" probeResult="failure" output=< Feb 20 07:48:13 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 07:48:13 crc kubenswrapper[5094]: > Feb 20 07:48:14 crc kubenswrapper[5094]: I0220 07:48:14.841358 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:48:14 crc kubenswrapper[5094]: E0220 07:48:14.842186 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:48:22 crc kubenswrapper[5094]: I0220 07:48:22.854633 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:22 crc kubenswrapper[5094]: I0220 07:48:22.933276 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:23 crc kubenswrapper[5094]: I0220 07:48:23.105830 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-npwbj"] Feb 20 07:48:24 crc kubenswrapper[5094]: I0220 07:48:24.325364 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-npwbj" podUID="acf90a1a-02eb-43e2-9533-ca348b502c35" containerName="registry-server" containerID="cri-o://7961c8adb678afe9de644fbcd5dcc9c1fe579a81300a9f219bfc31cc94e5ed4e" gracePeriod=2 Feb 20 07:48:24 crc kubenswrapper[5094]: I0220 07:48:24.850780 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:24 crc kubenswrapper[5094]: I0220 07:48:24.975971 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acf90a1a-02eb-43e2-9533-ca348b502c35-utilities\") pod \"acf90a1a-02eb-43e2-9533-ca348b502c35\" (UID: \"acf90a1a-02eb-43e2-9533-ca348b502c35\") " Feb 20 07:48:24 crc kubenswrapper[5094]: I0220 07:48:24.976131 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vj9s\" (UniqueName: \"kubernetes.io/projected/acf90a1a-02eb-43e2-9533-ca348b502c35-kube-api-access-8vj9s\") pod \"acf90a1a-02eb-43e2-9533-ca348b502c35\" (UID: \"acf90a1a-02eb-43e2-9533-ca348b502c35\") " Feb 20 07:48:24 crc kubenswrapper[5094]: I0220 07:48:24.976236 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acf90a1a-02eb-43e2-9533-ca348b502c35-catalog-content\") pod \"acf90a1a-02eb-43e2-9533-ca348b502c35\" (UID: \"acf90a1a-02eb-43e2-9533-ca348b502c35\") " Feb 20 07:48:24 crc kubenswrapper[5094]: I0220 07:48:24.978111 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acf90a1a-02eb-43e2-9533-ca348b502c35-utilities" (OuterVolumeSpecName: "utilities") pod "acf90a1a-02eb-43e2-9533-ca348b502c35" (UID: "acf90a1a-02eb-43e2-9533-ca348b502c35"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:48:24 crc kubenswrapper[5094]: I0220 07:48:24.987095 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acf90a1a-02eb-43e2-9533-ca348b502c35-kube-api-access-8vj9s" (OuterVolumeSpecName: "kube-api-access-8vj9s") pod "acf90a1a-02eb-43e2-9533-ca348b502c35" (UID: "acf90a1a-02eb-43e2-9533-ca348b502c35"). InnerVolumeSpecName "kube-api-access-8vj9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.078790 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acf90a1a-02eb-43e2-9533-ca348b502c35-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.078847 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vj9s\" (UniqueName: \"kubernetes.io/projected/acf90a1a-02eb-43e2-9533-ca348b502c35-kube-api-access-8vj9s\") on node \"crc\" DevicePath \"\"" Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.154597 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acf90a1a-02eb-43e2-9533-ca348b502c35-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "acf90a1a-02eb-43e2-9533-ca348b502c35" (UID: "acf90a1a-02eb-43e2-9533-ca348b502c35"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.180565 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acf90a1a-02eb-43e2-9533-ca348b502c35-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.338559 5094 generic.go:334] "Generic (PLEG): container finished" podID="acf90a1a-02eb-43e2-9533-ca348b502c35" containerID="7961c8adb678afe9de644fbcd5dcc9c1fe579a81300a9f219bfc31cc94e5ed4e" exitCode=0 Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.338633 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npwbj" event={"ID":"acf90a1a-02eb-43e2-9533-ca348b502c35","Type":"ContainerDied","Data":"7961c8adb678afe9de644fbcd5dcc9c1fe579a81300a9f219bfc31cc94e5ed4e"} Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.338657 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.338776 5094 scope.go:117] "RemoveContainer" containerID="7961c8adb678afe9de644fbcd5dcc9c1fe579a81300a9f219bfc31cc94e5ed4e" Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.338752 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npwbj" event={"ID":"acf90a1a-02eb-43e2-9533-ca348b502c35","Type":"ContainerDied","Data":"c0599642c8d98050e1f7e47d03328b62c53a4b4fd04a8ac2ddf65e7ae4f3f5e8"} Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.371415 5094 scope.go:117] "RemoveContainer" containerID="9a86fc11356d6af5a7cedfc80ec43da9122b830459a8c0070ff2d5aebd9d1717" Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.390980 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-npwbj"] Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.408028 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-npwbj"] Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.411066 5094 scope.go:117] "RemoveContainer" containerID="2c24953b240ddf964b6379fe0d758314d8b25c77572be0dea4c9689428fe0745" Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.443279 5094 scope.go:117] "RemoveContainer" containerID="7961c8adb678afe9de644fbcd5dcc9c1fe579a81300a9f219bfc31cc94e5ed4e" Feb 20 07:48:25 crc kubenswrapper[5094]: E0220 07:48:25.444858 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7961c8adb678afe9de644fbcd5dcc9c1fe579a81300a9f219bfc31cc94e5ed4e\": container with ID starting with 7961c8adb678afe9de644fbcd5dcc9c1fe579a81300a9f219bfc31cc94e5ed4e not found: ID does not exist" containerID="7961c8adb678afe9de644fbcd5dcc9c1fe579a81300a9f219bfc31cc94e5ed4e" Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.444911 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7961c8adb678afe9de644fbcd5dcc9c1fe579a81300a9f219bfc31cc94e5ed4e"} err="failed to get container status \"7961c8adb678afe9de644fbcd5dcc9c1fe579a81300a9f219bfc31cc94e5ed4e\": rpc error: code = NotFound desc = could not find container \"7961c8adb678afe9de644fbcd5dcc9c1fe579a81300a9f219bfc31cc94e5ed4e\": container with ID starting with 7961c8adb678afe9de644fbcd5dcc9c1fe579a81300a9f219bfc31cc94e5ed4e not found: ID does not exist" Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.445030 5094 scope.go:117] "RemoveContainer" containerID="9a86fc11356d6af5a7cedfc80ec43da9122b830459a8c0070ff2d5aebd9d1717" Feb 20 07:48:25 crc kubenswrapper[5094]: E0220 07:48:25.445490 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a86fc11356d6af5a7cedfc80ec43da9122b830459a8c0070ff2d5aebd9d1717\": container with ID starting with 9a86fc11356d6af5a7cedfc80ec43da9122b830459a8c0070ff2d5aebd9d1717 not found: ID does not exist" containerID="9a86fc11356d6af5a7cedfc80ec43da9122b830459a8c0070ff2d5aebd9d1717" Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.445542 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a86fc11356d6af5a7cedfc80ec43da9122b830459a8c0070ff2d5aebd9d1717"} err="failed to get container status \"9a86fc11356d6af5a7cedfc80ec43da9122b830459a8c0070ff2d5aebd9d1717\": rpc error: code = NotFound desc = could not find container \"9a86fc11356d6af5a7cedfc80ec43da9122b830459a8c0070ff2d5aebd9d1717\": container with ID starting with 9a86fc11356d6af5a7cedfc80ec43da9122b830459a8c0070ff2d5aebd9d1717 not found: ID does not exist" Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.445576 5094 scope.go:117] "RemoveContainer" containerID="2c24953b240ddf964b6379fe0d758314d8b25c77572be0dea4c9689428fe0745" Feb 20 07:48:25 crc kubenswrapper[5094]: E0220 07:48:25.446230 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c24953b240ddf964b6379fe0d758314d8b25c77572be0dea4c9689428fe0745\": container with ID starting with 2c24953b240ddf964b6379fe0d758314d8b25c77572be0dea4c9689428fe0745 not found: ID does not exist" containerID="2c24953b240ddf964b6379fe0d758314d8b25c77572be0dea4c9689428fe0745" Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.446267 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c24953b240ddf964b6379fe0d758314d8b25c77572be0dea4c9689428fe0745"} err="failed to get container status \"2c24953b240ddf964b6379fe0d758314d8b25c77572be0dea4c9689428fe0745\": rpc error: code = NotFound desc = could not find container \"2c24953b240ddf964b6379fe0d758314d8b25c77572be0dea4c9689428fe0745\": container with ID starting with 2c24953b240ddf964b6379fe0d758314d8b25c77572be0dea4c9689428fe0745 not found: ID does not exist" Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.858306 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acf90a1a-02eb-43e2-9533-ca348b502c35" path="/var/lib/kubelet/pods/acf90a1a-02eb-43e2-9533-ca348b502c35/volumes" Feb 20 07:48:28 crc kubenswrapper[5094]: I0220 07:48:28.840974 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:48:28 crc kubenswrapper[5094]: E0220 07:48:28.844218 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:48:43 crc kubenswrapper[5094]: I0220 07:48:43.841328 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:48:43 crc kubenswrapper[5094]: E0220 07:48:43.842616 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:48:56 crc kubenswrapper[5094]: I0220 07:48:56.840620 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:48:56 crc kubenswrapper[5094]: E0220 07:48:56.841974 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:49:09 crc kubenswrapper[5094]: I0220 07:49:09.840507 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:49:09 crc kubenswrapper[5094]: E0220 07:49:09.841810 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:49:21 crc kubenswrapper[5094]: I0220 07:49:21.841410 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:49:21 crc kubenswrapper[5094]: E0220 07:49:21.843950 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:49:35 crc kubenswrapper[5094]: I0220 07:49:35.882966 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:49:35 crc kubenswrapper[5094]: E0220 07:49:35.885940 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:49:48 crc kubenswrapper[5094]: I0220 07:49:48.840812 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:49:48 crc kubenswrapper[5094]: E0220 07:49:48.842111 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:50:03 crc kubenswrapper[5094]: I0220 07:50:03.841138 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:50:03 crc kubenswrapper[5094]: E0220 07:50:03.842846 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:50:18 crc kubenswrapper[5094]: I0220 07:50:18.840474 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:50:18 crc kubenswrapper[5094]: E0220 07:50:18.841432 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:50:32 crc kubenswrapper[5094]: I0220 07:50:32.841180 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:50:32 crc kubenswrapper[5094]: E0220 07:50:32.843206 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:50:46 crc kubenswrapper[5094]: I0220 07:50:46.840194 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:50:47 crc kubenswrapper[5094]: I0220 07:50:47.864161 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"49c679c1e094a953959d220d84ae5c3290c008d2ae0e9a9a08ce980339bbcafe"} Feb 20 07:51:00 crc kubenswrapper[5094]: I0220 07:51:00.924941 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nqmbf"] Feb 20 07:51:00 crc kubenswrapper[5094]: E0220 07:51:00.926762 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acf90a1a-02eb-43e2-9533-ca348b502c35" containerName="registry-server" Feb 20 07:51:00 crc kubenswrapper[5094]: I0220 07:51:00.926792 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="acf90a1a-02eb-43e2-9533-ca348b502c35" containerName="registry-server" Feb 20 07:51:00 crc kubenswrapper[5094]: E0220 07:51:00.926812 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acf90a1a-02eb-43e2-9533-ca348b502c35" containerName="extract-utilities" Feb 20 07:51:00 crc kubenswrapper[5094]: I0220 07:51:00.926829 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="acf90a1a-02eb-43e2-9533-ca348b502c35" containerName="extract-utilities" Feb 20 07:51:00 crc kubenswrapper[5094]: E0220 07:51:00.926869 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acf90a1a-02eb-43e2-9533-ca348b502c35" containerName="extract-content" Feb 20 07:51:00 crc kubenswrapper[5094]: I0220 07:51:00.926887 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="acf90a1a-02eb-43e2-9533-ca348b502c35" containerName="extract-content" Feb 20 07:51:00 crc kubenswrapper[5094]: I0220 07:51:00.927176 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="acf90a1a-02eb-43e2-9533-ca348b502c35" containerName="registry-server" Feb 20 07:51:00 crc kubenswrapper[5094]: I0220 07:51:00.966553 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nqmbf"] Feb 20 07:51:00 crc kubenswrapper[5094]: I0220 07:51:00.966761 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:01 crc kubenswrapper[5094]: I0220 07:51:01.019317 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-catalog-content\") pod \"redhat-marketplace-nqmbf\" (UID: \"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea\") " pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:01 crc kubenswrapper[5094]: I0220 07:51:01.019431 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72wpz\" (UniqueName: \"kubernetes.io/projected/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-kube-api-access-72wpz\") pod \"redhat-marketplace-nqmbf\" (UID: \"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea\") " pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:01 crc kubenswrapper[5094]: I0220 07:51:01.019544 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-utilities\") pod \"redhat-marketplace-nqmbf\" (UID: \"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea\") " pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:01 crc kubenswrapper[5094]: I0220 07:51:01.121328 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-catalog-content\") pod \"redhat-marketplace-nqmbf\" (UID: \"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea\") " pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:01 crc kubenswrapper[5094]: I0220 07:51:01.121425 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72wpz\" (UniqueName: \"kubernetes.io/projected/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-kube-api-access-72wpz\") pod \"redhat-marketplace-nqmbf\" (UID: \"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea\") " pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:01 crc kubenswrapper[5094]: I0220 07:51:01.121486 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-utilities\") pod \"redhat-marketplace-nqmbf\" (UID: \"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea\") " pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:01 crc kubenswrapper[5094]: I0220 07:51:01.122115 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-utilities\") pod \"redhat-marketplace-nqmbf\" (UID: \"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea\") " pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:01 crc kubenswrapper[5094]: I0220 07:51:01.122351 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-catalog-content\") pod \"redhat-marketplace-nqmbf\" (UID: \"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea\") " pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:01 crc kubenswrapper[5094]: I0220 07:51:01.452073 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72wpz\" (UniqueName: \"kubernetes.io/projected/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-kube-api-access-72wpz\") pod \"redhat-marketplace-nqmbf\" (UID: \"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea\") " pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:01 crc kubenswrapper[5094]: I0220 07:51:01.596320 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:02 crc kubenswrapper[5094]: I0220 07:51:02.095740 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nqmbf"] Feb 20 07:51:03 crc kubenswrapper[5094]: I0220 07:51:03.030183 5094 generic.go:334] "Generic (PLEG): container finished" podID="10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea" containerID="4d03e404f80e8df8f6097f4bd57639027307f3fdb1534d2065714dc427ed3daf" exitCode=0 Feb 20 07:51:03 crc kubenswrapper[5094]: I0220 07:51:03.030466 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqmbf" event={"ID":"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea","Type":"ContainerDied","Data":"4d03e404f80e8df8f6097f4bd57639027307f3fdb1534d2065714dc427ed3daf"} Feb 20 07:51:03 crc kubenswrapper[5094]: I0220 07:51:03.030814 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqmbf" event={"ID":"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea","Type":"ContainerStarted","Data":"0984cce57642e3526adbaa0385b41cc36a181fef4aa681377f1c1423aa7645e7"} Feb 20 07:51:05 crc kubenswrapper[5094]: I0220 07:51:05.056596 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqmbf" event={"ID":"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea","Type":"ContainerStarted","Data":"811576eef7a821dca9af2ed9a0b2b13836a41f67b8667b77e774e371ca39504e"} Feb 20 07:51:06 crc kubenswrapper[5094]: I0220 07:51:06.070020 5094 generic.go:334] "Generic (PLEG): container finished" podID="10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea" containerID="811576eef7a821dca9af2ed9a0b2b13836a41f67b8667b77e774e371ca39504e" exitCode=0 Feb 20 07:51:06 crc kubenswrapper[5094]: I0220 07:51:06.070137 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqmbf" event={"ID":"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea","Type":"ContainerDied","Data":"811576eef7a821dca9af2ed9a0b2b13836a41f67b8667b77e774e371ca39504e"} Feb 20 07:51:07 crc kubenswrapper[5094]: I0220 07:51:07.099443 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqmbf" event={"ID":"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea","Type":"ContainerStarted","Data":"96dde62521256e51aa04b75061f5e0c11d74ceb462326152b67cb1c7bb89609c"} Feb 20 07:51:07 crc kubenswrapper[5094]: I0220 07:51:07.141356 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nqmbf" podStartSLOduration=3.707493975 podStartE2EDuration="7.1413118s" podCreationTimestamp="2026-02-20 07:51:00 +0000 UTC" firstStartedPulling="2026-02-20 07:51:03.03408804 +0000 UTC m=+3877.906714791" lastFinishedPulling="2026-02-20 07:51:06.467905855 +0000 UTC m=+3881.340532616" observedRunningTime="2026-02-20 07:51:07.132513098 +0000 UTC m=+3882.005139899" watchObservedRunningTime="2026-02-20 07:51:07.1413118 +0000 UTC m=+3882.013938551" Feb 20 07:51:11 crc kubenswrapper[5094]: I0220 07:51:11.597392 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:11 crc kubenswrapper[5094]: I0220 07:51:11.598372 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:11 crc kubenswrapper[5094]: I0220 07:51:11.667186 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:12 crc kubenswrapper[5094]: I0220 07:51:12.203222 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:12 crc kubenswrapper[5094]: I0220 07:51:12.292296 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nqmbf"] Feb 20 07:51:14 crc kubenswrapper[5094]: I0220 07:51:14.169139 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nqmbf" podUID="10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea" containerName="registry-server" containerID="cri-o://96dde62521256e51aa04b75061f5e0c11d74ceb462326152b67cb1c7bb89609c" gracePeriod=2 Feb 20 07:51:14 crc kubenswrapper[5094]: I0220 07:51:14.743504 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:14 crc kubenswrapper[5094]: I0220 07:51:14.907796 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-catalog-content\") pod \"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea\" (UID: \"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea\") " Feb 20 07:51:14 crc kubenswrapper[5094]: I0220 07:51:14.907884 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72wpz\" (UniqueName: \"kubernetes.io/projected/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-kube-api-access-72wpz\") pod \"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea\" (UID: \"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea\") " Feb 20 07:51:14 crc kubenswrapper[5094]: I0220 07:51:14.908023 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-utilities\") pod \"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea\" (UID: \"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea\") " Feb 20 07:51:14 crc kubenswrapper[5094]: I0220 07:51:14.909990 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-utilities" (OuterVolumeSpecName: "utilities") pod "10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea" (UID: "10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:51:14 crc kubenswrapper[5094]: I0220 07:51:14.947939 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea" (UID: "10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.010787 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.010869 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.144785 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-kube-api-access-72wpz" (OuterVolumeSpecName: "kube-api-access-72wpz") pod "10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea" (UID: "10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea"). InnerVolumeSpecName "kube-api-access-72wpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.195274 5094 generic.go:334] "Generic (PLEG): container finished" podID="10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea" containerID="96dde62521256e51aa04b75061f5e0c11d74ceb462326152b67cb1c7bb89609c" exitCode=0 Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.195346 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqmbf" event={"ID":"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea","Type":"ContainerDied","Data":"96dde62521256e51aa04b75061f5e0c11d74ceb462326152b67cb1c7bb89609c"} Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.195387 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqmbf" event={"ID":"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea","Type":"ContainerDied","Data":"0984cce57642e3526adbaa0385b41cc36a181fef4aa681377f1c1423aa7645e7"} Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.195409 5094 scope.go:117] "RemoveContainer" containerID="96dde62521256e51aa04b75061f5e0c11d74ceb462326152b67cb1c7bb89609c" Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.195792 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.215380 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72wpz\" (UniqueName: \"kubernetes.io/projected/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-kube-api-access-72wpz\") on node \"crc\" DevicePath \"\"" Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.235917 5094 scope.go:117] "RemoveContainer" containerID="811576eef7a821dca9af2ed9a0b2b13836a41f67b8667b77e774e371ca39504e" Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.253798 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nqmbf"] Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.268572 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nqmbf"] Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.359095 5094 scope.go:117] "RemoveContainer" containerID="4d03e404f80e8df8f6097f4bd57639027307f3fdb1534d2065714dc427ed3daf" Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.389044 5094 scope.go:117] "RemoveContainer" containerID="96dde62521256e51aa04b75061f5e0c11d74ceb462326152b67cb1c7bb89609c" Feb 20 07:51:15 crc kubenswrapper[5094]: E0220 07:51:15.389668 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96dde62521256e51aa04b75061f5e0c11d74ceb462326152b67cb1c7bb89609c\": container with ID starting with 96dde62521256e51aa04b75061f5e0c11d74ceb462326152b67cb1c7bb89609c not found: ID does not exist" containerID="96dde62521256e51aa04b75061f5e0c11d74ceb462326152b67cb1c7bb89609c" Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.389786 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96dde62521256e51aa04b75061f5e0c11d74ceb462326152b67cb1c7bb89609c"} err="failed to get container status \"96dde62521256e51aa04b75061f5e0c11d74ceb462326152b67cb1c7bb89609c\": rpc error: code = NotFound desc = could not find container \"96dde62521256e51aa04b75061f5e0c11d74ceb462326152b67cb1c7bb89609c\": container with ID starting with 96dde62521256e51aa04b75061f5e0c11d74ceb462326152b67cb1c7bb89609c not found: ID does not exist" Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.389849 5094 scope.go:117] "RemoveContainer" containerID="811576eef7a821dca9af2ed9a0b2b13836a41f67b8667b77e774e371ca39504e" Feb 20 07:51:15 crc kubenswrapper[5094]: E0220 07:51:15.390419 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"811576eef7a821dca9af2ed9a0b2b13836a41f67b8667b77e774e371ca39504e\": container with ID starting with 811576eef7a821dca9af2ed9a0b2b13836a41f67b8667b77e774e371ca39504e not found: ID does not exist" containerID="811576eef7a821dca9af2ed9a0b2b13836a41f67b8667b77e774e371ca39504e" Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.390488 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"811576eef7a821dca9af2ed9a0b2b13836a41f67b8667b77e774e371ca39504e"} err="failed to get container status \"811576eef7a821dca9af2ed9a0b2b13836a41f67b8667b77e774e371ca39504e\": rpc error: code = NotFound desc = could not find container \"811576eef7a821dca9af2ed9a0b2b13836a41f67b8667b77e774e371ca39504e\": container with ID starting with 811576eef7a821dca9af2ed9a0b2b13836a41f67b8667b77e774e371ca39504e not found: ID does not exist" Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.390532 5094 scope.go:117] "RemoveContainer" containerID="4d03e404f80e8df8f6097f4bd57639027307f3fdb1534d2065714dc427ed3daf" Feb 20 07:51:15 crc kubenswrapper[5094]: E0220 07:51:15.391222 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d03e404f80e8df8f6097f4bd57639027307f3fdb1534d2065714dc427ed3daf\": container with ID starting with 4d03e404f80e8df8f6097f4bd57639027307f3fdb1534d2065714dc427ed3daf not found: ID does not exist" containerID="4d03e404f80e8df8f6097f4bd57639027307f3fdb1534d2065714dc427ed3daf" Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.391274 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d03e404f80e8df8f6097f4bd57639027307f3fdb1534d2065714dc427ed3daf"} err="failed to get container status \"4d03e404f80e8df8f6097f4bd57639027307f3fdb1534d2065714dc427ed3daf\": rpc error: code = NotFound desc = could not find container \"4d03e404f80e8df8f6097f4bd57639027307f3fdb1534d2065714dc427ed3daf\": container with ID starting with 4d03e404f80e8df8f6097f4bd57639027307f3fdb1534d2065714dc427ed3daf not found: ID does not exist" Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.857739 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea" path="/var/lib/kubelet/pods/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea/volumes" Feb 20 07:51:47 crc kubenswrapper[5094]: I0220 07:51:47.751929 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4wbdf"] Feb 20 07:51:47 crc kubenswrapper[5094]: E0220 07:51:47.753379 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea" containerName="extract-content" Feb 20 07:51:47 crc kubenswrapper[5094]: I0220 07:51:47.753397 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea" containerName="extract-content" Feb 20 07:51:47 crc kubenswrapper[5094]: E0220 07:51:47.753423 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea" containerName="extract-utilities" Feb 20 07:51:47 crc kubenswrapper[5094]: I0220 07:51:47.753432 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea" containerName="extract-utilities" Feb 20 07:51:47 crc kubenswrapper[5094]: E0220 07:51:47.753464 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea" containerName="registry-server" Feb 20 07:51:47 crc kubenswrapper[5094]: I0220 07:51:47.753473 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea" containerName="registry-server" Feb 20 07:51:47 crc kubenswrapper[5094]: I0220 07:51:47.753668 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea" containerName="registry-server" Feb 20 07:51:47 crc kubenswrapper[5094]: I0220 07:51:47.755067 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:51:47 crc kubenswrapper[5094]: I0220 07:51:47.777405 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4wbdf"] Feb 20 07:51:47 crc kubenswrapper[5094]: I0220 07:51:47.865827 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xch2z\" (UniqueName: \"kubernetes.io/projected/4a369096-f833-46f9-93c3-f05f985168c2-kube-api-access-xch2z\") pod \"certified-operators-4wbdf\" (UID: \"4a369096-f833-46f9-93c3-f05f985168c2\") " pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:51:47 crc kubenswrapper[5094]: I0220 07:51:47.865914 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a369096-f833-46f9-93c3-f05f985168c2-catalog-content\") pod \"certified-operators-4wbdf\" (UID: \"4a369096-f833-46f9-93c3-f05f985168c2\") " pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:51:47 crc kubenswrapper[5094]: I0220 07:51:47.865941 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a369096-f833-46f9-93c3-f05f985168c2-utilities\") pod \"certified-operators-4wbdf\" (UID: \"4a369096-f833-46f9-93c3-f05f985168c2\") " pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:51:47 crc kubenswrapper[5094]: I0220 07:51:47.967194 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xch2z\" (UniqueName: \"kubernetes.io/projected/4a369096-f833-46f9-93c3-f05f985168c2-kube-api-access-xch2z\") pod \"certified-operators-4wbdf\" (UID: \"4a369096-f833-46f9-93c3-f05f985168c2\") " pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:51:47 crc kubenswrapper[5094]: I0220 07:51:47.967293 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a369096-f833-46f9-93c3-f05f985168c2-catalog-content\") pod \"certified-operators-4wbdf\" (UID: \"4a369096-f833-46f9-93c3-f05f985168c2\") " pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:51:47 crc kubenswrapper[5094]: I0220 07:51:47.967312 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a369096-f833-46f9-93c3-f05f985168c2-utilities\") pod \"certified-operators-4wbdf\" (UID: \"4a369096-f833-46f9-93c3-f05f985168c2\") " pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:51:47 crc kubenswrapper[5094]: I0220 07:51:47.968386 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a369096-f833-46f9-93c3-f05f985168c2-utilities\") pod \"certified-operators-4wbdf\" (UID: \"4a369096-f833-46f9-93c3-f05f985168c2\") " pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:51:47 crc kubenswrapper[5094]: I0220 07:51:47.968942 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a369096-f833-46f9-93c3-f05f985168c2-catalog-content\") pod \"certified-operators-4wbdf\" (UID: \"4a369096-f833-46f9-93c3-f05f985168c2\") " pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:51:47 crc kubenswrapper[5094]: I0220 07:51:47.992105 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xch2z\" (UniqueName: \"kubernetes.io/projected/4a369096-f833-46f9-93c3-f05f985168c2-kube-api-access-xch2z\") pod \"certified-operators-4wbdf\" (UID: \"4a369096-f833-46f9-93c3-f05f985168c2\") " pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:51:48 crc kubenswrapper[5094]: I0220 07:51:48.085682 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:51:48 crc kubenswrapper[5094]: I0220 07:51:48.636205 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4wbdf"] Feb 20 07:51:49 crc kubenswrapper[5094]: I0220 07:51:49.529330 5094 generic.go:334] "Generic (PLEG): container finished" podID="4a369096-f833-46f9-93c3-f05f985168c2" containerID="12fc2e66175a94ce2eb8ceb7a1f31e3188bf472dcb51c49a41f1fd905ecc9e2b" exitCode=0 Feb 20 07:51:49 crc kubenswrapper[5094]: I0220 07:51:49.529412 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4wbdf" event={"ID":"4a369096-f833-46f9-93c3-f05f985168c2","Type":"ContainerDied","Data":"12fc2e66175a94ce2eb8ceb7a1f31e3188bf472dcb51c49a41f1fd905ecc9e2b"} Feb 20 07:51:49 crc kubenswrapper[5094]: I0220 07:51:49.529851 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4wbdf" event={"ID":"4a369096-f833-46f9-93c3-f05f985168c2","Type":"ContainerStarted","Data":"72202432c6c5201712870e02191031f3b7dfe04b8cff9bef61b9a586db1fda5c"} Feb 20 07:51:50 crc kubenswrapper[5094]: I0220 07:51:50.541694 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4wbdf" event={"ID":"4a369096-f833-46f9-93c3-f05f985168c2","Type":"ContainerStarted","Data":"d85ceef357b4056e2444fbbf12081104dafde0672362ef4e97ea55c60685c29c"} Feb 20 07:51:51 crc kubenswrapper[5094]: I0220 07:51:51.554614 5094 generic.go:334] "Generic (PLEG): container finished" podID="4a369096-f833-46f9-93c3-f05f985168c2" containerID="d85ceef357b4056e2444fbbf12081104dafde0672362ef4e97ea55c60685c29c" exitCode=0 Feb 20 07:51:51 crc kubenswrapper[5094]: I0220 07:51:51.554735 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4wbdf" event={"ID":"4a369096-f833-46f9-93c3-f05f985168c2","Type":"ContainerDied","Data":"d85ceef357b4056e2444fbbf12081104dafde0672362ef4e97ea55c60685c29c"} Feb 20 07:51:52 crc kubenswrapper[5094]: I0220 07:51:52.570171 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4wbdf" event={"ID":"4a369096-f833-46f9-93c3-f05f985168c2","Type":"ContainerStarted","Data":"d996f31224706728c31d0388d14ef7d56c00937fb75914e5917b1e5b7304f2d5"} Feb 20 07:51:58 crc kubenswrapper[5094]: I0220 07:51:58.086744 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:51:58 crc kubenswrapper[5094]: I0220 07:51:58.089911 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:51:58 crc kubenswrapper[5094]: I0220 07:51:58.173566 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:51:58 crc kubenswrapper[5094]: I0220 07:51:58.206093 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4wbdf" podStartSLOduration=8.73740356 podStartE2EDuration="11.206045599s" podCreationTimestamp="2026-02-20 07:51:47 +0000 UTC" firstStartedPulling="2026-02-20 07:51:49.53152548 +0000 UTC m=+3924.404152191" lastFinishedPulling="2026-02-20 07:51:52.000167479 +0000 UTC m=+3926.872794230" observedRunningTime="2026-02-20 07:51:52.60090249 +0000 UTC m=+3927.473529251" watchObservedRunningTime="2026-02-20 07:51:58.206045599 +0000 UTC m=+3933.078672340" Feb 20 07:51:58 crc kubenswrapper[5094]: I0220 07:51:58.663770 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:51:58 crc kubenswrapper[5094]: I0220 07:51:58.711411 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4wbdf"] Feb 20 07:52:00 crc kubenswrapper[5094]: I0220 07:52:00.633887 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4wbdf" podUID="4a369096-f833-46f9-93c3-f05f985168c2" containerName="registry-server" containerID="cri-o://d996f31224706728c31d0388d14ef7d56c00937fb75914e5917b1e5b7304f2d5" gracePeriod=2 Feb 20 07:52:01 crc kubenswrapper[5094]: I0220 07:52:01.653072 5094 generic.go:334] "Generic (PLEG): container finished" podID="4a369096-f833-46f9-93c3-f05f985168c2" containerID="d996f31224706728c31d0388d14ef7d56c00937fb75914e5917b1e5b7304f2d5" exitCode=0 Feb 20 07:52:01 crc kubenswrapper[5094]: I0220 07:52:01.653170 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4wbdf" event={"ID":"4a369096-f833-46f9-93c3-f05f985168c2","Type":"ContainerDied","Data":"d996f31224706728c31d0388d14ef7d56c00937fb75914e5917b1e5b7304f2d5"} Feb 20 07:52:01 crc kubenswrapper[5094]: I0220 07:52:01.711957 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:52:01 crc kubenswrapper[5094]: I0220 07:52:01.828679 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a369096-f833-46f9-93c3-f05f985168c2-utilities\") pod \"4a369096-f833-46f9-93c3-f05f985168c2\" (UID: \"4a369096-f833-46f9-93c3-f05f985168c2\") " Feb 20 07:52:01 crc kubenswrapper[5094]: I0220 07:52:01.828809 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a369096-f833-46f9-93c3-f05f985168c2-catalog-content\") pod \"4a369096-f833-46f9-93c3-f05f985168c2\" (UID: \"4a369096-f833-46f9-93c3-f05f985168c2\") " Feb 20 07:52:01 crc kubenswrapper[5094]: I0220 07:52:01.828868 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xch2z\" (UniqueName: \"kubernetes.io/projected/4a369096-f833-46f9-93c3-f05f985168c2-kube-api-access-xch2z\") pod \"4a369096-f833-46f9-93c3-f05f985168c2\" (UID: \"4a369096-f833-46f9-93c3-f05f985168c2\") " Feb 20 07:52:01 crc kubenswrapper[5094]: I0220 07:52:01.830515 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a369096-f833-46f9-93c3-f05f985168c2-utilities" (OuterVolumeSpecName: "utilities") pod "4a369096-f833-46f9-93c3-f05f985168c2" (UID: "4a369096-f833-46f9-93c3-f05f985168c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:52:01 crc kubenswrapper[5094]: I0220 07:52:01.840085 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a369096-f833-46f9-93c3-f05f985168c2-kube-api-access-xch2z" (OuterVolumeSpecName: "kube-api-access-xch2z") pod "4a369096-f833-46f9-93c3-f05f985168c2" (UID: "4a369096-f833-46f9-93c3-f05f985168c2"). InnerVolumeSpecName "kube-api-access-xch2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:52:01 crc kubenswrapper[5094]: I0220 07:52:01.907999 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a369096-f833-46f9-93c3-f05f985168c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a369096-f833-46f9-93c3-f05f985168c2" (UID: "4a369096-f833-46f9-93c3-f05f985168c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:52:01 crc kubenswrapper[5094]: I0220 07:52:01.934865 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a369096-f833-46f9-93c3-f05f985168c2-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:52:01 crc kubenswrapper[5094]: I0220 07:52:01.935504 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a369096-f833-46f9-93c3-f05f985168c2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:52:01 crc kubenswrapper[5094]: I0220 07:52:01.935610 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xch2z\" (UniqueName: \"kubernetes.io/projected/4a369096-f833-46f9-93c3-f05f985168c2-kube-api-access-xch2z\") on node \"crc\" DevicePath \"\"" Feb 20 07:52:02 crc kubenswrapper[5094]: I0220 07:52:02.671999 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4wbdf" event={"ID":"4a369096-f833-46f9-93c3-f05f985168c2","Type":"ContainerDied","Data":"72202432c6c5201712870e02191031f3b7dfe04b8cff9bef61b9a586db1fda5c"} Feb 20 07:52:02 crc kubenswrapper[5094]: I0220 07:52:02.672112 5094 scope.go:117] "RemoveContainer" containerID="d996f31224706728c31d0388d14ef7d56c00937fb75914e5917b1e5b7304f2d5" Feb 20 07:52:02 crc kubenswrapper[5094]: I0220 07:52:02.672123 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:52:02 crc kubenswrapper[5094]: I0220 07:52:02.706905 5094 scope.go:117] "RemoveContainer" containerID="d85ceef357b4056e2444fbbf12081104dafde0672362ef4e97ea55c60685c29c" Feb 20 07:52:02 crc kubenswrapper[5094]: I0220 07:52:02.745384 5094 scope.go:117] "RemoveContainer" containerID="12fc2e66175a94ce2eb8ceb7a1f31e3188bf472dcb51c49a41f1fd905ecc9e2b" Feb 20 07:52:02 crc kubenswrapper[5094]: I0220 07:52:02.748592 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4wbdf"] Feb 20 07:52:02 crc kubenswrapper[5094]: I0220 07:52:02.760275 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4wbdf"] Feb 20 07:52:03 crc kubenswrapper[5094]: I0220 07:52:03.861228 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a369096-f833-46f9-93c3-f05f985168c2" path="/var/lib/kubelet/pods/4a369096-f833-46f9-93c3-f05f985168c2/volumes" Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.356288 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bhkm8"] Feb 20 07:53:03 crc kubenswrapper[5094]: E0220 07:53:03.357621 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a369096-f833-46f9-93c3-f05f985168c2" containerName="extract-content" Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.357644 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a369096-f833-46f9-93c3-f05f985168c2" containerName="extract-content" Feb 20 07:53:03 crc kubenswrapper[5094]: E0220 07:53:03.357683 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a369096-f833-46f9-93c3-f05f985168c2" containerName="registry-server" Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.357695 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a369096-f833-46f9-93c3-f05f985168c2" containerName="registry-server" Feb 20 07:53:03 crc kubenswrapper[5094]: E0220 07:53:03.357744 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a369096-f833-46f9-93c3-f05f985168c2" containerName="extract-utilities" Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.357759 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a369096-f833-46f9-93c3-f05f985168c2" containerName="extract-utilities" Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.358039 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a369096-f833-46f9-93c3-f05f985168c2" containerName="registry-server" Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.362959 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bhkm8" Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.383320 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bhkm8"] Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.481896 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/966d704a-5474-4b23-b125-63789f45ee54-catalog-content\") pod \"community-operators-bhkm8\" (UID: \"966d704a-5474-4b23-b125-63789f45ee54\") " pod="openshift-marketplace/community-operators-bhkm8" Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.482019 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/966d704a-5474-4b23-b125-63789f45ee54-utilities\") pod \"community-operators-bhkm8\" (UID: \"966d704a-5474-4b23-b125-63789f45ee54\") " pod="openshift-marketplace/community-operators-bhkm8" Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.482075 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxr9n\" (UniqueName: \"kubernetes.io/projected/966d704a-5474-4b23-b125-63789f45ee54-kube-api-access-fxr9n\") pod \"community-operators-bhkm8\" (UID: \"966d704a-5474-4b23-b125-63789f45ee54\") " pod="openshift-marketplace/community-operators-bhkm8" Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.583459 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/966d704a-5474-4b23-b125-63789f45ee54-utilities\") pod \"community-operators-bhkm8\" (UID: \"966d704a-5474-4b23-b125-63789f45ee54\") " pod="openshift-marketplace/community-operators-bhkm8" Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.583554 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxr9n\" (UniqueName: \"kubernetes.io/projected/966d704a-5474-4b23-b125-63789f45ee54-kube-api-access-fxr9n\") pod \"community-operators-bhkm8\" (UID: \"966d704a-5474-4b23-b125-63789f45ee54\") " pod="openshift-marketplace/community-operators-bhkm8" Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.583632 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/966d704a-5474-4b23-b125-63789f45ee54-catalog-content\") pod \"community-operators-bhkm8\" (UID: \"966d704a-5474-4b23-b125-63789f45ee54\") " pod="openshift-marketplace/community-operators-bhkm8" Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.584452 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/966d704a-5474-4b23-b125-63789f45ee54-catalog-content\") pod \"community-operators-bhkm8\" (UID: \"966d704a-5474-4b23-b125-63789f45ee54\") " pod="openshift-marketplace/community-operators-bhkm8" Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.584781 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/966d704a-5474-4b23-b125-63789f45ee54-utilities\") pod \"community-operators-bhkm8\" (UID: \"966d704a-5474-4b23-b125-63789f45ee54\") " pod="openshift-marketplace/community-operators-bhkm8" Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.614852 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxr9n\" (UniqueName: \"kubernetes.io/projected/966d704a-5474-4b23-b125-63789f45ee54-kube-api-access-fxr9n\") pod \"community-operators-bhkm8\" (UID: \"966d704a-5474-4b23-b125-63789f45ee54\") " pod="openshift-marketplace/community-operators-bhkm8" Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.722961 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bhkm8" Feb 20 07:53:04 crc kubenswrapper[5094]: I0220 07:53:04.107345 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:53:04 crc kubenswrapper[5094]: I0220 07:53:04.107977 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:53:04 crc kubenswrapper[5094]: I0220 07:53:04.243470 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bhkm8"] Feb 20 07:53:04 crc kubenswrapper[5094]: I0220 07:53:04.295524 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhkm8" event={"ID":"966d704a-5474-4b23-b125-63789f45ee54","Type":"ContainerStarted","Data":"c6a2d57b0006fd94772a9d020943c798981c6241465d4cdcc4eea15f613aac56"} Feb 20 07:53:05 crc kubenswrapper[5094]: I0220 07:53:05.310298 5094 generic.go:334] "Generic (PLEG): container finished" podID="966d704a-5474-4b23-b125-63789f45ee54" containerID="d65564fe78761429178c8f6287f3ce04ee576c655291e31bd89c9884855ac402" exitCode=0 Feb 20 07:53:05 crc kubenswrapper[5094]: I0220 07:53:05.310377 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhkm8" event={"ID":"966d704a-5474-4b23-b125-63789f45ee54","Type":"ContainerDied","Data":"d65564fe78761429178c8f6287f3ce04ee576c655291e31bd89c9884855ac402"} Feb 20 07:53:05 crc kubenswrapper[5094]: I0220 07:53:05.314612 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 07:53:10 crc kubenswrapper[5094]: I0220 07:53:10.361458 5094 generic.go:334] "Generic (PLEG): container finished" podID="966d704a-5474-4b23-b125-63789f45ee54" containerID="5a7511e6676c5ab50f40da25e1815cf9c50290e0aad0403c8709bb94e3b5cb03" exitCode=0 Feb 20 07:53:10 crc kubenswrapper[5094]: I0220 07:53:10.361598 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhkm8" event={"ID":"966d704a-5474-4b23-b125-63789f45ee54","Type":"ContainerDied","Data":"5a7511e6676c5ab50f40da25e1815cf9c50290e0aad0403c8709bb94e3b5cb03"} Feb 20 07:53:11 crc kubenswrapper[5094]: I0220 07:53:11.376182 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhkm8" event={"ID":"966d704a-5474-4b23-b125-63789f45ee54","Type":"ContainerStarted","Data":"d0e403d379777aa95fad81293d03def2b5729155fcd79be2b6e78032a7c9e16f"} Feb 20 07:53:11 crc kubenswrapper[5094]: I0220 07:53:11.407399 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bhkm8" podStartSLOduration=2.974705405 podStartE2EDuration="8.407368102s" podCreationTimestamp="2026-02-20 07:53:03 +0000 UTC" firstStartedPulling="2026-02-20 07:53:05.31410997 +0000 UTC m=+4000.186736721" lastFinishedPulling="2026-02-20 07:53:10.746772707 +0000 UTC m=+4005.619399418" observedRunningTime="2026-02-20 07:53:11.399414021 +0000 UTC m=+4006.272040742" watchObservedRunningTime="2026-02-20 07:53:11.407368102 +0000 UTC m=+4006.279994853" Feb 20 07:53:13 crc kubenswrapper[5094]: I0220 07:53:13.723604 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bhkm8" Feb 20 07:53:13 crc kubenswrapper[5094]: I0220 07:53:13.725836 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bhkm8" Feb 20 07:53:13 crc kubenswrapper[5094]: I0220 07:53:13.787640 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bhkm8" Feb 20 07:53:23 crc kubenswrapper[5094]: I0220 07:53:23.779092 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bhkm8" Feb 20 07:53:23 crc kubenswrapper[5094]: I0220 07:53:23.903040 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bhkm8"] Feb 20 07:53:23 crc kubenswrapper[5094]: I0220 07:53:23.981252 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nhpxw"] Feb 20 07:53:23 crc kubenswrapper[5094]: I0220 07:53:23.981975 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nhpxw" podUID="061991e0-0b0a-4e47-9275-e00b323e9fb2" containerName="registry-server" containerID="cri-o://533d4a0d04fa804ad9211edefb6a91600dc963cb9cea7ea16b1aa22fe13ba2dd" gracePeriod=2 Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.470460 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nhpxw" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.497339 5094 generic.go:334] "Generic (PLEG): container finished" podID="061991e0-0b0a-4e47-9275-e00b323e9fb2" containerID="533d4a0d04fa804ad9211edefb6a91600dc963cb9cea7ea16b1aa22fe13ba2dd" exitCode=0 Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.497406 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nhpxw" event={"ID":"061991e0-0b0a-4e47-9275-e00b323e9fb2","Type":"ContainerDied","Data":"533d4a0d04fa804ad9211edefb6a91600dc963cb9cea7ea16b1aa22fe13ba2dd"} Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.497415 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nhpxw" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.497728 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nhpxw" event={"ID":"061991e0-0b0a-4e47-9275-e00b323e9fb2","Type":"ContainerDied","Data":"e9e27186df68fe530955f183de9f930dd6ce5b2ea9902cb61d24b627fd7e8b4f"} Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.497764 5094 scope.go:117] "RemoveContainer" containerID="533d4a0d04fa804ad9211edefb6a91600dc963cb9cea7ea16b1aa22fe13ba2dd" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.523523 5094 scope.go:117] "RemoveContainer" containerID="6c1deece1e9db9069d22aa8f5ae753fd3cc50686aa2d45d85821104ae79591b4" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.558237 5094 scope.go:117] "RemoveContainer" containerID="258470f2010631a4f587bce056365561c7d5a4f1c012ad7808c46dac865f443a" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.585575 5094 scope.go:117] "RemoveContainer" containerID="533d4a0d04fa804ad9211edefb6a91600dc963cb9cea7ea16b1aa22fe13ba2dd" Feb 20 07:53:24 crc kubenswrapper[5094]: E0220 07:53:24.586387 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"533d4a0d04fa804ad9211edefb6a91600dc963cb9cea7ea16b1aa22fe13ba2dd\": container with ID starting with 533d4a0d04fa804ad9211edefb6a91600dc963cb9cea7ea16b1aa22fe13ba2dd not found: ID does not exist" containerID="533d4a0d04fa804ad9211edefb6a91600dc963cb9cea7ea16b1aa22fe13ba2dd" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.586422 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"533d4a0d04fa804ad9211edefb6a91600dc963cb9cea7ea16b1aa22fe13ba2dd"} err="failed to get container status \"533d4a0d04fa804ad9211edefb6a91600dc963cb9cea7ea16b1aa22fe13ba2dd\": rpc error: code = NotFound desc = could not find container \"533d4a0d04fa804ad9211edefb6a91600dc963cb9cea7ea16b1aa22fe13ba2dd\": container with ID starting with 533d4a0d04fa804ad9211edefb6a91600dc963cb9cea7ea16b1aa22fe13ba2dd not found: ID does not exist" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.586447 5094 scope.go:117] "RemoveContainer" containerID="6c1deece1e9db9069d22aa8f5ae753fd3cc50686aa2d45d85821104ae79591b4" Feb 20 07:53:24 crc kubenswrapper[5094]: E0220 07:53:24.587205 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c1deece1e9db9069d22aa8f5ae753fd3cc50686aa2d45d85821104ae79591b4\": container with ID starting with 6c1deece1e9db9069d22aa8f5ae753fd3cc50686aa2d45d85821104ae79591b4 not found: ID does not exist" containerID="6c1deece1e9db9069d22aa8f5ae753fd3cc50686aa2d45d85821104ae79591b4" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.587243 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c1deece1e9db9069d22aa8f5ae753fd3cc50686aa2d45d85821104ae79591b4"} err="failed to get container status \"6c1deece1e9db9069d22aa8f5ae753fd3cc50686aa2d45d85821104ae79591b4\": rpc error: code = NotFound desc = could not find container \"6c1deece1e9db9069d22aa8f5ae753fd3cc50686aa2d45d85821104ae79591b4\": container with ID starting with 6c1deece1e9db9069d22aa8f5ae753fd3cc50686aa2d45d85821104ae79591b4 not found: ID does not exist" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.587258 5094 scope.go:117] "RemoveContainer" containerID="258470f2010631a4f587bce056365561c7d5a4f1c012ad7808c46dac865f443a" Feb 20 07:53:24 crc kubenswrapper[5094]: E0220 07:53:24.587772 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"258470f2010631a4f587bce056365561c7d5a4f1c012ad7808c46dac865f443a\": container with ID starting with 258470f2010631a4f587bce056365561c7d5a4f1c012ad7808c46dac865f443a not found: ID does not exist" containerID="258470f2010631a4f587bce056365561c7d5a4f1c012ad7808c46dac865f443a" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.587798 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"258470f2010631a4f587bce056365561c7d5a4f1c012ad7808c46dac865f443a"} err="failed to get container status \"258470f2010631a4f587bce056365561c7d5a4f1c012ad7808c46dac865f443a\": rpc error: code = NotFound desc = could not find container \"258470f2010631a4f587bce056365561c7d5a4f1c012ad7808c46dac865f443a\": container with ID starting with 258470f2010631a4f587bce056365561c7d5a4f1c012ad7808c46dac865f443a not found: ID does not exist" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.599577 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/061991e0-0b0a-4e47-9275-e00b323e9fb2-utilities\") pod \"061991e0-0b0a-4e47-9275-e00b323e9fb2\" (UID: \"061991e0-0b0a-4e47-9275-e00b323e9fb2\") " Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.599662 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmx6j\" (UniqueName: \"kubernetes.io/projected/061991e0-0b0a-4e47-9275-e00b323e9fb2-kube-api-access-bmx6j\") pod \"061991e0-0b0a-4e47-9275-e00b323e9fb2\" (UID: \"061991e0-0b0a-4e47-9275-e00b323e9fb2\") " Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.599693 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/061991e0-0b0a-4e47-9275-e00b323e9fb2-catalog-content\") pod \"061991e0-0b0a-4e47-9275-e00b323e9fb2\" (UID: \"061991e0-0b0a-4e47-9275-e00b323e9fb2\") " Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.600162 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/061991e0-0b0a-4e47-9275-e00b323e9fb2-utilities" (OuterVolumeSpecName: "utilities") pod "061991e0-0b0a-4e47-9275-e00b323e9fb2" (UID: "061991e0-0b0a-4e47-9275-e00b323e9fb2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.607213 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/061991e0-0b0a-4e47-9275-e00b323e9fb2-kube-api-access-bmx6j" (OuterVolumeSpecName: "kube-api-access-bmx6j") pod "061991e0-0b0a-4e47-9275-e00b323e9fb2" (UID: "061991e0-0b0a-4e47-9275-e00b323e9fb2"). InnerVolumeSpecName "kube-api-access-bmx6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.657279 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/061991e0-0b0a-4e47-9275-e00b323e9fb2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "061991e0-0b0a-4e47-9275-e00b323e9fb2" (UID: "061991e0-0b0a-4e47-9275-e00b323e9fb2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.701385 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/061991e0-0b0a-4e47-9275-e00b323e9fb2-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.701427 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmx6j\" (UniqueName: \"kubernetes.io/projected/061991e0-0b0a-4e47-9275-e00b323e9fb2-kube-api-access-bmx6j\") on node \"crc\" DevicePath \"\"" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.701438 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/061991e0-0b0a-4e47-9275-e00b323e9fb2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.850799 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nhpxw"] Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.878725 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nhpxw"] Feb 20 07:53:25 crc kubenswrapper[5094]: I0220 07:53:25.849079 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="061991e0-0b0a-4e47-9275-e00b323e9fb2" path="/var/lib/kubelet/pods/061991e0-0b0a-4e47-9275-e00b323e9fb2/volumes" Feb 20 07:53:34 crc kubenswrapper[5094]: I0220 07:53:34.106752 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:53:34 crc kubenswrapper[5094]: I0220 07:53:34.107688 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:54:04 crc kubenswrapper[5094]: I0220 07:54:04.107134 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:54:04 crc kubenswrapper[5094]: I0220 07:54:04.108179 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:54:04 crc kubenswrapper[5094]: I0220 07:54:04.108252 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 07:54:04 crc kubenswrapper[5094]: I0220 07:54:04.109283 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"49c679c1e094a953959d220d84ae5c3290c008d2ae0e9a9a08ce980339bbcafe"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 07:54:04 crc kubenswrapper[5094]: I0220 07:54:04.109350 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://49c679c1e094a953959d220d84ae5c3290c008d2ae0e9a9a08ce980339bbcafe" gracePeriod=600 Feb 20 07:54:04 crc kubenswrapper[5094]: I0220 07:54:04.925471 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="49c679c1e094a953959d220d84ae5c3290c008d2ae0e9a9a08ce980339bbcafe" exitCode=0 Feb 20 07:54:04 crc kubenswrapper[5094]: I0220 07:54:04.925589 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"49c679c1e094a953959d220d84ae5c3290c008d2ae0e9a9a08ce980339bbcafe"} Feb 20 07:54:04 crc kubenswrapper[5094]: I0220 07:54:04.926135 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3"} Feb 20 07:54:04 crc kubenswrapper[5094]: I0220 07:54:04.926183 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:56:04 crc kubenswrapper[5094]: I0220 07:56:04.106826 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:56:04 crc kubenswrapper[5094]: I0220 07:56:04.108072 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:56:34 crc kubenswrapper[5094]: I0220 07:56:34.107516 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:56:34 crc kubenswrapper[5094]: I0220 07:56:34.108597 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:57:04 crc kubenswrapper[5094]: I0220 07:57:04.107355 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:57:04 crc kubenswrapper[5094]: I0220 07:57:04.108585 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:57:04 crc kubenswrapper[5094]: I0220 07:57:04.108675 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 07:57:04 crc kubenswrapper[5094]: I0220 07:57:04.110116 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 07:57:04 crc kubenswrapper[5094]: I0220 07:57:04.110292 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" gracePeriod=600 Feb 20 07:57:04 crc kubenswrapper[5094]: E0220 07:57:04.242818 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:57:04 crc kubenswrapper[5094]: I0220 07:57:04.872584 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3"} Feb 20 07:57:04 crc kubenswrapper[5094]: I0220 07:57:04.873222 5094 scope.go:117] "RemoveContainer" containerID="49c679c1e094a953959d220d84ae5c3290c008d2ae0e9a9a08ce980339bbcafe" Feb 20 07:57:04 crc kubenswrapper[5094]: I0220 07:57:04.872511 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" exitCode=0 Feb 20 07:57:04 crc kubenswrapper[5094]: I0220 07:57:04.874214 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 07:57:04 crc kubenswrapper[5094]: E0220 07:57:04.874541 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:57:16 crc kubenswrapper[5094]: I0220 07:57:16.839647 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 07:57:16 crc kubenswrapper[5094]: E0220 07:57:16.840992 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:57:29 crc kubenswrapper[5094]: I0220 07:57:29.840507 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 07:57:29 crc kubenswrapper[5094]: E0220 07:57:29.841828 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:57:44 crc kubenswrapper[5094]: I0220 07:57:44.840329 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 07:57:44 crc kubenswrapper[5094]: E0220 07:57:44.841675 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:57:57 crc kubenswrapper[5094]: I0220 07:57:57.840113 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 07:57:57 crc kubenswrapper[5094]: E0220 07:57:57.842004 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:58:12 crc kubenswrapper[5094]: I0220 07:58:12.841277 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 07:58:12 crc kubenswrapper[5094]: E0220 07:58:12.842312 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:58:23 crc kubenswrapper[5094]: I0220 07:58:23.841219 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 07:58:23 crc kubenswrapper[5094]: E0220 07:58:23.842687 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:58:34 crc kubenswrapper[5094]: I0220 07:58:34.839970 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 07:58:34 crc kubenswrapper[5094]: E0220 07:58:34.840792 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.227808 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2pznd"] Feb 20 07:58:39 crc kubenswrapper[5094]: E0220 07:58:39.228970 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="061991e0-0b0a-4e47-9275-e00b323e9fb2" containerName="extract-utilities" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.228987 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="061991e0-0b0a-4e47-9275-e00b323e9fb2" containerName="extract-utilities" Feb 20 07:58:39 crc kubenswrapper[5094]: E0220 07:58:39.228999 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="061991e0-0b0a-4e47-9275-e00b323e9fb2" containerName="registry-server" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.229005 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="061991e0-0b0a-4e47-9275-e00b323e9fb2" containerName="registry-server" Feb 20 07:58:39 crc kubenswrapper[5094]: E0220 07:58:39.229019 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="061991e0-0b0a-4e47-9275-e00b323e9fb2" containerName="extract-content" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.229026 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="061991e0-0b0a-4e47-9275-e00b323e9fb2" containerName="extract-content" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.229181 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="061991e0-0b0a-4e47-9275-e00b323e9fb2" containerName="registry-server" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.230288 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.289511 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2pznd"] Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.349305 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-catalog-content\") pod \"redhat-operators-2pznd\" (UID: \"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e\") " pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.349414 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hxrp\" (UniqueName: \"kubernetes.io/projected/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-kube-api-access-4hxrp\") pod \"redhat-operators-2pznd\" (UID: \"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e\") " pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.349727 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-utilities\") pod \"redhat-operators-2pznd\" (UID: \"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e\") " pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.451748 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hxrp\" (UniqueName: \"kubernetes.io/projected/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-kube-api-access-4hxrp\") pod \"redhat-operators-2pznd\" (UID: \"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e\") " pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.451904 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-utilities\") pod \"redhat-operators-2pznd\" (UID: \"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e\") " pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.451973 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-catalog-content\") pod \"redhat-operators-2pznd\" (UID: \"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e\") " pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.452484 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-utilities\") pod \"redhat-operators-2pznd\" (UID: \"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e\") " pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.452553 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-catalog-content\") pod \"redhat-operators-2pznd\" (UID: \"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e\") " pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.477316 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hxrp\" (UniqueName: \"kubernetes.io/projected/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-kube-api-access-4hxrp\") pod \"redhat-operators-2pznd\" (UID: \"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e\") " pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.550039 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.998342 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2pznd"] Feb 20 07:58:40 crc kubenswrapper[5094]: I0220 07:58:40.028234 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pznd" event={"ID":"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e","Type":"ContainerStarted","Data":"16f2d279e9786bec6efd08bd1ca91aa319b918a099a927f356625ed09bf4f4dd"} Feb 20 07:58:41 crc kubenswrapper[5094]: I0220 07:58:41.038650 5094 generic.go:334] "Generic (PLEG): container finished" podID="a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e" containerID="912712af09fbe2023dacdfbc58c58b77b5b02d464783c137acbcdacf597a91f5" exitCode=0 Feb 20 07:58:41 crc kubenswrapper[5094]: I0220 07:58:41.039020 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pznd" event={"ID":"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e","Type":"ContainerDied","Data":"912712af09fbe2023dacdfbc58c58b77b5b02d464783c137acbcdacf597a91f5"} Feb 20 07:58:41 crc kubenswrapper[5094]: I0220 07:58:41.041559 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 07:58:42 crc kubenswrapper[5094]: I0220 07:58:42.051603 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pznd" event={"ID":"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e","Type":"ContainerStarted","Data":"5635d74746c33fb77085e391ba1f7e2baf7f1505db9afc7521e44e6048577c71"} Feb 20 07:58:43 crc kubenswrapper[5094]: I0220 07:58:43.065122 5094 generic.go:334] "Generic (PLEG): container finished" podID="a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e" containerID="5635d74746c33fb77085e391ba1f7e2baf7f1505db9afc7521e44e6048577c71" exitCode=0 Feb 20 07:58:43 crc kubenswrapper[5094]: I0220 07:58:43.065179 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pznd" event={"ID":"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e","Type":"ContainerDied","Data":"5635d74746c33fb77085e391ba1f7e2baf7f1505db9afc7521e44e6048577c71"} Feb 20 07:58:44 crc kubenswrapper[5094]: I0220 07:58:44.078457 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pznd" event={"ID":"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e","Type":"ContainerStarted","Data":"aab55074e5221dd1c8ddf427d82131e85131ebf60fd7fdd873cd344040c97578"} Feb 20 07:58:44 crc kubenswrapper[5094]: I0220 07:58:44.122104 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2pznd" podStartSLOduration=2.617357367 podStartE2EDuration="5.122073985s" podCreationTimestamp="2026-02-20 07:58:39 +0000 UTC" firstStartedPulling="2026-02-20 07:58:41.041261598 +0000 UTC m=+4335.913888319" lastFinishedPulling="2026-02-20 07:58:43.545978216 +0000 UTC m=+4338.418604937" observedRunningTime="2026-02-20 07:58:44.114749688 +0000 UTC m=+4338.987376429" watchObservedRunningTime="2026-02-20 07:58:44.122073985 +0000 UTC m=+4338.994700706" Feb 20 07:58:46 crc kubenswrapper[5094]: I0220 07:58:46.840860 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 07:58:46 crc kubenswrapper[5094]: E0220 07:58:46.842068 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:58:49 crc kubenswrapper[5094]: I0220 07:58:49.551077 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:58:49 crc kubenswrapper[5094]: I0220 07:58:49.551195 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:58:50 crc kubenswrapper[5094]: I0220 07:58:50.711693 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2pznd" podUID="a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e" containerName="registry-server" probeResult="failure" output=< Feb 20 07:58:50 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 07:58:50 crc kubenswrapper[5094]: > Feb 20 07:58:57 crc kubenswrapper[5094]: I0220 07:58:57.840966 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 07:58:57 crc kubenswrapper[5094]: E0220 07:58:57.841979 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:58:59 crc kubenswrapper[5094]: I0220 07:58:59.617714 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:58:59 crc kubenswrapper[5094]: I0220 07:58:59.686619 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:58:59 crc kubenswrapper[5094]: I0220 07:58:59.866491 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2pznd"] Feb 20 07:59:01 crc kubenswrapper[5094]: I0220 07:59:01.235197 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2pznd" podUID="a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e" containerName="registry-server" containerID="cri-o://aab55074e5221dd1c8ddf427d82131e85131ebf60fd7fdd873cd344040c97578" gracePeriod=2 Feb 20 07:59:01 crc kubenswrapper[5094]: I0220 07:59:01.715380 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:59:01 crc kubenswrapper[5094]: I0220 07:59:01.897495 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hxrp\" (UniqueName: \"kubernetes.io/projected/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-kube-api-access-4hxrp\") pod \"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e\" (UID: \"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e\") " Feb 20 07:59:01 crc kubenswrapper[5094]: I0220 07:59:01.897587 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-utilities\") pod \"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e\" (UID: \"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e\") " Feb 20 07:59:01 crc kubenswrapper[5094]: I0220 07:59:01.897852 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-catalog-content\") pod \"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e\" (UID: \"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e\") " Feb 20 07:59:01 crc kubenswrapper[5094]: I0220 07:59:01.900087 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-utilities" (OuterVolumeSpecName: "utilities") pod "a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e" (UID: "a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:59:01 crc kubenswrapper[5094]: I0220 07:59:01.909970 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-kube-api-access-4hxrp" (OuterVolumeSpecName: "kube-api-access-4hxrp") pod "a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e" (UID: "a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e"). InnerVolumeSpecName "kube-api-access-4hxrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.001501 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hxrp\" (UniqueName: \"kubernetes.io/projected/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-kube-api-access-4hxrp\") on node \"crc\" DevicePath \"\"" Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.001559 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.086624 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e" (UID: "a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.103806 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.247541 5094 generic.go:334] "Generic (PLEG): container finished" podID="a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e" containerID="aab55074e5221dd1c8ddf427d82131e85131ebf60fd7fdd873cd344040c97578" exitCode=0 Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.247619 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pznd" event={"ID":"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e","Type":"ContainerDied","Data":"aab55074e5221dd1c8ddf427d82131e85131ebf60fd7fdd873cd344040c97578"} Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.247667 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pznd" event={"ID":"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e","Type":"ContainerDied","Data":"16f2d279e9786bec6efd08bd1ca91aa319b918a099a927f356625ed09bf4f4dd"} Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.247700 5094 scope.go:117] "RemoveContainer" containerID="aab55074e5221dd1c8ddf427d82131e85131ebf60fd7fdd873cd344040c97578" Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.247968 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.286024 5094 scope.go:117] "RemoveContainer" containerID="5635d74746c33fb77085e391ba1f7e2baf7f1505db9afc7521e44e6048577c71" Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.330209 5094 scope.go:117] "RemoveContainer" containerID="912712af09fbe2023dacdfbc58c58b77b5b02d464783c137acbcdacf597a91f5" Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.376308 5094 scope.go:117] "RemoveContainer" containerID="aab55074e5221dd1c8ddf427d82131e85131ebf60fd7fdd873cd344040c97578" Feb 20 07:59:02 crc kubenswrapper[5094]: E0220 07:59:02.377038 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aab55074e5221dd1c8ddf427d82131e85131ebf60fd7fdd873cd344040c97578\": container with ID starting with aab55074e5221dd1c8ddf427d82131e85131ebf60fd7fdd873cd344040c97578 not found: ID does not exist" containerID="aab55074e5221dd1c8ddf427d82131e85131ebf60fd7fdd873cd344040c97578" Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.377096 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aab55074e5221dd1c8ddf427d82131e85131ebf60fd7fdd873cd344040c97578"} err="failed to get container status \"aab55074e5221dd1c8ddf427d82131e85131ebf60fd7fdd873cd344040c97578\": rpc error: code = NotFound desc = could not find container \"aab55074e5221dd1c8ddf427d82131e85131ebf60fd7fdd873cd344040c97578\": container with ID starting with aab55074e5221dd1c8ddf427d82131e85131ebf60fd7fdd873cd344040c97578 not found: ID does not exist" Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.377133 5094 scope.go:117] "RemoveContainer" containerID="5635d74746c33fb77085e391ba1f7e2baf7f1505db9afc7521e44e6048577c71" Feb 20 07:59:02 crc kubenswrapper[5094]: E0220 07:59:02.377621 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5635d74746c33fb77085e391ba1f7e2baf7f1505db9afc7521e44e6048577c71\": container with ID starting with 5635d74746c33fb77085e391ba1f7e2baf7f1505db9afc7521e44e6048577c71 not found: ID does not exist" containerID="5635d74746c33fb77085e391ba1f7e2baf7f1505db9afc7521e44e6048577c71" Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.377734 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5635d74746c33fb77085e391ba1f7e2baf7f1505db9afc7521e44e6048577c71"} err="failed to get container status \"5635d74746c33fb77085e391ba1f7e2baf7f1505db9afc7521e44e6048577c71\": rpc error: code = NotFound desc = could not find container \"5635d74746c33fb77085e391ba1f7e2baf7f1505db9afc7521e44e6048577c71\": container with ID starting with 5635d74746c33fb77085e391ba1f7e2baf7f1505db9afc7521e44e6048577c71 not found: ID does not exist" Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.377793 5094 scope.go:117] "RemoveContainer" containerID="912712af09fbe2023dacdfbc58c58b77b5b02d464783c137acbcdacf597a91f5" Feb 20 07:59:02 crc kubenswrapper[5094]: E0220 07:59:02.378152 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"912712af09fbe2023dacdfbc58c58b77b5b02d464783c137acbcdacf597a91f5\": container with ID starting with 912712af09fbe2023dacdfbc58c58b77b5b02d464783c137acbcdacf597a91f5 not found: ID does not exist" containerID="912712af09fbe2023dacdfbc58c58b77b5b02d464783c137acbcdacf597a91f5" Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.378185 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"912712af09fbe2023dacdfbc58c58b77b5b02d464783c137acbcdacf597a91f5"} err="failed to get container status \"912712af09fbe2023dacdfbc58c58b77b5b02d464783c137acbcdacf597a91f5\": rpc error: code = NotFound desc = could not find container \"912712af09fbe2023dacdfbc58c58b77b5b02d464783c137acbcdacf597a91f5\": container with ID starting with 912712af09fbe2023dacdfbc58c58b77b5b02d464783c137acbcdacf597a91f5 not found: ID does not exist" Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.382223 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2pznd"] Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.391951 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2pznd"] Feb 20 07:59:03 crc kubenswrapper[5094]: I0220 07:59:03.853825 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e" path="/var/lib/kubelet/pods/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e/volumes" Feb 20 07:59:08 crc kubenswrapper[5094]: I0220 07:59:08.840690 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 07:59:08 crc kubenswrapper[5094]: E0220 07:59:08.841467 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:59:23 crc kubenswrapper[5094]: I0220 07:59:23.840777 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 07:59:23 crc kubenswrapper[5094]: E0220 07:59:23.841991 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:59:36 crc kubenswrapper[5094]: I0220 07:59:36.840978 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 07:59:36 crc kubenswrapper[5094]: E0220 07:59:36.842476 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:59:48 crc kubenswrapper[5094]: I0220 07:59:48.841157 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 07:59:48 crc kubenswrapper[5094]: E0220 07:59:48.842760 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.221175 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td"] Feb 20 08:00:00 crc kubenswrapper[5094]: E0220 08:00:00.222379 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e" containerName="extract-utilities" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.222441 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e" containerName="extract-utilities" Feb 20 08:00:00 crc kubenswrapper[5094]: E0220 08:00:00.222493 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e" containerName="registry-server" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.222503 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e" containerName="registry-server" Feb 20 08:00:00 crc kubenswrapper[5094]: E0220 08:00:00.222529 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e" containerName="extract-content" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.222537 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e" containerName="extract-content" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.223042 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e" containerName="registry-server" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.223858 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.229629 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.239375 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td"] Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.243694 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.349753 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a036c1c3-0425-4a2e-a42d-2abfcdc49620-config-volume\") pod \"collect-profiles-29526240-s96td\" (UID: \"a036c1c3-0425-4a2e-a42d-2abfcdc49620\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.350410 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a036c1c3-0425-4a2e-a42d-2abfcdc49620-secret-volume\") pod \"collect-profiles-29526240-s96td\" (UID: \"a036c1c3-0425-4a2e-a42d-2abfcdc49620\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.350541 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdrm6\" (UniqueName: \"kubernetes.io/projected/a036c1c3-0425-4a2e-a42d-2abfcdc49620-kube-api-access-gdrm6\") pod \"collect-profiles-29526240-s96td\" (UID: \"a036c1c3-0425-4a2e-a42d-2abfcdc49620\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.452411 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a036c1c3-0425-4a2e-a42d-2abfcdc49620-secret-volume\") pod \"collect-profiles-29526240-s96td\" (UID: \"a036c1c3-0425-4a2e-a42d-2abfcdc49620\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.452487 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdrm6\" (UniqueName: \"kubernetes.io/projected/a036c1c3-0425-4a2e-a42d-2abfcdc49620-kube-api-access-gdrm6\") pod \"collect-profiles-29526240-s96td\" (UID: \"a036c1c3-0425-4a2e-a42d-2abfcdc49620\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.452604 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a036c1c3-0425-4a2e-a42d-2abfcdc49620-config-volume\") pod \"collect-profiles-29526240-s96td\" (UID: \"a036c1c3-0425-4a2e-a42d-2abfcdc49620\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.453947 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a036c1c3-0425-4a2e-a42d-2abfcdc49620-config-volume\") pod \"collect-profiles-29526240-s96td\" (UID: \"a036c1c3-0425-4a2e-a42d-2abfcdc49620\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.467219 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a036c1c3-0425-4a2e-a42d-2abfcdc49620-secret-volume\") pod \"collect-profiles-29526240-s96td\" (UID: \"a036c1c3-0425-4a2e-a42d-2abfcdc49620\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.472747 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdrm6\" (UniqueName: \"kubernetes.io/projected/a036c1c3-0425-4a2e-a42d-2abfcdc49620-kube-api-access-gdrm6\") pod \"collect-profiles-29526240-s96td\" (UID: \"a036c1c3-0425-4a2e-a42d-2abfcdc49620\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.547393 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td" Feb 20 08:00:01 crc kubenswrapper[5094]: I0220 08:00:01.074483 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td"] Feb 20 08:00:01 crc kubenswrapper[5094]: I0220 08:00:01.868253 5094 generic.go:334] "Generic (PLEG): container finished" podID="a036c1c3-0425-4a2e-a42d-2abfcdc49620" containerID="a7f01ab3dfebce16c461640e15ab5cb83ed76e8a8bf4b49d9de590c4cb6aacd4" exitCode=0 Feb 20 08:00:01 crc kubenswrapper[5094]: I0220 08:00:01.868357 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td" event={"ID":"a036c1c3-0425-4a2e-a42d-2abfcdc49620","Type":"ContainerDied","Data":"a7f01ab3dfebce16c461640e15ab5cb83ed76e8a8bf4b49d9de590c4cb6aacd4"} Feb 20 08:00:01 crc kubenswrapper[5094]: I0220 08:00:01.868942 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td" event={"ID":"a036c1c3-0425-4a2e-a42d-2abfcdc49620","Type":"ContainerStarted","Data":"92793db5d7f411266cc1b360d23e2052740b7de803bbc4e041ccb1c325e0851a"} Feb 20 08:00:03 crc kubenswrapper[5094]: I0220 08:00:03.318167 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td" Feb 20 08:00:03 crc kubenswrapper[5094]: I0220 08:00:03.407191 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a036c1c3-0425-4a2e-a42d-2abfcdc49620-secret-volume\") pod \"a036c1c3-0425-4a2e-a42d-2abfcdc49620\" (UID: \"a036c1c3-0425-4a2e-a42d-2abfcdc49620\") " Feb 20 08:00:03 crc kubenswrapper[5094]: I0220 08:00:03.407405 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a036c1c3-0425-4a2e-a42d-2abfcdc49620-config-volume\") pod \"a036c1c3-0425-4a2e-a42d-2abfcdc49620\" (UID: \"a036c1c3-0425-4a2e-a42d-2abfcdc49620\") " Feb 20 08:00:03 crc kubenswrapper[5094]: I0220 08:00:03.407474 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdrm6\" (UniqueName: \"kubernetes.io/projected/a036c1c3-0425-4a2e-a42d-2abfcdc49620-kube-api-access-gdrm6\") pod \"a036c1c3-0425-4a2e-a42d-2abfcdc49620\" (UID: \"a036c1c3-0425-4a2e-a42d-2abfcdc49620\") " Feb 20 08:00:03 crc kubenswrapper[5094]: I0220 08:00:03.409233 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a036c1c3-0425-4a2e-a42d-2abfcdc49620-config-volume" (OuterVolumeSpecName: "config-volume") pod "a036c1c3-0425-4a2e-a42d-2abfcdc49620" (UID: "a036c1c3-0425-4a2e-a42d-2abfcdc49620"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:00:03 crc kubenswrapper[5094]: I0220 08:00:03.417056 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a036c1c3-0425-4a2e-a42d-2abfcdc49620-kube-api-access-gdrm6" (OuterVolumeSpecName: "kube-api-access-gdrm6") pod "a036c1c3-0425-4a2e-a42d-2abfcdc49620" (UID: "a036c1c3-0425-4a2e-a42d-2abfcdc49620"). InnerVolumeSpecName "kube-api-access-gdrm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:00:03 crc kubenswrapper[5094]: I0220 08:00:03.417927 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a036c1c3-0425-4a2e-a42d-2abfcdc49620-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a036c1c3-0425-4a2e-a42d-2abfcdc49620" (UID: "a036c1c3-0425-4a2e-a42d-2abfcdc49620"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:00:03 crc kubenswrapper[5094]: I0220 08:00:03.510386 5094 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a036c1c3-0425-4a2e-a42d-2abfcdc49620-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 08:00:03 crc kubenswrapper[5094]: I0220 08:00:03.510463 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdrm6\" (UniqueName: \"kubernetes.io/projected/a036c1c3-0425-4a2e-a42d-2abfcdc49620-kube-api-access-gdrm6\") on node \"crc\" DevicePath \"\"" Feb 20 08:00:03 crc kubenswrapper[5094]: I0220 08:00:03.510490 5094 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a036c1c3-0425-4a2e-a42d-2abfcdc49620-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 08:00:03 crc kubenswrapper[5094]: I0220 08:00:03.840584 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 08:00:03 crc kubenswrapper[5094]: E0220 08:00:03.841795 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:00:03 crc kubenswrapper[5094]: I0220 08:00:03.889086 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td" event={"ID":"a036c1c3-0425-4a2e-a42d-2abfcdc49620","Type":"ContainerDied","Data":"92793db5d7f411266cc1b360d23e2052740b7de803bbc4e041ccb1c325e0851a"} Feb 20 08:00:03 crc kubenswrapper[5094]: I0220 08:00:03.889195 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92793db5d7f411266cc1b360d23e2052740b7de803bbc4e041ccb1c325e0851a" Feb 20 08:00:03 crc kubenswrapper[5094]: I0220 08:00:03.889204 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td" Feb 20 08:00:04 crc kubenswrapper[5094]: I0220 08:00:04.431407 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs"] Feb 20 08:00:04 crc kubenswrapper[5094]: I0220 08:00:04.438887 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs"] Feb 20 08:00:05 crc kubenswrapper[5094]: I0220 08:00:05.864822 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74085586-b345-46e6-9367-d3b5243312a4" path="/var/lib/kubelet/pods/74085586-b345-46e6-9367-d3b5243312a4/volumes" Feb 20 08:00:14 crc kubenswrapper[5094]: I0220 08:00:14.841933 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 08:00:14 crc kubenswrapper[5094]: E0220 08:00:14.843269 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:00:28 crc kubenswrapper[5094]: I0220 08:00:28.840956 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 08:00:28 crc kubenswrapper[5094]: E0220 08:00:28.842245 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:00:35 crc kubenswrapper[5094]: I0220 08:00:35.712025 5094 scope.go:117] "RemoveContainer" containerID="c476764b7efa3f5032f50b9bb9a9da3e47b6999aab2c3a50d607379d62e94686" Feb 20 08:00:43 crc kubenswrapper[5094]: I0220 08:00:43.841329 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 08:00:43 crc kubenswrapper[5094]: E0220 08:00:43.842753 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:00:55 crc kubenswrapper[5094]: I0220 08:00:55.855438 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 08:00:55 crc kubenswrapper[5094]: E0220 08:00:55.856944 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:01:07 crc kubenswrapper[5094]: I0220 08:01:07.848793 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 08:01:07 crc kubenswrapper[5094]: E0220 08:01:07.849856 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:01:22 crc kubenswrapper[5094]: I0220 08:01:22.840837 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 08:01:22 crc kubenswrapper[5094]: E0220 08:01:22.842371 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:01:36 crc kubenswrapper[5094]: I0220 08:01:36.844091 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 08:01:36 crc kubenswrapper[5094]: E0220 08:01:36.845553 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:01:51 crc kubenswrapper[5094]: I0220 08:01:51.842378 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 08:01:51 crc kubenswrapper[5094]: E0220 08:01:51.843624 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:02:04 crc kubenswrapper[5094]: I0220 08:02:04.840430 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 08:02:06 crc kubenswrapper[5094]: I0220 08:02:06.175013 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"3143f70134d8a8406c6c494913fe42a5323bf1a21b5d094b1b86262c618b9759"} Feb 20 08:02:07 crc kubenswrapper[5094]: I0220 08:02:07.017586 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jcqhm"] Feb 20 08:02:07 crc kubenswrapper[5094]: E0220 08:02:07.019263 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a036c1c3-0425-4a2e-a42d-2abfcdc49620" containerName="collect-profiles" Feb 20 08:02:07 crc kubenswrapper[5094]: I0220 08:02:07.019297 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a036c1c3-0425-4a2e-a42d-2abfcdc49620" containerName="collect-profiles" Feb 20 08:02:07 crc kubenswrapper[5094]: I0220 08:02:07.021199 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a036c1c3-0425-4a2e-a42d-2abfcdc49620" containerName="collect-profiles" Feb 20 08:02:07 crc kubenswrapper[5094]: I0220 08:02:07.025982 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:07 crc kubenswrapper[5094]: I0220 08:02:07.039187 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jcqhm"] Feb 20 08:02:07 crc kubenswrapper[5094]: I0220 08:02:07.141386 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-utilities\") pod \"certified-operators-jcqhm\" (UID: \"f83e717d-9073-4ecb-8aa5-f35d5fd35a84\") " pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:07 crc kubenswrapper[5094]: I0220 08:02:07.141547 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4w9d\" (UniqueName: \"kubernetes.io/projected/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-kube-api-access-m4w9d\") pod \"certified-operators-jcqhm\" (UID: \"f83e717d-9073-4ecb-8aa5-f35d5fd35a84\") " pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:07 crc kubenswrapper[5094]: I0220 08:02:07.141775 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-catalog-content\") pod \"certified-operators-jcqhm\" (UID: \"f83e717d-9073-4ecb-8aa5-f35d5fd35a84\") " pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:07 crc kubenswrapper[5094]: I0220 08:02:07.243269 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4w9d\" (UniqueName: \"kubernetes.io/projected/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-kube-api-access-m4w9d\") pod \"certified-operators-jcqhm\" (UID: \"f83e717d-9073-4ecb-8aa5-f35d5fd35a84\") " pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:07 crc kubenswrapper[5094]: I0220 08:02:07.243380 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-catalog-content\") pod \"certified-operators-jcqhm\" (UID: \"f83e717d-9073-4ecb-8aa5-f35d5fd35a84\") " pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:07 crc kubenswrapper[5094]: I0220 08:02:07.243415 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-utilities\") pod \"certified-operators-jcqhm\" (UID: \"f83e717d-9073-4ecb-8aa5-f35d5fd35a84\") " pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:07 crc kubenswrapper[5094]: I0220 08:02:07.244012 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-utilities\") pod \"certified-operators-jcqhm\" (UID: \"f83e717d-9073-4ecb-8aa5-f35d5fd35a84\") " pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:07 crc kubenswrapper[5094]: I0220 08:02:07.244251 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-catalog-content\") pod \"certified-operators-jcqhm\" (UID: \"f83e717d-9073-4ecb-8aa5-f35d5fd35a84\") " pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:07 crc kubenswrapper[5094]: I0220 08:02:07.267133 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4w9d\" (UniqueName: \"kubernetes.io/projected/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-kube-api-access-m4w9d\") pod \"certified-operators-jcqhm\" (UID: \"f83e717d-9073-4ecb-8aa5-f35d5fd35a84\") " pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:07 crc kubenswrapper[5094]: I0220 08:02:07.385309 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:07 crc kubenswrapper[5094]: I0220 08:02:07.849043 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jcqhm"] Feb 20 08:02:08 crc kubenswrapper[5094]: I0220 08:02:08.194278 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcqhm" event={"ID":"f83e717d-9073-4ecb-8aa5-f35d5fd35a84","Type":"ContainerStarted","Data":"6e07502c565034b70d7e1855695af0e1c2b2ffe84d0bc5fe019ad94a17c59ced"} Feb 20 08:02:09 crc kubenswrapper[5094]: I0220 08:02:09.209365 5094 generic.go:334] "Generic (PLEG): container finished" podID="f83e717d-9073-4ecb-8aa5-f35d5fd35a84" containerID="16ee1893805d7837ff22d51e6959a2a190d238d7646340ed7c902aec8b92e75e" exitCode=0 Feb 20 08:02:09 crc kubenswrapper[5094]: I0220 08:02:09.209443 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcqhm" event={"ID":"f83e717d-9073-4ecb-8aa5-f35d5fd35a84","Type":"ContainerDied","Data":"16ee1893805d7837ff22d51e6959a2a190d238d7646340ed7c902aec8b92e75e"} Feb 20 08:02:10 crc kubenswrapper[5094]: I0220 08:02:10.225770 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcqhm" event={"ID":"f83e717d-9073-4ecb-8aa5-f35d5fd35a84","Type":"ContainerStarted","Data":"0b54eb92842ab8707757b9ef33330d6ade06cc1159dcf0b42f9dac90b063266c"} Feb 20 08:02:11 crc kubenswrapper[5094]: I0220 08:02:11.242693 5094 generic.go:334] "Generic (PLEG): container finished" podID="f83e717d-9073-4ecb-8aa5-f35d5fd35a84" containerID="0b54eb92842ab8707757b9ef33330d6ade06cc1159dcf0b42f9dac90b063266c" exitCode=0 Feb 20 08:02:11 crc kubenswrapper[5094]: I0220 08:02:11.242970 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcqhm" event={"ID":"f83e717d-9073-4ecb-8aa5-f35d5fd35a84","Type":"ContainerDied","Data":"0b54eb92842ab8707757b9ef33330d6ade06cc1159dcf0b42f9dac90b063266c"} Feb 20 08:02:12 crc kubenswrapper[5094]: I0220 08:02:12.256579 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcqhm" event={"ID":"f83e717d-9073-4ecb-8aa5-f35d5fd35a84","Type":"ContainerStarted","Data":"783be4d3bd6dce8d541b1375204eb35de74673ef9933fe9fd4a53daf34c521f0"} Feb 20 08:02:12 crc kubenswrapper[5094]: I0220 08:02:12.298802 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jcqhm" podStartSLOduration=3.8258306490000002 podStartE2EDuration="6.298773814s" podCreationTimestamp="2026-02-20 08:02:06 +0000 UTC" firstStartedPulling="2026-02-20 08:02:09.213078202 +0000 UTC m=+4544.085704913" lastFinishedPulling="2026-02-20 08:02:11.686021327 +0000 UTC m=+4546.558648078" observedRunningTime="2026-02-20 08:02:12.292003552 +0000 UTC m=+4547.164630263" watchObservedRunningTime="2026-02-20 08:02:12.298773814 +0000 UTC m=+4547.171400565" Feb 20 08:02:17 crc kubenswrapper[5094]: I0220 08:02:17.386175 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:17 crc kubenswrapper[5094]: I0220 08:02:17.387452 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:17 crc kubenswrapper[5094]: I0220 08:02:17.447385 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:18 crc kubenswrapper[5094]: I0220 08:02:18.374632 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:18 crc kubenswrapper[5094]: I0220 08:02:18.446451 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jcqhm"] Feb 20 08:02:20 crc kubenswrapper[5094]: I0220 08:02:20.327662 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jcqhm" podUID="f83e717d-9073-4ecb-8aa5-f35d5fd35a84" containerName="registry-server" containerID="cri-o://783be4d3bd6dce8d541b1375204eb35de74673ef9933fe9fd4a53daf34c521f0" gracePeriod=2 Feb 20 08:02:20 crc kubenswrapper[5094]: I0220 08:02:20.847603 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:20 crc kubenswrapper[5094]: I0220 08:02:20.930225 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4w9d\" (UniqueName: \"kubernetes.io/projected/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-kube-api-access-m4w9d\") pod \"f83e717d-9073-4ecb-8aa5-f35d5fd35a84\" (UID: \"f83e717d-9073-4ecb-8aa5-f35d5fd35a84\") " Feb 20 08:02:20 crc kubenswrapper[5094]: I0220 08:02:20.930496 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-catalog-content\") pod \"f83e717d-9073-4ecb-8aa5-f35d5fd35a84\" (UID: \"f83e717d-9073-4ecb-8aa5-f35d5fd35a84\") " Feb 20 08:02:20 crc kubenswrapper[5094]: I0220 08:02:20.930674 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-utilities\") pod \"f83e717d-9073-4ecb-8aa5-f35d5fd35a84\" (UID: \"f83e717d-9073-4ecb-8aa5-f35d5fd35a84\") " Feb 20 08:02:20 crc kubenswrapper[5094]: I0220 08:02:20.932346 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-utilities" (OuterVolumeSpecName: "utilities") pod "f83e717d-9073-4ecb-8aa5-f35d5fd35a84" (UID: "f83e717d-9073-4ecb-8aa5-f35d5fd35a84"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:02:20 crc kubenswrapper[5094]: I0220 08:02:20.945976 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-kube-api-access-m4w9d" (OuterVolumeSpecName: "kube-api-access-m4w9d") pod "f83e717d-9073-4ecb-8aa5-f35d5fd35a84" (UID: "f83e717d-9073-4ecb-8aa5-f35d5fd35a84"). InnerVolumeSpecName "kube-api-access-m4w9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.032905 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4w9d\" (UniqueName: \"kubernetes.io/projected/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-kube-api-access-m4w9d\") on node \"crc\" DevicePath \"\"" Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.032992 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.066396 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f83e717d-9073-4ecb-8aa5-f35d5fd35a84" (UID: "f83e717d-9073-4ecb-8aa5-f35d5fd35a84"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.134893 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.340575 5094 generic.go:334] "Generic (PLEG): container finished" podID="f83e717d-9073-4ecb-8aa5-f35d5fd35a84" containerID="783be4d3bd6dce8d541b1375204eb35de74673ef9933fe9fd4a53daf34c521f0" exitCode=0 Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.340729 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.340738 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcqhm" event={"ID":"f83e717d-9073-4ecb-8aa5-f35d5fd35a84","Type":"ContainerDied","Data":"783be4d3bd6dce8d541b1375204eb35de74673ef9933fe9fd4a53daf34c521f0"} Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.341386 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcqhm" event={"ID":"f83e717d-9073-4ecb-8aa5-f35d5fd35a84","Type":"ContainerDied","Data":"6e07502c565034b70d7e1855695af0e1c2b2ffe84d0bc5fe019ad94a17c59ced"} Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.341460 5094 scope.go:117] "RemoveContainer" containerID="783be4d3bd6dce8d541b1375204eb35de74673ef9933fe9fd4a53daf34c521f0" Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.374840 5094 scope.go:117] "RemoveContainer" containerID="0b54eb92842ab8707757b9ef33330d6ade06cc1159dcf0b42f9dac90b063266c" Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.410741 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jcqhm"] Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.444355 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jcqhm"] Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.453775 5094 scope.go:117] "RemoveContainer" containerID="16ee1893805d7837ff22d51e6959a2a190d238d7646340ed7c902aec8b92e75e" Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.485027 5094 scope.go:117] "RemoveContainer" containerID="783be4d3bd6dce8d541b1375204eb35de74673ef9933fe9fd4a53daf34c521f0" Feb 20 08:02:21 crc kubenswrapper[5094]: E0220 08:02:21.485663 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"783be4d3bd6dce8d541b1375204eb35de74673ef9933fe9fd4a53daf34c521f0\": container with ID starting with 783be4d3bd6dce8d541b1375204eb35de74673ef9933fe9fd4a53daf34c521f0 not found: ID does not exist" containerID="783be4d3bd6dce8d541b1375204eb35de74673ef9933fe9fd4a53daf34c521f0" Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.485747 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"783be4d3bd6dce8d541b1375204eb35de74673ef9933fe9fd4a53daf34c521f0"} err="failed to get container status \"783be4d3bd6dce8d541b1375204eb35de74673ef9933fe9fd4a53daf34c521f0\": rpc error: code = NotFound desc = could not find container \"783be4d3bd6dce8d541b1375204eb35de74673ef9933fe9fd4a53daf34c521f0\": container with ID starting with 783be4d3bd6dce8d541b1375204eb35de74673ef9933fe9fd4a53daf34c521f0 not found: ID does not exist" Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.485792 5094 scope.go:117] "RemoveContainer" containerID="0b54eb92842ab8707757b9ef33330d6ade06cc1159dcf0b42f9dac90b063266c" Feb 20 08:02:21 crc kubenswrapper[5094]: E0220 08:02:21.486651 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b54eb92842ab8707757b9ef33330d6ade06cc1159dcf0b42f9dac90b063266c\": container with ID starting with 0b54eb92842ab8707757b9ef33330d6ade06cc1159dcf0b42f9dac90b063266c not found: ID does not exist" containerID="0b54eb92842ab8707757b9ef33330d6ade06cc1159dcf0b42f9dac90b063266c" Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.486688 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b54eb92842ab8707757b9ef33330d6ade06cc1159dcf0b42f9dac90b063266c"} err="failed to get container status \"0b54eb92842ab8707757b9ef33330d6ade06cc1159dcf0b42f9dac90b063266c\": rpc error: code = NotFound desc = could not find container \"0b54eb92842ab8707757b9ef33330d6ade06cc1159dcf0b42f9dac90b063266c\": container with ID starting with 0b54eb92842ab8707757b9ef33330d6ade06cc1159dcf0b42f9dac90b063266c not found: ID does not exist" Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.486741 5094 scope.go:117] "RemoveContainer" containerID="16ee1893805d7837ff22d51e6959a2a190d238d7646340ed7c902aec8b92e75e" Feb 20 08:02:21 crc kubenswrapper[5094]: E0220 08:02:21.487140 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16ee1893805d7837ff22d51e6959a2a190d238d7646340ed7c902aec8b92e75e\": container with ID starting with 16ee1893805d7837ff22d51e6959a2a190d238d7646340ed7c902aec8b92e75e not found: ID does not exist" containerID="16ee1893805d7837ff22d51e6959a2a190d238d7646340ed7c902aec8b92e75e" Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.487190 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16ee1893805d7837ff22d51e6959a2a190d238d7646340ed7c902aec8b92e75e"} err="failed to get container status \"16ee1893805d7837ff22d51e6959a2a190d238d7646340ed7c902aec8b92e75e\": rpc error: code = NotFound desc = could not find container \"16ee1893805d7837ff22d51e6959a2a190d238d7646340ed7c902aec8b92e75e\": container with ID starting with 16ee1893805d7837ff22d51e6959a2a190d238d7646340ed7c902aec8b92e75e not found: ID does not exist" Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.856356 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f83e717d-9073-4ecb-8aa5-f35d5fd35a84" path="/var/lib/kubelet/pods/f83e717d-9073-4ecb-8aa5-f35d5fd35a84/volumes" Feb 20 08:04:34 crc kubenswrapper[5094]: I0220 08:04:34.106865 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:04:34 crc kubenswrapper[5094]: I0220 08:04:34.107927 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:05:04 crc kubenswrapper[5094]: I0220 08:05:04.107598 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:05:04 crc kubenswrapper[5094]: I0220 08:05:04.108675 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.605179 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9vn8k"] Feb 20 08:05:18 crc kubenswrapper[5094]: E0220 08:05:18.606679 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f83e717d-9073-4ecb-8aa5-f35d5fd35a84" containerName="registry-server" Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.606735 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f83e717d-9073-4ecb-8aa5-f35d5fd35a84" containerName="registry-server" Feb 20 08:05:18 crc kubenswrapper[5094]: E0220 08:05:18.606772 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f83e717d-9073-4ecb-8aa5-f35d5fd35a84" containerName="extract-utilities" Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.606787 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f83e717d-9073-4ecb-8aa5-f35d5fd35a84" containerName="extract-utilities" Feb 20 08:05:18 crc kubenswrapper[5094]: E0220 08:05:18.606834 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f83e717d-9073-4ecb-8aa5-f35d5fd35a84" containerName="extract-content" Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.606848 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f83e717d-9073-4ecb-8aa5-f35d5fd35a84" containerName="extract-content" Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.607107 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f83e717d-9073-4ecb-8aa5-f35d5fd35a84" containerName="registry-server" Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.609052 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.639883 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9vn8k"] Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.758767 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkl9z\" (UniqueName: \"kubernetes.io/projected/bb372958-7c69-465a-b777-030494eb246a-kube-api-access-xkl9z\") pod \"community-operators-9vn8k\" (UID: \"bb372958-7c69-465a-b777-030494eb246a\") " pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.758861 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb372958-7c69-465a-b777-030494eb246a-catalog-content\") pod \"community-operators-9vn8k\" (UID: \"bb372958-7c69-465a-b777-030494eb246a\") " pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.758908 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb372958-7c69-465a-b777-030494eb246a-utilities\") pod \"community-operators-9vn8k\" (UID: \"bb372958-7c69-465a-b777-030494eb246a\") " pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.860961 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkl9z\" (UniqueName: \"kubernetes.io/projected/bb372958-7c69-465a-b777-030494eb246a-kube-api-access-xkl9z\") pod \"community-operators-9vn8k\" (UID: \"bb372958-7c69-465a-b777-030494eb246a\") " pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.861220 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb372958-7c69-465a-b777-030494eb246a-catalog-content\") pod \"community-operators-9vn8k\" (UID: \"bb372958-7c69-465a-b777-030494eb246a\") " pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.861282 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb372958-7c69-465a-b777-030494eb246a-utilities\") pod \"community-operators-9vn8k\" (UID: \"bb372958-7c69-465a-b777-030494eb246a\") " pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.861890 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb372958-7c69-465a-b777-030494eb246a-utilities\") pod \"community-operators-9vn8k\" (UID: \"bb372958-7c69-465a-b777-030494eb246a\") " pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.861890 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb372958-7c69-465a-b777-030494eb246a-catalog-content\") pod \"community-operators-9vn8k\" (UID: \"bb372958-7c69-465a-b777-030494eb246a\") " pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.894089 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkl9z\" (UniqueName: \"kubernetes.io/projected/bb372958-7c69-465a-b777-030494eb246a-kube-api-access-xkl9z\") pod \"community-operators-9vn8k\" (UID: \"bb372958-7c69-465a-b777-030494eb246a\") " pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.932016 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:19 crc kubenswrapper[5094]: I0220 08:05:19.237634 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9vn8k"] Feb 20 08:05:19 crc kubenswrapper[5094]: I0220 08:05:19.595292 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kcglh"] Feb 20 08:05:19 crc kubenswrapper[5094]: I0220 08:05:19.597087 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:19 crc kubenswrapper[5094]: I0220 08:05:19.619303 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kcglh"] Feb 20 08:05:19 crc kubenswrapper[5094]: I0220 08:05:19.676868 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-utilities\") pod \"redhat-marketplace-kcglh\" (UID: \"48cd9c77-9518-4f8d-aae6-01f8cc109bbd\") " pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:19 crc kubenswrapper[5094]: I0220 08:05:19.676957 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-catalog-content\") pod \"redhat-marketplace-kcglh\" (UID: \"48cd9c77-9518-4f8d-aae6-01f8cc109bbd\") " pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:19 crc kubenswrapper[5094]: I0220 08:05:19.676998 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtpww\" (UniqueName: \"kubernetes.io/projected/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-kube-api-access-gtpww\") pod \"redhat-marketplace-kcglh\" (UID: \"48cd9c77-9518-4f8d-aae6-01f8cc109bbd\") " pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:19 crc kubenswrapper[5094]: I0220 08:05:19.778968 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-utilities\") pod \"redhat-marketplace-kcglh\" (UID: \"48cd9c77-9518-4f8d-aae6-01f8cc109bbd\") " pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:19 crc kubenswrapper[5094]: I0220 08:05:19.779059 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-catalog-content\") pod \"redhat-marketplace-kcglh\" (UID: \"48cd9c77-9518-4f8d-aae6-01f8cc109bbd\") " pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:19 crc kubenswrapper[5094]: I0220 08:05:19.779098 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtpww\" (UniqueName: \"kubernetes.io/projected/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-kube-api-access-gtpww\") pod \"redhat-marketplace-kcglh\" (UID: \"48cd9c77-9518-4f8d-aae6-01f8cc109bbd\") " pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:19 crc kubenswrapper[5094]: I0220 08:05:19.779671 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-utilities\") pod \"redhat-marketplace-kcglh\" (UID: \"48cd9c77-9518-4f8d-aae6-01f8cc109bbd\") " pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:19 crc kubenswrapper[5094]: I0220 08:05:19.779903 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-catalog-content\") pod \"redhat-marketplace-kcglh\" (UID: \"48cd9c77-9518-4f8d-aae6-01f8cc109bbd\") " pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:19 crc kubenswrapper[5094]: I0220 08:05:19.803222 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtpww\" (UniqueName: \"kubernetes.io/projected/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-kube-api-access-gtpww\") pod \"redhat-marketplace-kcglh\" (UID: \"48cd9c77-9518-4f8d-aae6-01f8cc109bbd\") " pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:19 crc kubenswrapper[5094]: I0220 08:05:19.927578 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:20 crc kubenswrapper[5094]: I0220 08:05:20.078948 5094 generic.go:334] "Generic (PLEG): container finished" podID="bb372958-7c69-465a-b777-030494eb246a" containerID="6ab9cf4c1a2baf580d75234cf709a4e40b91236d935360bf5b74d8752da9972d" exitCode=0 Feb 20 08:05:20 crc kubenswrapper[5094]: I0220 08:05:20.079074 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vn8k" event={"ID":"bb372958-7c69-465a-b777-030494eb246a","Type":"ContainerDied","Data":"6ab9cf4c1a2baf580d75234cf709a4e40b91236d935360bf5b74d8752da9972d"} Feb 20 08:05:20 crc kubenswrapper[5094]: I0220 08:05:20.079466 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vn8k" event={"ID":"bb372958-7c69-465a-b777-030494eb246a","Type":"ContainerStarted","Data":"8430f9b319dadd2c5b9591487ab581b43a2994a86d12a939297ba853574c3fec"} Feb 20 08:05:20 crc kubenswrapper[5094]: I0220 08:05:20.082107 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 08:05:20 crc kubenswrapper[5094]: I0220 08:05:20.170910 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kcglh"] Feb 20 08:05:21 crc kubenswrapper[5094]: I0220 08:05:21.087481 5094 generic.go:334] "Generic (PLEG): container finished" podID="bb372958-7c69-465a-b777-030494eb246a" containerID="d77fadfe78db435b2b3d653940751a2de236dcd2a907d33c80b2992b4c327f82" exitCode=0 Feb 20 08:05:21 crc kubenswrapper[5094]: I0220 08:05:21.087580 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vn8k" event={"ID":"bb372958-7c69-465a-b777-030494eb246a","Type":"ContainerDied","Data":"d77fadfe78db435b2b3d653940751a2de236dcd2a907d33c80b2992b4c327f82"} Feb 20 08:05:21 crc kubenswrapper[5094]: I0220 08:05:21.089733 5094 generic.go:334] "Generic (PLEG): container finished" podID="48cd9c77-9518-4f8d-aae6-01f8cc109bbd" containerID="39940c6fc64d5148269f51845fceb3dc2cd3b5c97786ee2f0e3d03e6fab57c87" exitCode=0 Feb 20 08:05:21 crc kubenswrapper[5094]: I0220 08:05:21.089792 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kcglh" event={"ID":"48cd9c77-9518-4f8d-aae6-01f8cc109bbd","Type":"ContainerDied","Data":"39940c6fc64d5148269f51845fceb3dc2cd3b5c97786ee2f0e3d03e6fab57c87"} Feb 20 08:05:21 crc kubenswrapper[5094]: I0220 08:05:21.089830 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kcglh" event={"ID":"48cd9c77-9518-4f8d-aae6-01f8cc109bbd","Type":"ContainerStarted","Data":"26632a487f68eadc734723e4639ec5fdf159933d393e3a4066d8d387504c2ce7"} Feb 20 08:05:22 crc kubenswrapper[5094]: I0220 08:05:22.101810 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vn8k" event={"ID":"bb372958-7c69-465a-b777-030494eb246a","Type":"ContainerStarted","Data":"c2b6325efe24dafb298dfe2d3c8a64310b52d0692d4c59b44309636582f6a2dc"} Feb 20 08:05:22 crc kubenswrapper[5094]: I0220 08:05:22.122517 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9vn8k" podStartSLOduration=2.69536786 podStartE2EDuration="4.122490642s" podCreationTimestamp="2026-02-20 08:05:18 +0000 UTC" firstStartedPulling="2026-02-20 08:05:20.081519721 +0000 UTC m=+4734.954146442" lastFinishedPulling="2026-02-20 08:05:21.508642503 +0000 UTC m=+4736.381269224" observedRunningTime="2026-02-20 08:05:22.12029461 +0000 UTC m=+4736.992921331" watchObservedRunningTime="2026-02-20 08:05:22.122490642 +0000 UTC m=+4736.995117353" Feb 20 08:05:23 crc kubenswrapper[5094]: I0220 08:05:23.112290 5094 generic.go:334] "Generic (PLEG): container finished" podID="48cd9c77-9518-4f8d-aae6-01f8cc109bbd" containerID="ac45ad5d642dccb5f31fa01005a4c82e37caefb785d934d09d4bda31f8e409f4" exitCode=0 Feb 20 08:05:23 crc kubenswrapper[5094]: I0220 08:05:23.112362 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kcglh" event={"ID":"48cd9c77-9518-4f8d-aae6-01f8cc109bbd","Type":"ContainerDied","Data":"ac45ad5d642dccb5f31fa01005a4c82e37caefb785d934d09d4bda31f8e409f4"} Feb 20 08:05:24 crc kubenswrapper[5094]: I0220 08:05:24.125568 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kcglh" event={"ID":"48cd9c77-9518-4f8d-aae6-01f8cc109bbd","Type":"ContainerStarted","Data":"3ac4718c43efbb097b56b4b12b6d0e314180adc5c8f34b06db43bf5faf77ef89"} Feb 20 08:05:24 crc kubenswrapper[5094]: I0220 08:05:24.155643 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kcglh" podStartSLOduration=2.779211085 podStartE2EDuration="5.155618426s" podCreationTimestamp="2026-02-20 08:05:19 +0000 UTC" firstStartedPulling="2026-02-20 08:05:21.091479504 +0000 UTC m=+4735.964106215" lastFinishedPulling="2026-02-20 08:05:23.467886845 +0000 UTC m=+4738.340513556" observedRunningTime="2026-02-20 08:05:24.151553459 +0000 UTC m=+4739.024180170" watchObservedRunningTime="2026-02-20 08:05:24.155618426 +0000 UTC m=+4739.028245137" Feb 20 08:05:28 crc kubenswrapper[5094]: I0220 08:05:28.932340 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:28 crc kubenswrapper[5094]: I0220 08:05:28.933846 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:29 crc kubenswrapper[5094]: I0220 08:05:29.101305 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:29 crc kubenswrapper[5094]: I0220 08:05:29.242216 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:29 crc kubenswrapper[5094]: I0220 08:05:29.345618 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9vn8k"] Feb 20 08:05:29 crc kubenswrapper[5094]: I0220 08:05:29.928458 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:29 crc kubenswrapper[5094]: I0220 08:05:29.928522 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:29 crc kubenswrapper[5094]: I0220 08:05:29.969215 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:30 crc kubenswrapper[5094]: I0220 08:05:30.241185 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:31 crc kubenswrapper[5094]: I0220 08:05:31.197052 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9vn8k" podUID="bb372958-7c69-465a-b777-030494eb246a" containerName="registry-server" containerID="cri-o://c2b6325efe24dafb298dfe2d3c8a64310b52d0692d4c59b44309636582f6a2dc" gracePeriod=2 Feb 20 08:05:31 crc kubenswrapper[5094]: I0220 08:05:31.637031 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:31 crc kubenswrapper[5094]: I0220 08:05:31.727587 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkl9z\" (UniqueName: \"kubernetes.io/projected/bb372958-7c69-465a-b777-030494eb246a-kube-api-access-xkl9z\") pod \"bb372958-7c69-465a-b777-030494eb246a\" (UID: \"bb372958-7c69-465a-b777-030494eb246a\") " Feb 20 08:05:31 crc kubenswrapper[5094]: I0220 08:05:31.727659 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb372958-7c69-465a-b777-030494eb246a-utilities\") pod \"bb372958-7c69-465a-b777-030494eb246a\" (UID: \"bb372958-7c69-465a-b777-030494eb246a\") " Feb 20 08:05:31 crc kubenswrapper[5094]: I0220 08:05:31.727727 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb372958-7c69-465a-b777-030494eb246a-catalog-content\") pod \"bb372958-7c69-465a-b777-030494eb246a\" (UID: \"bb372958-7c69-465a-b777-030494eb246a\") " Feb 20 08:05:31 crc kubenswrapper[5094]: I0220 08:05:31.728792 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb372958-7c69-465a-b777-030494eb246a-utilities" (OuterVolumeSpecName: "utilities") pod "bb372958-7c69-465a-b777-030494eb246a" (UID: "bb372958-7c69-465a-b777-030494eb246a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:05:31 crc kubenswrapper[5094]: I0220 08:05:31.734733 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb372958-7c69-465a-b777-030494eb246a-kube-api-access-xkl9z" (OuterVolumeSpecName: "kube-api-access-xkl9z") pod "bb372958-7c69-465a-b777-030494eb246a" (UID: "bb372958-7c69-465a-b777-030494eb246a"). InnerVolumeSpecName "kube-api-access-xkl9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:05:31 crc kubenswrapper[5094]: I0220 08:05:31.743568 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kcglh"] Feb 20 08:05:31 crc kubenswrapper[5094]: I0220 08:05:31.786139 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb372958-7c69-465a-b777-030494eb246a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb372958-7c69-465a-b777-030494eb246a" (UID: "bb372958-7c69-465a-b777-030494eb246a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:05:31 crc kubenswrapper[5094]: I0220 08:05:31.829460 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkl9z\" (UniqueName: \"kubernetes.io/projected/bb372958-7c69-465a-b777-030494eb246a-kube-api-access-xkl9z\") on node \"crc\" DevicePath \"\"" Feb 20 08:05:31 crc kubenswrapper[5094]: I0220 08:05:31.829491 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb372958-7c69-465a-b777-030494eb246a-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:05:31 crc kubenswrapper[5094]: I0220 08:05:31.829501 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb372958-7c69-465a-b777-030494eb246a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.212750 5094 generic.go:334] "Generic (PLEG): container finished" podID="bb372958-7c69-465a-b777-030494eb246a" containerID="c2b6325efe24dafb298dfe2d3c8a64310b52d0692d4c59b44309636582f6a2dc" exitCode=0 Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.213353 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kcglh" podUID="48cd9c77-9518-4f8d-aae6-01f8cc109bbd" containerName="registry-server" containerID="cri-o://3ac4718c43efbb097b56b4b12b6d0e314180adc5c8f34b06db43bf5faf77ef89" gracePeriod=2 Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.214639 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.217042 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vn8k" event={"ID":"bb372958-7c69-465a-b777-030494eb246a","Type":"ContainerDied","Data":"c2b6325efe24dafb298dfe2d3c8a64310b52d0692d4c59b44309636582f6a2dc"} Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.217218 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vn8k" event={"ID":"bb372958-7c69-465a-b777-030494eb246a","Type":"ContainerDied","Data":"8430f9b319dadd2c5b9591487ab581b43a2994a86d12a939297ba853574c3fec"} Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.217348 5094 scope.go:117] "RemoveContainer" containerID="c2b6325efe24dafb298dfe2d3c8a64310b52d0692d4c59b44309636582f6a2dc" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.255556 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9vn8k"] Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.263624 5094 scope.go:117] "RemoveContainer" containerID="d77fadfe78db435b2b3d653940751a2de236dcd2a907d33c80b2992b4c327f82" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.265337 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9vn8k"] Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.306255 5094 scope.go:117] "RemoveContainer" containerID="6ab9cf4c1a2baf580d75234cf709a4e40b91236d935360bf5b74d8752da9972d" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.435831 5094 scope.go:117] "RemoveContainer" containerID="c2b6325efe24dafb298dfe2d3c8a64310b52d0692d4c59b44309636582f6a2dc" Feb 20 08:05:32 crc kubenswrapper[5094]: E0220 08:05:32.436316 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2b6325efe24dafb298dfe2d3c8a64310b52d0692d4c59b44309636582f6a2dc\": container with ID starting with c2b6325efe24dafb298dfe2d3c8a64310b52d0692d4c59b44309636582f6a2dc not found: ID does not exist" containerID="c2b6325efe24dafb298dfe2d3c8a64310b52d0692d4c59b44309636582f6a2dc" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.436352 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2b6325efe24dafb298dfe2d3c8a64310b52d0692d4c59b44309636582f6a2dc"} err="failed to get container status \"c2b6325efe24dafb298dfe2d3c8a64310b52d0692d4c59b44309636582f6a2dc\": rpc error: code = NotFound desc = could not find container \"c2b6325efe24dafb298dfe2d3c8a64310b52d0692d4c59b44309636582f6a2dc\": container with ID starting with c2b6325efe24dafb298dfe2d3c8a64310b52d0692d4c59b44309636582f6a2dc not found: ID does not exist" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.436376 5094 scope.go:117] "RemoveContainer" containerID="d77fadfe78db435b2b3d653940751a2de236dcd2a907d33c80b2992b4c327f82" Feb 20 08:05:32 crc kubenswrapper[5094]: E0220 08:05:32.436762 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d77fadfe78db435b2b3d653940751a2de236dcd2a907d33c80b2992b4c327f82\": container with ID starting with d77fadfe78db435b2b3d653940751a2de236dcd2a907d33c80b2992b4c327f82 not found: ID does not exist" containerID="d77fadfe78db435b2b3d653940751a2de236dcd2a907d33c80b2992b4c327f82" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.436837 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d77fadfe78db435b2b3d653940751a2de236dcd2a907d33c80b2992b4c327f82"} err="failed to get container status \"d77fadfe78db435b2b3d653940751a2de236dcd2a907d33c80b2992b4c327f82\": rpc error: code = NotFound desc = could not find container \"d77fadfe78db435b2b3d653940751a2de236dcd2a907d33c80b2992b4c327f82\": container with ID starting with d77fadfe78db435b2b3d653940751a2de236dcd2a907d33c80b2992b4c327f82 not found: ID does not exist" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.436879 5094 scope.go:117] "RemoveContainer" containerID="6ab9cf4c1a2baf580d75234cf709a4e40b91236d935360bf5b74d8752da9972d" Feb 20 08:05:32 crc kubenswrapper[5094]: E0220 08:05:32.437214 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ab9cf4c1a2baf580d75234cf709a4e40b91236d935360bf5b74d8752da9972d\": container with ID starting with 6ab9cf4c1a2baf580d75234cf709a4e40b91236d935360bf5b74d8752da9972d not found: ID does not exist" containerID="6ab9cf4c1a2baf580d75234cf709a4e40b91236d935360bf5b74d8752da9972d" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.437238 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ab9cf4c1a2baf580d75234cf709a4e40b91236d935360bf5b74d8752da9972d"} err="failed to get container status \"6ab9cf4c1a2baf580d75234cf709a4e40b91236d935360bf5b74d8752da9972d\": rpc error: code = NotFound desc = could not find container \"6ab9cf4c1a2baf580d75234cf709a4e40b91236d935360bf5b74d8752da9972d\": container with ID starting with 6ab9cf4c1a2baf580d75234cf709a4e40b91236d935360bf5b74d8752da9972d not found: ID does not exist" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.692883 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.846538 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtpww\" (UniqueName: \"kubernetes.io/projected/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-kube-api-access-gtpww\") pod \"48cd9c77-9518-4f8d-aae6-01f8cc109bbd\" (UID: \"48cd9c77-9518-4f8d-aae6-01f8cc109bbd\") " Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.846764 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-catalog-content\") pod \"48cd9c77-9518-4f8d-aae6-01f8cc109bbd\" (UID: \"48cd9c77-9518-4f8d-aae6-01f8cc109bbd\") " Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.846861 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-utilities\") pod \"48cd9c77-9518-4f8d-aae6-01f8cc109bbd\" (UID: \"48cd9c77-9518-4f8d-aae6-01f8cc109bbd\") " Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.847940 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-utilities" (OuterVolumeSpecName: "utilities") pod "48cd9c77-9518-4f8d-aae6-01f8cc109bbd" (UID: "48cd9c77-9518-4f8d-aae6-01f8cc109bbd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.852309 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-kube-api-access-gtpww" (OuterVolumeSpecName: "kube-api-access-gtpww") pod "48cd9c77-9518-4f8d-aae6-01f8cc109bbd" (UID: "48cd9c77-9518-4f8d-aae6-01f8cc109bbd"). InnerVolumeSpecName "kube-api-access-gtpww". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.897493 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48cd9c77-9518-4f8d-aae6-01f8cc109bbd" (UID: "48cd9c77-9518-4f8d-aae6-01f8cc109bbd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.949318 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.949365 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.949377 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtpww\" (UniqueName: \"kubernetes.io/projected/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-kube-api-access-gtpww\") on node \"crc\" DevicePath \"\"" Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.224970 5094 generic.go:334] "Generic (PLEG): container finished" podID="48cd9c77-9518-4f8d-aae6-01f8cc109bbd" containerID="3ac4718c43efbb097b56b4b12b6d0e314180adc5c8f34b06db43bf5faf77ef89" exitCode=0 Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.225057 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kcglh" event={"ID":"48cd9c77-9518-4f8d-aae6-01f8cc109bbd","Type":"ContainerDied","Data":"3ac4718c43efbb097b56b4b12b6d0e314180adc5c8f34b06db43bf5faf77ef89"} Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.225137 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.225475 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kcglh" event={"ID":"48cd9c77-9518-4f8d-aae6-01f8cc109bbd","Type":"ContainerDied","Data":"26632a487f68eadc734723e4639ec5fdf159933d393e3a4066d8d387504c2ce7"} Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.225510 5094 scope.go:117] "RemoveContainer" containerID="3ac4718c43efbb097b56b4b12b6d0e314180adc5c8f34b06db43bf5faf77ef89" Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.254764 5094 scope.go:117] "RemoveContainer" containerID="ac45ad5d642dccb5f31fa01005a4c82e37caefb785d934d09d4bda31f8e409f4" Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.281568 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kcglh"] Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.288852 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kcglh"] Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.298388 5094 scope.go:117] "RemoveContainer" containerID="39940c6fc64d5148269f51845fceb3dc2cd3b5c97786ee2f0e3d03e6fab57c87" Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.317128 5094 scope.go:117] "RemoveContainer" containerID="3ac4718c43efbb097b56b4b12b6d0e314180adc5c8f34b06db43bf5faf77ef89" Feb 20 08:05:33 crc kubenswrapper[5094]: E0220 08:05:33.317688 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ac4718c43efbb097b56b4b12b6d0e314180adc5c8f34b06db43bf5faf77ef89\": container with ID starting with 3ac4718c43efbb097b56b4b12b6d0e314180adc5c8f34b06db43bf5faf77ef89 not found: ID does not exist" containerID="3ac4718c43efbb097b56b4b12b6d0e314180adc5c8f34b06db43bf5faf77ef89" Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.317907 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ac4718c43efbb097b56b4b12b6d0e314180adc5c8f34b06db43bf5faf77ef89"} err="failed to get container status \"3ac4718c43efbb097b56b4b12b6d0e314180adc5c8f34b06db43bf5faf77ef89\": rpc error: code = NotFound desc = could not find container \"3ac4718c43efbb097b56b4b12b6d0e314180adc5c8f34b06db43bf5faf77ef89\": container with ID starting with 3ac4718c43efbb097b56b4b12b6d0e314180adc5c8f34b06db43bf5faf77ef89 not found: ID does not exist" Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.317935 5094 scope.go:117] "RemoveContainer" containerID="ac45ad5d642dccb5f31fa01005a4c82e37caefb785d934d09d4bda31f8e409f4" Feb 20 08:05:33 crc kubenswrapper[5094]: E0220 08:05:33.318255 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac45ad5d642dccb5f31fa01005a4c82e37caefb785d934d09d4bda31f8e409f4\": container with ID starting with ac45ad5d642dccb5f31fa01005a4c82e37caefb785d934d09d4bda31f8e409f4 not found: ID does not exist" containerID="ac45ad5d642dccb5f31fa01005a4c82e37caefb785d934d09d4bda31f8e409f4" Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.318292 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac45ad5d642dccb5f31fa01005a4c82e37caefb785d934d09d4bda31f8e409f4"} err="failed to get container status \"ac45ad5d642dccb5f31fa01005a4c82e37caefb785d934d09d4bda31f8e409f4\": rpc error: code = NotFound desc = could not find container \"ac45ad5d642dccb5f31fa01005a4c82e37caefb785d934d09d4bda31f8e409f4\": container with ID starting with ac45ad5d642dccb5f31fa01005a4c82e37caefb785d934d09d4bda31f8e409f4 not found: ID does not exist" Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.318313 5094 scope.go:117] "RemoveContainer" containerID="39940c6fc64d5148269f51845fceb3dc2cd3b5c97786ee2f0e3d03e6fab57c87" Feb 20 08:05:33 crc kubenswrapper[5094]: E0220 08:05:33.318661 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39940c6fc64d5148269f51845fceb3dc2cd3b5c97786ee2f0e3d03e6fab57c87\": container with ID starting with 39940c6fc64d5148269f51845fceb3dc2cd3b5c97786ee2f0e3d03e6fab57c87 not found: ID does not exist" containerID="39940c6fc64d5148269f51845fceb3dc2cd3b5c97786ee2f0e3d03e6fab57c87" Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.318727 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39940c6fc64d5148269f51845fceb3dc2cd3b5c97786ee2f0e3d03e6fab57c87"} err="failed to get container status \"39940c6fc64d5148269f51845fceb3dc2cd3b5c97786ee2f0e3d03e6fab57c87\": rpc error: code = NotFound desc = could not find container \"39940c6fc64d5148269f51845fceb3dc2cd3b5c97786ee2f0e3d03e6fab57c87\": container with ID starting with 39940c6fc64d5148269f51845fceb3dc2cd3b5c97786ee2f0e3d03e6fab57c87 not found: ID does not exist" Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.850765 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48cd9c77-9518-4f8d-aae6-01f8cc109bbd" path="/var/lib/kubelet/pods/48cd9c77-9518-4f8d-aae6-01f8cc109bbd/volumes" Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.852468 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb372958-7c69-465a-b777-030494eb246a" path="/var/lib/kubelet/pods/bb372958-7c69-465a-b777-030494eb246a/volumes" Feb 20 08:05:34 crc kubenswrapper[5094]: I0220 08:05:34.107389 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:05:34 crc kubenswrapper[5094]: I0220 08:05:34.107500 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:05:34 crc kubenswrapper[5094]: I0220 08:05:34.107580 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 08:05:34 crc kubenswrapper[5094]: I0220 08:05:34.108646 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3143f70134d8a8406c6c494913fe42a5323bf1a21b5d094b1b86262c618b9759"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 08:05:34 crc kubenswrapper[5094]: I0220 08:05:34.108776 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://3143f70134d8a8406c6c494913fe42a5323bf1a21b5d094b1b86262c618b9759" gracePeriod=600 Feb 20 08:05:34 crc kubenswrapper[5094]: I0220 08:05:34.241001 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="3143f70134d8a8406c6c494913fe42a5323bf1a21b5d094b1b86262c618b9759" exitCode=0 Feb 20 08:05:34 crc kubenswrapper[5094]: I0220 08:05:34.241114 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"3143f70134d8a8406c6c494913fe42a5323bf1a21b5d094b1b86262c618b9759"} Feb 20 08:05:34 crc kubenswrapper[5094]: I0220 08:05:34.241176 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 08:05:35 crc kubenswrapper[5094]: I0220 08:05:35.255650 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37"} Feb 20 08:07:34 crc kubenswrapper[5094]: I0220 08:07:34.107145 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:07:34 crc kubenswrapper[5094]: I0220 08:07:34.107886 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:08:04 crc kubenswrapper[5094]: I0220 08:08:04.107499 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:08:04 crc kubenswrapper[5094]: I0220 08:08:04.108333 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:08:34 crc kubenswrapper[5094]: I0220 08:08:34.107321 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:08:34 crc kubenswrapper[5094]: I0220 08:08:34.108019 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:08:34 crc kubenswrapper[5094]: I0220 08:08:34.108084 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 08:08:34 crc kubenswrapper[5094]: I0220 08:08:34.108910 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 08:08:34 crc kubenswrapper[5094]: I0220 08:08:34.109009 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" gracePeriod=600 Feb 20 08:08:34 crc kubenswrapper[5094]: E0220 08:08:34.246669 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:08:34 crc kubenswrapper[5094]: I0220 08:08:34.964624 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" exitCode=0 Feb 20 08:08:34 crc kubenswrapper[5094]: I0220 08:08:34.964680 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37"} Feb 20 08:08:34 crc kubenswrapper[5094]: I0220 08:08:34.964748 5094 scope.go:117] "RemoveContainer" containerID="3143f70134d8a8406c6c494913fe42a5323bf1a21b5d094b1b86262c618b9759" Feb 20 08:08:34 crc kubenswrapper[5094]: I0220 08:08:34.965327 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:08:34 crc kubenswrapper[5094]: E0220 08:08:34.965695 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.658071 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4wfxd"] Feb 20 08:08:40 crc kubenswrapper[5094]: E0220 08:08:40.658841 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb372958-7c69-465a-b777-030494eb246a" containerName="extract-content" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.658860 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb372958-7c69-465a-b777-030494eb246a" containerName="extract-content" Feb 20 08:08:40 crc kubenswrapper[5094]: E0220 08:08:40.658878 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb372958-7c69-465a-b777-030494eb246a" containerName="registry-server" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.658887 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb372958-7c69-465a-b777-030494eb246a" containerName="registry-server" Feb 20 08:08:40 crc kubenswrapper[5094]: E0220 08:08:40.658905 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb372958-7c69-465a-b777-030494eb246a" containerName="extract-utilities" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.658914 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb372958-7c69-465a-b777-030494eb246a" containerName="extract-utilities" Feb 20 08:08:40 crc kubenswrapper[5094]: E0220 08:08:40.658937 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48cd9c77-9518-4f8d-aae6-01f8cc109bbd" containerName="extract-utilities" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.658944 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="48cd9c77-9518-4f8d-aae6-01f8cc109bbd" containerName="extract-utilities" Feb 20 08:08:40 crc kubenswrapper[5094]: E0220 08:08:40.658959 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48cd9c77-9518-4f8d-aae6-01f8cc109bbd" containerName="registry-server" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.658967 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="48cd9c77-9518-4f8d-aae6-01f8cc109bbd" containerName="registry-server" Feb 20 08:08:40 crc kubenswrapper[5094]: E0220 08:08:40.658979 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48cd9c77-9518-4f8d-aae6-01f8cc109bbd" containerName="extract-content" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.658987 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="48cd9c77-9518-4f8d-aae6-01f8cc109bbd" containerName="extract-content" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.659177 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb372958-7c69-465a-b777-030494eb246a" containerName="registry-server" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.659196 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="48cd9c77-9518-4f8d-aae6-01f8cc109bbd" containerName="registry-server" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.660464 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.664641 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4wfxd"] Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.729276 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7113bba-5e76-4cec-86ae-8b8f25962f3b-catalog-content\") pod \"redhat-operators-4wfxd\" (UID: \"d7113bba-5e76-4cec-86ae-8b8f25962f3b\") " pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.729344 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dww2x\" (UniqueName: \"kubernetes.io/projected/d7113bba-5e76-4cec-86ae-8b8f25962f3b-kube-api-access-dww2x\") pod \"redhat-operators-4wfxd\" (UID: \"d7113bba-5e76-4cec-86ae-8b8f25962f3b\") " pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.729405 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7113bba-5e76-4cec-86ae-8b8f25962f3b-utilities\") pod \"redhat-operators-4wfxd\" (UID: \"d7113bba-5e76-4cec-86ae-8b8f25962f3b\") " pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.830494 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7113bba-5e76-4cec-86ae-8b8f25962f3b-catalog-content\") pod \"redhat-operators-4wfxd\" (UID: \"d7113bba-5e76-4cec-86ae-8b8f25962f3b\") " pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.830581 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dww2x\" (UniqueName: \"kubernetes.io/projected/d7113bba-5e76-4cec-86ae-8b8f25962f3b-kube-api-access-dww2x\") pod \"redhat-operators-4wfxd\" (UID: \"d7113bba-5e76-4cec-86ae-8b8f25962f3b\") " pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.830648 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7113bba-5e76-4cec-86ae-8b8f25962f3b-utilities\") pod \"redhat-operators-4wfxd\" (UID: \"d7113bba-5e76-4cec-86ae-8b8f25962f3b\") " pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.831128 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7113bba-5e76-4cec-86ae-8b8f25962f3b-catalog-content\") pod \"redhat-operators-4wfxd\" (UID: \"d7113bba-5e76-4cec-86ae-8b8f25962f3b\") " pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.831201 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7113bba-5e76-4cec-86ae-8b8f25962f3b-utilities\") pod \"redhat-operators-4wfxd\" (UID: \"d7113bba-5e76-4cec-86ae-8b8f25962f3b\") " pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.851649 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dww2x\" (UniqueName: \"kubernetes.io/projected/d7113bba-5e76-4cec-86ae-8b8f25962f3b-kube-api-access-dww2x\") pod \"redhat-operators-4wfxd\" (UID: \"d7113bba-5e76-4cec-86ae-8b8f25962f3b\") " pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.995430 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:08:41 crc kubenswrapper[5094]: I0220 08:08:41.403127 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4wfxd"] Feb 20 08:08:42 crc kubenswrapper[5094]: I0220 08:08:42.017146 5094 generic.go:334] "Generic (PLEG): container finished" podID="d7113bba-5e76-4cec-86ae-8b8f25962f3b" containerID="ff0214f2a987535c275156e8107b886fcb2dde4a06f589fb68a02f6924af7145" exitCode=0 Feb 20 08:08:42 crc kubenswrapper[5094]: I0220 08:08:42.017241 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4wfxd" event={"ID":"d7113bba-5e76-4cec-86ae-8b8f25962f3b","Type":"ContainerDied","Data":"ff0214f2a987535c275156e8107b886fcb2dde4a06f589fb68a02f6924af7145"} Feb 20 08:08:42 crc kubenswrapper[5094]: I0220 08:08:42.017462 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4wfxd" event={"ID":"d7113bba-5e76-4cec-86ae-8b8f25962f3b","Type":"ContainerStarted","Data":"ba3b2230d07fa163b3aed23f4cdc1e6ce956e10bfc139d6a0576b073cc675fdf"} Feb 20 08:08:43 crc kubenswrapper[5094]: I0220 08:08:43.026314 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4wfxd" event={"ID":"d7113bba-5e76-4cec-86ae-8b8f25962f3b","Type":"ContainerStarted","Data":"2975c7d76bb965432618b15285614568496609a55569ca90adeefbf8ba05ea8c"} Feb 20 08:08:44 crc kubenswrapper[5094]: I0220 08:08:44.037764 5094 generic.go:334] "Generic (PLEG): container finished" podID="d7113bba-5e76-4cec-86ae-8b8f25962f3b" containerID="2975c7d76bb965432618b15285614568496609a55569ca90adeefbf8ba05ea8c" exitCode=0 Feb 20 08:08:44 crc kubenswrapper[5094]: I0220 08:08:44.037869 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4wfxd" event={"ID":"d7113bba-5e76-4cec-86ae-8b8f25962f3b","Type":"ContainerDied","Data":"2975c7d76bb965432618b15285614568496609a55569ca90adeefbf8ba05ea8c"} Feb 20 08:08:45 crc kubenswrapper[5094]: I0220 08:08:45.049632 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4wfxd" event={"ID":"d7113bba-5e76-4cec-86ae-8b8f25962f3b","Type":"ContainerStarted","Data":"cb1f0e86fa2d34c2c4dbdd909edb855c5e32410b104d7b14b39d10bbdc8eb2c2"} Feb 20 08:08:47 crc kubenswrapper[5094]: I0220 08:08:47.841087 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:08:47 crc kubenswrapper[5094]: E0220 08:08:47.841641 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:08:50 crc kubenswrapper[5094]: I0220 08:08:50.996204 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:08:50 crc kubenswrapper[5094]: I0220 08:08:50.997826 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:08:52 crc kubenswrapper[5094]: I0220 08:08:52.041085 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4wfxd" podUID="d7113bba-5e76-4cec-86ae-8b8f25962f3b" containerName="registry-server" probeResult="failure" output=< Feb 20 08:08:52 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 08:08:52 crc kubenswrapper[5094]: > Feb 20 08:09:01 crc kubenswrapper[5094]: I0220 08:09:01.068024 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:09:01 crc kubenswrapper[5094]: I0220 08:09:01.101259 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4wfxd" podStartSLOduration=18.420452936 podStartE2EDuration="21.101217469s" podCreationTimestamp="2026-02-20 08:08:40 +0000 UTC" firstStartedPulling="2026-02-20 08:08:42.018824699 +0000 UTC m=+4936.891451410" lastFinishedPulling="2026-02-20 08:08:44.699589202 +0000 UTC m=+4939.572215943" observedRunningTime="2026-02-20 08:08:45.071679719 +0000 UTC m=+4939.944306450" watchObservedRunningTime="2026-02-20 08:09:01.101217469 +0000 UTC m=+4955.973844180" Feb 20 08:09:01 crc kubenswrapper[5094]: I0220 08:09:01.127481 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:09:01 crc kubenswrapper[5094]: I0220 08:09:01.303737 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4wfxd"] Feb 20 08:09:02 crc kubenswrapper[5094]: I0220 08:09:02.168124 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4wfxd" podUID="d7113bba-5e76-4cec-86ae-8b8f25962f3b" containerName="registry-server" containerID="cri-o://cb1f0e86fa2d34c2c4dbdd909edb855c5e32410b104d7b14b39d10bbdc8eb2c2" gracePeriod=2 Feb 20 08:09:02 crc kubenswrapper[5094]: I0220 08:09:02.532881 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:09:02 crc kubenswrapper[5094]: I0220 08:09:02.653199 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dww2x\" (UniqueName: \"kubernetes.io/projected/d7113bba-5e76-4cec-86ae-8b8f25962f3b-kube-api-access-dww2x\") pod \"d7113bba-5e76-4cec-86ae-8b8f25962f3b\" (UID: \"d7113bba-5e76-4cec-86ae-8b8f25962f3b\") " Feb 20 08:09:02 crc kubenswrapper[5094]: I0220 08:09:02.653281 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7113bba-5e76-4cec-86ae-8b8f25962f3b-catalog-content\") pod \"d7113bba-5e76-4cec-86ae-8b8f25962f3b\" (UID: \"d7113bba-5e76-4cec-86ae-8b8f25962f3b\") " Feb 20 08:09:02 crc kubenswrapper[5094]: I0220 08:09:02.653347 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7113bba-5e76-4cec-86ae-8b8f25962f3b-utilities\") pod \"d7113bba-5e76-4cec-86ae-8b8f25962f3b\" (UID: \"d7113bba-5e76-4cec-86ae-8b8f25962f3b\") " Feb 20 08:09:02 crc kubenswrapper[5094]: I0220 08:09:02.654435 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7113bba-5e76-4cec-86ae-8b8f25962f3b-utilities" (OuterVolumeSpecName: "utilities") pod "d7113bba-5e76-4cec-86ae-8b8f25962f3b" (UID: "d7113bba-5e76-4cec-86ae-8b8f25962f3b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:09:02 crc kubenswrapper[5094]: I0220 08:09:02.659291 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7113bba-5e76-4cec-86ae-8b8f25962f3b-kube-api-access-dww2x" (OuterVolumeSpecName: "kube-api-access-dww2x") pod "d7113bba-5e76-4cec-86ae-8b8f25962f3b" (UID: "d7113bba-5e76-4cec-86ae-8b8f25962f3b"). InnerVolumeSpecName "kube-api-access-dww2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:09:02 crc kubenswrapper[5094]: I0220 08:09:02.756974 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dww2x\" (UniqueName: \"kubernetes.io/projected/d7113bba-5e76-4cec-86ae-8b8f25962f3b-kube-api-access-dww2x\") on node \"crc\" DevicePath \"\"" Feb 20 08:09:02 crc kubenswrapper[5094]: I0220 08:09:02.757018 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7113bba-5e76-4cec-86ae-8b8f25962f3b-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:09:02 crc kubenswrapper[5094]: I0220 08:09:02.798316 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7113bba-5e76-4cec-86ae-8b8f25962f3b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d7113bba-5e76-4cec-86ae-8b8f25962f3b" (UID: "d7113bba-5e76-4cec-86ae-8b8f25962f3b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:09:02 crc kubenswrapper[5094]: I0220 08:09:02.840497 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:09:02 crc kubenswrapper[5094]: E0220 08:09:02.840914 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:09:02 crc kubenswrapper[5094]: I0220 08:09:02.858392 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7113bba-5e76-4cec-86ae-8b8f25962f3b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:09:03 crc kubenswrapper[5094]: I0220 08:09:03.188371 5094 generic.go:334] "Generic (PLEG): container finished" podID="d7113bba-5e76-4cec-86ae-8b8f25962f3b" containerID="cb1f0e86fa2d34c2c4dbdd909edb855c5e32410b104d7b14b39d10bbdc8eb2c2" exitCode=0 Feb 20 08:09:03 crc kubenswrapper[5094]: I0220 08:09:03.188440 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4wfxd" event={"ID":"d7113bba-5e76-4cec-86ae-8b8f25962f3b","Type":"ContainerDied","Data":"cb1f0e86fa2d34c2c4dbdd909edb855c5e32410b104d7b14b39d10bbdc8eb2c2"} Feb 20 08:09:03 crc kubenswrapper[5094]: I0220 08:09:03.188483 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4wfxd" event={"ID":"d7113bba-5e76-4cec-86ae-8b8f25962f3b","Type":"ContainerDied","Data":"ba3b2230d07fa163b3aed23f4cdc1e6ce956e10bfc139d6a0576b073cc675fdf"} Feb 20 08:09:03 crc kubenswrapper[5094]: I0220 08:09:03.188516 5094 scope.go:117] "RemoveContainer" containerID="cb1f0e86fa2d34c2c4dbdd909edb855c5e32410b104d7b14b39d10bbdc8eb2c2" Feb 20 08:09:03 crc kubenswrapper[5094]: I0220 08:09:03.188521 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:09:03 crc kubenswrapper[5094]: I0220 08:09:03.217058 5094 scope.go:117] "RemoveContainer" containerID="2975c7d76bb965432618b15285614568496609a55569ca90adeefbf8ba05ea8c" Feb 20 08:09:03 crc kubenswrapper[5094]: I0220 08:09:03.235240 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4wfxd"] Feb 20 08:09:03 crc kubenswrapper[5094]: I0220 08:09:03.241433 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4wfxd"] Feb 20 08:09:03 crc kubenswrapper[5094]: I0220 08:09:03.361617 5094 scope.go:117] "RemoveContainer" containerID="ff0214f2a987535c275156e8107b886fcb2dde4a06f589fb68a02f6924af7145" Feb 20 08:09:03 crc kubenswrapper[5094]: I0220 08:09:03.385389 5094 scope.go:117] "RemoveContainer" containerID="cb1f0e86fa2d34c2c4dbdd909edb855c5e32410b104d7b14b39d10bbdc8eb2c2" Feb 20 08:09:03 crc kubenswrapper[5094]: E0220 08:09:03.385926 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb1f0e86fa2d34c2c4dbdd909edb855c5e32410b104d7b14b39d10bbdc8eb2c2\": container with ID starting with cb1f0e86fa2d34c2c4dbdd909edb855c5e32410b104d7b14b39d10bbdc8eb2c2 not found: ID does not exist" containerID="cb1f0e86fa2d34c2c4dbdd909edb855c5e32410b104d7b14b39d10bbdc8eb2c2" Feb 20 08:09:03 crc kubenswrapper[5094]: I0220 08:09:03.385961 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb1f0e86fa2d34c2c4dbdd909edb855c5e32410b104d7b14b39d10bbdc8eb2c2"} err="failed to get container status \"cb1f0e86fa2d34c2c4dbdd909edb855c5e32410b104d7b14b39d10bbdc8eb2c2\": rpc error: code = NotFound desc = could not find container \"cb1f0e86fa2d34c2c4dbdd909edb855c5e32410b104d7b14b39d10bbdc8eb2c2\": container with ID starting with cb1f0e86fa2d34c2c4dbdd909edb855c5e32410b104d7b14b39d10bbdc8eb2c2 not found: ID does not exist" Feb 20 08:09:03 crc kubenswrapper[5094]: I0220 08:09:03.385983 5094 scope.go:117] "RemoveContainer" containerID="2975c7d76bb965432618b15285614568496609a55569ca90adeefbf8ba05ea8c" Feb 20 08:09:03 crc kubenswrapper[5094]: E0220 08:09:03.386377 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2975c7d76bb965432618b15285614568496609a55569ca90adeefbf8ba05ea8c\": container with ID starting with 2975c7d76bb965432618b15285614568496609a55569ca90adeefbf8ba05ea8c not found: ID does not exist" containerID="2975c7d76bb965432618b15285614568496609a55569ca90adeefbf8ba05ea8c" Feb 20 08:09:03 crc kubenswrapper[5094]: I0220 08:09:03.386428 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2975c7d76bb965432618b15285614568496609a55569ca90adeefbf8ba05ea8c"} err="failed to get container status \"2975c7d76bb965432618b15285614568496609a55569ca90adeefbf8ba05ea8c\": rpc error: code = NotFound desc = could not find container \"2975c7d76bb965432618b15285614568496609a55569ca90adeefbf8ba05ea8c\": container with ID starting with 2975c7d76bb965432618b15285614568496609a55569ca90adeefbf8ba05ea8c not found: ID does not exist" Feb 20 08:09:03 crc kubenswrapper[5094]: I0220 08:09:03.386462 5094 scope.go:117] "RemoveContainer" containerID="ff0214f2a987535c275156e8107b886fcb2dde4a06f589fb68a02f6924af7145" Feb 20 08:09:03 crc kubenswrapper[5094]: E0220 08:09:03.386809 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff0214f2a987535c275156e8107b886fcb2dde4a06f589fb68a02f6924af7145\": container with ID starting with ff0214f2a987535c275156e8107b886fcb2dde4a06f589fb68a02f6924af7145 not found: ID does not exist" containerID="ff0214f2a987535c275156e8107b886fcb2dde4a06f589fb68a02f6924af7145" Feb 20 08:09:03 crc kubenswrapper[5094]: I0220 08:09:03.386843 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff0214f2a987535c275156e8107b886fcb2dde4a06f589fb68a02f6924af7145"} err="failed to get container status \"ff0214f2a987535c275156e8107b886fcb2dde4a06f589fb68a02f6924af7145\": rpc error: code = NotFound desc = could not find container \"ff0214f2a987535c275156e8107b886fcb2dde4a06f589fb68a02f6924af7145\": container with ID starting with ff0214f2a987535c275156e8107b886fcb2dde4a06f589fb68a02f6924af7145 not found: ID does not exist" Feb 20 08:09:03 crc kubenswrapper[5094]: I0220 08:09:03.848660 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7113bba-5e76-4cec-86ae-8b8f25962f3b" path="/var/lib/kubelet/pods/d7113bba-5e76-4cec-86ae-8b8f25962f3b/volumes" Feb 20 08:09:13 crc kubenswrapper[5094]: I0220 08:09:13.840312 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:09:13 crc kubenswrapper[5094]: E0220 08:09:13.841207 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:09:27 crc kubenswrapper[5094]: I0220 08:09:27.841373 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:09:27 crc kubenswrapper[5094]: E0220 08:09:27.842356 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:09:41 crc kubenswrapper[5094]: I0220 08:09:41.841654 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:09:41 crc kubenswrapper[5094]: E0220 08:09:41.842982 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:09:52 crc kubenswrapper[5094]: I0220 08:09:52.840479 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:09:52 crc kubenswrapper[5094]: E0220 08:09:52.842261 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:10:07 crc kubenswrapper[5094]: I0220 08:10:07.840313 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:10:07 crc kubenswrapper[5094]: E0220 08:10:07.842519 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:10:22 crc kubenswrapper[5094]: I0220 08:10:22.842316 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:10:22 crc kubenswrapper[5094]: E0220 08:10:22.843666 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:10:37 crc kubenswrapper[5094]: I0220 08:10:37.840796 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:10:37 crc kubenswrapper[5094]: E0220 08:10:37.841877 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:10:49 crc kubenswrapper[5094]: I0220 08:10:49.840389 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:10:49 crc kubenswrapper[5094]: E0220 08:10:49.841870 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:11:01 crc kubenswrapper[5094]: I0220 08:11:01.840856 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:11:01 crc kubenswrapper[5094]: E0220 08:11:01.841676 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:11:14 crc kubenswrapper[5094]: I0220 08:11:14.840873 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:11:14 crc kubenswrapper[5094]: E0220 08:11:14.842003 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:11:25 crc kubenswrapper[5094]: I0220 08:11:25.844419 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:11:25 crc kubenswrapper[5094]: E0220 08:11:25.845846 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:11:38 crc kubenswrapper[5094]: I0220 08:11:38.841629 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:11:38 crc kubenswrapper[5094]: E0220 08:11:38.842686 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:11:52 crc kubenswrapper[5094]: I0220 08:11:52.840696 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:11:52 crc kubenswrapper[5094]: E0220 08:11:52.841308 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:12:06 crc kubenswrapper[5094]: I0220 08:12:06.840666 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:12:06 crc kubenswrapper[5094]: E0220 08:12:06.842035 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:12:17 crc kubenswrapper[5094]: I0220 08:12:17.840657 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:12:17 crc kubenswrapper[5094]: E0220 08:12:17.841652 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:12:29 crc kubenswrapper[5094]: I0220 08:12:29.840079 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:12:29 crc kubenswrapper[5094]: E0220 08:12:29.841692 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:12:41 crc kubenswrapper[5094]: I0220 08:12:41.839797 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:12:41 crc kubenswrapper[5094]: E0220 08:12:41.840374 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:12:54 crc kubenswrapper[5094]: I0220 08:12:54.840628 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:12:54 crc kubenswrapper[5094]: E0220 08:12:54.841365 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:13:05 crc kubenswrapper[5094]: I0220 08:13:05.848157 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:13:05 crc kubenswrapper[5094]: E0220 08:13:05.849066 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:13:16 crc kubenswrapper[5094]: I0220 08:13:16.841038 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:13:16 crc kubenswrapper[5094]: E0220 08:13:16.841855 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:13:27 crc kubenswrapper[5094]: I0220 08:13:27.841090 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:13:27 crc kubenswrapper[5094]: E0220 08:13:27.842443 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:13:41 crc kubenswrapper[5094]: I0220 08:13:41.840911 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:13:42 crc kubenswrapper[5094]: I0220 08:13:42.416503 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"97148134be230a037117359d1f168e80e119726d339f329bd4df40629ef0a2ca"} Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.158222 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr"] Feb 20 08:15:00 crc kubenswrapper[5094]: E0220 08:15:00.159541 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7113bba-5e76-4cec-86ae-8b8f25962f3b" containerName="registry-server" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.159569 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7113bba-5e76-4cec-86ae-8b8f25962f3b" containerName="registry-server" Feb 20 08:15:00 crc kubenswrapper[5094]: E0220 08:15:00.159597 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7113bba-5e76-4cec-86ae-8b8f25962f3b" containerName="extract-content" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.159611 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7113bba-5e76-4cec-86ae-8b8f25962f3b" containerName="extract-content" Feb 20 08:15:00 crc kubenswrapper[5094]: E0220 08:15:00.159628 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7113bba-5e76-4cec-86ae-8b8f25962f3b" containerName="extract-utilities" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.159645 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7113bba-5e76-4cec-86ae-8b8f25962f3b" containerName="extract-utilities" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.161479 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7113bba-5e76-4cec-86ae-8b8f25962f3b" containerName="registry-server" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.163174 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.167546 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.168006 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.195118 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr"] Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.338971 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b1b88d4-fc9b-465d-907e-7abf6c46c919-config-volume\") pod \"collect-profiles-29526255-vpwzr\" (UID: \"0b1b88d4-fc9b-465d-907e-7abf6c46c919\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.339023 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b1b88d4-fc9b-465d-907e-7abf6c46c919-secret-volume\") pod \"collect-profiles-29526255-vpwzr\" (UID: \"0b1b88d4-fc9b-465d-907e-7abf6c46c919\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.339130 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj6hx\" (UniqueName: \"kubernetes.io/projected/0b1b88d4-fc9b-465d-907e-7abf6c46c919-kube-api-access-hj6hx\") pod \"collect-profiles-29526255-vpwzr\" (UID: \"0b1b88d4-fc9b-465d-907e-7abf6c46c919\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.440041 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b1b88d4-fc9b-465d-907e-7abf6c46c919-config-volume\") pod \"collect-profiles-29526255-vpwzr\" (UID: \"0b1b88d4-fc9b-465d-907e-7abf6c46c919\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.440103 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b1b88d4-fc9b-465d-907e-7abf6c46c919-secret-volume\") pod \"collect-profiles-29526255-vpwzr\" (UID: \"0b1b88d4-fc9b-465d-907e-7abf6c46c919\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.440185 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj6hx\" (UniqueName: \"kubernetes.io/projected/0b1b88d4-fc9b-465d-907e-7abf6c46c919-kube-api-access-hj6hx\") pod \"collect-profiles-29526255-vpwzr\" (UID: \"0b1b88d4-fc9b-465d-907e-7abf6c46c919\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.441221 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b1b88d4-fc9b-465d-907e-7abf6c46c919-config-volume\") pod \"collect-profiles-29526255-vpwzr\" (UID: \"0b1b88d4-fc9b-465d-907e-7abf6c46c919\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.455072 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b1b88d4-fc9b-465d-907e-7abf6c46c919-secret-volume\") pod \"collect-profiles-29526255-vpwzr\" (UID: \"0b1b88d4-fc9b-465d-907e-7abf6c46c919\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.459325 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj6hx\" (UniqueName: \"kubernetes.io/projected/0b1b88d4-fc9b-465d-907e-7abf6c46c919-kube-api-access-hj6hx\") pod \"collect-profiles-29526255-vpwzr\" (UID: \"0b1b88d4-fc9b-465d-907e-7abf6c46c919\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.497204 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.889715 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr"] Feb 20 08:15:01 crc kubenswrapper[5094]: I0220 08:15:01.047739 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" event={"ID":"0b1b88d4-fc9b-465d-907e-7abf6c46c919","Type":"ContainerStarted","Data":"df01effbb937965fa2da8671ac9d455b13bf8b90f5ca8dcd5de48223dc6408d9"} Feb 20 08:15:01 crc kubenswrapper[5094]: I0220 08:15:01.047800 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" event={"ID":"0b1b88d4-fc9b-465d-907e-7abf6c46c919","Type":"ContainerStarted","Data":"e6dce8367cd274fa2f1bf6b1f243c57fa20df7ec4137f484c2b7028850b4e915"} Feb 20 08:15:01 crc kubenswrapper[5094]: I0220 08:15:01.066035 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" podStartSLOduration=1.065964497 podStartE2EDuration="1.065964497s" podCreationTimestamp="2026-02-20 08:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:15:01.062649327 +0000 UTC m=+5315.935276058" watchObservedRunningTime="2026-02-20 08:15:01.065964497 +0000 UTC m=+5315.938591218" Feb 20 08:15:02 crc kubenswrapper[5094]: I0220 08:15:02.055959 5094 generic.go:334] "Generic (PLEG): container finished" podID="0b1b88d4-fc9b-465d-907e-7abf6c46c919" containerID="df01effbb937965fa2da8671ac9d455b13bf8b90f5ca8dcd5de48223dc6408d9" exitCode=0 Feb 20 08:15:02 crc kubenswrapper[5094]: I0220 08:15:02.056073 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" event={"ID":"0b1b88d4-fc9b-465d-907e-7abf6c46c919","Type":"ContainerDied","Data":"df01effbb937965fa2da8671ac9d455b13bf8b90f5ca8dcd5de48223dc6408d9"} Feb 20 08:15:03 crc kubenswrapper[5094]: I0220 08:15:03.358779 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" Feb 20 08:15:03 crc kubenswrapper[5094]: I0220 08:15:03.389694 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hj6hx\" (UniqueName: \"kubernetes.io/projected/0b1b88d4-fc9b-465d-907e-7abf6c46c919-kube-api-access-hj6hx\") pod \"0b1b88d4-fc9b-465d-907e-7abf6c46c919\" (UID: \"0b1b88d4-fc9b-465d-907e-7abf6c46c919\") " Feb 20 08:15:03 crc kubenswrapper[5094]: I0220 08:15:03.389807 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b1b88d4-fc9b-465d-907e-7abf6c46c919-config-volume\") pod \"0b1b88d4-fc9b-465d-907e-7abf6c46c919\" (UID: \"0b1b88d4-fc9b-465d-907e-7abf6c46c919\") " Feb 20 08:15:03 crc kubenswrapper[5094]: I0220 08:15:03.389872 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b1b88d4-fc9b-465d-907e-7abf6c46c919-secret-volume\") pod \"0b1b88d4-fc9b-465d-907e-7abf6c46c919\" (UID: \"0b1b88d4-fc9b-465d-907e-7abf6c46c919\") " Feb 20 08:15:03 crc kubenswrapper[5094]: I0220 08:15:03.394283 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b1b88d4-fc9b-465d-907e-7abf6c46c919-config-volume" (OuterVolumeSpecName: "config-volume") pod "0b1b88d4-fc9b-465d-907e-7abf6c46c919" (UID: "0b1b88d4-fc9b-465d-907e-7abf6c46c919"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:15:03 crc kubenswrapper[5094]: I0220 08:15:03.399005 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b1b88d4-fc9b-465d-907e-7abf6c46c919-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0b1b88d4-fc9b-465d-907e-7abf6c46c919" (UID: "0b1b88d4-fc9b-465d-907e-7abf6c46c919"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:15:03 crc kubenswrapper[5094]: I0220 08:15:03.399080 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b1b88d4-fc9b-465d-907e-7abf6c46c919-kube-api-access-hj6hx" (OuterVolumeSpecName: "kube-api-access-hj6hx") pod "0b1b88d4-fc9b-465d-907e-7abf6c46c919" (UID: "0b1b88d4-fc9b-465d-907e-7abf6c46c919"). InnerVolumeSpecName "kube-api-access-hj6hx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:15:03 crc kubenswrapper[5094]: I0220 08:15:03.491496 5094 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b1b88d4-fc9b-465d-907e-7abf6c46c919-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 08:15:03 crc kubenswrapper[5094]: I0220 08:15:03.491533 5094 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b1b88d4-fc9b-465d-907e-7abf6c46c919-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 08:15:03 crc kubenswrapper[5094]: I0220 08:15:03.491543 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hj6hx\" (UniqueName: \"kubernetes.io/projected/0b1b88d4-fc9b-465d-907e-7abf6c46c919-kube-api-access-hj6hx\") on node \"crc\" DevicePath \"\"" Feb 20 08:15:04 crc kubenswrapper[5094]: I0220 08:15:04.073345 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" event={"ID":"0b1b88d4-fc9b-465d-907e-7abf6c46c919","Type":"ContainerDied","Data":"e6dce8367cd274fa2f1bf6b1f243c57fa20df7ec4137f484c2b7028850b4e915"} Feb 20 08:15:04 crc kubenswrapper[5094]: I0220 08:15:04.073393 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6dce8367cd274fa2f1bf6b1f243c57fa20df7ec4137f484c2b7028850b4e915" Feb 20 08:15:04 crc kubenswrapper[5094]: I0220 08:15:04.073409 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" Feb 20 08:15:04 crc kubenswrapper[5094]: I0220 08:15:04.437276 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw"] Feb 20 08:15:04 crc kubenswrapper[5094]: I0220 08:15:04.441799 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw"] Feb 20 08:15:05 crc kubenswrapper[5094]: I0220 08:15:05.853933 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d" path="/var/lib/kubelet/pods/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d/volumes" Feb 20 08:15:36 crc kubenswrapper[5094]: I0220 08:15:36.049262 5094 scope.go:117] "RemoveContainer" containerID="07ea8e807e5436859467c750ef51269844eba966788ab09e687b71868fdd8b31" Feb 20 08:16:04 crc kubenswrapper[5094]: I0220 08:16:04.107405 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:16:04 crc kubenswrapper[5094]: I0220 08:16:04.108088 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:16:12 crc kubenswrapper[5094]: I0220 08:16:12.067457 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-szndh"] Feb 20 08:16:12 crc kubenswrapper[5094]: E0220 08:16:12.068413 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b1b88d4-fc9b-465d-907e-7abf6c46c919" containerName="collect-profiles" Feb 20 08:16:12 crc kubenswrapper[5094]: I0220 08:16:12.068432 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b1b88d4-fc9b-465d-907e-7abf6c46c919" containerName="collect-profiles" Feb 20 08:16:12 crc kubenswrapper[5094]: I0220 08:16:12.068640 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b1b88d4-fc9b-465d-907e-7abf6c46c919" containerName="collect-profiles" Feb 20 08:16:12 crc kubenswrapper[5094]: I0220 08:16:12.069881 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:12 crc kubenswrapper[5094]: I0220 08:16:12.085092 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-szndh"] Feb 20 08:16:12 crc kubenswrapper[5094]: I0220 08:16:12.265102 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7mcm\" (UniqueName: \"kubernetes.io/projected/49693624-23e2-4579-a736-a6148ac00de5-kube-api-access-r7mcm\") pod \"community-operators-szndh\" (UID: \"49693624-23e2-4579-a736-a6148ac00de5\") " pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:12 crc kubenswrapper[5094]: I0220 08:16:12.265493 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49693624-23e2-4579-a736-a6148ac00de5-catalog-content\") pod \"community-operators-szndh\" (UID: \"49693624-23e2-4579-a736-a6148ac00de5\") " pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:12 crc kubenswrapper[5094]: I0220 08:16:12.265638 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49693624-23e2-4579-a736-a6148ac00de5-utilities\") pod \"community-operators-szndh\" (UID: \"49693624-23e2-4579-a736-a6148ac00de5\") " pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:12 crc kubenswrapper[5094]: I0220 08:16:12.366774 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7mcm\" (UniqueName: \"kubernetes.io/projected/49693624-23e2-4579-a736-a6148ac00de5-kube-api-access-r7mcm\") pod \"community-operators-szndh\" (UID: \"49693624-23e2-4579-a736-a6148ac00de5\") " pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:12 crc kubenswrapper[5094]: I0220 08:16:12.366842 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49693624-23e2-4579-a736-a6148ac00de5-catalog-content\") pod \"community-operators-szndh\" (UID: \"49693624-23e2-4579-a736-a6148ac00de5\") " pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:12 crc kubenswrapper[5094]: I0220 08:16:12.366865 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49693624-23e2-4579-a736-a6148ac00de5-utilities\") pod \"community-operators-szndh\" (UID: \"49693624-23e2-4579-a736-a6148ac00de5\") " pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:12 crc kubenswrapper[5094]: I0220 08:16:12.367282 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49693624-23e2-4579-a736-a6148ac00de5-utilities\") pod \"community-operators-szndh\" (UID: \"49693624-23e2-4579-a736-a6148ac00de5\") " pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:12 crc kubenswrapper[5094]: I0220 08:16:12.367384 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49693624-23e2-4579-a736-a6148ac00de5-catalog-content\") pod \"community-operators-szndh\" (UID: \"49693624-23e2-4579-a736-a6148ac00de5\") " pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:12 crc kubenswrapper[5094]: I0220 08:16:12.390633 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7mcm\" (UniqueName: \"kubernetes.io/projected/49693624-23e2-4579-a736-a6148ac00de5-kube-api-access-r7mcm\") pod \"community-operators-szndh\" (UID: \"49693624-23e2-4579-a736-a6148ac00de5\") " pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:12 crc kubenswrapper[5094]: I0220 08:16:12.688073 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:13 crc kubenswrapper[5094]: I0220 08:16:13.143474 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-szndh"] Feb 20 08:16:13 crc kubenswrapper[5094]: I0220 08:16:13.604434 5094 generic.go:334] "Generic (PLEG): container finished" podID="49693624-23e2-4579-a736-a6148ac00de5" containerID="5dda47894fb4b237543c3e5a0a710cb4e499545a2916d431ecc2e488db613d89" exitCode=0 Feb 20 08:16:13 crc kubenswrapper[5094]: I0220 08:16:13.604664 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szndh" event={"ID":"49693624-23e2-4579-a736-a6148ac00de5","Type":"ContainerDied","Data":"5dda47894fb4b237543c3e5a0a710cb4e499545a2916d431ecc2e488db613d89"} Feb 20 08:16:13 crc kubenswrapper[5094]: I0220 08:16:13.604690 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szndh" event={"ID":"49693624-23e2-4579-a736-a6148ac00de5","Type":"ContainerStarted","Data":"77800da7e8ad36fe3403f66f675a7ce8b8ae989d7a8a45320926ea3f87f9b807"} Feb 20 08:16:13 crc kubenswrapper[5094]: I0220 08:16:13.607427 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 08:16:14 crc kubenswrapper[5094]: I0220 08:16:14.620606 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szndh" event={"ID":"49693624-23e2-4579-a736-a6148ac00de5","Type":"ContainerStarted","Data":"9a98987ba55a83e82377e685996648b387e22f999290e5a3fb53a68cd8615111"} Feb 20 08:16:15 crc kubenswrapper[5094]: I0220 08:16:15.633376 5094 generic.go:334] "Generic (PLEG): container finished" podID="49693624-23e2-4579-a736-a6148ac00de5" containerID="9a98987ba55a83e82377e685996648b387e22f999290e5a3fb53a68cd8615111" exitCode=0 Feb 20 08:16:15 crc kubenswrapper[5094]: I0220 08:16:15.633446 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szndh" event={"ID":"49693624-23e2-4579-a736-a6148ac00de5","Type":"ContainerDied","Data":"9a98987ba55a83e82377e685996648b387e22f999290e5a3fb53a68cd8615111"} Feb 20 08:16:16 crc kubenswrapper[5094]: I0220 08:16:16.642524 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szndh" event={"ID":"49693624-23e2-4579-a736-a6148ac00de5","Type":"ContainerStarted","Data":"a332eb9c1cd6b287b06b452672f79b317ec070e1819028f5f277db0a8d2ddaa3"} Feb 20 08:16:16 crc kubenswrapper[5094]: I0220 08:16:16.669173 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-szndh" podStartSLOduration=2.217313269 podStartE2EDuration="4.66915261s" podCreationTimestamp="2026-02-20 08:16:12 +0000 UTC" firstStartedPulling="2026-02-20 08:16:13.607243041 +0000 UTC m=+5388.479869752" lastFinishedPulling="2026-02-20 08:16:16.059082382 +0000 UTC m=+5390.931709093" observedRunningTime="2026-02-20 08:16:16.662174583 +0000 UTC m=+5391.534801294" watchObservedRunningTime="2026-02-20 08:16:16.66915261 +0000 UTC m=+5391.541779321" Feb 20 08:16:19 crc kubenswrapper[5094]: I0220 08:16:19.490488 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2792j"] Feb 20 08:16:19 crc kubenswrapper[5094]: I0220 08:16:19.492778 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:19 crc kubenswrapper[5094]: I0220 08:16:19.508806 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2792j"] Feb 20 08:16:19 crc kubenswrapper[5094]: I0220 08:16:19.686021 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2jxb\" (UniqueName: \"kubernetes.io/projected/6b9a2484-7e04-40b5-aae9-895f1a450ad6-kube-api-access-x2jxb\") pod \"redhat-marketplace-2792j\" (UID: \"6b9a2484-7e04-40b5-aae9-895f1a450ad6\") " pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:19 crc kubenswrapper[5094]: I0220 08:16:19.686507 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2484-7e04-40b5-aae9-895f1a450ad6-catalog-content\") pod \"redhat-marketplace-2792j\" (UID: \"6b9a2484-7e04-40b5-aae9-895f1a450ad6\") " pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:19 crc kubenswrapper[5094]: I0220 08:16:19.686645 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2484-7e04-40b5-aae9-895f1a450ad6-utilities\") pod \"redhat-marketplace-2792j\" (UID: \"6b9a2484-7e04-40b5-aae9-895f1a450ad6\") " pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:19 crc kubenswrapper[5094]: I0220 08:16:19.788086 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2484-7e04-40b5-aae9-895f1a450ad6-utilities\") pod \"redhat-marketplace-2792j\" (UID: \"6b9a2484-7e04-40b5-aae9-895f1a450ad6\") " pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:19 crc kubenswrapper[5094]: I0220 08:16:19.788190 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2jxb\" (UniqueName: \"kubernetes.io/projected/6b9a2484-7e04-40b5-aae9-895f1a450ad6-kube-api-access-x2jxb\") pod \"redhat-marketplace-2792j\" (UID: \"6b9a2484-7e04-40b5-aae9-895f1a450ad6\") " pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:19 crc kubenswrapper[5094]: I0220 08:16:19.788309 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2484-7e04-40b5-aae9-895f1a450ad6-catalog-content\") pod \"redhat-marketplace-2792j\" (UID: \"6b9a2484-7e04-40b5-aae9-895f1a450ad6\") " pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:19 crc kubenswrapper[5094]: I0220 08:16:19.788631 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2484-7e04-40b5-aae9-895f1a450ad6-utilities\") pod \"redhat-marketplace-2792j\" (UID: \"6b9a2484-7e04-40b5-aae9-895f1a450ad6\") " pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:19 crc kubenswrapper[5094]: I0220 08:16:19.788679 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2484-7e04-40b5-aae9-895f1a450ad6-catalog-content\") pod \"redhat-marketplace-2792j\" (UID: \"6b9a2484-7e04-40b5-aae9-895f1a450ad6\") " pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:19 crc kubenswrapper[5094]: I0220 08:16:19.824174 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2jxb\" (UniqueName: \"kubernetes.io/projected/6b9a2484-7e04-40b5-aae9-895f1a450ad6-kube-api-access-x2jxb\") pod \"redhat-marketplace-2792j\" (UID: \"6b9a2484-7e04-40b5-aae9-895f1a450ad6\") " pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:20 crc kubenswrapper[5094]: I0220 08:16:20.123648 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:20 crc kubenswrapper[5094]: I0220 08:16:20.397193 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2792j"] Feb 20 08:16:20 crc kubenswrapper[5094]: W0220 08:16:20.399360 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b9a2484_7e04_40b5_aae9_895f1a450ad6.slice/crio-c44365c565049838dd503b609110293ae986c0140dec2e02acbc4c804bc33128 WatchSource:0}: Error finding container c44365c565049838dd503b609110293ae986c0140dec2e02acbc4c804bc33128: Status 404 returned error can't find the container with id c44365c565049838dd503b609110293ae986c0140dec2e02acbc4c804bc33128 Feb 20 08:16:20 crc kubenswrapper[5094]: I0220 08:16:20.669399 5094 generic.go:334] "Generic (PLEG): container finished" podID="6b9a2484-7e04-40b5-aae9-895f1a450ad6" containerID="279190da08b55a4cc036cbd0439b757ac2fbf4be0789e671813e3757e0274ed5" exitCode=0 Feb 20 08:16:20 crc kubenswrapper[5094]: I0220 08:16:20.669504 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2792j" event={"ID":"6b9a2484-7e04-40b5-aae9-895f1a450ad6","Type":"ContainerDied","Data":"279190da08b55a4cc036cbd0439b757ac2fbf4be0789e671813e3757e0274ed5"} Feb 20 08:16:20 crc kubenswrapper[5094]: I0220 08:16:20.669778 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2792j" event={"ID":"6b9a2484-7e04-40b5-aae9-895f1a450ad6","Type":"ContainerStarted","Data":"c44365c565049838dd503b609110293ae986c0140dec2e02acbc4c804bc33128"} Feb 20 08:16:21 crc kubenswrapper[5094]: I0220 08:16:21.680046 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2792j" event={"ID":"6b9a2484-7e04-40b5-aae9-895f1a450ad6","Type":"ContainerStarted","Data":"ed5402904916dc6afe4f8848a945c7421e6f0b70582de7500a05cdb6fc590e41"} Feb 20 08:16:22 crc kubenswrapper[5094]: I0220 08:16:22.688163 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:22 crc kubenswrapper[5094]: I0220 08:16:22.688422 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:22 crc kubenswrapper[5094]: I0220 08:16:22.690972 5094 generic.go:334] "Generic (PLEG): container finished" podID="6b9a2484-7e04-40b5-aae9-895f1a450ad6" containerID="ed5402904916dc6afe4f8848a945c7421e6f0b70582de7500a05cdb6fc590e41" exitCode=0 Feb 20 08:16:22 crc kubenswrapper[5094]: I0220 08:16:22.691024 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2792j" event={"ID":"6b9a2484-7e04-40b5-aae9-895f1a450ad6","Type":"ContainerDied","Data":"ed5402904916dc6afe4f8848a945c7421e6f0b70582de7500a05cdb6fc590e41"} Feb 20 08:16:22 crc kubenswrapper[5094]: I0220 08:16:22.772843 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:23 crc kubenswrapper[5094]: I0220 08:16:23.702653 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2792j" event={"ID":"6b9a2484-7e04-40b5-aae9-895f1a450ad6","Type":"ContainerStarted","Data":"09d4e876a4b0323d6acc2f616735e6cf048edc258f42476d7cbf3371b7b69737"} Feb 20 08:16:23 crc kubenswrapper[5094]: I0220 08:16:23.728191 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2792j" podStartSLOduration=2.3216876859999998 podStartE2EDuration="4.728166227s" podCreationTimestamp="2026-02-20 08:16:19 +0000 UTC" firstStartedPulling="2026-02-20 08:16:20.672063378 +0000 UTC m=+5395.544690089" lastFinishedPulling="2026-02-20 08:16:23.078541909 +0000 UTC m=+5397.951168630" observedRunningTime="2026-02-20 08:16:23.722936122 +0000 UTC m=+5398.595562873" watchObservedRunningTime="2026-02-20 08:16:23.728166227 +0000 UTC m=+5398.600792948" Feb 20 08:16:23 crc kubenswrapper[5094]: I0220 08:16:23.766555 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:25 crc kubenswrapper[5094]: I0220 08:16:25.039943 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-szndh"] Feb 20 08:16:26 crc kubenswrapper[5094]: I0220 08:16:26.732485 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-szndh" podUID="49693624-23e2-4579-a736-a6148ac00de5" containerName="registry-server" containerID="cri-o://a332eb9c1cd6b287b06b452672f79b317ec070e1819028f5f277db0a8d2ddaa3" gracePeriod=2 Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.188779 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.328401 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7mcm\" (UniqueName: \"kubernetes.io/projected/49693624-23e2-4579-a736-a6148ac00de5-kube-api-access-r7mcm\") pod \"49693624-23e2-4579-a736-a6148ac00de5\" (UID: \"49693624-23e2-4579-a736-a6148ac00de5\") " Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.328751 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49693624-23e2-4579-a736-a6148ac00de5-catalog-content\") pod \"49693624-23e2-4579-a736-a6148ac00de5\" (UID: \"49693624-23e2-4579-a736-a6148ac00de5\") " Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.328951 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49693624-23e2-4579-a736-a6148ac00de5-utilities\") pod \"49693624-23e2-4579-a736-a6148ac00de5\" (UID: \"49693624-23e2-4579-a736-a6148ac00de5\") " Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.329685 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49693624-23e2-4579-a736-a6148ac00de5-utilities" (OuterVolumeSpecName: "utilities") pod "49693624-23e2-4579-a736-a6148ac00de5" (UID: "49693624-23e2-4579-a736-a6148ac00de5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.337961 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49693624-23e2-4579-a736-a6148ac00de5-kube-api-access-r7mcm" (OuterVolumeSpecName: "kube-api-access-r7mcm") pod "49693624-23e2-4579-a736-a6148ac00de5" (UID: "49693624-23e2-4579-a736-a6148ac00de5"). InnerVolumeSpecName "kube-api-access-r7mcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.386010 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49693624-23e2-4579-a736-a6148ac00de5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49693624-23e2-4579-a736-a6148ac00de5" (UID: "49693624-23e2-4579-a736-a6148ac00de5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.430968 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49693624-23e2-4579-a736-a6148ac00de5-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.431008 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7mcm\" (UniqueName: \"kubernetes.io/projected/49693624-23e2-4579-a736-a6148ac00de5-kube-api-access-r7mcm\") on node \"crc\" DevicePath \"\"" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.431020 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49693624-23e2-4579-a736-a6148ac00de5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.750042 5094 generic.go:334] "Generic (PLEG): container finished" podID="49693624-23e2-4579-a736-a6148ac00de5" containerID="a332eb9c1cd6b287b06b452672f79b317ec070e1819028f5f277db0a8d2ddaa3" exitCode=0 Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.750122 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szndh" event={"ID":"49693624-23e2-4579-a736-a6148ac00de5","Type":"ContainerDied","Data":"a332eb9c1cd6b287b06b452672f79b317ec070e1819028f5f277db0a8d2ddaa3"} Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.750149 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.750173 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szndh" event={"ID":"49693624-23e2-4579-a736-a6148ac00de5","Type":"ContainerDied","Data":"77800da7e8ad36fe3403f66f675a7ce8b8ae989d7a8a45320926ea3f87f9b807"} Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.750211 5094 scope.go:117] "RemoveContainer" containerID="a332eb9c1cd6b287b06b452672f79b317ec070e1819028f5f277db0a8d2ddaa3" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.789315 5094 scope.go:117] "RemoveContainer" containerID="9a98987ba55a83e82377e685996648b387e22f999290e5a3fb53a68cd8615111" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.795216 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-szndh"] Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.800528 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-szndh"] Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.813277 5094 scope.go:117] "RemoveContainer" containerID="5dda47894fb4b237543c3e5a0a710cb4e499545a2916d431ecc2e488db613d89" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.833230 5094 scope.go:117] "RemoveContainer" containerID="a332eb9c1cd6b287b06b452672f79b317ec070e1819028f5f277db0a8d2ddaa3" Feb 20 08:16:27 crc kubenswrapper[5094]: E0220 08:16:27.833723 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a332eb9c1cd6b287b06b452672f79b317ec070e1819028f5f277db0a8d2ddaa3\": container with ID starting with a332eb9c1cd6b287b06b452672f79b317ec070e1819028f5f277db0a8d2ddaa3 not found: ID does not exist" containerID="a332eb9c1cd6b287b06b452672f79b317ec070e1819028f5f277db0a8d2ddaa3" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.833773 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a332eb9c1cd6b287b06b452672f79b317ec070e1819028f5f277db0a8d2ddaa3"} err="failed to get container status \"a332eb9c1cd6b287b06b452672f79b317ec070e1819028f5f277db0a8d2ddaa3\": rpc error: code = NotFound desc = could not find container \"a332eb9c1cd6b287b06b452672f79b317ec070e1819028f5f277db0a8d2ddaa3\": container with ID starting with a332eb9c1cd6b287b06b452672f79b317ec070e1819028f5f277db0a8d2ddaa3 not found: ID does not exist" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.833821 5094 scope.go:117] "RemoveContainer" containerID="9a98987ba55a83e82377e685996648b387e22f999290e5a3fb53a68cd8615111" Feb 20 08:16:27 crc kubenswrapper[5094]: E0220 08:16:27.834203 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a98987ba55a83e82377e685996648b387e22f999290e5a3fb53a68cd8615111\": container with ID starting with 9a98987ba55a83e82377e685996648b387e22f999290e5a3fb53a68cd8615111 not found: ID does not exist" containerID="9a98987ba55a83e82377e685996648b387e22f999290e5a3fb53a68cd8615111" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.834241 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a98987ba55a83e82377e685996648b387e22f999290e5a3fb53a68cd8615111"} err="failed to get container status \"9a98987ba55a83e82377e685996648b387e22f999290e5a3fb53a68cd8615111\": rpc error: code = NotFound desc = could not find container \"9a98987ba55a83e82377e685996648b387e22f999290e5a3fb53a68cd8615111\": container with ID starting with 9a98987ba55a83e82377e685996648b387e22f999290e5a3fb53a68cd8615111 not found: ID does not exist" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.834266 5094 scope.go:117] "RemoveContainer" containerID="5dda47894fb4b237543c3e5a0a710cb4e499545a2916d431ecc2e488db613d89" Feb 20 08:16:27 crc kubenswrapper[5094]: E0220 08:16:27.834590 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dda47894fb4b237543c3e5a0a710cb4e499545a2916d431ecc2e488db613d89\": container with ID starting with 5dda47894fb4b237543c3e5a0a710cb4e499545a2916d431ecc2e488db613d89 not found: ID does not exist" containerID="5dda47894fb4b237543c3e5a0a710cb4e499545a2916d431ecc2e488db613d89" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.834625 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dda47894fb4b237543c3e5a0a710cb4e499545a2916d431ecc2e488db613d89"} err="failed to get container status \"5dda47894fb4b237543c3e5a0a710cb4e499545a2916d431ecc2e488db613d89\": rpc error: code = NotFound desc = could not find container \"5dda47894fb4b237543c3e5a0a710cb4e499545a2916d431ecc2e488db613d89\": container with ID starting with 5dda47894fb4b237543c3e5a0a710cb4e499545a2916d431ecc2e488db613d89 not found: ID does not exist" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.849583 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49693624-23e2-4579-a736-a6148ac00de5" path="/var/lib/kubelet/pods/49693624-23e2-4579-a736-a6148ac00de5/volumes" Feb 20 08:16:30 crc kubenswrapper[5094]: I0220 08:16:30.124937 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:30 crc kubenswrapper[5094]: I0220 08:16:30.125065 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:30 crc kubenswrapper[5094]: I0220 08:16:30.166550 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:30 crc kubenswrapper[5094]: I0220 08:16:30.835474 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:31 crc kubenswrapper[5094]: I0220 08:16:31.043743 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2792j"] Feb 20 08:16:32 crc kubenswrapper[5094]: I0220 08:16:32.792858 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2792j" podUID="6b9a2484-7e04-40b5-aae9-895f1a450ad6" containerName="registry-server" containerID="cri-o://09d4e876a4b0323d6acc2f616735e6cf048edc258f42476d7cbf3371b7b69737" gracePeriod=2 Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.248819 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.426529 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2jxb\" (UniqueName: \"kubernetes.io/projected/6b9a2484-7e04-40b5-aae9-895f1a450ad6-kube-api-access-x2jxb\") pod \"6b9a2484-7e04-40b5-aae9-895f1a450ad6\" (UID: \"6b9a2484-7e04-40b5-aae9-895f1a450ad6\") " Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.426613 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2484-7e04-40b5-aae9-895f1a450ad6-utilities\") pod \"6b9a2484-7e04-40b5-aae9-895f1a450ad6\" (UID: \"6b9a2484-7e04-40b5-aae9-895f1a450ad6\") " Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.426651 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2484-7e04-40b5-aae9-895f1a450ad6-catalog-content\") pod \"6b9a2484-7e04-40b5-aae9-895f1a450ad6\" (UID: \"6b9a2484-7e04-40b5-aae9-895f1a450ad6\") " Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.427798 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b9a2484-7e04-40b5-aae9-895f1a450ad6-utilities" (OuterVolumeSpecName: "utilities") pod "6b9a2484-7e04-40b5-aae9-895f1a450ad6" (UID: "6b9a2484-7e04-40b5-aae9-895f1a450ad6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.433318 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b9a2484-7e04-40b5-aae9-895f1a450ad6-kube-api-access-x2jxb" (OuterVolumeSpecName: "kube-api-access-x2jxb") pod "6b9a2484-7e04-40b5-aae9-895f1a450ad6" (UID: "6b9a2484-7e04-40b5-aae9-895f1a450ad6"). InnerVolumeSpecName "kube-api-access-x2jxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.457831 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b9a2484-7e04-40b5-aae9-895f1a450ad6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b9a2484-7e04-40b5-aae9-895f1a450ad6" (UID: "6b9a2484-7e04-40b5-aae9-895f1a450ad6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.528787 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2jxb\" (UniqueName: \"kubernetes.io/projected/6b9a2484-7e04-40b5-aae9-895f1a450ad6-kube-api-access-x2jxb\") on node \"crc\" DevicePath \"\"" Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.528830 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2484-7e04-40b5-aae9-895f1a450ad6-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.528840 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2484-7e04-40b5-aae9-895f1a450ad6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.804605 5094 generic.go:334] "Generic (PLEG): container finished" podID="6b9a2484-7e04-40b5-aae9-895f1a450ad6" containerID="09d4e876a4b0323d6acc2f616735e6cf048edc258f42476d7cbf3371b7b69737" exitCode=0 Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.804639 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.804655 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2792j" event={"ID":"6b9a2484-7e04-40b5-aae9-895f1a450ad6","Type":"ContainerDied","Data":"09d4e876a4b0323d6acc2f616735e6cf048edc258f42476d7cbf3371b7b69737"} Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.805120 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2792j" event={"ID":"6b9a2484-7e04-40b5-aae9-895f1a450ad6","Type":"ContainerDied","Data":"c44365c565049838dd503b609110293ae986c0140dec2e02acbc4c804bc33128"} Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.805142 5094 scope.go:117] "RemoveContainer" containerID="09d4e876a4b0323d6acc2f616735e6cf048edc258f42476d7cbf3371b7b69737" Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.825607 5094 scope.go:117] "RemoveContainer" containerID="ed5402904916dc6afe4f8848a945c7421e6f0b70582de7500a05cdb6fc590e41" Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.868834 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2792j"] Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.868891 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2792j"] Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.872831 5094 scope.go:117] "RemoveContainer" containerID="279190da08b55a4cc036cbd0439b757ac2fbf4be0789e671813e3757e0274ed5" Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.890218 5094 scope.go:117] "RemoveContainer" containerID="09d4e876a4b0323d6acc2f616735e6cf048edc258f42476d7cbf3371b7b69737" Feb 20 08:16:33 crc kubenswrapper[5094]: E0220 08:16:33.890862 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09d4e876a4b0323d6acc2f616735e6cf048edc258f42476d7cbf3371b7b69737\": container with ID starting with 09d4e876a4b0323d6acc2f616735e6cf048edc258f42476d7cbf3371b7b69737 not found: ID does not exist" containerID="09d4e876a4b0323d6acc2f616735e6cf048edc258f42476d7cbf3371b7b69737" Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.890908 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09d4e876a4b0323d6acc2f616735e6cf048edc258f42476d7cbf3371b7b69737"} err="failed to get container status \"09d4e876a4b0323d6acc2f616735e6cf048edc258f42476d7cbf3371b7b69737\": rpc error: code = NotFound desc = could not find container \"09d4e876a4b0323d6acc2f616735e6cf048edc258f42476d7cbf3371b7b69737\": container with ID starting with 09d4e876a4b0323d6acc2f616735e6cf048edc258f42476d7cbf3371b7b69737 not found: ID does not exist" Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.890936 5094 scope.go:117] "RemoveContainer" containerID="ed5402904916dc6afe4f8848a945c7421e6f0b70582de7500a05cdb6fc590e41" Feb 20 08:16:33 crc kubenswrapper[5094]: E0220 08:16:33.891360 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed5402904916dc6afe4f8848a945c7421e6f0b70582de7500a05cdb6fc590e41\": container with ID starting with ed5402904916dc6afe4f8848a945c7421e6f0b70582de7500a05cdb6fc590e41 not found: ID does not exist" containerID="ed5402904916dc6afe4f8848a945c7421e6f0b70582de7500a05cdb6fc590e41" Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.891469 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed5402904916dc6afe4f8848a945c7421e6f0b70582de7500a05cdb6fc590e41"} err="failed to get container status \"ed5402904916dc6afe4f8848a945c7421e6f0b70582de7500a05cdb6fc590e41\": rpc error: code = NotFound desc = could not find container \"ed5402904916dc6afe4f8848a945c7421e6f0b70582de7500a05cdb6fc590e41\": container with ID starting with ed5402904916dc6afe4f8848a945c7421e6f0b70582de7500a05cdb6fc590e41 not found: ID does not exist" Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.891568 5094 scope.go:117] "RemoveContainer" containerID="279190da08b55a4cc036cbd0439b757ac2fbf4be0789e671813e3757e0274ed5" Feb 20 08:16:33 crc kubenswrapper[5094]: E0220 08:16:33.892087 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"279190da08b55a4cc036cbd0439b757ac2fbf4be0789e671813e3757e0274ed5\": container with ID starting with 279190da08b55a4cc036cbd0439b757ac2fbf4be0789e671813e3757e0274ed5 not found: ID does not exist" containerID="279190da08b55a4cc036cbd0439b757ac2fbf4be0789e671813e3757e0274ed5" Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.892114 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"279190da08b55a4cc036cbd0439b757ac2fbf4be0789e671813e3757e0274ed5"} err="failed to get container status \"279190da08b55a4cc036cbd0439b757ac2fbf4be0789e671813e3757e0274ed5\": rpc error: code = NotFound desc = could not find container \"279190da08b55a4cc036cbd0439b757ac2fbf4be0789e671813e3757e0274ed5\": container with ID starting with 279190da08b55a4cc036cbd0439b757ac2fbf4be0789e671813e3757e0274ed5 not found: ID does not exist" Feb 20 08:16:34 crc kubenswrapper[5094]: I0220 08:16:34.107216 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:16:34 crc kubenswrapper[5094]: I0220 08:16:34.107636 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:16:35 crc kubenswrapper[5094]: I0220 08:16:35.858264 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b9a2484-7e04-40b5-aae9-895f1a450ad6" path="/var/lib/kubelet/pods/6b9a2484-7e04-40b5-aae9-895f1a450ad6/volumes" Feb 20 08:17:04 crc kubenswrapper[5094]: I0220 08:17:04.106849 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:17:04 crc kubenswrapper[5094]: I0220 08:17:04.107518 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:17:04 crc kubenswrapper[5094]: I0220 08:17:04.107691 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 08:17:04 crc kubenswrapper[5094]: I0220 08:17:04.108389 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"97148134be230a037117359d1f168e80e119726d339f329bd4df40629ef0a2ca"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 08:17:04 crc kubenswrapper[5094]: I0220 08:17:04.108457 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://97148134be230a037117359d1f168e80e119726d339f329bd4df40629ef0a2ca" gracePeriod=600 Feb 20 08:17:05 crc kubenswrapper[5094]: I0220 08:17:05.063696 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="97148134be230a037117359d1f168e80e119726d339f329bd4df40629ef0a2ca" exitCode=0 Feb 20 08:17:05 crc kubenswrapper[5094]: I0220 08:17:05.063799 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"97148134be230a037117359d1f168e80e119726d339f329bd4df40629ef0a2ca"} Feb 20 08:17:05 crc kubenswrapper[5094]: I0220 08:17:05.064274 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda"} Feb 20 08:17:05 crc kubenswrapper[5094]: I0220 08:17:05.064300 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:17:29 crc kubenswrapper[5094]: I0220 08:17:29.838477 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jxdj4"] Feb 20 08:17:29 crc kubenswrapper[5094]: E0220 08:17:29.839427 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49693624-23e2-4579-a736-a6148ac00de5" containerName="registry-server" Feb 20 08:17:29 crc kubenswrapper[5094]: I0220 08:17:29.839443 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="49693624-23e2-4579-a736-a6148ac00de5" containerName="registry-server" Feb 20 08:17:29 crc kubenswrapper[5094]: E0220 08:17:29.839466 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b9a2484-7e04-40b5-aae9-895f1a450ad6" containerName="extract-utilities" Feb 20 08:17:29 crc kubenswrapper[5094]: I0220 08:17:29.839475 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9a2484-7e04-40b5-aae9-895f1a450ad6" containerName="extract-utilities" Feb 20 08:17:29 crc kubenswrapper[5094]: E0220 08:17:29.839486 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b9a2484-7e04-40b5-aae9-895f1a450ad6" containerName="extract-content" Feb 20 08:17:29 crc kubenswrapper[5094]: I0220 08:17:29.839496 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9a2484-7e04-40b5-aae9-895f1a450ad6" containerName="extract-content" Feb 20 08:17:29 crc kubenswrapper[5094]: E0220 08:17:29.839511 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49693624-23e2-4579-a736-a6148ac00de5" containerName="extract-utilities" Feb 20 08:17:29 crc kubenswrapper[5094]: I0220 08:17:29.839520 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="49693624-23e2-4579-a736-a6148ac00de5" containerName="extract-utilities" Feb 20 08:17:29 crc kubenswrapper[5094]: E0220 08:17:29.839541 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49693624-23e2-4579-a736-a6148ac00de5" containerName="extract-content" Feb 20 08:17:29 crc kubenswrapper[5094]: I0220 08:17:29.839550 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="49693624-23e2-4579-a736-a6148ac00de5" containerName="extract-content" Feb 20 08:17:29 crc kubenswrapper[5094]: E0220 08:17:29.839567 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b9a2484-7e04-40b5-aae9-895f1a450ad6" containerName="registry-server" Feb 20 08:17:29 crc kubenswrapper[5094]: I0220 08:17:29.839575 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9a2484-7e04-40b5-aae9-895f1a450ad6" containerName="registry-server" Feb 20 08:17:29 crc kubenswrapper[5094]: I0220 08:17:29.839776 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b9a2484-7e04-40b5-aae9-895f1a450ad6" containerName="registry-server" Feb 20 08:17:29 crc kubenswrapper[5094]: I0220 08:17:29.839798 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="49693624-23e2-4579-a736-a6148ac00de5" containerName="registry-server" Feb 20 08:17:29 crc kubenswrapper[5094]: I0220 08:17:29.841054 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:29 crc kubenswrapper[5094]: I0220 08:17:29.879923 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jxdj4"] Feb 20 08:17:30 crc kubenswrapper[5094]: I0220 08:17:30.026746 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64f8c64c-9964-45bc-a6f5-b588e04962e1-catalog-content\") pod \"certified-operators-jxdj4\" (UID: \"64f8c64c-9964-45bc-a6f5-b588e04962e1\") " pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:30 crc kubenswrapper[5094]: I0220 08:17:30.026818 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfsrb\" (UniqueName: \"kubernetes.io/projected/64f8c64c-9964-45bc-a6f5-b588e04962e1-kube-api-access-xfsrb\") pod \"certified-operators-jxdj4\" (UID: \"64f8c64c-9964-45bc-a6f5-b588e04962e1\") " pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:30 crc kubenswrapper[5094]: I0220 08:17:30.026891 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64f8c64c-9964-45bc-a6f5-b588e04962e1-utilities\") pod \"certified-operators-jxdj4\" (UID: \"64f8c64c-9964-45bc-a6f5-b588e04962e1\") " pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:30 crc kubenswrapper[5094]: I0220 08:17:30.127879 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64f8c64c-9964-45bc-a6f5-b588e04962e1-catalog-content\") pod \"certified-operators-jxdj4\" (UID: \"64f8c64c-9964-45bc-a6f5-b588e04962e1\") " pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:30 crc kubenswrapper[5094]: I0220 08:17:30.127924 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfsrb\" (UniqueName: \"kubernetes.io/projected/64f8c64c-9964-45bc-a6f5-b588e04962e1-kube-api-access-xfsrb\") pod \"certified-operators-jxdj4\" (UID: \"64f8c64c-9964-45bc-a6f5-b588e04962e1\") " pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:30 crc kubenswrapper[5094]: I0220 08:17:30.127961 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64f8c64c-9964-45bc-a6f5-b588e04962e1-utilities\") pod \"certified-operators-jxdj4\" (UID: \"64f8c64c-9964-45bc-a6f5-b588e04962e1\") " pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:30 crc kubenswrapper[5094]: I0220 08:17:30.128493 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64f8c64c-9964-45bc-a6f5-b588e04962e1-utilities\") pod \"certified-operators-jxdj4\" (UID: \"64f8c64c-9964-45bc-a6f5-b588e04962e1\") " pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:30 crc kubenswrapper[5094]: I0220 08:17:30.128575 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64f8c64c-9964-45bc-a6f5-b588e04962e1-catalog-content\") pod \"certified-operators-jxdj4\" (UID: \"64f8c64c-9964-45bc-a6f5-b588e04962e1\") " pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:30 crc kubenswrapper[5094]: I0220 08:17:30.146730 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfsrb\" (UniqueName: \"kubernetes.io/projected/64f8c64c-9964-45bc-a6f5-b588e04962e1-kube-api-access-xfsrb\") pod \"certified-operators-jxdj4\" (UID: \"64f8c64c-9964-45bc-a6f5-b588e04962e1\") " pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:30 crc kubenswrapper[5094]: I0220 08:17:30.192599 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:30 crc kubenswrapper[5094]: I0220 08:17:30.480212 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jxdj4"] Feb 20 08:17:31 crc kubenswrapper[5094]: I0220 08:17:31.319915 5094 generic.go:334] "Generic (PLEG): container finished" podID="64f8c64c-9964-45bc-a6f5-b588e04962e1" containerID="ae2104091bc469a43dc0c9748f7fbc6439f754633daaafdf4ea0867dbc748771" exitCode=0 Feb 20 08:17:31 crc kubenswrapper[5094]: I0220 08:17:31.319955 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jxdj4" event={"ID":"64f8c64c-9964-45bc-a6f5-b588e04962e1","Type":"ContainerDied","Data":"ae2104091bc469a43dc0c9748f7fbc6439f754633daaafdf4ea0867dbc748771"} Feb 20 08:17:31 crc kubenswrapper[5094]: I0220 08:17:31.319979 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jxdj4" event={"ID":"64f8c64c-9964-45bc-a6f5-b588e04962e1","Type":"ContainerStarted","Data":"487dc11d31f6ce82e7ae2d892e6eff95fc315f0bc93c3d1894f98a07ce7a3066"} Feb 20 08:17:32 crc kubenswrapper[5094]: I0220 08:17:32.328675 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jxdj4" event={"ID":"64f8c64c-9964-45bc-a6f5-b588e04962e1","Type":"ContainerStarted","Data":"d702aa8f9bd4af18ff756018b88b1c6acbb6b891749b37833177f16791ffd076"} Feb 20 08:17:33 crc kubenswrapper[5094]: I0220 08:17:33.342248 5094 generic.go:334] "Generic (PLEG): container finished" podID="64f8c64c-9964-45bc-a6f5-b588e04962e1" containerID="d702aa8f9bd4af18ff756018b88b1c6acbb6b891749b37833177f16791ffd076" exitCode=0 Feb 20 08:17:33 crc kubenswrapper[5094]: I0220 08:17:33.342320 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jxdj4" event={"ID":"64f8c64c-9964-45bc-a6f5-b588e04962e1","Type":"ContainerDied","Data":"d702aa8f9bd4af18ff756018b88b1c6acbb6b891749b37833177f16791ffd076"} Feb 20 08:17:34 crc kubenswrapper[5094]: I0220 08:17:34.354338 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jxdj4" event={"ID":"64f8c64c-9964-45bc-a6f5-b588e04962e1","Type":"ContainerStarted","Data":"8fa1b2b785229a8ca47d1d3cbbad1359e5364208bd831802e973c24ee5e70bbb"} Feb 20 08:17:34 crc kubenswrapper[5094]: I0220 08:17:34.382744 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jxdj4" podStartSLOduration=2.958839541 podStartE2EDuration="5.38268603s" podCreationTimestamp="2026-02-20 08:17:29 +0000 UTC" firstStartedPulling="2026-02-20 08:17:31.321361204 +0000 UTC m=+5466.193987915" lastFinishedPulling="2026-02-20 08:17:33.745207653 +0000 UTC m=+5468.617834404" observedRunningTime="2026-02-20 08:17:34.37561888 +0000 UTC m=+5469.248245631" watchObservedRunningTime="2026-02-20 08:17:34.38268603 +0000 UTC m=+5469.255312781" Feb 20 08:17:40 crc kubenswrapper[5094]: I0220 08:17:40.193531 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:40 crc kubenswrapper[5094]: I0220 08:17:40.194497 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:40 crc kubenswrapper[5094]: I0220 08:17:40.272551 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:40 crc kubenswrapper[5094]: I0220 08:17:40.493658 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:40 crc kubenswrapper[5094]: I0220 08:17:40.549153 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jxdj4"] Feb 20 08:17:42 crc kubenswrapper[5094]: I0220 08:17:42.421317 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jxdj4" podUID="64f8c64c-9964-45bc-a6f5-b588e04962e1" containerName="registry-server" containerID="cri-o://8fa1b2b785229a8ca47d1d3cbbad1359e5364208bd831802e973c24ee5e70bbb" gracePeriod=2 Feb 20 08:17:42 crc kubenswrapper[5094]: E0220 08:17:42.537352 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64f8c64c_9964_45bc_a6f5_b588e04962e1.slice/crio-8fa1b2b785229a8ca47d1d3cbbad1359e5364208bd831802e973c24ee5e70bbb.scope\": RecentStats: unable to find data in memory cache]" Feb 20 08:17:42 crc kubenswrapper[5094]: I0220 08:17:42.905803 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:42 crc kubenswrapper[5094]: I0220 08:17:42.930283 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfsrb\" (UniqueName: \"kubernetes.io/projected/64f8c64c-9964-45bc-a6f5-b588e04962e1-kube-api-access-xfsrb\") pod \"64f8c64c-9964-45bc-a6f5-b588e04962e1\" (UID: \"64f8c64c-9964-45bc-a6f5-b588e04962e1\") " Feb 20 08:17:42 crc kubenswrapper[5094]: I0220 08:17:42.930370 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64f8c64c-9964-45bc-a6f5-b588e04962e1-catalog-content\") pod \"64f8c64c-9964-45bc-a6f5-b588e04962e1\" (UID: \"64f8c64c-9964-45bc-a6f5-b588e04962e1\") " Feb 20 08:17:42 crc kubenswrapper[5094]: I0220 08:17:42.930406 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64f8c64c-9964-45bc-a6f5-b588e04962e1-utilities\") pod \"64f8c64c-9964-45bc-a6f5-b588e04962e1\" (UID: \"64f8c64c-9964-45bc-a6f5-b588e04962e1\") " Feb 20 08:17:42 crc kubenswrapper[5094]: I0220 08:17:42.932286 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64f8c64c-9964-45bc-a6f5-b588e04962e1-utilities" (OuterVolumeSpecName: "utilities") pod "64f8c64c-9964-45bc-a6f5-b588e04962e1" (UID: "64f8c64c-9964-45bc-a6f5-b588e04962e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:17:42 crc kubenswrapper[5094]: I0220 08:17:42.942408 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64f8c64c-9964-45bc-a6f5-b588e04962e1-kube-api-access-xfsrb" (OuterVolumeSpecName: "kube-api-access-xfsrb") pod "64f8c64c-9964-45bc-a6f5-b588e04962e1" (UID: "64f8c64c-9964-45bc-a6f5-b588e04962e1"). InnerVolumeSpecName "kube-api-access-xfsrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.031812 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64f8c64c-9964-45bc-a6f5-b588e04962e1-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.031864 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfsrb\" (UniqueName: \"kubernetes.io/projected/64f8c64c-9964-45bc-a6f5-b588e04962e1-kube-api-access-xfsrb\") on node \"crc\" DevicePath \"\"" Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.438787 5094 generic.go:334] "Generic (PLEG): container finished" podID="64f8c64c-9964-45bc-a6f5-b588e04962e1" containerID="8fa1b2b785229a8ca47d1d3cbbad1359e5364208bd831802e973c24ee5e70bbb" exitCode=0 Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.438829 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jxdj4" event={"ID":"64f8c64c-9964-45bc-a6f5-b588e04962e1","Type":"ContainerDied","Data":"8fa1b2b785229a8ca47d1d3cbbad1359e5364208bd831802e973c24ee5e70bbb"} Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.438811 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.438871 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jxdj4" event={"ID":"64f8c64c-9964-45bc-a6f5-b588e04962e1","Type":"ContainerDied","Data":"487dc11d31f6ce82e7ae2d892e6eff95fc315f0bc93c3d1894f98a07ce7a3066"} Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.438891 5094 scope.go:117] "RemoveContainer" containerID="8fa1b2b785229a8ca47d1d3cbbad1359e5364208bd831802e973c24ee5e70bbb" Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.480539 5094 scope.go:117] "RemoveContainer" containerID="d702aa8f9bd4af18ff756018b88b1c6acbb6b891749b37833177f16791ffd076" Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.499308 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64f8c64c-9964-45bc-a6f5-b588e04962e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "64f8c64c-9964-45bc-a6f5-b588e04962e1" (UID: "64f8c64c-9964-45bc-a6f5-b588e04962e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.508459 5094 scope.go:117] "RemoveContainer" containerID="ae2104091bc469a43dc0c9748f7fbc6439f754633daaafdf4ea0867dbc748771" Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.539641 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64f8c64c-9964-45bc-a6f5-b588e04962e1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.553298 5094 scope.go:117] "RemoveContainer" containerID="8fa1b2b785229a8ca47d1d3cbbad1359e5364208bd831802e973c24ee5e70bbb" Feb 20 08:17:43 crc kubenswrapper[5094]: E0220 08:17:43.553978 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fa1b2b785229a8ca47d1d3cbbad1359e5364208bd831802e973c24ee5e70bbb\": container with ID starting with 8fa1b2b785229a8ca47d1d3cbbad1359e5364208bd831802e973c24ee5e70bbb not found: ID does not exist" containerID="8fa1b2b785229a8ca47d1d3cbbad1359e5364208bd831802e973c24ee5e70bbb" Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.554021 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fa1b2b785229a8ca47d1d3cbbad1359e5364208bd831802e973c24ee5e70bbb"} err="failed to get container status \"8fa1b2b785229a8ca47d1d3cbbad1359e5364208bd831802e973c24ee5e70bbb\": rpc error: code = NotFound desc = could not find container \"8fa1b2b785229a8ca47d1d3cbbad1359e5364208bd831802e973c24ee5e70bbb\": container with ID starting with 8fa1b2b785229a8ca47d1d3cbbad1359e5364208bd831802e973c24ee5e70bbb not found: ID does not exist" Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.554048 5094 scope.go:117] "RemoveContainer" containerID="d702aa8f9bd4af18ff756018b88b1c6acbb6b891749b37833177f16791ffd076" Feb 20 08:17:43 crc kubenswrapper[5094]: E0220 08:17:43.554647 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d702aa8f9bd4af18ff756018b88b1c6acbb6b891749b37833177f16791ffd076\": container with ID starting with d702aa8f9bd4af18ff756018b88b1c6acbb6b891749b37833177f16791ffd076 not found: ID does not exist" containerID="d702aa8f9bd4af18ff756018b88b1c6acbb6b891749b37833177f16791ffd076" Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.554686 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d702aa8f9bd4af18ff756018b88b1c6acbb6b891749b37833177f16791ffd076"} err="failed to get container status \"d702aa8f9bd4af18ff756018b88b1c6acbb6b891749b37833177f16791ffd076\": rpc error: code = NotFound desc = could not find container \"d702aa8f9bd4af18ff756018b88b1c6acbb6b891749b37833177f16791ffd076\": container with ID starting with d702aa8f9bd4af18ff756018b88b1c6acbb6b891749b37833177f16791ffd076 not found: ID does not exist" Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.554736 5094 scope.go:117] "RemoveContainer" containerID="ae2104091bc469a43dc0c9748f7fbc6439f754633daaafdf4ea0867dbc748771" Feb 20 08:17:43 crc kubenswrapper[5094]: E0220 08:17:43.555084 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae2104091bc469a43dc0c9748f7fbc6439f754633daaafdf4ea0867dbc748771\": container with ID starting with ae2104091bc469a43dc0c9748f7fbc6439f754633daaafdf4ea0867dbc748771 not found: ID does not exist" containerID="ae2104091bc469a43dc0c9748f7fbc6439f754633daaafdf4ea0867dbc748771" Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.555126 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae2104091bc469a43dc0c9748f7fbc6439f754633daaafdf4ea0867dbc748771"} err="failed to get container status \"ae2104091bc469a43dc0c9748f7fbc6439f754633daaafdf4ea0867dbc748771\": rpc error: code = NotFound desc = could not find container \"ae2104091bc469a43dc0c9748f7fbc6439f754633daaafdf4ea0867dbc748771\": container with ID starting with ae2104091bc469a43dc0c9748f7fbc6439f754633daaafdf4ea0867dbc748771 not found: ID does not exist" Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.789067 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jxdj4"] Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.793558 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jxdj4"] Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.855153 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64f8c64c-9964-45bc-a6f5-b588e04962e1" path="/var/lib/kubelet/pods/64f8c64c-9964-45bc-a6f5-b588e04962e1/volumes" Feb 20 08:19:04 crc kubenswrapper[5094]: I0220 08:19:04.106760 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:19:04 crc kubenswrapper[5094]: I0220 08:19:04.107378 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.527106 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b5nqx"] Feb 20 08:19:11 crc kubenswrapper[5094]: E0220 08:19:11.528256 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64f8c64c-9964-45bc-a6f5-b588e04962e1" containerName="registry-server" Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.528270 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="64f8c64c-9964-45bc-a6f5-b588e04962e1" containerName="registry-server" Feb 20 08:19:11 crc kubenswrapper[5094]: E0220 08:19:11.528345 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64f8c64c-9964-45bc-a6f5-b588e04962e1" containerName="extract-utilities" Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.528356 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="64f8c64c-9964-45bc-a6f5-b588e04962e1" containerName="extract-utilities" Feb 20 08:19:11 crc kubenswrapper[5094]: E0220 08:19:11.528367 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64f8c64c-9964-45bc-a6f5-b588e04962e1" containerName="extract-content" Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.528375 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="64f8c64c-9964-45bc-a6f5-b588e04962e1" containerName="extract-content" Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.528514 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="64f8c64c-9964-45bc-a6f5-b588e04962e1" containerName="registry-server" Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.529650 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.550683 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b5nqx"] Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.704971 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4b6568c-9fe2-4353-835a-d363b4e64f9b-catalog-content\") pod \"redhat-operators-b5nqx\" (UID: \"a4b6568c-9fe2-4353-835a-d363b4e64f9b\") " pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.705226 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt7h9\" (UniqueName: \"kubernetes.io/projected/a4b6568c-9fe2-4353-835a-d363b4e64f9b-kube-api-access-jt7h9\") pod \"redhat-operators-b5nqx\" (UID: \"a4b6568c-9fe2-4353-835a-d363b4e64f9b\") " pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.705360 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4b6568c-9fe2-4353-835a-d363b4e64f9b-utilities\") pod \"redhat-operators-b5nqx\" (UID: \"a4b6568c-9fe2-4353-835a-d363b4e64f9b\") " pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.807142 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4b6568c-9fe2-4353-835a-d363b4e64f9b-catalog-content\") pod \"redhat-operators-b5nqx\" (UID: \"a4b6568c-9fe2-4353-835a-d363b4e64f9b\") " pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.807221 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt7h9\" (UniqueName: \"kubernetes.io/projected/a4b6568c-9fe2-4353-835a-d363b4e64f9b-kube-api-access-jt7h9\") pod \"redhat-operators-b5nqx\" (UID: \"a4b6568c-9fe2-4353-835a-d363b4e64f9b\") " pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.807251 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4b6568c-9fe2-4353-835a-d363b4e64f9b-utilities\") pod \"redhat-operators-b5nqx\" (UID: \"a4b6568c-9fe2-4353-835a-d363b4e64f9b\") " pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.807874 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4b6568c-9fe2-4353-835a-d363b4e64f9b-catalog-content\") pod \"redhat-operators-b5nqx\" (UID: \"a4b6568c-9fe2-4353-835a-d363b4e64f9b\") " pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.807917 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4b6568c-9fe2-4353-835a-d363b4e64f9b-utilities\") pod \"redhat-operators-b5nqx\" (UID: \"a4b6568c-9fe2-4353-835a-d363b4e64f9b\") " pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.834508 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt7h9\" (UniqueName: \"kubernetes.io/projected/a4b6568c-9fe2-4353-835a-d363b4e64f9b-kube-api-access-jt7h9\") pod \"redhat-operators-b5nqx\" (UID: \"a4b6568c-9fe2-4353-835a-d363b4e64f9b\") " pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.852775 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:12 crc kubenswrapper[5094]: I0220 08:19:12.348555 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b5nqx"] Feb 20 08:19:13 crc kubenswrapper[5094]: I0220 08:19:13.253501 5094 generic.go:334] "Generic (PLEG): container finished" podID="a4b6568c-9fe2-4353-835a-d363b4e64f9b" containerID="ee643764c6a48859bea4db11f8b036d1720a13356831d97e22237ad24ea567ad" exitCode=0 Feb 20 08:19:13 crc kubenswrapper[5094]: I0220 08:19:13.253540 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b5nqx" event={"ID":"a4b6568c-9fe2-4353-835a-d363b4e64f9b","Type":"ContainerDied","Data":"ee643764c6a48859bea4db11f8b036d1720a13356831d97e22237ad24ea567ad"} Feb 20 08:19:13 crc kubenswrapper[5094]: I0220 08:19:13.253564 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b5nqx" event={"ID":"a4b6568c-9fe2-4353-835a-d363b4e64f9b","Type":"ContainerStarted","Data":"92e6948396454f04f599a6213d5638ad59001d7ab5a53aa303251ddda9de1c5a"} Feb 20 08:19:14 crc kubenswrapper[5094]: I0220 08:19:14.267841 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b5nqx" event={"ID":"a4b6568c-9fe2-4353-835a-d363b4e64f9b","Type":"ContainerStarted","Data":"260e477dc84fa1fadafa46b30ccf4318d963a69512f32ad6e262b43c0e44157c"} Feb 20 08:19:15 crc kubenswrapper[5094]: I0220 08:19:15.279029 5094 generic.go:334] "Generic (PLEG): container finished" podID="a4b6568c-9fe2-4353-835a-d363b4e64f9b" containerID="260e477dc84fa1fadafa46b30ccf4318d963a69512f32ad6e262b43c0e44157c" exitCode=0 Feb 20 08:19:15 crc kubenswrapper[5094]: I0220 08:19:15.279118 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b5nqx" event={"ID":"a4b6568c-9fe2-4353-835a-d363b4e64f9b","Type":"ContainerDied","Data":"260e477dc84fa1fadafa46b30ccf4318d963a69512f32ad6e262b43c0e44157c"} Feb 20 08:19:16 crc kubenswrapper[5094]: I0220 08:19:16.292160 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b5nqx" event={"ID":"a4b6568c-9fe2-4353-835a-d363b4e64f9b","Type":"ContainerStarted","Data":"abc2d8411d9dbf7d76d25270021a13fa38e322bbf3b85ae52ffb695dcb09da4e"} Feb 20 08:19:16 crc kubenswrapper[5094]: I0220 08:19:16.327903 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b5nqx" podStartSLOduration=2.814928297 podStartE2EDuration="5.32788174s" podCreationTimestamp="2026-02-20 08:19:11 +0000 UTC" firstStartedPulling="2026-02-20 08:19:13.25523572 +0000 UTC m=+5568.127862431" lastFinishedPulling="2026-02-20 08:19:15.768189133 +0000 UTC m=+5570.640815874" observedRunningTime="2026-02-20 08:19:16.32622393 +0000 UTC m=+5571.198850641" watchObservedRunningTime="2026-02-20 08:19:16.32788174 +0000 UTC m=+5571.200508531" Feb 20 08:19:21 crc kubenswrapper[5094]: I0220 08:19:21.853777 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:21 crc kubenswrapper[5094]: I0220 08:19:21.854302 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:22 crc kubenswrapper[5094]: I0220 08:19:22.914311 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b5nqx" podUID="a4b6568c-9fe2-4353-835a-d363b4e64f9b" containerName="registry-server" probeResult="failure" output=< Feb 20 08:19:22 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 08:19:22 crc kubenswrapper[5094]: > Feb 20 08:19:31 crc kubenswrapper[5094]: I0220 08:19:31.905267 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:31 crc kubenswrapper[5094]: I0220 08:19:31.953606 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:32 crc kubenswrapper[5094]: I0220 08:19:32.141561 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b5nqx"] Feb 20 08:19:33 crc kubenswrapper[5094]: I0220 08:19:33.446515 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b5nqx" podUID="a4b6568c-9fe2-4353-835a-d363b4e64f9b" containerName="registry-server" containerID="cri-o://abc2d8411d9dbf7d76d25270021a13fa38e322bbf3b85ae52ffb695dcb09da4e" gracePeriod=2 Feb 20 08:19:33 crc kubenswrapper[5094]: I0220 08:19:33.818421 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:33 crc kubenswrapper[5094]: I0220 08:19:33.938069 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4b6568c-9fe2-4353-835a-d363b4e64f9b-utilities\") pod \"a4b6568c-9fe2-4353-835a-d363b4e64f9b\" (UID: \"a4b6568c-9fe2-4353-835a-d363b4e64f9b\") " Feb 20 08:19:33 crc kubenswrapper[5094]: I0220 08:19:33.938357 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4b6568c-9fe2-4353-835a-d363b4e64f9b-catalog-content\") pod \"a4b6568c-9fe2-4353-835a-d363b4e64f9b\" (UID: \"a4b6568c-9fe2-4353-835a-d363b4e64f9b\") " Feb 20 08:19:33 crc kubenswrapper[5094]: I0220 08:19:33.938430 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt7h9\" (UniqueName: \"kubernetes.io/projected/a4b6568c-9fe2-4353-835a-d363b4e64f9b-kube-api-access-jt7h9\") pod \"a4b6568c-9fe2-4353-835a-d363b4e64f9b\" (UID: \"a4b6568c-9fe2-4353-835a-d363b4e64f9b\") " Feb 20 08:19:33 crc kubenswrapper[5094]: I0220 08:19:33.939199 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4b6568c-9fe2-4353-835a-d363b4e64f9b-utilities" (OuterVolumeSpecName: "utilities") pod "a4b6568c-9fe2-4353-835a-d363b4e64f9b" (UID: "a4b6568c-9fe2-4353-835a-d363b4e64f9b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:19:33 crc kubenswrapper[5094]: I0220 08:19:33.955103 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4b6568c-9fe2-4353-835a-d363b4e64f9b-kube-api-access-jt7h9" (OuterVolumeSpecName: "kube-api-access-jt7h9") pod "a4b6568c-9fe2-4353-835a-d363b4e64f9b" (UID: "a4b6568c-9fe2-4353-835a-d363b4e64f9b"). InnerVolumeSpecName "kube-api-access-jt7h9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.039415 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4b6568c-9fe2-4353-835a-d363b4e64f9b-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.039452 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt7h9\" (UniqueName: \"kubernetes.io/projected/a4b6568c-9fe2-4353-835a-d363b4e64f9b-kube-api-access-jt7h9\") on node \"crc\" DevicePath \"\"" Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.054933 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4b6568c-9fe2-4353-835a-d363b4e64f9b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4b6568c-9fe2-4353-835a-d363b4e64f9b" (UID: "a4b6568c-9fe2-4353-835a-d363b4e64f9b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.106324 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.106592 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.140411 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4b6568c-9fe2-4353-835a-d363b4e64f9b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.454992 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.455004 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b5nqx" event={"ID":"a4b6568c-9fe2-4353-835a-d363b4e64f9b","Type":"ContainerDied","Data":"abc2d8411d9dbf7d76d25270021a13fa38e322bbf3b85ae52ffb695dcb09da4e"} Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.455128 5094 scope.go:117] "RemoveContainer" containerID="abc2d8411d9dbf7d76d25270021a13fa38e322bbf3b85ae52ffb695dcb09da4e" Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.456996 5094 generic.go:334] "Generic (PLEG): container finished" podID="a4b6568c-9fe2-4353-835a-d363b4e64f9b" containerID="abc2d8411d9dbf7d76d25270021a13fa38e322bbf3b85ae52ffb695dcb09da4e" exitCode=0 Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.457050 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b5nqx" event={"ID":"a4b6568c-9fe2-4353-835a-d363b4e64f9b","Type":"ContainerDied","Data":"92e6948396454f04f599a6213d5638ad59001d7ab5a53aa303251ddda9de1c5a"} Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.483082 5094 scope.go:117] "RemoveContainer" containerID="260e477dc84fa1fadafa46b30ccf4318d963a69512f32ad6e262b43c0e44157c" Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.497764 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b5nqx"] Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.497843 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b5nqx"] Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.504144 5094 scope.go:117] "RemoveContainer" containerID="ee643764c6a48859bea4db11f8b036d1720a13356831d97e22237ad24ea567ad" Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.523875 5094 scope.go:117] "RemoveContainer" containerID="abc2d8411d9dbf7d76d25270021a13fa38e322bbf3b85ae52ffb695dcb09da4e" Feb 20 08:19:34 crc kubenswrapper[5094]: E0220 08:19:34.524260 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abc2d8411d9dbf7d76d25270021a13fa38e322bbf3b85ae52ffb695dcb09da4e\": container with ID starting with abc2d8411d9dbf7d76d25270021a13fa38e322bbf3b85ae52ffb695dcb09da4e not found: ID does not exist" containerID="abc2d8411d9dbf7d76d25270021a13fa38e322bbf3b85ae52ffb695dcb09da4e" Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.524307 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abc2d8411d9dbf7d76d25270021a13fa38e322bbf3b85ae52ffb695dcb09da4e"} err="failed to get container status \"abc2d8411d9dbf7d76d25270021a13fa38e322bbf3b85ae52ffb695dcb09da4e\": rpc error: code = NotFound desc = could not find container \"abc2d8411d9dbf7d76d25270021a13fa38e322bbf3b85ae52ffb695dcb09da4e\": container with ID starting with abc2d8411d9dbf7d76d25270021a13fa38e322bbf3b85ae52ffb695dcb09da4e not found: ID does not exist" Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.524336 5094 scope.go:117] "RemoveContainer" containerID="260e477dc84fa1fadafa46b30ccf4318d963a69512f32ad6e262b43c0e44157c" Feb 20 08:19:34 crc kubenswrapper[5094]: E0220 08:19:34.524626 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"260e477dc84fa1fadafa46b30ccf4318d963a69512f32ad6e262b43c0e44157c\": container with ID starting with 260e477dc84fa1fadafa46b30ccf4318d963a69512f32ad6e262b43c0e44157c not found: ID does not exist" containerID="260e477dc84fa1fadafa46b30ccf4318d963a69512f32ad6e262b43c0e44157c" Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.524650 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"260e477dc84fa1fadafa46b30ccf4318d963a69512f32ad6e262b43c0e44157c"} err="failed to get container status \"260e477dc84fa1fadafa46b30ccf4318d963a69512f32ad6e262b43c0e44157c\": rpc error: code = NotFound desc = could not find container \"260e477dc84fa1fadafa46b30ccf4318d963a69512f32ad6e262b43c0e44157c\": container with ID starting with 260e477dc84fa1fadafa46b30ccf4318d963a69512f32ad6e262b43c0e44157c not found: ID does not exist" Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.524662 5094 scope.go:117] "RemoveContainer" containerID="ee643764c6a48859bea4db11f8b036d1720a13356831d97e22237ad24ea567ad" Feb 20 08:19:34 crc kubenswrapper[5094]: E0220 08:19:34.525120 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee643764c6a48859bea4db11f8b036d1720a13356831d97e22237ad24ea567ad\": container with ID starting with ee643764c6a48859bea4db11f8b036d1720a13356831d97e22237ad24ea567ad not found: ID does not exist" containerID="ee643764c6a48859bea4db11f8b036d1720a13356831d97e22237ad24ea567ad" Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.525144 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee643764c6a48859bea4db11f8b036d1720a13356831d97e22237ad24ea567ad"} err="failed to get container status \"ee643764c6a48859bea4db11f8b036d1720a13356831d97e22237ad24ea567ad\": rpc error: code = NotFound desc = could not find container \"ee643764c6a48859bea4db11f8b036d1720a13356831d97e22237ad24ea567ad\": container with ID starting with ee643764c6a48859bea4db11f8b036d1720a13356831d97e22237ad24ea567ad not found: ID does not exist" Feb 20 08:19:35 crc kubenswrapper[5094]: I0220 08:19:35.858335 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4b6568c-9fe2-4353-835a-d363b4e64f9b" path="/var/lib/kubelet/pods/a4b6568c-9fe2-4353-835a-d363b4e64f9b/volumes" Feb 20 08:20:04 crc kubenswrapper[5094]: I0220 08:20:04.106669 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:20:04 crc kubenswrapper[5094]: I0220 08:20:04.107157 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:20:04 crc kubenswrapper[5094]: I0220 08:20:04.107192 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 08:20:04 crc kubenswrapper[5094]: I0220 08:20:04.107755 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 08:20:04 crc kubenswrapper[5094]: I0220 08:20:04.107803 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" gracePeriod=600 Feb 20 08:20:04 crc kubenswrapper[5094]: E0220 08:20:04.245164 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:20:04 crc kubenswrapper[5094]: I0220 08:20:04.733458 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" exitCode=0 Feb 20 08:20:04 crc kubenswrapper[5094]: I0220 08:20:04.733502 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda"} Feb 20 08:20:04 crc kubenswrapper[5094]: I0220 08:20:04.733533 5094 scope.go:117] "RemoveContainer" containerID="97148134be230a037117359d1f168e80e119726d339f329bd4df40629ef0a2ca" Feb 20 08:20:04 crc kubenswrapper[5094]: I0220 08:20:04.734040 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:20:04 crc kubenswrapper[5094]: E0220 08:20:04.734360 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:20:19 crc kubenswrapper[5094]: I0220 08:20:19.840978 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:20:19 crc kubenswrapper[5094]: E0220 08:20:19.842296 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:20:31 crc kubenswrapper[5094]: I0220 08:20:31.840876 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:20:31 crc kubenswrapper[5094]: E0220 08:20:31.841920 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:20:44 crc kubenswrapper[5094]: I0220 08:20:44.840481 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:20:44 crc kubenswrapper[5094]: E0220 08:20:44.841539 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:20:56 crc kubenswrapper[5094]: I0220 08:20:56.840294 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:20:56 crc kubenswrapper[5094]: E0220 08:20:56.842057 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:21:08 crc kubenswrapper[5094]: I0220 08:21:08.839796 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:21:08 crc kubenswrapper[5094]: E0220 08:21:08.840626 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:21:20 crc kubenswrapper[5094]: I0220 08:21:20.840550 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:21:20 crc kubenswrapper[5094]: E0220 08:21:20.841738 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:21:35 crc kubenswrapper[5094]: I0220 08:21:35.845555 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:21:35 crc kubenswrapper[5094]: E0220 08:21:35.846568 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:21:50 crc kubenswrapper[5094]: I0220 08:21:50.841226 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:21:50 crc kubenswrapper[5094]: E0220 08:21:50.842352 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.014820 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-48gmd"] Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.022790 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-48gmd"] Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.174926 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-ksv5l"] Feb 20 08:22:01 crc kubenswrapper[5094]: E0220 08:22:01.175398 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4b6568c-9fe2-4353-835a-d363b4e64f9b" containerName="extract-utilities" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.175424 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4b6568c-9fe2-4353-835a-d363b4e64f9b" containerName="extract-utilities" Feb 20 08:22:01 crc kubenswrapper[5094]: E0220 08:22:01.175453 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4b6568c-9fe2-4353-835a-d363b4e64f9b" containerName="registry-server" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.175463 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4b6568c-9fe2-4353-835a-d363b4e64f9b" containerName="registry-server" Feb 20 08:22:01 crc kubenswrapper[5094]: E0220 08:22:01.175478 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4b6568c-9fe2-4353-835a-d363b4e64f9b" containerName="extract-content" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.175489 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4b6568c-9fe2-4353-835a-d363b4e64f9b" containerName="extract-content" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.175666 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4b6568c-9fe2-4353-835a-d363b4e64f9b" containerName="registry-server" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.176596 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ksv5l" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.178611 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.178930 5094 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-c5nt4" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.179073 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.180160 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.190133 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ksv5l"] Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.269697 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-crc-storage\") pod \"crc-storage-crc-ksv5l\" (UID: \"7a2bc18e-e8d4-445b-b8aa-34659fab0d46\") " pod="crc-storage/crc-storage-crc-ksv5l" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.269761 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5pzg\" (UniqueName: \"kubernetes.io/projected/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-kube-api-access-k5pzg\") pod \"crc-storage-crc-ksv5l\" (UID: \"7a2bc18e-e8d4-445b-b8aa-34659fab0d46\") " pod="crc-storage/crc-storage-crc-ksv5l" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.269784 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-node-mnt\") pod \"crc-storage-crc-ksv5l\" (UID: \"7a2bc18e-e8d4-445b-b8aa-34659fab0d46\") " pod="crc-storage/crc-storage-crc-ksv5l" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.371384 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-crc-storage\") pod \"crc-storage-crc-ksv5l\" (UID: \"7a2bc18e-e8d4-445b-b8aa-34659fab0d46\") " pod="crc-storage/crc-storage-crc-ksv5l" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.371429 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5pzg\" (UniqueName: \"kubernetes.io/projected/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-kube-api-access-k5pzg\") pod \"crc-storage-crc-ksv5l\" (UID: \"7a2bc18e-e8d4-445b-b8aa-34659fab0d46\") " pod="crc-storage/crc-storage-crc-ksv5l" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.371453 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-node-mnt\") pod \"crc-storage-crc-ksv5l\" (UID: \"7a2bc18e-e8d4-445b-b8aa-34659fab0d46\") " pod="crc-storage/crc-storage-crc-ksv5l" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.371769 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-node-mnt\") pod \"crc-storage-crc-ksv5l\" (UID: \"7a2bc18e-e8d4-445b-b8aa-34659fab0d46\") " pod="crc-storage/crc-storage-crc-ksv5l" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.373258 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-crc-storage\") pod \"crc-storage-crc-ksv5l\" (UID: \"7a2bc18e-e8d4-445b-b8aa-34659fab0d46\") " pod="crc-storage/crc-storage-crc-ksv5l" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.391987 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5pzg\" (UniqueName: \"kubernetes.io/projected/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-kube-api-access-k5pzg\") pod \"crc-storage-crc-ksv5l\" (UID: \"7a2bc18e-e8d4-445b-b8aa-34659fab0d46\") " pod="crc-storage/crc-storage-crc-ksv5l" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.527739 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ksv5l" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.842402 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:22:01 crc kubenswrapper[5094]: E0220 08:22:01.842806 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.853193 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d8b4842-acdc-4e60-9de5-b7b6dde61b62" path="/var/lib/kubelet/pods/9d8b4842-acdc-4e60-9de5-b7b6dde61b62/volumes" Feb 20 08:22:02 crc kubenswrapper[5094]: I0220 08:22:02.094117 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ksv5l"] Feb 20 08:22:02 crc kubenswrapper[5094]: I0220 08:22:02.105173 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 08:22:02 crc kubenswrapper[5094]: I0220 08:22:02.789345 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ksv5l" event={"ID":"7a2bc18e-e8d4-445b-b8aa-34659fab0d46","Type":"ContainerStarted","Data":"25626209a2314dcb52916690377aafb551b2c0e2509d3cbf08a38364344d06cd"} Feb 20 08:22:03 crc kubenswrapper[5094]: I0220 08:22:03.798580 5094 generic.go:334] "Generic (PLEG): container finished" podID="7a2bc18e-e8d4-445b-b8aa-34659fab0d46" containerID="dc0e175dcf3ab875f0111e29b9804a9472d9627cddd8835f9c61529f34f1c8d3" exitCode=0 Feb 20 08:22:03 crc kubenswrapper[5094]: I0220 08:22:03.798648 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ksv5l" event={"ID":"7a2bc18e-e8d4-445b-b8aa-34659fab0d46","Type":"ContainerDied","Data":"dc0e175dcf3ab875f0111e29b9804a9472d9627cddd8835f9c61529f34f1c8d3"} Feb 20 08:22:05 crc kubenswrapper[5094]: I0220 08:22:05.239697 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ksv5l" Feb 20 08:22:05 crc kubenswrapper[5094]: I0220 08:22:05.336042 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5pzg\" (UniqueName: \"kubernetes.io/projected/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-kube-api-access-k5pzg\") pod \"7a2bc18e-e8d4-445b-b8aa-34659fab0d46\" (UID: \"7a2bc18e-e8d4-445b-b8aa-34659fab0d46\") " Feb 20 08:22:05 crc kubenswrapper[5094]: I0220 08:22:05.336120 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-crc-storage\") pod \"7a2bc18e-e8d4-445b-b8aa-34659fab0d46\" (UID: \"7a2bc18e-e8d4-445b-b8aa-34659fab0d46\") " Feb 20 08:22:05 crc kubenswrapper[5094]: I0220 08:22:05.336193 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-node-mnt\") pod \"7a2bc18e-e8d4-445b-b8aa-34659fab0d46\" (UID: \"7a2bc18e-e8d4-445b-b8aa-34659fab0d46\") " Feb 20 08:22:05 crc kubenswrapper[5094]: I0220 08:22:05.336597 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "7a2bc18e-e8d4-445b-b8aa-34659fab0d46" (UID: "7a2bc18e-e8d4-445b-b8aa-34659fab0d46"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 08:22:05 crc kubenswrapper[5094]: I0220 08:22:05.342082 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-kube-api-access-k5pzg" (OuterVolumeSpecName: "kube-api-access-k5pzg") pod "7a2bc18e-e8d4-445b-b8aa-34659fab0d46" (UID: "7a2bc18e-e8d4-445b-b8aa-34659fab0d46"). InnerVolumeSpecName "kube-api-access-k5pzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:22:05 crc kubenswrapper[5094]: I0220 08:22:05.366617 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "7a2bc18e-e8d4-445b-b8aa-34659fab0d46" (UID: "7a2bc18e-e8d4-445b-b8aa-34659fab0d46"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:22:05 crc kubenswrapper[5094]: I0220 08:22:05.438507 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5pzg\" (UniqueName: \"kubernetes.io/projected/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-kube-api-access-k5pzg\") on node \"crc\" DevicePath \"\"" Feb 20 08:22:05 crc kubenswrapper[5094]: I0220 08:22:05.438568 5094 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 20 08:22:05 crc kubenswrapper[5094]: I0220 08:22:05.438585 5094 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 20 08:22:05 crc kubenswrapper[5094]: I0220 08:22:05.818579 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ksv5l" event={"ID":"7a2bc18e-e8d4-445b-b8aa-34659fab0d46","Type":"ContainerDied","Data":"25626209a2314dcb52916690377aafb551b2c0e2509d3cbf08a38364344d06cd"} Feb 20 08:22:05 crc kubenswrapper[5094]: I0220 08:22:05.818643 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25626209a2314dcb52916690377aafb551b2c0e2509d3cbf08a38364344d06cd" Feb 20 08:22:05 crc kubenswrapper[5094]: I0220 08:22:05.818766 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ksv5l" Feb 20 08:22:07 crc kubenswrapper[5094]: I0220 08:22:07.820314 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-ksv5l"] Feb 20 08:22:07 crc kubenswrapper[5094]: I0220 08:22:07.830739 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-ksv5l"] Feb 20 08:22:07 crc kubenswrapper[5094]: I0220 08:22:07.852450 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a2bc18e-e8d4-445b-b8aa-34659fab0d46" path="/var/lib/kubelet/pods/7a2bc18e-e8d4-445b-b8aa-34659fab0d46/volumes" Feb 20 08:22:07 crc kubenswrapper[5094]: I0220 08:22:07.973773 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-flmdt"] Feb 20 08:22:07 crc kubenswrapper[5094]: E0220 08:22:07.974444 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a2bc18e-e8d4-445b-b8aa-34659fab0d46" containerName="storage" Feb 20 08:22:07 crc kubenswrapper[5094]: I0220 08:22:07.974486 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a2bc18e-e8d4-445b-b8aa-34659fab0d46" containerName="storage" Feb 20 08:22:07 crc kubenswrapper[5094]: I0220 08:22:07.974768 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a2bc18e-e8d4-445b-b8aa-34659fab0d46" containerName="storage" Feb 20 08:22:07 crc kubenswrapper[5094]: I0220 08:22:07.975913 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-flmdt" Feb 20 08:22:07 crc kubenswrapper[5094]: I0220 08:22:07.979383 5094 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-c5nt4" Feb 20 08:22:07 crc kubenswrapper[5094]: I0220 08:22:07.979694 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 20 08:22:07 crc kubenswrapper[5094]: I0220 08:22:07.980004 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 20 08:22:07 crc kubenswrapper[5094]: I0220 08:22:07.980960 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 20 08:22:07 crc kubenswrapper[5094]: I0220 08:22:07.983434 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb7dx\" (UniqueName: \"kubernetes.io/projected/aae9458b-5e2d-4930-b32f-ec957b766175-kube-api-access-cb7dx\") pod \"crc-storage-crc-flmdt\" (UID: \"aae9458b-5e2d-4930-b32f-ec957b766175\") " pod="crc-storage/crc-storage-crc-flmdt" Feb 20 08:22:07 crc kubenswrapper[5094]: I0220 08:22:07.983594 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/aae9458b-5e2d-4930-b32f-ec957b766175-node-mnt\") pod \"crc-storage-crc-flmdt\" (UID: \"aae9458b-5e2d-4930-b32f-ec957b766175\") " pod="crc-storage/crc-storage-crc-flmdt" Feb 20 08:22:07 crc kubenswrapper[5094]: I0220 08:22:07.983748 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/aae9458b-5e2d-4930-b32f-ec957b766175-crc-storage\") pod \"crc-storage-crc-flmdt\" (UID: \"aae9458b-5e2d-4930-b32f-ec957b766175\") " pod="crc-storage/crc-storage-crc-flmdt" Feb 20 08:22:07 crc kubenswrapper[5094]: I0220 08:22:07.986440 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-flmdt"] Feb 20 08:22:08 crc kubenswrapper[5094]: I0220 08:22:08.084378 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/aae9458b-5e2d-4930-b32f-ec957b766175-node-mnt\") pod \"crc-storage-crc-flmdt\" (UID: \"aae9458b-5e2d-4930-b32f-ec957b766175\") " pod="crc-storage/crc-storage-crc-flmdt" Feb 20 08:22:08 crc kubenswrapper[5094]: I0220 08:22:08.084478 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/aae9458b-5e2d-4930-b32f-ec957b766175-crc-storage\") pod \"crc-storage-crc-flmdt\" (UID: \"aae9458b-5e2d-4930-b32f-ec957b766175\") " pod="crc-storage/crc-storage-crc-flmdt" Feb 20 08:22:08 crc kubenswrapper[5094]: I0220 08:22:08.084553 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb7dx\" (UniqueName: \"kubernetes.io/projected/aae9458b-5e2d-4930-b32f-ec957b766175-kube-api-access-cb7dx\") pod \"crc-storage-crc-flmdt\" (UID: \"aae9458b-5e2d-4930-b32f-ec957b766175\") " pod="crc-storage/crc-storage-crc-flmdt" Feb 20 08:22:08 crc kubenswrapper[5094]: I0220 08:22:08.085033 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/aae9458b-5e2d-4930-b32f-ec957b766175-node-mnt\") pod \"crc-storage-crc-flmdt\" (UID: \"aae9458b-5e2d-4930-b32f-ec957b766175\") " pod="crc-storage/crc-storage-crc-flmdt" Feb 20 08:22:08 crc kubenswrapper[5094]: I0220 08:22:08.085948 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/aae9458b-5e2d-4930-b32f-ec957b766175-crc-storage\") pod \"crc-storage-crc-flmdt\" (UID: \"aae9458b-5e2d-4930-b32f-ec957b766175\") " pod="crc-storage/crc-storage-crc-flmdt" Feb 20 08:22:08 crc kubenswrapper[5094]: I0220 08:22:08.109618 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb7dx\" (UniqueName: \"kubernetes.io/projected/aae9458b-5e2d-4930-b32f-ec957b766175-kube-api-access-cb7dx\") pod \"crc-storage-crc-flmdt\" (UID: \"aae9458b-5e2d-4930-b32f-ec957b766175\") " pod="crc-storage/crc-storage-crc-flmdt" Feb 20 08:22:08 crc kubenswrapper[5094]: I0220 08:22:08.300072 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-flmdt" Feb 20 08:22:08 crc kubenswrapper[5094]: I0220 08:22:08.815288 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-flmdt"] Feb 20 08:22:08 crc kubenswrapper[5094]: I0220 08:22:08.989264 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-flmdt" event={"ID":"aae9458b-5e2d-4930-b32f-ec957b766175","Type":"ContainerStarted","Data":"7303d3d4bf14a883133338e56cfe34c7a95e01e6132c8b6d39989cf5a695dc2e"} Feb 20 08:22:09 crc kubenswrapper[5094]: I0220 08:22:09.999404 5094 generic.go:334] "Generic (PLEG): container finished" podID="aae9458b-5e2d-4930-b32f-ec957b766175" containerID="d4b83d13fb6f1fea22d75b149f467c7b4670de433e70c72a73a2640f82e7276c" exitCode=0 Feb 20 08:22:09 crc kubenswrapper[5094]: I0220 08:22:09.999513 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-flmdt" event={"ID":"aae9458b-5e2d-4930-b32f-ec957b766175","Type":"ContainerDied","Data":"d4b83d13fb6f1fea22d75b149f467c7b4670de433e70c72a73a2640f82e7276c"} Feb 20 08:22:11 crc kubenswrapper[5094]: I0220 08:22:11.434100 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-flmdt" Feb 20 08:22:11 crc kubenswrapper[5094]: I0220 08:22:11.545891 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/aae9458b-5e2d-4930-b32f-ec957b766175-crc-storage\") pod \"aae9458b-5e2d-4930-b32f-ec957b766175\" (UID: \"aae9458b-5e2d-4930-b32f-ec957b766175\") " Feb 20 08:22:11 crc kubenswrapper[5094]: I0220 08:22:11.546069 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb7dx\" (UniqueName: \"kubernetes.io/projected/aae9458b-5e2d-4930-b32f-ec957b766175-kube-api-access-cb7dx\") pod \"aae9458b-5e2d-4930-b32f-ec957b766175\" (UID: \"aae9458b-5e2d-4930-b32f-ec957b766175\") " Feb 20 08:22:11 crc kubenswrapper[5094]: I0220 08:22:11.546147 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/aae9458b-5e2d-4930-b32f-ec957b766175-node-mnt\") pod \"aae9458b-5e2d-4930-b32f-ec957b766175\" (UID: \"aae9458b-5e2d-4930-b32f-ec957b766175\") " Feb 20 08:22:11 crc kubenswrapper[5094]: I0220 08:22:11.546463 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aae9458b-5e2d-4930-b32f-ec957b766175-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "aae9458b-5e2d-4930-b32f-ec957b766175" (UID: "aae9458b-5e2d-4930-b32f-ec957b766175"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 08:22:11 crc kubenswrapper[5094]: I0220 08:22:11.552137 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aae9458b-5e2d-4930-b32f-ec957b766175-kube-api-access-cb7dx" (OuterVolumeSpecName: "kube-api-access-cb7dx") pod "aae9458b-5e2d-4930-b32f-ec957b766175" (UID: "aae9458b-5e2d-4930-b32f-ec957b766175"). InnerVolumeSpecName "kube-api-access-cb7dx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:22:11 crc kubenswrapper[5094]: I0220 08:22:11.580879 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aae9458b-5e2d-4930-b32f-ec957b766175-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "aae9458b-5e2d-4930-b32f-ec957b766175" (UID: "aae9458b-5e2d-4930-b32f-ec957b766175"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:22:11 crc kubenswrapper[5094]: I0220 08:22:11.648306 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb7dx\" (UniqueName: \"kubernetes.io/projected/aae9458b-5e2d-4930-b32f-ec957b766175-kube-api-access-cb7dx\") on node \"crc\" DevicePath \"\"" Feb 20 08:22:11 crc kubenswrapper[5094]: I0220 08:22:11.648355 5094 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/aae9458b-5e2d-4930-b32f-ec957b766175-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 20 08:22:11 crc kubenswrapper[5094]: I0220 08:22:11.648368 5094 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/aae9458b-5e2d-4930-b32f-ec957b766175-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 20 08:22:12 crc kubenswrapper[5094]: I0220 08:22:12.023985 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-flmdt" event={"ID":"aae9458b-5e2d-4930-b32f-ec957b766175","Type":"ContainerDied","Data":"7303d3d4bf14a883133338e56cfe34c7a95e01e6132c8b6d39989cf5a695dc2e"} Feb 20 08:22:12 crc kubenswrapper[5094]: I0220 08:22:12.024026 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7303d3d4bf14a883133338e56cfe34c7a95e01e6132c8b6d39989cf5a695dc2e" Feb 20 08:22:12 crc kubenswrapper[5094]: I0220 08:22:12.024088 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-flmdt" Feb 20 08:22:15 crc kubenswrapper[5094]: I0220 08:22:15.847912 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:22:15 crc kubenswrapper[5094]: E0220 08:22:15.848751 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:22:26 crc kubenswrapper[5094]: I0220 08:22:26.839947 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:22:26 crc kubenswrapper[5094]: E0220 08:22:26.840767 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:22:36 crc kubenswrapper[5094]: I0220 08:22:36.257954 5094 scope.go:117] "RemoveContainer" containerID="07fcab491ccca10a02c6e686a0115bd8c0916121144d5fd12b7356bb88847cbf" Feb 20 08:22:38 crc kubenswrapper[5094]: I0220 08:22:38.840931 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:22:38 crc kubenswrapper[5094]: E0220 08:22:38.841579 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:22:52 crc kubenswrapper[5094]: I0220 08:22:52.841001 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:22:52 crc kubenswrapper[5094]: E0220 08:22:52.843396 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:23:06 crc kubenswrapper[5094]: I0220 08:23:06.840618 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:23:06 crc kubenswrapper[5094]: E0220 08:23:06.841742 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:23:20 crc kubenswrapper[5094]: I0220 08:23:20.840279 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:23:20 crc kubenswrapper[5094]: E0220 08:23:20.841534 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:23:35 crc kubenswrapper[5094]: I0220 08:23:35.848593 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:23:35 crc kubenswrapper[5094]: E0220 08:23:35.849628 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:23:50 crc kubenswrapper[5094]: I0220 08:23:50.840526 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:23:50 crc kubenswrapper[5094]: E0220 08:23:50.841621 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:24:03 crc kubenswrapper[5094]: I0220 08:24:03.839738 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:24:03 crc kubenswrapper[5094]: E0220 08:24:03.840569 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:24:14 crc kubenswrapper[5094]: I0220 08:24:14.840863 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:24:14 crc kubenswrapper[5094]: E0220 08:24:14.841758 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.278239 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf6488d7-47ktg"] Feb 20 08:24:20 crc kubenswrapper[5094]: E0220 08:24:20.279049 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aae9458b-5e2d-4930-b32f-ec957b766175" containerName="storage" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.279063 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="aae9458b-5e2d-4930-b32f-ec957b766175" containerName="storage" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.279191 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="aae9458b-5e2d-4930-b32f-ec957b766175" containerName="storage" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.279883 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.284066 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.284599 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.284732 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.284855 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-jb4sz" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.284975 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.324050 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf6488d7-47ktg"] Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.376721 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-config\") pod \"dnsmasq-dns-bbf6488d7-47ktg\" (UID: \"3b479d95-6e62-47d9-9a4f-ae0db08b69f0\") " pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.376811 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-dns-svc\") pod \"dnsmasq-dns-bbf6488d7-47ktg\" (UID: \"3b479d95-6e62-47d9-9a4f-ae0db08b69f0\") " pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.376865 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7bkt\" (UniqueName: \"kubernetes.io/projected/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-kube-api-access-c7bkt\") pod \"dnsmasq-dns-bbf6488d7-47ktg\" (UID: \"3b479d95-6e62-47d9-9a4f-ae0db08b69f0\") " pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.477556 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-config\") pod \"dnsmasq-dns-bbf6488d7-47ktg\" (UID: \"3b479d95-6e62-47d9-9a4f-ae0db08b69f0\") " pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.477622 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-dns-svc\") pod \"dnsmasq-dns-bbf6488d7-47ktg\" (UID: \"3b479d95-6e62-47d9-9a4f-ae0db08b69f0\") " pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.477650 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7bkt\" (UniqueName: \"kubernetes.io/projected/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-kube-api-access-c7bkt\") pod \"dnsmasq-dns-bbf6488d7-47ktg\" (UID: \"3b479d95-6e62-47d9-9a4f-ae0db08b69f0\") " pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.478919 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-config\") pod \"dnsmasq-dns-bbf6488d7-47ktg\" (UID: \"3b479d95-6e62-47d9-9a4f-ae0db08b69f0\") " pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.479466 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-dns-svc\") pod \"dnsmasq-dns-bbf6488d7-47ktg\" (UID: \"3b479d95-6e62-47d9-9a4f-ae0db08b69f0\") " pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.507210 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7bkt\" (UniqueName: \"kubernetes.io/projected/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-kube-api-access-c7bkt\") pod \"dnsmasq-dns-bbf6488d7-47ktg\" (UID: \"3b479d95-6e62-47d9-9a4f-ae0db08b69f0\") " pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.595045 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.644914 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76466d7cdc-nc2ng"] Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.646309 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.661602 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76466d7cdc-nc2ng"] Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.689095 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-config\") pod \"dnsmasq-dns-76466d7cdc-nc2ng\" (UID: \"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9\") " pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.689212 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-dns-svc\") pod \"dnsmasq-dns-76466d7cdc-nc2ng\" (UID: \"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9\") " pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.689272 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pqx5\" (UniqueName: \"kubernetes.io/projected/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-kube-api-access-9pqx5\") pod \"dnsmasq-dns-76466d7cdc-nc2ng\" (UID: \"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9\") " pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.790490 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-config\") pod \"dnsmasq-dns-76466d7cdc-nc2ng\" (UID: \"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9\") " pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.790562 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-dns-svc\") pod \"dnsmasq-dns-76466d7cdc-nc2ng\" (UID: \"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9\") " pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.791996 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-dns-svc\") pod \"dnsmasq-dns-76466d7cdc-nc2ng\" (UID: \"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9\") " pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.792099 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pqx5\" (UniqueName: \"kubernetes.io/projected/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-kube-api-access-9pqx5\") pod \"dnsmasq-dns-76466d7cdc-nc2ng\" (UID: \"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9\") " pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.793173 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-config\") pod \"dnsmasq-dns-76466d7cdc-nc2ng\" (UID: \"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9\") " pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.820025 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pqx5\" (UniqueName: \"kubernetes.io/projected/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-kube-api-access-9pqx5\") pod \"dnsmasq-dns-76466d7cdc-nc2ng\" (UID: \"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9\") " pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.046603 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.165746 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf6488d7-47ktg"] Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.464442 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.468570 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.471264 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.471323 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.471276 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-2c4db" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.472757 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.472921 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.477556 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76466d7cdc-nc2ng"] Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.504178 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.616295 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.616345 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d6404f29-e503-4f82-a2ce-e147c18677a7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.616373 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.616404 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d6404f29-e503-4f82-a2ce-e147c18677a7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.616428 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.616452 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.616475 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d6404f29-e503-4f82-a2ce-e147c18677a7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.616517 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8tnp\" (UniqueName: \"kubernetes.io/projected/d6404f29-e503-4f82-a2ce-e147c18677a7-kube-api-access-n8tnp\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.616532 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d6404f29-e503-4f82-a2ce-e147c18677a7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.718493 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.718559 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d6404f29-e503-4f82-a2ce-e147c18677a7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.718586 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.718623 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d6404f29-e503-4f82-a2ce-e147c18677a7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.718653 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.718682 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.718726 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d6404f29-e503-4f82-a2ce-e147c18677a7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.718775 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8tnp\" (UniqueName: \"kubernetes.io/projected/d6404f29-e503-4f82-a2ce-e147c18677a7-kube-api-access-n8tnp\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.718791 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d6404f29-e503-4f82-a2ce-e147c18677a7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.719501 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.720068 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.721736 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d6404f29-e503-4f82-a2ce-e147c18677a7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.721817 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d6404f29-e503-4f82-a2ce-e147c18677a7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.722517 5094 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.722562 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cc749f9eaf145df5c444fbae24383d1bdaa4331dffc2f7c6f1445ba7dce2304b/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.730172 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d6404f29-e503-4f82-a2ce-e147c18677a7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.731581 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.737103 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8tnp\" (UniqueName: \"kubernetes.io/projected/d6404f29-e503-4f82-a2ce-e147c18677a7-kube-api-access-n8tnp\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.737141 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d6404f29-e503-4f82-a2ce-e147c18677a7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.751668 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.803516 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.831740 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.833497 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.836096 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.836124 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.836342 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.836450 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-rs7rx" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.836538 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.861453 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.023997 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/09194bc6-429f-46a1-8dac-8b84385c9e10-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.024052 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.024079 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/09194bc6-429f-46a1-8dac-8b84385c9e10-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.024130 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qh8s\" (UniqueName: \"kubernetes.io/projected/09194bc6-429f-46a1-8dac-8b84385c9e10-kube-api-access-4qh8s\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.024145 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/09194bc6-429f-46a1-8dac-8b84385c9e10-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.024170 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.024231 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-219358f9-7520-4512-8729-274ec1ad54bd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-219358f9-7520-4512-8729-274ec1ad54bd\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.024285 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/09194bc6-429f-46a1-8dac-8b84385c9e10-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.024301 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.125755 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-219358f9-7520-4512-8729-274ec1ad54bd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-219358f9-7520-4512-8729-274ec1ad54bd\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.125846 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/09194bc6-429f-46a1-8dac-8b84385c9e10-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.125874 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.125899 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/09194bc6-429f-46a1-8dac-8b84385c9e10-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.125921 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.125949 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/09194bc6-429f-46a1-8dac-8b84385c9e10-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.125998 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qh8s\" (UniqueName: \"kubernetes.io/projected/09194bc6-429f-46a1-8dac-8b84385c9e10-kube-api-access-4qh8s\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.126020 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/09194bc6-429f-46a1-8dac-8b84385c9e10-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.126052 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.127149 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/09194bc6-429f-46a1-8dac-8b84385c9e10-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.127391 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/09194bc6-429f-46a1-8dac-8b84385c9e10-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.128448 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.129692 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.130368 5094 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.130399 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-219358f9-7520-4512-8729-274ec1ad54bd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-219358f9-7520-4512-8729-274ec1ad54bd\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c0ed1e37b492902e6baae6f722347d61d8d5759c03a6d0fd94fd84b63621ed84/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.132055 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/09194bc6-429f-46a1-8dac-8b84385c9e10-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.132360 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/09194bc6-429f-46a1-8dac-8b84385c9e10-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.134360 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.141632 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qh8s\" (UniqueName: \"kubernetes.io/projected/09194bc6-429f-46a1-8dac-8b84385c9e10-kube-api-access-4qh8s\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.158826 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-219358f9-7520-4512-8729-274ec1ad54bd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-219358f9-7520-4512-8729-274ec1ad54bd\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.188034 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" event={"ID":"3b479d95-6e62-47d9-9a4f-ae0db08b69f0","Type":"ContainerStarted","Data":"b8edfc60f02b3a08dfffd5b78149935d5da9920c6ad89968f0a47ed48c3497b0"} Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.189886 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" event={"ID":"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9","Type":"ContainerStarted","Data":"e19bae85da8585d71a99f124e607e047f6c285504074d9b173d9c3014d6e6d83"} Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.223691 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.288485 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 08:24:22 crc kubenswrapper[5094]: W0220 08:24:22.296192 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6404f29_e503_4f82_a2ce_e147c18677a7.slice/crio-4b1fc1669c5a53997e1b5722c230d8dba80a44d7f6b4091a5bd28b4938f7b018 WatchSource:0}: Error finding container 4b1fc1669c5a53997e1b5722c230d8dba80a44d7f6b4091a5bd28b4938f7b018: Status 404 returned error can't find the container with id 4b1fc1669c5a53997e1b5722c230d8dba80a44d7f6b4091a5bd28b4938f7b018 Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.314795 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.316283 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.323769 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-cddr5" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.324047 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.325368 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.325668 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.327010 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.351086 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.431343 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/542d99bc-6049-42dc-9036-8a795552e896-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.431445 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/542d99bc-6049-42dc-9036-8a795552e896-kolla-config\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.431583 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldvqr\" (UniqueName: \"kubernetes.io/projected/542d99bc-6049-42dc-9036-8a795552e896-kube-api-access-ldvqr\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.431671 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/542d99bc-6049-42dc-9036-8a795552e896-config-data-default\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.431776 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/542d99bc-6049-42dc-9036-8a795552e896-config-data-generated\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.431834 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/542d99bc-6049-42dc-9036-8a795552e896-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.431967 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/542d99bc-6049-42dc-9036-8a795552e896-operator-scripts\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.432084 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d2a44e57-3591-4378-a5ef-cac1c042d904\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d2a44e57-3591-4378-a5ef-cac1c042d904\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.533864 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d2a44e57-3591-4378-a5ef-cac1c042d904\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d2a44e57-3591-4378-a5ef-cac1c042d904\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.533930 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/542d99bc-6049-42dc-9036-8a795552e896-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.533955 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/542d99bc-6049-42dc-9036-8a795552e896-kolla-config\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.533985 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldvqr\" (UniqueName: \"kubernetes.io/projected/542d99bc-6049-42dc-9036-8a795552e896-kube-api-access-ldvqr\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.534008 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/542d99bc-6049-42dc-9036-8a795552e896-config-data-default\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.534043 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/542d99bc-6049-42dc-9036-8a795552e896-config-data-generated\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.534068 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/542d99bc-6049-42dc-9036-8a795552e896-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.534108 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/542d99bc-6049-42dc-9036-8a795552e896-operator-scripts\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.535596 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/542d99bc-6049-42dc-9036-8a795552e896-operator-scripts\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.536119 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/542d99bc-6049-42dc-9036-8a795552e896-kolla-config\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.536144 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/542d99bc-6049-42dc-9036-8a795552e896-config-data-generated\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.536587 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/542d99bc-6049-42dc-9036-8a795552e896-config-data-default\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.538178 5094 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.538207 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d2a44e57-3591-4378-a5ef-cac1c042d904\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d2a44e57-3591-4378-a5ef-cac1c042d904\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0ba4264431e89fd45e3b88a4a6661a636cbc0affb8e1a965d3661fe6696cb9c8/globalmount\"" pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.539940 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/542d99bc-6049-42dc-9036-8a795552e896-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.540418 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/542d99bc-6049-42dc-9036-8a795552e896-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.566757 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldvqr\" (UniqueName: \"kubernetes.io/projected/542d99bc-6049-42dc-9036-8a795552e896-kube-api-access-ldvqr\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.579126 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d2a44e57-3591-4378-a5ef-cac1c042d904\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d2a44e57-3591-4378-a5ef-cac1c042d904\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.653227 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.668854 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.939820 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.945944 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.950145 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-mgp6g" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.950925 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.958046 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 20 08:24:23 crc kubenswrapper[5094]: I0220 08:24:23.042150 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5074d037-240e-4685-8c3b-3dd7b963beb0-config-data\") pod \"memcached-0\" (UID: \"5074d037-240e-4685-8c3b-3dd7b963beb0\") " pod="openstack/memcached-0" Feb 20 08:24:23 crc kubenswrapper[5094]: I0220 08:24:23.043151 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djqkm\" (UniqueName: \"kubernetes.io/projected/5074d037-240e-4685-8c3b-3dd7b963beb0-kube-api-access-djqkm\") pod \"memcached-0\" (UID: \"5074d037-240e-4685-8c3b-3dd7b963beb0\") " pod="openstack/memcached-0" Feb 20 08:24:23 crc kubenswrapper[5094]: I0220 08:24:23.043270 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5074d037-240e-4685-8c3b-3dd7b963beb0-kolla-config\") pod \"memcached-0\" (UID: \"5074d037-240e-4685-8c3b-3dd7b963beb0\") " pod="openstack/memcached-0" Feb 20 08:24:23 crc kubenswrapper[5094]: I0220 08:24:23.069715 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 20 08:24:23 crc kubenswrapper[5094]: I0220 08:24:23.144523 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5074d037-240e-4685-8c3b-3dd7b963beb0-config-data\") pod \"memcached-0\" (UID: \"5074d037-240e-4685-8c3b-3dd7b963beb0\") " pod="openstack/memcached-0" Feb 20 08:24:23 crc kubenswrapper[5094]: I0220 08:24:23.144608 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djqkm\" (UniqueName: \"kubernetes.io/projected/5074d037-240e-4685-8c3b-3dd7b963beb0-kube-api-access-djqkm\") pod \"memcached-0\" (UID: \"5074d037-240e-4685-8c3b-3dd7b963beb0\") " pod="openstack/memcached-0" Feb 20 08:24:23 crc kubenswrapper[5094]: I0220 08:24:23.144649 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5074d037-240e-4685-8c3b-3dd7b963beb0-kolla-config\") pod \"memcached-0\" (UID: \"5074d037-240e-4685-8c3b-3dd7b963beb0\") " pod="openstack/memcached-0" Feb 20 08:24:23 crc kubenswrapper[5094]: I0220 08:24:23.145843 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5074d037-240e-4685-8c3b-3dd7b963beb0-config-data\") pod \"memcached-0\" (UID: \"5074d037-240e-4685-8c3b-3dd7b963beb0\") " pod="openstack/memcached-0" Feb 20 08:24:23 crc kubenswrapper[5094]: I0220 08:24:23.146613 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5074d037-240e-4685-8c3b-3dd7b963beb0-kolla-config\") pod \"memcached-0\" (UID: \"5074d037-240e-4685-8c3b-3dd7b963beb0\") " pod="openstack/memcached-0" Feb 20 08:24:23 crc kubenswrapper[5094]: I0220 08:24:23.163590 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djqkm\" (UniqueName: \"kubernetes.io/projected/5074d037-240e-4685-8c3b-3dd7b963beb0-kube-api-access-djqkm\") pod \"memcached-0\" (UID: \"5074d037-240e-4685-8c3b-3dd7b963beb0\") " pod="openstack/memcached-0" Feb 20 08:24:23 crc kubenswrapper[5094]: I0220 08:24:23.215748 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d6404f29-e503-4f82-a2ce-e147c18677a7","Type":"ContainerStarted","Data":"4b1fc1669c5a53997e1b5722c230d8dba80a44d7f6b4091a5bd28b4938f7b018"} Feb 20 08:24:23 crc kubenswrapper[5094]: I0220 08:24:23.221361 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"542d99bc-6049-42dc-9036-8a795552e896","Type":"ContainerStarted","Data":"dcdb3b2cee1e238efa34b6ce00f898e563246cfe9a36783212511b2b820cb19f"} Feb 20 08:24:23 crc kubenswrapper[5094]: I0220 08:24:23.223352 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"09194bc6-429f-46a1-8dac-8b84385c9e10","Type":"ContainerStarted","Data":"9e766a894fc47307f318a9f38ad47bbac110c5ff58c56284d0fae3825eea954c"} Feb 20 08:24:23 crc kubenswrapper[5094]: I0220 08:24:23.279450 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 20 08:24:23 crc kubenswrapper[5094]: I0220 08:24:23.713020 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.106199 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.109641 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.112479 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.112721 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-c2zzx" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.113808 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.113858 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.114142 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.234033 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5074d037-240e-4685-8c3b-3dd7b963beb0","Type":"ContainerStarted","Data":"c1f694fa5035758e7ea5e4a26c7fadd55dd76c0890987ed3fac18d3998a072f1"} Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.271010 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98dd23d5-7a26-4a06-a35a-e818b8feba3c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.271109 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/98dd23d5-7a26-4a06-a35a-e818b8feba3c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.271167 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98dd23d5-7a26-4a06-a35a-e818b8feba3c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.271234 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/98dd23d5-7a26-4a06-a35a-e818b8feba3c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.271301 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/98dd23d5-7a26-4a06-a35a-e818b8feba3c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.271350 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/98dd23d5-7a26-4a06-a35a-e818b8feba3c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.271441 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh64k\" (UniqueName: \"kubernetes.io/projected/98dd23d5-7a26-4a06-a35a-e818b8feba3c-kube-api-access-rh64k\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.271594 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8efa47f2-4de9-40d5-87ac-b721e70aafe9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8efa47f2-4de9-40d5-87ac-b721e70aafe9\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.373462 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/98dd23d5-7a26-4a06-a35a-e818b8feba3c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.373527 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/98dd23d5-7a26-4a06-a35a-e818b8feba3c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.373565 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/98dd23d5-7a26-4a06-a35a-e818b8feba3c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.373611 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh64k\" (UniqueName: \"kubernetes.io/projected/98dd23d5-7a26-4a06-a35a-e818b8feba3c-kube-api-access-rh64k\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.373718 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8efa47f2-4de9-40d5-87ac-b721e70aafe9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8efa47f2-4de9-40d5-87ac-b721e70aafe9\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.373746 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98dd23d5-7a26-4a06-a35a-e818b8feba3c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.373772 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/98dd23d5-7a26-4a06-a35a-e818b8feba3c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.373798 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98dd23d5-7a26-4a06-a35a-e818b8feba3c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.373905 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/98dd23d5-7a26-4a06-a35a-e818b8feba3c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.374610 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/98dd23d5-7a26-4a06-a35a-e818b8feba3c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.375191 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98dd23d5-7a26-4a06-a35a-e818b8feba3c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.375229 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/98dd23d5-7a26-4a06-a35a-e818b8feba3c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.377470 5094 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.377506 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8efa47f2-4de9-40d5-87ac-b721e70aafe9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8efa47f2-4de9-40d5-87ac-b721e70aafe9\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e3a1c21aa2f31023ffd9d8ce062194bd602191be618555955b2d9607f3d4eb2e/globalmount\"" pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.380018 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98dd23d5-7a26-4a06-a35a-e818b8feba3c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.393528 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/98dd23d5-7a26-4a06-a35a-e818b8feba3c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.393993 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh64k\" (UniqueName: \"kubernetes.io/projected/98dd23d5-7a26-4a06-a35a-e818b8feba3c-kube-api-access-rh64k\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.404206 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8efa47f2-4de9-40d5-87ac-b721e70aafe9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8efa47f2-4de9-40d5-87ac-b721e70aafe9\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.444156 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.856295 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 08:24:24 crc kubenswrapper[5094]: W0220 08:24:24.874881 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98dd23d5_7a26_4a06_a35a_e818b8feba3c.slice/crio-7fd2e7d13354b8cdd517191e3dfc6033af95d1eee2076eb99236435f65dd717e WatchSource:0}: Error finding container 7fd2e7d13354b8cdd517191e3dfc6033af95d1eee2076eb99236435f65dd717e: Status 404 returned error can't find the container with id 7fd2e7d13354b8cdd517191e3dfc6033af95d1eee2076eb99236435f65dd717e Feb 20 08:24:25 crc kubenswrapper[5094]: I0220 08:24:25.244010 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"98dd23d5-7a26-4a06-a35a-e818b8feba3c","Type":"ContainerStarted","Data":"7fd2e7d13354b8cdd517191e3dfc6033af95d1eee2076eb99236435f65dd717e"} Feb 20 08:24:28 crc kubenswrapper[5094]: I0220 08:24:28.841933 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:24:28 crc kubenswrapper[5094]: E0220 08:24:28.842115 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:24:43 crc kubenswrapper[5094]: I0220 08:24:43.840523 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:24:43 crc kubenswrapper[5094]: E0220 08:24:43.841343 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:24:44 crc kubenswrapper[5094]: I0220 08:24:44.537318 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"98dd23d5-7a26-4a06-a35a-e818b8feba3c","Type":"ContainerStarted","Data":"a6d1279d9d85f16e430a938ac1dc6735a73ff059dd1b7fd319df3fe9ec5c1713"} Feb 20 08:24:44 crc kubenswrapper[5094]: I0220 08:24:44.539440 5094 generic.go:334] "Generic (PLEG): container finished" podID="9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9" containerID="24e6eb261ea32e536bfab420ed66babedb6b616c3c0c6563b11146f4eaf4e89e" exitCode=0 Feb 20 08:24:44 crc kubenswrapper[5094]: I0220 08:24:44.539528 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" event={"ID":"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9","Type":"ContainerDied","Data":"24e6eb261ea32e536bfab420ed66babedb6b616c3c0c6563b11146f4eaf4e89e"} Feb 20 08:24:44 crc kubenswrapper[5094]: I0220 08:24:44.541198 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5074d037-240e-4685-8c3b-3dd7b963beb0","Type":"ContainerStarted","Data":"0f1c0f233201ef672176a0ccaab58c929e5fb1e955a41efd3b7b5b3a438bb757"} Feb 20 08:24:44 crc kubenswrapper[5094]: I0220 08:24:44.541411 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 20 08:24:44 crc kubenswrapper[5094]: I0220 08:24:44.543586 5094 generic.go:334] "Generic (PLEG): container finished" podID="3b479d95-6e62-47d9-9a4f-ae0db08b69f0" containerID="e9f7e3bd4afae9d78543f393ecec74c64af345ea3fff97276a145955cecf9c5c" exitCode=0 Feb 20 08:24:44 crc kubenswrapper[5094]: I0220 08:24:44.543639 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" event={"ID":"3b479d95-6e62-47d9-9a4f-ae0db08b69f0","Type":"ContainerDied","Data":"e9f7e3bd4afae9d78543f393ecec74c64af345ea3fff97276a145955cecf9c5c"} Feb 20 08:24:44 crc kubenswrapper[5094]: I0220 08:24:44.546269 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"542d99bc-6049-42dc-9036-8a795552e896","Type":"ContainerStarted","Data":"33e61e60c5f24519ad97451c04f3fc6b600a1332b53418633fa0a10d49fe57bb"} Feb 20 08:24:44 crc kubenswrapper[5094]: I0220 08:24:44.639923 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.616130379 podStartE2EDuration="22.639897089s" podCreationTimestamp="2026-02-20 08:24:22 +0000 UTC" firstStartedPulling="2026-02-20 08:24:23.729249946 +0000 UTC m=+5878.601876657" lastFinishedPulling="2026-02-20 08:24:43.753016666 +0000 UTC m=+5898.625643367" observedRunningTime="2026-02-20 08:24:44.618189794 +0000 UTC m=+5899.490816525" watchObservedRunningTime="2026-02-20 08:24:44.639897089 +0000 UTC m=+5899.512523800" Feb 20 08:24:44 crc kubenswrapper[5094]: E0220 08:24:44.767441 5094 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Feb 20 08:24:44 crc kubenswrapper[5094]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/3b479d95-6e62-47d9-9a4f-ae0db08b69f0/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 20 08:24:44 crc kubenswrapper[5094]: > podSandboxID="b8edfc60f02b3a08dfffd5b78149935d5da9920c6ad89968f0a47ed48c3497b0" Feb 20 08:24:44 crc kubenswrapper[5094]: E0220 08:24:44.767622 5094 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 20 08:24:44 crc kubenswrapper[5094]: container &Container{Name:dnsmasq-dns,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:f0473f3e18dd17d7021c02e991298923,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8chc6h5bh56fh546hb7hc8h67h5bchffh577h697h5b5h5bdh59bhf6hf4h558hb5h578h595h5cchfbh644h59ch7fh654h547h587h5cbh5d5h8fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c7bkt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-bbf6488d7-47ktg_openstack(3b479d95-6e62-47d9-9a4f-ae0db08b69f0): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/3b479d95-6e62-47d9-9a4f-ae0db08b69f0/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 20 08:24:44 crc kubenswrapper[5094]: > logger="UnhandledError" Feb 20 08:24:44 crc kubenswrapper[5094]: E0220 08:24:44.768817 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/3b479d95-6e62-47d9-9a4f-ae0db08b69f0/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" podUID="3b479d95-6e62-47d9-9a4f-ae0db08b69f0" Feb 20 08:24:45 crc kubenswrapper[5094]: I0220 08:24:45.553658 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"09194bc6-429f-46a1-8dac-8b84385c9e10","Type":"ContainerStarted","Data":"969469a9c4ed7927decd268446b641c0bc47204103fa4da8c53604820787ff47"} Feb 20 08:24:45 crc kubenswrapper[5094]: I0220 08:24:45.556559 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" event={"ID":"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9","Type":"ContainerStarted","Data":"8a09f7482da0f94c8e0f012e0e1e52314f35f8500de127da7d88164462490023"} Feb 20 08:24:45 crc kubenswrapper[5094]: I0220 08:24:45.557104 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" Feb 20 08:24:45 crc kubenswrapper[5094]: I0220 08:24:45.559898 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d6404f29-e503-4f82-a2ce-e147c18677a7","Type":"ContainerStarted","Data":"6d76e2a1b5f6be82c63ad209d475f9abed3c7fa6699825eafe32cb1a6b659d1b"} Feb 20 08:24:45 crc kubenswrapper[5094]: I0220 08:24:45.660478 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" podStartSLOduration=3.477640103 podStartE2EDuration="25.660455596s" podCreationTimestamp="2026-02-20 08:24:20 +0000 UTC" firstStartedPulling="2026-02-20 08:24:21.525840672 +0000 UTC m=+5876.398467383" lastFinishedPulling="2026-02-20 08:24:43.708656165 +0000 UTC m=+5898.581282876" observedRunningTime="2026-02-20 08:24:45.630597926 +0000 UTC m=+5900.503224627" watchObservedRunningTime="2026-02-20 08:24:45.660455596 +0000 UTC m=+5900.533082317" Feb 20 08:24:46 crc kubenswrapper[5094]: I0220 08:24:46.567538 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" event={"ID":"3b479d95-6e62-47d9-9a4f-ae0db08b69f0","Type":"ContainerStarted","Data":"ccb0c715c4c5f62bbb1a26097315e58e086c6a4dc9a1f3f25e16ec5c7afb5d9e"} Feb 20 08:24:46 crc kubenswrapper[5094]: I0220 08:24:46.568832 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" Feb 20 08:24:46 crc kubenswrapper[5094]: I0220 08:24:46.595012 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" podStartSLOduration=4.061823541 podStartE2EDuration="26.594989529s" podCreationTimestamp="2026-02-20 08:24:20 +0000 UTC" firstStartedPulling="2026-02-20 08:24:21.175521138 +0000 UTC m=+5876.048147849" lastFinishedPulling="2026-02-20 08:24:43.708687126 +0000 UTC m=+5898.581313837" observedRunningTime="2026-02-20 08:24:46.591344861 +0000 UTC m=+5901.463971602" watchObservedRunningTime="2026-02-20 08:24:46.594989529 +0000 UTC m=+5901.467616250" Feb 20 08:24:47 crc kubenswrapper[5094]: I0220 08:24:47.578081 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"98dd23d5-7a26-4a06-a35a-e818b8feba3c","Type":"ContainerDied","Data":"a6d1279d9d85f16e430a938ac1dc6735a73ff059dd1b7fd319df3fe9ec5c1713"} Feb 20 08:24:47 crc kubenswrapper[5094]: I0220 08:24:47.577963 5094 generic.go:334] "Generic (PLEG): container finished" podID="98dd23d5-7a26-4a06-a35a-e818b8feba3c" containerID="a6d1279d9d85f16e430a938ac1dc6735a73ff059dd1b7fd319df3fe9ec5c1713" exitCode=0 Feb 20 08:24:47 crc kubenswrapper[5094]: I0220 08:24:47.580507 5094 generic.go:334] "Generic (PLEG): container finished" podID="542d99bc-6049-42dc-9036-8a795552e896" containerID="33e61e60c5f24519ad97451c04f3fc6b600a1332b53418633fa0a10d49fe57bb" exitCode=0 Feb 20 08:24:47 crc kubenswrapper[5094]: I0220 08:24:47.580597 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"542d99bc-6049-42dc-9036-8a795552e896","Type":"ContainerDied","Data":"33e61e60c5f24519ad97451c04f3fc6b600a1332b53418633fa0a10d49fe57bb"} Feb 20 08:24:48 crc kubenswrapper[5094]: I0220 08:24:48.591044 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"542d99bc-6049-42dc-9036-8a795552e896","Type":"ContainerStarted","Data":"a05b6ba9743e2aa1a2a2831e18ff866d108d1cd723ee36b2b5263f8cab5bda9b"} Feb 20 08:24:48 crc kubenswrapper[5094]: I0220 08:24:48.593029 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"98dd23d5-7a26-4a06-a35a-e818b8feba3c","Type":"ContainerStarted","Data":"04cae0d528ed5bf01cb8000560f61cd697745bcdb665cbb5524d94ef584ed691"} Feb 20 08:24:48 crc kubenswrapper[5094]: I0220 08:24:48.616953 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=6.993498583 podStartE2EDuration="27.616934844s" podCreationTimestamp="2026-02-20 08:24:21 +0000 UTC" firstStartedPulling="2026-02-20 08:24:23.090528402 +0000 UTC m=+5877.963155113" lastFinishedPulling="2026-02-20 08:24:43.713964663 +0000 UTC m=+5898.586591374" observedRunningTime="2026-02-20 08:24:48.610996381 +0000 UTC m=+5903.483623172" watchObservedRunningTime="2026-02-20 08:24:48.616934844 +0000 UTC m=+5903.489561555" Feb 20 08:24:48 crc kubenswrapper[5094]: I0220 08:24:48.641096 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=6.719832842 podStartE2EDuration="25.641073346s" podCreationTimestamp="2026-02-20 08:24:23 +0000 UTC" firstStartedPulling="2026-02-20 08:24:24.876408319 +0000 UTC m=+5879.749035030" lastFinishedPulling="2026-02-20 08:24:43.797648823 +0000 UTC m=+5898.670275534" observedRunningTime="2026-02-20 08:24:48.634067097 +0000 UTC m=+5903.506693868" watchObservedRunningTime="2026-02-20 08:24:48.641073346 +0000 UTC m=+5903.513700057" Feb 20 08:24:50 crc kubenswrapper[5094]: I0220 08:24:50.602177 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" Feb 20 08:24:51 crc kubenswrapper[5094]: I0220 08:24:51.047878 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" Feb 20 08:24:51 crc kubenswrapper[5094]: I0220 08:24:51.102445 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf6488d7-47ktg"] Feb 20 08:24:51 crc kubenswrapper[5094]: I0220 08:24:51.613591 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" podUID="3b479d95-6e62-47d9-9a4f-ae0db08b69f0" containerName="dnsmasq-dns" containerID="cri-o://ccb0c715c4c5f62bbb1a26097315e58e086c6a4dc9a1f3f25e16ec5c7afb5d9e" gracePeriod=10 Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.053018 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.203263 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7bkt\" (UniqueName: \"kubernetes.io/projected/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-kube-api-access-c7bkt\") pod \"3b479d95-6e62-47d9-9a4f-ae0db08b69f0\" (UID: \"3b479d95-6e62-47d9-9a4f-ae0db08b69f0\") " Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.203316 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-config\") pod \"3b479d95-6e62-47d9-9a4f-ae0db08b69f0\" (UID: \"3b479d95-6e62-47d9-9a4f-ae0db08b69f0\") " Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.203361 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-dns-svc\") pod \"3b479d95-6e62-47d9-9a4f-ae0db08b69f0\" (UID: \"3b479d95-6e62-47d9-9a4f-ae0db08b69f0\") " Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.208256 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-kube-api-access-c7bkt" (OuterVolumeSpecName: "kube-api-access-c7bkt") pod "3b479d95-6e62-47d9-9a4f-ae0db08b69f0" (UID: "3b479d95-6e62-47d9-9a4f-ae0db08b69f0"). InnerVolumeSpecName "kube-api-access-c7bkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.238512 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-config" (OuterVolumeSpecName: "config") pod "3b479d95-6e62-47d9-9a4f-ae0db08b69f0" (UID: "3b479d95-6e62-47d9-9a4f-ae0db08b69f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.244536 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3b479d95-6e62-47d9-9a4f-ae0db08b69f0" (UID: "3b479d95-6e62-47d9-9a4f-ae0db08b69f0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.304558 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7bkt\" (UniqueName: \"kubernetes.io/projected/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-kube-api-access-c7bkt\") on node \"crc\" DevicePath \"\"" Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.304595 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.304604 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.632549 5094 generic.go:334] "Generic (PLEG): container finished" podID="3b479d95-6e62-47d9-9a4f-ae0db08b69f0" containerID="ccb0c715c4c5f62bbb1a26097315e58e086c6a4dc9a1f3f25e16ec5c7afb5d9e" exitCode=0 Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.632595 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" event={"ID":"3b479d95-6e62-47d9-9a4f-ae0db08b69f0","Type":"ContainerDied","Data":"ccb0c715c4c5f62bbb1a26097315e58e086c6a4dc9a1f3f25e16ec5c7afb5d9e"} Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.632622 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" event={"ID":"3b479d95-6e62-47d9-9a4f-ae0db08b69f0","Type":"ContainerDied","Data":"b8edfc60f02b3a08dfffd5b78149935d5da9920c6ad89968f0a47ed48c3497b0"} Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.632618 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.632637 5094 scope.go:117] "RemoveContainer" containerID="ccb0c715c4c5f62bbb1a26097315e58e086c6a4dc9a1f3f25e16ec5c7afb5d9e" Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.654464 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.654500 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.668017 5094 scope.go:117] "RemoveContainer" containerID="e9f7e3bd4afae9d78543f393ecec74c64af345ea3fff97276a145955cecf9c5c" Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.668532 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf6488d7-47ktg"] Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.676402 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf6488d7-47ktg"] Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.694096 5094 scope.go:117] "RemoveContainer" containerID="ccb0c715c4c5f62bbb1a26097315e58e086c6a4dc9a1f3f25e16ec5c7afb5d9e" Feb 20 08:24:52 crc kubenswrapper[5094]: E0220 08:24:52.694530 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccb0c715c4c5f62bbb1a26097315e58e086c6a4dc9a1f3f25e16ec5c7afb5d9e\": container with ID starting with ccb0c715c4c5f62bbb1a26097315e58e086c6a4dc9a1f3f25e16ec5c7afb5d9e not found: ID does not exist" containerID="ccb0c715c4c5f62bbb1a26097315e58e086c6a4dc9a1f3f25e16ec5c7afb5d9e" Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.694593 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccb0c715c4c5f62bbb1a26097315e58e086c6a4dc9a1f3f25e16ec5c7afb5d9e"} err="failed to get container status \"ccb0c715c4c5f62bbb1a26097315e58e086c6a4dc9a1f3f25e16ec5c7afb5d9e\": rpc error: code = NotFound desc = could not find container \"ccb0c715c4c5f62bbb1a26097315e58e086c6a4dc9a1f3f25e16ec5c7afb5d9e\": container with ID starting with ccb0c715c4c5f62bbb1a26097315e58e086c6a4dc9a1f3f25e16ec5c7afb5d9e not found: ID does not exist" Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.694626 5094 scope.go:117] "RemoveContainer" containerID="e9f7e3bd4afae9d78543f393ecec74c64af345ea3fff97276a145955cecf9c5c" Feb 20 08:24:52 crc kubenswrapper[5094]: E0220 08:24:52.695262 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9f7e3bd4afae9d78543f393ecec74c64af345ea3fff97276a145955cecf9c5c\": container with ID starting with e9f7e3bd4afae9d78543f393ecec74c64af345ea3fff97276a145955cecf9c5c not found: ID does not exist" containerID="e9f7e3bd4afae9d78543f393ecec74c64af345ea3fff97276a145955cecf9c5c" Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.695304 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9f7e3bd4afae9d78543f393ecec74c64af345ea3fff97276a145955cecf9c5c"} err="failed to get container status \"e9f7e3bd4afae9d78543f393ecec74c64af345ea3fff97276a145955cecf9c5c\": rpc error: code = NotFound desc = could not find container \"e9f7e3bd4afae9d78543f393ecec74c64af345ea3fff97276a145955cecf9c5c\": container with ID starting with e9f7e3bd4afae9d78543f393ecec74c64af345ea3fff97276a145955cecf9c5c not found: ID does not exist" Feb 20 08:24:53 crc kubenswrapper[5094]: I0220 08:24:53.280478 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 20 08:24:53 crc kubenswrapper[5094]: I0220 08:24:53.857288 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b479d95-6e62-47d9-9a4f-ae0db08b69f0" path="/var/lib/kubelet/pods/3b479d95-6e62-47d9-9a4f-ae0db08b69f0/volumes" Feb 20 08:24:54 crc kubenswrapper[5094]: I0220 08:24:54.444457 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:54 crc kubenswrapper[5094]: I0220 08:24:54.444987 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:54 crc kubenswrapper[5094]: I0220 08:24:54.526672 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:54 crc kubenswrapper[5094]: I0220 08:24:54.721348 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:55 crc kubenswrapper[5094]: I0220 08:24:55.197427 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 20 08:24:55 crc kubenswrapper[5094]: I0220 08:24:55.320292 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 20 08:24:56 crc kubenswrapper[5094]: I0220 08:24:56.840141 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:24:56 crc kubenswrapper[5094]: E0220 08:24:56.840698 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:25:01 crc kubenswrapper[5094]: I0220 08:25:01.329388 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-m2rs9"] Feb 20 08:25:01 crc kubenswrapper[5094]: E0220 08:25:01.332506 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b479d95-6e62-47d9-9a4f-ae0db08b69f0" containerName="init" Feb 20 08:25:01 crc kubenswrapper[5094]: I0220 08:25:01.332607 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b479d95-6e62-47d9-9a4f-ae0db08b69f0" containerName="init" Feb 20 08:25:01 crc kubenswrapper[5094]: E0220 08:25:01.332668 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b479d95-6e62-47d9-9a4f-ae0db08b69f0" containerName="dnsmasq-dns" Feb 20 08:25:01 crc kubenswrapper[5094]: I0220 08:25:01.332785 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b479d95-6e62-47d9-9a4f-ae0db08b69f0" containerName="dnsmasq-dns" Feb 20 08:25:01 crc kubenswrapper[5094]: I0220 08:25:01.332983 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b479d95-6e62-47d9-9a4f-ae0db08b69f0" containerName="dnsmasq-dns" Feb 20 08:25:01 crc kubenswrapper[5094]: I0220 08:25:01.333521 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-m2rs9" Feb 20 08:25:01 crc kubenswrapper[5094]: I0220 08:25:01.339166 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 20 08:25:01 crc kubenswrapper[5094]: I0220 08:25:01.343620 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-m2rs9"] Feb 20 08:25:01 crc kubenswrapper[5094]: I0220 08:25:01.449256 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcnbj\" (UniqueName: \"kubernetes.io/projected/6bfb5bd8-1bf4-4080-908d-71dac3301aca-kube-api-access-rcnbj\") pod \"root-account-create-update-m2rs9\" (UID: \"6bfb5bd8-1bf4-4080-908d-71dac3301aca\") " pod="openstack/root-account-create-update-m2rs9" Feb 20 08:25:01 crc kubenswrapper[5094]: I0220 08:25:01.449388 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bfb5bd8-1bf4-4080-908d-71dac3301aca-operator-scripts\") pod \"root-account-create-update-m2rs9\" (UID: \"6bfb5bd8-1bf4-4080-908d-71dac3301aca\") " pod="openstack/root-account-create-update-m2rs9" Feb 20 08:25:01 crc kubenswrapper[5094]: I0220 08:25:01.551369 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bfb5bd8-1bf4-4080-908d-71dac3301aca-operator-scripts\") pod \"root-account-create-update-m2rs9\" (UID: \"6bfb5bd8-1bf4-4080-908d-71dac3301aca\") " pod="openstack/root-account-create-update-m2rs9" Feb 20 08:25:01 crc kubenswrapper[5094]: I0220 08:25:01.551526 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcnbj\" (UniqueName: \"kubernetes.io/projected/6bfb5bd8-1bf4-4080-908d-71dac3301aca-kube-api-access-rcnbj\") pod \"root-account-create-update-m2rs9\" (UID: \"6bfb5bd8-1bf4-4080-908d-71dac3301aca\") " pod="openstack/root-account-create-update-m2rs9" Feb 20 08:25:01 crc kubenswrapper[5094]: I0220 08:25:01.552755 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bfb5bd8-1bf4-4080-908d-71dac3301aca-operator-scripts\") pod \"root-account-create-update-m2rs9\" (UID: \"6bfb5bd8-1bf4-4080-908d-71dac3301aca\") " pod="openstack/root-account-create-update-m2rs9" Feb 20 08:25:01 crc kubenswrapper[5094]: I0220 08:25:01.572343 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcnbj\" (UniqueName: \"kubernetes.io/projected/6bfb5bd8-1bf4-4080-908d-71dac3301aca-kube-api-access-rcnbj\") pod \"root-account-create-update-m2rs9\" (UID: \"6bfb5bd8-1bf4-4080-908d-71dac3301aca\") " pod="openstack/root-account-create-update-m2rs9" Feb 20 08:25:01 crc kubenswrapper[5094]: I0220 08:25:01.654887 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-m2rs9" Feb 20 08:25:01 crc kubenswrapper[5094]: I0220 08:25:01.899540 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-m2rs9"] Feb 20 08:25:01 crc kubenswrapper[5094]: W0220 08:25:01.901201 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bfb5bd8_1bf4_4080_908d_71dac3301aca.slice/crio-bdee6ec37699f4bfe82a7731cdde32d50b1a444a033cb9052df957ac0304b266 WatchSource:0}: Error finding container bdee6ec37699f4bfe82a7731cdde32d50b1a444a033cb9052df957ac0304b266: Status 404 returned error can't find the container with id bdee6ec37699f4bfe82a7731cdde32d50b1a444a033cb9052df957ac0304b266 Feb 20 08:25:02 crc kubenswrapper[5094]: I0220 08:25:02.719325 5094 generic.go:334] "Generic (PLEG): container finished" podID="6bfb5bd8-1bf4-4080-908d-71dac3301aca" containerID="5fe6cd402f52794a3175518b1d65628a3975facf339971987772005f254a31df" exitCode=0 Feb 20 08:25:02 crc kubenswrapper[5094]: I0220 08:25:02.719385 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-m2rs9" event={"ID":"6bfb5bd8-1bf4-4080-908d-71dac3301aca","Type":"ContainerDied","Data":"5fe6cd402f52794a3175518b1d65628a3975facf339971987772005f254a31df"} Feb 20 08:25:02 crc kubenswrapper[5094]: I0220 08:25:02.719447 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-m2rs9" event={"ID":"6bfb5bd8-1bf4-4080-908d-71dac3301aca","Type":"ContainerStarted","Data":"bdee6ec37699f4bfe82a7731cdde32d50b1a444a033cb9052df957ac0304b266"} Feb 20 08:25:04 crc kubenswrapper[5094]: I0220 08:25:04.124628 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-m2rs9" Feb 20 08:25:04 crc kubenswrapper[5094]: I0220 08:25:04.208635 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcnbj\" (UniqueName: \"kubernetes.io/projected/6bfb5bd8-1bf4-4080-908d-71dac3301aca-kube-api-access-rcnbj\") pod \"6bfb5bd8-1bf4-4080-908d-71dac3301aca\" (UID: \"6bfb5bd8-1bf4-4080-908d-71dac3301aca\") " Feb 20 08:25:04 crc kubenswrapper[5094]: I0220 08:25:04.208744 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bfb5bd8-1bf4-4080-908d-71dac3301aca-operator-scripts\") pod \"6bfb5bd8-1bf4-4080-908d-71dac3301aca\" (UID: \"6bfb5bd8-1bf4-4080-908d-71dac3301aca\") " Feb 20 08:25:04 crc kubenswrapper[5094]: I0220 08:25:04.209612 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bfb5bd8-1bf4-4080-908d-71dac3301aca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6bfb5bd8-1bf4-4080-908d-71dac3301aca" (UID: "6bfb5bd8-1bf4-4080-908d-71dac3301aca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:25:04 crc kubenswrapper[5094]: I0220 08:25:04.217359 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bfb5bd8-1bf4-4080-908d-71dac3301aca-kube-api-access-rcnbj" (OuterVolumeSpecName: "kube-api-access-rcnbj") pod "6bfb5bd8-1bf4-4080-908d-71dac3301aca" (UID: "6bfb5bd8-1bf4-4080-908d-71dac3301aca"). InnerVolumeSpecName "kube-api-access-rcnbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:25:04 crc kubenswrapper[5094]: I0220 08:25:04.310891 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcnbj\" (UniqueName: \"kubernetes.io/projected/6bfb5bd8-1bf4-4080-908d-71dac3301aca-kube-api-access-rcnbj\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:04 crc kubenswrapper[5094]: I0220 08:25:04.311281 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bfb5bd8-1bf4-4080-908d-71dac3301aca-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:04 crc kubenswrapper[5094]: I0220 08:25:04.740379 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-m2rs9" event={"ID":"6bfb5bd8-1bf4-4080-908d-71dac3301aca","Type":"ContainerDied","Data":"bdee6ec37699f4bfe82a7731cdde32d50b1a444a033cb9052df957ac0304b266"} Feb 20 08:25:04 crc kubenswrapper[5094]: I0220 08:25:04.740437 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdee6ec37699f4bfe82a7731cdde32d50b1a444a033cb9052df957ac0304b266" Feb 20 08:25:04 crc kubenswrapper[5094]: I0220 08:25:04.740935 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-m2rs9" Feb 20 08:25:07 crc kubenswrapper[5094]: I0220 08:25:07.841074 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:25:08 crc kubenswrapper[5094]: I0220 08:25:08.087794 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-m2rs9"] Feb 20 08:25:08 crc kubenswrapper[5094]: I0220 08:25:08.093368 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-m2rs9"] Feb 20 08:25:08 crc kubenswrapper[5094]: I0220 08:25:08.774784 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"3240a62ab36af910255d2e2cb0810a1e622f0e56dc51a3b3a6a36dc8406c37ea"} Feb 20 08:25:09 crc kubenswrapper[5094]: I0220 08:25:09.851204 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bfb5bd8-1bf4-4080-908d-71dac3301aca" path="/var/lib/kubelet/pods/6bfb5bd8-1bf4-4080-908d-71dac3301aca/volumes" Feb 20 08:25:13 crc kubenswrapper[5094]: I0220 08:25:13.113330 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-bwdbk"] Feb 20 08:25:13 crc kubenswrapper[5094]: E0220 08:25:13.114725 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bfb5bd8-1bf4-4080-908d-71dac3301aca" containerName="mariadb-account-create-update" Feb 20 08:25:13 crc kubenswrapper[5094]: I0220 08:25:13.114749 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bfb5bd8-1bf4-4080-908d-71dac3301aca" containerName="mariadb-account-create-update" Feb 20 08:25:13 crc kubenswrapper[5094]: I0220 08:25:13.115082 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bfb5bd8-1bf4-4080-908d-71dac3301aca" containerName="mariadb-account-create-update" Feb 20 08:25:13 crc kubenswrapper[5094]: I0220 08:25:13.116213 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bwdbk" Feb 20 08:25:13 crc kubenswrapper[5094]: I0220 08:25:13.118940 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 20 08:25:13 crc kubenswrapper[5094]: I0220 08:25:13.121616 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bwdbk"] Feb 20 08:25:13 crc kubenswrapper[5094]: I0220 08:25:13.181054 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cll4n\" (UniqueName: \"kubernetes.io/projected/5e2665ce-2c09-43f9-8245-ed36e682e1e0-kube-api-access-cll4n\") pod \"root-account-create-update-bwdbk\" (UID: \"5e2665ce-2c09-43f9-8245-ed36e682e1e0\") " pod="openstack/root-account-create-update-bwdbk" Feb 20 08:25:13 crc kubenswrapper[5094]: I0220 08:25:13.181125 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e2665ce-2c09-43f9-8245-ed36e682e1e0-operator-scripts\") pod \"root-account-create-update-bwdbk\" (UID: \"5e2665ce-2c09-43f9-8245-ed36e682e1e0\") " pod="openstack/root-account-create-update-bwdbk" Feb 20 08:25:13 crc kubenswrapper[5094]: I0220 08:25:13.282745 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cll4n\" (UniqueName: \"kubernetes.io/projected/5e2665ce-2c09-43f9-8245-ed36e682e1e0-kube-api-access-cll4n\") pod \"root-account-create-update-bwdbk\" (UID: \"5e2665ce-2c09-43f9-8245-ed36e682e1e0\") " pod="openstack/root-account-create-update-bwdbk" Feb 20 08:25:13 crc kubenswrapper[5094]: I0220 08:25:13.282865 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e2665ce-2c09-43f9-8245-ed36e682e1e0-operator-scripts\") pod \"root-account-create-update-bwdbk\" (UID: \"5e2665ce-2c09-43f9-8245-ed36e682e1e0\") " pod="openstack/root-account-create-update-bwdbk" Feb 20 08:25:13 crc kubenswrapper[5094]: I0220 08:25:13.284223 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e2665ce-2c09-43f9-8245-ed36e682e1e0-operator-scripts\") pod \"root-account-create-update-bwdbk\" (UID: \"5e2665ce-2c09-43f9-8245-ed36e682e1e0\") " pod="openstack/root-account-create-update-bwdbk" Feb 20 08:25:13 crc kubenswrapper[5094]: I0220 08:25:13.312369 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cll4n\" (UniqueName: \"kubernetes.io/projected/5e2665ce-2c09-43f9-8245-ed36e682e1e0-kube-api-access-cll4n\") pod \"root-account-create-update-bwdbk\" (UID: \"5e2665ce-2c09-43f9-8245-ed36e682e1e0\") " pod="openstack/root-account-create-update-bwdbk" Feb 20 08:25:13 crc kubenswrapper[5094]: I0220 08:25:13.454329 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bwdbk" Feb 20 08:25:13 crc kubenswrapper[5094]: I0220 08:25:13.892111 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bwdbk"] Feb 20 08:25:13 crc kubenswrapper[5094]: W0220 08:25:13.899066 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e2665ce_2c09_43f9_8245_ed36e682e1e0.slice/crio-9d532f2bcf1d75bcf507fd6166ca0077f192de90fca1e0670b857de898687b24 WatchSource:0}: Error finding container 9d532f2bcf1d75bcf507fd6166ca0077f192de90fca1e0670b857de898687b24: Status 404 returned error can't find the container with id 9d532f2bcf1d75bcf507fd6166ca0077f192de90fca1e0670b857de898687b24 Feb 20 08:25:14 crc kubenswrapper[5094]: I0220 08:25:14.827468 5094 generic.go:334] "Generic (PLEG): container finished" podID="5e2665ce-2c09-43f9-8245-ed36e682e1e0" containerID="3a10c0a7a48b4e7a28b7f39fc5231d6ac90168dce54dc2c58472a7fe7bfce49e" exitCode=0 Feb 20 08:25:14 crc kubenswrapper[5094]: I0220 08:25:14.827836 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bwdbk" event={"ID":"5e2665ce-2c09-43f9-8245-ed36e682e1e0","Type":"ContainerDied","Data":"3a10c0a7a48b4e7a28b7f39fc5231d6ac90168dce54dc2c58472a7fe7bfce49e"} Feb 20 08:25:14 crc kubenswrapper[5094]: I0220 08:25:14.827869 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bwdbk" event={"ID":"5e2665ce-2c09-43f9-8245-ed36e682e1e0","Type":"ContainerStarted","Data":"9d532f2bcf1d75bcf507fd6166ca0077f192de90fca1e0670b857de898687b24"} Feb 20 08:25:16 crc kubenswrapper[5094]: I0220 08:25:16.222948 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bwdbk" Feb 20 08:25:16 crc kubenswrapper[5094]: I0220 08:25:16.329249 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cll4n\" (UniqueName: \"kubernetes.io/projected/5e2665ce-2c09-43f9-8245-ed36e682e1e0-kube-api-access-cll4n\") pod \"5e2665ce-2c09-43f9-8245-ed36e682e1e0\" (UID: \"5e2665ce-2c09-43f9-8245-ed36e682e1e0\") " Feb 20 08:25:16 crc kubenswrapper[5094]: I0220 08:25:16.329478 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e2665ce-2c09-43f9-8245-ed36e682e1e0-operator-scripts\") pod \"5e2665ce-2c09-43f9-8245-ed36e682e1e0\" (UID: \"5e2665ce-2c09-43f9-8245-ed36e682e1e0\") " Feb 20 08:25:16 crc kubenswrapper[5094]: I0220 08:25:16.331335 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e2665ce-2c09-43f9-8245-ed36e682e1e0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5e2665ce-2c09-43f9-8245-ed36e682e1e0" (UID: "5e2665ce-2c09-43f9-8245-ed36e682e1e0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:25:16 crc kubenswrapper[5094]: I0220 08:25:16.340964 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e2665ce-2c09-43f9-8245-ed36e682e1e0-kube-api-access-cll4n" (OuterVolumeSpecName: "kube-api-access-cll4n") pod "5e2665ce-2c09-43f9-8245-ed36e682e1e0" (UID: "5e2665ce-2c09-43f9-8245-ed36e682e1e0"). InnerVolumeSpecName "kube-api-access-cll4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:25:16 crc kubenswrapper[5094]: I0220 08:25:16.432260 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e2665ce-2c09-43f9-8245-ed36e682e1e0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:16 crc kubenswrapper[5094]: I0220 08:25:16.432312 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cll4n\" (UniqueName: \"kubernetes.io/projected/5e2665ce-2c09-43f9-8245-ed36e682e1e0-kube-api-access-cll4n\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:16 crc kubenswrapper[5094]: I0220 08:25:16.847287 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bwdbk" event={"ID":"5e2665ce-2c09-43f9-8245-ed36e682e1e0","Type":"ContainerDied","Data":"9d532f2bcf1d75bcf507fd6166ca0077f192de90fca1e0670b857de898687b24"} Feb 20 08:25:16 crc kubenswrapper[5094]: I0220 08:25:16.847635 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d532f2bcf1d75bcf507fd6166ca0077f192de90fca1e0670b857de898687b24" Feb 20 08:25:16 crc kubenswrapper[5094]: I0220 08:25:16.847354 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bwdbk" Feb 20 08:25:17 crc kubenswrapper[5094]: I0220 08:25:17.854873 5094 generic.go:334] "Generic (PLEG): container finished" podID="d6404f29-e503-4f82-a2ce-e147c18677a7" containerID="6d76e2a1b5f6be82c63ad209d475f9abed3c7fa6699825eafe32cb1a6b659d1b" exitCode=0 Feb 20 08:25:17 crc kubenswrapper[5094]: I0220 08:25:17.855825 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d6404f29-e503-4f82-a2ce-e147c18677a7","Type":"ContainerDied","Data":"6d76e2a1b5f6be82c63ad209d475f9abed3c7fa6699825eafe32cb1a6b659d1b"} Feb 20 08:25:17 crc kubenswrapper[5094]: I0220 08:25:17.857234 5094 generic.go:334] "Generic (PLEG): container finished" podID="09194bc6-429f-46a1-8dac-8b84385c9e10" containerID="969469a9c4ed7927decd268446b641c0bc47204103fa4da8c53604820787ff47" exitCode=0 Feb 20 08:25:17 crc kubenswrapper[5094]: I0220 08:25:17.857263 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"09194bc6-429f-46a1-8dac-8b84385c9e10","Type":"ContainerDied","Data":"969469a9c4ed7927decd268446b641c0bc47204103fa4da8c53604820787ff47"} Feb 20 08:25:18 crc kubenswrapper[5094]: I0220 08:25:18.865416 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d6404f29-e503-4f82-a2ce-e147c18677a7","Type":"ContainerStarted","Data":"6e9951d34d067e6c11ee76a135e218095884e7a51c2dd26ce69a65711a7b7b62"} Feb 20 08:25:18 crc kubenswrapper[5094]: I0220 08:25:18.865904 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 20 08:25:18 crc kubenswrapper[5094]: I0220 08:25:18.867196 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"09194bc6-429f-46a1-8dac-8b84385c9e10","Type":"ContainerStarted","Data":"b3b442baad93ec413f9539526f057ff7813024d9fd4c545cfc22a77adbfd3b74"} Feb 20 08:25:18 crc kubenswrapper[5094]: I0220 08:25:18.867394 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:18 crc kubenswrapper[5094]: I0220 08:25:18.893389 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.457040871 podStartE2EDuration="58.89336666s" podCreationTimestamp="2026-02-20 08:24:20 +0000 UTC" firstStartedPulling="2026-02-20 08:24:22.320765245 +0000 UTC m=+5877.193391946" lastFinishedPulling="2026-02-20 08:24:43.757091024 +0000 UTC m=+5898.629717735" observedRunningTime="2026-02-20 08:25:18.886659878 +0000 UTC m=+5933.759286589" watchObservedRunningTime="2026-02-20 08:25:18.89336666 +0000 UTC m=+5933.765993371" Feb 20 08:25:18 crc kubenswrapper[5094]: I0220 08:25:18.912532 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.906894197 podStartE2EDuration="58.912515592s" podCreationTimestamp="2026-02-20 08:24:20 +0000 UTC" firstStartedPulling="2026-02-20 08:24:22.703060211 +0000 UTC m=+5877.575686922" lastFinishedPulling="2026-02-20 08:24:43.708681606 +0000 UTC m=+5898.581308317" observedRunningTime="2026-02-20 08:25:18.908540586 +0000 UTC m=+5933.781167297" watchObservedRunningTime="2026-02-20 08:25:18.912515592 +0000 UTC m=+5933.785142303" Feb 20 08:25:31 crc kubenswrapper[5094]: I0220 08:25:31.806017 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 20 08:25:32 crc kubenswrapper[5094]: I0220 08:25:32.228158 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:39 crc kubenswrapper[5094]: I0220 08:25:39.683409 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c98fc4f89-fwwbb"] Feb 20 08:25:39 crc kubenswrapper[5094]: E0220 08:25:39.684399 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e2665ce-2c09-43f9-8245-ed36e682e1e0" containerName="mariadb-account-create-update" Feb 20 08:25:39 crc kubenswrapper[5094]: I0220 08:25:39.684418 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e2665ce-2c09-43f9-8245-ed36e682e1e0" containerName="mariadb-account-create-update" Feb 20 08:25:39 crc kubenswrapper[5094]: I0220 08:25:39.684614 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e2665ce-2c09-43f9-8245-ed36e682e1e0" containerName="mariadb-account-create-update" Feb 20 08:25:39 crc kubenswrapper[5094]: I0220 08:25:39.685680 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" Feb 20 08:25:39 crc kubenswrapper[5094]: I0220 08:25:39.695892 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c98fc4f89-fwwbb"] Feb 20 08:25:39 crc kubenswrapper[5094]: I0220 08:25:39.841330 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f4c545b-01fc-4e08-994c-7d24a10a963e-dns-svc\") pod \"dnsmasq-dns-c98fc4f89-fwwbb\" (UID: \"6f4c545b-01fc-4e08-994c-7d24a10a963e\") " pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" Feb 20 08:25:39 crc kubenswrapper[5094]: I0220 08:25:39.841641 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f4c545b-01fc-4e08-994c-7d24a10a963e-config\") pod \"dnsmasq-dns-c98fc4f89-fwwbb\" (UID: \"6f4c545b-01fc-4e08-994c-7d24a10a963e\") " pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" Feb 20 08:25:39 crc kubenswrapper[5094]: I0220 08:25:39.841854 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9shd2\" (UniqueName: \"kubernetes.io/projected/6f4c545b-01fc-4e08-994c-7d24a10a963e-kube-api-access-9shd2\") pod \"dnsmasq-dns-c98fc4f89-fwwbb\" (UID: \"6f4c545b-01fc-4e08-994c-7d24a10a963e\") " pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" Feb 20 08:25:39 crc kubenswrapper[5094]: I0220 08:25:39.943995 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f4c545b-01fc-4e08-994c-7d24a10a963e-config\") pod \"dnsmasq-dns-c98fc4f89-fwwbb\" (UID: \"6f4c545b-01fc-4e08-994c-7d24a10a963e\") " pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" Feb 20 08:25:39 crc kubenswrapper[5094]: I0220 08:25:39.944142 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9shd2\" (UniqueName: \"kubernetes.io/projected/6f4c545b-01fc-4e08-994c-7d24a10a963e-kube-api-access-9shd2\") pod \"dnsmasq-dns-c98fc4f89-fwwbb\" (UID: \"6f4c545b-01fc-4e08-994c-7d24a10a963e\") " pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" Feb 20 08:25:39 crc kubenswrapper[5094]: I0220 08:25:39.944300 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f4c545b-01fc-4e08-994c-7d24a10a963e-dns-svc\") pod \"dnsmasq-dns-c98fc4f89-fwwbb\" (UID: \"6f4c545b-01fc-4e08-994c-7d24a10a963e\") " pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" Feb 20 08:25:39 crc kubenswrapper[5094]: I0220 08:25:39.945082 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f4c545b-01fc-4e08-994c-7d24a10a963e-config\") pod \"dnsmasq-dns-c98fc4f89-fwwbb\" (UID: \"6f4c545b-01fc-4e08-994c-7d24a10a963e\") " pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" Feb 20 08:25:39 crc kubenswrapper[5094]: I0220 08:25:39.945736 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f4c545b-01fc-4e08-994c-7d24a10a963e-dns-svc\") pod \"dnsmasq-dns-c98fc4f89-fwwbb\" (UID: \"6f4c545b-01fc-4e08-994c-7d24a10a963e\") " pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" Feb 20 08:25:39 crc kubenswrapper[5094]: I0220 08:25:39.973698 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9shd2\" (UniqueName: \"kubernetes.io/projected/6f4c545b-01fc-4e08-994c-7d24a10a963e-kube-api-access-9shd2\") pod \"dnsmasq-dns-c98fc4f89-fwwbb\" (UID: \"6f4c545b-01fc-4e08-994c-7d24a10a963e\") " pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" Feb 20 08:25:40 crc kubenswrapper[5094]: I0220 08:25:40.006131 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" Feb 20 08:25:40 crc kubenswrapper[5094]: W0220 08:25:40.582788 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f4c545b_01fc_4e08_994c_7d24a10a963e.slice/crio-980a56a2f88743ff53bcdb60a139ba0c40f893ee3421dbdc2711a580ef52fcca WatchSource:0}: Error finding container 980a56a2f88743ff53bcdb60a139ba0c40f893ee3421dbdc2711a580ef52fcca: Status 404 returned error can't find the container with id 980a56a2f88743ff53bcdb60a139ba0c40f893ee3421dbdc2711a580ef52fcca Feb 20 08:25:40 crc kubenswrapper[5094]: I0220 08:25:40.589379 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c98fc4f89-fwwbb"] Feb 20 08:25:40 crc kubenswrapper[5094]: I0220 08:25:40.714160 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 08:25:41 crc kubenswrapper[5094]: I0220 08:25:41.075265 5094 generic.go:334] "Generic (PLEG): container finished" podID="6f4c545b-01fc-4e08-994c-7d24a10a963e" containerID="6dc775b2c5f76a93f5e36b63c468418d60dce69b4e0ea3323ccd0a9f35c92e84" exitCode=0 Feb 20 08:25:41 crc kubenswrapper[5094]: I0220 08:25:41.075313 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" event={"ID":"6f4c545b-01fc-4e08-994c-7d24a10a963e","Type":"ContainerDied","Data":"6dc775b2c5f76a93f5e36b63c468418d60dce69b4e0ea3323ccd0a9f35c92e84"} Feb 20 08:25:41 crc kubenswrapper[5094]: I0220 08:25:41.075357 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" event={"ID":"6f4c545b-01fc-4e08-994c-7d24a10a963e","Type":"ContainerStarted","Data":"980a56a2f88743ff53bcdb60a139ba0c40f893ee3421dbdc2711a580ef52fcca"} Feb 20 08:25:41 crc kubenswrapper[5094]: I0220 08:25:41.620293 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 08:25:42 crc kubenswrapper[5094]: I0220 08:25:42.099068 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" event={"ID":"6f4c545b-01fc-4e08-994c-7d24a10a963e","Type":"ContainerStarted","Data":"a7aa04cfead057bdf297d9b737afb89cfa0c54a45dbf6dfc5c84ca574ec804e3"} Feb 20 08:25:42 crc kubenswrapper[5094]: I0220 08:25:42.099789 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" Feb 20 08:25:42 crc kubenswrapper[5094]: I0220 08:25:42.119845 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" podStartSLOduration=3.119827301 podStartE2EDuration="3.119827301s" podCreationTimestamp="2026-02-20 08:25:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:25:42.118672273 +0000 UTC m=+5956.991298984" watchObservedRunningTime="2026-02-20 08:25:42.119827301 +0000 UTC m=+5956.992454012" Feb 20 08:25:42 crc kubenswrapper[5094]: I0220 08:25:42.781224 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="d6404f29-e503-4f82-a2ce-e147c18677a7" containerName="rabbitmq" containerID="cri-o://6e9951d34d067e6c11ee76a135e218095884e7a51c2dd26ce69a65711a7b7b62" gracePeriod=604798 Feb 20 08:25:43 crc kubenswrapper[5094]: I0220 08:25:43.580978 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="09194bc6-429f-46a1-8dac-8b84385c9e10" containerName="rabbitmq" containerID="cri-o://b3b442baad93ec413f9539526f057ff7813024d9fd4c545cfc22a77adbfd3b74" gracePeriod=604799 Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.193955 5094 generic.go:334] "Generic (PLEG): container finished" podID="d6404f29-e503-4f82-a2ce-e147c18677a7" containerID="6e9951d34d067e6c11ee76a135e218095884e7a51c2dd26ce69a65711a7b7b62" exitCode=0 Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.194252 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d6404f29-e503-4f82-a2ce-e147c18677a7","Type":"ContainerDied","Data":"6e9951d34d067e6c11ee76a135e218095884e7a51c2dd26ce69a65711a7b7b62"} Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.360747 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.540394 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d6404f29-e503-4f82-a2ce-e147c18677a7-plugins-conf\") pod \"d6404f29-e503-4f82-a2ce-e147c18677a7\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.540466 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d6404f29-e503-4f82-a2ce-e147c18677a7-pod-info\") pod \"d6404f29-e503-4f82-a2ce-e147c18677a7\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.540501 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-confd\") pod \"d6404f29-e503-4f82-a2ce-e147c18677a7\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.540570 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-plugins\") pod \"d6404f29-e503-4f82-a2ce-e147c18677a7\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.540588 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-erlang-cookie\") pod \"d6404f29-e503-4f82-a2ce-e147c18677a7\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.540785 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\") pod \"d6404f29-e503-4f82-a2ce-e147c18677a7\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.540809 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d6404f29-e503-4f82-a2ce-e147c18677a7-server-conf\") pod \"d6404f29-e503-4f82-a2ce-e147c18677a7\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.540850 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8tnp\" (UniqueName: \"kubernetes.io/projected/d6404f29-e503-4f82-a2ce-e147c18677a7-kube-api-access-n8tnp\") pod \"d6404f29-e503-4f82-a2ce-e147c18677a7\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.540873 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d6404f29-e503-4f82-a2ce-e147c18677a7-erlang-cookie-secret\") pod \"d6404f29-e503-4f82-a2ce-e147c18677a7\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.541391 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "d6404f29-e503-4f82-a2ce-e147c18677a7" (UID: "d6404f29-e503-4f82-a2ce-e147c18677a7"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.541594 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6404f29-e503-4f82-a2ce-e147c18677a7-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "d6404f29-e503-4f82-a2ce-e147c18677a7" (UID: "d6404f29-e503-4f82-a2ce-e147c18677a7"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.542155 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "d6404f29-e503-4f82-a2ce-e147c18677a7" (UID: "d6404f29-e503-4f82-a2ce-e147c18677a7"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.546605 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6404f29-e503-4f82-a2ce-e147c18677a7-kube-api-access-n8tnp" (OuterVolumeSpecName: "kube-api-access-n8tnp") pod "d6404f29-e503-4f82-a2ce-e147c18677a7" (UID: "d6404f29-e503-4f82-a2ce-e147c18677a7"). InnerVolumeSpecName "kube-api-access-n8tnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.548009 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/d6404f29-e503-4f82-a2ce-e147c18677a7-pod-info" (OuterVolumeSpecName: "pod-info") pod "d6404f29-e503-4f82-a2ce-e147c18677a7" (UID: "d6404f29-e503-4f82-a2ce-e147c18677a7"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.553520 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6404f29-e503-4f82-a2ce-e147c18677a7-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "d6404f29-e503-4f82-a2ce-e147c18677a7" (UID: "d6404f29-e503-4f82-a2ce-e147c18677a7"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.554107 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1" (OuterVolumeSpecName: "persistence") pod "d6404f29-e503-4f82-a2ce-e147c18677a7" (UID: "d6404f29-e503-4f82-a2ce-e147c18677a7"). InnerVolumeSpecName "pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.560262 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6404f29-e503-4f82-a2ce-e147c18677a7-server-conf" (OuterVolumeSpecName: "server-conf") pod "d6404f29-e503-4f82-a2ce-e147c18677a7" (UID: "d6404f29-e503-4f82-a2ce-e147c18677a7"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.623341 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d6404f29-e503-4f82-a2ce-e147c18677a7" (UID: "d6404f29-e503-4f82-a2ce-e147c18677a7"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.643442 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\") on node \"crc\" " Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.644016 5094 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d6404f29-e503-4f82-a2ce-e147c18677a7-server-conf\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.644115 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8tnp\" (UniqueName: \"kubernetes.io/projected/d6404f29-e503-4f82-a2ce-e147c18677a7-kube-api-access-n8tnp\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.644138 5094 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d6404f29-e503-4f82-a2ce-e147c18677a7-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.644188 5094 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d6404f29-e503-4f82-a2ce-e147c18677a7-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.644207 5094 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d6404f29-e503-4f82-a2ce-e147c18677a7-pod-info\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.644222 5094 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.644276 5094 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.644296 5094 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.662776 5094 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.663053 5094 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1") on node "crc" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.745311 5094 reconciler_common.go:293] "Volume detached for volume \"pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.008010 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.064016 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76466d7cdc-nc2ng"] Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.064276 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" podUID="9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9" containerName="dnsmasq-dns" containerID="cri-o://8a09f7482da0f94c8e0f012e0e1e52314f35f8500de127da7d88164462490023" gracePeriod=10 Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.209053 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d6404f29-e503-4f82-a2ce-e147c18677a7","Type":"ContainerDied","Data":"4b1fc1669c5a53997e1b5722c230d8dba80a44d7f6b4091a5bd28b4938f7b018"} Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.209105 5094 scope.go:117] "RemoveContainer" containerID="6e9951d34d067e6c11ee76a135e218095884e7a51c2dd26ce69a65711a7b7b62" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.209147 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.212787 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.215137 5094 generic.go:334] "Generic (PLEG): container finished" podID="09194bc6-429f-46a1-8dac-8b84385c9e10" containerID="b3b442baad93ec413f9539526f057ff7813024d9fd4c545cfc22a77adbfd3b74" exitCode=0 Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.215226 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"09194bc6-429f-46a1-8dac-8b84385c9e10","Type":"ContainerDied","Data":"b3b442baad93ec413f9539526f057ff7813024d9fd4c545cfc22a77adbfd3b74"} Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.215258 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"09194bc6-429f-46a1-8dac-8b84385c9e10","Type":"ContainerDied","Data":"9e766a894fc47307f318a9f38ad47bbac110c5ff58c56284d0fae3825eea954c"} Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.219240 5094 generic.go:334] "Generic (PLEG): container finished" podID="9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9" containerID="8a09f7482da0f94c8e0f012e0e1e52314f35f8500de127da7d88164462490023" exitCode=0 Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.219279 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" event={"ID":"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9","Type":"ContainerDied","Data":"8a09f7482da0f94c8e0f012e0e1e52314f35f8500de127da7d88164462490023"} Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.236282 5094 scope.go:117] "RemoveContainer" containerID="6d76e2a1b5f6be82c63ad209d475f9abed3c7fa6699825eafe32cb1a6b659d1b" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.263716 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.270281 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.291521 5094 scope.go:117] "RemoveContainer" containerID="b3b442baad93ec413f9539526f057ff7813024d9fd4c545cfc22a77adbfd3b74" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.303351 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 08:25:50 crc kubenswrapper[5094]: E0220 08:25:50.303637 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09194bc6-429f-46a1-8dac-8b84385c9e10" containerName="rabbitmq" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.303649 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="09194bc6-429f-46a1-8dac-8b84385c9e10" containerName="rabbitmq" Feb 20 08:25:50 crc kubenswrapper[5094]: E0220 08:25:50.303666 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6404f29-e503-4f82-a2ce-e147c18677a7" containerName="setup-container" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.303672 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6404f29-e503-4f82-a2ce-e147c18677a7" containerName="setup-container" Feb 20 08:25:50 crc kubenswrapper[5094]: E0220 08:25:50.303689 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6404f29-e503-4f82-a2ce-e147c18677a7" containerName="rabbitmq" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.303697 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6404f29-e503-4f82-a2ce-e147c18677a7" containerName="rabbitmq" Feb 20 08:25:50 crc kubenswrapper[5094]: E0220 08:25:50.303738 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09194bc6-429f-46a1-8dac-8b84385c9e10" containerName="setup-container" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.303747 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="09194bc6-429f-46a1-8dac-8b84385c9e10" containerName="setup-container" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.303928 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="09194bc6-429f-46a1-8dac-8b84385c9e10" containerName="rabbitmq" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.303941 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6404f29-e503-4f82-a2ce-e147c18677a7" containerName="rabbitmq" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.304833 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.307059 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.313038 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.313247 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.313418 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-2c4db" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.313467 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.333641 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.334858 5094 scope.go:117] "RemoveContainer" containerID="969469a9c4ed7927decd268446b641c0bc47204103fa4da8c53604820787ff47" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.367178 5094 scope.go:117] "RemoveContainer" containerID="b3b442baad93ec413f9539526f057ff7813024d9fd4c545cfc22a77adbfd3b74" Feb 20 08:25:50 crc kubenswrapper[5094]: E0220 08:25:50.368429 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3b442baad93ec413f9539526f057ff7813024d9fd4c545cfc22a77adbfd3b74\": container with ID starting with b3b442baad93ec413f9539526f057ff7813024d9fd4c545cfc22a77adbfd3b74 not found: ID does not exist" containerID="b3b442baad93ec413f9539526f057ff7813024d9fd4c545cfc22a77adbfd3b74" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.368461 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3b442baad93ec413f9539526f057ff7813024d9fd4c545cfc22a77adbfd3b74"} err="failed to get container status \"b3b442baad93ec413f9539526f057ff7813024d9fd4c545cfc22a77adbfd3b74\": rpc error: code = NotFound desc = could not find container \"b3b442baad93ec413f9539526f057ff7813024d9fd4c545cfc22a77adbfd3b74\": container with ID starting with b3b442baad93ec413f9539526f057ff7813024d9fd4c545cfc22a77adbfd3b74 not found: ID does not exist" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.368484 5094 scope.go:117] "RemoveContainer" containerID="969469a9c4ed7927decd268446b641c0bc47204103fa4da8c53604820787ff47" Feb 20 08:25:50 crc kubenswrapper[5094]: E0220 08:25:50.368733 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"969469a9c4ed7927decd268446b641c0bc47204103fa4da8c53604820787ff47\": container with ID starting with 969469a9c4ed7927decd268446b641c0bc47204103fa4da8c53604820787ff47 not found: ID does not exist" containerID="969469a9c4ed7927decd268446b641c0bc47204103fa4da8c53604820787ff47" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.368757 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"969469a9c4ed7927decd268446b641c0bc47204103fa4da8c53604820787ff47"} err="failed to get container status \"969469a9c4ed7927decd268446b641c0bc47204103fa4da8c53604820787ff47\": rpc error: code = NotFound desc = could not find container \"969469a9c4ed7927decd268446b641c0bc47204103fa4da8c53604820787ff47\": container with ID starting with 969469a9c4ed7927decd268446b641c0bc47204103fa4da8c53604820787ff47 not found: ID does not exist" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.373869 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-219358f9-7520-4512-8729-274ec1ad54bd\") pod \"09194bc6-429f-46a1-8dac-8b84385c9e10\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.373932 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qh8s\" (UniqueName: \"kubernetes.io/projected/09194bc6-429f-46a1-8dac-8b84385c9e10-kube-api-access-4qh8s\") pod \"09194bc6-429f-46a1-8dac-8b84385c9e10\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.373989 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-confd\") pod \"09194bc6-429f-46a1-8dac-8b84385c9e10\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.374026 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/09194bc6-429f-46a1-8dac-8b84385c9e10-erlang-cookie-secret\") pod \"09194bc6-429f-46a1-8dac-8b84385c9e10\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.374063 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-erlang-cookie\") pod \"09194bc6-429f-46a1-8dac-8b84385c9e10\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.374126 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/09194bc6-429f-46a1-8dac-8b84385c9e10-server-conf\") pod \"09194bc6-429f-46a1-8dac-8b84385c9e10\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.374264 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/09194bc6-429f-46a1-8dac-8b84385c9e10-pod-info\") pod \"09194bc6-429f-46a1-8dac-8b84385c9e10\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.374294 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-plugins\") pod \"09194bc6-429f-46a1-8dac-8b84385c9e10\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.374314 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/09194bc6-429f-46a1-8dac-8b84385c9e10-plugins-conf\") pod \"09194bc6-429f-46a1-8dac-8b84385c9e10\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.374434 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.374463 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.374494 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.374532 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfn2p\" (UniqueName: \"kubernetes.io/projected/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-kube-api-access-nfn2p\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.374570 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.374647 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.374662 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.374678 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.374713 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.378169 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "09194bc6-429f-46a1-8dac-8b84385c9e10" (UID: "09194bc6-429f-46a1-8dac-8b84385c9e10"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.378181 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09194bc6-429f-46a1-8dac-8b84385c9e10-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "09194bc6-429f-46a1-8dac-8b84385c9e10" (UID: "09194bc6-429f-46a1-8dac-8b84385c9e10"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.378802 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "09194bc6-429f-46a1-8dac-8b84385c9e10" (UID: "09194bc6-429f-46a1-8dac-8b84385c9e10"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.379524 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/09194bc6-429f-46a1-8dac-8b84385c9e10-pod-info" (OuterVolumeSpecName: "pod-info") pod "09194bc6-429f-46a1-8dac-8b84385c9e10" (UID: "09194bc6-429f-46a1-8dac-8b84385c9e10"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.383310 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09194bc6-429f-46a1-8dac-8b84385c9e10-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "09194bc6-429f-46a1-8dac-8b84385c9e10" (UID: "09194bc6-429f-46a1-8dac-8b84385c9e10"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.390976 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-219358f9-7520-4512-8729-274ec1ad54bd" (OuterVolumeSpecName: "persistence") pod "09194bc6-429f-46a1-8dac-8b84385c9e10" (UID: "09194bc6-429f-46a1-8dac-8b84385c9e10"). InnerVolumeSpecName "pvc-219358f9-7520-4512-8729-274ec1ad54bd". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.395083 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09194bc6-429f-46a1-8dac-8b84385c9e10-kube-api-access-4qh8s" (OuterVolumeSpecName: "kube-api-access-4qh8s") pod "09194bc6-429f-46a1-8dac-8b84385c9e10" (UID: "09194bc6-429f-46a1-8dac-8b84385c9e10"). InnerVolumeSpecName "kube-api-access-4qh8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.399928 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09194bc6-429f-46a1-8dac-8b84385c9e10-server-conf" (OuterVolumeSpecName: "server-conf") pod "09194bc6-429f-46a1-8dac-8b84385c9e10" (UID: "09194bc6-429f-46a1-8dac-8b84385c9e10"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.452966 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "09194bc6-429f-46a1-8dac-8b84385c9e10" (UID: "09194bc6-429f-46a1-8dac-8b84385c9e10"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.462902 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.475527 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pqx5\" (UniqueName: \"kubernetes.io/projected/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-kube-api-access-9pqx5\") pod \"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9\" (UID: \"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9\") " Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.475596 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-dns-svc\") pod \"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9\" (UID: \"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9\") " Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.475668 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-config\") pod \"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9\" (UID: \"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9\") " Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.475990 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476031 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476060 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476090 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfn2p\" (UniqueName: \"kubernetes.io/projected/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-kube-api-access-nfn2p\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476123 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476169 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476185 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476199 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476238 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476281 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qh8s\" (UniqueName: \"kubernetes.io/projected/09194bc6-429f-46a1-8dac-8b84385c9e10-kube-api-access-4qh8s\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476291 5094 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476313 5094 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/09194bc6-429f-46a1-8dac-8b84385c9e10-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476322 5094 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476331 5094 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/09194bc6-429f-46a1-8dac-8b84385c9e10-server-conf\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476340 5094 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/09194bc6-429f-46a1-8dac-8b84385c9e10-pod-info\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476348 5094 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476357 5094 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/09194bc6-429f-46a1-8dac-8b84385c9e10-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476377 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-219358f9-7520-4512-8729-274ec1ad54bd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-219358f9-7520-4512-8729-274ec1ad54bd\") on node \"crc\" " Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.479159 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.483690 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-kube-api-access-9pqx5" (OuterVolumeSpecName: "kube-api-access-9pqx5") pod "9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9" (UID: "9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9"). InnerVolumeSpecName "kube-api-access-9pqx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.483965 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.485459 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.486729 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.487669 5094 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.487694 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cc749f9eaf145df5c444fbae24383d1bdaa4331dffc2f7c6f1445ba7dce2304b/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.488238 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.488764 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.491471 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.505294 5094 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.505466 5094 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-219358f9-7520-4512-8729-274ec1ad54bd" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-219358f9-7520-4512-8729-274ec1ad54bd") on node "crc" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.512771 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfn2p\" (UniqueName: \"kubernetes.io/projected/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-kube-api-access-nfn2p\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.529675 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9" (UID: "9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.533463 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.541966 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-config" (OuterVolumeSpecName: "config") pod "9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9" (UID: "9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.577199 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.577233 5094 reconciler_common.go:293] "Volume detached for volume \"pvc-219358f9-7520-4512-8729-274ec1ad54bd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-219358f9-7520-4512-8729-274ec1ad54bd\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.577247 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pqx5\" (UniqueName: \"kubernetes.io/projected/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-kube-api-access-9pqx5\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.577257 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.637601 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.910681 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.229755 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" event={"ID":"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9","Type":"ContainerDied","Data":"e19bae85da8585d71a99f124e607e047f6c285504074d9b173d9c3014d6e6d83"} Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.230151 5094 scope.go:117] "RemoveContainer" containerID="8a09f7482da0f94c8e0f012e0e1e52314f35f8500de127da7d88164462490023" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.230026 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.238391 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b","Type":"ContainerStarted","Data":"7b80f9c0dd2c624b48bdf59524151b4fa2295a49aa7e051802bad2d412291ce3"} Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.239444 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.260266 5094 scope.go:117] "RemoveContainer" containerID="24e6eb261ea32e536bfab420ed66babedb6b616c3c0c6563b11146f4eaf4e89e" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.265212 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76466d7cdc-nc2ng"] Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.284655 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76466d7cdc-nc2ng"] Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.304379 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.304447 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.322845 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 08:25:51 crc kubenswrapper[5094]: E0220 08:25:51.323377 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9" containerName="dnsmasq-dns" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.323462 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9" containerName="dnsmasq-dns" Feb 20 08:25:51 crc kubenswrapper[5094]: E0220 08:25:51.323538 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9" containerName="init" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.323601 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9" containerName="init" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.323811 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9" containerName="dnsmasq-dns" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.324594 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.333147 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.336754 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.337054 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.337215 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.337328 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-rs7rx" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.340473 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.411467 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/392a6bbf-c80d-4142-adb2-b4828517b1c6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.411561 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/392a6bbf-c80d-4142-adb2-b4828517b1c6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.411591 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-219358f9-7520-4512-8729-274ec1ad54bd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-219358f9-7520-4512-8729-274ec1ad54bd\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.411617 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/392a6bbf-c80d-4142-adb2-b4828517b1c6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.411643 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/392a6bbf-c80d-4142-adb2-b4828517b1c6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.411672 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/392a6bbf-c80d-4142-adb2-b4828517b1c6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.411785 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpp2v\" (UniqueName: \"kubernetes.io/projected/392a6bbf-c80d-4142-adb2-b4828517b1c6-kube-api-access-cpp2v\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.411997 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/392a6bbf-c80d-4142-adb2-b4828517b1c6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.412084 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/392a6bbf-c80d-4142-adb2-b4828517b1c6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.512533 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/392a6bbf-c80d-4142-adb2-b4828517b1c6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.512574 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpp2v\" (UniqueName: \"kubernetes.io/projected/392a6bbf-c80d-4142-adb2-b4828517b1c6-kube-api-access-cpp2v\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.512613 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/392a6bbf-c80d-4142-adb2-b4828517b1c6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.512636 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/392a6bbf-c80d-4142-adb2-b4828517b1c6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.512675 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/392a6bbf-c80d-4142-adb2-b4828517b1c6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.512732 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/392a6bbf-c80d-4142-adb2-b4828517b1c6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.512751 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-219358f9-7520-4512-8729-274ec1ad54bd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-219358f9-7520-4512-8729-274ec1ad54bd\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.512773 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/392a6bbf-c80d-4142-adb2-b4828517b1c6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.512787 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/392a6bbf-c80d-4142-adb2-b4828517b1c6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.514863 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/392a6bbf-c80d-4142-adb2-b4828517b1c6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.515037 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/392a6bbf-c80d-4142-adb2-b4828517b1c6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.515141 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/392a6bbf-c80d-4142-adb2-b4828517b1c6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.516042 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/392a6bbf-c80d-4142-adb2-b4828517b1c6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.517173 5094 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.517272 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-219358f9-7520-4512-8729-274ec1ad54bd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-219358f9-7520-4512-8729-274ec1ad54bd\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c0ed1e37b492902e6baae6f722347d61d8d5759c03a6d0fd94fd84b63621ed84/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.517871 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/392a6bbf-c80d-4142-adb2-b4828517b1c6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.519155 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/392a6bbf-c80d-4142-adb2-b4828517b1c6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.521791 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/392a6bbf-c80d-4142-adb2-b4828517b1c6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.532643 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpp2v\" (UniqueName: \"kubernetes.io/projected/392a6bbf-c80d-4142-adb2-b4828517b1c6-kube-api-access-cpp2v\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.548332 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-219358f9-7520-4512-8729-274ec1ad54bd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-219358f9-7520-4512-8729-274ec1ad54bd\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.684020 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.865534 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09194bc6-429f-46a1-8dac-8b84385c9e10" path="/var/lib/kubelet/pods/09194bc6-429f-46a1-8dac-8b84385c9e10/volumes" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.866846 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9" path="/var/lib/kubelet/pods/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9/volumes" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.876532 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6404f29-e503-4f82-a2ce-e147c18677a7" path="/var/lib/kubelet/pods/d6404f29-e503-4f82-a2ce-e147c18677a7/volumes" Feb 20 08:25:52 crc kubenswrapper[5094]: I0220 08:25:52.107095 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 08:25:52 crc kubenswrapper[5094]: W0220 08:25:52.146101 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod392a6bbf_c80d_4142_adb2_b4828517b1c6.slice/crio-f3fdcbe9418cd588544798e51723a34620d3427c69aa739a6a00bcbcebfd74e9 WatchSource:0}: Error finding container f3fdcbe9418cd588544798e51723a34620d3427c69aa739a6a00bcbcebfd74e9: Status 404 returned error can't find the container with id f3fdcbe9418cd588544798e51723a34620d3427c69aa739a6a00bcbcebfd74e9 Feb 20 08:25:52 crc kubenswrapper[5094]: I0220 08:25:52.256635 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"392a6bbf-c80d-4142-adb2-b4828517b1c6","Type":"ContainerStarted","Data":"f3fdcbe9418cd588544798e51723a34620d3427c69aa739a6a00bcbcebfd74e9"} Feb 20 08:25:53 crc kubenswrapper[5094]: I0220 08:25:53.268069 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b","Type":"ContainerStarted","Data":"ab3945abed45e569ed16f126fb6c69dd16f3c7414312304803aee733c8faae22"} Feb 20 08:25:54 crc kubenswrapper[5094]: I0220 08:25:54.277642 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"392a6bbf-c80d-4142-adb2-b4828517b1c6","Type":"ContainerStarted","Data":"cb2ce7eb067ea29894907db6542d2260151c1caaba71914d8c805a9a260f43dc"} Feb 20 08:26:24 crc kubenswrapper[5094]: I0220 08:26:24.523204 5094 generic.go:334] "Generic (PLEG): container finished" podID="fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b" containerID="ab3945abed45e569ed16f126fb6c69dd16f3c7414312304803aee733c8faae22" exitCode=0 Feb 20 08:26:24 crc kubenswrapper[5094]: I0220 08:26:24.523289 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b","Type":"ContainerDied","Data":"ab3945abed45e569ed16f126fb6c69dd16f3c7414312304803aee733c8faae22"} Feb 20 08:26:25 crc kubenswrapper[5094]: I0220 08:26:25.531872 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b","Type":"ContainerStarted","Data":"f31fffa9aaf7df7267a9d1b277cc7729ff296a53878bb5b76a0999edb7ddc055"} Feb 20 08:26:25 crc kubenswrapper[5094]: I0220 08:26:25.532334 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 20 08:26:25 crc kubenswrapper[5094]: I0220 08:26:25.538007 5094 generic.go:334] "Generic (PLEG): container finished" podID="392a6bbf-c80d-4142-adb2-b4828517b1c6" containerID="cb2ce7eb067ea29894907db6542d2260151c1caaba71914d8c805a9a260f43dc" exitCode=0 Feb 20 08:26:25 crc kubenswrapper[5094]: I0220 08:26:25.538049 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"392a6bbf-c80d-4142-adb2-b4828517b1c6","Type":"ContainerDied","Data":"cb2ce7eb067ea29894907db6542d2260151c1caaba71914d8c805a9a260f43dc"} Feb 20 08:26:25 crc kubenswrapper[5094]: I0220 08:26:25.560401 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=35.560382695 podStartE2EDuration="35.560382695s" podCreationTimestamp="2026-02-20 08:25:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:26:25.557537806 +0000 UTC m=+6000.430164537" watchObservedRunningTime="2026-02-20 08:26:25.560382695 +0000 UTC m=+6000.433009406" Feb 20 08:26:26 crc kubenswrapper[5094]: I0220 08:26:26.545941 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"392a6bbf-c80d-4142-adb2-b4828517b1c6","Type":"ContainerStarted","Data":"a02e39cab42ca86cca65460f27340ab2f34d0de07eb22c8b5a3463d05dba2a09"} Feb 20 08:26:26 crc kubenswrapper[5094]: I0220 08:26:26.546648 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:26:26 crc kubenswrapper[5094]: I0220 08:26:26.567162 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=35.56714297 podStartE2EDuration="35.56714297s" podCreationTimestamp="2026-02-20 08:25:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:26:26.564553529 +0000 UTC m=+6001.437180250" watchObservedRunningTime="2026-02-20 08:26:26.56714297 +0000 UTC m=+6001.439769691" Feb 20 08:26:40 crc kubenswrapper[5094]: I0220 08:26:40.641945 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 20 08:26:41 crc kubenswrapper[5094]: I0220 08:26:41.688009 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:26:48 crc kubenswrapper[5094]: I0220 08:26:48.678357 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Feb 20 08:26:48 crc kubenswrapper[5094]: I0220 08:26:48.680191 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 20 08:26:48 crc kubenswrapper[5094]: I0220 08:26:48.685072 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-d88p7" Feb 20 08:26:48 crc kubenswrapper[5094]: I0220 08:26:48.690690 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 20 08:26:48 crc kubenswrapper[5094]: I0220 08:26:48.870045 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk227\" (UniqueName: \"kubernetes.io/projected/0a72a034-da88-4003-930b-d4a4843366fa-kube-api-access-jk227\") pod \"mariadb-client\" (UID: \"0a72a034-da88-4003-930b-d4a4843366fa\") " pod="openstack/mariadb-client" Feb 20 08:26:48 crc kubenswrapper[5094]: I0220 08:26:48.972082 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk227\" (UniqueName: \"kubernetes.io/projected/0a72a034-da88-4003-930b-d4a4843366fa-kube-api-access-jk227\") pod \"mariadb-client\" (UID: \"0a72a034-da88-4003-930b-d4a4843366fa\") " pod="openstack/mariadb-client" Feb 20 08:26:48 crc kubenswrapper[5094]: I0220 08:26:48.994349 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk227\" (UniqueName: \"kubernetes.io/projected/0a72a034-da88-4003-930b-d4a4843366fa-kube-api-access-jk227\") pod \"mariadb-client\" (UID: \"0a72a034-da88-4003-930b-d4a4843366fa\") " pod="openstack/mariadb-client" Feb 20 08:26:49 crc kubenswrapper[5094]: I0220 08:26:49.005238 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 20 08:26:49 crc kubenswrapper[5094]: I0220 08:26:49.605749 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 20 08:26:49 crc kubenswrapper[5094]: W0220 08:26:49.616319 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a72a034_da88_4003_930b_d4a4843366fa.slice/crio-153ceebd8012a66fad340f61b2a9cd58fb7be67cb3d3230fe6e8e53eb4b5b054 WatchSource:0}: Error finding container 153ceebd8012a66fad340f61b2a9cd58fb7be67cb3d3230fe6e8e53eb4b5b054: Status 404 returned error can't find the container with id 153ceebd8012a66fad340f61b2a9cd58fb7be67cb3d3230fe6e8e53eb4b5b054 Feb 20 08:26:49 crc kubenswrapper[5094]: I0220 08:26:49.719872 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"0a72a034-da88-4003-930b-d4a4843366fa","Type":"ContainerStarted","Data":"153ceebd8012a66fad340f61b2a9cd58fb7be67cb3d3230fe6e8e53eb4b5b054"} Feb 20 08:26:50 crc kubenswrapper[5094]: I0220 08:26:50.729883 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"0a72a034-da88-4003-930b-d4a4843366fa","Type":"ContainerStarted","Data":"ea871822d43bdfe462e2b400f50aee8de7fad30859014e952fc2e09e25cb686b"} Feb 20 08:26:50 crc kubenswrapper[5094]: I0220 08:26:50.749198 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client" podStartSLOduration=2.155472409 podStartE2EDuration="2.749177597s" podCreationTimestamp="2026-02-20 08:26:48 +0000 UTC" firstStartedPulling="2026-02-20 08:26:49.618261456 +0000 UTC m=+6024.490888167" lastFinishedPulling="2026-02-20 08:26:50.211966634 +0000 UTC m=+6025.084593355" observedRunningTime="2026-02-20 08:26:50.74188907 +0000 UTC m=+6025.614515791" watchObservedRunningTime="2026-02-20 08:26:50.749177597 +0000 UTC m=+6025.621804308" Feb 20 08:27:03 crc kubenswrapper[5094]: I0220 08:27:03.750971 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 20 08:27:03 crc kubenswrapper[5094]: I0220 08:27:03.751875 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-client" podUID="0a72a034-da88-4003-930b-d4a4843366fa" containerName="mariadb-client" containerID="cri-o://ea871822d43bdfe462e2b400f50aee8de7fad30859014e952fc2e09e25cb686b" gracePeriod=30 Feb 20 08:27:04 crc kubenswrapper[5094]: I0220 08:27:04.295339 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 20 08:27:04 crc kubenswrapper[5094]: I0220 08:27:04.361138 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk227\" (UniqueName: \"kubernetes.io/projected/0a72a034-da88-4003-930b-d4a4843366fa-kube-api-access-jk227\") pod \"0a72a034-da88-4003-930b-d4a4843366fa\" (UID: \"0a72a034-da88-4003-930b-d4a4843366fa\") " Feb 20 08:27:04 crc kubenswrapper[5094]: I0220 08:27:04.366404 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a72a034-da88-4003-930b-d4a4843366fa-kube-api-access-jk227" (OuterVolumeSpecName: "kube-api-access-jk227") pod "0a72a034-da88-4003-930b-d4a4843366fa" (UID: "0a72a034-da88-4003-930b-d4a4843366fa"). InnerVolumeSpecName "kube-api-access-jk227". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:27:04 crc kubenswrapper[5094]: I0220 08:27:04.463142 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jk227\" (UniqueName: \"kubernetes.io/projected/0a72a034-da88-4003-930b-d4a4843366fa-kube-api-access-jk227\") on node \"crc\" DevicePath \"\"" Feb 20 08:27:04 crc kubenswrapper[5094]: I0220 08:27:04.878391 5094 generic.go:334] "Generic (PLEG): container finished" podID="0a72a034-da88-4003-930b-d4a4843366fa" containerID="ea871822d43bdfe462e2b400f50aee8de7fad30859014e952fc2e09e25cb686b" exitCode=143 Feb 20 08:27:04 crc kubenswrapper[5094]: I0220 08:27:04.878448 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"0a72a034-da88-4003-930b-d4a4843366fa","Type":"ContainerDied","Data":"ea871822d43bdfe462e2b400f50aee8de7fad30859014e952fc2e09e25cb686b"} Feb 20 08:27:04 crc kubenswrapper[5094]: I0220 08:27:04.878485 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"0a72a034-da88-4003-930b-d4a4843366fa","Type":"ContainerDied","Data":"153ceebd8012a66fad340f61b2a9cd58fb7be67cb3d3230fe6e8e53eb4b5b054"} Feb 20 08:27:04 crc kubenswrapper[5094]: I0220 08:27:04.878494 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 20 08:27:04 crc kubenswrapper[5094]: I0220 08:27:04.878506 5094 scope.go:117] "RemoveContainer" containerID="ea871822d43bdfe462e2b400f50aee8de7fad30859014e952fc2e09e25cb686b" Feb 20 08:27:04 crc kubenswrapper[5094]: I0220 08:27:04.896172 5094 scope.go:117] "RemoveContainer" containerID="ea871822d43bdfe462e2b400f50aee8de7fad30859014e952fc2e09e25cb686b" Feb 20 08:27:04 crc kubenswrapper[5094]: E0220 08:27:04.896809 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea871822d43bdfe462e2b400f50aee8de7fad30859014e952fc2e09e25cb686b\": container with ID starting with ea871822d43bdfe462e2b400f50aee8de7fad30859014e952fc2e09e25cb686b not found: ID does not exist" containerID="ea871822d43bdfe462e2b400f50aee8de7fad30859014e952fc2e09e25cb686b" Feb 20 08:27:04 crc kubenswrapper[5094]: I0220 08:27:04.896916 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea871822d43bdfe462e2b400f50aee8de7fad30859014e952fc2e09e25cb686b"} err="failed to get container status \"ea871822d43bdfe462e2b400f50aee8de7fad30859014e952fc2e09e25cb686b\": rpc error: code = NotFound desc = could not find container \"ea871822d43bdfe462e2b400f50aee8de7fad30859014e952fc2e09e25cb686b\": container with ID starting with ea871822d43bdfe462e2b400f50aee8de7fad30859014e952fc2e09e25cb686b not found: ID does not exist" Feb 20 08:27:04 crc kubenswrapper[5094]: I0220 08:27:04.952451 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 20 08:27:04 crc kubenswrapper[5094]: I0220 08:27:04.958907 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Feb 20 08:27:05 crc kubenswrapper[5094]: I0220 08:27:05.848856 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a72a034-da88-4003-930b-d4a4843366fa" path="/var/lib/kubelet/pods/0a72a034-da88-4003-930b-d4a4843366fa/volumes" Feb 20 08:27:09 crc kubenswrapper[5094]: I0220 08:27:09.890492 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pn85q"] Feb 20 08:27:09 crc kubenswrapper[5094]: E0220 08:27:09.891478 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a72a034-da88-4003-930b-d4a4843366fa" containerName="mariadb-client" Feb 20 08:27:09 crc kubenswrapper[5094]: I0220 08:27:09.891495 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a72a034-da88-4003-930b-d4a4843366fa" containerName="mariadb-client" Feb 20 08:27:09 crc kubenswrapper[5094]: I0220 08:27:09.891721 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a72a034-da88-4003-930b-d4a4843366fa" containerName="mariadb-client" Feb 20 08:27:09 crc kubenswrapper[5094]: I0220 08:27:09.893443 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:09 crc kubenswrapper[5094]: I0220 08:27:09.899267 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pn85q"] Feb 20 08:27:10 crc kubenswrapper[5094]: I0220 08:27:10.052167 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5727179b-3c74-4339-8714-80c77b9ac4c1-catalog-content\") pod \"redhat-marketplace-pn85q\" (UID: \"5727179b-3c74-4339-8714-80c77b9ac4c1\") " pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:10 crc kubenswrapper[5094]: I0220 08:27:10.052236 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5727179b-3c74-4339-8714-80c77b9ac4c1-utilities\") pod \"redhat-marketplace-pn85q\" (UID: \"5727179b-3c74-4339-8714-80c77b9ac4c1\") " pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:10 crc kubenswrapper[5094]: I0220 08:27:10.052319 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q75c7\" (UniqueName: \"kubernetes.io/projected/5727179b-3c74-4339-8714-80c77b9ac4c1-kube-api-access-q75c7\") pod \"redhat-marketplace-pn85q\" (UID: \"5727179b-3c74-4339-8714-80c77b9ac4c1\") " pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:10 crc kubenswrapper[5094]: I0220 08:27:10.153943 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q75c7\" (UniqueName: \"kubernetes.io/projected/5727179b-3c74-4339-8714-80c77b9ac4c1-kube-api-access-q75c7\") pod \"redhat-marketplace-pn85q\" (UID: \"5727179b-3c74-4339-8714-80c77b9ac4c1\") " pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:10 crc kubenswrapper[5094]: I0220 08:27:10.154005 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5727179b-3c74-4339-8714-80c77b9ac4c1-catalog-content\") pod \"redhat-marketplace-pn85q\" (UID: \"5727179b-3c74-4339-8714-80c77b9ac4c1\") " pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:10 crc kubenswrapper[5094]: I0220 08:27:10.154037 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5727179b-3c74-4339-8714-80c77b9ac4c1-utilities\") pod \"redhat-marketplace-pn85q\" (UID: \"5727179b-3c74-4339-8714-80c77b9ac4c1\") " pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:10 crc kubenswrapper[5094]: I0220 08:27:10.154505 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5727179b-3c74-4339-8714-80c77b9ac4c1-catalog-content\") pod \"redhat-marketplace-pn85q\" (UID: \"5727179b-3c74-4339-8714-80c77b9ac4c1\") " pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:10 crc kubenswrapper[5094]: I0220 08:27:10.154536 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5727179b-3c74-4339-8714-80c77b9ac4c1-utilities\") pod \"redhat-marketplace-pn85q\" (UID: \"5727179b-3c74-4339-8714-80c77b9ac4c1\") " pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:10 crc kubenswrapper[5094]: I0220 08:27:10.174173 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q75c7\" (UniqueName: \"kubernetes.io/projected/5727179b-3c74-4339-8714-80c77b9ac4c1-kube-api-access-q75c7\") pod \"redhat-marketplace-pn85q\" (UID: \"5727179b-3c74-4339-8714-80c77b9ac4c1\") " pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:10 crc kubenswrapper[5094]: I0220 08:27:10.224452 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:10 crc kubenswrapper[5094]: I0220 08:27:10.648384 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pn85q"] Feb 20 08:27:10 crc kubenswrapper[5094]: I0220 08:27:10.931473 5094 generic.go:334] "Generic (PLEG): container finished" podID="5727179b-3c74-4339-8714-80c77b9ac4c1" containerID="f0ee81ec9752dc160db9c095f7837ea36ac32ee4f2d219c3d951a3e54a76f625" exitCode=0 Feb 20 08:27:10 crc kubenswrapper[5094]: I0220 08:27:10.931568 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pn85q" event={"ID":"5727179b-3c74-4339-8714-80c77b9ac4c1","Type":"ContainerDied","Data":"f0ee81ec9752dc160db9c095f7837ea36ac32ee4f2d219c3d951a3e54a76f625"} Feb 20 08:27:10 crc kubenswrapper[5094]: I0220 08:27:10.931850 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pn85q" event={"ID":"5727179b-3c74-4339-8714-80c77b9ac4c1","Type":"ContainerStarted","Data":"b576f773f6ac06b5034d0d8b7def4a70037f3dd2770948ebe163885be0ac669d"} Feb 20 08:27:10 crc kubenswrapper[5094]: I0220 08:27:10.935766 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 08:27:11 crc kubenswrapper[5094]: I0220 08:27:11.939434 5094 generic.go:334] "Generic (PLEG): container finished" podID="5727179b-3c74-4339-8714-80c77b9ac4c1" containerID="bb31d551d6e8d601a521138e3a8befce1c055afdd56e3413b1abcf1bf22d60fc" exitCode=0 Feb 20 08:27:11 crc kubenswrapper[5094]: I0220 08:27:11.939537 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pn85q" event={"ID":"5727179b-3c74-4339-8714-80c77b9ac4c1","Type":"ContainerDied","Data":"bb31d551d6e8d601a521138e3a8befce1c055afdd56e3413b1abcf1bf22d60fc"} Feb 20 08:27:12 crc kubenswrapper[5094]: I0220 08:27:12.951459 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pn85q" event={"ID":"5727179b-3c74-4339-8714-80c77b9ac4c1","Type":"ContainerStarted","Data":"e4fb2d59873364aee450a2596febcda23d8b32cd503f88490dc96e9906051dfb"} Feb 20 08:27:12 crc kubenswrapper[5094]: I0220 08:27:12.974243 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pn85q" podStartSLOduration=2.5842042259999998 podStartE2EDuration="3.974218s" podCreationTimestamp="2026-02-20 08:27:09 +0000 UTC" firstStartedPulling="2026-02-20 08:27:10.93553939 +0000 UTC m=+6045.808166101" lastFinishedPulling="2026-02-20 08:27:12.325553154 +0000 UTC m=+6047.198179875" observedRunningTime="2026-02-20 08:27:12.969018024 +0000 UTC m=+6047.841644735" watchObservedRunningTime="2026-02-20 08:27:12.974218 +0000 UTC m=+6047.846844711" Feb 20 08:27:20 crc kubenswrapper[5094]: I0220 08:27:20.225758 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:20 crc kubenswrapper[5094]: I0220 08:27:20.226813 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:20 crc kubenswrapper[5094]: I0220 08:27:20.277339 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:21 crc kubenswrapper[5094]: I0220 08:27:21.046232 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:21 crc kubenswrapper[5094]: I0220 08:27:21.090972 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pn85q"] Feb 20 08:27:23 crc kubenswrapper[5094]: I0220 08:27:23.021308 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pn85q" podUID="5727179b-3c74-4339-8714-80c77b9ac4c1" containerName="registry-server" containerID="cri-o://e4fb2d59873364aee450a2596febcda23d8b32cd503f88490dc96e9906051dfb" gracePeriod=2 Feb 20 08:27:23 crc kubenswrapper[5094]: I0220 08:27:23.534056 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:23 crc kubenswrapper[5094]: I0220 08:27:23.573459 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5727179b-3c74-4339-8714-80c77b9ac4c1-utilities\") pod \"5727179b-3c74-4339-8714-80c77b9ac4c1\" (UID: \"5727179b-3c74-4339-8714-80c77b9ac4c1\") " Feb 20 08:27:23 crc kubenswrapper[5094]: I0220 08:27:23.573566 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q75c7\" (UniqueName: \"kubernetes.io/projected/5727179b-3c74-4339-8714-80c77b9ac4c1-kube-api-access-q75c7\") pod \"5727179b-3c74-4339-8714-80c77b9ac4c1\" (UID: \"5727179b-3c74-4339-8714-80c77b9ac4c1\") " Feb 20 08:27:23 crc kubenswrapper[5094]: I0220 08:27:23.573641 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5727179b-3c74-4339-8714-80c77b9ac4c1-catalog-content\") pod \"5727179b-3c74-4339-8714-80c77b9ac4c1\" (UID: \"5727179b-3c74-4339-8714-80c77b9ac4c1\") " Feb 20 08:27:23 crc kubenswrapper[5094]: I0220 08:27:23.574607 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5727179b-3c74-4339-8714-80c77b9ac4c1-utilities" (OuterVolumeSpecName: "utilities") pod "5727179b-3c74-4339-8714-80c77b9ac4c1" (UID: "5727179b-3c74-4339-8714-80c77b9ac4c1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:27:23 crc kubenswrapper[5094]: I0220 08:27:23.579342 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5727179b-3c74-4339-8714-80c77b9ac4c1-kube-api-access-q75c7" (OuterVolumeSpecName: "kube-api-access-q75c7") pod "5727179b-3c74-4339-8714-80c77b9ac4c1" (UID: "5727179b-3c74-4339-8714-80c77b9ac4c1"). InnerVolumeSpecName "kube-api-access-q75c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:27:23 crc kubenswrapper[5094]: I0220 08:27:23.598168 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5727179b-3c74-4339-8714-80c77b9ac4c1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5727179b-3c74-4339-8714-80c77b9ac4c1" (UID: "5727179b-3c74-4339-8714-80c77b9ac4c1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:27:23 crc kubenswrapper[5094]: I0220 08:27:23.674390 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q75c7\" (UniqueName: \"kubernetes.io/projected/5727179b-3c74-4339-8714-80c77b9ac4c1-kube-api-access-q75c7\") on node \"crc\" DevicePath \"\"" Feb 20 08:27:23 crc kubenswrapper[5094]: I0220 08:27:23.674440 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5727179b-3c74-4339-8714-80c77b9ac4c1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:27:23 crc kubenswrapper[5094]: I0220 08:27:23.674452 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5727179b-3c74-4339-8714-80c77b9ac4c1-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:27:23 crc kubenswrapper[5094]: E0220 08:27:23.941257 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5727179b_3c74_4339_8714_80c77b9ac4c1.slice\": RecentStats: unable to find data in memory cache]" Feb 20 08:27:24 crc kubenswrapper[5094]: I0220 08:27:24.030178 5094 generic.go:334] "Generic (PLEG): container finished" podID="5727179b-3c74-4339-8714-80c77b9ac4c1" containerID="e4fb2d59873364aee450a2596febcda23d8b32cd503f88490dc96e9906051dfb" exitCode=0 Feb 20 08:27:24 crc kubenswrapper[5094]: I0220 08:27:24.030226 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pn85q" event={"ID":"5727179b-3c74-4339-8714-80c77b9ac4c1","Type":"ContainerDied","Data":"e4fb2d59873364aee450a2596febcda23d8b32cd503f88490dc96e9906051dfb"} Feb 20 08:27:24 crc kubenswrapper[5094]: I0220 08:27:24.030255 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pn85q" event={"ID":"5727179b-3c74-4339-8714-80c77b9ac4c1","Type":"ContainerDied","Data":"b576f773f6ac06b5034d0d8b7def4a70037f3dd2770948ebe163885be0ac669d"} Feb 20 08:27:24 crc kubenswrapper[5094]: I0220 08:27:24.030271 5094 scope.go:117] "RemoveContainer" containerID="e4fb2d59873364aee450a2596febcda23d8b32cd503f88490dc96e9906051dfb" Feb 20 08:27:24 crc kubenswrapper[5094]: I0220 08:27:24.030300 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:24 crc kubenswrapper[5094]: I0220 08:27:24.064057 5094 scope.go:117] "RemoveContainer" containerID="bb31d551d6e8d601a521138e3a8befce1c055afdd56e3413b1abcf1bf22d60fc" Feb 20 08:27:24 crc kubenswrapper[5094]: I0220 08:27:24.065636 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pn85q"] Feb 20 08:27:24 crc kubenswrapper[5094]: I0220 08:27:24.074125 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pn85q"] Feb 20 08:27:24 crc kubenswrapper[5094]: I0220 08:27:24.097942 5094 scope.go:117] "RemoveContainer" containerID="f0ee81ec9752dc160db9c095f7837ea36ac32ee4f2d219c3d951a3e54a76f625" Feb 20 08:27:24 crc kubenswrapper[5094]: I0220 08:27:24.122479 5094 scope.go:117] "RemoveContainer" containerID="e4fb2d59873364aee450a2596febcda23d8b32cd503f88490dc96e9906051dfb" Feb 20 08:27:24 crc kubenswrapper[5094]: E0220 08:27:24.123640 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4fb2d59873364aee450a2596febcda23d8b32cd503f88490dc96e9906051dfb\": container with ID starting with e4fb2d59873364aee450a2596febcda23d8b32cd503f88490dc96e9906051dfb not found: ID does not exist" containerID="e4fb2d59873364aee450a2596febcda23d8b32cd503f88490dc96e9906051dfb" Feb 20 08:27:24 crc kubenswrapper[5094]: I0220 08:27:24.123802 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4fb2d59873364aee450a2596febcda23d8b32cd503f88490dc96e9906051dfb"} err="failed to get container status \"e4fb2d59873364aee450a2596febcda23d8b32cd503f88490dc96e9906051dfb\": rpc error: code = NotFound desc = could not find container \"e4fb2d59873364aee450a2596febcda23d8b32cd503f88490dc96e9906051dfb\": container with ID starting with e4fb2d59873364aee450a2596febcda23d8b32cd503f88490dc96e9906051dfb not found: ID does not exist" Feb 20 08:27:24 crc kubenswrapper[5094]: I0220 08:27:24.123892 5094 scope.go:117] "RemoveContainer" containerID="bb31d551d6e8d601a521138e3a8befce1c055afdd56e3413b1abcf1bf22d60fc" Feb 20 08:27:24 crc kubenswrapper[5094]: E0220 08:27:24.124595 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb31d551d6e8d601a521138e3a8befce1c055afdd56e3413b1abcf1bf22d60fc\": container with ID starting with bb31d551d6e8d601a521138e3a8befce1c055afdd56e3413b1abcf1bf22d60fc not found: ID does not exist" containerID="bb31d551d6e8d601a521138e3a8befce1c055afdd56e3413b1abcf1bf22d60fc" Feb 20 08:27:24 crc kubenswrapper[5094]: I0220 08:27:24.124662 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb31d551d6e8d601a521138e3a8befce1c055afdd56e3413b1abcf1bf22d60fc"} err="failed to get container status \"bb31d551d6e8d601a521138e3a8befce1c055afdd56e3413b1abcf1bf22d60fc\": rpc error: code = NotFound desc = could not find container \"bb31d551d6e8d601a521138e3a8befce1c055afdd56e3413b1abcf1bf22d60fc\": container with ID starting with bb31d551d6e8d601a521138e3a8befce1c055afdd56e3413b1abcf1bf22d60fc not found: ID does not exist" Feb 20 08:27:24 crc kubenswrapper[5094]: I0220 08:27:24.125039 5094 scope.go:117] "RemoveContainer" containerID="f0ee81ec9752dc160db9c095f7837ea36ac32ee4f2d219c3d951a3e54a76f625" Feb 20 08:27:24 crc kubenswrapper[5094]: E0220 08:27:24.127074 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0ee81ec9752dc160db9c095f7837ea36ac32ee4f2d219c3d951a3e54a76f625\": container with ID starting with f0ee81ec9752dc160db9c095f7837ea36ac32ee4f2d219c3d951a3e54a76f625 not found: ID does not exist" containerID="f0ee81ec9752dc160db9c095f7837ea36ac32ee4f2d219c3d951a3e54a76f625" Feb 20 08:27:24 crc kubenswrapper[5094]: I0220 08:27:24.127132 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0ee81ec9752dc160db9c095f7837ea36ac32ee4f2d219c3d951a3e54a76f625"} err="failed to get container status \"f0ee81ec9752dc160db9c095f7837ea36ac32ee4f2d219c3d951a3e54a76f625\": rpc error: code = NotFound desc = could not find container \"f0ee81ec9752dc160db9c095f7837ea36ac32ee4f2d219c3d951a3e54a76f625\": container with ID starting with f0ee81ec9752dc160db9c095f7837ea36ac32ee4f2d219c3d951a3e54a76f625 not found: ID does not exist" Feb 20 08:27:25 crc kubenswrapper[5094]: I0220 08:27:25.852496 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5727179b-3c74-4339-8714-80c77b9ac4c1" path="/var/lib/kubelet/pods/5727179b-3c74-4339-8714-80c77b9ac4c1/volumes" Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.469747 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h6g8l"] Feb 20 08:27:26 crc kubenswrapper[5094]: E0220 08:27:26.470034 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5727179b-3c74-4339-8714-80c77b9ac4c1" containerName="extract-utilities" Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.470046 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5727179b-3c74-4339-8714-80c77b9ac4c1" containerName="extract-utilities" Feb 20 08:27:26 crc kubenswrapper[5094]: E0220 08:27:26.470062 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5727179b-3c74-4339-8714-80c77b9ac4c1" containerName="registry-server" Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.470068 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5727179b-3c74-4339-8714-80c77b9ac4c1" containerName="registry-server" Feb 20 08:27:26 crc kubenswrapper[5094]: E0220 08:27:26.470090 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5727179b-3c74-4339-8714-80c77b9ac4c1" containerName="extract-content" Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.470096 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5727179b-3c74-4339-8714-80c77b9ac4c1" containerName="extract-content" Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.470242 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5727179b-3c74-4339-8714-80c77b9ac4c1" containerName="registry-server" Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.471211 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.504978 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h6g8l"] Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.523130 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-catalog-content\") pod \"community-operators-h6g8l\" (UID: \"40cdda9b-f8ca-4731-b2ab-e982c3ec6893\") " pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.523202 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-utilities\") pod \"community-operators-h6g8l\" (UID: \"40cdda9b-f8ca-4731-b2ab-e982c3ec6893\") " pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.523330 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmt97\" (UniqueName: \"kubernetes.io/projected/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-kube-api-access-rmt97\") pod \"community-operators-h6g8l\" (UID: \"40cdda9b-f8ca-4731-b2ab-e982c3ec6893\") " pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.625116 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-utilities\") pod \"community-operators-h6g8l\" (UID: \"40cdda9b-f8ca-4731-b2ab-e982c3ec6893\") " pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.625191 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmt97\" (UniqueName: \"kubernetes.io/projected/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-kube-api-access-rmt97\") pod \"community-operators-h6g8l\" (UID: \"40cdda9b-f8ca-4731-b2ab-e982c3ec6893\") " pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.625271 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-catalog-content\") pod \"community-operators-h6g8l\" (UID: \"40cdda9b-f8ca-4731-b2ab-e982c3ec6893\") " pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.625814 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-utilities\") pod \"community-operators-h6g8l\" (UID: \"40cdda9b-f8ca-4731-b2ab-e982c3ec6893\") " pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.625868 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-catalog-content\") pod \"community-operators-h6g8l\" (UID: \"40cdda9b-f8ca-4731-b2ab-e982c3ec6893\") " pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.642749 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmt97\" (UniqueName: \"kubernetes.io/projected/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-kube-api-access-rmt97\") pod \"community-operators-h6g8l\" (UID: \"40cdda9b-f8ca-4731-b2ab-e982c3ec6893\") " pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.806310 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:27 crc kubenswrapper[5094]: I0220 08:27:27.087677 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h6g8l"] Feb 20 08:27:28 crc kubenswrapper[5094]: I0220 08:27:28.065667 5094 generic.go:334] "Generic (PLEG): container finished" podID="40cdda9b-f8ca-4731-b2ab-e982c3ec6893" containerID="2b2fefeb49093ee36074ed06293dc186b80167f359041c96cb12ca4d929a9354" exitCode=0 Feb 20 08:27:28 crc kubenswrapper[5094]: I0220 08:27:28.065766 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6g8l" event={"ID":"40cdda9b-f8ca-4731-b2ab-e982c3ec6893","Type":"ContainerDied","Data":"2b2fefeb49093ee36074ed06293dc186b80167f359041c96cb12ca4d929a9354"} Feb 20 08:27:28 crc kubenswrapper[5094]: I0220 08:27:28.066278 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6g8l" event={"ID":"40cdda9b-f8ca-4731-b2ab-e982c3ec6893","Type":"ContainerStarted","Data":"332979b6e60754867c0fa91afe6bdeb22d4fbcd58e4acb21f601fa64b4983b95"} Feb 20 08:27:29 crc kubenswrapper[5094]: I0220 08:27:29.078641 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6g8l" event={"ID":"40cdda9b-f8ca-4731-b2ab-e982c3ec6893","Type":"ContainerStarted","Data":"e3f0e12048f8f4b710bef856e9eb70b75648660c4f2f0c2507d4f0e195400f91"} Feb 20 08:27:30 crc kubenswrapper[5094]: I0220 08:27:30.087599 5094 generic.go:334] "Generic (PLEG): container finished" podID="40cdda9b-f8ca-4731-b2ab-e982c3ec6893" containerID="e3f0e12048f8f4b710bef856e9eb70b75648660c4f2f0c2507d4f0e195400f91" exitCode=0 Feb 20 08:27:30 crc kubenswrapper[5094]: I0220 08:27:30.087644 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6g8l" event={"ID":"40cdda9b-f8ca-4731-b2ab-e982c3ec6893","Type":"ContainerDied","Data":"e3f0e12048f8f4b710bef856e9eb70b75648660c4f2f0c2507d4f0e195400f91"} Feb 20 08:27:31 crc kubenswrapper[5094]: I0220 08:27:31.097255 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6g8l" event={"ID":"40cdda9b-f8ca-4731-b2ab-e982c3ec6893","Type":"ContainerStarted","Data":"2806195696dd702732fcb70c252a5dd4f41a8b33afe3f5188a6e81145e00ceff"} Feb 20 08:27:31 crc kubenswrapper[5094]: I0220 08:27:31.128518 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h6g8l" podStartSLOduration=2.495333651 podStartE2EDuration="5.128498178s" podCreationTimestamp="2026-02-20 08:27:26 +0000 UTC" firstStartedPulling="2026-02-20 08:27:28.068744001 +0000 UTC m=+6062.941370712" lastFinishedPulling="2026-02-20 08:27:30.701908508 +0000 UTC m=+6065.574535239" observedRunningTime="2026-02-20 08:27:31.1198877 +0000 UTC m=+6065.992514421" watchObservedRunningTime="2026-02-20 08:27:31.128498178 +0000 UTC m=+6066.001124889" Feb 20 08:27:34 crc kubenswrapper[5094]: I0220 08:27:34.106414 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:27:34 crc kubenswrapper[5094]: I0220 08:27:34.106984 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:27:36 crc kubenswrapper[5094]: I0220 08:27:36.806686 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:36 crc kubenswrapper[5094]: I0220 08:27:36.807195 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:36 crc kubenswrapper[5094]: I0220 08:27:36.853210 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:37 crc kubenswrapper[5094]: I0220 08:27:37.182450 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:37 crc kubenswrapper[5094]: I0220 08:27:37.228942 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h6g8l"] Feb 20 08:27:39 crc kubenswrapper[5094]: I0220 08:27:39.156180 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h6g8l" podUID="40cdda9b-f8ca-4731-b2ab-e982c3ec6893" containerName="registry-server" containerID="cri-o://2806195696dd702732fcb70c252a5dd4f41a8b33afe3f5188a6e81145e00ceff" gracePeriod=2 Feb 20 08:27:39 crc kubenswrapper[5094]: I0220 08:27:39.606973 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:39 crc kubenswrapper[5094]: I0220 08:27:39.633150 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-utilities\") pod \"40cdda9b-f8ca-4731-b2ab-e982c3ec6893\" (UID: \"40cdda9b-f8ca-4731-b2ab-e982c3ec6893\") " Feb 20 08:27:39 crc kubenswrapper[5094]: I0220 08:27:39.633276 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmt97\" (UniqueName: \"kubernetes.io/projected/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-kube-api-access-rmt97\") pod \"40cdda9b-f8ca-4731-b2ab-e982c3ec6893\" (UID: \"40cdda9b-f8ca-4731-b2ab-e982c3ec6893\") " Feb 20 08:27:39 crc kubenswrapper[5094]: I0220 08:27:39.633305 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-catalog-content\") pod \"40cdda9b-f8ca-4731-b2ab-e982c3ec6893\" (UID: \"40cdda9b-f8ca-4731-b2ab-e982c3ec6893\") " Feb 20 08:27:39 crc kubenswrapper[5094]: I0220 08:27:39.634467 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-utilities" (OuterVolumeSpecName: "utilities") pod "40cdda9b-f8ca-4731-b2ab-e982c3ec6893" (UID: "40cdda9b-f8ca-4731-b2ab-e982c3ec6893"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:27:39 crc kubenswrapper[5094]: I0220 08:27:39.639927 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-kube-api-access-rmt97" (OuterVolumeSpecName: "kube-api-access-rmt97") pod "40cdda9b-f8ca-4731-b2ab-e982c3ec6893" (UID: "40cdda9b-f8ca-4731-b2ab-e982c3ec6893"). InnerVolumeSpecName "kube-api-access-rmt97". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:27:39 crc kubenswrapper[5094]: I0220 08:27:39.689309 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40cdda9b-f8ca-4731-b2ab-e982c3ec6893" (UID: "40cdda9b-f8ca-4731-b2ab-e982c3ec6893"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:27:39 crc kubenswrapper[5094]: I0220 08:27:39.734751 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmt97\" (UniqueName: \"kubernetes.io/projected/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-kube-api-access-rmt97\") on node \"crc\" DevicePath \"\"" Feb 20 08:27:39 crc kubenswrapper[5094]: I0220 08:27:39.734792 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:27:39 crc kubenswrapper[5094]: I0220 08:27:39.734804 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:27:40 crc kubenswrapper[5094]: I0220 08:27:40.165180 5094 generic.go:334] "Generic (PLEG): container finished" podID="40cdda9b-f8ca-4731-b2ab-e982c3ec6893" containerID="2806195696dd702732fcb70c252a5dd4f41a8b33afe3f5188a6e81145e00ceff" exitCode=0 Feb 20 08:27:40 crc kubenswrapper[5094]: I0220 08:27:40.165229 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6g8l" event={"ID":"40cdda9b-f8ca-4731-b2ab-e982c3ec6893","Type":"ContainerDied","Data":"2806195696dd702732fcb70c252a5dd4f41a8b33afe3f5188a6e81145e00ceff"} Feb 20 08:27:40 crc kubenswrapper[5094]: I0220 08:27:40.165265 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6g8l" event={"ID":"40cdda9b-f8ca-4731-b2ab-e982c3ec6893","Type":"ContainerDied","Data":"332979b6e60754867c0fa91afe6bdeb22d4fbcd58e4acb21f601fa64b4983b95"} Feb 20 08:27:40 crc kubenswrapper[5094]: I0220 08:27:40.165290 5094 scope.go:117] "RemoveContainer" containerID="2806195696dd702732fcb70c252a5dd4f41a8b33afe3f5188a6e81145e00ceff" Feb 20 08:27:40 crc kubenswrapper[5094]: I0220 08:27:40.165230 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:40 crc kubenswrapper[5094]: I0220 08:27:40.208767 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h6g8l"] Feb 20 08:27:40 crc kubenswrapper[5094]: I0220 08:27:40.208926 5094 scope.go:117] "RemoveContainer" containerID="e3f0e12048f8f4b710bef856e9eb70b75648660c4f2f0c2507d4f0e195400f91" Feb 20 08:27:40 crc kubenswrapper[5094]: I0220 08:27:40.215941 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h6g8l"] Feb 20 08:27:40 crc kubenswrapper[5094]: I0220 08:27:40.265871 5094 scope.go:117] "RemoveContainer" containerID="2b2fefeb49093ee36074ed06293dc186b80167f359041c96cb12ca4d929a9354" Feb 20 08:27:40 crc kubenswrapper[5094]: I0220 08:27:40.293040 5094 scope.go:117] "RemoveContainer" containerID="2806195696dd702732fcb70c252a5dd4f41a8b33afe3f5188a6e81145e00ceff" Feb 20 08:27:40 crc kubenswrapper[5094]: E0220 08:27:40.293506 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2806195696dd702732fcb70c252a5dd4f41a8b33afe3f5188a6e81145e00ceff\": container with ID starting with 2806195696dd702732fcb70c252a5dd4f41a8b33afe3f5188a6e81145e00ceff not found: ID does not exist" containerID="2806195696dd702732fcb70c252a5dd4f41a8b33afe3f5188a6e81145e00ceff" Feb 20 08:27:40 crc kubenswrapper[5094]: I0220 08:27:40.293545 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2806195696dd702732fcb70c252a5dd4f41a8b33afe3f5188a6e81145e00ceff"} err="failed to get container status \"2806195696dd702732fcb70c252a5dd4f41a8b33afe3f5188a6e81145e00ceff\": rpc error: code = NotFound desc = could not find container \"2806195696dd702732fcb70c252a5dd4f41a8b33afe3f5188a6e81145e00ceff\": container with ID starting with 2806195696dd702732fcb70c252a5dd4f41a8b33afe3f5188a6e81145e00ceff not found: ID does not exist" Feb 20 08:27:40 crc kubenswrapper[5094]: I0220 08:27:40.293590 5094 scope.go:117] "RemoveContainer" containerID="e3f0e12048f8f4b710bef856e9eb70b75648660c4f2f0c2507d4f0e195400f91" Feb 20 08:27:40 crc kubenswrapper[5094]: E0220 08:27:40.294011 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3f0e12048f8f4b710bef856e9eb70b75648660c4f2f0c2507d4f0e195400f91\": container with ID starting with e3f0e12048f8f4b710bef856e9eb70b75648660c4f2f0c2507d4f0e195400f91 not found: ID does not exist" containerID="e3f0e12048f8f4b710bef856e9eb70b75648660c4f2f0c2507d4f0e195400f91" Feb 20 08:27:40 crc kubenswrapper[5094]: I0220 08:27:40.294039 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3f0e12048f8f4b710bef856e9eb70b75648660c4f2f0c2507d4f0e195400f91"} err="failed to get container status \"e3f0e12048f8f4b710bef856e9eb70b75648660c4f2f0c2507d4f0e195400f91\": rpc error: code = NotFound desc = could not find container \"e3f0e12048f8f4b710bef856e9eb70b75648660c4f2f0c2507d4f0e195400f91\": container with ID starting with e3f0e12048f8f4b710bef856e9eb70b75648660c4f2f0c2507d4f0e195400f91 not found: ID does not exist" Feb 20 08:27:40 crc kubenswrapper[5094]: I0220 08:27:40.294059 5094 scope.go:117] "RemoveContainer" containerID="2b2fefeb49093ee36074ed06293dc186b80167f359041c96cb12ca4d929a9354" Feb 20 08:27:40 crc kubenswrapper[5094]: E0220 08:27:40.294334 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b2fefeb49093ee36074ed06293dc186b80167f359041c96cb12ca4d929a9354\": container with ID starting with 2b2fefeb49093ee36074ed06293dc186b80167f359041c96cb12ca4d929a9354 not found: ID does not exist" containerID="2b2fefeb49093ee36074ed06293dc186b80167f359041c96cb12ca4d929a9354" Feb 20 08:27:40 crc kubenswrapper[5094]: I0220 08:27:40.294358 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b2fefeb49093ee36074ed06293dc186b80167f359041c96cb12ca4d929a9354"} err="failed to get container status \"2b2fefeb49093ee36074ed06293dc186b80167f359041c96cb12ca4d929a9354\": rpc error: code = NotFound desc = could not find container \"2b2fefeb49093ee36074ed06293dc186b80167f359041c96cb12ca4d929a9354\": container with ID starting with 2b2fefeb49093ee36074ed06293dc186b80167f359041c96cb12ca4d929a9354 not found: ID does not exist" Feb 20 08:27:41 crc kubenswrapper[5094]: I0220 08:27:41.852728 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40cdda9b-f8ca-4731-b2ab-e982c3ec6893" path="/var/lib/kubelet/pods/40cdda9b-f8ca-4731-b2ab-e982c3ec6893/volumes" Feb 20 08:28:04 crc kubenswrapper[5094]: I0220 08:28:04.107337 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:28:04 crc kubenswrapper[5094]: I0220 08:28:04.108067 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:28:34 crc kubenswrapper[5094]: I0220 08:28:34.107586 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:28:34 crc kubenswrapper[5094]: I0220 08:28:34.108536 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:28:34 crc kubenswrapper[5094]: I0220 08:28:34.108625 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 08:28:34 crc kubenswrapper[5094]: I0220 08:28:34.109856 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3240a62ab36af910255d2e2cb0810a1e622f0e56dc51a3b3a6a36dc8406c37ea"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 08:28:34 crc kubenswrapper[5094]: I0220 08:28:34.109971 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://3240a62ab36af910255d2e2cb0810a1e622f0e56dc51a3b3a6a36dc8406c37ea" gracePeriod=600 Feb 20 08:28:34 crc kubenswrapper[5094]: I0220 08:28:34.655253 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="3240a62ab36af910255d2e2cb0810a1e622f0e56dc51a3b3a6a36dc8406c37ea" exitCode=0 Feb 20 08:28:34 crc kubenswrapper[5094]: I0220 08:28:34.655337 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"3240a62ab36af910255d2e2cb0810a1e622f0e56dc51a3b3a6a36dc8406c37ea"} Feb 20 08:28:34 crc kubenswrapper[5094]: I0220 08:28:34.655994 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050"} Feb 20 08:28:34 crc kubenswrapper[5094]: I0220 08:28:34.656017 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:28:36 crc kubenswrapper[5094]: I0220 08:28:36.459116 5094 scope.go:117] "RemoveContainer" containerID="dc0e175dcf3ab875f0111e29b9804a9472d9627cddd8835f9c61529f34f1c8d3" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.088557 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cmrpp"] Feb 20 08:29:43 crc kubenswrapper[5094]: E0220 08:29:43.089978 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40cdda9b-f8ca-4731-b2ab-e982c3ec6893" containerName="extract-utilities" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.090002 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="40cdda9b-f8ca-4731-b2ab-e982c3ec6893" containerName="extract-utilities" Feb 20 08:29:43 crc kubenswrapper[5094]: E0220 08:29:43.090021 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40cdda9b-f8ca-4731-b2ab-e982c3ec6893" containerName="registry-server" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.090032 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="40cdda9b-f8ca-4731-b2ab-e982c3ec6893" containerName="registry-server" Feb 20 08:29:43 crc kubenswrapper[5094]: E0220 08:29:43.090045 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40cdda9b-f8ca-4731-b2ab-e982c3ec6893" containerName="extract-content" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.090057 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="40cdda9b-f8ca-4731-b2ab-e982c3ec6893" containerName="extract-content" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.090298 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="40cdda9b-f8ca-4731-b2ab-e982c3ec6893" containerName="registry-server" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.092006 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.105396 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cmrpp"] Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.178311 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd74t\" (UniqueName: \"kubernetes.io/projected/a9e3f247-149c-4eb7-9eff-7c13eb87a975-kube-api-access-fd74t\") pod \"certified-operators-cmrpp\" (UID: \"a9e3f247-149c-4eb7-9eff-7c13eb87a975\") " pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.178403 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9e3f247-149c-4eb7-9eff-7c13eb87a975-utilities\") pod \"certified-operators-cmrpp\" (UID: \"a9e3f247-149c-4eb7-9eff-7c13eb87a975\") " pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.178446 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9e3f247-149c-4eb7-9eff-7c13eb87a975-catalog-content\") pod \"certified-operators-cmrpp\" (UID: \"a9e3f247-149c-4eb7-9eff-7c13eb87a975\") " pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.279422 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9e3f247-149c-4eb7-9eff-7c13eb87a975-catalog-content\") pod \"certified-operators-cmrpp\" (UID: \"a9e3f247-149c-4eb7-9eff-7c13eb87a975\") " pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.279515 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd74t\" (UniqueName: \"kubernetes.io/projected/a9e3f247-149c-4eb7-9eff-7c13eb87a975-kube-api-access-fd74t\") pod \"certified-operators-cmrpp\" (UID: \"a9e3f247-149c-4eb7-9eff-7c13eb87a975\") " pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.279557 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9e3f247-149c-4eb7-9eff-7c13eb87a975-utilities\") pod \"certified-operators-cmrpp\" (UID: \"a9e3f247-149c-4eb7-9eff-7c13eb87a975\") " pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.280418 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9e3f247-149c-4eb7-9eff-7c13eb87a975-utilities\") pod \"certified-operators-cmrpp\" (UID: \"a9e3f247-149c-4eb7-9eff-7c13eb87a975\") " pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.280512 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9e3f247-149c-4eb7-9eff-7c13eb87a975-catalog-content\") pod \"certified-operators-cmrpp\" (UID: \"a9e3f247-149c-4eb7-9eff-7c13eb87a975\") " pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.303607 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd74t\" (UniqueName: \"kubernetes.io/projected/a9e3f247-149c-4eb7-9eff-7c13eb87a975-kube-api-access-fd74t\") pod \"certified-operators-cmrpp\" (UID: \"a9e3f247-149c-4eb7-9eff-7c13eb87a975\") " pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.420861 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.925798 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cmrpp"] Feb 20 08:29:44 crc kubenswrapper[5094]: I0220 08:29:44.286217 5094 generic.go:334] "Generic (PLEG): container finished" podID="a9e3f247-149c-4eb7-9eff-7c13eb87a975" containerID="939730f7eeb5ef990c74f994b9464fafb5722a575bcf5c0dbc9afd8f6d7709e6" exitCode=0 Feb 20 08:29:44 crc kubenswrapper[5094]: I0220 08:29:44.286279 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmrpp" event={"ID":"a9e3f247-149c-4eb7-9eff-7c13eb87a975","Type":"ContainerDied","Data":"939730f7eeb5ef990c74f994b9464fafb5722a575bcf5c0dbc9afd8f6d7709e6"} Feb 20 08:29:44 crc kubenswrapper[5094]: I0220 08:29:44.286317 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmrpp" event={"ID":"a9e3f247-149c-4eb7-9eff-7c13eb87a975","Type":"ContainerStarted","Data":"02149a89d5d5d34e333a8506120c9c4e6feb6b6b4482074736c18109ec9c31ae"} Feb 20 08:29:45 crc kubenswrapper[5094]: I0220 08:29:45.279813 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jjk54"] Feb 20 08:29:45 crc kubenswrapper[5094]: I0220 08:29:45.281556 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:29:45 crc kubenswrapper[5094]: I0220 08:29:45.295481 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmrpp" event={"ID":"a9e3f247-149c-4eb7-9eff-7c13eb87a975","Type":"ContainerStarted","Data":"431a40c5b8a92aa0e0b0d78933c1736fb71dc55652241da39e6461ff6fd59d6d"} Feb 20 08:29:45 crc kubenswrapper[5094]: I0220 08:29:45.296085 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jjk54"] Feb 20 08:29:45 crc kubenswrapper[5094]: I0220 08:29:45.415753 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-utilities\") pod \"redhat-operators-jjk54\" (UID: \"386a80fc-f69f-4bc8-bc43-8c3eba784c4e\") " pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:29:45 crc kubenswrapper[5094]: I0220 08:29:45.415856 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-catalog-content\") pod \"redhat-operators-jjk54\" (UID: \"386a80fc-f69f-4bc8-bc43-8c3eba784c4e\") " pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:29:45 crc kubenswrapper[5094]: I0220 08:29:45.416678 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9vr6\" (UniqueName: \"kubernetes.io/projected/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-kube-api-access-f9vr6\") pod \"redhat-operators-jjk54\" (UID: \"386a80fc-f69f-4bc8-bc43-8c3eba784c4e\") " pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:29:45 crc kubenswrapper[5094]: I0220 08:29:45.518463 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9vr6\" (UniqueName: \"kubernetes.io/projected/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-kube-api-access-f9vr6\") pod \"redhat-operators-jjk54\" (UID: \"386a80fc-f69f-4bc8-bc43-8c3eba784c4e\") " pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:29:45 crc kubenswrapper[5094]: I0220 08:29:45.518565 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-utilities\") pod \"redhat-operators-jjk54\" (UID: \"386a80fc-f69f-4bc8-bc43-8c3eba784c4e\") " pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:29:45 crc kubenswrapper[5094]: I0220 08:29:45.518611 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-catalog-content\") pod \"redhat-operators-jjk54\" (UID: \"386a80fc-f69f-4bc8-bc43-8c3eba784c4e\") " pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:29:45 crc kubenswrapper[5094]: I0220 08:29:45.519208 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-utilities\") pod \"redhat-operators-jjk54\" (UID: \"386a80fc-f69f-4bc8-bc43-8c3eba784c4e\") " pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:29:45 crc kubenswrapper[5094]: I0220 08:29:45.519222 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-catalog-content\") pod \"redhat-operators-jjk54\" (UID: \"386a80fc-f69f-4bc8-bc43-8c3eba784c4e\") " pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:29:45 crc kubenswrapper[5094]: I0220 08:29:45.542228 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9vr6\" (UniqueName: \"kubernetes.io/projected/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-kube-api-access-f9vr6\") pod \"redhat-operators-jjk54\" (UID: \"386a80fc-f69f-4bc8-bc43-8c3eba784c4e\") " pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:29:45 crc kubenswrapper[5094]: I0220 08:29:45.658851 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:29:46 crc kubenswrapper[5094]: I0220 08:29:46.140112 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jjk54"] Feb 20 08:29:46 crc kubenswrapper[5094]: W0220 08:29:46.144123 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod386a80fc_f69f_4bc8_bc43_8c3eba784c4e.slice/crio-9e5e9a40fac3a47fbd176299b18411e902d13de4ae6e824acdcc6a2fb66c7a0b WatchSource:0}: Error finding container 9e5e9a40fac3a47fbd176299b18411e902d13de4ae6e824acdcc6a2fb66c7a0b: Status 404 returned error can't find the container with id 9e5e9a40fac3a47fbd176299b18411e902d13de4ae6e824acdcc6a2fb66c7a0b Feb 20 08:29:46 crc kubenswrapper[5094]: I0220 08:29:46.308773 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjk54" event={"ID":"386a80fc-f69f-4bc8-bc43-8c3eba784c4e","Type":"ContainerStarted","Data":"9e5e9a40fac3a47fbd176299b18411e902d13de4ae6e824acdcc6a2fb66c7a0b"} Feb 20 08:29:46 crc kubenswrapper[5094]: I0220 08:29:46.313661 5094 generic.go:334] "Generic (PLEG): container finished" podID="a9e3f247-149c-4eb7-9eff-7c13eb87a975" containerID="431a40c5b8a92aa0e0b0d78933c1736fb71dc55652241da39e6461ff6fd59d6d" exitCode=0 Feb 20 08:29:46 crc kubenswrapper[5094]: I0220 08:29:46.313736 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmrpp" event={"ID":"a9e3f247-149c-4eb7-9eff-7c13eb87a975","Type":"ContainerDied","Data":"431a40c5b8a92aa0e0b0d78933c1736fb71dc55652241da39e6461ff6fd59d6d"} Feb 20 08:29:47 crc kubenswrapper[5094]: I0220 08:29:47.324865 5094 generic.go:334] "Generic (PLEG): container finished" podID="386a80fc-f69f-4bc8-bc43-8c3eba784c4e" containerID="3a733e5c205009cb2fd844f7de4a834d196c2b13a86d56a1de95a1ff048bfa54" exitCode=0 Feb 20 08:29:47 crc kubenswrapper[5094]: I0220 08:29:47.324969 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjk54" event={"ID":"386a80fc-f69f-4bc8-bc43-8c3eba784c4e","Type":"ContainerDied","Data":"3a733e5c205009cb2fd844f7de4a834d196c2b13a86d56a1de95a1ff048bfa54"} Feb 20 08:29:47 crc kubenswrapper[5094]: I0220 08:29:47.328849 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmrpp" event={"ID":"a9e3f247-149c-4eb7-9eff-7c13eb87a975","Type":"ContainerStarted","Data":"eac50eda1d7a67c01d32002b172303538da2343da19455ed4ab428e978e625d5"} Feb 20 08:29:47 crc kubenswrapper[5094]: I0220 08:29:47.368895 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cmrpp" podStartSLOduration=1.936447024 podStartE2EDuration="4.368876419s" podCreationTimestamp="2026-02-20 08:29:43 +0000 UTC" firstStartedPulling="2026-02-20 08:29:44.288372831 +0000 UTC m=+6199.160999572" lastFinishedPulling="2026-02-20 08:29:46.720802266 +0000 UTC m=+6201.593428967" observedRunningTime="2026-02-20 08:29:47.366154833 +0000 UTC m=+6202.238781544" watchObservedRunningTime="2026-02-20 08:29:47.368876419 +0000 UTC m=+6202.241503130" Feb 20 08:29:48 crc kubenswrapper[5094]: I0220 08:29:48.336351 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjk54" event={"ID":"386a80fc-f69f-4bc8-bc43-8c3eba784c4e","Type":"ContainerStarted","Data":"333f6a9a58287ad4b807a6cac2d3e9ce998c6f00485066a36639c0b4c3e5c733"} Feb 20 08:29:49 crc kubenswrapper[5094]: I0220 08:29:49.346169 5094 generic.go:334] "Generic (PLEG): container finished" podID="386a80fc-f69f-4bc8-bc43-8c3eba784c4e" containerID="333f6a9a58287ad4b807a6cac2d3e9ce998c6f00485066a36639c0b4c3e5c733" exitCode=0 Feb 20 08:29:49 crc kubenswrapper[5094]: I0220 08:29:49.346303 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjk54" event={"ID":"386a80fc-f69f-4bc8-bc43-8c3eba784c4e","Type":"ContainerDied","Data":"333f6a9a58287ad4b807a6cac2d3e9ce998c6f00485066a36639c0b4c3e5c733"} Feb 20 08:29:50 crc kubenswrapper[5094]: I0220 08:29:50.354731 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjk54" event={"ID":"386a80fc-f69f-4bc8-bc43-8c3eba784c4e","Type":"ContainerStarted","Data":"abe35586d5a9ae8d903350883629e148cd91b6ae8db2c363e0547eb2e6187125"} Feb 20 08:29:50 crc kubenswrapper[5094]: I0220 08:29:50.370791 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jjk54" podStartSLOduration=2.99574619 podStartE2EDuration="5.370772458s" podCreationTimestamp="2026-02-20 08:29:45 +0000 UTC" firstStartedPulling="2026-02-20 08:29:47.327000227 +0000 UTC m=+6202.199626938" lastFinishedPulling="2026-02-20 08:29:49.702026495 +0000 UTC m=+6204.574653206" observedRunningTime="2026-02-20 08:29:50.369244631 +0000 UTC m=+6205.241871342" watchObservedRunningTime="2026-02-20 08:29:50.370772458 +0000 UTC m=+6205.243399179" Feb 20 08:29:53 crc kubenswrapper[5094]: I0220 08:29:53.421881 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:53 crc kubenswrapper[5094]: I0220 08:29:53.422198 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:53 crc kubenswrapper[5094]: I0220 08:29:53.479128 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:54 crc kubenswrapper[5094]: I0220 08:29:54.439266 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:54 crc kubenswrapper[5094]: I0220 08:29:54.871347 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cmrpp"] Feb 20 08:29:55 crc kubenswrapper[5094]: I0220 08:29:55.659883 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:29:55 crc kubenswrapper[5094]: I0220 08:29:55.659936 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:29:56 crc kubenswrapper[5094]: I0220 08:29:56.409408 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cmrpp" podUID="a9e3f247-149c-4eb7-9eff-7c13eb87a975" containerName="registry-server" containerID="cri-o://eac50eda1d7a67c01d32002b172303538da2343da19455ed4ab428e978e625d5" gracePeriod=2 Feb 20 08:29:56 crc kubenswrapper[5094]: I0220 08:29:56.704903 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jjk54" podUID="386a80fc-f69f-4bc8-bc43-8c3eba784c4e" containerName="registry-server" probeResult="failure" output=< Feb 20 08:29:56 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 08:29:56 crc kubenswrapper[5094]: > Feb 20 08:29:56 crc kubenswrapper[5094]: I0220 08:29:56.812079 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:56 crc kubenswrapper[5094]: I0220 08:29:56.988462 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9e3f247-149c-4eb7-9eff-7c13eb87a975-catalog-content\") pod \"a9e3f247-149c-4eb7-9eff-7c13eb87a975\" (UID: \"a9e3f247-149c-4eb7-9eff-7c13eb87a975\") " Feb 20 08:29:56 crc kubenswrapper[5094]: I0220 08:29:56.988601 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9e3f247-149c-4eb7-9eff-7c13eb87a975-utilities\") pod \"a9e3f247-149c-4eb7-9eff-7c13eb87a975\" (UID: \"a9e3f247-149c-4eb7-9eff-7c13eb87a975\") " Feb 20 08:29:56 crc kubenswrapper[5094]: I0220 08:29:56.988633 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fd74t\" (UniqueName: \"kubernetes.io/projected/a9e3f247-149c-4eb7-9eff-7c13eb87a975-kube-api-access-fd74t\") pod \"a9e3f247-149c-4eb7-9eff-7c13eb87a975\" (UID: \"a9e3f247-149c-4eb7-9eff-7c13eb87a975\") " Feb 20 08:29:56 crc kubenswrapper[5094]: I0220 08:29:56.990532 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9e3f247-149c-4eb7-9eff-7c13eb87a975-utilities" (OuterVolumeSpecName: "utilities") pod "a9e3f247-149c-4eb7-9eff-7c13eb87a975" (UID: "a9e3f247-149c-4eb7-9eff-7c13eb87a975"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:29:56 crc kubenswrapper[5094]: I0220 08:29:56.994979 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9e3f247-149c-4eb7-9eff-7c13eb87a975-kube-api-access-fd74t" (OuterVolumeSpecName: "kube-api-access-fd74t") pod "a9e3f247-149c-4eb7-9eff-7c13eb87a975" (UID: "a9e3f247-149c-4eb7-9eff-7c13eb87a975"). InnerVolumeSpecName "kube-api-access-fd74t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.040764 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9e3f247-149c-4eb7-9eff-7c13eb87a975-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9e3f247-149c-4eb7-9eff-7c13eb87a975" (UID: "a9e3f247-149c-4eb7-9eff-7c13eb87a975"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.090680 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9e3f247-149c-4eb7-9eff-7c13eb87a975-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.090734 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9e3f247-149c-4eb7-9eff-7c13eb87a975-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.090745 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fd74t\" (UniqueName: \"kubernetes.io/projected/a9e3f247-149c-4eb7-9eff-7c13eb87a975-kube-api-access-fd74t\") on node \"crc\" DevicePath \"\"" Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.417805 5094 generic.go:334] "Generic (PLEG): container finished" podID="a9e3f247-149c-4eb7-9eff-7c13eb87a975" containerID="eac50eda1d7a67c01d32002b172303538da2343da19455ed4ab428e978e625d5" exitCode=0 Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.417844 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmrpp" event={"ID":"a9e3f247-149c-4eb7-9eff-7c13eb87a975","Type":"ContainerDied","Data":"eac50eda1d7a67c01d32002b172303538da2343da19455ed4ab428e978e625d5"} Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.417875 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmrpp" event={"ID":"a9e3f247-149c-4eb7-9eff-7c13eb87a975","Type":"ContainerDied","Data":"02149a89d5d5d34e333a8506120c9c4e6feb6b6b4482074736c18109ec9c31ae"} Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.417892 5094 scope.go:117] "RemoveContainer" containerID="eac50eda1d7a67c01d32002b172303538da2343da19455ed4ab428e978e625d5" Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.417906 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.445837 5094 scope.go:117] "RemoveContainer" containerID="431a40c5b8a92aa0e0b0d78933c1736fb71dc55652241da39e6461ff6fd59d6d" Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.458446 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cmrpp"] Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.467408 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cmrpp"] Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.472487 5094 scope.go:117] "RemoveContainer" containerID="939730f7eeb5ef990c74f994b9464fafb5722a575bcf5c0dbc9afd8f6d7709e6" Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.509415 5094 scope.go:117] "RemoveContainer" containerID="eac50eda1d7a67c01d32002b172303538da2343da19455ed4ab428e978e625d5" Feb 20 08:29:57 crc kubenswrapper[5094]: E0220 08:29:57.509842 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eac50eda1d7a67c01d32002b172303538da2343da19455ed4ab428e978e625d5\": container with ID starting with eac50eda1d7a67c01d32002b172303538da2343da19455ed4ab428e978e625d5 not found: ID does not exist" containerID="eac50eda1d7a67c01d32002b172303538da2343da19455ed4ab428e978e625d5" Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.509897 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eac50eda1d7a67c01d32002b172303538da2343da19455ed4ab428e978e625d5"} err="failed to get container status \"eac50eda1d7a67c01d32002b172303538da2343da19455ed4ab428e978e625d5\": rpc error: code = NotFound desc = could not find container \"eac50eda1d7a67c01d32002b172303538da2343da19455ed4ab428e978e625d5\": container with ID starting with eac50eda1d7a67c01d32002b172303538da2343da19455ed4ab428e978e625d5 not found: ID does not exist" Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.509925 5094 scope.go:117] "RemoveContainer" containerID="431a40c5b8a92aa0e0b0d78933c1736fb71dc55652241da39e6461ff6fd59d6d" Feb 20 08:29:57 crc kubenswrapper[5094]: E0220 08:29:57.510215 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"431a40c5b8a92aa0e0b0d78933c1736fb71dc55652241da39e6461ff6fd59d6d\": container with ID starting with 431a40c5b8a92aa0e0b0d78933c1736fb71dc55652241da39e6461ff6fd59d6d not found: ID does not exist" containerID="431a40c5b8a92aa0e0b0d78933c1736fb71dc55652241da39e6461ff6fd59d6d" Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.510235 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"431a40c5b8a92aa0e0b0d78933c1736fb71dc55652241da39e6461ff6fd59d6d"} err="failed to get container status \"431a40c5b8a92aa0e0b0d78933c1736fb71dc55652241da39e6461ff6fd59d6d\": rpc error: code = NotFound desc = could not find container \"431a40c5b8a92aa0e0b0d78933c1736fb71dc55652241da39e6461ff6fd59d6d\": container with ID starting with 431a40c5b8a92aa0e0b0d78933c1736fb71dc55652241da39e6461ff6fd59d6d not found: ID does not exist" Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.510247 5094 scope.go:117] "RemoveContainer" containerID="939730f7eeb5ef990c74f994b9464fafb5722a575bcf5c0dbc9afd8f6d7709e6" Feb 20 08:29:57 crc kubenswrapper[5094]: E0220 08:29:57.510807 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"939730f7eeb5ef990c74f994b9464fafb5722a575bcf5c0dbc9afd8f6d7709e6\": container with ID starting with 939730f7eeb5ef990c74f994b9464fafb5722a575bcf5c0dbc9afd8f6d7709e6 not found: ID does not exist" containerID="939730f7eeb5ef990c74f994b9464fafb5722a575bcf5c0dbc9afd8f6d7709e6" Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.510847 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"939730f7eeb5ef990c74f994b9464fafb5722a575bcf5c0dbc9afd8f6d7709e6"} err="failed to get container status \"939730f7eeb5ef990c74f994b9464fafb5722a575bcf5c0dbc9afd8f6d7709e6\": rpc error: code = NotFound desc = could not find container \"939730f7eeb5ef990c74f994b9464fafb5722a575bcf5c0dbc9afd8f6d7709e6\": container with ID starting with 939730f7eeb5ef990c74f994b9464fafb5722a575bcf5c0dbc9afd8f6d7709e6 not found: ID does not exist" Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.855877 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9e3f247-149c-4eb7-9eff-7c13eb87a975" path="/var/lib/kubelet/pods/a9e3f247-149c-4eb7-9eff-7c13eb87a975/volumes" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.162219 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh"] Feb 20 08:30:00 crc kubenswrapper[5094]: E0220 08:30:00.163088 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9e3f247-149c-4eb7-9eff-7c13eb87a975" containerName="extract-utilities" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.163128 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9e3f247-149c-4eb7-9eff-7c13eb87a975" containerName="extract-utilities" Feb 20 08:30:00 crc kubenswrapper[5094]: E0220 08:30:00.163187 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9e3f247-149c-4eb7-9eff-7c13eb87a975" containerName="registry-server" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.163198 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9e3f247-149c-4eb7-9eff-7c13eb87a975" containerName="registry-server" Feb 20 08:30:00 crc kubenswrapper[5094]: E0220 08:30:00.163230 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9e3f247-149c-4eb7-9eff-7c13eb87a975" containerName="extract-content" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.163239 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9e3f247-149c-4eb7-9eff-7c13eb87a975" containerName="extract-content" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.163449 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9e3f247-149c-4eb7-9eff-7c13eb87a975" containerName="registry-server" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.164227 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.167612 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.167895 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.177184 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh"] Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.350462 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c7c75bd-9812-4d90-80ea-08eda0f926fc-config-volume\") pod \"collect-profiles-29526270-cm5bh\" (UID: \"5c7c75bd-9812-4d90-80ea-08eda0f926fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.350557 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c7c75bd-9812-4d90-80ea-08eda0f926fc-secret-volume\") pod \"collect-profiles-29526270-cm5bh\" (UID: \"5c7c75bd-9812-4d90-80ea-08eda0f926fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.350641 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46mmf\" (UniqueName: \"kubernetes.io/projected/5c7c75bd-9812-4d90-80ea-08eda0f926fc-kube-api-access-46mmf\") pod \"collect-profiles-29526270-cm5bh\" (UID: \"5c7c75bd-9812-4d90-80ea-08eda0f926fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.453197 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c7c75bd-9812-4d90-80ea-08eda0f926fc-config-volume\") pod \"collect-profiles-29526270-cm5bh\" (UID: \"5c7c75bd-9812-4d90-80ea-08eda0f926fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.453601 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c7c75bd-9812-4d90-80ea-08eda0f926fc-secret-volume\") pod \"collect-profiles-29526270-cm5bh\" (UID: \"5c7c75bd-9812-4d90-80ea-08eda0f926fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.453621 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46mmf\" (UniqueName: \"kubernetes.io/projected/5c7c75bd-9812-4d90-80ea-08eda0f926fc-kube-api-access-46mmf\") pod \"collect-profiles-29526270-cm5bh\" (UID: \"5c7c75bd-9812-4d90-80ea-08eda0f926fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.455233 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c7c75bd-9812-4d90-80ea-08eda0f926fc-config-volume\") pod \"collect-profiles-29526270-cm5bh\" (UID: \"5c7c75bd-9812-4d90-80ea-08eda0f926fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.464320 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c7c75bd-9812-4d90-80ea-08eda0f926fc-secret-volume\") pod \"collect-profiles-29526270-cm5bh\" (UID: \"5c7c75bd-9812-4d90-80ea-08eda0f926fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.470283 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46mmf\" (UniqueName: \"kubernetes.io/projected/5c7c75bd-9812-4d90-80ea-08eda0f926fc-kube-api-access-46mmf\") pod \"collect-profiles-29526270-cm5bh\" (UID: \"5c7c75bd-9812-4d90-80ea-08eda0f926fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.496640 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.718327 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh"] Feb 20 08:30:01 crc kubenswrapper[5094]: I0220 08:30:01.458663 5094 generic.go:334] "Generic (PLEG): container finished" podID="5c7c75bd-9812-4d90-80ea-08eda0f926fc" containerID="89a68b3798c7e61a71c5a1f766e1642edc8983858caba5c4db74959c3a8cdcec" exitCode=0 Feb 20 08:30:01 crc kubenswrapper[5094]: I0220 08:30:01.458760 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh" event={"ID":"5c7c75bd-9812-4d90-80ea-08eda0f926fc","Type":"ContainerDied","Data":"89a68b3798c7e61a71c5a1f766e1642edc8983858caba5c4db74959c3a8cdcec"} Feb 20 08:30:01 crc kubenswrapper[5094]: I0220 08:30:01.458793 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh" event={"ID":"5c7c75bd-9812-4d90-80ea-08eda0f926fc","Type":"ContainerStarted","Data":"fd34be79f2c5a696df0c96f7a0bfdd55ee04248c01f08074b0c8cbd32ebf1e54"} Feb 20 08:30:02 crc kubenswrapper[5094]: I0220 08:30:02.771585 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh" Feb 20 08:30:02 crc kubenswrapper[5094]: I0220 08:30:02.792422 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c7c75bd-9812-4d90-80ea-08eda0f926fc-secret-volume\") pod \"5c7c75bd-9812-4d90-80ea-08eda0f926fc\" (UID: \"5c7c75bd-9812-4d90-80ea-08eda0f926fc\") " Feb 20 08:30:02 crc kubenswrapper[5094]: I0220 08:30:02.792468 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46mmf\" (UniqueName: \"kubernetes.io/projected/5c7c75bd-9812-4d90-80ea-08eda0f926fc-kube-api-access-46mmf\") pod \"5c7c75bd-9812-4d90-80ea-08eda0f926fc\" (UID: \"5c7c75bd-9812-4d90-80ea-08eda0f926fc\") " Feb 20 08:30:02 crc kubenswrapper[5094]: I0220 08:30:02.792536 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c7c75bd-9812-4d90-80ea-08eda0f926fc-config-volume\") pod \"5c7c75bd-9812-4d90-80ea-08eda0f926fc\" (UID: \"5c7c75bd-9812-4d90-80ea-08eda0f926fc\") " Feb 20 08:30:02 crc kubenswrapper[5094]: I0220 08:30:02.793577 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c7c75bd-9812-4d90-80ea-08eda0f926fc-config-volume" (OuterVolumeSpecName: "config-volume") pod "5c7c75bd-9812-4d90-80ea-08eda0f926fc" (UID: "5c7c75bd-9812-4d90-80ea-08eda0f926fc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:30:02 crc kubenswrapper[5094]: I0220 08:30:02.799061 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c7c75bd-9812-4d90-80ea-08eda0f926fc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5c7c75bd-9812-4d90-80ea-08eda0f926fc" (UID: "5c7c75bd-9812-4d90-80ea-08eda0f926fc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:30:02 crc kubenswrapper[5094]: I0220 08:30:02.799107 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c7c75bd-9812-4d90-80ea-08eda0f926fc-kube-api-access-46mmf" (OuterVolumeSpecName: "kube-api-access-46mmf") pod "5c7c75bd-9812-4d90-80ea-08eda0f926fc" (UID: "5c7c75bd-9812-4d90-80ea-08eda0f926fc"). InnerVolumeSpecName "kube-api-access-46mmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:30:02 crc kubenswrapper[5094]: I0220 08:30:02.894132 5094 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c7c75bd-9812-4d90-80ea-08eda0f926fc-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 08:30:02 crc kubenswrapper[5094]: I0220 08:30:02.894184 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46mmf\" (UniqueName: \"kubernetes.io/projected/5c7c75bd-9812-4d90-80ea-08eda0f926fc-kube-api-access-46mmf\") on node \"crc\" DevicePath \"\"" Feb 20 08:30:02 crc kubenswrapper[5094]: I0220 08:30:02.894196 5094 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c7c75bd-9812-4d90-80ea-08eda0f926fc-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 08:30:03 crc kubenswrapper[5094]: I0220 08:30:03.476119 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh" event={"ID":"5c7c75bd-9812-4d90-80ea-08eda0f926fc","Type":"ContainerDied","Data":"fd34be79f2c5a696df0c96f7a0bfdd55ee04248c01f08074b0c8cbd32ebf1e54"} Feb 20 08:30:03 crc kubenswrapper[5094]: I0220 08:30:03.476158 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd34be79f2c5a696df0c96f7a0bfdd55ee04248c01f08074b0c8cbd32ebf1e54" Feb 20 08:30:03 crc kubenswrapper[5094]: I0220 08:30:03.476220 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh" Feb 20 08:30:03 crc kubenswrapper[5094]: I0220 08:30:03.857889 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz"] Feb 20 08:30:03 crc kubenswrapper[5094]: I0220 08:30:03.868048 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz"] Feb 20 08:30:05 crc kubenswrapper[5094]: I0220 08:30:05.716929 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:30:05 crc kubenswrapper[5094]: I0220 08:30:05.785234 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:30:05 crc kubenswrapper[5094]: I0220 08:30:05.849361 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c1d2dad-446d-40c2-aceb-de13411f5c93" path="/var/lib/kubelet/pods/1c1d2dad-446d-40c2-aceb-de13411f5c93/volumes" Feb 20 08:30:05 crc kubenswrapper[5094]: I0220 08:30:05.961341 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jjk54"] Feb 20 08:30:07 crc kubenswrapper[5094]: I0220 08:30:07.512947 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jjk54" podUID="386a80fc-f69f-4bc8-bc43-8c3eba784c4e" containerName="registry-server" containerID="cri-o://abe35586d5a9ae8d903350883629e148cd91b6ae8db2c363e0547eb2e6187125" gracePeriod=2 Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.169615 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.181989 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9vr6\" (UniqueName: \"kubernetes.io/projected/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-kube-api-access-f9vr6\") pod \"386a80fc-f69f-4bc8-bc43-8c3eba784c4e\" (UID: \"386a80fc-f69f-4bc8-bc43-8c3eba784c4e\") " Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.182131 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-catalog-content\") pod \"386a80fc-f69f-4bc8-bc43-8c3eba784c4e\" (UID: \"386a80fc-f69f-4bc8-bc43-8c3eba784c4e\") " Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.182156 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-utilities\") pod \"386a80fc-f69f-4bc8-bc43-8c3eba784c4e\" (UID: \"386a80fc-f69f-4bc8-bc43-8c3eba784c4e\") " Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.183175 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-utilities" (OuterVolumeSpecName: "utilities") pod "386a80fc-f69f-4bc8-bc43-8c3eba784c4e" (UID: "386a80fc-f69f-4bc8-bc43-8c3eba784c4e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.190207 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-kube-api-access-f9vr6" (OuterVolumeSpecName: "kube-api-access-f9vr6") pod "386a80fc-f69f-4bc8-bc43-8c3eba784c4e" (UID: "386a80fc-f69f-4bc8-bc43-8c3eba784c4e"). InnerVolumeSpecName "kube-api-access-f9vr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.284262 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9vr6\" (UniqueName: \"kubernetes.io/projected/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-kube-api-access-f9vr6\") on node \"crc\" DevicePath \"\"" Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.284631 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.311491 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "386a80fc-f69f-4bc8-bc43-8c3eba784c4e" (UID: "386a80fc-f69f-4bc8-bc43-8c3eba784c4e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.385495 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.524896 5094 generic.go:334] "Generic (PLEG): container finished" podID="386a80fc-f69f-4bc8-bc43-8c3eba784c4e" containerID="abe35586d5a9ae8d903350883629e148cd91b6ae8db2c363e0547eb2e6187125" exitCode=0 Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.524948 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.524970 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjk54" event={"ID":"386a80fc-f69f-4bc8-bc43-8c3eba784c4e","Type":"ContainerDied","Data":"abe35586d5a9ae8d903350883629e148cd91b6ae8db2c363e0547eb2e6187125"} Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.526186 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjk54" event={"ID":"386a80fc-f69f-4bc8-bc43-8c3eba784c4e","Type":"ContainerDied","Data":"9e5e9a40fac3a47fbd176299b18411e902d13de4ae6e824acdcc6a2fb66c7a0b"} Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.526243 5094 scope.go:117] "RemoveContainer" containerID="abe35586d5a9ae8d903350883629e148cd91b6ae8db2c363e0547eb2e6187125" Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.558259 5094 scope.go:117] "RemoveContainer" containerID="333f6a9a58287ad4b807a6cac2d3e9ce998c6f00485066a36639c0b4c3e5c733" Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.560456 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jjk54"] Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.577567 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jjk54"] Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.597044 5094 scope.go:117] "RemoveContainer" containerID="3a733e5c205009cb2fd844f7de4a834d196c2b13a86d56a1de95a1ff048bfa54" Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.631388 5094 scope.go:117] "RemoveContainer" containerID="abe35586d5a9ae8d903350883629e148cd91b6ae8db2c363e0547eb2e6187125" Feb 20 08:30:08 crc kubenswrapper[5094]: E0220 08:30:08.635514 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abe35586d5a9ae8d903350883629e148cd91b6ae8db2c363e0547eb2e6187125\": container with ID starting with abe35586d5a9ae8d903350883629e148cd91b6ae8db2c363e0547eb2e6187125 not found: ID does not exist" containerID="abe35586d5a9ae8d903350883629e148cd91b6ae8db2c363e0547eb2e6187125" Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.635579 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abe35586d5a9ae8d903350883629e148cd91b6ae8db2c363e0547eb2e6187125"} err="failed to get container status \"abe35586d5a9ae8d903350883629e148cd91b6ae8db2c363e0547eb2e6187125\": rpc error: code = NotFound desc = could not find container \"abe35586d5a9ae8d903350883629e148cd91b6ae8db2c363e0547eb2e6187125\": container with ID starting with abe35586d5a9ae8d903350883629e148cd91b6ae8db2c363e0547eb2e6187125 not found: ID does not exist" Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.635617 5094 scope.go:117] "RemoveContainer" containerID="333f6a9a58287ad4b807a6cac2d3e9ce998c6f00485066a36639c0b4c3e5c733" Feb 20 08:30:08 crc kubenswrapper[5094]: E0220 08:30:08.636271 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"333f6a9a58287ad4b807a6cac2d3e9ce998c6f00485066a36639c0b4c3e5c733\": container with ID starting with 333f6a9a58287ad4b807a6cac2d3e9ce998c6f00485066a36639c0b4c3e5c733 not found: ID does not exist" containerID="333f6a9a58287ad4b807a6cac2d3e9ce998c6f00485066a36639c0b4c3e5c733" Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.636447 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"333f6a9a58287ad4b807a6cac2d3e9ce998c6f00485066a36639c0b4c3e5c733"} err="failed to get container status \"333f6a9a58287ad4b807a6cac2d3e9ce998c6f00485066a36639c0b4c3e5c733\": rpc error: code = NotFound desc = could not find container \"333f6a9a58287ad4b807a6cac2d3e9ce998c6f00485066a36639c0b4c3e5c733\": container with ID starting with 333f6a9a58287ad4b807a6cac2d3e9ce998c6f00485066a36639c0b4c3e5c733 not found: ID does not exist" Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.636626 5094 scope.go:117] "RemoveContainer" containerID="3a733e5c205009cb2fd844f7de4a834d196c2b13a86d56a1de95a1ff048bfa54" Feb 20 08:30:08 crc kubenswrapper[5094]: E0220 08:30:08.637241 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a733e5c205009cb2fd844f7de4a834d196c2b13a86d56a1de95a1ff048bfa54\": container with ID starting with 3a733e5c205009cb2fd844f7de4a834d196c2b13a86d56a1de95a1ff048bfa54 not found: ID does not exist" containerID="3a733e5c205009cb2fd844f7de4a834d196c2b13a86d56a1de95a1ff048bfa54" Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.637281 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a733e5c205009cb2fd844f7de4a834d196c2b13a86d56a1de95a1ff048bfa54"} err="failed to get container status \"3a733e5c205009cb2fd844f7de4a834d196c2b13a86d56a1de95a1ff048bfa54\": rpc error: code = NotFound desc = could not find container \"3a733e5c205009cb2fd844f7de4a834d196c2b13a86d56a1de95a1ff048bfa54\": container with ID starting with 3a733e5c205009cb2fd844f7de4a834d196c2b13a86d56a1de95a1ff048bfa54 not found: ID does not exist" Feb 20 08:30:09 crc kubenswrapper[5094]: I0220 08:30:09.850506 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="386a80fc-f69f-4bc8-bc43-8c3eba784c4e" path="/var/lib/kubelet/pods/386a80fc-f69f-4bc8-bc43-8c3eba784c4e/volumes" Feb 20 08:30:34 crc kubenswrapper[5094]: I0220 08:30:34.107524 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:30:34 crc kubenswrapper[5094]: I0220 08:30:34.108106 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:30:36 crc kubenswrapper[5094]: I0220 08:30:36.558408 5094 scope.go:117] "RemoveContainer" containerID="362eb9911b24b0b4b251a0b53f012f2ce0407b0cfe5ced9cb86222665657d8eb" Feb 20 08:31:04 crc kubenswrapper[5094]: I0220 08:31:04.107341 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:31:04 crc kubenswrapper[5094]: I0220 08:31:04.108000 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:31:34 crc kubenswrapper[5094]: I0220 08:31:34.106825 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:31:34 crc kubenswrapper[5094]: I0220 08:31:34.107528 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:31:34 crc kubenswrapper[5094]: I0220 08:31:34.107596 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 08:31:34 crc kubenswrapper[5094]: I0220 08:31:34.108411 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 08:31:34 crc kubenswrapper[5094]: I0220 08:31:34.108494 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" gracePeriod=600 Feb 20 08:31:34 crc kubenswrapper[5094]: E0220 08:31:34.235043 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:31:34 crc kubenswrapper[5094]: I0220 08:31:34.246877 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" exitCode=0 Feb 20 08:31:34 crc kubenswrapper[5094]: I0220 08:31:34.246936 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050"} Feb 20 08:31:34 crc kubenswrapper[5094]: I0220 08:31:34.247045 5094 scope.go:117] "RemoveContainer" containerID="3240a62ab36af910255d2e2cb0810a1e622f0e56dc51a3b3a6a36dc8406c37ea" Feb 20 08:31:34 crc kubenswrapper[5094]: I0220 08:31:34.248371 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:31:34 crc kubenswrapper[5094]: E0220 08:31:34.248965 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:31:36 crc kubenswrapper[5094]: I0220 08:31:36.641936 5094 scope.go:117] "RemoveContainer" containerID="5fe6cd402f52794a3175518b1d65628a3975facf339971987772005f254a31df" Feb 20 08:31:45 crc kubenswrapper[5094]: I0220 08:31:45.840838 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:31:45 crc kubenswrapper[5094]: E0220 08:31:45.841663 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:31:58 crc kubenswrapper[5094]: I0220 08:31:58.840953 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:31:58 crc kubenswrapper[5094]: E0220 08:31:58.842335 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:32:11 crc kubenswrapper[5094]: I0220 08:32:11.840836 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:32:11 crc kubenswrapper[5094]: E0220 08:32:11.841886 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:32:23 crc kubenswrapper[5094]: I0220 08:32:23.841232 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:32:23 crc kubenswrapper[5094]: E0220 08:32:23.842259 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:32:38 crc kubenswrapper[5094]: I0220 08:32:38.839977 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:32:38 crc kubenswrapper[5094]: E0220 08:32:38.840691 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:32:50 crc kubenswrapper[5094]: I0220 08:32:50.840258 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:32:50 crc kubenswrapper[5094]: E0220 08:32:50.841054 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:33:02 crc kubenswrapper[5094]: I0220 08:33:02.840425 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:33:02 crc kubenswrapper[5094]: E0220 08:33:02.841288 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:33:13 crc kubenswrapper[5094]: I0220 08:33:13.840574 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:33:13 crc kubenswrapper[5094]: E0220 08:33:13.841862 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:33:24 crc kubenswrapper[5094]: I0220 08:33:24.840296 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:33:24 crc kubenswrapper[5094]: E0220 08:33:24.841029 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:33:35 crc kubenswrapper[5094]: I0220 08:33:35.849047 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:33:35 crc kubenswrapper[5094]: E0220 08:33:35.850201 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:33:50 crc kubenswrapper[5094]: I0220 08:33:50.839946 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:33:50 crc kubenswrapper[5094]: E0220 08:33:50.840620 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:34:04 crc kubenswrapper[5094]: I0220 08:34:04.840298 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:34:04 crc kubenswrapper[5094]: E0220 08:34:04.841084 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:34:08 crc kubenswrapper[5094]: I0220 08:34:08.068895 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" podUID="059e3724-d657-4f2e-beec-f4f55e09e498" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.47:8080/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 08:34:19 crc kubenswrapper[5094]: I0220 08:34:19.840829 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:34:19 crc kubenswrapper[5094]: E0220 08:34:19.841780 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:34:31 crc kubenswrapper[5094]: I0220 08:34:31.841151 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:34:31 crc kubenswrapper[5094]: E0220 08:34:31.842874 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:34:45 crc kubenswrapper[5094]: I0220 08:34:45.848363 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:34:45 crc kubenswrapper[5094]: E0220 08:34:45.849449 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:34:57 crc kubenswrapper[5094]: I0220 08:34:57.840586 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:34:57 crc kubenswrapper[5094]: E0220 08:34:57.842388 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:35:09 crc kubenswrapper[5094]: I0220 08:35:09.840759 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:35:09 crc kubenswrapper[5094]: E0220 08:35:09.841675 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:35:17 crc kubenswrapper[5094]: I0220 08:35:17.086258 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-bwdbk"] Feb 20 08:35:17 crc kubenswrapper[5094]: I0220 08:35:17.095846 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-bwdbk"] Feb 20 08:35:17 crc kubenswrapper[5094]: I0220 08:35:17.850558 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e2665ce-2c09-43f9-8245-ed36e682e1e0" path="/var/lib/kubelet/pods/5e2665ce-2c09-43f9-8245-ed36e682e1e0/volumes" Feb 20 08:35:23 crc kubenswrapper[5094]: I0220 08:35:23.839943 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:35:23 crc kubenswrapper[5094]: E0220 08:35:23.840916 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:35:36 crc kubenswrapper[5094]: I0220 08:35:36.746830 5094 scope.go:117] "RemoveContainer" containerID="3a10c0a7a48b4e7a28b7f39fc5231d6ac90168dce54dc2c58472a7fe7bfce49e" Feb 20 08:35:37 crc kubenswrapper[5094]: I0220 08:35:37.841064 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:35:37 crc kubenswrapper[5094]: E0220 08:35:37.841475 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:35:52 crc kubenswrapper[5094]: I0220 08:35:52.840825 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:35:52 crc kubenswrapper[5094]: E0220 08:35:52.841674 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:36:04 crc kubenswrapper[5094]: I0220 08:36:04.840587 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:36:04 crc kubenswrapper[5094]: E0220 08:36:04.841431 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:36:16 crc kubenswrapper[5094]: I0220 08:36:16.842915 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:36:16 crc kubenswrapper[5094]: E0220 08:36:16.844592 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:36:30 crc kubenswrapper[5094]: I0220 08:36:30.840794 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:36:30 crc kubenswrapper[5094]: E0220 08:36:30.841920 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:36:43 crc kubenswrapper[5094]: I0220 08:36:43.840387 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:36:44 crc kubenswrapper[5094]: I0220 08:36:44.858298 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"4eb6a1e784b78bde35c2b1ca39f3fd3b16c3f1975e9f121306145f15615f4afb"} Feb 20 08:37:17 crc kubenswrapper[5094]: I0220 08:37:17.984881 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Feb 20 08:37:17 crc kubenswrapper[5094]: E0220 08:37:17.985682 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="386a80fc-f69f-4bc8-bc43-8c3eba784c4e" containerName="extract-content" Feb 20 08:37:17 crc kubenswrapper[5094]: I0220 08:37:17.985693 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="386a80fc-f69f-4bc8-bc43-8c3eba784c4e" containerName="extract-content" Feb 20 08:37:17 crc kubenswrapper[5094]: E0220 08:37:17.985742 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="386a80fc-f69f-4bc8-bc43-8c3eba784c4e" containerName="extract-utilities" Feb 20 08:37:17 crc kubenswrapper[5094]: I0220 08:37:17.985752 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="386a80fc-f69f-4bc8-bc43-8c3eba784c4e" containerName="extract-utilities" Feb 20 08:37:17 crc kubenswrapper[5094]: E0220 08:37:17.985766 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="386a80fc-f69f-4bc8-bc43-8c3eba784c4e" containerName="registry-server" Feb 20 08:37:17 crc kubenswrapper[5094]: I0220 08:37:17.985773 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="386a80fc-f69f-4bc8-bc43-8c3eba784c4e" containerName="registry-server" Feb 20 08:37:17 crc kubenswrapper[5094]: E0220 08:37:17.985784 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c7c75bd-9812-4d90-80ea-08eda0f926fc" containerName="collect-profiles" Feb 20 08:37:17 crc kubenswrapper[5094]: I0220 08:37:17.985789 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c7c75bd-9812-4d90-80ea-08eda0f926fc" containerName="collect-profiles" Feb 20 08:37:17 crc kubenswrapper[5094]: I0220 08:37:17.985928 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="386a80fc-f69f-4bc8-bc43-8c3eba784c4e" containerName="registry-server" Feb 20 08:37:17 crc kubenswrapper[5094]: I0220 08:37:17.985939 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c7c75bd-9812-4d90-80ea-08eda0f926fc" containerName="collect-profiles" Feb 20 08:37:17 crc kubenswrapper[5094]: I0220 08:37:17.986435 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 20 08:37:17 crc kubenswrapper[5094]: I0220 08:37:17.996585 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-d88p7" Feb 20 08:37:17 crc kubenswrapper[5094]: I0220 08:37:17.997916 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Feb 20 08:37:18 crc kubenswrapper[5094]: I0220 08:37:18.097759 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvljf\" (UniqueName: \"kubernetes.io/projected/38c66beb-a97f-470c-8999-e15f5c4a9b60-kube-api-access-rvljf\") pod \"mariadb-copy-data\" (UID: \"38c66beb-a97f-470c-8999-e15f5c4a9b60\") " pod="openstack/mariadb-copy-data" Feb 20 08:37:18 crc kubenswrapper[5094]: I0220 08:37:18.097821 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7\") pod \"mariadb-copy-data\" (UID: \"38c66beb-a97f-470c-8999-e15f5c4a9b60\") " pod="openstack/mariadb-copy-data" Feb 20 08:37:18 crc kubenswrapper[5094]: I0220 08:37:18.199057 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7\") pod \"mariadb-copy-data\" (UID: \"38c66beb-a97f-470c-8999-e15f5c4a9b60\") " pod="openstack/mariadb-copy-data" Feb 20 08:37:18 crc kubenswrapper[5094]: I0220 08:37:18.199210 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvljf\" (UniqueName: \"kubernetes.io/projected/38c66beb-a97f-470c-8999-e15f5c4a9b60-kube-api-access-rvljf\") pod \"mariadb-copy-data\" (UID: \"38c66beb-a97f-470c-8999-e15f5c4a9b60\") " pod="openstack/mariadb-copy-data" Feb 20 08:37:18 crc kubenswrapper[5094]: I0220 08:37:18.202901 5094 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 08:37:18 crc kubenswrapper[5094]: I0220 08:37:18.202931 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7\") pod \"mariadb-copy-data\" (UID: \"38c66beb-a97f-470c-8999-e15f5c4a9b60\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3a4207eaa54a39bb320d6003c0fa348ef31a94bbf0305c6e782ec2791dde8b18/globalmount\"" pod="openstack/mariadb-copy-data" Feb 20 08:37:18 crc kubenswrapper[5094]: I0220 08:37:18.221818 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvljf\" (UniqueName: \"kubernetes.io/projected/38c66beb-a97f-470c-8999-e15f5c4a9b60-kube-api-access-rvljf\") pod \"mariadb-copy-data\" (UID: \"38c66beb-a97f-470c-8999-e15f5c4a9b60\") " pod="openstack/mariadb-copy-data" Feb 20 08:37:18 crc kubenswrapper[5094]: I0220 08:37:18.232027 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7\") pod \"mariadb-copy-data\" (UID: \"38c66beb-a97f-470c-8999-e15f5c4a9b60\") " pod="openstack/mariadb-copy-data" Feb 20 08:37:18 crc kubenswrapper[5094]: I0220 08:37:18.307008 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 20 08:37:18 crc kubenswrapper[5094]: I0220 08:37:18.809917 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Feb 20 08:37:19 crc kubenswrapper[5094]: I0220 08:37:19.134639 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"38c66beb-a97f-470c-8999-e15f5c4a9b60","Type":"ContainerStarted","Data":"0ce7542b1a8ed6f96aeda268ad15c227863bbd5946b3eaac7eb6134db4ce52f0"} Feb 20 08:37:19 crc kubenswrapper[5094]: I0220 08:37:19.134685 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"38c66beb-a97f-470c-8999-e15f5c4a9b60","Type":"ContainerStarted","Data":"9eef613ade19e2ed2f7272eef05d7fc30f774f3cc73f115f6f763788afc1cc96"} Feb 20 08:37:19 crc kubenswrapper[5094]: I0220 08:37:19.148811 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.148794034 podStartE2EDuration="3.148794034s" podCreationTimestamp="2026-02-20 08:37:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:37:19.147037672 +0000 UTC m=+6654.019664383" watchObservedRunningTime="2026-02-20 08:37:19.148794034 +0000 UTC m=+6654.021420745" Feb 20 08:37:20 crc kubenswrapper[5094]: I0220 08:37:20.716140 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rdh9h"] Feb 20 08:37:20 crc kubenswrapper[5094]: I0220 08:37:20.717727 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:20 crc kubenswrapper[5094]: I0220 08:37:20.733416 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rdh9h"] Feb 20 08:37:20 crc kubenswrapper[5094]: I0220 08:37:20.856689 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f9a7688-af6f-4953-b440-409492c949c9-catalog-content\") pod \"redhat-marketplace-rdh9h\" (UID: \"6f9a7688-af6f-4953-b440-409492c949c9\") " pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:20 crc kubenswrapper[5094]: I0220 08:37:20.856766 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f9a7688-af6f-4953-b440-409492c949c9-utilities\") pod \"redhat-marketplace-rdh9h\" (UID: \"6f9a7688-af6f-4953-b440-409492c949c9\") " pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:20 crc kubenswrapper[5094]: I0220 08:37:20.856823 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbjtv\" (UniqueName: \"kubernetes.io/projected/6f9a7688-af6f-4953-b440-409492c949c9-kube-api-access-cbjtv\") pod \"redhat-marketplace-rdh9h\" (UID: \"6f9a7688-af6f-4953-b440-409492c949c9\") " pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:20 crc kubenswrapper[5094]: I0220 08:37:20.958004 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f9a7688-af6f-4953-b440-409492c949c9-catalog-content\") pod \"redhat-marketplace-rdh9h\" (UID: \"6f9a7688-af6f-4953-b440-409492c949c9\") " pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:20 crc kubenswrapper[5094]: I0220 08:37:20.958070 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f9a7688-af6f-4953-b440-409492c949c9-utilities\") pod \"redhat-marketplace-rdh9h\" (UID: \"6f9a7688-af6f-4953-b440-409492c949c9\") " pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:20 crc kubenswrapper[5094]: I0220 08:37:20.958494 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbjtv\" (UniqueName: \"kubernetes.io/projected/6f9a7688-af6f-4953-b440-409492c949c9-kube-api-access-cbjtv\") pod \"redhat-marketplace-rdh9h\" (UID: \"6f9a7688-af6f-4953-b440-409492c949c9\") " pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:20 crc kubenswrapper[5094]: I0220 08:37:20.959590 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f9a7688-af6f-4953-b440-409492c949c9-catalog-content\") pod \"redhat-marketplace-rdh9h\" (UID: \"6f9a7688-af6f-4953-b440-409492c949c9\") " pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:20 crc kubenswrapper[5094]: I0220 08:37:20.960015 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f9a7688-af6f-4953-b440-409492c949c9-utilities\") pod \"redhat-marketplace-rdh9h\" (UID: \"6f9a7688-af6f-4953-b440-409492c949c9\") " pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:20 crc kubenswrapper[5094]: I0220 08:37:20.978036 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbjtv\" (UniqueName: \"kubernetes.io/projected/6f9a7688-af6f-4953-b440-409492c949c9-kube-api-access-cbjtv\") pod \"redhat-marketplace-rdh9h\" (UID: \"6f9a7688-af6f-4953-b440-409492c949c9\") " pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:21 crc kubenswrapper[5094]: I0220 08:37:21.039743 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:21 crc kubenswrapper[5094]: I0220 08:37:21.499075 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rdh9h"] Feb 20 08:37:22 crc kubenswrapper[5094]: I0220 08:37:22.185443 5094 generic.go:334] "Generic (PLEG): container finished" podID="6f9a7688-af6f-4953-b440-409492c949c9" containerID="c1a008da31d18f9721b30cdc04cddededd4183c6a72ed832366210c7f6a42513" exitCode=0 Feb 20 08:37:22 crc kubenswrapper[5094]: I0220 08:37:22.185569 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rdh9h" event={"ID":"6f9a7688-af6f-4953-b440-409492c949c9","Type":"ContainerDied","Data":"c1a008da31d18f9721b30cdc04cddededd4183c6a72ed832366210c7f6a42513"} Feb 20 08:37:22 crc kubenswrapper[5094]: I0220 08:37:22.185740 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rdh9h" event={"ID":"6f9a7688-af6f-4953-b440-409492c949c9","Type":"ContainerStarted","Data":"8c642e0fc4a13dfb17f0c0c42eaaf1f018f91fd34f83c2194b14e4acf8f25a6f"} Feb 20 08:37:22 crc kubenswrapper[5094]: I0220 08:37:22.187821 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 08:37:22 crc kubenswrapper[5094]: I0220 08:37:22.772164 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Feb 20 08:37:22 crc kubenswrapper[5094]: I0220 08:37:22.773272 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 20 08:37:22 crc kubenswrapper[5094]: I0220 08:37:22.780510 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 20 08:37:22 crc kubenswrapper[5094]: I0220 08:37:22.892065 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r72nv\" (UniqueName: \"kubernetes.io/projected/d771cf42-84a5-4783-8680-2ffad753c57e-kube-api-access-r72nv\") pod \"mariadb-client\" (UID: \"d771cf42-84a5-4783-8680-2ffad753c57e\") " pod="openstack/mariadb-client" Feb 20 08:37:22 crc kubenswrapper[5094]: I0220 08:37:22.993623 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r72nv\" (UniqueName: \"kubernetes.io/projected/d771cf42-84a5-4783-8680-2ffad753c57e-kube-api-access-r72nv\") pod \"mariadb-client\" (UID: \"d771cf42-84a5-4783-8680-2ffad753c57e\") " pod="openstack/mariadb-client" Feb 20 08:37:23 crc kubenswrapper[5094]: I0220 08:37:23.017575 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r72nv\" (UniqueName: \"kubernetes.io/projected/d771cf42-84a5-4783-8680-2ffad753c57e-kube-api-access-r72nv\") pod \"mariadb-client\" (UID: \"d771cf42-84a5-4783-8680-2ffad753c57e\") " pod="openstack/mariadb-client" Feb 20 08:37:23 crc kubenswrapper[5094]: I0220 08:37:23.165856 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 20 08:37:23 crc kubenswrapper[5094]: I0220 08:37:23.204001 5094 generic.go:334] "Generic (PLEG): container finished" podID="6f9a7688-af6f-4953-b440-409492c949c9" containerID="2cc82cc2beadfd24972780a90d31e67eb714f9ad234f7641910d3746337aaa06" exitCode=0 Feb 20 08:37:23 crc kubenswrapper[5094]: I0220 08:37:23.204032 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rdh9h" event={"ID":"6f9a7688-af6f-4953-b440-409492c949c9","Type":"ContainerDied","Data":"2cc82cc2beadfd24972780a90d31e67eb714f9ad234f7641910d3746337aaa06"} Feb 20 08:37:23 crc kubenswrapper[5094]: I0220 08:37:23.592771 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 20 08:37:24 crc kubenswrapper[5094]: I0220 08:37:24.211759 5094 generic.go:334] "Generic (PLEG): container finished" podID="d771cf42-84a5-4783-8680-2ffad753c57e" containerID="cfdeb3a947002392415700b896a4c45e60680753af31e8fcda15dd6191a2f1dc" exitCode=0 Feb 20 08:37:24 crc kubenswrapper[5094]: I0220 08:37:24.211796 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"d771cf42-84a5-4783-8680-2ffad753c57e","Type":"ContainerDied","Data":"cfdeb3a947002392415700b896a4c45e60680753af31e8fcda15dd6191a2f1dc"} Feb 20 08:37:24 crc kubenswrapper[5094]: I0220 08:37:24.212054 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"d771cf42-84a5-4783-8680-2ffad753c57e","Type":"ContainerStarted","Data":"470051e33cbb8af1d6882246b88c3dbe7592d48c358cb5b30ed369af587b5673"} Feb 20 08:37:24 crc kubenswrapper[5094]: I0220 08:37:24.213849 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rdh9h" event={"ID":"6f9a7688-af6f-4953-b440-409492c949c9","Type":"ContainerStarted","Data":"455520736b3d244bd242142a5565e950ec689596fd0a57c14bf0d96df2437b80"} Feb 20 08:37:24 crc kubenswrapper[5094]: I0220 08:37:24.241957 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rdh9h" podStartSLOduration=2.842336342 podStartE2EDuration="4.241939307s" podCreationTimestamp="2026-02-20 08:37:20 +0000 UTC" firstStartedPulling="2026-02-20 08:37:22.187600954 +0000 UTC m=+6657.060227665" lastFinishedPulling="2026-02-20 08:37:23.587203919 +0000 UTC m=+6658.459830630" observedRunningTime="2026-02-20 08:37:24.238005562 +0000 UTC m=+6659.110632283" watchObservedRunningTime="2026-02-20 08:37:24.241939307 +0000 UTC m=+6659.114566018" Feb 20 08:37:25 crc kubenswrapper[5094]: I0220 08:37:25.564032 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 20 08:37:25 crc kubenswrapper[5094]: I0220 08:37:25.589226 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_d771cf42-84a5-4783-8680-2ffad753c57e/mariadb-client/0.log" Feb 20 08:37:25 crc kubenswrapper[5094]: I0220 08:37:25.616292 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 20 08:37:25 crc kubenswrapper[5094]: I0220 08:37:25.621843 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Feb 20 08:37:25 crc kubenswrapper[5094]: I0220 08:37:25.642496 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r72nv\" (UniqueName: \"kubernetes.io/projected/d771cf42-84a5-4783-8680-2ffad753c57e-kube-api-access-r72nv\") pod \"d771cf42-84a5-4783-8680-2ffad753c57e\" (UID: \"d771cf42-84a5-4783-8680-2ffad753c57e\") " Feb 20 08:37:25 crc kubenswrapper[5094]: I0220 08:37:25.648077 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d771cf42-84a5-4783-8680-2ffad753c57e-kube-api-access-r72nv" (OuterVolumeSpecName: "kube-api-access-r72nv") pod "d771cf42-84a5-4783-8680-2ffad753c57e" (UID: "d771cf42-84a5-4783-8680-2ffad753c57e"). InnerVolumeSpecName "kube-api-access-r72nv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:37:25 crc kubenswrapper[5094]: I0220 08:37:25.744672 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r72nv\" (UniqueName: \"kubernetes.io/projected/d771cf42-84a5-4783-8680-2ffad753c57e-kube-api-access-r72nv\") on node \"crc\" DevicePath \"\"" Feb 20 08:37:25 crc kubenswrapper[5094]: I0220 08:37:25.794596 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Feb 20 08:37:25 crc kubenswrapper[5094]: E0220 08:37:25.795251 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d771cf42-84a5-4783-8680-2ffad753c57e" containerName="mariadb-client" Feb 20 08:37:25 crc kubenswrapper[5094]: I0220 08:37:25.795270 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d771cf42-84a5-4783-8680-2ffad753c57e" containerName="mariadb-client" Feb 20 08:37:25 crc kubenswrapper[5094]: I0220 08:37:25.795579 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d771cf42-84a5-4783-8680-2ffad753c57e" containerName="mariadb-client" Feb 20 08:37:25 crc kubenswrapper[5094]: I0220 08:37:25.796551 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 20 08:37:25 crc kubenswrapper[5094]: I0220 08:37:25.803031 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 20 08:37:25 crc kubenswrapper[5094]: I0220 08:37:25.846980 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp2gl\" (UniqueName: \"kubernetes.io/projected/5a2ead87-5f06-45fa-aa5d-347ff16f517e-kube-api-access-fp2gl\") pod \"mariadb-client\" (UID: \"5a2ead87-5f06-45fa-aa5d-347ff16f517e\") " pod="openstack/mariadb-client" Feb 20 08:37:25 crc kubenswrapper[5094]: I0220 08:37:25.855346 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d771cf42-84a5-4783-8680-2ffad753c57e" path="/var/lib/kubelet/pods/d771cf42-84a5-4783-8680-2ffad753c57e/volumes" Feb 20 08:37:25 crc kubenswrapper[5094]: I0220 08:37:25.948408 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp2gl\" (UniqueName: \"kubernetes.io/projected/5a2ead87-5f06-45fa-aa5d-347ff16f517e-kube-api-access-fp2gl\") pod \"mariadb-client\" (UID: \"5a2ead87-5f06-45fa-aa5d-347ff16f517e\") " pod="openstack/mariadb-client" Feb 20 08:37:25 crc kubenswrapper[5094]: I0220 08:37:25.969792 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp2gl\" (UniqueName: \"kubernetes.io/projected/5a2ead87-5f06-45fa-aa5d-347ff16f517e-kube-api-access-fp2gl\") pod \"mariadb-client\" (UID: \"5a2ead87-5f06-45fa-aa5d-347ff16f517e\") " pod="openstack/mariadb-client" Feb 20 08:37:26 crc kubenswrapper[5094]: I0220 08:37:26.119124 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 20 08:37:26 crc kubenswrapper[5094]: I0220 08:37:26.258670 5094 scope.go:117] "RemoveContainer" containerID="cfdeb3a947002392415700b896a4c45e60680753af31e8fcda15dd6191a2f1dc" Feb 20 08:37:26 crc kubenswrapper[5094]: I0220 08:37:26.258742 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 20 08:37:26 crc kubenswrapper[5094]: I0220 08:37:26.606652 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 20 08:37:26 crc kubenswrapper[5094]: W0220 08:37:26.612981 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a2ead87_5f06_45fa_aa5d_347ff16f517e.slice/crio-1fd0fe7ec6d9fbcb333d4c4918bc170668fef60f0fb6bdf09650d32851d2ffc4 WatchSource:0}: Error finding container 1fd0fe7ec6d9fbcb333d4c4918bc170668fef60f0fb6bdf09650d32851d2ffc4: Status 404 returned error can't find the container with id 1fd0fe7ec6d9fbcb333d4c4918bc170668fef60f0fb6bdf09650d32851d2ffc4 Feb 20 08:37:27 crc kubenswrapper[5094]: I0220 08:37:27.269382 5094 generic.go:334] "Generic (PLEG): container finished" podID="5a2ead87-5f06-45fa-aa5d-347ff16f517e" containerID="51c5d24049c628fe72ec29c7d6aad6b1b26637f9d1812b5f27f768d7d83239ed" exitCode=0 Feb 20 08:37:27 crc kubenswrapper[5094]: I0220 08:37:27.269446 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"5a2ead87-5f06-45fa-aa5d-347ff16f517e","Type":"ContainerDied","Data":"51c5d24049c628fe72ec29c7d6aad6b1b26637f9d1812b5f27f768d7d83239ed"} Feb 20 08:37:27 crc kubenswrapper[5094]: I0220 08:37:27.269839 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"5a2ead87-5f06-45fa-aa5d-347ff16f517e","Type":"ContainerStarted","Data":"1fd0fe7ec6d9fbcb333d4c4918bc170668fef60f0fb6bdf09650d32851d2ffc4"} Feb 20 08:37:28 crc kubenswrapper[5094]: I0220 08:37:28.648320 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 20 08:37:28 crc kubenswrapper[5094]: I0220 08:37:28.672100 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_5a2ead87-5f06-45fa-aa5d-347ff16f517e/mariadb-client/0.log" Feb 20 08:37:28 crc kubenswrapper[5094]: I0220 08:37:28.696110 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp2gl\" (UniqueName: \"kubernetes.io/projected/5a2ead87-5f06-45fa-aa5d-347ff16f517e-kube-api-access-fp2gl\") pod \"5a2ead87-5f06-45fa-aa5d-347ff16f517e\" (UID: \"5a2ead87-5f06-45fa-aa5d-347ff16f517e\") " Feb 20 08:37:28 crc kubenswrapper[5094]: I0220 08:37:28.701723 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a2ead87-5f06-45fa-aa5d-347ff16f517e-kube-api-access-fp2gl" (OuterVolumeSpecName: "kube-api-access-fp2gl") pod "5a2ead87-5f06-45fa-aa5d-347ff16f517e" (UID: "5a2ead87-5f06-45fa-aa5d-347ff16f517e"). InnerVolumeSpecName "kube-api-access-fp2gl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:37:28 crc kubenswrapper[5094]: I0220 08:37:28.702428 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 20 08:37:28 crc kubenswrapper[5094]: I0220 08:37:28.713592 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Feb 20 08:37:28 crc kubenswrapper[5094]: I0220 08:37:28.797624 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp2gl\" (UniqueName: \"kubernetes.io/projected/5a2ead87-5f06-45fa-aa5d-347ff16f517e-kube-api-access-fp2gl\") on node \"crc\" DevicePath \"\"" Feb 20 08:37:29 crc kubenswrapper[5094]: I0220 08:37:29.292734 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fd0fe7ec6d9fbcb333d4c4918bc170668fef60f0fb6bdf09650d32851d2ffc4" Feb 20 08:37:29 crc kubenswrapper[5094]: I0220 08:37:29.292830 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 20 08:37:29 crc kubenswrapper[5094]: I0220 08:37:29.850560 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a2ead87-5f06-45fa-aa5d-347ff16f517e" path="/var/lib/kubelet/pods/5a2ead87-5f06-45fa-aa5d-347ff16f517e/volumes" Feb 20 08:37:31 crc kubenswrapper[5094]: I0220 08:37:31.041175 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:31 crc kubenswrapper[5094]: I0220 08:37:31.041285 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:31 crc kubenswrapper[5094]: I0220 08:37:31.089025 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:31 crc kubenswrapper[5094]: I0220 08:37:31.366773 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:31 crc kubenswrapper[5094]: I0220 08:37:31.415103 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rdh9h"] Feb 20 08:37:33 crc kubenswrapper[5094]: I0220 08:37:33.327656 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rdh9h" podUID="6f9a7688-af6f-4953-b440-409492c949c9" containerName="registry-server" containerID="cri-o://455520736b3d244bd242142a5565e950ec689596fd0a57c14bf0d96df2437b80" gracePeriod=2 Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.028143 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.188652 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbjtv\" (UniqueName: \"kubernetes.io/projected/6f9a7688-af6f-4953-b440-409492c949c9-kube-api-access-cbjtv\") pod \"6f9a7688-af6f-4953-b440-409492c949c9\" (UID: \"6f9a7688-af6f-4953-b440-409492c949c9\") " Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.188783 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f9a7688-af6f-4953-b440-409492c949c9-utilities\") pod \"6f9a7688-af6f-4953-b440-409492c949c9\" (UID: \"6f9a7688-af6f-4953-b440-409492c949c9\") " Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.188945 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f9a7688-af6f-4953-b440-409492c949c9-catalog-content\") pod \"6f9a7688-af6f-4953-b440-409492c949c9\" (UID: \"6f9a7688-af6f-4953-b440-409492c949c9\") " Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.190499 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f9a7688-af6f-4953-b440-409492c949c9-utilities" (OuterVolumeSpecName: "utilities") pod "6f9a7688-af6f-4953-b440-409492c949c9" (UID: "6f9a7688-af6f-4953-b440-409492c949c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.193907 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f9a7688-af6f-4953-b440-409492c949c9-kube-api-access-cbjtv" (OuterVolumeSpecName: "kube-api-access-cbjtv") pod "6f9a7688-af6f-4953-b440-409492c949c9" (UID: "6f9a7688-af6f-4953-b440-409492c949c9"). InnerVolumeSpecName "kube-api-access-cbjtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.216513 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f9a7688-af6f-4953-b440-409492c949c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f9a7688-af6f-4953-b440-409492c949c9" (UID: "6f9a7688-af6f-4953-b440-409492c949c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.290536 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f9a7688-af6f-4953-b440-409492c949c9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.290574 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbjtv\" (UniqueName: \"kubernetes.io/projected/6f9a7688-af6f-4953-b440-409492c949c9-kube-api-access-cbjtv\") on node \"crc\" DevicePath \"\"" Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.290585 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f9a7688-af6f-4953-b440-409492c949c9-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.340014 5094 generic.go:334] "Generic (PLEG): container finished" podID="6f9a7688-af6f-4953-b440-409492c949c9" containerID="455520736b3d244bd242142a5565e950ec689596fd0a57c14bf0d96df2437b80" exitCode=0 Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.340087 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rdh9h" event={"ID":"6f9a7688-af6f-4953-b440-409492c949c9","Type":"ContainerDied","Data":"455520736b3d244bd242142a5565e950ec689596fd0a57c14bf0d96df2437b80"} Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.340129 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rdh9h" event={"ID":"6f9a7688-af6f-4953-b440-409492c949c9","Type":"ContainerDied","Data":"8c642e0fc4a13dfb17f0c0c42eaaf1f018f91fd34f83c2194b14e4acf8f25a6f"} Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.340133 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.340157 5094 scope.go:117] "RemoveContainer" containerID="455520736b3d244bd242142a5565e950ec689596fd0a57c14bf0d96df2437b80" Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.367959 5094 scope.go:117] "RemoveContainer" containerID="2cc82cc2beadfd24972780a90d31e67eb714f9ad234f7641910d3746337aaa06" Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.395002 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rdh9h"] Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.402057 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rdh9h"] Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.411509 5094 scope.go:117] "RemoveContainer" containerID="c1a008da31d18f9721b30cdc04cddededd4183c6a72ed832366210c7f6a42513" Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.434946 5094 scope.go:117] "RemoveContainer" containerID="455520736b3d244bd242142a5565e950ec689596fd0a57c14bf0d96df2437b80" Feb 20 08:37:34 crc kubenswrapper[5094]: E0220 08:37:34.435407 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"455520736b3d244bd242142a5565e950ec689596fd0a57c14bf0d96df2437b80\": container with ID starting with 455520736b3d244bd242142a5565e950ec689596fd0a57c14bf0d96df2437b80 not found: ID does not exist" containerID="455520736b3d244bd242142a5565e950ec689596fd0a57c14bf0d96df2437b80" Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.435437 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"455520736b3d244bd242142a5565e950ec689596fd0a57c14bf0d96df2437b80"} err="failed to get container status \"455520736b3d244bd242142a5565e950ec689596fd0a57c14bf0d96df2437b80\": rpc error: code = NotFound desc = could not find container \"455520736b3d244bd242142a5565e950ec689596fd0a57c14bf0d96df2437b80\": container with ID starting with 455520736b3d244bd242142a5565e950ec689596fd0a57c14bf0d96df2437b80 not found: ID does not exist" Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.435457 5094 scope.go:117] "RemoveContainer" containerID="2cc82cc2beadfd24972780a90d31e67eb714f9ad234f7641910d3746337aaa06" Feb 20 08:37:34 crc kubenswrapper[5094]: E0220 08:37:34.435798 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cc82cc2beadfd24972780a90d31e67eb714f9ad234f7641910d3746337aaa06\": container with ID starting with 2cc82cc2beadfd24972780a90d31e67eb714f9ad234f7641910d3746337aaa06 not found: ID does not exist" containerID="2cc82cc2beadfd24972780a90d31e67eb714f9ad234f7641910d3746337aaa06" Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.435834 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cc82cc2beadfd24972780a90d31e67eb714f9ad234f7641910d3746337aaa06"} err="failed to get container status \"2cc82cc2beadfd24972780a90d31e67eb714f9ad234f7641910d3746337aaa06\": rpc error: code = NotFound desc = could not find container \"2cc82cc2beadfd24972780a90d31e67eb714f9ad234f7641910d3746337aaa06\": container with ID starting with 2cc82cc2beadfd24972780a90d31e67eb714f9ad234f7641910d3746337aaa06 not found: ID does not exist" Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.435852 5094 scope.go:117] "RemoveContainer" containerID="c1a008da31d18f9721b30cdc04cddededd4183c6a72ed832366210c7f6a42513" Feb 20 08:37:34 crc kubenswrapper[5094]: E0220 08:37:34.436176 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1a008da31d18f9721b30cdc04cddededd4183c6a72ed832366210c7f6a42513\": container with ID starting with c1a008da31d18f9721b30cdc04cddededd4183c6a72ed832366210c7f6a42513 not found: ID does not exist" containerID="c1a008da31d18f9721b30cdc04cddededd4183c6a72ed832366210c7f6a42513" Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.436236 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1a008da31d18f9721b30cdc04cddededd4183c6a72ed832366210c7f6a42513"} err="failed to get container status \"c1a008da31d18f9721b30cdc04cddededd4183c6a72ed832366210c7f6a42513\": rpc error: code = NotFound desc = could not find container \"c1a008da31d18f9721b30cdc04cddededd4183c6a72ed832366210c7f6a42513\": container with ID starting with c1a008da31d18f9721b30cdc04cddededd4183c6a72ed832366210c7f6a42513 not found: ID does not exist" Feb 20 08:37:35 crc kubenswrapper[5094]: I0220 08:37:35.851229 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f9a7688-af6f-4953-b440-409492c949c9" path="/var/lib/kubelet/pods/6f9a7688-af6f-4953-b440-409492c949c9/volumes" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.740933 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vs8s6"] Feb 20 08:37:44 crc kubenswrapper[5094]: E0220 08:37:44.741815 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a2ead87-5f06-45fa-aa5d-347ff16f517e" containerName="mariadb-client" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.741874 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a2ead87-5f06-45fa-aa5d-347ff16f517e" containerName="mariadb-client" Feb 20 08:37:44 crc kubenswrapper[5094]: E0220 08:37:44.741909 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9a7688-af6f-4953-b440-409492c949c9" containerName="extract-utilities" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.741919 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9a7688-af6f-4953-b440-409492c949c9" containerName="extract-utilities" Feb 20 08:37:44 crc kubenswrapper[5094]: E0220 08:37:44.741938 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9a7688-af6f-4953-b440-409492c949c9" containerName="registry-server" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.741947 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9a7688-af6f-4953-b440-409492c949c9" containerName="registry-server" Feb 20 08:37:44 crc kubenswrapper[5094]: E0220 08:37:44.741965 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9a7688-af6f-4953-b440-409492c949c9" containerName="extract-content" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.741971 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9a7688-af6f-4953-b440-409492c949c9" containerName="extract-content" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.742202 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9a7688-af6f-4953-b440-409492c949c9" containerName="registry-server" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.742221 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a2ead87-5f06-45fa-aa5d-347ff16f517e" containerName="mariadb-client" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.743570 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.784792 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf546d1c-5402-468b-a510-df011545b4bf-catalog-content\") pod \"community-operators-vs8s6\" (UID: \"bf546d1c-5402-468b-a510-df011545b4bf\") " pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.784922 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcn9j\" (UniqueName: \"kubernetes.io/projected/bf546d1c-5402-468b-a510-df011545b4bf-kube-api-access-jcn9j\") pod \"community-operators-vs8s6\" (UID: \"bf546d1c-5402-468b-a510-df011545b4bf\") " pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.784970 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf546d1c-5402-468b-a510-df011545b4bf-utilities\") pod \"community-operators-vs8s6\" (UID: \"bf546d1c-5402-468b-a510-df011545b4bf\") " pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.785816 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vs8s6"] Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.886358 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf546d1c-5402-468b-a510-df011545b4bf-catalog-content\") pod \"community-operators-vs8s6\" (UID: \"bf546d1c-5402-468b-a510-df011545b4bf\") " pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.886503 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcn9j\" (UniqueName: \"kubernetes.io/projected/bf546d1c-5402-468b-a510-df011545b4bf-kube-api-access-jcn9j\") pod \"community-operators-vs8s6\" (UID: \"bf546d1c-5402-468b-a510-df011545b4bf\") " pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.886716 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf546d1c-5402-468b-a510-df011545b4bf-utilities\") pod \"community-operators-vs8s6\" (UID: \"bf546d1c-5402-468b-a510-df011545b4bf\") " pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.887073 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf546d1c-5402-468b-a510-df011545b4bf-catalog-content\") pod \"community-operators-vs8s6\" (UID: \"bf546d1c-5402-468b-a510-df011545b4bf\") " pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.887247 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf546d1c-5402-468b-a510-df011545b4bf-utilities\") pod \"community-operators-vs8s6\" (UID: \"bf546d1c-5402-468b-a510-df011545b4bf\") " pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.907115 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcn9j\" (UniqueName: \"kubernetes.io/projected/bf546d1c-5402-468b-a510-df011545b4bf-kube-api-access-jcn9j\") pod \"community-operators-vs8s6\" (UID: \"bf546d1c-5402-468b-a510-df011545b4bf\") " pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:45 crc kubenswrapper[5094]: I0220 08:37:45.065475 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:45 crc kubenswrapper[5094]: I0220 08:37:45.523622 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vs8s6"] Feb 20 08:37:46 crc kubenswrapper[5094]: I0220 08:37:46.452445 5094 generic.go:334] "Generic (PLEG): container finished" podID="bf546d1c-5402-468b-a510-df011545b4bf" containerID="f20fba63f7ce465e08719424a8bc32bb153d0d6e565a93b992825ed7e94c9e02" exitCode=0 Feb 20 08:37:46 crc kubenswrapper[5094]: I0220 08:37:46.452545 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vs8s6" event={"ID":"bf546d1c-5402-468b-a510-df011545b4bf","Type":"ContainerDied","Data":"f20fba63f7ce465e08719424a8bc32bb153d0d6e565a93b992825ed7e94c9e02"} Feb 20 08:37:46 crc kubenswrapper[5094]: I0220 08:37:46.452890 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vs8s6" event={"ID":"bf546d1c-5402-468b-a510-df011545b4bf","Type":"ContainerStarted","Data":"b2e4678a0b8ece41a89baa1faf5b87ba135db8df0ae1a2369768bfdb8c3b0a46"} Feb 20 08:37:47 crc kubenswrapper[5094]: I0220 08:37:47.462113 5094 generic.go:334] "Generic (PLEG): container finished" podID="bf546d1c-5402-468b-a510-df011545b4bf" containerID="b31d2dc45159f42d33732eb6b6ef11de26eb07de6167955691efff5af1506f0b" exitCode=0 Feb 20 08:37:47 crc kubenswrapper[5094]: I0220 08:37:47.462237 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vs8s6" event={"ID":"bf546d1c-5402-468b-a510-df011545b4bf","Type":"ContainerDied","Data":"b31d2dc45159f42d33732eb6b6ef11de26eb07de6167955691efff5af1506f0b"} Feb 20 08:37:48 crc kubenswrapper[5094]: I0220 08:37:48.472499 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vs8s6" event={"ID":"bf546d1c-5402-468b-a510-df011545b4bf","Type":"ContainerStarted","Data":"4b2c720f16849b4a5e5ba3ed04da3530f344c6b52f3458788d8c97bf92c15bcd"} Feb 20 08:37:48 crc kubenswrapper[5094]: I0220 08:37:48.501093 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vs8s6" podStartSLOduration=3.11129748 podStartE2EDuration="4.501066588s" podCreationTimestamp="2026-02-20 08:37:44 +0000 UTC" firstStartedPulling="2026-02-20 08:37:46.454624335 +0000 UTC m=+6681.327251046" lastFinishedPulling="2026-02-20 08:37:47.844393453 +0000 UTC m=+6682.717020154" observedRunningTime="2026-02-20 08:37:48.491619721 +0000 UTC m=+6683.364246472" watchObservedRunningTime="2026-02-20 08:37:48.501066588 +0000 UTC m=+6683.373693349" Feb 20 08:37:55 crc kubenswrapper[5094]: I0220 08:37:55.067115 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:55 crc kubenswrapper[5094]: I0220 08:37:55.067810 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:55 crc kubenswrapper[5094]: I0220 08:37:55.116337 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:55 crc kubenswrapper[5094]: I0220 08:37:55.573312 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:55 crc kubenswrapper[5094]: I0220 08:37:55.620376 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vs8s6"] Feb 20 08:37:57 crc kubenswrapper[5094]: I0220 08:37:57.541627 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vs8s6" podUID="bf546d1c-5402-468b-a510-df011545b4bf" containerName="registry-server" containerID="cri-o://4b2c720f16849b4a5e5ba3ed04da3530f344c6b52f3458788d8c97bf92c15bcd" gracePeriod=2 Feb 20 08:37:57 crc kubenswrapper[5094]: I0220 08:37:57.986999 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:57 crc kubenswrapper[5094]: I0220 08:37:57.996614 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf546d1c-5402-468b-a510-df011545b4bf-catalog-content\") pod \"bf546d1c-5402-468b-a510-df011545b4bf\" (UID: \"bf546d1c-5402-468b-a510-df011545b4bf\") " Feb 20 08:37:57 crc kubenswrapper[5094]: I0220 08:37:57.996724 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf546d1c-5402-468b-a510-df011545b4bf-utilities\") pod \"bf546d1c-5402-468b-a510-df011545b4bf\" (UID: \"bf546d1c-5402-468b-a510-df011545b4bf\") " Feb 20 08:37:57 crc kubenswrapper[5094]: I0220 08:37:57.996789 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcn9j\" (UniqueName: \"kubernetes.io/projected/bf546d1c-5402-468b-a510-df011545b4bf-kube-api-access-jcn9j\") pod \"bf546d1c-5402-468b-a510-df011545b4bf\" (UID: \"bf546d1c-5402-468b-a510-df011545b4bf\") " Feb 20 08:37:57 crc kubenswrapper[5094]: I0220 08:37:57.997527 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf546d1c-5402-468b-a510-df011545b4bf-utilities" (OuterVolumeSpecName: "utilities") pod "bf546d1c-5402-468b-a510-df011545b4bf" (UID: "bf546d1c-5402-468b-a510-df011545b4bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.002575 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf546d1c-5402-468b-a510-df011545b4bf-kube-api-access-jcn9j" (OuterVolumeSpecName: "kube-api-access-jcn9j") pod "bf546d1c-5402-468b-a510-df011545b4bf" (UID: "bf546d1c-5402-468b-a510-df011545b4bf"). InnerVolumeSpecName "kube-api-access-jcn9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.098851 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf546d1c-5402-468b-a510-df011545b4bf-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.098891 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcn9j\" (UniqueName: \"kubernetes.io/projected/bf546d1c-5402-468b-a510-df011545b4bf-kube-api-access-jcn9j\") on node \"crc\" DevicePath \"\"" Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.371191 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf546d1c-5402-468b-a510-df011545b4bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf546d1c-5402-468b-a510-df011545b4bf" (UID: "bf546d1c-5402-468b-a510-df011545b4bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.401573 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf546d1c-5402-468b-a510-df011545b4bf-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.549721 5094 generic.go:334] "Generic (PLEG): container finished" podID="bf546d1c-5402-468b-a510-df011545b4bf" containerID="4b2c720f16849b4a5e5ba3ed04da3530f344c6b52f3458788d8c97bf92c15bcd" exitCode=0 Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.549814 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.549813 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vs8s6" event={"ID":"bf546d1c-5402-468b-a510-df011545b4bf","Type":"ContainerDied","Data":"4b2c720f16849b4a5e5ba3ed04da3530f344c6b52f3458788d8c97bf92c15bcd"} Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.550131 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vs8s6" event={"ID":"bf546d1c-5402-468b-a510-df011545b4bf","Type":"ContainerDied","Data":"b2e4678a0b8ece41a89baa1faf5b87ba135db8df0ae1a2369768bfdb8c3b0a46"} Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.550156 5094 scope.go:117] "RemoveContainer" containerID="4b2c720f16849b4a5e5ba3ed04da3530f344c6b52f3458788d8c97bf92c15bcd" Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.568827 5094 scope.go:117] "RemoveContainer" containerID="b31d2dc45159f42d33732eb6b6ef11de26eb07de6167955691efff5af1506f0b" Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.586384 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vs8s6"] Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.594085 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vs8s6"] Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.605630 5094 scope.go:117] "RemoveContainer" containerID="f20fba63f7ce465e08719424a8bc32bb153d0d6e565a93b992825ed7e94c9e02" Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.627437 5094 scope.go:117] "RemoveContainer" containerID="4b2c720f16849b4a5e5ba3ed04da3530f344c6b52f3458788d8c97bf92c15bcd" Feb 20 08:37:58 crc kubenswrapper[5094]: E0220 08:37:58.628127 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b2c720f16849b4a5e5ba3ed04da3530f344c6b52f3458788d8c97bf92c15bcd\": container with ID starting with 4b2c720f16849b4a5e5ba3ed04da3530f344c6b52f3458788d8c97bf92c15bcd not found: ID does not exist" containerID="4b2c720f16849b4a5e5ba3ed04da3530f344c6b52f3458788d8c97bf92c15bcd" Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.628161 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b2c720f16849b4a5e5ba3ed04da3530f344c6b52f3458788d8c97bf92c15bcd"} err="failed to get container status \"4b2c720f16849b4a5e5ba3ed04da3530f344c6b52f3458788d8c97bf92c15bcd\": rpc error: code = NotFound desc = could not find container \"4b2c720f16849b4a5e5ba3ed04da3530f344c6b52f3458788d8c97bf92c15bcd\": container with ID starting with 4b2c720f16849b4a5e5ba3ed04da3530f344c6b52f3458788d8c97bf92c15bcd not found: ID does not exist" Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.628184 5094 scope.go:117] "RemoveContainer" containerID="b31d2dc45159f42d33732eb6b6ef11de26eb07de6167955691efff5af1506f0b" Feb 20 08:37:58 crc kubenswrapper[5094]: E0220 08:37:58.628492 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b31d2dc45159f42d33732eb6b6ef11de26eb07de6167955691efff5af1506f0b\": container with ID starting with b31d2dc45159f42d33732eb6b6ef11de26eb07de6167955691efff5af1506f0b not found: ID does not exist" containerID="b31d2dc45159f42d33732eb6b6ef11de26eb07de6167955691efff5af1506f0b" Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.628513 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b31d2dc45159f42d33732eb6b6ef11de26eb07de6167955691efff5af1506f0b"} err="failed to get container status \"b31d2dc45159f42d33732eb6b6ef11de26eb07de6167955691efff5af1506f0b\": rpc error: code = NotFound desc = could not find container \"b31d2dc45159f42d33732eb6b6ef11de26eb07de6167955691efff5af1506f0b\": container with ID starting with b31d2dc45159f42d33732eb6b6ef11de26eb07de6167955691efff5af1506f0b not found: ID does not exist" Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.628527 5094 scope.go:117] "RemoveContainer" containerID="f20fba63f7ce465e08719424a8bc32bb153d0d6e565a93b992825ed7e94c9e02" Feb 20 08:37:58 crc kubenswrapper[5094]: E0220 08:37:58.628899 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f20fba63f7ce465e08719424a8bc32bb153d0d6e565a93b992825ed7e94c9e02\": container with ID starting with f20fba63f7ce465e08719424a8bc32bb153d0d6e565a93b992825ed7e94c9e02 not found: ID does not exist" containerID="f20fba63f7ce465e08719424a8bc32bb153d0d6e565a93b992825ed7e94c9e02" Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.628926 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f20fba63f7ce465e08719424a8bc32bb153d0d6e565a93b992825ed7e94c9e02"} err="failed to get container status \"f20fba63f7ce465e08719424a8bc32bb153d0d6e565a93b992825ed7e94c9e02\": rpc error: code = NotFound desc = could not find container \"f20fba63f7ce465e08719424a8bc32bb153d0d6e565a93b992825ed7e94c9e02\": container with ID starting with f20fba63f7ce465e08719424a8bc32bb153d0d6e565a93b992825ed7e94c9e02 not found: ID does not exist" Feb 20 08:37:59 crc kubenswrapper[5094]: I0220 08:37:59.848072 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf546d1c-5402-468b-a510-df011545b4bf" path="/var/lib/kubelet/pods/bf546d1c-5402-468b-a510-df011545b4bf/volumes" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.599930 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 20 08:38:00 crc kubenswrapper[5094]: E0220 08:38:00.600307 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf546d1c-5402-468b-a510-df011545b4bf" containerName="extract-utilities" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.600334 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf546d1c-5402-468b-a510-df011545b4bf" containerName="extract-utilities" Feb 20 08:38:00 crc kubenswrapper[5094]: E0220 08:38:00.600355 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf546d1c-5402-468b-a510-df011545b4bf" containerName="registry-server" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.600363 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf546d1c-5402-468b-a510-df011545b4bf" containerName="registry-server" Feb 20 08:38:00 crc kubenswrapper[5094]: E0220 08:38:00.600381 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf546d1c-5402-468b-a510-df011545b4bf" containerName="extract-content" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.600389 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf546d1c-5402-468b-a510-df011545b4bf" containerName="extract-content" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.600563 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf546d1c-5402-468b-a510-df011545b4bf" containerName="registry-server" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.601431 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.603465 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-cccp9" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.603889 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.611182 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.613961 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.652625 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.654821 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.663171 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.671915 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.674045 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.681053 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.733891 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/464f9fe3-85bf-4e78-adc3-3feedbaf1dac-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.734549 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/464f9fe3-85bf-4e78-adc3-3feedbaf1dac-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.734641 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/464f9fe3-85bf-4e78-adc3-3feedbaf1dac-config\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.734688 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rp6m\" (UniqueName: \"kubernetes.io/projected/464f9fe3-85bf-4e78-adc3-3feedbaf1dac-kube-api-access-7rp6m\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.734802 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a68ddb08-cf6f-40ba-8b6c-c58d8c550e21\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a68ddb08-cf6f-40ba-8b6c-c58d8c550e21\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.734867 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/464f9fe3-85bf-4e78-adc3-3feedbaf1dac-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.797835 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.800496 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.808131 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-khxrk" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.809828 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.813432 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.815555 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.817635 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.823398 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.824896 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.835945 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/464f9fe3-85bf-4e78-adc3-3feedbaf1dac-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836006 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79dfde5a-85a9-437f-979d-1fdb99a1bb5f-config\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836035 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7n6t\" (UniqueName: \"kubernetes.io/projected/79dfde5a-85a9-437f-979d-1fdb99a1bb5f-kube-api-access-w7n6t\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836070 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/464f9fe3-85bf-4e78-adc3-3feedbaf1dac-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836092 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvjj7\" (UniqueName: \"kubernetes.io/projected/5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf-kube-api-access-gvjj7\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836135 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf-config\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836166 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836189 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/464f9fe3-85bf-4e78-adc3-3feedbaf1dac-config\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836409 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rp6m\" (UniqueName: \"kubernetes.io/projected/464f9fe3-85bf-4e78-adc3-3feedbaf1dac-kube-api-access-7rp6m\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836435 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79dfde5a-85a9-437f-979d-1fdb99a1bb5f-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836464 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a68ddb08-cf6f-40ba-8b6c-c58d8c550e21\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a68ddb08-cf6f-40ba-8b6c-c58d8c550e21\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836507 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79dfde5a-85a9-437f-979d-1fdb99a1bb5f-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836533 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836575 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836598 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/79dfde5a-85a9-437f-979d-1fdb99a1bb5f-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836623 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/464f9fe3-85bf-4e78-adc3-3feedbaf1dac-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836654 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-086d02ff-33e5-4489-acc4-387156439c87\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-086d02ff-33e5-4489-acc4-387156439c87\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836681 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8c8db6a8-4d9e-4fff-b7d7-1b00dbd62a06\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c8db6a8-4d9e-4fff-b7d7-1b00dbd62a06\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.838002 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.840032 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/464f9fe3-85bf-4e78-adc3-3feedbaf1dac-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.840437 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/464f9fe3-85bf-4e78-adc3-3feedbaf1dac-config\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.844596 5094 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.844641 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a68ddb08-cf6f-40ba-8b6c-c58d8c550e21\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a68ddb08-cf6f-40ba-8b6c-c58d8c550e21\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/af7a81776acc7ca130746af725e9a7d819304a2a8d3e4ea5c067c8428d995108/globalmount\"" pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.845931 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/464f9fe3-85bf-4e78-adc3-3feedbaf1dac-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.846389 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/464f9fe3-85bf-4e78-adc3-3feedbaf1dac-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.860254 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.863690 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rp6m\" (UniqueName: \"kubernetes.io/projected/464f9fe3-85bf-4e78-adc3-3feedbaf1dac-kube-api-access-7rp6m\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.869529 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.911589 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a68ddb08-cf6f-40ba-8b6c-c58d8c550e21\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a68ddb08-cf6f-40ba-8b6c-c58d8c550e21\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.932138 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.940381 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6ff74bc5-95bf-47fd-969e-cecbf1317e5d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.940444 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d2ec44bc-99d8-46fe-bce0-e66607ad84f0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d2ec44bc-99d8-46fe-bce0-e66607ad84f0\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.940495 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-086d02ff-33e5-4489-acc4-387156439c87\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-086d02ff-33e5-4489-acc4-387156439c87\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.940521 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ff74bc5-95bf-47fd-969e-cecbf1317e5d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.940553 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8c8db6a8-4d9e-4fff-b7d7-1b00dbd62a06\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c8db6a8-4d9e-4fff-b7d7-1b00dbd62a06\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.940596 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9e6d0be3-167e-49e9-8450-a563f9115817-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.940650 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e6d0be3-167e-49e9-8450-a563f9115817-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.940674 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ff74bc5-95bf-47fd-969e-cecbf1317e5d-config\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.940730 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6jvv\" (UniqueName: \"kubernetes.io/projected/9e6d0be3-167e-49e9-8450-a563f9115817-kube-api-access-v6jvv\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.940777 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79dfde5a-85a9-437f-979d-1fdb99a1bb5f-config\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.940804 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7n6t\" (UniqueName: \"kubernetes.io/projected/79dfde5a-85a9-437f-979d-1fdb99a1bb5f-kube-api-access-w7n6t\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.940826 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e6d0be3-167e-49e9-8450-a563f9115817-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.940860 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/daa57141-e76b-43a8-b363-2a1c7129d7c2-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.940891 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e6d0be3-167e-49e9-8450-a563f9115817-config\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.940925 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvjj7\" (UniqueName: \"kubernetes.io/projected/5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf-kube-api-access-gvjj7\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.941019 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daa57141-e76b-43a8-b363-2a1c7129d7c2-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.941057 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6ff74bc5-95bf-47fd-969e-cecbf1317e5d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.941107 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf-config\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.941150 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.941176 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4755z\" (UniqueName: \"kubernetes.io/projected/6ff74bc5-95bf-47fd-969e-cecbf1317e5d-kube-api-access-4755z\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.941213 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79dfde5a-85a9-437f-979d-1fdb99a1bb5f-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.941269 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg4dh\" (UniqueName: \"kubernetes.io/projected/daa57141-e76b-43a8-b363-2a1c7129d7c2-kube-api-access-wg4dh\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.941315 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-74762bbd-7d2d-49b1-96c4-758e964293fc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74762bbd-7d2d-49b1-96c4-758e964293fc\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.941334 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/daa57141-e76b-43a8-b363-2a1c7129d7c2-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.941363 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79dfde5a-85a9-437f-979d-1fdb99a1bb5f-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.941391 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daa57141-e76b-43a8-b363-2a1c7129d7c2-config\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.941414 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.941454 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.941477 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/79dfde5a-85a9-437f-979d-1fdb99a1bb5f-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.941692 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a53cb370-1d59-4b93-af68-e80a25bccf14\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a53cb370-1d59-4b93-af68-e80a25bccf14\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.944972 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf-config\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.946915 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.947773 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.948598 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.949135 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79dfde5a-85a9-437f-979d-1fdb99a1bb5f-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.949393 5094 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.949424 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-086d02ff-33e5-4489-acc4-387156439c87\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-086d02ff-33e5-4489-acc4-387156439c87\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4f917b20bcaaf63d298ed333d7ec9bccd5f615564cff30af71dc3e2e9860eebf/globalmount\"" pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.950070 5094 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.950099 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8c8db6a8-4d9e-4fff-b7d7-1b00dbd62a06\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c8db6a8-4d9e-4fff-b7d7-1b00dbd62a06\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3e7eacfc8f768e9fa5f5a033cdbc14ac604c8bf71d6df88886a02074aa564c63/globalmount\"" pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.951468 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79dfde5a-85a9-437f-979d-1fdb99a1bb5f-config\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.952600 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/79dfde5a-85a9-437f-979d-1fdb99a1bb5f-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.961214 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79dfde5a-85a9-437f-979d-1fdb99a1bb5f-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.970369 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7n6t\" (UniqueName: \"kubernetes.io/projected/79dfde5a-85a9-437f-979d-1fdb99a1bb5f-kube-api-access-w7n6t\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.972823 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvjj7\" (UniqueName: \"kubernetes.io/projected/5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf-kube-api-access-gvjj7\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.988023 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8c8db6a8-4d9e-4fff-b7d7-1b00dbd62a06\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c8db6a8-4d9e-4fff-b7d7-1b00dbd62a06\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:00.992496 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-086d02ff-33e5-4489-acc4-387156439c87\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-086d02ff-33e5-4489-acc4-387156439c87\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.043407 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg4dh\" (UniqueName: \"kubernetes.io/projected/daa57141-e76b-43a8-b363-2a1c7129d7c2-kube-api-access-wg4dh\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.043765 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-74762bbd-7d2d-49b1-96c4-758e964293fc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74762bbd-7d2d-49b1-96c4-758e964293fc\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.043789 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/daa57141-e76b-43a8-b363-2a1c7129d7c2-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.043811 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daa57141-e76b-43a8-b363-2a1c7129d7c2-config\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.043837 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a53cb370-1d59-4b93-af68-e80a25bccf14\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a53cb370-1d59-4b93-af68-e80a25bccf14\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.043860 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6ff74bc5-95bf-47fd-969e-cecbf1317e5d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.043879 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d2ec44bc-99d8-46fe-bce0-e66607ad84f0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d2ec44bc-99d8-46fe-bce0-e66607ad84f0\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.044550 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ff74bc5-95bf-47fd-969e-cecbf1317e5d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.044882 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9e6d0be3-167e-49e9-8450-a563f9115817-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.044649 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/daa57141-e76b-43a8-b363-2a1c7129d7c2-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.044954 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e6d0be3-167e-49e9-8450-a563f9115817-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.044968 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ff74bc5-95bf-47fd-969e-cecbf1317e5d-config\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.045226 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daa57141-e76b-43a8-b363-2a1c7129d7c2-config\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.045290 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6ff74bc5-95bf-47fd-969e-cecbf1317e5d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.045518 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9e6d0be3-167e-49e9-8450-a563f9115817-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.045830 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e6d0be3-167e-49e9-8450-a563f9115817-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.045865 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6jvv\" (UniqueName: \"kubernetes.io/projected/9e6d0be3-167e-49e9-8450-a563f9115817-kube-api-access-v6jvv\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.045915 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e6d0be3-167e-49e9-8450-a563f9115817-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.045936 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/daa57141-e76b-43a8-b363-2a1c7129d7c2-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.046189 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ff74bc5-95bf-47fd-969e-cecbf1317e5d-config\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.046256 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e6d0be3-167e-49e9-8450-a563f9115817-config\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.046296 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daa57141-e76b-43a8-b363-2a1c7129d7c2-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.046313 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6ff74bc5-95bf-47fd-969e-cecbf1317e5d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.046715 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/daa57141-e76b-43a8-b363-2a1c7129d7c2-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.046840 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4755z\" (UniqueName: \"kubernetes.io/projected/6ff74bc5-95bf-47fd-969e-cecbf1317e5d-kube-api-access-4755z\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.047265 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6ff74bc5-95bf-47fd-969e-cecbf1317e5d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.047379 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e6d0be3-167e-49e9-8450-a563f9115817-config\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.047679 5094 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.047712 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d2ec44bc-99d8-46fe-bce0-e66607ad84f0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d2ec44bc-99d8-46fe-bce0-e66607ad84f0\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3de976b4516acfae8f0ae7592c7947fd04a70506f0f82f5f85f87b0700dab6c1/globalmount\"" pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.047723 5094 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.047742 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-74762bbd-7d2d-49b1-96c4-758e964293fc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74762bbd-7d2d-49b1-96c4-758e964293fc\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/01a95fc33ca55a11f0de736abe42c39664c673b1158062cf6e2d13e8309e9d6d/globalmount\"" pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.047856 5094 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.047902 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a53cb370-1d59-4b93-af68-e80a25bccf14\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a53cb370-1d59-4b93-af68-e80a25bccf14\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/56f10ad7da66177b62cc526e0d3d231fa0fafed9b03ca85a0c04fc0671d7fe8b/globalmount\"" pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.049882 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ff74bc5-95bf-47fd-969e-cecbf1317e5d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.050399 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e6d0be3-167e-49e9-8450-a563f9115817-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.056597 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daa57141-e76b-43a8-b363-2a1c7129d7c2-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.123399 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg4dh\" (UniqueName: \"kubernetes.io/projected/daa57141-e76b-43a8-b363-2a1c7129d7c2-kube-api-access-wg4dh\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.128820 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6jvv\" (UniqueName: \"kubernetes.io/projected/9e6d0be3-167e-49e9-8450-a563f9115817-kube-api-access-v6jvv\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.133913 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4755z\" (UniqueName: \"kubernetes.io/projected/6ff74bc5-95bf-47fd-969e-cecbf1317e5d-kube-api-access-4755z\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.142453 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a53cb370-1d59-4b93-af68-e80a25bccf14\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a53cb370-1d59-4b93-af68-e80a25bccf14\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.146817 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d2ec44bc-99d8-46fe-bce0-e66607ad84f0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d2ec44bc-99d8-46fe-bce0-e66607ad84f0\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.148356 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-74762bbd-7d2d-49b1-96c4-758e964293fc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74762bbd-7d2d-49b1-96c4-758e964293fc\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.223283 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.230223 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.274434 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.288983 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.419067 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.651108 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.709430 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.812459 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 20 08:38:01 crc kubenswrapper[5094]: W0220 08:38:01.815381 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddaa57141_e76b_43a8_b363_2a1c7129d7c2.slice/crio-f400088e97c917837c1fd23b0a74f1e67f33f87d312992ecc52236a504ce04ae WatchSource:0}: Error finding container f400088e97c917837c1fd23b0a74f1e67f33f87d312992ecc52236a504ce04ae: Status 404 returned error can't find the container with id f400088e97c917837c1fd23b0a74f1e67f33f87d312992ecc52236a504ce04ae Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.940722 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 20 08:38:01 crc kubenswrapper[5094]: W0220 08:38:01.945031 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a5fd3fa_b5c3_4e02_a9e6_26be7e747baf.slice/crio-06dfae5e8e75efed8d1bb35bd399b9e54698c1efe8466c5efcdcbf6e7aa38df7 WatchSource:0}: Error finding container 06dfae5e8e75efed8d1bb35bd399b9e54698c1efe8466c5efcdcbf6e7aa38df7: Status 404 returned error can't find the container with id 06dfae5e8e75efed8d1bb35bd399b9e54698c1efe8466c5efcdcbf6e7aa38df7 Feb 20 08:38:02 crc kubenswrapper[5094]: I0220 08:38:02.104993 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 20 08:38:02 crc kubenswrapper[5094]: I0220 08:38:02.595972 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf","Type":"ContainerStarted","Data":"06dfae5e8e75efed8d1bb35bd399b9e54698c1efe8466c5efcdcbf6e7aa38df7"} Feb 20 08:38:02 crc kubenswrapper[5094]: I0220 08:38:02.597673 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6ff74bc5-95bf-47fd-969e-cecbf1317e5d","Type":"ContainerStarted","Data":"f01f91db9ecefd5a772f69460286181a4eb87ea9321f41fcf8762ab79ec8209e"} Feb 20 08:38:02 crc kubenswrapper[5094]: I0220 08:38:02.598889 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"daa57141-e76b-43a8-b363-2a1c7129d7c2","Type":"ContainerStarted","Data":"f400088e97c917837c1fd23b0a74f1e67f33f87d312992ecc52236a504ce04ae"} Feb 20 08:38:02 crc kubenswrapper[5094]: I0220 08:38:02.600130 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"9e6d0be3-167e-49e9-8450-a563f9115817","Type":"ContainerStarted","Data":"07ac9677cab8c7dc7d9f8fb253454cf9f1ccb83ba8bee4cca6e7b4cc7a2ab0ee"} Feb 20 08:38:02 crc kubenswrapper[5094]: I0220 08:38:02.601557 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"464f9fe3-85bf-4e78-adc3-3feedbaf1dac","Type":"ContainerStarted","Data":"f7f1f7e993447666b52826c9f29a6032a140b4d9da2ada87940301108b4b71f4"} Feb 20 08:38:02 crc kubenswrapper[5094]: I0220 08:38:02.671443 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 20 08:38:03 crc kubenswrapper[5094]: I0220 08:38:03.616985 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"79dfde5a-85a9-437f-979d-1fdb99a1bb5f","Type":"ContainerStarted","Data":"41c8000ed5958ae552e5896097b5d15e857e74ee85c49d153c5cf5700209b743"} Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.643833 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"79dfde5a-85a9-437f-979d-1fdb99a1bb5f","Type":"ContainerStarted","Data":"8d75a82b08cf0e4ed65386619c69af8dad6fc6ac01d03ab70938690b266fd92f"} Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.645662 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"464f9fe3-85bf-4e78-adc3-3feedbaf1dac","Type":"ContainerStarted","Data":"23c515e4903b122b59853efaa4b81a82e3af81ae56b30822173437a630639389"} Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.645682 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"464f9fe3-85bf-4e78-adc3-3feedbaf1dac","Type":"ContainerStarted","Data":"cad877db9087f42fae5beb889ae0251566dcf2cb5740cd06889aa6e148238bb6"} Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.648058 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf","Type":"ContainerStarted","Data":"bba0a9f73377e804cd503c5cafe076f419799ed3bb1954426a828acf7ef836af"} Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.648098 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf","Type":"ContainerStarted","Data":"d4bdf6c215bd679fb0bb92e8eac97b0a597ad3bbb7f3ca15c939b5140aacfab6"} Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.657544 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6ff74bc5-95bf-47fd-969e-cecbf1317e5d","Type":"ContainerStarted","Data":"e5442ddc216d0fffaf41f19267974c16eb734de160956a981ffde266c3dfb6ff"} Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.657595 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6ff74bc5-95bf-47fd-969e-cecbf1317e5d","Type":"ContainerStarted","Data":"894019acdbeb1b3c8070ec89b10e9a16d33b0aec1d0ff9f59d4b97c89bf52360"} Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.659612 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"daa57141-e76b-43a8-b363-2a1c7129d7c2","Type":"ContainerStarted","Data":"6dd3c4055c326a927f1cf9ca2c3c9c971d515a2985402433d11bad89c3adc4c0"} Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.659644 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"daa57141-e76b-43a8-b363-2a1c7129d7c2","Type":"ContainerStarted","Data":"6c639df003d8c9072fcb16082e4d25ba1ad201993e8cb56f59b7dda97b6a8df5"} Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.665188 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.5034545489999998 podStartE2EDuration="7.665173924s" podCreationTimestamp="2026-02-20 08:37:59 +0000 UTC" firstStartedPulling="2026-02-20 08:38:01.656202148 +0000 UTC m=+6696.528828859" lastFinishedPulling="2026-02-20 08:38:05.817921523 +0000 UTC m=+6700.690548234" observedRunningTime="2026-02-20 08:38:06.664857347 +0000 UTC m=+6701.537484058" watchObservedRunningTime="2026-02-20 08:38:06.665173924 +0000 UTC m=+6701.537800635" Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.665244 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"9e6d0be3-167e-49e9-8450-a563f9115817","Type":"ContainerStarted","Data":"4c13e72dcc45e8131fc1636a705a643d6f7623e809f34bdf7f0898d98c9ded45"} Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.665294 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"9e6d0be3-167e-49e9-8450-a563f9115817","Type":"ContainerStarted","Data":"c52b3f7de314c2fed27b6a3ca93e11553d22d024cb7cff3ad9b2ba6cb7843fe6"} Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.694151 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=3.810295625 podStartE2EDuration="7.694134542s" podCreationTimestamp="2026-02-20 08:37:59 +0000 UTC" firstStartedPulling="2026-02-20 08:38:01.948989205 +0000 UTC m=+6696.821615916" lastFinishedPulling="2026-02-20 08:38:05.832828122 +0000 UTC m=+6700.705454833" observedRunningTime="2026-02-20 08:38:06.690411642 +0000 UTC m=+6701.563038383" watchObservedRunningTime="2026-02-20 08:38:06.694134542 +0000 UTC m=+6701.566761253" Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.714116 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.715736389 podStartE2EDuration="7.714067381s" podCreationTimestamp="2026-02-20 08:37:59 +0000 UTC" firstStartedPulling="2026-02-20 08:38:01.819933129 +0000 UTC m=+6696.692559840" lastFinishedPulling="2026-02-20 08:38:05.818264121 +0000 UTC m=+6700.690890832" observedRunningTime="2026-02-20 08:38:06.712541554 +0000 UTC m=+6701.585168265" watchObservedRunningTime="2026-02-20 08:38:06.714067381 +0000 UTC m=+6701.586694092" Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.729827 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.029957792 podStartE2EDuration="7.72981176s" podCreationTimestamp="2026-02-20 08:37:59 +0000 UTC" firstStartedPulling="2026-02-20 08:38:02.119451578 +0000 UTC m=+6696.992078289" lastFinishedPulling="2026-02-20 08:38:05.819305546 +0000 UTC m=+6700.691932257" observedRunningTime="2026-02-20 08:38:06.726105131 +0000 UTC m=+6701.598731842" watchObservedRunningTime="2026-02-20 08:38:06.72981176 +0000 UTC m=+6701.602438471" Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.743363 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=3.641114193 podStartE2EDuration="7.743347166s" podCreationTimestamp="2026-02-20 08:37:59 +0000 UTC" firstStartedPulling="2026-02-20 08:38:01.715488095 +0000 UTC m=+6696.588114806" lastFinishedPulling="2026-02-20 08:38:05.817721068 +0000 UTC m=+6700.690347779" observedRunningTime="2026-02-20 08:38:06.743031338 +0000 UTC m=+6701.615658059" watchObservedRunningTime="2026-02-20 08:38:06.743347166 +0000 UTC m=+6701.615973877" Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.933213 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:07 crc kubenswrapper[5094]: I0220 08:38:07.224811 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:07 crc kubenswrapper[5094]: I0220 08:38:07.230956 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:07 crc kubenswrapper[5094]: I0220 08:38:07.275312 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:07 crc kubenswrapper[5094]: I0220 08:38:07.419552 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:07 crc kubenswrapper[5094]: I0220 08:38:07.674266 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"79dfde5a-85a9-437f-979d-1fdb99a1bb5f","Type":"ContainerStarted","Data":"f7d4b556c6eff4b7976324b90fe5d2082b5e471da261cf1356c01b752ea25135"} Feb 20 08:38:07 crc kubenswrapper[5094]: I0220 08:38:07.695308 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=5.194745235 podStartE2EDuration="8.695289317s" podCreationTimestamp="2026-02-20 08:37:59 +0000 UTC" firstStartedPulling="2026-02-20 08:38:02.691238509 +0000 UTC m=+6697.563865220" lastFinishedPulling="2026-02-20 08:38:06.191782591 +0000 UTC m=+6701.064409302" observedRunningTime="2026-02-20 08:38:07.689396036 +0000 UTC m=+6702.562022747" watchObservedRunningTime="2026-02-20 08:38:07.695289317 +0000 UTC m=+6702.567916028" Feb 20 08:38:09 crc kubenswrapper[5094]: I0220 08:38:09.987390 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:09 crc kubenswrapper[5094]: I0220 08:38:09.987726 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:10 crc kubenswrapper[5094]: I0220 08:38:10.268253 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:10 crc kubenswrapper[5094]: I0220 08:38:10.268961 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:10 crc kubenswrapper[5094]: I0220 08:38:10.283070 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:10 crc kubenswrapper[5094]: I0220 08:38:10.283352 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:10 crc kubenswrapper[5094]: I0220 08:38:10.289693 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:10 crc kubenswrapper[5094]: I0220 08:38:10.316752 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:10 crc kubenswrapper[5094]: I0220 08:38:10.317091 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:10 crc kubenswrapper[5094]: I0220 08:38:10.335624 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:10 crc kubenswrapper[5094]: I0220 08:38:10.457584 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:10 crc kubenswrapper[5094]: I0220 08:38:10.458267 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:10 crc kubenswrapper[5094]: I0220 08:38:10.701466 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.293695 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.293951 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.310803 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.351079 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.454017 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.512806 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64f8ffdb7f-bsqkd"] Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.514164 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.517309 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.528472 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64f8ffdb7f-bsqkd"] Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.644422 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64f8ffdb7f-bsqkd"] Feb 20 08:38:11 crc kubenswrapper[5094]: E0220 08:38:11.645237 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-79prf ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" podUID="bc60d174-9065-440e-b292-7b9646fdc03c" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.645366 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-ovsdbserver-sb\") pod \"dnsmasq-dns-64f8ffdb7f-bsqkd\" (UID: \"bc60d174-9065-440e-b292-7b9646fdc03c\") " pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.645444 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-dns-svc\") pod \"dnsmasq-dns-64f8ffdb7f-bsqkd\" (UID: \"bc60d174-9065-440e-b292-7b9646fdc03c\") " pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.645516 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79prf\" (UniqueName: \"kubernetes.io/projected/bc60d174-9065-440e-b292-7b9646fdc03c-kube-api-access-79prf\") pod \"dnsmasq-dns-64f8ffdb7f-bsqkd\" (UID: \"bc60d174-9065-440e-b292-7b9646fdc03c\") " pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.645579 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-config\") pod \"dnsmasq-dns-64f8ffdb7f-bsqkd\" (UID: \"bc60d174-9065-440e-b292-7b9646fdc03c\") " pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.671968 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68dcfc97c7-fzntw"] Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.673151 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.674891 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.686215 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68dcfc97c7-fzntw"] Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.708775 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.720490 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.746735 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-dns-svc\") pod \"dnsmasq-dns-64f8ffdb7f-bsqkd\" (UID: \"bc60d174-9065-440e-b292-7b9646fdc03c\") " pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.746836 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79prf\" (UniqueName: \"kubernetes.io/projected/bc60d174-9065-440e-b292-7b9646fdc03c-kube-api-access-79prf\") pod \"dnsmasq-dns-64f8ffdb7f-bsqkd\" (UID: \"bc60d174-9065-440e-b292-7b9646fdc03c\") " pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.746882 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-config\") pod \"dnsmasq-dns-64f8ffdb7f-bsqkd\" (UID: \"bc60d174-9065-440e-b292-7b9646fdc03c\") " pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.746924 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-ovsdbserver-sb\") pod \"dnsmasq-dns-64f8ffdb7f-bsqkd\" (UID: \"bc60d174-9065-440e-b292-7b9646fdc03c\") " pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.747646 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-dns-svc\") pod \"dnsmasq-dns-64f8ffdb7f-bsqkd\" (UID: \"bc60d174-9065-440e-b292-7b9646fdc03c\") " pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.747648 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-ovsdbserver-sb\") pod \"dnsmasq-dns-64f8ffdb7f-bsqkd\" (UID: \"bc60d174-9065-440e-b292-7b9646fdc03c\") " pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.747975 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-config\") pod \"dnsmasq-dns-64f8ffdb7f-bsqkd\" (UID: \"bc60d174-9065-440e-b292-7b9646fdc03c\") " pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.763415 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79prf\" (UniqueName: \"kubernetes.io/projected/bc60d174-9065-440e-b292-7b9646fdc03c-kube-api-access-79prf\") pod \"dnsmasq-dns-64f8ffdb7f-bsqkd\" (UID: \"bc60d174-9065-440e-b292-7b9646fdc03c\") " pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.847865 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79prf\" (UniqueName: \"kubernetes.io/projected/bc60d174-9065-440e-b292-7b9646fdc03c-kube-api-access-79prf\") pod \"bc60d174-9065-440e-b292-7b9646fdc03c\" (UID: \"bc60d174-9065-440e-b292-7b9646fdc03c\") " Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.847921 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-dns-svc\") pod \"bc60d174-9065-440e-b292-7b9646fdc03c\" (UID: \"bc60d174-9065-440e-b292-7b9646fdc03c\") " Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.847983 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-ovsdbserver-sb\") pod \"bc60d174-9065-440e-b292-7b9646fdc03c\" (UID: \"bc60d174-9065-440e-b292-7b9646fdc03c\") " Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.848142 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-config\") pod \"bc60d174-9065-440e-b292-7b9646fdc03c\" (UID: \"bc60d174-9065-440e-b292-7b9646fdc03c\") " Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.848338 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5tlr\" (UniqueName: \"kubernetes.io/projected/59402637-ea41-4c78-a455-361d55c5422a-kube-api-access-k5tlr\") pod \"dnsmasq-dns-68dcfc97c7-fzntw\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.848364 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-ovsdbserver-sb\") pod \"dnsmasq-dns-68dcfc97c7-fzntw\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.848550 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-config" (OuterVolumeSpecName: "config") pod "bc60d174-9065-440e-b292-7b9646fdc03c" (UID: "bc60d174-9065-440e-b292-7b9646fdc03c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.848598 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bc60d174-9065-440e-b292-7b9646fdc03c" (UID: "bc60d174-9065-440e-b292-7b9646fdc03c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.848999 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-ovsdbserver-nb\") pod \"dnsmasq-dns-68dcfc97c7-fzntw\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.849206 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-dns-svc\") pod \"dnsmasq-dns-68dcfc97c7-fzntw\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.849281 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-config\") pod \"dnsmasq-dns-68dcfc97c7-fzntw\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.849661 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.849693 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.849735 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bc60d174-9065-440e-b292-7b9646fdc03c" (UID: "bc60d174-9065-440e-b292-7b9646fdc03c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.852432 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc60d174-9065-440e-b292-7b9646fdc03c-kube-api-access-79prf" (OuterVolumeSpecName: "kube-api-access-79prf") pod "bc60d174-9065-440e-b292-7b9646fdc03c" (UID: "bc60d174-9065-440e-b292-7b9646fdc03c"). InnerVolumeSpecName "kube-api-access-79prf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.950891 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-ovsdbserver-nb\") pod \"dnsmasq-dns-68dcfc97c7-fzntw\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.950986 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-dns-svc\") pod \"dnsmasq-dns-68dcfc97c7-fzntw\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.951022 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-config\") pod \"dnsmasq-dns-68dcfc97c7-fzntw\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.951064 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5tlr\" (UniqueName: \"kubernetes.io/projected/59402637-ea41-4c78-a455-361d55c5422a-kube-api-access-k5tlr\") pod \"dnsmasq-dns-68dcfc97c7-fzntw\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.951087 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-ovsdbserver-sb\") pod \"dnsmasq-dns-68dcfc97c7-fzntw\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.951445 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79prf\" (UniqueName: \"kubernetes.io/projected/bc60d174-9065-440e-b292-7b9646fdc03c-kube-api-access-79prf\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.951464 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.952158 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-config\") pod \"dnsmasq-dns-68dcfc97c7-fzntw\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.952233 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-dns-svc\") pod \"dnsmasq-dns-68dcfc97c7-fzntw\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.952357 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-ovsdbserver-sb\") pod \"dnsmasq-dns-68dcfc97c7-fzntw\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.952415 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-ovsdbserver-nb\") pod \"dnsmasq-dns-68dcfc97c7-fzntw\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.968970 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5tlr\" (UniqueName: \"kubernetes.io/projected/59402637-ea41-4c78-a455-361d55c5422a-kube-api-access-k5tlr\") pod \"dnsmasq-dns-68dcfc97c7-fzntw\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.990523 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:12 crc kubenswrapper[5094]: I0220 08:38:12.475050 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68dcfc97c7-fzntw"] Feb 20 08:38:12 crc kubenswrapper[5094]: W0220 08:38:12.477755 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59402637_ea41_4c78_a455_361d55c5422a.slice/crio-cc810b39c5c9fcbe3da7e8305b82e45f0c0fd18ad2de7d721bc7779ae68beb9b WatchSource:0}: Error finding container cc810b39c5c9fcbe3da7e8305b82e45f0c0fd18ad2de7d721bc7779ae68beb9b: Status 404 returned error can't find the container with id cc810b39c5c9fcbe3da7e8305b82e45f0c0fd18ad2de7d721bc7779ae68beb9b Feb 20 08:38:12 crc kubenswrapper[5094]: I0220 08:38:12.720039 5094 generic.go:334] "Generic (PLEG): container finished" podID="59402637-ea41-4c78-a455-361d55c5422a" containerID="63ebc0843e1f18b14a59e973034594be94e63f23e4b14fd38a292a888d5971cd" exitCode=0 Feb 20 08:38:12 crc kubenswrapper[5094]: I0220 08:38:12.720206 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" event={"ID":"59402637-ea41-4c78-a455-361d55c5422a","Type":"ContainerDied","Data":"63ebc0843e1f18b14a59e973034594be94e63f23e4b14fd38a292a888d5971cd"} Feb 20 08:38:12 crc kubenswrapper[5094]: I0220 08:38:12.720259 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" Feb 20 08:38:12 crc kubenswrapper[5094]: I0220 08:38:12.720266 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" event={"ID":"59402637-ea41-4c78-a455-361d55c5422a","Type":"ContainerStarted","Data":"cc810b39c5c9fcbe3da7e8305b82e45f0c0fd18ad2de7d721bc7779ae68beb9b"} Feb 20 08:38:12 crc kubenswrapper[5094]: I0220 08:38:12.823909 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64f8ffdb7f-bsqkd"] Feb 20 08:38:12 crc kubenswrapper[5094]: I0220 08:38:12.824362 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64f8ffdb7f-bsqkd"] Feb 20 08:38:13 crc kubenswrapper[5094]: I0220 08:38:13.733524 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" event={"ID":"59402637-ea41-4c78-a455-361d55c5422a","Type":"ContainerStarted","Data":"c629842f4857fc824451a2868ac4ca4ec36a7daeeb169573db6f9b5405d05a43"} Feb 20 08:38:13 crc kubenswrapper[5094]: I0220 08:38:13.733793 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:13 crc kubenswrapper[5094]: I0220 08:38:13.771601 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" podStartSLOduration=2.771570752 podStartE2EDuration="2.771570752s" podCreationTimestamp="2026-02-20 08:38:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:38:13.767037833 +0000 UTC m=+6708.639664564" watchObservedRunningTime="2026-02-20 08:38:13.771570752 +0000 UTC m=+6708.644197503" Feb 20 08:38:13 crc kubenswrapper[5094]: I0220 08:38:13.854335 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc60d174-9065-440e-b292-7b9646fdc03c" path="/var/lib/kubelet/pods/bc60d174-9065-440e-b292-7b9646fdc03c/volumes" Feb 20 08:38:15 crc kubenswrapper[5094]: I0220 08:38:15.978348 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:19 crc kubenswrapper[5094]: I0220 08:38:19.214349 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Feb 20 08:38:19 crc kubenswrapper[5094]: I0220 08:38:19.217009 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 20 08:38:19 crc kubenswrapper[5094]: I0220 08:38:19.219954 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Feb 20 08:38:19 crc kubenswrapper[5094]: I0220 08:38:19.239616 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Feb 20 08:38:19 crc kubenswrapper[5094]: I0220 08:38:19.321685 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/4111d2dd-641f-4113-8751-4151d435e934-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"4111d2dd-641f-4113-8751-4151d435e934\") " pod="openstack/ovn-copy-data" Feb 20 08:38:19 crc kubenswrapper[5094]: I0220 08:38:19.322255 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trgg5\" (UniqueName: \"kubernetes.io/projected/4111d2dd-641f-4113-8751-4151d435e934-kube-api-access-trgg5\") pod \"ovn-copy-data\" (UID: \"4111d2dd-641f-4113-8751-4151d435e934\") " pod="openstack/ovn-copy-data" Feb 20 08:38:19 crc kubenswrapper[5094]: I0220 08:38:19.322347 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87\") pod \"ovn-copy-data\" (UID: \"4111d2dd-641f-4113-8751-4151d435e934\") " pod="openstack/ovn-copy-data" Feb 20 08:38:19 crc kubenswrapper[5094]: I0220 08:38:19.424096 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/4111d2dd-641f-4113-8751-4151d435e934-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"4111d2dd-641f-4113-8751-4151d435e934\") " pod="openstack/ovn-copy-data" Feb 20 08:38:19 crc kubenswrapper[5094]: I0220 08:38:19.424181 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trgg5\" (UniqueName: \"kubernetes.io/projected/4111d2dd-641f-4113-8751-4151d435e934-kube-api-access-trgg5\") pod \"ovn-copy-data\" (UID: \"4111d2dd-641f-4113-8751-4151d435e934\") " pod="openstack/ovn-copy-data" Feb 20 08:38:19 crc kubenswrapper[5094]: I0220 08:38:19.424237 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87\") pod \"ovn-copy-data\" (UID: \"4111d2dd-641f-4113-8751-4151d435e934\") " pod="openstack/ovn-copy-data" Feb 20 08:38:19 crc kubenswrapper[5094]: I0220 08:38:19.427384 5094 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 08:38:19 crc kubenswrapper[5094]: I0220 08:38:19.427421 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87\") pod \"ovn-copy-data\" (UID: \"4111d2dd-641f-4113-8751-4151d435e934\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5ddeed08d804d11f30f0c2d3797e874048b40f01e45c1c9db13016d536198515/globalmount\"" pod="openstack/ovn-copy-data" Feb 20 08:38:19 crc kubenswrapper[5094]: I0220 08:38:19.432797 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/4111d2dd-641f-4113-8751-4151d435e934-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"4111d2dd-641f-4113-8751-4151d435e934\") " pod="openstack/ovn-copy-data" Feb 20 08:38:19 crc kubenswrapper[5094]: I0220 08:38:19.447219 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trgg5\" (UniqueName: \"kubernetes.io/projected/4111d2dd-641f-4113-8751-4151d435e934-kube-api-access-trgg5\") pod \"ovn-copy-data\" (UID: \"4111d2dd-641f-4113-8751-4151d435e934\") " pod="openstack/ovn-copy-data" Feb 20 08:38:19 crc kubenswrapper[5094]: I0220 08:38:19.468961 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87\") pod \"ovn-copy-data\" (UID: \"4111d2dd-641f-4113-8751-4151d435e934\") " pod="openstack/ovn-copy-data" Feb 20 08:38:19 crc kubenswrapper[5094]: I0220 08:38:19.546518 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 20 08:38:20 crc kubenswrapper[5094]: I0220 08:38:20.119038 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Feb 20 08:38:20 crc kubenswrapper[5094]: W0220 08:38:20.127700 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4111d2dd_641f_4113_8751_4151d435e934.slice/crio-12052248c5408eacb24c4b1b460564df0f8c856ad57e93b0f1d03df46cc8aee4 WatchSource:0}: Error finding container 12052248c5408eacb24c4b1b460564df0f8c856ad57e93b0f1d03df46cc8aee4: Status 404 returned error can't find the container with id 12052248c5408eacb24c4b1b460564df0f8c856ad57e93b0f1d03df46cc8aee4 Feb 20 08:38:20 crc kubenswrapper[5094]: I0220 08:38:20.794938 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"4111d2dd-641f-4113-8751-4151d435e934","Type":"ContainerStarted","Data":"0a962e0ed36b20eeee3760655985d477e38322eaa7ec12060bc14e41416dcf5e"} Feb 20 08:38:20 crc kubenswrapper[5094]: I0220 08:38:20.795009 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"4111d2dd-641f-4113-8751-4151d435e934","Type":"ContainerStarted","Data":"12052248c5408eacb24c4b1b460564df0f8c856ad57e93b0f1d03df46cc8aee4"} Feb 20 08:38:20 crc kubenswrapper[5094]: I0220 08:38:20.827170 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=2.5960703450000002 podStartE2EDuration="2.827144596s" podCreationTimestamp="2026-02-20 08:38:18 +0000 UTC" firstStartedPulling="2026-02-20 08:38:20.13385138 +0000 UTC m=+6715.006478131" lastFinishedPulling="2026-02-20 08:38:20.364925671 +0000 UTC m=+6715.237552382" observedRunningTime="2026-02-20 08:38:20.814574583 +0000 UTC m=+6715.687201324" watchObservedRunningTime="2026-02-20 08:38:20.827144596 +0000 UTC m=+6715.699771337" Feb 20 08:38:21 crc kubenswrapper[5094]: I0220 08:38:21.991987 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.056783 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c98fc4f89-fwwbb"] Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.057057 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" podUID="6f4c545b-01fc-4e08-994c-7d24a10a963e" containerName="dnsmasq-dns" containerID="cri-o://a7aa04cfead057bdf297d9b737afb89cfa0c54a45dbf6dfc5c84ca574ec804e3" gracePeriod=10 Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.550116 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.587413 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f4c545b-01fc-4e08-994c-7d24a10a963e-dns-svc\") pod \"6f4c545b-01fc-4e08-994c-7d24a10a963e\" (UID: \"6f4c545b-01fc-4e08-994c-7d24a10a963e\") " Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.587688 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f4c545b-01fc-4e08-994c-7d24a10a963e-config\") pod \"6f4c545b-01fc-4e08-994c-7d24a10a963e\" (UID: \"6f4c545b-01fc-4e08-994c-7d24a10a963e\") " Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.587962 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9shd2\" (UniqueName: \"kubernetes.io/projected/6f4c545b-01fc-4e08-994c-7d24a10a963e-kube-api-access-9shd2\") pod \"6f4c545b-01fc-4e08-994c-7d24a10a963e\" (UID: \"6f4c545b-01fc-4e08-994c-7d24a10a963e\") " Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.606989 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f4c545b-01fc-4e08-994c-7d24a10a963e-kube-api-access-9shd2" (OuterVolumeSpecName: "kube-api-access-9shd2") pod "6f4c545b-01fc-4e08-994c-7d24a10a963e" (UID: "6f4c545b-01fc-4e08-994c-7d24a10a963e"). InnerVolumeSpecName "kube-api-access-9shd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.627943 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f4c545b-01fc-4e08-994c-7d24a10a963e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6f4c545b-01fc-4e08-994c-7d24a10a963e" (UID: "6f4c545b-01fc-4e08-994c-7d24a10a963e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.629884 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f4c545b-01fc-4e08-994c-7d24a10a963e-config" (OuterVolumeSpecName: "config") pod "6f4c545b-01fc-4e08-994c-7d24a10a963e" (UID: "6f4c545b-01fc-4e08-994c-7d24a10a963e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.690217 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9shd2\" (UniqueName: \"kubernetes.io/projected/6f4c545b-01fc-4e08-994c-7d24a10a963e-kube-api-access-9shd2\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.690412 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f4c545b-01fc-4e08-994c-7d24a10a963e-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.690469 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f4c545b-01fc-4e08-994c-7d24a10a963e-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.811788 5094 generic.go:334] "Generic (PLEG): container finished" podID="6f4c545b-01fc-4e08-994c-7d24a10a963e" containerID="a7aa04cfead057bdf297d9b737afb89cfa0c54a45dbf6dfc5c84ca574ec804e3" exitCode=0 Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.811849 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.811845 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" event={"ID":"6f4c545b-01fc-4e08-994c-7d24a10a963e","Type":"ContainerDied","Data":"a7aa04cfead057bdf297d9b737afb89cfa0c54a45dbf6dfc5c84ca574ec804e3"} Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.812276 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" event={"ID":"6f4c545b-01fc-4e08-994c-7d24a10a963e","Type":"ContainerDied","Data":"980a56a2f88743ff53bcdb60a139ba0c40f893ee3421dbdc2711a580ef52fcca"} Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.812297 5094 scope.go:117] "RemoveContainer" containerID="a7aa04cfead057bdf297d9b737afb89cfa0c54a45dbf6dfc5c84ca574ec804e3" Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.830116 5094 scope.go:117] "RemoveContainer" containerID="6dc775b2c5f76a93f5e36b63c468418d60dce69b4e0ea3323ccd0a9f35c92e84" Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.841861 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c98fc4f89-fwwbb"] Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.847980 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c98fc4f89-fwwbb"] Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.850286 5094 scope.go:117] "RemoveContainer" containerID="a7aa04cfead057bdf297d9b737afb89cfa0c54a45dbf6dfc5c84ca574ec804e3" Feb 20 08:38:22 crc kubenswrapper[5094]: E0220 08:38:22.850747 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7aa04cfead057bdf297d9b737afb89cfa0c54a45dbf6dfc5c84ca574ec804e3\": container with ID starting with a7aa04cfead057bdf297d9b737afb89cfa0c54a45dbf6dfc5c84ca574ec804e3 not found: ID does not exist" containerID="a7aa04cfead057bdf297d9b737afb89cfa0c54a45dbf6dfc5c84ca574ec804e3" Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.850864 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7aa04cfead057bdf297d9b737afb89cfa0c54a45dbf6dfc5c84ca574ec804e3"} err="failed to get container status \"a7aa04cfead057bdf297d9b737afb89cfa0c54a45dbf6dfc5c84ca574ec804e3\": rpc error: code = NotFound desc = could not find container \"a7aa04cfead057bdf297d9b737afb89cfa0c54a45dbf6dfc5c84ca574ec804e3\": container with ID starting with a7aa04cfead057bdf297d9b737afb89cfa0c54a45dbf6dfc5c84ca574ec804e3 not found: ID does not exist" Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.850959 5094 scope.go:117] "RemoveContainer" containerID="6dc775b2c5f76a93f5e36b63c468418d60dce69b4e0ea3323ccd0a9f35c92e84" Feb 20 08:38:22 crc kubenswrapper[5094]: E0220 08:38:22.851458 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dc775b2c5f76a93f5e36b63c468418d60dce69b4e0ea3323ccd0a9f35c92e84\": container with ID starting with 6dc775b2c5f76a93f5e36b63c468418d60dce69b4e0ea3323ccd0a9f35c92e84 not found: ID does not exist" containerID="6dc775b2c5f76a93f5e36b63c468418d60dce69b4e0ea3323ccd0a9f35c92e84" Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.851560 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dc775b2c5f76a93f5e36b63c468418d60dce69b4e0ea3323ccd0a9f35c92e84"} err="failed to get container status \"6dc775b2c5f76a93f5e36b63c468418d60dce69b4e0ea3323ccd0a9f35c92e84\": rpc error: code = NotFound desc = could not find container \"6dc775b2c5f76a93f5e36b63c468418d60dce69b4e0ea3323ccd0a9f35c92e84\": container with ID starting with 6dc775b2c5f76a93f5e36b63c468418d60dce69b4e0ea3323ccd0a9f35c92e84 not found: ID does not exist" Feb 20 08:38:23 crc kubenswrapper[5094]: I0220 08:38:23.851454 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f4c545b-01fc-4e08-994c-7d24a10a963e" path="/var/lib/kubelet/pods/6f4c545b-01fc-4e08-994c-7d24a10a963e/volumes" Feb 20 08:38:27 crc kubenswrapper[5094]: E0220 08:38:27.936901 5094 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.188:34014->38.102.83.188:42807: write tcp 38.102.83.188:34014->38.102.83.188:42807: write: broken pipe Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.291866 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 20 08:38:29 crc kubenswrapper[5094]: E0220 08:38:29.292336 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4c545b-01fc-4e08-994c-7d24a10a963e" containerName="dnsmasq-dns" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.292356 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4c545b-01fc-4e08-994c-7d24a10a963e" containerName="dnsmasq-dns" Feb 20 08:38:29 crc kubenswrapper[5094]: E0220 08:38:29.292391 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4c545b-01fc-4e08-994c-7d24a10a963e" containerName="init" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.292399 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4c545b-01fc-4e08-994c-7d24a10a963e" containerName="init" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.292585 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f4c545b-01fc-4e08-994c-7d24a10a963e" containerName="dnsmasq-dns" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.294082 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.295891 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.295917 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-g4jlf" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.297190 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.306933 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.408601 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4bd6bf8e-8e67-4de1-a294-6b5d50f1797a-scripts\") pod \"ovn-northd-0\" (UID: \"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a\") " pod="openstack/ovn-northd-0" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.409041 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bd6bf8e-8e67-4de1-a294-6b5d50f1797a-config\") pod \"ovn-northd-0\" (UID: \"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a\") " pod="openstack/ovn-northd-0" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.409074 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4bd6bf8e-8e67-4de1-a294-6b5d50f1797a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a\") " pod="openstack/ovn-northd-0" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.409148 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znfq5\" (UniqueName: \"kubernetes.io/projected/4bd6bf8e-8e67-4de1-a294-6b5d50f1797a-kube-api-access-znfq5\") pod \"ovn-northd-0\" (UID: \"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a\") " pod="openstack/ovn-northd-0" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.409202 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd6bf8e-8e67-4de1-a294-6b5d50f1797a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a\") " pod="openstack/ovn-northd-0" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.510362 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4bd6bf8e-8e67-4de1-a294-6b5d50f1797a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a\") " pod="openstack/ovn-northd-0" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.510457 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znfq5\" (UniqueName: \"kubernetes.io/projected/4bd6bf8e-8e67-4de1-a294-6b5d50f1797a-kube-api-access-znfq5\") pod \"ovn-northd-0\" (UID: \"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a\") " pod="openstack/ovn-northd-0" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.510481 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd6bf8e-8e67-4de1-a294-6b5d50f1797a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a\") " pod="openstack/ovn-northd-0" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.510538 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4bd6bf8e-8e67-4de1-a294-6b5d50f1797a-scripts\") pod \"ovn-northd-0\" (UID: \"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a\") " pod="openstack/ovn-northd-0" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.510566 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bd6bf8e-8e67-4de1-a294-6b5d50f1797a-config\") pod \"ovn-northd-0\" (UID: \"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a\") " pod="openstack/ovn-northd-0" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.510954 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4bd6bf8e-8e67-4de1-a294-6b5d50f1797a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a\") " pod="openstack/ovn-northd-0" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.512800 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4bd6bf8e-8e67-4de1-a294-6b5d50f1797a-scripts\") pod \"ovn-northd-0\" (UID: \"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a\") " pod="openstack/ovn-northd-0" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.512815 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bd6bf8e-8e67-4de1-a294-6b5d50f1797a-config\") pod \"ovn-northd-0\" (UID: \"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a\") " pod="openstack/ovn-northd-0" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.517468 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd6bf8e-8e67-4de1-a294-6b5d50f1797a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a\") " pod="openstack/ovn-northd-0" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.526461 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znfq5\" (UniqueName: \"kubernetes.io/projected/4bd6bf8e-8e67-4de1-a294-6b5d50f1797a-kube-api-access-znfq5\") pod \"ovn-northd-0\" (UID: \"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a\") " pod="openstack/ovn-northd-0" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.616066 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 20 08:38:30 crc kubenswrapper[5094]: I0220 08:38:30.187788 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 20 08:38:30 crc kubenswrapper[5094]: I0220 08:38:30.908025 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a","Type":"ContainerStarted","Data":"123d94f8371eb846e916485504943df8a24d926aa8884f217c14b7c3e87aac28"} Feb 20 08:38:31 crc kubenswrapper[5094]: I0220 08:38:31.918104 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a","Type":"ContainerStarted","Data":"0c180cd6c00e3610429ef6452551fc6bec59df4886e740d49346a96f85490f71"} Feb 20 08:38:31 crc kubenswrapper[5094]: I0220 08:38:31.918478 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a","Type":"ContainerStarted","Data":"a3978845dbbac7c8217dc5684fa73e1b55aaa287ef2ea2f0928de9dc6434e1ab"} Feb 20 08:38:31 crc kubenswrapper[5094]: I0220 08:38:31.919863 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 20 08:38:31 crc kubenswrapper[5094]: I0220 08:38:31.939442 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.27140524 podStartE2EDuration="2.939408957s" podCreationTimestamp="2026-02-20 08:38:29 +0000 UTC" firstStartedPulling="2026-02-20 08:38:30.200213938 +0000 UTC m=+6725.072840649" lastFinishedPulling="2026-02-20 08:38:30.868217665 +0000 UTC m=+6725.740844366" observedRunningTime="2026-02-20 08:38:31.934154291 +0000 UTC m=+6726.806781052" watchObservedRunningTime="2026-02-20 08:38:31.939408957 +0000 UTC m=+6726.812035708" Feb 20 08:38:37 crc kubenswrapper[5094]: I0220 08:38:37.849845 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-l2kgb"] Feb 20 08:38:37 crc kubenswrapper[5094]: I0220 08:38:37.851634 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-l2kgb" Feb 20 08:38:37 crc kubenswrapper[5094]: I0220 08:38:37.858396 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-l2kgb"] Feb 20 08:38:37 crc kubenswrapper[5094]: I0220 08:38:37.943117 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5af6-account-create-update-9f77r"] Feb 20 08:38:37 crc kubenswrapper[5094]: I0220 08:38:37.944157 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5af6-account-create-update-9f77r" Feb 20 08:38:37 crc kubenswrapper[5094]: I0220 08:38:37.946373 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 20 08:38:37 crc kubenswrapper[5094]: I0220 08:38:37.956844 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec59a7fc-e360-4e39-8c57-cfaa43d23566-operator-scripts\") pod \"keystone-db-create-l2kgb\" (UID: \"ec59a7fc-e360-4e39-8c57-cfaa43d23566\") " pod="openstack/keystone-db-create-l2kgb" Feb 20 08:38:37 crc kubenswrapper[5094]: I0220 08:38:37.957299 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wj4s\" (UniqueName: \"kubernetes.io/projected/ec59a7fc-e360-4e39-8c57-cfaa43d23566-kube-api-access-7wj4s\") pod \"keystone-db-create-l2kgb\" (UID: \"ec59a7fc-e360-4e39-8c57-cfaa43d23566\") " pod="openstack/keystone-db-create-l2kgb" Feb 20 08:38:37 crc kubenswrapper[5094]: I0220 08:38:37.957489 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5af6-account-create-update-9f77r"] Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.058656 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wj4s\" (UniqueName: \"kubernetes.io/projected/ec59a7fc-e360-4e39-8c57-cfaa43d23566-kube-api-access-7wj4s\") pod \"keystone-db-create-l2kgb\" (UID: \"ec59a7fc-e360-4e39-8c57-cfaa43d23566\") " pod="openstack/keystone-db-create-l2kgb" Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.058747 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d1678de-0344-47d5-98bb-d9ffd63912e7-operator-scripts\") pod \"keystone-5af6-account-create-update-9f77r\" (UID: \"1d1678de-0344-47d5-98bb-d9ffd63912e7\") " pod="openstack/keystone-5af6-account-create-update-9f77r" Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.058802 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g9h5\" (UniqueName: \"kubernetes.io/projected/1d1678de-0344-47d5-98bb-d9ffd63912e7-kube-api-access-2g9h5\") pod \"keystone-5af6-account-create-update-9f77r\" (UID: \"1d1678de-0344-47d5-98bb-d9ffd63912e7\") " pod="openstack/keystone-5af6-account-create-update-9f77r" Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.058848 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec59a7fc-e360-4e39-8c57-cfaa43d23566-operator-scripts\") pod \"keystone-db-create-l2kgb\" (UID: \"ec59a7fc-e360-4e39-8c57-cfaa43d23566\") " pod="openstack/keystone-db-create-l2kgb" Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.059495 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec59a7fc-e360-4e39-8c57-cfaa43d23566-operator-scripts\") pod \"keystone-db-create-l2kgb\" (UID: \"ec59a7fc-e360-4e39-8c57-cfaa43d23566\") " pod="openstack/keystone-db-create-l2kgb" Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.076176 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wj4s\" (UniqueName: \"kubernetes.io/projected/ec59a7fc-e360-4e39-8c57-cfaa43d23566-kube-api-access-7wj4s\") pod \"keystone-db-create-l2kgb\" (UID: \"ec59a7fc-e360-4e39-8c57-cfaa43d23566\") " pod="openstack/keystone-db-create-l2kgb" Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.159831 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g9h5\" (UniqueName: \"kubernetes.io/projected/1d1678de-0344-47d5-98bb-d9ffd63912e7-kube-api-access-2g9h5\") pod \"keystone-5af6-account-create-update-9f77r\" (UID: \"1d1678de-0344-47d5-98bb-d9ffd63912e7\") " pod="openstack/keystone-5af6-account-create-update-9f77r" Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.160289 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d1678de-0344-47d5-98bb-d9ffd63912e7-operator-scripts\") pod \"keystone-5af6-account-create-update-9f77r\" (UID: \"1d1678de-0344-47d5-98bb-d9ffd63912e7\") " pod="openstack/keystone-5af6-account-create-update-9f77r" Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.161081 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d1678de-0344-47d5-98bb-d9ffd63912e7-operator-scripts\") pod \"keystone-5af6-account-create-update-9f77r\" (UID: \"1d1678de-0344-47d5-98bb-d9ffd63912e7\") " pod="openstack/keystone-5af6-account-create-update-9f77r" Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.170007 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-l2kgb" Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.181694 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g9h5\" (UniqueName: \"kubernetes.io/projected/1d1678de-0344-47d5-98bb-d9ffd63912e7-kube-api-access-2g9h5\") pod \"keystone-5af6-account-create-update-9f77r\" (UID: \"1d1678de-0344-47d5-98bb-d9ffd63912e7\") " pod="openstack/keystone-5af6-account-create-update-9f77r" Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.260002 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5af6-account-create-update-9f77r" Feb 20 08:38:38 crc kubenswrapper[5094]: W0220 08:38:38.613502 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec59a7fc_e360_4e39_8c57_cfaa43d23566.slice/crio-7d96dce0fca24fe420e2d9aacc437cea985c151685fc2474b7c889a87e6ea238 WatchSource:0}: Error finding container 7d96dce0fca24fe420e2d9aacc437cea985c151685fc2474b7c889a87e6ea238: Status 404 returned error can't find the container with id 7d96dce0fca24fe420e2d9aacc437cea985c151685fc2474b7c889a87e6ea238 Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.616136 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-l2kgb"] Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.703316 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5af6-account-create-update-9f77r"] Feb 20 08:38:38 crc kubenswrapper[5094]: W0220 08:38:38.712928 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d1678de_0344_47d5_98bb_d9ffd63912e7.slice/crio-f7c8fee7f1cfc445472a648944b8f6e3d207b38fb0473114436526e8f7d1d17a WatchSource:0}: Error finding container f7c8fee7f1cfc445472a648944b8f6e3d207b38fb0473114436526e8f7d1d17a: Status 404 returned error can't find the container with id f7c8fee7f1cfc445472a648944b8f6e3d207b38fb0473114436526e8f7d1d17a Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.974641 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5af6-account-create-update-9f77r" event={"ID":"1d1678de-0344-47d5-98bb-d9ffd63912e7","Type":"ContainerStarted","Data":"f7c8fee7f1cfc445472a648944b8f6e3d207b38fb0473114436526e8f7d1d17a"} Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.976187 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-l2kgb" event={"ID":"ec59a7fc-e360-4e39-8c57-cfaa43d23566","Type":"ContainerStarted","Data":"cd80f63c99e6776fe7e02e1fb80f9d75dc8541b9606921e566c011df6c0e65ac"} Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.976262 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-l2kgb" event={"ID":"ec59a7fc-e360-4e39-8c57-cfaa43d23566","Type":"ContainerStarted","Data":"7d96dce0fca24fe420e2d9aacc437cea985c151685fc2474b7c889a87e6ea238"} Feb 20 08:38:39 crc kubenswrapper[5094]: I0220 08:38:39.004861 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-l2kgb" podStartSLOduration=2.004828478 podStartE2EDuration="2.004828478s" podCreationTimestamp="2026-02-20 08:38:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:38:38.995307639 +0000 UTC m=+6733.867934390" watchObservedRunningTime="2026-02-20 08:38:39.004828478 +0000 UTC m=+6733.877455199" Feb 20 08:38:39 crc kubenswrapper[5094]: I0220 08:38:39.990397 5094 generic.go:334] "Generic (PLEG): container finished" podID="1d1678de-0344-47d5-98bb-d9ffd63912e7" containerID="52db5b53565602a22b540482712ac73023427fa1b0c5c5dd0a43d58c9fbc73b5" exitCode=0 Feb 20 08:38:39 crc kubenswrapper[5094]: I0220 08:38:39.990603 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5af6-account-create-update-9f77r" event={"ID":"1d1678de-0344-47d5-98bb-d9ffd63912e7","Type":"ContainerDied","Data":"52db5b53565602a22b540482712ac73023427fa1b0c5c5dd0a43d58c9fbc73b5"} Feb 20 08:38:39 crc kubenswrapper[5094]: I0220 08:38:39.994458 5094 generic.go:334] "Generic (PLEG): container finished" podID="ec59a7fc-e360-4e39-8c57-cfaa43d23566" containerID="cd80f63c99e6776fe7e02e1fb80f9d75dc8541b9606921e566c011df6c0e65ac" exitCode=0 Feb 20 08:38:39 crc kubenswrapper[5094]: I0220 08:38:39.994527 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-l2kgb" event={"ID":"ec59a7fc-e360-4e39-8c57-cfaa43d23566","Type":"ContainerDied","Data":"cd80f63c99e6776fe7e02e1fb80f9d75dc8541b9606921e566c011df6c0e65ac"} Feb 20 08:38:41 crc kubenswrapper[5094]: I0220 08:38:41.383068 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-l2kgb" Feb 20 08:38:41 crc kubenswrapper[5094]: I0220 08:38:41.389211 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5af6-account-create-update-9f77r" Feb 20 08:38:41 crc kubenswrapper[5094]: I0220 08:38:41.426497 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec59a7fc-e360-4e39-8c57-cfaa43d23566-operator-scripts\") pod \"ec59a7fc-e360-4e39-8c57-cfaa43d23566\" (UID: \"ec59a7fc-e360-4e39-8c57-cfaa43d23566\") " Feb 20 08:38:41 crc kubenswrapper[5094]: I0220 08:38:41.426549 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d1678de-0344-47d5-98bb-d9ffd63912e7-operator-scripts\") pod \"1d1678de-0344-47d5-98bb-d9ffd63912e7\" (UID: \"1d1678de-0344-47d5-98bb-d9ffd63912e7\") " Feb 20 08:38:41 crc kubenswrapper[5094]: I0220 08:38:41.426582 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2g9h5\" (UniqueName: \"kubernetes.io/projected/1d1678de-0344-47d5-98bb-d9ffd63912e7-kube-api-access-2g9h5\") pod \"1d1678de-0344-47d5-98bb-d9ffd63912e7\" (UID: \"1d1678de-0344-47d5-98bb-d9ffd63912e7\") " Feb 20 08:38:41 crc kubenswrapper[5094]: I0220 08:38:41.426639 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wj4s\" (UniqueName: \"kubernetes.io/projected/ec59a7fc-e360-4e39-8c57-cfaa43d23566-kube-api-access-7wj4s\") pod \"ec59a7fc-e360-4e39-8c57-cfaa43d23566\" (UID: \"ec59a7fc-e360-4e39-8c57-cfaa43d23566\") " Feb 20 08:38:41 crc kubenswrapper[5094]: I0220 08:38:41.427469 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec59a7fc-e360-4e39-8c57-cfaa43d23566-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ec59a7fc-e360-4e39-8c57-cfaa43d23566" (UID: "ec59a7fc-e360-4e39-8c57-cfaa43d23566"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:38:41 crc kubenswrapper[5094]: I0220 08:38:41.427488 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d1678de-0344-47d5-98bb-d9ffd63912e7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1d1678de-0344-47d5-98bb-d9ffd63912e7" (UID: "1d1678de-0344-47d5-98bb-d9ffd63912e7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:38:41 crc kubenswrapper[5094]: I0220 08:38:41.441074 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec59a7fc-e360-4e39-8c57-cfaa43d23566-kube-api-access-7wj4s" (OuterVolumeSpecName: "kube-api-access-7wj4s") pod "ec59a7fc-e360-4e39-8c57-cfaa43d23566" (UID: "ec59a7fc-e360-4e39-8c57-cfaa43d23566"). InnerVolumeSpecName "kube-api-access-7wj4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:38:41 crc kubenswrapper[5094]: I0220 08:38:41.441164 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d1678de-0344-47d5-98bb-d9ffd63912e7-kube-api-access-2g9h5" (OuterVolumeSpecName: "kube-api-access-2g9h5") pod "1d1678de-0344-47d5-98bb-d9ffd63912e7" (UID: "1d1678de-0344-47d5-98bb-d9ffd63912e7"). InnerVolumeSpecName "kube-api-access-2g9h5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:38:41 crc kubenswrapper[5094]: I0220 08:38:41.528375 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec59a7fc-e360-4e39-8c57-cfaa43d23566-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:41 crc kubenswrapper[5094]: I0220 08:38:41.528406 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d1678de-0344-47d5-98bb-d9ffd63912e7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:41 crc kubenswrapper[5094]: I0220 08:38:41.528417 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2g9h5\" (UniqueName: \"kubernetes.io/projected/1d1678de-0344-47d5-98bb-d9ffd63912e7-kube-api-access-2g9h5\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:41 crc kubenswrapper[5094]: I0220 08:38:41.528427 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wj4s\" (UniqueName: \"kubernetes.io/projected/ec59a7fc-e360-4e39-8c57-cfaa43d23566-kube-api-access-7wj4s\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:42 crc kubenswrapper[5094]: I0220 08:38:42.009316 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5af6-account-create-update-9f77r" event={"ID":"1d1678de-0344-47d5-98bb-d9ffd63912e7","Type":"ContainerDied","Data":"f7c8fee7f1cfc445472a648944b8f6e3d207b38fb0473114436526e8f7d1d17a"} Feb 20 08:38:42 crc kubenswrapper[5094]: I0220 08:38:42.009359 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7c8fee7f1cfc445472a648944b8f6e3d207b38fb0473114436526e8f7d1d17a" Feb 20 08:38:42 crc kubenswrapper[5094]: I0220 08:38:42.009363 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5af6-account-create-update-9f77r" Feb 20 08:38:42 crc kubenswrapper[5094]: I0220 08:38:42.010674 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-l2kgb" event={"ID":"ec59a7fc-e360-4e39-8c57-cfaa43d23566","Type":"ContainerDied","Data":"7d96dce0fca24fe420e2d9aacc437cea985c151685fc2474b7c889a87e6ea238"} Feb 20 08:38:42 crc kubenswrapper[5094]: I0220 08:38:42.010711 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d96dce0fca24fe420e2d9aacc437cea985c151685fc2474b7c889a87e6ea238" Feb 20 08:38:42 crc kubenswrapper[5094]: I0220 08:38:42.010765 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-l2kgb" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.415484 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-bh8cv"] Feb 20 08:38:43 crc kubenswrapper[5094]: E0220 08:38:43.416803 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec59a7fc-e360-4e39-8c57-cfaa43d23566" containerName="mariadb-database-create" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.416878 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec59a7fc-e360-4e39-8c57-cfaa43d23566" containerName="mariadb-database-create" Feb 20 08:38:43 crc kubenswrapper[5094]: E0220 08:38:43.416932 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d1678de-0344-47d5-98bb-d9ffd63912e7" containerName="mariadb-account-create-update" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.416980 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d1678de-0344-47d5-98bb-d9ffd63912e7" containerName="mariadb-account-create-update" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.417204 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec59a7fc-e360-4e39-8c57-cfaa43d23566" containerName="mariadb-database-create" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.417279 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d1678de-0344-47d5-98bb-d9ffd63912e7" containerName="mariadb-account-create-update" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.417912 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bh8cv" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.431404 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.434301 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.434350 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.434442 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-knkl6" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.440875 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-bh8cv"] Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.462884 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81601ce5-f2ae-4f57-a829-6b235b7ae4df-config-data\") pod \"keystone-db-sync-bh8cv\" (UID: \"81601ce5-f2ae-4f57-a829-6b235b7ae4df\") " pod="openstack/keystone-db-sync-bh8cv" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.462942 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81601ce5-f2ae-4f57-a829-6b235b7ae4df-combined-ca-bundle\") pod \"keystone-db-sync-bh8cv\" (UID: \"81601ce5-f2ae-4f57-a829-6b235b7ae4df\") " pod="openstack/keystone-db-sync-bh8cv" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.463107 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7zvc\" (UniqueName: \"kubernetes.io/projected/81601ce5-f2ae-4f57-a829-6b235b7ae4df-kube-api-access-s7zvc\") pod \"keystone-db-sync-bh8cv\" (UID: \"81601ce5-f2ae-4f57-a829-6b235b7ae4df\") " pod="openstack/keystone-db-sync-bh8cv" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.565083 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81601ce5-f2ae-4f57-a829-6b235b7ae4df-config-data\") pod \"keystone-db-sync-bh8cv\" (UID: \"81601ce5-f2ae-4f57-a829-6b235b7ae4df\") " pod="openstack/keystone-db-sync-bh8cv" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.565560 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81601ce5-f2ae-4f57-a829-6b235b7ae4df-combined-ca-bundle\") pod \"keystone-db-sync-bh8cv\" (UID: \"81601ce5-f2ae-4f57-a829-6b235b7ae4df\") " pod="openstack/keystone-db-sync-bh8cv" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.565784 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7zvc\" (UniqueName: \"kubernetes.io/projected/81601ce5-f2ae-4f57-a829-6b235b7ae4df-kube-api-access-s7zvc\") pod \"keystone-db-sync-bh8cv\" (UID: \"81601ce5-f2ae-4f57-a829-6b235b7ae4df\") " pod="openstack/keystone-db-sync-bh8cv" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.570289 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81601ce5-f2ae-4f57-a829-6b235b7ae4df-combined-ca-bundle\") pod \"keystone-db-sync-bh8cv\" (UID: \"81601ce5-f2ae-4f57-a829-6b235b7ae4df\") " pod="openstack/keystone-db-sync-bh8cv" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.570441 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81601ce5-f2ae-4f57-a829-6b235b7ae4df-config-data\") pod \"keystone-db-sync-bh8cv\" (UID: \"81601ce5-f2ae-4f57-a829-6b235b7ae4df\") " pod="openstack/keystone-db-sync-bh8cv" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.581351 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7zvc\" (UniqueName: \"kubernetes.io/projected/81601ce5-f2ae-4f57-a829-6b235b7ae4df-kube-api-access-s7zvc\") pod \"keystone-db-sync-bh8cv\" (UID: \"81601ce5-f2ae-4f57-a829-6b235b7ae4df\") " pod="openstack/keystone-db-sync-bh8cv" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.734265 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bh8cv" Feb 20 08:38:44 crc kubenswrapper[5094]: I0220 08:38:44.229650 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-bh8cv"] Feb 20 08:38:45 crc kubenswrapper[5094]: I0220 08:38:45.059120 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bh8cv" event={"ID":"81601ce5-f2ae-4f57-a829-6b235b7ae4df","Type":"ContainerStarted","Data":"498c87e28265cbefbe45ff4c9051d34828ab6173c31904d7e27419e27276c4e5"} Feb 20 08:38:49 crc kubenswrapper[5094]: I0220 08:38:49.684636 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 20 08:38:50 crc kubenswrapper[5094]: I0220 08:38:50.094943 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bh8cv" event={"ID":"81601ce5-f2ae-4f57-a829-6b235b7ae4df","Type":"ContainerStarted","Data":"8731ce151a95e8d35ad8f8cec8f98989c881d304429dbe888614942996f9f454"} Feb 20 08:38:50 crc kubenswrapper[5094]: I0220 08:38:50.113767 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-bh8cv" podStartSLOduration=2.357895886 podStartE2EDuration="7.113739559s" podCreationTimestamp="2026-02-20 08:38:43 +0000 UTC" firstStartedPulling="2026-02-20 08:38:44.230173473 +0000 UTC m=+6739.102800224" lastFinishedPulling="2026-02-20 08:38:48.986017186 +0000 UTC m=+6743.858643897" observedRunningTime="2026-02-20 08:38:50.110263475 +0000 UTC m=+6744.982890196" watchObservedRunningTime="2026-02-20 08:38:50.113739559 +0000 UTC m=+6744.986366280" Feb 20 08:38:51 crc kubenswrapper[5094]: I0220 08:38:51.103484 5094 generic.go:334] "Generic (PLEG): container finished" podID="81601ce5-f2ae-4f57-a829-6b235b7ae4df" containerID="8731ce151a95e8d35ad8f8cec8f98989c881d304429dbe888614942996f9f454" exitCode=0 Feb 20 08:38:51 crc kubenswrapper[5094]: I0220 08:38:51.103571 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bh8cv" event={"ID":"81601ce5-f2ae-4f57-a829-6b235b7ae4df","Type":"ContainerDied","Data":"8731ce151a95e8d35ad8f8cec8f98989c881d304429dbe888614942996f9f454"} Feb 20 08:38:52 crc kubenswrapper[5094]: I0220 08:38:52.400298 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bh8cv" Feb 20 08:38:52 crc kubenswrapper[5094]: I0220 08:38:52.508316 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81601ce5-f2ae-4f57-a829-6b235b7ae4df-config-data\") pod \"81601ce5-f2ae-4f57-a829-6b235b7ae4df\" (UID: \"81601ce5-f2ae-4f57-a829-6b235b7ae4df\") " Feb 20 08:38:52 crc kubenswrapper[5094]: I0220 08:38:52.508398 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81601ce5-f2ae-4f57-a829-6b235b7ae4df-combined-ca-bundle\") pod \"81601ce5-f2ae-4f57-a829-6b235b7ae4df\" (UID: \"81601ce5-f2ae-4f57-a829-6b235b7ae4df\") " Feb 20 08:38:52 crc kubenswrapper[5094]: I0220 08:38:52.508426 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7zvc\" (UniqueName: \"kubernetes.io/projected/81601ce5-f2ae-4f57-a829-6b235b7ae4df-kube-api-access-s7zvc\") pod \"81601ce5-f2ae-4f57-a829-6b235b7ae4df\" (UID: \"81601ce5-f2ae-4f57-a829-6b235b7ae4df\") " Feb 20 08:38:52 crc kubenswrapper[5094]: I0220 08:38:52.512869 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81601ce5-f2ae-4f57-a829-6b235b7ae4df-kube-api-access-s7zvc" (OuterVolumeSpecName: "kube-api-access-s7zvc") pod "81601ce5-f2ae-4f57-a829-6b235b7ae4df" (UID: "81601ce5-f2ae-4f57-a829-6b235b7ae4df"). InnerVolumeSpecName "kube-api-access-s7zvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:38:52 crc kubenswrapper[5094]: I0220 08:38:52.530735 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81601ce5-f2ae-4f57-a829-6b235b7ae4df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81601ce5-f2ae-4f57-a829-6b235b7ae4df" (UID: "81601ce5-f2ae-4f57-a829-6b235b7ae4df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:38:52 crc kubenswrapper[5094]: I0220 08:38:52.547039 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81601ce5-f2ae-4f57-a829-6b235b7ae4df-config-data" (OuterVolumeSpecName: "config-data") pod "81601ce5-f2ae-4f57-a829-6b235b7ae4df" (UID: "81601ce5-f2ae-4f57-a829-6b235b7ae4df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:38:52 crc kubenswrapper[5094]: I0220 08:38:52.610404 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81601ce5-f2ae-4f57-a829-6b235b7ae4df-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:52 crc kubenswrapper[5094]: I0220 08:38:52.610435 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81601ce5-f2ae-4f57-a829-6b235b7ae4df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:52 crc kubenswrapper[5094]: I0220 08:38:52.610477 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7zvc\" (UniqueName: \"kubernetes.io/projected/81601ce5-f2ae-4f57-a829-6b235b7ae4df-kube-api-access-s7zvc\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.121203 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bh8cv" event={"ID":"81601ce5-f2ae-4f57-a829-6b235b7ae4df","Type":"ContainerDied","Data":"498c87e28265cbefbe45ff4c9051d34828ab6173c31904d7e27419e27276c4e5"} Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.121529 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="498c87e28265cbefbe45ff4c9051d34828ab6173c31904d7e27419e27276c4e5" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.121274 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bh8cv" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.334408 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77c94f6b7-twntj"] Feb 20 08:38:53 crc kubenswrapper[5094]: E0220 08:38:53.334833 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81601ce5-f2ae-4f57-a829-6b235b7ae4df" containerName="keystone-db-sync" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.334855 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="81601ce5-f2ae-4f57-a829-6b235b7ae4df" containerName="keystone-db-sync" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.335085 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="81601ce5-f2ae-4f57-a829-6b235b7ae4df" containerName="keystone-db-sync" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.336159 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.357206 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77c94f6b7-twntj"] Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.401439 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-sfb9q"] Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.402465 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.406531 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.406682 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.407902 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-knkl6" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.408088 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.411947 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.416526 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-sfb9q"] Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.420815 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-dns-svc\") pod \"dnsmasq-dns-77c94f6b7-twntj\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.420915 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-config\") pod \"dnsmasq-dns-77c94f6b7-twntj\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.420980 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcqs7\" (UniqueName: \"kubernetes.io/projected/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-kube-api-access-xcqs7\") pod \"dnsmasq-dns-77c94f6b7-twntj\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.421011 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-ovsdbserver-sb\") pod \"dnsmasq-dns-77c94f6b7-twntj\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.421044 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-ovsdbserver-nb\") pod \"dnsmasq-dns-77c94f6b7-twntj\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.522295 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2b5t\" (UniqueName: \"kubernetes.io/projected/80bba0dd-119d-47bb-8526-df2e59c5b132-kube-api-access-s2b5t\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.522367 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-ovsdbserver-nb\") pod \"dnsmasq-dns-77c94f6b7-twntj\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.522407 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-dns-svc\") pod \"dnsmasq-dns-77c94f6b7-twntj\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.522436 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-credential-keys\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.522478 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-config\") pod \"dnsmasq-dns-77c94f6b7-twntj\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.522502 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-combined-ca-bundle\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.522622 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-config-data\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.522639 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-fernet-keys\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.522679 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-scripts\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.522766 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcqs7\" (UniqueName: \"kubernetes.io/projected/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-kube-api-access-xcqs7\") pod \"dnsmasq-dns-77c94f6b7-twntj\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.522800 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-ovsdbserver-sb\") pod \"dnsmasq-dns-77c94f6b7-twntj\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.523796 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-ovsdbserver-sb\") pod \"dnsmasq-dns-77c94f6b7-twntj\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.524288 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-ovsdbserver-nb\") pod \"dnsmasq-dns-77c94f6b7-twntj\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.525338 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-dns-svc\") pod \"dnsmasq-dns-77c94f6b7-twntj\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.525543 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-config\") pod \"dnsmasq-dns-77c94f6b7-twntj\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.542523 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcqs7\" (UniqueName: \"kubernetes.io/projected/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-kube-api-access-xcqs7\") pod \"dnsmasq-dns-77c94f6b7-twntj\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.624134 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-config-data\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.624563 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-fernet-keys\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.624595 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-scripts\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.624682 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2b5t\" (UniqueName: \"kubernetes.io/projected/80bba0dd-119d-47bb-8526-df2e59c5b132-kube-api-access-s2b5t\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.624780 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-credential-keys\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.624879 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-combined-ca-bundle\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.628427 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-scripts\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.628677 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-combined-ca-bundle\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.629443 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-credential-keys\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.629520 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-config-data\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.630121 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-fernet-keys\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.644750 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2b5t\" (UniqueName: \"kubernetes.io/projected/80bba0dd-119d-47bb-8526-df2e59c5b132-kube-api-access-s2b5t\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.658959 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.719167 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:54 crc kubenswrapper[5094]: W0220 08:38:54.148447 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3b60dac_2fbe_46ba_acc9_92058e10f2d1.slice/crio-b9d473e434d2dbf9a062a256fefe028e1a75e895c91d6257edb1d3df1f31c7ef WatchSource:0}: Error finding container b9d473e434d2dbf9a062a256fefe028e1a75e895c91d6257edb1d3df1f31c7ef: Status 404 returned error can't find the container with id b9d473e434d2dbf9a062a256fefe028e1a75e895c91d6257edb1d3df1f31c7ef Feb 20 08:38:54 crc kubenswrapper[5094]: I0220 08:38:54.148772 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77c94f6b7-twntj"] Feb 20 08:38:54 crc kubenswrapper[5094]: I0220 08:38:54.222117 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-sfb9q"] Feb 20 08:38:54 crc kubenswrapper[5094]: W0220 08:38:54.227489 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80bba0dd_119d_47bb_8526_df2e59c5b132.slice/crio-b57adec3ffe67650fa5cd28f4d1178094049974ea52c13f89f722a21ff6f9743 WatchSource:0}: Error finding container b57adec3ffe67650fa5cd28f4d1178094049974ea52c13f89f722a21ff6f9743: Status 404 returned error can't find the container with id b57adec3ffe67650fa5cd28f4d1178094049974ea52c13f89f722a21ff6f9743 Feb 20 08:38:55 crc kubenswrapper[5094]: I0220 08:38:55.134225 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sfb9q" event={"ID":"80bba0dd-119d-47bb-8526-df2e59c5b132","Type":"ContainerStarted","Data":"3bce3a1b265a86956611d3f0e735c171240869a005a9b1236b337c2fd156e6f9"} Feb 20 08:38:55 crc kubenswrapper[5094]: I0220 08:38:55.134502 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sfb9q" event={"ID":"80bba0dd-119d-47bb-8526-df2e59c5b132","Type":"ContainerStarted","Data":"b57adec3ffe67650fa5cd28f4d1178094049974ea52c13f89f722a21ff6f9743"} Feb 20 08:38:55 crc kubenswrapper[5094]: I0220 08:38:55.136653 5094 generic.go:334] "Generic (PLEG): container finished" podID="e3b60dac-2fbe-46ba-acc9-92058e10f2d1" containerID="d7d23a98ee3bcf78f157fef71692c3b42c0ccd4e53f68bb8466090a9c903b801" exitCode=0 Feb 20 08:38:55 crc kubenswrapper[5094]: I0220 08:38:55.136693 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77c94f6b7-twntj" event={"ID":"e3b60dac-2fbe-46ba-acc9-92058e10f2d1","Type":"ContainerDied","Data":"d7d23a98ee3bcf78f157fef71692c3b42c0ccd4e53f68bb8466090a9c903b801"} Feb 20 08:38:55 crc kubenswrapper[5094]: I0220 08:38:55.136731 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77c94f6b7-twntj" event={"ID":"e3b60dac-2fbe-46ba-acc9-92058e10f2d1","Type":"ContainerStarted","Data":"b9d473e434d2dbf9a062a256fefe028e1a75e895c91d6257edb1d3df1f31c7ef"} Feb 20 08:38:55 crc kubenswrapper[5094]: I0220 08:38:55.156034 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-sfb9q" podStartSLOduration=2.156014187 podStartE2EDuration="2.156014187s" podCreationTimestamp="2026-02-20 08:38:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:38:55.150218408 +0000 UTC m=+6750.022845119" watchObservedRunningTime="2026-02-20 08:38:55.156014187 +0000 UTC m=+6750.028640898" Feb 20 08:38:56 crc kubenswrapper[5094]: I0220 08:38:56.146006 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77c94f6b7-twntj" event={"ID":"e3b60dac-2fbe-46ba-acc9-92058e10f2d1","Type":"ContainerStarted","Data":"24d6cfafc2015bc19bb6e88020cb0449cf6f98d40157ba914e379542fe8d395a"} Feb 20 08:38:56 crc kubenswrapper[5094]: I0220 08:38:56.169947 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77c94f6b7-twntj" podStartSLOduration=3.16992676 podStartE2EDuration="3.16992676s" podCreationTimestamp="2026-02-20 08:38:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:38:56.169368887 +0000 UTC m=+6751.041995598" watchObservedRunningTime="2026-02-20 08:38:56.16992676 +0000 UTC m=+6751.042553471" Feb 20 08:38:57 crc kubenswrapper[5094]: I0220 08:38:57.155343 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:58 crc kubenswrapper[5094]: I0220 08:38:58.163203 5094 generic.go:334] "Generic (PLEG): container finished" podID="80bba0dd-119d-47bb-8526-df2e59c5b132" containerID="3bce3a1b265a86956611d3f0e735c171240869a005a9b1236b337c2fd156e6f9" exitCode=0 Feb 20 08:38:58 crc kubenswrapper[5094]: I0220 08:38:58.163286 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sfb9q" event={"ID":"80bba0dd-119d-47bb-8526-df2e59c5b132","Type":"ContainerDied","Data":"3bce3a1b265a86956611d3f0e735c171240869a005a9b1236b337c2fd156e6f9"} Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.475575 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.633152 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-scripts\") pod \"80bba0dd-119d-47bb-8526-df2e59c5b132\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.633244 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-credential-keys\") pod \"80bba0dd-119d-47bb-8526-df2e59c5b132\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.633291 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-combined-ca-bundle\") pod \"80bba0dd-119d-47bb-8526-df2e59c5b132\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.633335 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-fernet-keys\") pod \"80bba0dd-119d-47bb-8526-df2e59c5b132\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.633402 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-config-data\") pod \"80bba0dd-119d-47bb-8526-df2e59c5b132\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.633429 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2b5t\" (UniqueName: \"kubernetes.io/projected/80bba0dd-119d-47bb-8526-df2e59c5b132-kube-api-access-s2b5t\") pod \"80bba0dd-119d-47bb-8526-df2e59c5b132\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.639840 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80bba0dd-119d-47bb-8526-df2e59c5b132-kube-api-access-s2b5t" (OuterVolumeSpecName: "kube-api-access-s2b5t") pod "80bba0dd-119d-47bb-8526-df2e59c5b132" (UID: "80bba0dd-119d-47bb-8526-df2e59c5b132"). InnerVolumeSpecName "kube-api-access-s2b5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.639943 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "80bba0dd-119d-47bb-8526-df2e59c5b132" (UID: "80bba0dd-119d-47bb-8526-df2e59c5b132"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.640839 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "80bba0dd-119d-47bb-8526-df2e59c5b132" (UID: "80bba0dd-119d-47bb-8526-df2e59c5b132"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.651555 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-scripts" (OuterVolumeSpecName: "scripts") pod "80bba0dd-119d-47bb-8526-df2e59c5b132" (UID: "80bba0dd-119d-47bb-8526-df2e59c5b132"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.656819 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-config-data" (OuterVolumeSpecName: "config-data") pod "80bba0dd-119d-47bb-8526-df2e59c5b132" (UID: "80bba0dd-119d-47bb-8526-df2e59c5b132"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.666536 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80bba0dd-119d-47bb-8526-df2e59c5b132" (UID: "80bba0dd-119d-47bb-8526-df2e59c5b132"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.735568 5094 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.735876 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.736028 5094 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.736134 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.736247 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2b5t\" (UniqueName: \"kubernetes.io/projected/80bba0dd-119d-47bb-8526-df2e59c5b132-kube-api-access-s2b5t\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.736369 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.179580 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sfb9q" event={"ID":"80bba0dd-119d-47bb-8526-df2e59c5b132","Type":"ContainerDied","Data":"b57adec3ffe67650fa5cd28f4d1178094049974ea52c13f89f722a21ff6f9743"} Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.179616 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b57adec3ffe67650fa5cd28f4d1178094049974ea52c13f89f722a21ff6f9743" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.179871 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.360856 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-sfb9q"] Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.365848 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-sfb9q"] Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.439078 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-869rz"] Feb 20 08:39:00 crc kubenswrapper[5094]: E0220 08:39:00.439787 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80bba0dd-119d-47bb-8526-df2e59c5b132" containerName="keystone-bootstrap" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.439808 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="80bba0dd-119d-47bb-8526-df2e59c5b132" containerName="keystone-bootstrap" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.440194 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="80bba0dd-119d-47bb-8526-df2e59c5b132" containerName="keystone-bootstrap" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.440919 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.448072 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.448394 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.448580 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.454470 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-knkl6" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.458288 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.470655 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-869rz"] Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.549152 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-scripts\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.549207 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-fernet-keys\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.549260 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-config-data\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.549285 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-combined-ca-bundle\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.549302 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsn6l\" (UniqueName: \"kubernetes.io/projected/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-kube-api-access-fsn6l\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.549360 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-credential-keys\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.650640 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-scripts\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.650940 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-fernet-keys\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.651096 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-config-data\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.652091 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-combined-ca-bundle\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.652146 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsn6l\" (UniqueName: \"kubernetes.io/projected/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-kube-api-access-fsn6l\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.652424 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-credential-keys\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.654849 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-scripts\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.655123 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-combined-ca-bundle\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.655759 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-config-data\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.656814 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-credential-keys\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.658645 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-fernet-keys\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.668543 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsn6l\" (UniqueName: \"kubernetes.io/projected/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-kube-api-access-fsn6l\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.762012 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:01 crc kubenswrapper[5094]: I0220 08:39:01.196244 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-869rz"] Feb 20 08:39:01 crc kubenswrapper[5094]: I0220 08:39:01.853342 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80bba0dd-119d-47bb-8526-df2e59c5b132" path="/var/lib/kubelet/pods/80bba0dd-119d-47bb-8526-df2e59c5b132/volumes" Feb 20 08:39:02 crc kubenswrapper[5094]: I0220 08:39:02.203918 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-869rz" event={"ID":"5ab956a6-a68a-4da9-9065-6f09fb2a8f28","Type":"ContainerStarted","Data":"130895fd2e1c68f426f440f2c22e59759f2e0edcb44074b22c5181ef4b193c92"} Feb 20 08:39:02 crc kubenswrapper[5094]: I0220 08:39:02.203967 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-869rz" event={"ID":"5ab956a6-a68a-4da9-9065-6f09fb2a8f28","Type":"ContainerStarted","Data":"e176cf819b6fe2d61f86091fa78711313240cd2e99aa818e6e9adb05b61379d4"} Feb 20 08:39:02 crc kubenswrapper[5094]: I0220 08:39:02.233624 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-869rz" podStartSLOduration=2.233597322 podStartE2EDuration="2.233597322s" podCreationTimestamp="2026-02-20 08:39:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:39:02.227772052 +0000 UTC m=+6757.100398763" watchObservedRunningTime="2026-02-20 08:39:02.233597322 +0000 UTC m=+6757.106224073" Feb 20 08:39:03 crc kubenswrapper[5094]: I0220 08:39:03.660930 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:39:03 crc kubenswrapper[5094]: I0220 08:39:03.742608 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68dcfc97c7-fzntw"] Feb 20 08:39:03 crc kubenswrapper[5094]: I0220 08:39:03.744282 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" podUID="59402637-ea41-4c78-a455-361d55c5422a" containerName="dnsmasq-dns" containerID="cri-o://c629842f4857fc824451a2868ac4ca4ec36a7daeeb169573db6f9b5405d05a43" gracePeriod=10 Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.106881 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.106935 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.224972 5094 generic.go:334] "Generic (PLEG): container finished" podID="59402637-ea41-4c78-a455-361d55c5422a" containerID="c629842f4857fc824451a2868ac4ca4ec36a7daeeb169573db6f9b5405d05a43" exitCode=0 Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.225038 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" event={"ID":"59402637-ea41-4c78-a455-361d55c5422a","Type":"ContainerDied","Data":"c629842f4857fc824451a2868ac4ca4ec36a7daeeb169573db6f9b5405d05a43"} Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.225062 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" event={"ID":"59402637-ea41-4c78-a455-361d55c5422a","Type":"ContainerDied","Data":"cc810b39c5c9fcbe3da7e8305b82e45f0c0fd18ad2de7d721bc7779ae68beb9b"} Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.225072 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc810b39c5c9fcbe3da7e8305b82e45f0c0fd18ad2de7d721bc7779ae68beb9b" Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.226805 5094 generic.go:334] "Generic (PLEG): container finished" podID="5ab956a6-a68a-4da9-9065-6f09fb2a8f28" containerID="130895fd2e1c68f426f440f2c22e59759f2e0edcb44074b22c5181ef4b193c92" exitCode=0 Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.226843 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-869rz" event={"ID":"5ab956a6-a68a-4da9-9065-6f09fb2a8f28","Type":"ContainerDied","Data":"130895fd2e1c68f426f440f2c22e59759f2e0edcb44074b22c5181ef4b193c92"} Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.233890 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.310479 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-ovsdbserver-nb\") pod \"59402637-ea41-4c78-a455-361d55c5422a\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.310567 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-dns-svc\") pod \"59402637-ea41-4c78-a455-361d55c5422a\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.310611 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5tlr\" (UniqueName: \"kubernetes.io/projected/59402637-ea41-4c78-a455-361d55c5422a-kube-api-access-k5tlr\") pod \"59402637-ea41-4c78-a455-361d55c5422a\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.310783 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-config\") pod \"59402637-ea41-4c78-a455-361d55c5422a\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.310820 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-ovsdbserver-sb\") pod \"59402637-ea41-4c78-a455-361d55c5422a\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.327988 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59402637-ea41-4c78-a455-361d55c5422a-kube-api-access-k5tlr" (OuterVolumeSpecName: "kube-api-access-k5tlr") pod "59402637-ea41-4c78-a455-361d55c5422a" (UID: "59402637-ea41-4c78-a455-361d55c5422a"). InnerVolumeSpecName "kube-api-access-k5tlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.354526 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "59402637-ea41-4c78-a455-361d55c5422a" (UID: "59402637-ea41-4c78-a455-361d55c5422a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.392117 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "59402637-ea41-4c78-a455-361d55c5422a" (UID: "59402637-ea41-4c78-a455-361d55c5422a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.398130 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-config" (OuterVolumeSpecName: "config") pod "59402637-ea41-4c78-a455-361d55c5422a" (UID: "59402637-ea41-4c78-a455-361d55c5422a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.400481 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "59402637-ea41-4c78-a455-361d55c5422a" (UID: "59402637-ea41-4c78-a455-361d55c5422a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.415076 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.415166 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.415182 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.415236 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.415253 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5tlr\" (UniqueName: \"kubernetes.io/projected/59402637-ea41-4c78-a455-361d55c5422a-kube-api-access-k5tlr\") on node \"crc\" DevicePath \"\"" Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.233538 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.268243 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68dcfc97c7-fzntw"] Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.273234 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68dcfc97c7-fzntw"] Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.580502 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.734391 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsn6l\" (UniqueName: \"kubernetes.io/projected/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-kube-api-access-fsn6l\") pod \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.734433 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-combined-ca-bundle\") pod \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.734537 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-fernet-keys\") pod \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.734565 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-credential-keys\") pod \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.734583 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-config-data\") pod \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.734687 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-scripts\") pod \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.739652 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-scripts" (OuterVolumeSpecName: "scripts") pod "5ab956a6-a68a-4da9-9065-6f09fb2a8f28" (UID: "5ab956a6-a68a-4da9-9065-6f09fb2a8f28"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.739684 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5ab956a6-a68a-4da9-9065-6f09fb2a8f28" (UID: "5ab956a6-a68a-4da9-9065-6f09fb2a8f28"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.739843 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-kube-api-access-fsn6l" (OuterVolumeSpecName: "kube-api-access-fsn6l") pod "5ab956a6-a68a-4da9-9065-6f09fb2a8f28" (UID: "5ab956a6-a68a-4da9-9065-6f09fb2a8f28"). InnerVolumeSpecName "kube-api-access-fsn6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.739891 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "5ab956a6-a68a-4da9-9065-6f09fb2a8f28" (UID: "5ab956a6-a68a-4da9-9065-6f09fb2a8f28"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.757264 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ab956a6-a68a-4da9-9065-6f09fb2a8f28" (UID: "5ab956a6-a68a-4da9-9065-6f09fb2a8f28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.761110 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-config-data" (OuterVolumeSpecName: "config-data") pod "5ab956a6-a68a-4da9-9065-6f09fb2a8f28" (UID: "5ab956a6-a68a-4da9-9065-6f09fb2a8f28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.836527 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsn6l\" (UniqueName: \"kubernetes.io/projected/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-kube-api-access-fsn6l\") on node \"crc\" DevicePath \"\"" Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.836570 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.836583 5094 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.836595 5094 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.836610 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.836621 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.849843 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59402637-ea41-4c78-a455-361d55c5422a" path="/var/lib/kubelet/pods/59402637-ea41-4c78-a455-361d55c5422a/volumes" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.240299 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-869rz" event={"ID":"5ab956a6-a68a-4da9-9065-6f09fb2a8f28","Type":"ContainerDied","Data":"e176cf819b6fe2d61f86091fa78711313240cd2e99aa818e6e9adb05b61379d4"} Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.240336 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e176cf819b6fe2d61f86091fa78711313240cd2e99aa818e6e9adb05b61379d4" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.240347 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.666252 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6f444df446-vdhbp"] Feb 20 08:39:06 crc kubenswrapper[5094]: E0220 08:39:06.667129 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59402637-ea41-4c78-a455-361d55c5422a" containerName="dnsmasq-dns" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.667157 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="59402637-ea41-4c78-a455-361d55c5422a" containerName="dnsmasq-dns" Feb 20 08:39:06 crc kubenswrapper[5094]: E0220 08:39:06.667199 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ab956a6-a68a-4da9-9065-6f09fb2a8f28" containerName="keystone-bootstrap" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.667211 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ab956a6-a68a-4da9-9065-6f09fb2a8f28" containerName="keystone-bootstrap" Feb 20 08:39:06 crc kubenswrapper[5094]: E0220 08:39:06.667248 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59402637-ea41-4c78-a455-361d55c5422a" containerName="init" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.667260 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="59402637-ea41-4c78-a455-361d55c5422a" containerName="init" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.667492 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="59402637-ea41-4c78-a455-361d55c5422a" containerName="dnsmasq-dns" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.667521 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ab956a6-a68a-4da9-9065-6f09fb2a8f28" containerName="keystone-bootstrap" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.668216 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.670362 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-knkl6" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.670831 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.677782 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.682497 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.684147 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6f444df446-vdhbp"] Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.749352 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/167ab003-3908-4714-95b2-bfad7c1e1e00-combined-ca-bundle\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.749438 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/167ab003-3908-4714-95b2-bfad7c1e1e00-scripts\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.749479 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsk9g\" (UniqueName: \"kubernetes.io/projected/167ab003-3908-4714-95b2-bfad7c1e1e00-kube-api-access-zsk9g\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.749503 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/167ab003-3908-4714-95b2-bfad7c1e1e00-credential-keys\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.749533 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/167ab003-3908-4714-95b2-bfad7c1e1e00-fernet-keys\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.749620 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/167ab003-3908-4714-95b2-bfad7c1e1e00-config-data\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.851294 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/167ab003-3908-4714-95b2-bfad7c1e1e00-fernet-keys\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.851417 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/167ab003-3908-4714-95b2-bfad7c1e1e00-config-data\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.851464 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/167ab003-3908-4714-95b2-bfad7c1e1e00-combined-ca-bundle\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.851515 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/167ab003-3908-4714-95b2-bfad7c1e1e00-scripts\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.851555 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsk9g\" (UniqueName: \"kubernetes.io/projected/167ab003-3908-4714-95b2-bfad7c1e1e00-kube-api-access-zsk9g\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.851580 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/167ab003-3908-4714-95b2-bfad7c1e1e00-credential-keys\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.855543 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/167ab003-3908-4714-95b2-bfad7c1e1e00-scripts\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.855833 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/167ab003-3908-4714-95b2-bfad7c1e1e00-fernet-keys\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.857042 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/167ab003-3908-4714-95b2-bfad7c1e1e00-combined-ca-bundle\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.857149 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/167ab003-3908-4714-95b2-bfad7c1e1e00-credential-keys\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.858877 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/167ab003-3908-4714-95b2-bfad7c1e1e00-config-data\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.871351 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsk9g\" (UniqueName: \"kubernetes.io/projected/167ab003-3908-4714-95b2-bfad7c1e1e00-kube-api-access-zsk9g\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.986144 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:07 crc kubenswrapper[5094]: I0220 08:39:07.386606 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6f444df446-vdhbp"] Feb 20 08:39:08 crc kubenswrapper[5094]: I0220 08:39:08.256084 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6f444df446-vdhbp" event={"ID":"167ab003-3908-4714-95b2-bfad7c1e1e00","Type":"ContainerStarted","Data":"d8b59a4889dd70a1cfabcd0744e95db17f9a524587e18d6a759a3a34faa924f2"} Feb 20 08:39:08 crc kubenswrapper[5094]: I0220 08:39:08.256544 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:08 crc kubenswrapper[5094]: I0220 08:39:08.256559 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6f444df446-vdhbp" event={"ID":"167ab003-3908-4714-95b2-bfad7c1e1e00","Type":"ContainerStarted","Data":"732f63a2ef73a7882303d350cbe2352e9029f0e590d815ed7f259e3a8763489e"} Feb 20 08:39:08 crc kubenswrapper[5094]: I0220 08:39:08.273267 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6f444df446-vdhbp" podStartSLOduration=2.273245365 podStartE2EDuration="2.273245365s" podCreationTimestamp="2026-02-20 08:39:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:39:08.272567258 +0000 UTC m=+6763.145193969" watchObservedRunningTime="2026-02-20 08:39:08.273245365 +0000 UTC m=+6763.145872076" Feb 20 08:39:34 crc kubenswrapper[5094]: I0220 08:39:34.107078 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:39:34 crc kubenswrapper[5094]: I0220 08:39:34.107897 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:39:38 crc kubenswrapper[5094]: I0220 08:39:38.474187 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:42 crc kubenswrapper[5094]: I0220 08:39:42.817822 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 20 08:39:42 crc kubenswrapper[5094]: I0220 08:39:42.819335 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 08:39:42 crc kubenswrapper[5094]: I0220 08:39:42.825257 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 20 08:39:42 crc kubenswrapper[5094]: I0220 08:39:42.825592 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 20 08:39:42 crc kubenswrapper[5094]: I0220 08:39:42.831470 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-9882x" Feb 20 08:39:42 crc kubenswrapper[5094]: I0220 08:39:42.841377 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 20 08:39:42 crc kubenswrapper[5094]: I0220 08:39:42.987320 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b4afe958-0e78-49e9-b05a-08ff4c42f602-openstack-config\") pod \"openstackclient\" (UID: \"b4afe958-0e78-49e9-b05a-08ff4c42f602\") " pod="openstack/openstackclient" Feb 20 08:39:42 crc kubenswrapper[5094]: I0220 08:39:42.987382 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd86c\" (UniqueName: \"kubernetes.io/projected/b4afe958-0e78-49e9-b05a-08ff4c42f602-kube-api-access-kd86c\") pod \"openstackclient\" (UID: \"b4afe958-0e78-49e9-b05a-08ff4c42f602\") " pod="openstack/openstackclient" Feb 20 08:39:42 crc kubenswrapper[5094]: I0220 08:39:42.987568 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b4afe958-0e78-49e9-b05a-08ff4c42f602-openstack-config-secret\") pod \"openstackclient\" (UID: \"b4afe958-0e78-49e9-b05a-08ff4c42f602\") " pod="openstack/openstackclient" Feb 20 08:39:43 crc kubenswrapper[5094]: I0220 08:39:43.089653 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd86c\" (UniqueName: \"kubernetes.io/projected/b4afe958-0e78-49e9-b05a-08ff4c42f602-kube-api-access-kd86c\") pod \"openstackclient\" (UID: \"b4afe958-0e78-49e9-b05a-08ff4c42f602\") " pod="openstack/openstackclient" Feb 20 08:39:43 crc kubenswrapper[5094]: I0220 08:39:43.089879 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b4afe958-0e78-49e9-b05a-08ff4c42f602-openstack-config-secret\") pod \"openstackclient\" (UID: \"b4afe958-0e78-49e9-b05a-08ff4c42f602\") " pod="openstack/openstackclient" Feb 20 08:39:43 crc kubenswrapper[5094]: I0220 08:39:43.089948 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b4afe958-0e78-49e9-b05a-08ff4c42f602-openstack-config\") pod \"openstackclient\" (UID: \"b4afe958-0e78-49e9-b05a-08ff4c42f602\") " pod="openstack/openstackclient" Feb 20 08:39:43 crc kubenswrapper[5094]: I0220 08:39:43.091006 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b4afe958-0e78-49e9-b05a-08ff4c42f602-openstack-config\") pod \"openstackclient\" (UID: \"b4afe958-0e78-49e9-b05a-08ff4c42f602\") " pod="openstack/openstackclient" Feb 20 08:39:43 crc kubenswrapper[5094]: I0220 08:39:43.103088 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b4afe958-0e78-49e9-b05a-08ff4c42f602-openstack-config-secret\") pod \"openstackclient\" (UID: \"b4afe958-0e78-49e9-b05a-08ff4c42f602\") " pod="openstack/openstackclient" Feb 20 08:39:43 crc kubenswrapper[5094]: I0220 08:39:43.109486 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd86c\" (UniqueName: \"kubernetes.io/projected/b4afe958-0e78-49e9-b05a-08ff4c42f602-kube-api-access-kd86c\") pod \"openstackclient\" (UID: \"b4afe958-0e78-49e9-b05a-08ff4c42f602\") " pod="openstack/openstackclient" Feb 20 08:39:43 crc kubenswrapper[5094]: I0220 08:39:43.140983 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 08:39:43 crc kubenswrapper[5094]: I0220 08:39:43.583435 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 20 08:39:43 crc kubenswrapper[5094]: W0220 08:39:43.593645 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4afe958_0e78_49e9_b05a_08ff4c42f602.slice/crio-f5f4f088f9e5532097f51ae10dffeeb6d9fd8dbc6bda70e84718e13d2139d04f WatchSource:0}: Error finding container f5f4f088f9e5532097f51ae10dffeeb6d9fd8dbc6bda70e84718e13d2139d04f: Status 404 returned error can't find the container with id f5f4f088f9e5532097f51ae10dffeeb6d9fd8dbc6bda70e84718e13d2139d04f Feb 20 08:39:44 crc kubenswrapper[5094]: I0220 08:39:44.589187 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b4afe958-0e78-49e9-b05a-08ff4c42f602","Type":"ContainerStarted","Data":"f5f4f088f9e5532097f51ae10dffeeb6d9fd8dbc6bda70e84718e13d2139d04f"} Feb 20 08:39:50 crc kubenswrapper[5094]: I0220 08:39:50.061921 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z7m9s"] Feb 20 08:39:50 crc kubenswrapper[5094]: I0220 08:39:50.063822 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:39:50 crc kubenswrapper[5094]: I0220 08:39:50.069679 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z7m9s"] Feb 20 08:39:50 crc kubenswrapper[5094]: I0220 08:39:50.213900 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/223d1ebb-1681-477e-b91a-43dc2ce65d74-utilities\") pod \"redhat-operators-z7m9s\" (UID: \"223d1ebb-1681-477e-b91a-43dc2ce65d74\") " pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:39:50 crc kubenswrapper[5094]: I0220 08:39:50.213956 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/223d1ebb-1681-477e-b91a-43dc2ce65d74-catalog-content\") pod \"redhat-operators-z7m9s\" (UID: \"223d1ebb-1681-477e-b91a-43dc2ce65d74\") " pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:39:50 crc kubenswrapper[5094]: I0220 08:39:50.214125 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdcb9\" (UniqueName: \"kubernetes.io/projected/223d1ebb-1681-477e-b91a-43dc2ce65d74-kube-api-access-hdcb9\") pod \"redhat-operators-z7m9s\" (UID: \"223d1ebb-1681-477e-b91a-43dc2ce65d74\") " pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:39:50 crc kubenswrapper[5094]: I0220 08:39:50.316198 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/223d1ebb-1681-477e-b91a-43dc2ce65d74-utilities\") pod \"redhat-operators-z7m9s\" (UID: \"223d1ebb-1681-477e-b91a-43dc2ce65d74\") " pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:39:50 crc kubenswrapper[5094]: I0220 08:39:50.316257 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/223d1ebb-1681-477e-b91a-43dc2ce65d74-catalog-content\") pod \"redhat-operators-z7m9s\" (UID: \"223d1ebb-1681-477e-b91a-43dc2ce65d74\") " pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:39:50 crc kubenswrapper[5094]: I0220 08:39:50.316308 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdcb9\" (UniqueName: \"kubernetes.io/projected/223d1ebb-1681-477e-b91a-43dc2ce65d74-kube-api-access-hdcb9\") pod \"redhat-operators-z7m9s\" (UID: \"223d1ebb-1681-477e-b91a-43dc2ce65d74\") " pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:39:50 crc kubenswrapper[5094]: I0220 08:39:50.316816 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/223d1ebb-1681-477e-b91a-43dc2ce65d74-utilities\") pod \"redhat-operators-z7m9s\" (UID: \"223d1ebb-1681-477e-b91a-43dc2ce65d74\") " pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:39:50 crc kubenswrapper[5094]: I0220 08:39:50.316856 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/223d1ebb-1681-477e-b91a-43dc2ce65d74-catalog-content\") pod \"redhat-operators-z7m9s\" (UID: \"223d1ebb-1681-477e-b91a-43dc2ce65d74\") " pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:39:50 crc kubenswrapper[5094]: I0220 08:39:50.335933 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdcb9\" (UniqueName: \"kubernetes.io/projected/223d1ebb-1681-477e-b91a-43dc2ce65d74-kube-api-access-hdcb9\") pod \"redhat-operators-z7m9s\" (UID: \"223d1ebb-1681-477e-b91a-43dc2ce65d74\") " pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:39:50 crc kubenswrapper[5094]: I0220 08:39:50.390357 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:39:55 crc kubenswrapper[5094]: W0220 08:39:55.126951 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod223d1ebb_1681_477e_b91a_43dc2ce65d74.slice/crio-ae06670f68a94f466d0e5d0b0bc69e3312f585a59895b367fe60d78b78182119 WatchSource:0}: Error finding container ae06670f68a94f466d0e5d0b0bc69e3312f585a59895b367fe60d78b78182119: Status 404 returned error can't find the container with id ae06670f68a94f466d0e5d0b0bc69e3312f585a59895b367fe60d78b78182119 Feb 20 08:39:55 crc kubenswrapper[5094]: I0220 08:39:55.127673 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z7m9s"] Feb 20 08:39:55 crc kubenswrapper[5094]: I0220 08:39:55.685468 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b4afe958-0e78-49e9-b05a-08ff4c42f602","Type":"ContainerStarted","Data":"cb4acd794ebf0ed75ac1a7120cae6e99da13e366872ae910d88b143eaec6e181"} Feb 20 08:39:55 crc kubenswrapper[5094]: I0220 08:39:55.689024 5094 generic.go:334] "Generic (PLEG): container finished" podID="223d1ebb-1681-477e-b91a-43dc2ce65d74" containerID="14f85aca3f8bff89ae81140f75bdd696b9244b57f19d94ca92a8e902738cc1a6" exitCode=0 Feb 20 08:39:55 crc kubenswrapper[5094]: I0220 08:39:55.689091 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7m9s" event={"ID":"223d1ebb-1681-477e-b91a-43dc2ce65d74","Type":"ContainerDied","Data":"14f85aca3f8bff89ae81140f75bdd696b9244b57f19d94ca92a8e902738cc1a6"} Feb 20 08:39:55 crc kubenswrapper[5094]: I0220 08:39:55.689119 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7m9s" event={"ID":"223d1ebb-1681-477e-b91a-43dc2ce65d74","Type":"ContainerStarted","Data":"ae06670f68a94f466d0e5d0b0bc69e3312f585a59895b367fe60d78b78182119"} Feb 20 08:39:55 crc kubenswrapper[5094]: I0220 08:39:55.711449 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.568347089 podStartE2EDuration="13.71135276s" podCreationTimestamp="2026-02-20 08:39:42 +0000 UTC" firstStartedPulling="2026-02-20 08:39:43.600171236 +0000 UTC m=+6798.472797947" lastFinishedPulling="2026-02-20 08:39:54.743176907 +0000 UTC m=+6809.615803618" observedRunningTime="2026-02-20 08:39:55.710285384 +0000 UTC m=+6810.582912105" watchObservedRunningTime="2026-02-20 08:39:55.71135276 +0000 UTC m=+6810.583979471" Feb 20 08:39:56 crc kubenswrapper[5094]: I0220 08:39:56.700070 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7m9s" event={"ID":"223d1ebb-1681-477e-b91a-43dc2ce65d74","Type":"ContainerStarted","Data":"c074de8d9209e605a1080b4e00b6dafcd4ebcb435aaf09dbb76bc4d663278186"} Feb 20 08:39:57 crc kubenswrapper[5094]: I0220 08:39:57.725942 5094 generic.go:334] "Generic (PLEG): container finished" podID="223d1ebb-1681-477e-b91a-43dc2ce65d74" containerID="c074de8d9209e605a1080b4e00b6dafcd4ebcb435aaf09dbb76bc4d663278186" exitCode=0 Feb 20 08:39:57 crc kubenswrapper[5094]: I0220 08:39:57.725992 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7m9s" event={"ID":"223d1ebb-1681-477e-b91a-43dc2ce65d74","Type":"ContainerDied","Data":"c074de8d9209e605a1080b4e00b6dafcd4ebcb435aaf09dbb76bc4d663278186"} Feb 20 08:39:58 crc kubenswrapper[5094]: I0220 08:39:58.735415 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7m9s" event={"ID":"223d1ebb-1681-477e-b91a-43dc2ce65d74","Type":"ContainerStarted","Data":"6398de9a8c4eab7d55cf276621ec04d25a41e78ffe65c9b84589bd748f48f245"} Feb 20 08:39:58 crc kubenswrapper[5094]: I0220 08:39:58.758237 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z7m9s" podStartSLOduration=6.369382878 podStartE2EDuration="8.758215692s" podCreationTimestamp="2026-02-20 08:39:50 +0000 UTC" firstStartedPulling="2026-02-20 08:39:55.691188994 +0000 UTC m=+6810.563815735" lastFinishedPulling="2026-02-20 08:39:58.080021838 +0000 UTC m=+6812.952648549" observedRunningTime="2026-02-20 08:39:58.75067862 +0000 UTC m=+6813.623305331" watchObservedRunningTime="2026-02-20 08:39:58.758215692 +0000 UTC m=+6813.630842403" Feb 20 08:40:00 crc kubenswrapper[5094]: I0220 08:40:00.391201 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:40:00 crc kubenswrapper[5094]: I0220 08:40:00.391288 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:40:01 crc kubenswrapper[5094]: I0220 08:40:01.472522 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z7m9s" podUID="223d1ebb-1681-477e-b91a-43dc2ce65d74" containerName="registry-server" probeResult="failure" output=< Feb 20 08:40:01 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 08:40:01 crc kubenswrapper[5094]: > Feb 20 08:40:04 crc kubenswrapper[5094]: I0220 08:40:04.106996 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:40:04 crc kubenswrapper[5094]: I0220 08:40:04.108104 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:40:04 crc kubenswrapper[5094]: I0220 08:40:04.108232 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 08:40:04 crc kubenswrapper[5094]: I0220 08:40:04.109583 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4eb6a1e784b78bde35c2b1ca39f3fd3b16c3f1975e9f121306145f15615f4afb"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 08:40:04 crc kubenswrapper[5094]: I0220 08:40:04.109688 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://4eb6a1e784b78bde35c2b1ca39f3fd3b16c3f1975e9f121306145f15615f4afb" gracePeriod=600 Feb 20 08:40:04 crc kubenswrapper[5094]: I0220 08:40:04.797470 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="4eb6a1e784b78bde35c2b1ca39f3fd3b16c3f1975e9f121306145f15615f4afb" exitCode=0 Feb 20 08:40:04 crc kubenswrapper[5094]: I0220 08:40:04.797606 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"4eb6a1e784b78bde35c2b1ca39f3fd3b16c3f1975e9f121306145f15615f4afb"} Feb 20 08:40:04 crc kubenswrapper[5094]: I0220 08:40:04.797977 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:40:05 crc kubenswrapper[5094]: I0220 08:40:05.809004 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec"} Feb 20 08:40:09 crc kubenswrapper[5094]: I0220 08:40:09.451534 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6" podUID="fe469d05-edeb-4d23-b06b-6bdbfc646e99" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.49:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 08:40:09 crc kubenswrapper[5094]: I0220 08:40:09.452816 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6" podUID="fe469d05-edeb-4d23-b06b-6bdbfc646e99" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.49:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 08:40:10 crc kubenswrapper[5094]: I0220 08:40:10.458531 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:40:10 crc kubenswrapper[5094]: I0220 08:40:10.540136 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:40:10 crc kubenswrapper[5094]: I0220 08:40:10.732065 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z7m9s"] Feb 20 08:40:12 crc kubenswrapper[5094]: I0220 08:40:12.460086 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z7m9s" podUID="223d1ebb-1681-477e-b91a-43dc2ce65d74" containerName="registry-server" containerID="cri-o://6398de9a8c4eab7d55cf276621ec04d25a41e78ffe65c9b84589bd748f48f245" gracePeriod=2 Feb 20 08:40:12 crc kubenswrapper[5094]: I0220 08:40:12.880405 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.001995 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/223d1ebb-1681-477e-b91a-43dc2ce65d74-utilities\") pod \"223d1ebb-1681-477e-b91a-43dc2ce65d74\" (UID: \"223d1ebb-1681-477e-b91a-43dc2ce65d74\") " Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.002044 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdcb9\" (UniqueName: \"kubernetes.io/projected/223d1ebb-1681-477e-b91a-43dc2ce65d74-kube-api-access-hdcb9\") pod \"223d1ebb-1681-477e-b91a-43dc2ce65d74\" (UID: \"223d1ebb-1681-477e-b91a-43dc2ce65d74\") " Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.002100 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/223d1ebb-1681-477e-b91a-43dc2ce65d74-catalog-content\") pod \"223d1ebb-1681-477e-b91a-43dc2ce65d74\" (UID: \"223d1ebb-1681-477e-b91a-43dc2ce65d74\") " Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.003802 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/223d1ebb-1681-477e-b91a-43dc2ce65d74-utilities" (OuterVolumeSpecName: "utilities") pod "223d1ebb-1681-477e-b91a-43dc2ce65d74" (UID: "223d1ebb-1681-477e-b91a-43dc2ce65d74"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.005492 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/223d1ebb-1681-477e-b91a-43dc2ce65d74-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.008906 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/223d1ebb-1681-477e-b91a-43dc2ce65d74-kube-api-access-hdcb9" (OuterVolumeSpecName: "kube-api-access-hdcb9") pod "223d1ebb-1681-477e-b91a-43dc2ce65d74" (UID: "223d1ebb-1681-477e-b91a-43dc2ce65d74"). InnerVolumeSpecName "kube-api-access-hdcb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.108161 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdcb9\" (UniqueName: \"kubernetes.io/projected/223d1ebb-1681-477e-b91a-43dc2ce65d74-kube-api-access-hdcb9\") on node \"crc\" DevicePath \"\"" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.163925 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/223d1ebb-1681-477e-b91a-43dc2ce65d74-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "223d1ebb-1681-477e-b91a-43dc2ce65d74" (UID: "223d1ebb-1681-477e-b91a-43dc2ce65d74"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.209137 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/223d1ebb-1681-477e-b91a-43dc2ce65d74-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.475278 5094 generic.go:334] "Generic (PLEG): container finished" podID="223d1ebb-1681-477e-b91a-43dc2ce65d74" containerID="6398de9a8c4eab7d55cf276621ec04d25a41e78ffe65c9b84589bd748f48f245" exitCode=0 Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.475401 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.475396 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7m9s" event={"ID":"223d1ebb-1681-477e-b91a-43dc2ce65d74","Type":"ContainerDied","Data":"6398de9a8c4eab7d55cf276621ec04d25a41e78ffe65c9b84589bd748f48f245"} Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.475956 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7m9s" event={"ID":"223d1ebb-1681-477e-b91a-43dc2ce65d74","Type":"ContainerDied","Data":"ae06670f68a94f466d0e5d0b0bc69e3312f585a59895b367fe60d78b78182119"} Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.475983 5094 scope.go:117] "RemoveContainer" containerID="6398de9a8c4eab7d55cf276621ec04d25a41e78ffe65c9b84589bd748f48f245" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.506438 5094 scope.go:117] "RemoveContainer" containerID="c074de8d9209e605a1080b4e00b6dafcd4ebcb435aaf09dbb76bc4d663278186" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.528040 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z7m9s"] Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.544437 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z7m9s"] Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.564940 5094 scope.go:117] "RemoveContainer" containerID="14f85aca3f8bff89ae81140f75bdd696b9244b57f19d94ca92a8e902738cc1a6" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.595352 5094 scope.go:117] "RemoveContainer" containerID="6398de9a8c4eab7d55cf276621ec04d25a41e78ffe65c9b84589bd748f48f245" Feb 20 08:40:13 crc kubenswrapper[5094]: E0220 08:40:13.596320 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6398de9a8c4eab7d55cf276621ec04d25a41e78ffe65c9b84589bd748f48f245\": container with ID starting with 6398de9a8c4eab7d55cf276621ec04d25a41e78ffe65c9b84589bd748f48f245 not found: ID does not exist" containerID="6398de9a8c4eab7d55cf276621ec04d25a41e78ffe65c9b84589bd748f48f245" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.596386 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6398de9a8c4eab7d55cf276621ec04d25a41e78ffe65c9b84589bd748f48f245"} err="failed to get container status \"6398de9a8c4eab7d55cf276621ec04d25a41e78ffe65c9b84589bd748f48f245\": rpc error: code = NotFound desc = could not find container \"6398de9a8c4eab7d55cf276621ec04d25a41e78ffe65c9b84589bd748f48f245\": container with ID starting with 6398de9a8c4eab7d55cf276621ec04d25a41e78ffe65c9b84589bd748f48f245 not found: ID does not exist" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.596424 5094 scope.go:117] "RemoveContainer" containerID="c074de8d9209e605a1080b4e00b6dafcd4ebcb435aaf09dbb76bc4d663278186" Feb 20 08:40:13 crc kubenswrapper[5094]: E0220 08:40:13.597066 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c074de8d9209e605a1080b4e00b6dafcd4ebcb435aaf09dbb76bc4d663278186\": container with ID starting with c074de8d9209e605a1080b4e00b6dafcd4ebcb435aaf09dbb76bc4d663278186 not found: ID does not exist" containerID="c074de8d9209e605a1080b4e00b6dafcd4ebcb435aaf09dbb76bc4d663278186" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.597111 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c074de8d9209e605a1080b4e00b6dafcd4ebcb435aaf09dbb76bc4d663278186"} err="failed to get container status \"c074de8d9209e605a1080b4e00b6dafcd4ebcb435aaf09dbb76bc4d663278186\": rpc error: code = NotFound desc = could not find container \"c074de8d9209e605a1080b4e00b6dafcd4ebcb435aaf09dbb76bc4d663278186\": container with ID starting with c074de8d9209e605a1080b4e00b6dafcd4ebcb435aaf09dbb76bc4d663278186 not found: ID does not exist" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.597139 5094 scope.go:117] "RemoveContainer" containerID="14f85aca3f8bff89ae81140f75bdd696b9244b57f19d94ca92a8e902738cc1a6" Feb 20 08:40:13 crc kubenswrapper[5094]: E0220 08:40:13.597415 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14f85aca3f8bff89ae81140f75bdd696b9244b57f19d94ca92a8e902738cc1a6\": container with ID starting with 14f85aca3f8bff89ae81140f75bdd696b9244b57f19d94ca92a8e902738cc1a6 not found: ID does not exist" containerID="14f85aca3f8bff89ae81140f75bdd696b9244b57f19d94ca92a8e902738cc1a6" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.597447 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14f85aca3f8bff89ae81140f75bdd696b9244b57f19d94ca92a8e902738cc1a6"} err="failed to get container status \"14f85aca3f8bff89ae81140f75bdd696b9244b57f19d94ca92a8e902738cc1a6\": rpc error: code = NotFound desc = could not find container \"14f85aca3f8bff89ae81140f75bdd696b9244b57f19d94ca92a8e902738cc1a6\": container with ID starting with 14f85aca3f8bff89ae81140f75bdd696b9244b57f19d94ca92a8e902738cc1a6 not found: ID does not exist" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.853852 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="223d1ebb-1681-477e-b91a-43dc2ce65d74" path="/var/lib/kubelet/pods/223d1ebb-1681-477e-b91a-43dc2ce65d74/volumes" Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.337323 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sq9f8"] Feb 20 08:40:44 crc kubenswrapper[5094]: E0220 08:40:44.338944 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="223d1ebb-1681-477e-b91a-43dc2ce65d74" containerName="extract-content" Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.338979 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="223d1ebb-1681-477e-b91a-43dc2ce65d74" containerName="extract-content" Feb 20 08:40:44 crc kubenswrapper[5094]: E0220 08:40:44.339027 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="223d1ebb-1681-477e-b91a-43dc2ce65d74" containerName="registry-server" Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.339039 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="223d1ebb-1681-477e-b91a-43dc2ce65d74" containerName="registry-server" Feb 20 08:40:44 crc kubenswrapper[5094]: E0220 08:40:44.339083 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="223d1ebb-1681-477e-b91a-43dc2ce65d74" containerName="extract-utilities" Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.339097 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="223d1ebb-1681-477e-b91a-43dc2ce65d74" containerName="extract-utilities" Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.339664 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="223d1ebb-1681-477e-b91a-43dc2ce65d74" containerName="registry-server" Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.354977 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sq9f8"] Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.355163 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.503132 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgps6\" (UniqueName: \"kubernetes.io/projected/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-kube-api-access-kgps6\") pod \"certified-operators-sq9f8\" (UID: \"1220b209-cf9a-473e-8c43-e3fbd4ead7ee\") " pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.503180 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-catalog-content\") pod \"certified-operators-sq9f8\" (UID: \"1220b209-cf9a-473e-8c43-e3fbd4ead7ee\") " pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.503221 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-utilities\") pod \"certified-operators-sq9f8\" (UID: \"1220b209-cf9a-473e-8c43-e3fbd4ead7ee\") " pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.604241 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgps6\" (UniqueName: \"kubernetes.io/projected/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-kube-api-access-kgps6\") pod \"certified-operators-sq9f8\" (UID: \"1220b209-cf9a-473e-8c43-e3fbd4ead7ee\") " pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.604292 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-catalog-content\") pod \"certified-operators-sq9f8\" (UID: \"1220b209-cf9a-473e-8c43-e3fbd4ead7ee\") " pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.604339 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-utilities\") pod \"certified-operators-sq9f8\" (UID: \"1220b209-cf9a-473e-8c43-e3fbd4ead7ee\") " pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.604917 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-catalog-content\") pod \"certified-operators-sq9f8\" (UID: \"1220b209-cf9a-473e-8c43-e3fbd4ead7ee\") " pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.604926 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-utilities\") pod \"certified-operators-sq9f8\" (UID: \"1220b209-cf9a-473e-8c43-e3fbd4ead7ee\") " pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.635986 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgps6\" (UniqueName: \"kubernetes.io/projected/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-kube-api-access-kgps6\") pod \"certified-operators-sq9f8\" (UID: \"1220b209-cf9a-473e-8c43-e3fbd4ead7ee\") " pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.689492 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:45 crc kubenswrapper[5094]: I0220 08:40:45.196028 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sq9f8"] Feb 20 08:40:45 crc kubenswrapper[5094]: I0220 08:40:45.777326 5094 generic.go:334] "Generic (PLEG): container finished" podID="1220b209-cf9a-473e-8c43-e3fbd4ead7ee" containerID="e4d0143a0bfc23a67fe065f30ca7bef32ef97af74469926b449f1947e4dc0ff0" exitCode=0 Feb 20 08:40:45 crc kubenswrapper[5094]: I0220 08:40:45.777381 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sq9f8" event={"ID":"1220b209-cf9a-473e-8c43-e3fbd4ead7ee","Type":"ContainerDied","Data":"e4d0143a0bfc23a67fe065f30ca7bef32ef97af74469926b449f1947e4dc0ff0"} Feb 20 08:40:45 crc kubenswrapper[5094]: I0220 08:40:45.777406 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sq9f8" event={"ID":"1220b209-cf9a-473e-8c43-e3fbd4ead7ee","Type":"ContainerStarted","Data":"f38c2ad1ee049bab464b7e571d472f3a493e9dc3b0801372a7334e7bc03d4f4c"} Feb 20 08:40:46 crc kubenswrapper[5094]: I0220 08:40:46.794546 5094 generic.go:334] "Generic (PLEG): container finished" podID="1220b209-cf9a-473e-8c43-e3fbd4ead7ee" containerID="48a7dde60317d8dc89c284c17c89d10ba212905b23f5c8b567ef7dfc86280761" exitCode=0 Feb 20 08:40:46 crc kubenswrapper[5094]: I0220 08:40:46.794633 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sq9f8" event={"ID":"1220b209-cf9a-473e-8c43-e3fbd4ead7ee","Type":"ContainerDied","Data":"48a7dde60317d8dc89c284c17c89d10ba212905b23f5c8b567ef7dfc86280761"} Feb 20 08:40:47 crc kubenswrapper[5094]: I0220 08:40:47.810562 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sq9f8" event={"ID":"1220b209-cf9a-473e-8c43-e3fbd4ead7ee","Type":"ContainerStarted","Data":"9c2ad71cd007d4580cdda3852435a5f8e42251cab198cc6edfcc601ea5c9cf6c"} Feb 20 08:40:47 crc kubenswrapper[5094]: I0220 08:40:47.835567 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sq9f8" podStartSLOduration=2.4335994579999998 podStartE2EDuration="3.83554649s" podCreationTimestamp="2026-02-20 08:40:44 +0000 UTC" firstStartedPulling="2026-02-20 08:40:45.779647819 +0000 UTC m=+6860.652274570" lastFinishedPulling="2026-02-20 08:40:47.181594891 +0000 UTC m=+6862.054221602" observedRunningTime="2026-02-20 08:40:47.829969396 +0000 UTC m=+6862.702596107" watchObservedRunningTime="2026-02-20 08:40:47.83554649 +0000 UTC m=+6862.708173201" Feb 20 08:40:54 crc kubenswrapper[5094]: I0220 08:40:54.691406 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:54 crc kubenswrapper[5094]: I0220 08:40:54.692101 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:54 crc kubenswrapper[5094]: I0220 08:40:54.735823 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:54 crc kubenswrapper[5094]: I0220 08:40:54.932554 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:54 crc kubenswrapper[5094]: I0220 08:40:54.992721 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sq9f8"] Feb 20 08:40:56 crc kubenswrapper[5094]: I0220 08:40:56.891075 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sq9f8" podUID="1220b209-cf9a-473e-8c43-e3fbd4ead7ee" containerName="registry-server" containerID="cri-o://9c2ad71cd007d4580cdda3852435a5f8e42251cab198cc6edfcc601ea5c9cf6c" gracePeriod=2 Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.302449 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.426891 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-utilities\") pod \"1220b209-cf9a-473e-8c43-e3fbd4ead7ee\" (UID: \"1220b209-cf9a-473e-8c43-e3fbd4ead7ee\") " Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.426948 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgps6\" (UniqueName: \"kubernetes.io/projected/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-kube-api-access-kgps6\") pod \"1220b209-cf9a-473e-8c43-e3fbd4ead7ee\" (UID: \"1220b209-cf9a-473e-8c43-e3fbd4ead7ee\") " Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.427019 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-catalog-content\") pod \"1220b209-cf9a-473e-8c43-e3fbd4ead7ee\" (UID: \"1220b209-cf9a-473e-8c43-e3fbd4ead7ee\") " Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.427951 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-utilities" (OuterVolumeSpecName: "utilities") pod "1220b209-cf9a-473e-8c43-e3fbd4ead7ee" (UID: "1220b209-cf9a-473e-8c43-e3fbd4ead7ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.432554 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-kube-api-access-kgps6" (OuterVolumeSpecName: "kube-api-access-kgps6") pod "1220b209-cf9a-473e-8c43-e3fbd4ead7ee" (UID: "1220b209-cf9a-473e-8c43-e3fbd4ead7ee"). InnerVolumeSpecName "kube-api-access-kgps6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.518341 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1220b209-cf9a-473e-8c43-e3fbd4ead7ee" (UID: "1220b209-cf9a-473e-8c43-e3fbd4ead7ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.529044 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.529077 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.529089 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgps6\" (UniqueName: \"kubernetes.io/projected/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-kube-api-access-kgps6\") on node \"crc\" DevicePath \"\"" Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.899822 5094 generic.go:334] "Generic (PLEG): container finished" podID="1220b209-cf9a-473e-8c43-e3fbd4ead7ee" containerID="9c2ad71cd007d4580cdda3852435a5f8e42251cab198cc6edfcc601ea5c9cf6c" exitCode=0 Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.899869 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sq9f8" event={"ID":"1220b209-cf9a-473e-8c43-e3fbd4ead7ee","Type":"ContainerDied","Data":"9c2ad71cd007d4580cdda3852435a5f8e42251cab198cc6edfcc601ea5c9cf6c"} Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.900173 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sq9f8" event={"ID":"1220b209-cf9a-473e-8c43-e3fbd4ead7ee","Type":"ContainerDied","Data":"f38c2ad1ee049bab464b7e571d472f3a493e9dc3b0801372a7334e7bc03d4f4c"} Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.900019 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.901112 5094 scope.go:117] "RemoveContainer" containerID="9c2ad71cd007d4580cdda3852435a5f8e42251cab198cc6edfcc601ea5c9cf6c" Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.943867 5094 scope.go:117] "RemoveContainer" containerID="48a7dde60317d8dc89c284c17c89d10ba212905b23f5c8b567ef7dfc86280761" Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.943989 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sq9f8"] Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.952987 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sq9f8"] Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.975653 5094 scope.go:117] "RemoveContainer" containerID="e4d0143a0bfc23a67fe065f30ca7bef32ef97af74469926b449f1947e4dc0ff0" Feb 20 08:40:58 crc kubenswrapper[5094]: I0220 08:40:58.016795 5094 scope.go:117] "RemoveContainer" containerID="9c2ad71cd007d4580cdda3852435a5f8e42251cab198cc6edfcc601ea5c9cf6c" Feb 20 08:40:58 crc kubenswrapper[5094]: E0220 08:40:58.017110 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c2ad71cd007d4580cdda3852435a5f8e42251cab198cc6edfcc601ea5c9cf6c\": container with ID starting with 9c2ad71cd007d4580cdda3852435a5f8e42251cab198cc6edfcc601ea5c9cf6c not found: ID does not exist" containerID="9c2ad71cd007d4580cdda3852435a5f8e42251cab198cc6edfcc601ea5c9cf6c" Feb 20 08:40:58 crc kubenswrapper[5094]: I0220 08:40:58.017148 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c2ad71cd007d4580cdda3852435a5f8e42251cab198cc6edfcc601ea5c9cf6c"} err="failed to get container status \"9c2ad71cd007d4580cdda3852435a5f8e42251cab198cc6edfcc601ea5c9cf6c\": rpc error: code = NotFound desc = could not find container \"9c2ad71cd007d4580cdda3852435a5f8e42251cab198cc6edfcc601ea5c9cf6c\": container with ID starting with 9c2ad71cd007d4580cdda3852435a5f8e42251cab198cc6edfcc601ea5c9cf6c not found: ID does not exist" Feb 20 08:40:58 crc kubenswrapper[5094]: I0220 08:40:58.017167 5094 scope.go:117] "RemoveContainer" containerID="48a7dde60317d8dc89c284c17c89d10ba212905b23f5c8b567ef7dfc86280761" Feb 20 08:40:58 crc kubenswrapper[5094]: E0220 08:40:58.017681 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48a7dde60317d8dc89c284c17c89d10ba212905b23f5c8b567ef7dfc86280761\": container with ID starting with 48a7dde60317d8dc89c284c17c89d10ba212905b23f5c8b567ef7dfc86280761 not found: ID does not exist" containerID="48a7dde60317d8dc89c284c17c89d10ba212905b23f5c8b567ef7dfc86280761" Feb 20 08:40:58 crc kubenswrapper[5094]: I0220 08:40:58.017767 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48a7dde60317d8dc89c284c17c89d10ba212905b23f5c8b567ef7dfc86280761"} err="failed to get container status \"48a7dde60317d8dc89c284c17c89d10ba212905b23f5c8b567ef7dfc86280761\": rpc error: code = NotFound desc = could not find container \"48a7dde60317d8dc89c284c17c89d10ba212905b23f5c8b567ef7dfc86280761\": container with ID starting with 48a7dde60317d8dc89c284c17c89d10ba212905b23f5c8b567ef7dfc86280761 not found: ID does not exist" Feb 20 08:40:58 crc kubenswrapper[5094]: I0220 08:40:58.017789 5094 scope.go:117] "RemoveContainer" containerID="e4d0143a0bfc23a67fe065f30ca7bef32ef97af74469926b449f1947e4dc0ff0" Feb 20 08:40:58 crc kubenswrapper[5094]: E0220 08:40:58.018000 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4d0143a0bfc23a67fe065f30ca7bef32ef97af74469926b449f1947e4dc0ff0\": container with ID starting with e4d0143a0bfc23a67fe065f30ca7bef32ef97af74469926b449f1947e4dc0ff0 not found: ID does not exist" containerID="e4d0143a0bfc23a67fe065f30ca7bef32ef97af74469926b449f1947e4dc0ff0" Feb 20 08:40:58 crc kubenswrapper[5094]: I0220 08:40:58.018023 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4d0143a0bfc23a67fe065f30ca7bef32ef97af74469926b449f1947e4dc0ff0"} err="failed to get container status \"e4d0143a0bfc23a67fe065f30ca7bef32ef97af74469926b449f1947e4dc0ff0\": rpc error: code = NotFound desc = could not find container \"e4d0143a0bfc23a67fe065f30ca7bef32ef97af74469926b449f1947e4dc0ff0\": container with ID starting with e4d0143a0bfc23a67fe065f30ca7bef32ef97af74469926b449f1947e4dc0ff0 not found: ID does not exist" Feb 20 08:40:59 crc kubenswrapper[5094]: I0220 08:40:59.855885 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1220b209-cf9a-473e-8c43-e3fbd4ead7ee" path="/var/lib/kubelet/pods/1220b209-cf9a-473e-8c43-e3fbd4ead7ee/volumes" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.494288 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-2tqmv"] Feb 20 08:41:21 crc kubenswrapper[5094]: E0220 08:41:21.496405 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1220b209-cf9a-473e-8c43-e3fbd4ead7ee" containerName="registry-server" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.496498 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="1220b209-cf9a-473e-8c43-e3fbd4ead7ee" containerName="registry-server" Feb 20 08:41:21 crc kubenswrapper[5094]: E0220 08:41:21.496598 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1220b209-cf9a-473e-8c43-e3fbd4ead7ee" containerName="extract-content" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.496685 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="1220b209-cf9a-473e-8c43-e3fbd4ead7ee" containerName="extract-content" Feb 20 08:41:21 crc kubenswrapper[5094]: E0220 08:41:21.496786 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1220b209-cf9a-473e-8c43-e3fbd4ead7ee" containerName="extract-utilities" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.496852 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="1220b209-cf9a-473e-8c43-e3fbd4ead7ee" containerName="extract-utilities" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.497107 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="1220b209-cf9a-473e-8c43-e3fbd4ead7ee" containerName="registry-server" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.497918 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2tqmv" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.505072 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-0a46-account-create-update-6bs8s"] Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.506929 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0a46-account-create-update-6bs8s" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.509073 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.512924 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-2tqmv"] Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.520780 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0a46-account-create-update-6bs8s"] Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.592309 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dvzk\" (UniqueName: \"kubernetes.io/projected/61baa8c9-4a07-47bc-94c8-b7e3f3846ff2-kube-api-access-2dvzk\") pod \"barbican-db-create-2tqmv\" (UID: \"61baa8c9-4a07-47bc-94c8-b7e3f3846ff2\") " pod="openstack/barbican-db-create-2tqmv" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.592632 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61baa8c9-4a07-47bc-94c8-b7e3f3846ff2-operator-scripts\") pod \"barbican-db-create-2tqmv\" (UID: \"61baa8c9-4a07-47bc-94c8-b7e3f3846ff2\") " pod="openstack/barbican-db-create-2tqmv" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.592771 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb71d5b0-a19d-4900-be92-77b1abeaf856-operator-scripts\") pod \"barbican-0a46-account-create-update-6bs8s\" (UID: \"eb71d5b0-a19d-4900-be92-77b1abeaf856\") " pod="openstack/barbican-0a46-account-create-update-6bs8s" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.592910 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpgk2\" (UniqueName: \"kubernetes.io/projected/eb71d5b0-a19d-4900-be92-77b1abeaf856-kube-api-access-bpgk2\") pod \"barbican-0a46-account-create-update-6bs8s\" (UID: \"eb71d5b0-a19d-4900-be92-77b1abeaf856\") " pod="openstack/barbican-0a46-account-create-update-6bs8s" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.694814 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb71d5b0-a19d-4900-be92-77b1abeaf856-operator-scripts\") pod \"barbican-0a46-account-create-update-6bs8s\" (UID: \"eb71d5b0-a19d-4900-be92-77b1abeaf856\") " pod="openstack/barbican-0a46-account-create-update-6bs8s" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.694990 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpgk2\" (UniqueName: \"kubernetes.io/projected/eb71d5b0-a19d-4900-be92-77b1abeaf856-kube-api-access-bpgk2\") pod \"barbican-0a46-account-create-update-6bs8s\" (UID: \"eb71d5b0-a19d-4900-be92-77b1abeaf856\") " pod="openstack/barbican-0a46-account-create-update-6bs8s" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.695101 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dvzk\" (UniqueName: \"kubernetes.io/projected/61baa8c9-4a07-47bc-94c8-b7e3f3846ff2-kube-api-access-2dvzk\") pod \"barbican-db-create-2tqmv\" (UID: \"61baa8c9-4a07-47bc-94c8-b7e3f3846ff2\") " pod="openstack/barbican-db-create-2tqmv" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.695213 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61baa8c9-4a07-47bc-94c8-b7e3f3846ff2-operator-scripts\") pod \"barbican-db-create-2tqmv\" (UID: \"61baa8c9-4a07-47bc-94c8-b7e3f3846ff2\") " pod="openstack/barbican-db-create-2tqmv" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.695765 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb71d5b0-a19d-4900-be92-77b1abeaf856-operator-scripts\") pod \"barbican-0a46-account-create-update-6bs8s\" (UID: \"eb71d5b0-a19d-4900-be92-77b1abeaf856\") " pod="openstack/barbican-0a46-account-create-update-6bs8s" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.696777 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61baa8c9-4a07-47bc-94c8-b7e3f3846ff2-operator-scripts\") pod \"barbican-db-create-2tqmv\" (UID: \"61baa8c9-4a07-47bc-94c8-b7e3f3846ff2\") " pod="openstack/barbican-db-create-2tqmv" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.714448 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dvzk\" (UniqueName: \"kubernetes.io/projected/61baa8c9-4a07-47bc-94c8-b7e3f3846ff2-kube-api-access-2dvzk\") pod \"barbican-db-create-2tqmv\" (UID: \"61baa8c9-4a07-47bc-94c8-b7e3f3846ff2\") " pod="openstack/barbican-db-create-2tqmv" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.718293 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpgk2\" (UniqueName: \"kubernetes.io/projected/eb71d5b0-a19d-4900-be92-77b1abeaf856-kube-api-access-bpgk2\") pod \"barbican-0a46-account-create-update-6bs8s\" (UID: \"eb71d5b0-a19d-4900-be92-77b1abeaf856\") " pod="openstack/barbican-0a46-account-create-update-6bs8s" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.822755 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2tqmv" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.840723 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0a46-account-create-update-6bs8s" Feb 20 08:41:22 crc kubenswrapper[5094]: I0220 08:41:22.320931 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0a46-account-create-update-6bs8s"] Feb 20 08:41:22 crc kubenswrapper[5094]: W0220 08:41:22.330391 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb71d5b0_a19d_4900_be92_77b1abeaf856.slice/crio-7bbef8849fb28ef72e101d5f6b345cb48ee27d5c052f39c6a32091246ba8624a WatchSource:0}: Error finding container 7bbef8849fb28ef72e101d5f6b345cb48ee27d5c052f39c6a32091246ba8624a: Status 404 returned error can't find the container with id 7bbef8849fb28ef72e101d5f6b345cb48ee27d5c052f39c6a32091246ba8624a Feb 20 08:41:22 crc kubenswrapper[5094]: I0220 08:41:22.371032 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-2tqmv"] Feb 20 08:41:23 crc kubenswrapper[5094]: I0220 08:41:23.119381 5094 generic.go:334] "Generic (PLEG): container finished" podID="eb71d5b0-a19d-4900-be92-77b1abeaf856" containerID="4f87cc562d40739a0734989e8f19246c6cf1e1144b307f5249bd8e950afcfbb0" exitCode=0 Feb 20 08:41:23 crc kubenswrapper[5094]: I0220 08:41:23.119470 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0a46-account-create-update-6bs8s" event={"ID":"eb71d5b0-a19d-4900-be92-77b1abeaf856","Type":"ContainerDied","Data":"4f87cc562d40739a0734989e8f19246c6cf1e1144b307f5249bd8e950afcfbb0"} Feb 20 08:41:23 crc kubenswrapper[5094]: I0220 08:41:23.119505 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0a46-account-create-update-6bs8s" event={"ID":"eb71d5b0-a19d-4900-be92-77b1abeaf856","Type":"ContainerStarted","Data":"7bbef8849fb28ef72e101d5f6b345cb48ee27d5c052f39c6a32091246ba8624a"} Feb 20 08:41:23 crc kubenswrapper[5094]: I0220 08:41:23.121471 5094 generic.go:334] "Generic (PLEG): container finished" podID="61baa8c9-4a07-47bc-94c8-b7e3f3846ff2" containerID="4231927e6f52319c4c7cbbaa5766e18430942afbbae151ea27a85c1b2eed2b12" exitCode=0 Feb 20 08:41:23 crc kubenswrapper[5094]: I0220 08:41:23.121515 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2tqmv" event={"ID":"61baa8c9-4a07-47bc-94c8-b7e3f3846ff2","Type":"ContainerDied","Data":"4231927e6f52319c4c7cbbaa5766e18430942afbbae151ea27a85c1b2eed2b12"} Feb 20 08:41:23 crc kubenswrapper[5094]: I0220 08:41:23.121537 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2tqmv" event={"ID":"61baa8c9-4a07-47bc-94c8-b7e3f3846ff2","Type":"ContainerStarted","Data":"b22b1fea60091c52338c1c42e8f642f2a793eb4270ff5da8d70b8a55ab170ead"} Feb 20 08:41:24 crc kubenswrapper[5094]: I0220 08:41:24.583040 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0a46-account-create-update-6bs8s" Feb 20 08:41:24 crc kubenswrapper[5094]: I0220 08:41:24.591579 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2tqmv" Feb 20 08:41:24 crc kubenswrapper[5094]: I0220 08:41:24.647551 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpgk2\" (UniqueName: \"kubernetes.io/projected/eb71d5b0-a19d-4900-be92-77b1abeaf856-kube-api-access-bpgk2\") pod \"eb71d5b0-a19d-4900-be92-77b1abeaf856\" (UID: \"eb71d5b0-a19d-4900-be92-77b1abeaf856\") " Feb 20 08:41:24 crc kubenswrapper[5094]: I0220 08:41:24.647942 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61baa8c9-4a07-47bc-94c8-b7e3f3846ff2-operator-scripts\") pod \"61baa8c9-4a07-47bc-94c8-b7e3f3846ff2\" (UID: \"61baa8c9-4a07-47bc-94c8-b7e3f3846ff2\") " Feb 20 08:41:24 crc kubenswrapper[5094]: I0220 08:41:24.648030 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dvzk\" (UniqueName: \"kubernetes.io/projected/61baa8c9-4a07-47bc-94c8-b7e3f3846ff2-kube-api-access-2dvzk\") pod \"61baa8c9-4a07-47bc-94c8-b7e3f3846ff2\" (UID: \"61baa8c9-4a07-47bc-94c8-b7e3f3846ff2\") " Feb 20 08:41:24 crc kubenswrapper[5094]: I0220 08:41:24.648147 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb71d5b0-a19d-4900-be92-77b1abeaf856-operator-scripts\") pod \"eb71d5b0-a19d-4900-be92-77b1abeaf856\" (UID: \"eb71d5b0-a19d-4900-be92-77b1abeaf856\") " Feb 20 08:41:24 crc kubenswrapper[5094]: I0220 08:41:24.648569 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61baa8c9-4a07-47bc-94c8-b7e3f3846ff2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "61baa8c9-4a07-47bc-94c8-b7e3f3846ff2" (UID: "61baa8c9-4a07-47bc-94c8-b7e3f3846ff2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:41:24 crc kubenswrapper[5094]: I0220 08:41:24.648797 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb71d5b0-a19d-4900-be92-77b1abeaf856-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eb71d5b0-a19d-4900-be92-77b1abeaf856" (UID: "eb71d5b0-a19d-4900-be92-77b1abeaf856"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:41:24 crc kubenswrapper[5094]: I0220 08:41:24.654069 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61baa8c9-4a07-47bc-94c8-b7e3f3846ff2-kube-api-access-2dvzk" (OuterVolumeSpecName: "kube-api-access-2dvzk") pod "61baa8c9-4a07-47bc-94c8-b7e3f3846ff2" (UID: "61baa8c9-4a07-47bc-94c8-b7e3f3846ff2"). InnerVolumeSpecName "kube-api-access-2dvzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:41:24 crc kubenswrapper[5094]: I0220 08:41:24.654148 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb71d5b0-a19d-4900-be92-77b1abeaf856-kube-api-access-bpgk2" (OuterVolumeSpecName: "kube-api-access-bpgk2") pod "eb71d5b0-a19d-4900-be92-77b1abeaf856" (UID: "eb71d5b0-a19d-4900-be92-77b1abeaf856"). InnerVolumeSpecName "kube-api-access-bpgk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:41:24 crc kubenswrapper[5094]: I0220 08:41:24.749430 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpgk2\" (UniqueName: \"kubernetes.io/projected/eb71d5b0-a19d-4900-be92-77b1abeaf856-kube-api-access-bpgk2\") on node \"crc\" DevicePath \"\"" Feb 20 08:41:24 crc kubenswrapper[5094]: I0220 08:41:24.749460 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61baa8c9-4a07-47bc-94c8-b7e3f3846ff2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:41:24 crc kubenswrapper[5094]: I0220 08:41:24.749469 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dvzk\" (UniqueName: \"kubernetes.io/projected/61baa8c9-4a07-47bc-94c8-b7e3f3846ff2-kube-api-access-2dvzk\") on node \"crc\" DevicePath \"\"" Feb 20 08:41:24 crc kubenswrapper[5094]: I0220 08:41:24.749479 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb71d5b0-a19d-4900-be92-77b1abeaf856-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:41:25 crc kubenswrapper[5094]: I0220 08:41:25.144035 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2tqmv" event={"ID":"61baa8c9-4a07-47bc-94c8-b7e3f3846ff2","Type":"ContainerDied","Data":"b22b1fea60091c52338c1c42e8f642f2a793eb4270ff5da8d70b8a55ab170ead"} Feb 20 08:41:25 crc kubenswrapper[5094]: I0220 08:41:25.144098 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b22b1fea60091c52338c1c42e8f642f2a793eb4270ff5da8d70b8a55ab170ead" Feb 20 08:41:25 crc kubenswrapper[5094]: I0220 08:41:25.144140 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2tqmv" Feb 20 08:41:25 crc kubenswrapper[5094]: I0220 08:41:25.145425 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0a46-account-create-update-6bs8s" event={"ID":"eb71d5b0-a19d-4900-be92-77b1abeaf856","Type":"ContainerDied","Data":"7bbef8849fb28ef72e101d5f6b345cb48ee27d5c052f39c6a32091246ba8624a"} Feb 20 08:41:25 crc kubenswrapper[5094]: I0220 08:41:25.145470 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bbef8849fb28ef72e101d5f6b345cb48ee27d5c052f39c6a32091246ba8624a" Feb 20 08:41:25 crc kubenswrapper[5094]: I0220 08:41:25.145485 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0a46-account-create-update-6bs8s" Feb 20 08:41:26 crc kubenswrapper[5094]: I0220 08:41:26.798218 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-9jfqj"] Feb 20 08:41:26 crc kubenswrapper[5094]: E0220 08:41:26.798945 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61baa8c9-4a07-47bc-94c8-b7e3f3846ff2" containerName="mariadb-database-create" Feb 20 08:41:26 crc kubenswrapper[5094]: I0220 08:41:26.798961 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="61baa8c9-4a07-47bc-94c8-b7e3f3846ff2" containerName="mariadb-database-create" Feb 20 08:41:26 crc kubenswrapper[5094]: E0220 08:41:26.798988 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb71d5b0-a19d-4900-be92-77b1abeaf856" containerName="mariadb-account-create-update" Feb 20 08:41:26 crc kubenswrapper[5094]: I0220 08:41:26.798998 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb71d5b0-a19d-4900-be92-77b1abeaf856" containerName="mariadb-account-create-update" Feb 20 08:41:26 crc kubenswrapper[5094]: I0220 08:41:26.799200 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="61baa8c9-4a07-47bc-94c8-b7e3f3846ff2" containerName="mariadb-database-create" Feb 20 08:41:26 crc kubenswrapper[5094]: I0220 08:41:26.799231 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb71d5b0-a19d-4900-be92-77b1abeaf856" containerName="mariadb-account-create-update" Feb 20 08:41:26 crc kubenswrapper[5094]: I0220 08:41:26.799953 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9jfqj" Feb 20 08:41:26 crc kubenswrapper[5094]: I0220 08:41:26.802027 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-6xjzv" Feb 20 08:41:26 crc kubenswrapper[5094]: I0220 08:41:26.802336 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 20 08:41:26 crc kubenswrapper[5094]: I0220 08:41:26.829405 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-9jfqj"] Feb 20 08:41:26 crc kubenswrapper[5094]: I0220 08:41:26.886739 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b241ede-085a-44b3-857b-f64e36b7b14f-combined-ca-bundle\") pod \"barbican-db-sync-9jfqj\" (UID: \"7b241ede-085a-44b3-857b-f64e36b7b14f\") " pod="openstack/barbican-db-sync-9jfqj" Feb 20 08:41:26 crc kubenswrapper[5094]: I0220 08:41:26.887066 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7b241ede-085a-44b3-857b-f64e36b7b14f-db-sync-config-data\") pod \"barbican-db-sync-9jfqj\" (UID: \"7b241ede-085a-44b3-857b-f64e36b7b14f\") " pod="openstack/barbican-db-sync-9jfqj" Feb 20 08:41:26 crc kubenswrapper[5094]: I0220 08:41:26.887203 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hhmf\" (UniqueName: \"kubernetes.io/projected/7b241ede-085a-44b3-857b-f64e36b7b14f-kube-api-access-9hhmf\") pod \"barbican-db-sync-9jfqj\" (UID: \"7b241ede-085a-44b3-857b-f64e36b7b14f\") " pod="openstack/barbican-db-sync-9jfqj" Feb 20 08:41:26 crc kubenswrapper[5094]: I0220 08:41:26.988870 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b241ede-085a-44b3-857b-f64e36b7b14f-combined-ca-bundle\") pod \"barbican-db-sync-9jfqj\" (UID: \"7b241ede-085a-44b3-857b-f64e36b7b14f\") " pod="openstack/barbican-db-sync-9jfqj" Feb 20 08:41:26 crc kubenswrapper[5094]: I0220 08:41:26.989113 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7b241ede-085a-44b3-857b-f64e36b7b14f-db-sync-config-data\") pod \"barbican-db-sync-9jfqj\" (UID: \"7b241ede-085a-44b3-857b-f64e36b7b14f\") " pod="openstack/barbican-db-sync-9jfqj" Feb 20 08:41:26 crc kubenswrapper[5094]: I0220 08:41:26.989210 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hhmf\" (UniqueName: \"kubernetes.io/projected/7b241ede-085a-44b3-857b-f64e36b7b14f-kube-api-access-9hhmf\") pod \"barbican-db-sync-9jfqj\" (UID: \"7b241ede-085a-44b3-857b-f64e36b7b14f\") " pod="openstack/barbican-db-sync-9jfqj" Feb 20 08:41:27 crc kubenswrapper[5094]: I0220 08:41:27.009830 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b241ede-085a-44b3-857b-f64e36b7b14f-combined-ca-bundle\") pod \"barbican-db-sync-9jfqj\" (UID: \"7b241ede-085a-44b3-857b-f64e36b7b14f\") " pod="openstack/barbican-db-sync-9jfqj" Feb 20 08:41:27 crc kubenswrapper[5094]: I0220 08:41:27.011660 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7b241ede-085a-44b3-857b-f64e36b7b14f-db-sync-config-data\") pod \"barbican-db-sync-9jfqj\" (UID: \"7b241ede-085a-44b3-857b-f64e36b7b14f\") " pod="openstack/barbican-db-sync-9jfqj" Feb 20 08:41:27 crc kubenswrapper[5094]: I0220 08:41:27.012333 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hhmf\" (UniqueName: \"kubernetes.io/projected/7b241ede-085a-44b3-857b-f64e36b7b14f-kube-api-access-9hhmf\") pod \"barbican-db-sync-9jfqj\" (UID: \"7b241ede-085a-44b3-857b-f64e36b7b14f\") " pod="openstack/barbican-db-sync-9jfqj" Feb 20 08:41:27 crc kubenswrapper[5094]: I0220 08:41:27.121339 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9jfqj" Feb 20 08:41:27 crc kubenswrapper[5094]: I0220 08:41:27.600226 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-9jfqj"] Feb 20 08:41:28 crc kubenswrapper[5094]: I0220 08:41:28.176150 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9jfqj" event={"ID":"7b241ede-085a-44b3-857b-f64e36b7b14f","Type":"ContainerStarted","Data":"8c39505d2d712b06c3ef0abbde14162515e52096b4c8288ab6a391e5274a44d7"} Feb 20 08:41:32 crc kubenswrapper[5094]: I0220 08:41:32.223809 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9jfqj" event={"ID":"7b241ede-085a-44b3-857b-f64e36b7b14f","Type":"ContainerStarted","Data":"ba2973a00772608356b5c6835b97549aca8d7d662bffa9fb35da2b972f2aa0f4"} Feb 20 08:41:32 crc kubenswrapper[5094]: I0220 08:41:32.257466 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-9jfqj" podStartSLOduration=2.299088501 podStartE2EDuration="6.257437292s" podCreationTimestamp="2026-02-20 08:41:26 +0000 UTC" firstStartedPulling="2026-02-20 08:41:27.607967628 +0000 UTC m=+6902.480594339" lastFinishedPulling="2026-02-20 08:41:31.566316409 +0000 UTC m=+6906.438943130" observedRunningTime="2026-02-20 08:41:32.251147691 +0000 UTC m=+6907.123774442" watchObservedRunningTime="2026-02-20 08:41:32.257437292 +0000 UTC m=+6907.130064053" Feb 20 08:41:34 crc kubenswrapper[5094]: I0220 08:41:34.254202 5094 generic.go:334] "Generic (PLEG): container finished" podID="7b241ede-085a-44b3-857b-f64e36b7b14f" containerID="ba2973a00772608356b5c6835b97549aca8d7d662bffa9fb35da2b972f2aa0f4" exitCode=0 Feb 20 08:41:34 crc kubenswrapper[5094]: I0220 08:41:34.254422 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9jfqj" event={"ID":"7b241ede-085a-44b3-857b-f64e36b7b14f","Type":"ContainerDied","Data":"ba2973a00772608356b5c6835b97549aca8d7d662bffa9fb35da2b972f2aa0f4"} Feb 20 08:41:35 crc kubenswrapper[5094]: I0220 08:41:35.590093 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9jfqj" Feb 20 08:41:35 crc kubenswrapper[5094]: I0220 08:41:35.657272 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7b241ede-085a-44b3-857b-f64e36b7b14f-db-sync-config-data\") pod \"7b241ede-085a-44b3-857b-f64e36b7b14f\" (UID: \"7b241ede-085a-44b3-857b-f64e36b7b14f\") " Feb 20 08:41:35 crc kubenswrapper[5094]: I0220 08:41:35.657375 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b241ede-085a-44b3-857b-f64e36b7b14f-combined-ca-bundle\") pod \"7b241ede-085a-44b3-857b-f64e36b7b14f\" (UID: \"7b241ede-085a-44b3-857b-f64e36b7b14f\") " Feb 20 08:41:35 crc kubenswrapper[5094]: I0220 08:41:35.657412 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hhmf\" (UniqueName: \"kubernetes.io/projected/7b241ede-085a-44b3-857b-f64e36b7b14f-kube-api-access-9hhmf\") pod \"7b241ede-085a-44b3-857b-f64e36b7b14f\" (UID: \"7b241ede-085a-44b3-857b-f64e36b7b14f\") " Feb 20 08:41:35 crc kubenswrapper[5094]: I0220 08:41:35.662625 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b241ede-085a-44b3-857b-f64e36b7b14f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7b241ede-085a-44b3-857b-f64e36b7b14f" (UID: "7b241ede-085a-44b3-857b-f64e36b7b14f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:41:35 crc kubenswrapper[5094]: I0220 08:41:35.663037 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b241ede-085a-44b3-857b-f64e36b7b14f-kube-api-access-9hhmf" (OuterVolumeSpecName: "kube-api-access-9hhmf") pod "7b241ede-085a-44b3-857b-f64e36b7b14f" (UID: "7b241ede-085a-44b3-857b-f64e36b7b14f"). InnerVolumeSpecName "kube-api-access-9hhmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:41:35 crc kubenswrapper[5094]: I0220 08:41:35.683981 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b241ede-085a-44b3-857b-f64e36b7b14f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b241ede-085a-44b3-857b-f64e36b7b14f" (UID: "7b241ede-085a-44b3-857b-f64e36b7b14f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:41:35 crc kubenswrapper[5094]: I0220 08:41:35.759488 5094 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7b241ede-085a-44b3-857b-f64e36b7b14f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:41:35 crc kubenswrapper[5094]: I0220 08:41:35.759527 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b241ede-085a-44b3-857b-f64e36b7b14f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:41:35 crc kubenswrapper[5094]: I0220 08:41:35.759538 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hhmf\" (UniqueName: \"kubernetes.io/projected/7b241ede-085a-44b3-857b-f64e36b7b14f-kube-api-access-9hhmf\") on node \"crc\" DevicePath \"\"" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.276985 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9jfqj" event={"ID":"7b241ede-085a-44b3-857b-f64e36b7b14f","Type":"ContainerDied","Data":"8c39505d2d712b06c3ef0abbde14162515e52096b4c8288ab6a391e5274a44d7"} Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.277049 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c39505d2d712b06c3ef0abbde14162515e52096b4c8288ab6a391e5274a44d7" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.277015 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9jfqj" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.516839 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-84b95bd745-mrk5m"] Feb 20 08:41:36 crc kubenswrapper[5094]: E0220 08:41:36.517469 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b241ede-085a-44b3-857b-f64e36b7b14f" containerName="barbican-db-sync" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.517489 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b241ede-085a-44b3-857b-f64e36b7b14f" containerName="barbican-db-sync" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.517689 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b241ede-085a-44b3-857b-f64e36b7b14f" containerName="barbican-db-sync" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.518859 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.521489 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.521911 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.521919 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-6xjzv" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.552041 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-84b95bd745-mrk5m"] Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.567436 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7f6984ff88-5xqtx"] Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.581923 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.583898 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13560fbf-48aa-45ac-8c10-067377d1adfa-combined-ca-bundle\") pod \"barbican-worker-84b95bd745-mrk5m\" (UID: \"13560fbf-48aa-45ac-8c10-067377d1adfa\") " pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.583964 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13560fbf-48aa-45ac-8c10-067377d1adfa-config-data-custom\") pod \"barbican-worker-84b95bd745-mrk5m\" (UID: \"13560fbf-48aa-45ac-8c10-067377d1adfa\") " pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.584012 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13560fbf-48aa-45ac-8c10-067377d1adfa-config-data\") pod \"barbican-worker-84b95bd745-mrk5m\" (UID: \"13560fbf-48aa-45ac-8c10-067377d1adfa\") " pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.584048 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13560fbf-48aa-45ac-8c10-067377d1adfa-logs\") pod \"barbican-worker-84b95bd745-mrk5m\" (UID: \"13560fbf-48aa-45ac-8c10-067377d1adfa\") " pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.584145 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpgw9\" (UniqueName: \"kubernetes.io/projected/13560fbf-48aa-45ac-8c10-067377d1adfa-kube-api-access-tpgw9\") pod \"barbican-worker-84b95bd745-mrk5m\" (UID: \"13560fbf-48aa-45ac-8c10-067377d1adfa\") " pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.586956 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.600642 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7f6984ff88-5xqtx"] Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.619877 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58dc5b9855-b6gdq"] Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.624100 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.637367 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dc5b9855-b6gdq"] Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.685733 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-dns-svc\") pod \"dnsmasq-dns-58dc5b9855-b6gdq\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.685786 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxtws\" (UniqueName: \"kubernetes.io/projected/b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a-kube-api-access-dxtws\") pod \"barbican-keystone-listener-7f6984ff88-5xqtx\" (UID: \"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a\") " pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.685822 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a-config-data-custom\") pod \"barbican-keystone-listener-7f6984ff88-5xqtx\" (UID: \"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a\") " pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.685842 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-config\") pod \"dnsmasq-dns-58dc5b9855-b6gdq\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.685883 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13560fbf-48aa-45ac-8c10-067377d1adfa-combined-ca-bundle\") pod \"barbican-worker-84b95bd745-mrk5m\" (UID: \"13560fbf-48aa-45ac-8c10-067377d1adfa\") " pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.685995 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13560fbf-48aa-45ac-8c10-067377d1adfa-config-data-custom\") pod \"barbican-worker-84b95bd745-mrk5m\" (UID: \"13560fbf-48aa-45ac-8c10-067377d1adfa\") " pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.686064 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a-logs\") pod \"barbican-keystone-listener-7f6984ff88-5xqtx\" (UID: \"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a\") " pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.686104 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a-config-data\") pod \"barbican-keystone-listener-7f6984ff88-5xqtx\" (UID: \"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a\") " pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.686134 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13560fbf-48aa-45ac-8c10-067377d1adfa-config-data\") pod \"barbican-worker-84b95bd745-mrk5m\" (UID: \"13560fbf-48aa-45ac-8c10-067377d1adfa\") " pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.686155 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-ovsdbserver-nb\") pod \"dnsmasq-dns-58dc5b9855-b6gdq\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.686204 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13560fbf-48aa-45ac-8c10-067377d1adfa-logs\") pod \"barbican-worker-84b95bd745-mrk5m\" (UID: \"13560fbf-48aa-45ac-8c10-067377d1adfa\") " pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.686280 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-ovsdbserver-sb\") pod \"dnsmasq-dns-58dc5b9855-b6gdq\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.686325 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a-combined-ca-bundle\") pod \"barbican-keystone-listener-7f6984ff88-5xqtx\" (UID: \"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a\") " pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.686390 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpcqp\" (UniqueName: \"kubernetes.io/projected/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-kube-api-access-hpcqp\") pod \"dnsmasq-dns-58dc5b9855-b6gdq\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.686440 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpgw9\" (UniqueName: \"kubernetes.io/projected/13560fbf-48aa-45ac-8c10-067377d1adfa-kube-api-access-tpgw9\") pod \"barbican-worker-84b95bd745-mrk5m\" (UID: \"13560fbf-48aa-45ac-8c10-067377d1adfa\") " pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.687027 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13560fbf-48aa-45ac-8c10-067377d1adfa-logs\") pod \"barbican-worker-84b95bd745-mrk5m\" (UID: \"13560fbf-48aa-45ac-8c10-067377d1adfa\") " pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.691341 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13560fbf-48aa-45ac-8c10-067377d1adfa-config-data-custom\") pod \"barbican-worker-84b95bd745-mrk5m\" (UID: \"13560fbf-48aa-45ac-8c10-067377d1adfa\") " pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.692101 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13560fbf-48aa-45ac-8c10-067377d1adfa-config-data\") pod \"barbican-worker-84b95bd745-mrk5m\" (UID: \"13560fbf-48aa-45ac-8c10-067377d1adfa\") " pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.696210 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13560fbf-48aa-45ac-8c10-067377d1adfa-combined-ca-bundle\") pod \"barbican-worker-84b95bd745-mrk5m\" (UID: \"13560fbf-48aa-45ac-8c10-067377d1adfa\") " pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.712437 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpgw9\" (UniqueName: \"kubernetes.io/projected/13560fbf-48aa-45ac-8c10-067377d1adfa-kube-api-access-tpgw9\") pod \"barbican-worker-84b95bd745-mrk5m\" (UID: \"13560fbf-48aa-45ac-8c10-067377d1adfa\") " pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.737751 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-69fdd7dd98-bm4fc"] Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.739098 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.743671 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.759946 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-69fdd7dd98-bm4fc"] Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.788840 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-ovsdbserver-sb\") pod \"dnsmasq-dns-58dc5b9855-b6gdq\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.788892 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a-combined-ca-bundle\") pod \"barbican-keystone-listener-7f6984ff88-5xqtx\" (UID: \"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a\") " pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.788933 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpcqp\" (UniqueName: \"kubernetes.io/projected/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-kube-api-access-hpcqp\") pod \"dnsmasq-dns-58dc5b9855-b6gdq\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.788955 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e777e53-5dbe-4779-bc99-90bbf12cea8f-config-data\") pod \"barbican-api-69fdd7dd98-bm4fc\" (UID: \"3e777e53-5dbe-4779-bc99-90bbf12cea8f\") " pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.788984 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-dns-svc\") pod \"dnsmasq-dns-58dc5b9855-b6gdq\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.789004 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp9wn\" (UniqueName: \"kubernetes.io/projected/3e777e53-5dbe-4779-bc99-90bbf12cea8f-kube-api-access-vp9wn\") pod \"barbican-api-69fdd7dd98-bm4fc\" (UID: \"3e777e53-5dbe-4779-bc99-90bbf12cea8f\") " pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.789024 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxtws\" (UniqueName: \"kubernetes.io/projected/b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a-kube-api-access-dxtws\") pod \"barbican-keystone-listener-7f6984ff88-5xqtx\" (UID: \"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a\") " pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.789052 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a-config-data-custom\") pod \"barbican-keystone-listener-7f6984ff88-5xqtx\" (UID: \"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a\") " pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.789072 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-config\") pod \"dnsmasq-dns-58dc5b9855-b6gdq\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.789109 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a-logs\") pod \"barbican-keystone-listener-7f6984ff88-5xqtx\" (UID: \"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a\") " pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.789134 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e777e53-5dbe-4779-bc99-90bbf12cea8f-combined-ca-bundle\") pod \"barbican-api-69fdd7dd98-bm4fc\" (UID: \"3e777e53-5dbe-4779-bc99-90bbf12cea8f\") " pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.789157 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a-config-data\") pod \"barbican-keystone-listener-7f6984ff88-5xqtx\" (UID: \"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a\") " pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.789185 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e777e53-5dbe-4779-bc99-90bbf12cea8f-config-data-custom\") pod \"barbican-api-69fdd7dd98-bm4fc\" (UID: \"3e777e53-5dbe-4779-bc99-90bbf12cea8f\") " pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.789204 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-ovsdbserver-nb\") pod \"dnsmasq-dns-58dc5b9855-b6gdq\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.789230 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e777e53-5dbe-4779-bc99-90bbf12cea8f-logs\") pod \"barbican-api-69fdd7dd98-bm4fc\" (UID: \"3e777e53-5dbe-4779-bc99-90bbf12cea8f\") " pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.789773 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-ovsdbserver-sb\") pod \"dnsmasq-dns-58dc5b9855-b6gdq\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.790536 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a-logs\") pod \"barbican-keystone-listener-7f6984ff88-5xqtx\" (UID: \"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a\") " pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.790721 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-config\") pod \"dnsmasq-dns-58dc5b9855-b6gdq\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.791667 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-ovsdbserver-nb\") pod \"dnsmasq-dns-58dc5b9855-b6gdq\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.791738 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-dns-svc\") pod \"dnsmasq-dns-58dc5b9855-b6gdq\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.793518 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a-config-data-custom\") pod \"barbican-keystone-listener-7f6984ff88-5xqtx\" (UID: \"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a\") " pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.795280 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a-combined-ca-bundle\") pod \"barbican-keystone-listener-7f6984ff88-5xqtx\" (UID: \"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a\") " pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.801130 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a-config-data\") pod \"barbican-keystone-listener-7f6984ff88-5xqtx\" (UID: \"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a\") " pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.813663 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpcqp\" (UniqueName: \"kubernetes.io/projected/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-kube-api-access-hpcqp\") pod \"dnsmasq-dns-58dc5b9855-b6gdq\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.832187 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxtws\" (UniqueName: \"kubernetes.io/projected/b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a-kube-api-access-dxtws\") pod \"barbican-keystone-listener-7f6984ff88-5xqtx\" (UID: \"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a\") " pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.852183 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.890578 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e777e53-5dbe-4779-bc99-90bbf12cea8f-config-data-custom\") pod \"barbican-api-69fdd7dd98-bm4fc\" (UID: \"3e777e53-5dbe-4779-bc99-90bbf12cea8f\") " pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.890631 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e777e53-5dbe-4779-bc99-90bbf12cea8f-logs\") pod \"barbican-api-69fdd7dd98-bm4fc\" (UID: \"3e777e53-5dbe-4779-bc99-90bbf12cea8f\") " pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.890815 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e777e53-5dbe-4779-bc99-90bbf12cea8f-config-data\") pod \"barbican-api-69fdd7dd98-bm4fc\" (UID: \"3e777e53-5dbe-4779-bc99-90bbf12cea8f\") " pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.890857 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp9wn\" (UniqueName: \"kubernetes.io/projected/3e777e53-5dbe-4779-bc99-90bbf12cea8f-kube-api-access-vp9wn\") pod \"barbican-api-69fdd7dd98-bm4fc\" (UID: \"3e777e53-5dbe-4779-bc99-90bbf12cea8f\") " pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.890930 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e777e53-5dbe-4779-bc99-90bbf12cea8f-combined-ca-bundle\") pod \"barbican-api-69fdd7dd98-bm4fc\" (UID: \"3e777e53-5dbe-4779-bc99-90bbf12cea8f\") " pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.893163 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e777e53-5dbe-4779-bc99-90bbf12cea8f-logs\") pod \"barbican-api-69fdd7dd98-bm4fc\" (UID: \"3e777e53-5dbe-4779-bc99-90bbf12cea8f\") " pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.894831 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e777e53-5dbe-4779-bc99-90bbf12cea8f-combined-ca-bundle\") pod \"barbican-api-69fdd7dd98-bm4fc\" (UID: \"3e777e53-5dbe-4779-bc99-90bbf12cea8f\") " pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.899336 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e777e53-5dbe-4779-bc99-90bbf12cea8f-config-data-custom\") pod \"barbican-api-69fdd7dd98-bm4fc\" (UID: \"3e777e53-5dbe-4779-bc99-90bbf12cea8f\") " pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.900059 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e777e53-5dbe-4779-bc99-90bbf12cea8f-config-data\") pod \"barbican-api-69fdd7dd98-bm4fc\" (UID: \"3e777e53-5dbe-4779-bc99-90bbf12cea8f\") " pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.907880 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.911574 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp9wn\" (UniqueName: \"kubernetes.io/projected/3e777e53-5dbe-4779-bc99-90bbf12cea8f-kube-api-access-vp9wn\") pod \"barbican-api-69fdd7dd98-bm4fc\" (UID: \"3e777e53-5dbe-4779-bc99-90bbf12cea8f\") " pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.953347 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:37 crc kubenswrapper[5094]: I0220 08:41:37.084606 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:37 crc kubenswrapper[5094]: I0220 08:41:37.372644 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-84b95bd745-mrk5m"] Feb 20 08:41:37 crc kubenswrapper[5094]: I0220 08:41:37.379191 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dc5b9855-b6gdq"] Feb 20 08:41:37 crc kubenswrapper[5094]: W0220 08:41:37.380595 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdb5c820_23c9_42e7_9c70_d8f504f47ff5.slice/crio-30f5698d42ea6d5782a8328b6af34f86a1ed299fd9bf068ef4d5c02cdc98c3f0 WatchSource:0}: Error finding container 30f5698d42ea6d5782a8328b6af34f86a1ed299fd9bf068ef4d5c02cdc98c3f0: Status 404 returned error can't find the container with id 30f5698d42ea6d5782a8328b6af34f86a1ed299fd9bf068ef4d5c02cdc98c3f0 Feb 20 08:41:37 crc kubenswrapper[5094]: W0220 08:41:37.381084 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13560fbf_48aa_45ac_8c10_067377d1adfa.slice/crio-49b8b108407c6012bd43bb4c6fb589f3a202da485ad943015fd4518a0191ef50 WatchSource:0}: Error finding container 49b8b108407c6012bd43bb4c6fb589f3a202da485ad943015fd4518a0191ef50: Status 404 returned error can't find the container with id 49b8b108407c6012bd43bb4c6fb589f3a202da485ad943015fd4518a0191ef50 Feb 20 08:41:37 crc kubenswrapper[5094]: I0220 08:41:37.474410 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7f6984ff88-5xqtx"] Feb 20 08:41:37 crc kubenswrapper[5094]: W0220 08:41:37.476883 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb128e8c6_6bcb_4e4b_b648_d3f932ad0a0a.slice/crio-fcb8498d9b27364dbe5c39be0b8a85e70da8f5883c78440f5eff58c473a6b61d WatchSource:0}: Error finding container fcb8498d9b27364dbe5c39be0b8a85e70da8f5883c78440f5eff58c473a6b61d: Status 404 returned error can't find the container with id fcb8498d9b27364dbe5c39be0b8a85e70da8f5883c78440f5eff58c473a6b61d Feb 20 08:41:37 crc kubenswrapper[5094]: I0220 08:41:37.652243 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-69fdd7dd98-bm4fc"] Feb 20 08:41:38 crc kubenswrapper[5094]: I0220 08:41:38.302533 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69fdd7dd98-bm4fc" event={"ID":"3e777e53-5dbe-4779-bc99-90bbf12cea8f","Type":"ContainerStarted","Data":"2d550ec529357c21f1fc2dd66c42f495d790148c60f79e531c19862c0c74ae22"} Feb 20 08:41:38 crc kubenswrapper[5094]: I0220 08:41:38.303144 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69fdd7dd98-bm4fc" event={"ID":"3e777e53-5dbe-4779-bc99-90bbf12cea8f","Type":"ContainerStarted","Data":"2e7e50414a7da88d85240944a44c96b6efa122453231347a8284e7b90a4f99c5"} Feb 20 08:41:38 crc kubenswrapper[5094]: I0220 08:41:38.303155 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69fdd7dd98-bm4fc" event={"ID":"3e777e53-5dbe-4779-bc99-90bbf12cea8f","Type":"ContainerStarted","Data":"6505a9d3fc5a6422418e299d1ee2d5edda6daae2fce25af07a0cd0dd6262e891"} Feb 20 08:41:38 crc kubenswrapper[5094]: I0220 08:41:38.304220 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:38 crc kubenswrapper[5094]: I0220 08:41:38.304244 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:38 crc kubenswrapper[5094]: I0220 08:41:38.306679 5094 generic.go:334] "Generic (PLEG): container finished" podID="fdb5c820-23c9-42e7-9c70-d8f504f47ff5" containerID="faf72b2a28fa9464260dd616d409fc89190c54954c251671336a3448ddcf3235" exitCode=0 Feb 20 08:41:38 crc kubenswrapper[5094]: I0220 08:41:38.306801 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" event={"ID":"fdb5c820-23c9-42e7-9c70-d8f504f47ff5","Type":"ContainerDied","Data":"faf72b2a28fa9464260dd616d409fc89190c54954c251671336a3448ddcf3235"} Feb 20 08:41:38 crc kubenswrapper[5094]: I0220 08:41:38.306826 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" event={"ID":"fdb5c820-23c9-42e7-9c70-d8f504f47ff5","Type":"ContainerStarted","Data":"30f5698d42ea6d5782a8328b6af34f86a1ed299fd9bf068ef4d5c02cdc98c3f0"} Feb 20 08:41:38 crc kubenswrapper[5094]: I0220 08:41:38.309563 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" event={"ID":"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a","Type":"ContainerStarted","Data":"fcb8498d9b27364dbe5c39be0b8a85e70da8f5883c78440f5eff58c473a6b61d"} Feb 20 08:41:38 crc kubenswrapper[5094]: I0220 08:41:38.324655 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-84b95bd745-mrk5m" event={"ID":"13560fbf-48aa-45ac-8c10-067377d1adfa","Type":"ContainerStarted","Data":"49b8b108407c6012bd43bb4c6fb589f3a202da485ad943015fd4518a0191ef50"} Feb 20 08:41:38 crc kubenswrapper[5094]: I0220 08:41:38.333517 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-69fdd7dd98-bm4fc" podStartSLOduration=2.333494462 podStartE2EDuration="2.333494462s" podCreationTimestamp="2026-02-20 08:41:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:41:38.330400098 +0000 UTC m=+6913.203026819" watchObservedRunningTime="2026-02-20 08:41:38.333494462 +0000 UTC m=+6913.206121183" Feb 20 08:41:39 crc kubenswrapper[5094]: I0220 08:41:39.334317 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" event={"ID":"fdb5c820-23c9-42e7-9c70-d8f504f47ff5","Type":"ContainerStarted","Data":"8bb9de6e4383ec58a31d9dfe384382f6766f2e29c7eeaa03946d4109134ffc85"} Feb 20 08:41:39 crc kubenswrapper[5094]: I0220 08:41:39.334858 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:39 crc kubenswrapper[5094]: I0220 08:41:39.336982 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" event={"ID":"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a","Type":"ContainerStarted","Data":"3d15e5fff529a2b063ebbf609f9d126ad00fc95612bcd84704e79dfd39910a97"} Feb 20 08:41:39 crc kubenswrapper[5094]: I0220 08:41:39.337042 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" event={"ID":"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a","Type":"ContainerStarted","Data":"2baab9f100ae06234de01222fcf22f1bd7f6a5d3cde7b0291340ff39c51c1896"} Feb 20 08:41:39 crc kubenswrapper[5094]: I0220 08:41:39.339041 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-84b95bd745-mrk5m" event={"ID":"13560fbf-48aa-45ac-8c10-067377d1adfa","Type":"ContainerStarted","Data":"436e0386deb8a8833e0385750a15de7fac619644cccc1c8b1de2d7d3f00c3e3c"} Feb 20 08:41:39 crc kubenswrapper[5094]: I0220 08:41:39.339132 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-84b95bd745-mrk5m" event={"ID":"13560fbf-48aa-45ac-8c10-067377d1adfa","Type":"ContainerStarted","Data":"247355ce6bfad6ca0c0d9842bdce1e5f689f717fadc309b92b2429a59f3ffa26"} Feb 20 08:41:39 crc kubenswrapper[5094]: I0220 08:41:39.363394 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" podStartSLOduration=3.363374669 podStartE2EDuration="3.363374669s" podCreationTimestamp="2026-02-20 08:41:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:41:39.354997687 +0000 UTC m=+6914.227624438" watchObservedRunningTime="2026-02-20 08:41:39.363374669 +0000 UTC m=+6914.236001380" Feb 20 08:41:39 crc kubenswrapper[5094]: I0220 08:41:39.378529 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-84b95bd745-mrk5m" podStartSLOduration=2.098396563 podStartE2EDuration="3.378510693s" podCreationTimestamp="2026-02-20 08:41:36 +0000 UTC" firstStartedPulling="2026-02-20 08:41:37.383927368 +0000 UTC m=+6912.256554079" lastFinishedPulling="2026-02-20 08:41:38.664041498 +0000 UTC m=+6913.536668209" observedRunningTime="2026-02-20 08:41:39.376212118 +0000 UTC m=+6914.248838859" watchObservedRunningTime="2026-02-20 08:41:39.378510693 +0000 UTC m=+6914.251137404" Feb 20 08:41:39 crc kubenswrapper[5094]: I0220 08:41:39.399586 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" podStartSLOduration=2.215128963 podStartE2EDuration="3.39956539s" podCreationTimestamp="2026-02-20 08:41:36 +0000 UTC" firstStartedPulling="2026-02-20 08:41:37.480461801 +0000 UTC m=+6912.353088512" lastFinishedPulling="2026-02-20 08:41:38.664898228 +0000 UTC m=+6913.537524939" observedRunningTime="2026-02-20 08:41:39.39332312 +0000 UTC m=+6914.265949871" watchObservedRunningTime="2026-02-20 08:41:39.39956539 +0000 UTC m=+6914.272192101" Feb 20 08:41:43 crc kubenswrapper[5094]: I0220 08:41:43.551770 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:44 crc kubenswrapper[5094]: I0220 08:41:44.889952 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:46 crc kubenswrapper[5094]: I0220 08:41:46.955869 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.012646 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77c94f6b7-twntj"] Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.012935 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77c94f6b7-twntj" podUID="e3b60dac-2fbe-46ba-acc9-92058e10f2d1" containerName="dnsmasq-dns" containerID="cri-o://24d6cfafc2015bc19bb6e88020cb0449cf6f98d40157ba914e379542fe8d395a" gracePeriod=10 Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.417126 5094 generic.go:334] "Generic (PLEG): container finished" podID="e3b60dac-2fbe-46ba-acc9-92058e10f2d1" containerID="24d6cfafc2015bc19bb6e88020cb0449cf6f98d40157ba914e379542fe8d395a" exitCode=0 Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.417643 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77c94f6b7-twntj" event={"ID":"e3b60dac-2fbe-46ba-acc9-92058e10f2d1","Type":"ContainerDied","Data":"24d6cfafc2015bc19bb6e88020cb0449cf6f98d40157ba914e379542fe8d395a"} Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.549353 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.590792 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-dns-svc\") pod \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.590906 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-ovsdbserver-nb\") pod \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.590961 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-config\") pod \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.590986 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcqs7\" (UniqueName: \"kubernetes.io/projected/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-kube-api-access-xcqs7\") pod \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.591043 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-ovsdbserver-sb\") pod \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.625678 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-kube-api-access-xcqs7" (OuterVolumeSpecName: "kube-api-access-xcqs7") pod "e3b60dac-2fbe-46ba-acc9-92058e10f2d1" (UID: "e3b60dac-2fbe-46ba-acc9-92058e10f2d1"). InnerVolumeSpecName "kube-api-access-xcqs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.658762 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e3b60dac-2fbe-46ba-acc9-92058e10f2d1" (UID: "e3b60dac-2fbe-46ba-acc9-92058e10f2d1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.658819 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e3b60dac-2fbe-46ba-acc9-92058e10f2d1" (UID: "e3b60dac-2fbe-46ba-acc9-92058e10f2d1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.659329 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e3b60dac-2fbe-46ba-acc9-92058e10f2d1" (UID: "e3b60dac-2fbe-46ba-acc9-92058e10f2d1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.659346 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-config" (OuterVolumeSpecName: "config") pod "e3b60dac-2fbe-46ba-acc9-92058e10f2d1" (UID: "e3b60dac-2fbe-46ba-acc9-92058e10f2d1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.692884 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.692915 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.692925 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.692934 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcqs7\" (UniqueName: \"kubernetes.io/projected/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-kube-api-access-xcqs7\") on node \"crc\" DevicePath \"\"" Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.692944 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 08:41:48 crc kubenswrapper[5094]: I0220 08:41:48.426843 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77c94f6b7-twntj" event={"ID":"e3b60dac-2fbe-46ba-acc9-92058e10f2d1","Type":"ContainerDied","Data":"b9d473e434d2dbf9a062a256fefe028e1a75e895c91d6257edb1d3df1f31c7ef"} Feb 20 08:41:48 crc kubenswrapper[5094]: I0220 08:41:48.427252 5094 scope.go:117] "RemoveContainer" containerID="24d6cfafc2015bc19bb6e88020cb0449cf6f98d40157ba914e379542fe8d395a" Feb 20 08:41:48 crc kubenswrapper[5094]: I0220 08:41:48.426897 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:41:48 crc kubenswrapper[5094]: I0220 08:41:48.455291 5094 scope.go:117] "RemoveContainer" containerID="d7d23a98ee3bcf78f157fef71692c3b42c0ccd4e53f68bb8466090a9c903b801" Feb 20 08:41:48 crc kubenswrapper[5094]: I0220 08:41:48.456807 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77c94f6b7-twntj"] Feb 20 08:41:48 crc kubenswrapper[5094]: I0220 08:41:48.464528 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77c94f6b7-twntj"] Feb 20 08:41:49 crc kubenswrapper[5094]: I0220 08:41:49.852430 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3b60dac-2fbe-46ba-acc9-92058e10f2d1" path="/var/lib/kubelet/pods/e3b60dac-2fbe-46ba-acc9-92058e10f2d1/volumes" Feb 20 08:41:51 crc kubenswrapper[5094]: I0220 08:41:51.829102 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-rbwv2"] Feb 20 08:41:51 crc kubenswrapper[5094]: E0220 08:41:51.829845 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b60dac-2fbe-46ba-acc9-92058e10f2d1" containerName="init" Feb 20 08:41:51 crc kubenswrapper[5094]: I0220 08:41:51.829859 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b60dac-2fbe-46ba-acc9-92058e10f2d1" containerName="init" Feb 20 08:41:51 crc kubenswrapper[5094]: E0220 08:41:51.829884 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b60dac-2fbe-46ba-acc9-92058e10f2d1" containerName="dnsmasq-dns" Feb 20 08:41:51 crc kubenswrapper[5094]: I0220 08:41:51.829891 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b60dac-2fbe-46ba-acc9-92058e10f2d1" containerName="dnsmasq-dns" Feb 20 08:41:51 crc kubenswrapper[5094]: I0220 08:41:51.830041 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3b60dac-2fbe-46ba-acc9-92058e10f2d1" containerName="dnsmasq-dns" Feb 20 08:41:51 crc kubenswrapper[5094]: I0220 08:41:51.830603 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rbwv2" Feb 20 08:41:51 crc kubenswrapper[5094]: I0220 08:41:51.865532 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-rbwv2"] Feb 20 08:41:51 crc kubenswrapper[5094]: I0220 08:41:51.934229 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-fc68-account-create-update-ctxrq"] Feb 20 08:41:51 crc kubenswrapper[5094]: I0220 08:41:51.935268 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fc68-account-create-update-ctxrq" Feb 20 08:41:51 crc kubenswrapper[5094]: I0220 08:41:51.937728 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 20 08:41:51 crc kubenswrapper[5094]: I0220 08:41:51.969516 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f3a4acd-5b68-467c-b024-b518d0f4d27e-operator-scripts\") pod \"neutron-db-create-rbwv2\" (UID: \"0f3a4acd-5b68-467c-b024-b518d0f4d27e\") " pod="openstack/neutron-db-create-rbwv2" Feb 20 08:41:51 crc kubenswrapper[5094]: I0220 08:41:51.969611 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n24tg\" (UniqueName: \"kubernetes.io/projected/0f3a4acd-5b68-467c-b024-b518d0f4d27e-kube-api-access-n24tg\") pod \"neutron-db-create-rbwv2\" (UID: \"0f3a4acd-5b68-467c-b024-b518d0f4d27e\") " pod="openstack/neutron-db-create-rbwv2" Feb 20 08:41:51 crc kubenswrapper[5094]: I0220 08:41:51.977830 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fc68-account-create-update-ctxrq"] Feb 20 08:41:52 crc kubenswrapper[5094]: I0220 08:41:52.070756 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8c2373d-6a69-460a-8622-d001dc53efc0-operator-scripts\") pod \"neutron-fc68-account-create-update-ctxrq\" (UID: \"d8c2373d-6a69-460a-8622-d001dc53efc0\") " pod="openstack/neutron-fc68-account-create-update-ctxrq" Feb 20 08:41:52 crc kubenswrapper[5094]: I0220 08:41:52.070841 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f3a4acd-5b68-467c-b024-b518d0f4d27e-operator-scripts\") pod \"neutron-db-create-rbwv2\" (UID: \"0f3a4acd-5b68-467c-b024-b518d0f4d27e\") " pod="openstack/neutron-db-create-rbwv2" Feb 20 08:41:52 crc kubenswrapper[5094]: I0220 08:41:52.071183 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n24tg\" (UniqueName: \"kubernetes.io/projected/0f3a4acd-5b68-467c-b024-b518d0f4d27e-kube-api-access-n24tg\") pod \"neutron-db-create-rbwv2\" (UID: \"0f3a4acd-5b68-467c-b024-b518d0f4d27e\") " pod="openstack/neutron-db-create-rbwv2" Feb 20 08:41:52 crc kubenswrapper[5094]: I0220 08:41:52.071473 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f3a4acd-5b68-467c-b024-b518d0f4d27e-operator-scripts\") pod \"neutron-db-create-rbwv2\" (UID: \"0f3a4acd-5b68-467c-b024-b518d0f4d27e\") " pod="openstack/neutron-db-create-rbwv2" Feb 20 08:41:52 crc kubenswrapper[5094]: I0220 08:41:52.071675 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57dns\" (UniqueName: \"kubernetes.io/projected/d8c2373d-6a69-460a-8622-d001dc53efc0-kube-api-access-57dns\") pod \"neutron-fc68-account-create-update-ctxrq\" (UID: \"d8c2373d-6a69-460a-8622-d001dc53efc0\") " pod="openstack/neutron-fc68-account-create-update-ctxrq" Feb 20 08:41:52 crc kubenswrapper[5094]: I0220 08:41:52.097183 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n24tg\" (UniqueName: \"kubernetes.io/projected/0f3a4acd-5b68-467c-b024-b518d0f4d27e-kube-api-access-n24tg\") pod \"neutron-db-create-rbwv2\" (UID: \"0f3a4acd-5b68-467c-b024-b518d0f4d27e\") " pod="openstack/neutron-db-create-rbwv2" Feb 20 08:41:52 crc kubenswrapper[5094]: I0220 08:41:52.151458 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rbwv2" Feb 20 08:41:52 crc kubenswrapper[5094]: I0220 08:41:52.174234 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57dns\" (UniqueName: \"kubernetes.io/projected/d8c2373d-6a69-460a-8622-d001dc53efc0-kube-api-access-57dns\") pod \"neutron-fc68-account-create-update-ctxrq\" (UID: \"d8c2373d-6a69-460a-8622-d001dc53efc0\") " pod="openstack/neutron-fc68-account-create-update-ctxrq" Feb 20 08:41:52 crc kubenswrapper[5094]: I0220 08:41:52.174340 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8c2373d-6a69-460a-8622-d001dc53efc0-operator-scripts\") pod \"neutron-fc68-account-create-update-ctxrq\" (UID: \"d8c2373d-6a69-460a-8622-d001dc53efc0\") " pod="openstack/neutron-fc68-account-create-update-ctxrq" Feb 20 08:41:52 crc kubenswrapper[5094]: I0220 08:41:52.175288 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8c2373d-6a69-460a-8622-d001dc53efc0-operator-scripts\") pod \"neutron-fc68-account-create-update-ctxrq\" (UID: \"d8c2373d-6a69-460a-8622-d001dc53efc0\") " pod="openstack/neutron-fc68-account-create-update-ctxrq" Feb 20 08:41:52 crc kubenswrapper[5094]: I0220 08:41:52.195059 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57dns\" (UniqueName: \"kubernetes.io/projected/d8c2373d-6a69-460a-8622-d001dc53efc0-kube-api-access-57dns\") pod \"neutron-fc68-account-create-update-ctxrq\" (UID: \"d8c2373d-6a69-460a-8622-d001dc53efc0\") " pod="openstack/neutron-fc68-account-create-update-ctxrq" Feb 20 08:41:52 crc kubenswrapper[5094]: I0220 08:41:52.279518 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fc68-account-create-update-ctxrq" Feb 20 08:41:52 crc kubenswrapper[5094]: I0220 08:41:52.705485 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-rbwv2"] Feb 20 08:41:52 crc kubenswrapper[5094]: W0220 08:41:52.710515 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f3a4acd_5b68_467c_b024_b518d0f4d27e.slice/crio-e274473d0fe0cf43be09d8165480434f8a2dbce8b367babc397e1b196b593a2b WatchSource:0}: Error finding container e274473d0fe0cf43be09d8165480434f8a2dbce8b367babc397e1b196b593a2b: Status 404 returned error can't find the container with id e274473d0fe0cf43be09d8165480434f8a2dbce8b367babc397e1b196b593a2b Feb 20 08:41:52 crc kubenswrapper[5094]: W0220 08:41:52.846368 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8c2373d_6a69_460a_8622_d001dc53efc0.slice/crio-6ecb69a45ac19b4312d699ee9bdc5737b998f021d79728732b5e67f0dfd777ff WatchSource:0}: Error finding container 6ecb69a45ac19b4312d699ee9bdc5737b998f021d79728732b5e67f0dfd777ff: Status 404 returned error can't find the container with id 6ecb69a45ac19b4312d699ee9bdc5737b998f021d79728732b5e67f0dfd777ff Feb 20 08:41:52 crc kubenswrapper[5094]: I0220 08:41:52.857238 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fc68-account-create-update-ctxrq"] Feb 20 08:41:53 crc kubenswrapper[5094]: I0220 08:41:53.470346 5094 generic.go:334] "Generic (PLEG): container finished" podID="d8c2373d-6a69-460a-8622-d001dc53efc0" containerID="4495a0b785b56a81800453fd2516a41bac0676f202c2358f07c81e7849110742" exitCode=0 Feb 20 08:41:53 crc kubenswrapper[5094]: I0220 08:41:53.470435 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fc68-account-create-update-ctxrq" event={"ID":"d8c2373d-6a69-460a-8622-d001dc53efc0","Type":"ContainerDied","Data":"4495a0b785b56a81800453fd2516a41bac0676f202c2358f07c81e7849110742"} Feb 20 08:41:53 crc kubenswrapper[5094]: I0220 08:41:53.470486 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fc68-account-create-update-ctxrq" event={"ID":"d8c2373d-6a69-460a-8622-d001dc53efc0","Type":"ContainerStarted","Data":"6ecb69a45ac19b4312d699ee9bdc5737b998f021d79728732b5e67f0dfd777ff"} Feb 20 08:41:53 crc kubenswrapper[5094]: I0220 08:41:53.472900 5094 generic.go:334] "Generic (PLEG): container finished" podID="0f3a4acd-5b68-467c-b024-b518d0f4d27e" containerID="5af35aa0d974ec2be3d578b66402a33233be4efbd611deaf5976f2b6d54c4e72" exitCode=0 Feb 20 08:41:53 crc kubenswrapper[5094]: I0220 08:41:53.472941 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rbwv2" event={"ID":"0f3a4acd-5b68-467c-b024-b518d0f4d27e","Type":"ContainerDied","Data":"5af35aa0d974ec2be3d578b66402a33233be4efbd611deaf5976f2b6d54c4e72"} Feb 20 08:41:53 crc kubenswrapper[5094]: I0220 08:41:53.472962 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rbwv2" event={"ID":"0f3a4acd-5b68-467c-b024-b518d0f4d27e","Type":"ContainerStarted","Data":"e274473d0fe0cf43be09d8165480434f8a2dbce8b367babc397e1b196b593a2b"} Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.014622 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rbwv2" Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.023542 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fc68-account-create-update-ctxrq" Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.138652 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8c2373d-6a69-460a-8622-d001dc53efc0-operator-scripts\") pod \"d8c2373d-6a69-460a-8622-d001dc53efc0\" (UID: \"d8c2373d-6a69-460a-8622-d001dc53efc0\") " Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.138835 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n24tg\" (UniqueName: \"kubernetes.io/projected/0f3a4acd-5b68-467c-b024-b518d0f4d27e-kube-api-access-n24tg\") pod \"0f3a4acd-5b68-467c-b024-b518d0f4d27e\" (UID: \"0f3a4acd-5b68-467c-b024-b518d0f4d27e\") " Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.138898 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f3a4acd-5b68-467c-b024-b518d0f4d27e-operator-scripts\") pod \"0f3a4acd-5b68-467c-b024-b518d0f4d27e\" (UID: \"0f3a4acd-5b68-467c-b024-b518d0f4d27e\") " Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.139053 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57dns\" (UniqueName: \"kubernetes.io/projected/d8c2373d-6a69-460a-8622-d001dc53efc0-kube-api-access-57dns\") pod \"d8c2373d-6a69-460a-8622-d001dc53efc0\" (UID: \"d8c2373d-6a69-460a-8622-d001dc53efc0\") " Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.139607 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8c2373d-6a69-460a-8622-d001dc53efc0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d8c2373d-6a69-460a-8622-d001dc53efc0" (UID: "d8c2373d-6a69-460a-8622-d001dc53efc0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.139673 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f3a4acd-5b68-467c-b024-b518d0f4d27e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0f3a4acd-5b68-467c-b024-b518d0f4d27e" (UID: "0f3a4acd-5b68-467c-b024-b518d0f4d27e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.144126 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f3a4acd-5b68-467c-b024-b518d0f4d27e-kube-api-access-n24tg" (OuterVolumeSpecName: "kube-api-access-n24tg") pod "0f3a4acd-5b68-467c-b024-b518d0f4d27e" (UID: "0f3a4acd-5b68-467c-b024-b518d0f4d27e"). InnerVolumeSpecName "kube-api-access-n24tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.146521 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8c2373d-6a69-460a-8622-d001dc53efc0-kube-api-access-57dns" (OuterVolumeSpecName: "kube-api-access-57dns") pod "d8c2373d-6a69-460a-8622-d001dc53efc0" (UID: "d8c2373d-6a69-460a-8622-d001dc53efc0"). InnerVolumeSpecName "kube-api-access-57dns". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.242093 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57dns\" (UniqueName: \"kubernetes.io/projected/d8c2373d-6a69-460a-8622-d001dc53efc0-kube-api-access-57dns\") on node \"crc\" DevicePath \"\"" Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.242151 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8c2373d-6a69-460a-8622-d001dc53efc0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.242161 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n24tg\" (UniqueName: \"kubernetes.io/projected/0f3a4acd-5b68-467c-b024-b518d0f4d27e-kube-api-access-n24tg\") on node \"crc\" DevicePath \"\"" Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.242171 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f3a4acd-5b68-467c-b024-b518d0f4d27e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.489462 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rbwv2" event={"ID":"0f3a4acd-5b68-467c-b024-b518d0f4d27e","Type":"ContainerDied","Data":"e274473d0fe0cf43be09d8165480434f8a2dbce8b367babc397e1b196b593a2b"} Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.489882 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e274473d0fe0cf43be09d8165480434f8a2dbce8b367babc397e1b196b593a2b" Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.489487 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rbwv2" Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.491557 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fc68-account-create-update-ctxrq" event={"ID":"d8c2373d-6a69-460a-8622-d001dc53efc0","Type":"ContainerDied","Data":"6ecb69a45ac19b4312d699ee9bdc5737b998f021d79728732b5e67f0dfd777ff"} Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.491605 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ecb69a45ac19b4312d699ee9bdc5737b998f021d79728732b5e67f0dfd777ff" Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.491626 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fc68-account-create-update-ctxrq" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.167969 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-j7lxk"] Feb 20 08:41:57 crc kubenswrapper[5094]: E0220 08:41:57.168350 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f3a4acd-5b68-467c-b024-b518d0f4d27e" containerName="mariadb-database-create" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.168364 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f3a4acd-5b68-467c-b024-b518d0f4d27e" containerName="mariadb-database-create" Feb 20 08:41:57 crc kubenswrapper[5094]: E0220 08:41:57.168400 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8c2373d-6a69-460a-8622-d001dc53efc0" containerName="mariadb-account-create-update" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.168409 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8c2373d-6a69-460a-8622-d001dc53efc0" containerName="mariadb-account-create-update" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.168587 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f3a4acd-5b68-467c-b024-b518d0f4d27e" containerName="mariadb-database-create" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.168600 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8c2373d-6a69-460a-8622-d001dc53efc0" containerName="mariadb-account-create-update" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.169261 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-j7lxk" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.171341 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.171443 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-v6cxj" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.171519 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.188396 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-j7lxk"] Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.281000 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bcbd09e1-8a1b-468e-9238-0691cafda43e-config\") pod \"neutron-db-sync-j7lxk\" (UID: \"bcbd09e1-8a1b-468e-9238-0691cafda43e\") " pod="openstack/neutron-db-sync-j7lxk" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.281085 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcbd09e1-8a1b-468e-9238-0691cafda43e-combined-ca-bundle\") pod \"neutron-db-sync-j7lxk\" (UID: \"bcbd09e1-8a1b-468e-9238-0691cafda43e\") " pod="openstack/neutron-db-sync-j7lxk" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.281130 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhp7b\" (UniqueName: \"kubernetes.io/projected/bcbd09e1-8a1b-468e-9238-0691cafda43e-kube-api-access-xhp7b\") pod \"neutron-db-sync-j7lxk\" (UID: \"bcbd09e1-8a1b-468e-9238-0691cafda43e\") " pod="openstack/neutron-db-sync-j7lxk" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.383319 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bcbd09e1-8a1b-468e-9238-0691cafda43e-config\") pod \"neutron-db-sync-j7lxk\" (UID: \"bcbd09e1-8a1b-468e-9238-0691cafda43e\") " pod="openstack/neutron-db-sync-j7lxk" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.383372 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcbd09e1-8a1b-468e-9238-0691cafda43e-combined-ca-bundle\") pod \"neutron-db-sync-j7lxk\" (UID: \"bcbd09e1-8a1b-468e-9238-0691cafda43e\") " pod="openstack/neutron-db-sync-j7lxk" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.383400 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhp7b\" (UniqueName: \"kubernetes.io/projected/bcbd09e1-8a1b-468e-9238-0691cafda43e-kube-api-access-xhp7b\") pod \"neutron-db-sync-j7lxk\" (UID: \"bcbd09e1-8a1b-468e-9238-0691cafda43e\") " pod="openstack/neutron-db-sync-j7lxk" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.394479 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcbd09e1-8a1b-468e-9238-0691cafda43e-combined-ca-bundle\") pod \"neutron-db-sync-j7lxk\" (UID: \"bcbd09e1-8a1b-468e-9238-0691cafda43e\") " pod="openstack/neutron-db-sync-j7lxk" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.395369 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bcbd09e1-8a1b-468e-9238-0691cafda43e-config\") pod \"neutron-db-sync-j7lxk\" (UID: \"bcbd09e1-8a1b-468e-9238-0691cafda43e\") " pod="openstack/neutron-db-sync-j7lxk" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.407883 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhp7b\" (UniqueName: \"kubernetes.io/projected/bcbd09e1-8a1b-468e-9238-0691cafda43e-kube-api-access-xhp7b\") pod \"neutron-db-sync-j7lxk\" (UID: \"bcbd09e1-8a1b-468e-9238-0691cafda43e\") " pod="openstack/neutron-db-sync-j7lxk" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.492880 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-j7lxk" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.967760 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-j7lxk"] Feb 20 08:41:58 crc kubenswrapper[5094]: I0220 08:41:58.516775 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-j7lxk" event={"ID":"bcbd09e1-8a1b-468e-9238-0691cafda43e","Type":"ContainerStarted","Data":"710b367e8a0475d2f89ae71b4ffcf7ae41da63c436f5361dbf27d4bd07bdf660"} Feb 20 08:41:58 crc kubenswrapper[5094]: I0220 08:41:58.517107 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-j7lxk" event={"ID":"bcbd09e1-8a1b-468e-9238-0691cafda43e","Type":"ContainerStarted","Data":"06453b626b934944ded65ddde039629a8276c6c1665fc2259f6c73b4d5e297ff"} Feb 20 08:41:58 crc kubenswrapper[5094]: I0220 08:41:58.542898 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-j7lxk" podStartSLOduration=1.542879184 podStartE2EDuration="1.542879184s" podCreationTimestamp="2026-02-20 08:41:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:41:58.534144353 +0000 UTC m=+6933.406771064" watchObservedRunningTime="2026-02-20 08:41:58.542879184 +0000 UTC m=+6933.415505895" Feb 20 08:42:03 crc kubenswrapper[5094]: I0220 08:42:03.569734 5094 generic.go:334] "Generic (PLEG): container finished" podID="bcbd09e1-8a1b-468e-9238-0691cafda43e" containerID="710b367e8a0475d2f89ae71b4ffcf7ae41da63c436f5361dbf27d4bd07bdf660" exitCode=0 Feb 20 08:42:03 crc kubenswrapper[5094]: I0220 08:42:03.569898 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-j7lxk" event={"ID":"bcbd09e1-8a1b-468e-9238-0691cafda43e","Type":"ContainerDied","Data":"710b367e8a0475d2f89ae71b4ffcf7ae41da63c436f5361dbf27d4bd07bdf660"} Feb 20 08:42:04 crc kubenswrapper[5094]: I0220 08:42:04.971766 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-j7lxk" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.124973 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcbd09e1-8a1b-468e-9238-0691cafda43e-combined-ca-bundle\") pod \"bcbd09e1-8a1b-468e-9238-0691cafda43e\" (UID: \"bcbd09e1-8a1b-468e-9238-0691cafda43e\") " Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.125071 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bcbd09e1-8a1b-468e-9238-0691cafda43e-config\") pod \"bcbd09e1-8a1b-468e-9238-0691cafda43e\" (UID: \"bcbd09e1-8a1b-468e-9238-0691cafda43e\") " Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.125218 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhp7b\" (UniqueName: \"kubernetes.io/projected/bcbd09e1-8a1b-468e-9238-0691cafda43e-kube-api-access-xhp7b\") pod \"bcbd09e1-8a1b-468e-9238-0691cafda43e\" (UID: \"bcbd09e1-8a1b-468e-9238-0691cafda43e\") " Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.130339 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcbd09e1-8a1b-468e-9238-0691cafda43e-kube-api-access-xhp7b" (OuterVolumeSpecName: "kube-api-access-xhp7b") pod "bcbd09e1-8a1b-468e-9238-0691cafda43e" (UID: "bcbd09e1-8a1b-468e-9238-0691cafda43e"). InnerVolumeSpecName "kube-api-access-xhp7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.148660 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcbd09e1-8a1b-468e-9238-0691cafda43e-config" (OuterVolumeSpecName: "config") pod "bcbd09e1-8a1b-468e-9238-0691cafda43e" (UID: "bcbd09e1-8a1b-468e-9238-0691cafda43e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.149816 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcbd09e1-8a1b-468e-9238-0691cafda43e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bcbd09e1-8a1b-468e-9238-0691cafda43e" (UID: "bcbd09e1-8a1b-468e-9238-0691cafda43e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.227531 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcbd09e1-8a1b-468e-9238-0691cafda43e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.227571 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/bcbd09e1-8a1b-468e-9238-0691cafda43e-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.227585 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhp7b\" (UniqueName: \"kubernetes.io/projected/bcbd09e1-8a1b-468e-9238-0691cafda43e-kube-api-access-xhp7b\") on node \"crc\" DevicePath \"\"" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.590613 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-j7lxk" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.590605 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-j7lxk" event={"ID":"bcbd09e1-8a1b-468e-9238-0691cafda43e","Type":"ContainerDied","Data":"06453b626b934944ded65ddde039629a8276c6c1665fc2259f6c73b4d5e297ff"} Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.591106 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06453b626b934944ded65ddde039629a8276c6c1665fc2259f6c73b4d5e297ff" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.743732 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-549654cbff-7zg62"] Feb 20 08:42:05 crc kubenswrapper[5094]: E0220 08:42:05.744250 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcbd09e1-8a1b-468e-9238-0691cafda43e" containerName="neutron-db-sync" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.744276 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcbd09e1-8a1b-468e-9238-0691cafda43e" containerName="neutron-db-sync" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.744560 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcbd09e1-8a1b-468e-9238-0691cafda43e" containerName="neutron-db-sync" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.746094 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.762082 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-549654cbff-7zg62"] Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.839061 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-ovsdbserver-sb\") pod \"dnsmasq-dns-549654cbff-7zg62\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.839142 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-dns-svc\") pod \"dnsmasq-dns-549654cbff-7zg62\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.839321 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9dbc\" (UniqueName: \"kubernetes.io/projected/14b1e378-3ae1-4707-8c14-3ee7ad292a55-kube-api-access-x9dbc\") pod \"dnsmasq-dns-549654cbff-7zg62\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.839476 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-config\") pod \"dnsmasq-dns-549654cbff-7zg62\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.839595 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-ovsdbserver-nb\") pod \"dnsmasq-dns-549654cbff-7zg62\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.882874 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-585ff4fdf7-llqts"] Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.884530 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-585ff4fdf7-llqts" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.889474 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.890130 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-v6cxj" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.890284 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.900263 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-585ff4fdf7-llqts"] Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.942144 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9dbc\" (UniqueName: \"kubernetes.io/projected/14b1e378-3ae1-4707-8c14-3ee7ad292a55-kube-api-access-x9dbc\") pod \"dnsmasq-dns-549654cbff-7zg62\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.942250 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-config\") pod \"dnsmasq-dns-549654cbff-7zg62\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.942327 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-ovsdbserver-nb\") pod \"dnsmasq-dns-549654cbff-7zg62\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.945387 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-ovsdbserver-sb\") pod \"dnsmasq-dns-549654cbff-7zg62\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.945684 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-dns-svc\") pod \"dnsmasq-dns-549654cbff-7zg62\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.946412 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-config\") pod \"dnsmasq-dns-549654cbff-7zg62\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.946870 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-dns-svc\") pod \"dnsmasq-dns-549654cbff-7zg62\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.947319 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-ovsdbserver-sb\") pod \"dnsmasq-dns-549654cbff-7zg62\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.948526 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-ovsdbserver-nb\") pod \"dnsmasq-dns-549654cbff-7zg62\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.969982 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9dbc\" (UniqueName: \"kubernetes.io/projected/14b1e378-3ae1-4707-8c14-3ee7ad292a55-kube-api-access-x9dbc\") pod \"dnsmasq-dns-549654cbff-7zg62\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:06 crc kubenswrapper[5094]: I0220 08:42:06.046963 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr9kv\" (UniqueName: \"kubernetes.io/projected/78fff8ae-90d4-490d-b302-45fce0bd0101-kube-api-access-zr9kv\") pod \"neutron-585ff4fdf7-llqts\" (UID: \"78fff8ae-90d4-490d-b302-45fce0bd0101\") " pod="openstack/neutron-585ff4fdf7-llqts" Feb 20 08:42:06 crc kubenswrapper[5094]: I0220 08:42:06.047045 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/78fff8ae-90d4-490d-b302-45fce0bd0101-httpd-config\") pod \"neutron-585ff4fdf7-llqts\" (UID: \"78fff8ae-90d4-490d-b302-45fce0bd0101\") " pod="openstack/neutron-585ff4fdf7-llqts" Feb 20 08:42:06 crc kubenswrapper[5094]: I0220 08:42:06.047258 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/78fff8ae-90d4-490d-b302-45fce0bd0101-config\") pod \"neutron-585ff4fdf7-llqts\" (UID: \"78fff8ae-90d4-490d-b302-45fce0bd0101\") " pod="openstack/neutron-585ff4fdf7-llqts" Feb 20 08:42:06 crc kubenswrapper[5094]: I0220 08:42:06.047355 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78fff8ae-90d4-490d-b302-45fce0bd0101-combined-ca-bundle\") pod \"neutron-585ff4fdf7-llqts\" (UID: \"78fff8ae-90d4-490d-b302-45fce0bd0101\") " pod="openstack/neutron-585ff4fdf7-llqts" Feb 20 08:42:06 crc kubenswrapper[5094]: I0220 08:42:06.129151 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:06 crc kubenswrapper[5094]: I0220 08:42:06.149424 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr9kv\" (UniqueName: \"kubernetes.io/projected/78fff8ae-90d4-490d-b302-45fce0bd0101-kube-api-access-zr9kv\") pod \"neutron-585ff4fdf7-llqts\" (UID: \"78fff8ae-90d4-490d-b302-45fce0bd0101\") " pod="openstack/neutron-585ff4fdf7-llqts" Feb 20 08:42:06 crc kubenswrapper[5094]: I0220 08:42:06.149514 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/78fff8ae-90d4-490d-b302-45fce0bd0101-httpd-config\") pod \"neutron-585ff4fdf7-llqts\" (UID: \"78fff8ae-90d4-490d-b302-45fce0bd0101\") " pod="openstack/neutron-585ff4fdf7-llqts" Feb 20 08:42:06 crc kubenswrapper[5094]: I0220 08:42:06.149579 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/78fff8ae-90d4-490d-b302-45fce0bd0101-config\") pod \"neutron-585ff4fdf7-llqts\" (UID: \"78fff8ae-90d4-490d-b302-45fce0bd0101\") " pod="openstack/neutron-585ff4fdf7-llqts" Feb 20 08:42:06 crc kubenswrapper[5094]: I0220 08:42:06.149640 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78fff8ae-90d4-490d-b302-45fce0bd0101-combined-ca-bundle\") pod \"neutron-585ff4fdf7-llqts\" (UID: \"78fff8ae-90d4-490d-b302-45fce0bd0101\") " pod="openstack/neutron-585ff4fdf7-llqts" Feb 20 08:42:06 crc kubenswrapper[5094]: I0220 08:42:06.154012 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78fff8ae-90d4-490d-b302-45fce0bd0101-combined-ca-bundle\") pod \"neutron-585ff4fdf7-llqts\" (UID: \"78fff8ae-90d4-490d-b302-45fce0bd0101\") " pod="openstack/neutron-585ff4fdf7-llqts" Feb 20 08:42:06 crc kubenswrapper[5094]: I0220 08:42:06.154254 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/78fff8ae-90d4-490d-b302-45fce0bd0101-httpd-config\") pod \"neutron-585ff4fdf7-llqts\" (UID: \"78fff8ae-90d4-490d-b302-45fce0bd0101\") " pod="openstack/neutron-585ff4fdf7-llqts" Feb 20 08:42:06 crc kubenswrapper[5094]: I0220 08:42:06.154631 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/78fff8ae-90d4-490d-b302-45fce0bd0101-config\") pod \"neutron-585ff4fdf7-llqts\" (UID: \"78fff8ae-90d4-490d-b302-45fce0bd0101\") " pod="openstack/neutron-585ff4fdf7-llqts" Feb 20 08:42:06 crc kubenswrapper[5094]: I0220 08:42:06.179648 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr9kv\" (UniqueName: \"kubernetes.io/projected/78fff8ae-90d4-490d-b302-45fce0bd0101-kube-api-access-zr9kv\") pod \"neutron-585ff4fdf7-llqts\" (UID: \"78fff8ae-90d4-490d-b302-45fce0bd0101\") " pod="openstack/neutron-585ff4fdf7-llqts" Feb 20 08:42:06 crc kubenswrapper[5094]: I0220 08:42:06.216483 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-585ff4fdf7-llqts" Feb 20 08:42:06 crc kubenswrapper[5094]: I0220 08:42:06.596806 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-549654cbff-7zg62"] Feb 20 08:42:06 crc kubenswrapper[5094]: W0220 08:42:06.599904 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14b1e378_3ae1_4707_8c14_3ee7ad292a55.slice/crio-4b216a605a8b79d7c7ddf65eb3de811aab7c907beb4645b0cfc8890ec094ca61 WatchSource:0}: Error finding container 4b216a605a8b79d7c7ddf65eb3de811aab7c907beb4645b0cfc8890ec094ca61: Status 404 returned error can't find the container with id 4b216a605a8b79d7c7ddf65eb3de811aab7c907beb4645b0cfc8890ec094ca61 Feb 20 08:42:06 crc kubenswrapper[5094]: I0220 08:42:06.877462 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-585ff4fdf7-llqts"] Feb 20 08:42:06 crc kubenswrapper[5094]: W0220 08:42:06.879372 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78fff8ae_90d4_490d_b302_45fce0bd0101.slice/crio-9f0b8f65b9e3b8ce6b00bb72d6d184a0c589d3d3c50e319f06e67b43c2a422a0 WatchSource:0}: Error finding container 9f0b8f65b9e3b8ce6b00bb72d6d184a0c589d3d3c50e319f06e67b43c2a422a0: Status 404 returned error can't find the container with id 9f0b8f65b9e3b8ce6b00bb72d6d184a0c589d3d3c50e319f06e67b43c2a422a0 Feb 20 08:42:07 crc kubenswrapper[5094]: I0220 08:42:07.606567 5094 generic.go:334] "Generic (PLEG): container finished" podID="14b1e378-3ae1-4707-8c14-3ee7ad292a55" containerID="a8b2fd7d2d6131025e57a52935b313eca72cf89c2a15b9432a7f1f536f05a672" exitCode=0 Feb 20 08:42:07 crc kubenswrapper[5094]: I0220 08:42:07.606917 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-549654cbff-7zg62" event={"ID":"14b1e378-3ae1-4707-8c14-3ee7ad292a55","Type":"ContainerDied","Data":"a8b2fd7d2d6131025e57a52935b313eca72cf89c2a15b9432a7f1f536f05a672"} Feb 20 08:42:07 crc kubenswrapper[5094]: I0220 08:42:07.606943 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-549654cbff-7zg62" event={"ID":"14b1e378-3ae1-4707-8c14-3ee7ad292a55","Type":"ContainerStarted","Data":"4b216a605a8b79d7c7ddf65eb3de811aab7c907beb4645b0cfc8890ec094ca61"} Feb 20 08:42:07 crc kubenswrapper[5094]: I0220 08:42:07.609694 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-585ff4fdf7-llqts" event={"ID":"78fff8ae-90d4-490d-b302-45fce0bd0101","Type":"ContainerStarted","Data":"a17ae4487e7812976ea9b81c300e3a145fb7794bb0b360255b78bde10a86c7e5"} Feb 20 08:42:07 crc kubenswrapper[5094]: I0220 08:42:07.609813 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-585ff4fdf7-llqts" event={"ID":"78fff8ae-90d4-490d-b302-45fce0bd0101","Type":"ContainerStarted","Data":"184344a649be3445ec7813f76772cf06714a504a32568a8084582827a77a9e06"} Feb 20 08:42:07 crc kubenswrapper[5094]: I0220 08:42:07.609826 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-585ff4fdf7-llqts" event={"ID":"78fff8ae-90d4-490d-b302-45fce0bd0101","Type":"ContainerStarted","Data":"9f0b8f65b9e3b8ce6b00bb72d6d184a0c589d3d3c50e319f06e67b43c2a422a0"} Feb 20 08:42:07 crc kubenswrapper[5094]: I0220 08:42:07.609931 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-585ff4fdf7-llqts" Feb 20 08:42:08 crc kubenswrapper[5094]: I0220 08:42:08.620554 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-549654cbff-7zg62" event={"ID":"14b1e378-3ae1-4707-8c14-3ee7ad292a55","Type":"ContainerStarted","Data":"09b2ae2e83651f531f10fff7937847937cdbc6ce024c4c5e0334ed9056460e09"} Feb 20 08:42:08 crc kubenswrapper[5094]: I0220 08:42:08.642879 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-585ff4fdf7-llqts" podStartSLOduration=3.642854681 podStartE2EDuration="3.642854681s" podCreationTimestamp="2026-02-20 08:42:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:42:07.646004738 +0000 UTC m=+6942.518631459" watchObservedRunningTime="2026-02-20 08:42:08.642854681 +0000 UTC m=+6943.515481402" Feb 20 08:42:08 crc kubenswrapper[5094]: I0220 08:42:08.644616 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-549654cbff-7zg62" podStartSLOduration=3.644605483 podStartE2EDuration="3.644605483s" podCreationTimestamp="2026-02-20 08:42:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:42:08.6386574 +0000 UTC m=+6943.511284121" watchObservedRunningTime="2026-02-20 08:42:08.644605483 +0000 UTC m=+6943.517232204" Feb 20 08:42:09 crc kubenswrapper[5094]: I0220 08:42:09.627613 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.131849 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.206802 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dc5b9855-b6gdq"] Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.207422 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" podUID="fdb5c820-23c9-42e7-9c70-d8f504f47ff5" containerName="dnsmasq-dns" containerID="cri-o://8bb9de6e4383ec58a31d9dfe384382f6766f2e29c7eeaa03946d4109134ffc85" gracePeriod=10 Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.684447 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.698813 5094 generic.go:334] "Generic (PLEG): container finished" podID="fdb5c820-23c9-42e7-9c70-d8f504f47ff5" containerID="8bb9de6e4383ec58a31d9dfe384382f6766f2e29c7eeaa03946d4109134ffc85" exitCode=0 Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.698855 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" event={"ID":"fdb5c820-23c9-42e7-9c70-d8f504f47ff5","Type":"ContainerDied","Data":"8bb9de6e4383ec58a31d9dfe384382f6766f2e29c7eeaa03946d4109134ffc85"} Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.698879 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.698891 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" event={"ID":"fdb5c820-23c9-42e7-9c70-d8f504f47ff5","Type":"ContainerDied","Data":"30f5698d42ea6d5782a8328b6af34f86a1ed299fd9bf068ef4d5c02cdc98c3f0"} Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.698910 5094 scope.go:117] "RemoveContainer" containerID="8bb9de6e4383ec58a31d9dfe384382f6766f2e29c7eeaa03946d4109134ffc85" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.737653 5094 scope.go:117] "RemoveContainer" containerID="faf72b2a28fa9464260dd616d409fc89190c54954c251671336a3448ddcf3235" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.759141 5094 scope.go:117] "RemoveContainer" containerID="8bb9de6e4383ec58a31d9dfe384382f6766f2e29c7eeaa03946d4109134ffc85" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.759967 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-ovsdbserver-sb\") pod \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.760849 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-dns-svc\") pod \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.760924 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpcqp\" (UniqueName: \"kubernetes.io/projected/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-kube-api-access-hpcqp\") pod \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.760944 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-config\") pod \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.760993 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-ovsdbserver-nb\") pod \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " Feb 20 08:42:16 crc kubenswrapper[5094]: E0220 08:42:16.761740 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bb9de6e4383ec58a31d9dfe384382f6766f2e29c7eeaa03946d4109134ffc85\": container with ID starting with 8bb9de6e4383ec58a31d9dfe384382f6766f2e29c7eeaa03946d4109134ffc85 not found: ID does not exist" containerID="8bb9de6e4383ec58a31d9dfe384382f6766f2e29c7eeaa03946d4109134ffc85" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.761798 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bb9de6e4383ec58a31d9dfe384382f6766f2e29c7eeaa03946d4109134ffc85"} err="failed to get container status \"8bb9de6e4383ec58a31d9dfe384382f6766f2e29c7eeaa03946d4109134ffc85\": rpc error: code = NotFound desc = could not find container \"8bb9de6e4383ec58a31d9dfe384382f6766f2e29c7eeaa03946d4109134ffc85\": container with ID starting with 8bb9de6e4383ec58a31d9dfe384382f6766f2e29c7eeaa03946d4109134ffc85 not found: ID does not exist" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.761828 5094 scope.go:117] "RemoveContainer" containerID="faf72b2a28fa9464260dd616d409fc89190c54954c251671336a3448ddcf3235" Feb 20 08:42:16 crc kubenswrapper[5094]: E0220 08:42:16.762801 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faf72b2a28fa9464260dd616d409fc89190c54954c251671336a3448ddcf3235\": container with ID starting with faf72b2a28fa9464260dd616d409fc89190c54954c251671336a3448ddcf3235 not found: ID does not exist" containerID="faf72b2a28fa9464260dd616d409fc89190c54954c251671336a3448ddcf3235" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.762845 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faf72b2a28fa9464260dd616d409fc89190c54954c251671336a3448ddcf3235"} err="failed to get container status \"faf72b2a28fa9464260dd616d409fc89190c54954c251671336a3448ddcf3235\": rpc error: code = NotFound desc = could not find container \"faf72b2a28fa9464260dd616d409fc89190c54954c251671336a3448ddcf3235\": container with ID starting with faf72b2a28fa9464260dd616d409fc89190c54954c251671336a3448ddcf3235 not found: ID does not exist" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.767396 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-kube-api-access-hpcqp" (OuterVolumeSpecName: "kube-api-access-hpcqp") pod "fdb5c820-23c9-42e7-9c70-d8f504f47ff5" (UID: "fdb5c820-23c9-42e7-9c70-d8f504f47ff5"). InnerVolumeSpecName "kube-api-access-hpcqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.805934 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fdb5c820-23c9-42e7-9c70-d8f504f47ff5" (UID: "fdb5c820-23c9-42e7-9c70-d8f504f47ff5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.805971 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fdb5c820-23c9-42e7-9c70-d8f504f47ff5" (UID: "fdb5c820-23c9-42e7-9c70-d8f504f47ff5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.809249 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-config" (OuterVolumeSpecName: "config") pod "fdb5c820-23c9-42e7-9c70-d8f504f47ff5" (UID: "fdb5c820-23c9-42e7-9c70-d8f504f47ff5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.811771 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fdb5c820-23c9-42e7-9c70-d8f504f47ff5" (UID: "fdb5c820-23c9-42e7-9c70-d8f504f47ff5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.863408 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpcqp\" (UniqueName: \"kubernetes.io/projected/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-kube-api-access-hpcqp\") on node \"crc\" DevicePath \"\"" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.863442 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.863452 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.863462 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.863471 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 08:42:17 crc kubenswrapper[5094]: I0220 08:42:17.027320 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dc5b9855-b6gdq"] Feb 20 08:42:17 crc kubenswrapper[5094]: I0220 08:42:17.042322 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58dc5b9855-b6gdq"] Feb 20 08:42:17 crc kubenswrapper[5094]: I0220 08:42:17.848964 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdb5c820-23c9-42e7-9c70-d8f504f47ff5" path="/var/lib/kubelet/pods/fdb5c820-23c9-42e7-9c70-d8f504f47ff5/volumes" Feb 20 08:42:34 crc kubenswrapper[5094]: I0220 08:42:34.106920 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:42:34 crc kubenswrapper[5094]: I0220 08:42:34.107908 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:42:36 crc kubenswrapper[5094]: I0220 08:42:36.228664 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-585ff4fdf7-llqts" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.605048 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-bgw44"] Feb 20 08:42:43 crc kubenswrapper[5094]: E0220 08:42:43.605882 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdb5c820-23c9-42e7-9c70-d8f504f47ff5" containerName="init" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.605902 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdb5c820-23c9-42e7-9c70-d8f504f47ff5" containerName="init" Feb 20 08:42:43 crc kubenswrapper[5094]: E0220 08:42:43.605922 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdb5c820-23c9-42e7-9c70-d8f504f47ff5" containerName="dnsmasq-dns" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.605931 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdb5c820-23c9-42e7-9c70-d8f504f47ff5" containerName="dnsmasq-dns" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.606156 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdb5c820-23c9-42e7-9c70-d8f504f47ff5" containerName="dnsmasq-dns" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.606855 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-bgw44" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.623945 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-bgw44"] Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.706364 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-80d8-account-create-update-2lhxz"] Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.707367 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-80d8-account-create-update-2lhxz" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.709500 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.722130 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-80d8-account-create-update-2lhxz"] Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.742503 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m5xv\" (UniqueName: \"kubernetes.io/projected/5d39890b-bbcb-4fcb-9f5e-6f74782fc661-kube-api-access-5m5xv\") pod \"glance-db-create-bgw44\" (UID: \"5d39890b-bbcb-4fcb-9f5e-6f74782fc661\") " pod="openstack/glance-db-create-bgw44" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.745372 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d39890b-bbcb-4fcb-9f5e-6f74782fc661-operator-scripts\") pod \"glance-db-create-bgw44\" (UID: \"5d39890b-bbcb-4fcb-9f5e-6f74782fc661\") " pod="openstack/glance-db-create-bgw44" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.846221 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0fbd49a-25e7-44de-a81d-f324feba0dff-operator-scripts\") pod \"glance-80d8-account-create-update-2lhxz\" (UID: \"f0fbd49a-25e7-44de-a81d-f324feba0dff\") " pod="openstack/glance-80d8-account-create-update-2lhxz" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.846410 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d39890b-bbcb-4fcb-9f5e-6f74782fc661-operator-scripts\") pod \"glance-db-create-bgw44\" (UID: \"5d39890b-bbcb-4fcb-9f5e-6f74782fc661\") " pod="openstack/glance-db-create-bgw44" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.846541 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m5xv\" (UniqueName: \"kubernetes.io/projected/5d39890b-bbcb-4fcb-9f5e-6f74782fc661-kube-api-access-5m5xv\") pod \"glance-db-create-bgw44\" (UID: \"5d39890b-bbcb-4fcb-9f5e-6f74782fc661\") " pod="openstack/glance-db-create-bgw44" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.846686 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbr6b\" (UniqueName: \"kubernetes.io/projected/f0fbd49a-25e7-44de-a81d-f324feba0dff-kube-api-access-pbr6b\") pod \"glance-80d8-account-create-update-2lhxz\" (UID: \"f0fbd49a-25e7-44de-a81d-f324feba0dff\") " pod="openstack/glance-80d8-account-create-update-2lhxz" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.847272 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d39890b-bbcb-4fcb-9f5e-6f74782fc661-operator-scripts\") pod \"glance-db-create-bgw44\" (UID: \"5d39890b-bbcb-4fcb-9f5e-6f74782fc661\") " pod="openstack/glance-db-create-bgw44" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.866945 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m5xv\" (UniqueName: \"kubernetes.io/projected/5d39890b-bbcb-4fcb-9f5e-6f74782fc661-kube-api-access-5m5xv\") pod \"glance-db-create-bgw44\" (UID: \"5d39890b-bbcb-4fcb-9f5e-6f74782fc661\") " pod="openstack/glance-db-create-bgw44" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.923871 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-bgw44" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.948888 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbr6b\" (UniqueName: \"kubernetes.io/projected/f0fbd49a-25e7-44de-a81d-f324feba0dff-kube-api-access-pbr6b\") pod \"glance-80d8-account-create-update-2lhxz\" (UID: \"f0fbd49a-25e7-44de-a81d-f324feba0dff\") " pod="openstack/glance-80d8-account-create-update-2lhxz" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.949025 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0fbd49a-25e7-44de-a81d-f324feba0dff-operator-scripts\") pod \"glance-80d8-account-create-update-2lhxz\" (UID: \"f0fbd49a-25e7-44de-a81d-f324feba0dff\") " pod="openstack/glance-80d8-account-create-update-2lhxz" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.950933 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0fbd49a-25e7-44de-a81d-f324feba0dff-operator-scripts\") pod \"glance-80d8-account-create-update-2lhxz\" (UID: \"f0fbd49a-25e7-44de-a81d-f324feba0dff\") " pod="openstack/glance-80d8-account-create-update-2lhxz" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.971990 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbr6b\" (UniqueName: \"kubernetes.io/projected/f0fbd49a-25e7-44de-a81d-f324feba0dff-kube-api-access-pbr6b\") pod \"glance-80d8-account-create-update-2lhxz\" (UID: \"f0fbd49a-25e7-44de-a81d-f324feba0dff\") " pod="openstack/glance-80d8-account-create-update-2lhxz" Feb 20 08:42:44 crc kubenswrapper[5094]: I0220 08:42:44.023318 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-80d8-account-create-update-2lhxz" Feb 20 08:42:44 crc kubenswrapper[5094]: I0220 08:42:44.419878 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-bgw44"] Feb 20 08:42:44 crc kubenswrapper[5094]: I0220 08:42:44.524432 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-80d8-account-create-update-2lhxz"] Feb 20 08:42:44 crc kubenswrapper[5094]: W0220 08:42:44.526825 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0fbd49a_25e7_44de_a81d_f324feba0dff.slice/crio-fb729c857f4440a56e2335a339bd8e9975febc05c613c62aa627adb7ec34420d WatchSource:0}: Error finding container fb729c857f4440a56e2335a339bd8e9975febc05c613c62aa627adb7ec34420d: Status 404 returned error can't find the container with id fb729c857f4440a56e2335a339bd8e9975febc05c613c62aa627adb7ec34420d Feb 20 08:42:44 crc kubenswrapper[5094]: I0220 08:42:44.940186 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-bgw44" event={"ID":"5d39890b-bbcb-4fcb-9f5e-6f74782fc661","Type":"ContainerStarted","Data":"26fb92ef592af820040ca30c6c02e7ed550cd4c5268296895eb9386dd13d2c0a"} Feb 20 08:42:44 crc kubenswrapper[5094]: I0220 08:42:44.940524 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-bgw44" event={"ID":"5d39890b-bbcb-4fcb-9f5e-6f74782fc661","Type":"ContainerStarted","Data":"e27064341d4ad31ef21d53047d3bf23db0f06583a1eb5340c5cc014514f5cf27"} Feb 20 08:42:44 crc kubenswrapper[5094]: I0220 08:42:44.941802 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-80d8-account-create-update-2lhxz" event={"ID":"f0fbd49a-25e7-44de-a81d-f324feba0dff","Type":"ContainerStarted","Data":"0517d39a1199d93e907c142d0fd23dddc068f1fcc10b13d41993d7946b0ef46a"} Feb 20 08:42:44 crc kubenswrapper[5094]: I0220 08:42:44.941845 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-80d8-account-create-update-2lhxz" event={"ID":"f0fbd49a-25e7-44de-a81d-f324feba0dff","Type":"ContainerStarted","Data":"fb729c857f4440a56e2335a339bd8e9975febc05c613c62aa627adb7ec34420d"} Feb 20 08:42:44 crc kubenswrapper[5094]: I0220 08:42:44.959148 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-bgw44" podStartSLOduration=1.9591320749999999 podStartE2EDuration="1.959132075s" podCreationTimestamp="2026-02-20 08:42:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:42:44.957149057 +0000 UTC m=+6979.829775768" watchObservedRunningTime="2026-02-20 08:42:44.959132075 +0000 UTC m=+6979.831758786" Feb 20 08:42:44 crc kubenswrapper[5094]: I0220 08:42:44.979380 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-80d8-account-create-update-2lhxz" podStartSLOduration=1.979358112 podStartE2EDuration="1.979358112s" podCreationTimestamp="2026-02-20 08:42:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:42:44.972746932 +0000 UTC m=+6979.845373653" watchObservedRunningTime="2026-02-20 08:42:44.979358112 +0000 UTC m=+6979.851984823" Feb 20 08:42:45 crc kubenswrapper[5094]: I0220 08:42:45.952454 5094 generic.go:334] "Generic (PLEG): container finished" podID="5d39890b-bbcb-4fcb-9f5e-6f74782fc661" containerID="26fb92ef592af820040ca30c6c02e7ed550cd4c5268296895eb9386dd13d2c0a" exitCode=0 Feb 20 08:42:45 crc kubenswrapper[5094]: I0220 08:42:45.952558 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-bgw44" event={"ID":"5d39890b-bbcb-4fcb-9f5e-6f74782fc661","Type":"ContainerDied","Data":"26fb92ef592af820040ca30c6c02e7ed550cd4c5268296895eb9386dd13d2c0a"} Feb 20 08:42:45 crc kubenswrapper[5094]: I0220 08:42:45.953695 5094 generic.go:334] "Generic (PLEG): container finished" podID="f0fbd49a-25e7-44de-a81d-f324feba0dff" containerID="0517d39a1199d93e907c142d0fd23dddc068f1fcc10b13d41993d7946b0ef46a" exitCode=0 Feb 20 08:42:45 crc kubenswrapper[5094]: I0220 08:42:45.953751 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-80d8-account-create-update-2lhxz" event={"ID":"f0fbd49a-25e7-44de-a81d-f324feba0dff","Type":"ContainerDied","Data":"0517d39a1199d93e907c142d0fd23dddc068f1fcc10b13d41993d7946b0ef46a"} Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.300765 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-bgw44" Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.307853 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-80d8-account-create-update-2lhxz" Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.425949 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m5xv\" (UniqueName: \"kubernetes.io/projected/5d39890b-bbcb-4fcb-9f5e-6f74782fc661-kube-api-access-5m5xv\") pod \"5d39890b-bbcb-4fcb-9f5e-6f74782fc661\" (UID: \"5d39890b-bbcb-4fcb-9f5e-6f74782fc661\") " Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.426024 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0fbd49a-25e7-44de-a81d-f324feba0dff-operator-scripts\") pod \"f0fbd49a-25e7-44de-a81d-f324feba0dff\" (UID: \"f0fbd49a-25e7-44de-a81d-f324feba0dff\") " Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.426083 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d39890b-bbcb-4fcb-9f5e-6f74782fc661-operator-scripts\") pod \"5d39890b-bbcb-4fcb-9f5e-6f74782fc661\" (UID: \"5d39890b-bbcb-4fcb-9f5e-6f74782fc661\") " Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.426228 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbr6b\" (UniqueName: \"kubernetes.io/projected/f0fbd49a-25e7-44de-a81d-f324feba0dff-kube-api-access-pbr6b\") pod \"f0fbd49a-25e7-44de-a81d-f324feba0dff\" (UID: \"f0fbd49a-25e7-44de-a81d-f324feba0dff\") " Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.428157 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0fbd49a-25e7-44de-a81d-f324feba0dff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f0fbd49a-25e7-44de-a81d-f324feba0dff" (UID: "f0fbd49a-25e7-44de-a81d-f324feba0dff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.428269 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d39890b-bbcb-4fcb-9f5e-6f74782fc661-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5d39890b-bbcb-4fcb-9f5e-6f74782fc661" (UID: "5d39890b-bbcb-4fcb-9f5e-6f74782fc661"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.432583 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0fbd49a-25e7-44de-a81d-f324feba0dff-kube-api-access-pbr6b" (OuterVolumeSpecName: "kube-api-access-pbr6b") pod "f0fbd49a-25e7-44de-a81d-f324feba0dff" (UID: "f0fbd49a-25e7-44de-a81d-f324feba0dff"). InnerVolumeSpecName "kube-api-access-pbr6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.433215 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d39890b-bbcb-4fcb-9f5e-6f74782fc661-kube-api-access-5m5xv" (OuterVolumeSpecName: "kube-api-access-5m5xv") pod "5d39890b-bbcb-4fcb-9f5e-6f74782fc661" (UID: "5d39890b-bbcb-4fcb-9f5e-6f74782fc661"). InnerVolumeSpecName "kube-api-access-5m5xv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.528222 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbr6b\" (UniqueName: \"kubernetes.io/projected/f0fbd49a-25e7-44de-a81d-f324feba0dff-kube-api-access-pbr6b\") on node \"crc\" DevicePath \"\"" Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.528255 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m5xv\" (UniqueName: \"kubernetes.io/projected/5d39890b-bbcb-4fcb-9f5e-6f74782fc661-kube-api-access-5m5xv\") on node \"crc\" DevicePath \"\"" Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.528267 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0fbd49a-25e7-44de-a81d-f324feba0dff-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.528276 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d39890b-bbcb-4fcb-9f5e-6f74782fc661-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.967785 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-80d8-account-create-update-2lhxz" event={"ID":"f0fbd49a-25e7-44de-a81d-f324feba0dff","Type":"ContainerDied","Data":"fb729c857f4440a56e2335a339bd8e9975febc05c613c62aa627adb7ec34420d"} Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.967825 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb729c857f4440a56e2335a339bd8e9975febc05c613c62aa627adb7ec34420d" Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.967880 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-80d8-account-create-update-2lhxz" Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.970545 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-bgw44" event={"ID":"5d39890b-bbcb-4fcb-9f5e-6f74782fc661","Type":"ContainerDied","Data":"e27064341d4ad31ef21d53047d3bf23db0f06583a1eb5340c5cc014514f5cf27"} Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.970574 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e27064341d4ad31ef21d53047d3bf23db0f06583a1eb5340c5cc014514f5cf27" Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.970616 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-bgw44" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.039771 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-2fzgg"] Feb 20 08:42:49 crc kubenswrapper[5094]: E0220 08:42:49.040394 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d39890b-bbcb-4fcb-9f5e-6f74782fc661" containerName="mariadb-database-create" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.040407 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d39890b-bbcb-4fcb-9f5e-6f74782fc661" containerName="mariadb-database-create" Feb 20 08:42:49 crc kubenswrapper[5094]: E0220 08:42:49.040438 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0fbd49a-25e7-44de-a81d-f324feba0dff" containerName="mariadb-account-create-update" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.040443 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0fbd49a-25e7-44de-a81d-f324feba0dff" containerName="mariadb-account-create-update" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.040591 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d39890b-bbcb-4fcb-9f5e-6f74782fc661" containerName="mariadb-database-create" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.040606 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0fbd49a-25e7-44de-a81d-f324feba0dff" containerName="mariadb-account-create-update" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.041153 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-2fzgg" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.048100 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-7xqwp" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.048324 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.065773 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-2fzgg"] Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.159610 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc4dw\" (UniqueName: \"kubernetes.io/projected/b382ec69-4b87-43f5-b964-eba4282bcc42-kube-api-access-nc4dw\") pod \"glance-db-sync-2fzgg\" (UID: \"b382ec69-4b87-43f5-b964-eba4282bcc42\") " pod="openstack/glance-db-sync-2fzgg" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.159684 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-combined-ca-bundle\") pod \"glance-db-sync-2fzgg\" (UID: \"b382ec69-4b87-43f5-b964-eba4282bcc42\") " pod="openstack/glance-db-sync-2fzgg" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.159840 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-db-sync-config-data\") pod \"glance-db-sync-2fzgg\" (UID: \"b382ec69-4b87-43f5-b964-eba4282bcc42\") " pod="openstack/glance-db-sync-2fzgg" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.160023 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-config-data\") pod \"glance-db-sync-2fzgg\" (UID: \"b382ec69-4b87-43f5-b964-eba4282bcc42\") " pod="openstack/glance-db-sync-2fzgg" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.261683 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-combined-ca-bundle\") pod \"glance-db-sync-2fzgg\" (UID: \"b382ec69-4b87-43f5-b964-eba4282bcc42\") " pod="openstack/glance-db-sync-2fzgg" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.261789 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-db-sync-config-data\") pod \"glance-db-sync-2fzgg\" (UID: \"b382ec69-4b87-43f5-b964-eba4282bcc42\") " pod="openstack/glance-db-sync-2fzgg" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.261850 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-config-data\") pod \"glance-db-sync-2fzgg\" (UID: \"b382ec69-4b87-43f5-b964-eba4282bcc42\") " pod="openstack/glance-db-sync-2fzgg" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.261941 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc4dw\" (UniqueName: \"kubernetes.io/projected/b382ec69-4b87-43f5-b964-eba4282bcc42-kube-api-access-nc4dw\") pod \"glance-db-sync-2fzgg\" (UID: \"b382ec69-4b87-43f5-b964-eba4282bcc42\") " pod="openstack/glance-db-sync-2fzgg" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.266807 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-config-data\") pod \"glance-db-sync-2fzgg\" (UID: \"b382ec69-4b87-43f5-b964-eba4282bcc42\") " pod="openstack/glance-db-sync-2fzgg" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.269255 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-combined-ca-bundle\") pod \"glance-db-sync-2fzgg\" (UID: \"b382ec69-4b87-43f5-b964-eba4282bcc42\") " pod="openstack/glance-db-sync-2fzgg" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.279954 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-db-sync-config-data\") pod \"glance-db-sync-2fzgg\" (UID: \"b382ec69-4b87-43f5-b964-eba4282bcc42\") " pod="openstack/glance-db-sync-2fzgg" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.301884 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc4dw\" (UniqueName: \"kubernetes.io/projected/b382ec69-4b87-43f5-b964-eba4282bcc42-kube-api-access-nc4dw\") pod \"glance-db-sync-2fzgg\" (UID: \"b382ec69-4b87-43f5-b964-eba4282bcc42\") " pod="openstack/glance-db-sync-2fzgg" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.366381 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-2fzgg" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.858263 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-2fzgg"] Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.863274 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 08:42:50 crc kubenswrapper[5094]: I0220 08:42:50.026592 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-2fzgg" event={"ID":"b382ec69-4b87-43f5-b964-eba4282bcc42","Type":"ContainerStarted","Data":"32f4328b6fac64f8733dba9143d7b674a26899af5e5493def4751935ca89de98"} Feb 20 08:43:04 crc kubenswrapper[5094]: I0220 08:43:04.107127 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:43:04 crc kubenswrapper[5094]: I0220 08:43:04.107747 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:43:15 crc kubenswrapper[5094]: I0220 08:43:15.239166 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-2fzgg" event={"ID":"b382ec69-4b87-43f5-b964-eba4282bcc42","Type":"ContainerStarted","Data":"d77d5b604322e5a963ae828151741fdab64a97bfd2a29b72e3a01f5ffe6ac7d2"} Feb 20 08:43:15 crc kubenswrapper[5094]: I0220 08:43:15.257754 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-2fzgg" podStartSLOduration=1.667360481 podStartE2EDuration="26.257736725s" podCreationTimestamp="2026-02-20 08:42:49 +0000 UTC" firstStartedPulling="2026-02-20 08:42:49.863044053 +0000 UTC m=+6984.735670764" lastFinishedPulling="2026-02-20 08:43:14.453420307 +0000 UTC m=+7009.326047008" observedRunningTime="2026-02-20 08:43:15.254798554 +0000 UTC m=+7010.127425265" watchObservedRunningTime="2026-02-20 08:43:15.257736725 +0000 UTC m=+7010.130363436" Feb 20 08:43:18 crc kubenswrapper[5094]: I0220 08:43:18.282784 5094 generic.go:334] "Generic (PLEG): container finished" podID="b382ec69-4b87-43f5-b964-eba4282bcc42" containerID="d77d5b604322e5a963ae828151741fdab64a97bfd2a29b72e3a01f5ffe6ac7d2" exitCode=0 Feb 20 08:43:18 crc kubenswrapper[5094]: I0220 08:43:18.282874 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-2fzgg" event={"ID":"b382ec69-4b87-43f5-b964-eba4282bcc42","Type":"ContainerDied","Data":"d77d5b604322e5a963ae828151741fdab64a97bfd2a29b72e3a01f5ffe6ac7d2"} Feb 20 08:43:19 crc kubenswrapper[5094]: I0220 08:43:19.790197 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-2fzgg" Feb 20 08:43:19 crc kubenswrapper[5094]: I0220 08:43:19.943407 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-combined-ca-bundle\") pod \"b382ec69-4b87-43f5-b964-eba4282bcc42\" (UID: \"b382ec69-4b87-43f5-b964-eba4282bcc42\") " Feb 20 08:43:19 crc kubenswrapper[5094]: I0220 08:43:19.943498 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-db-sync-config-data\") pod \"b382ec69-4b87-43f5-b964-eba4282bcc42\" (UID: \"b382ec69-4b87-43f5-b964-eba4282bcc42\") " Feb 20 08:43:19 crc kubenswrapper[5094]: I0220 08:43:19.943532 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nc4dw\" (UniqueName: \"kubernetes.io/projected/b382ec69-4b87-43f5-b964-eba4282bcc42-kube-api-access-nc4dw\") pod \"b382ec69-4b87-43f5-b964-eba4282bcc42\" (UID: \"b382ec69-4b87-43f5-b964-eba4282bcc42\") " Feb 20 08:43:19 crc kubenswrapper[5094]: I0220 08:43:19.943655 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-config-data\") pod \"b382ec69-4b87-43f5-b964-eba4282bcc42\" (UID: \"b382ec69-4b87-43f5-b964-eba4282bcc42\") " Feb 20 08:43:19 crc kubenswrapper[5094]: I0220 08:43:19.948796 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b382ec69-4b87-43f5-b964-eba4282bcc42-kube-api-access-nc4dw" (OuterVolumeSpecName: "kube-api-access-nc4dw") pod "b382ec69-4b87-43f5-b964-eba4282bcc42" (UID: "b382ec69-4b87-43f5-b964-eba4282bcc42"). InnerVolumeSpecName "kube-api-access-nc4dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:43:19 crc kubenswrapper[5094]: I0220 08:43:19.955366 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b382ec69-4b87-43f5-b964-eba4282bcc42" (UID: "b382ec69-4b87-43f5-b964-eba4282bcc42"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:43:19 crc kubenswrapper[5094]: I0220 08:43:19.966665 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b382ec69-4b87-43f5-b964-eba4282bcc42" (UID: "b382ec69-4b87-43f5-b964-eba4282bcc42"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:43:19 crc kubenswrapper[5094]: I0220 08:43:19.988270 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-config-data" (OuterVolumeSpecName: "config-data") pod "b382ec69-4b87-43f5-b964-eba4282bcc42" (UID: "b382ec69-4b87-43f5-b964-eba4282bcc42"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.046206 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.046247 5094 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.046260 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nc4dw\" (UniqueName: \"kubernetes.io/projected/b382ec69-4b87-43f5-b964-eba4282bcc42-kube-api-access-nc4dw\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.046273 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.305234 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-2fzgg" event={"ID":"b382ec69-4b87-43f5-b964-eba4282bcc42","Type":"ContainerDied","Data":"32f4328b6fac64f8733dba9143d7b674a26899af5e5493def4751935ca89de98"} Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.305272 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32f4328b6fac64f8733dba9143d7b674a26899af5e5493def4751935ca89de98" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.305302 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-2fzgg" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.683813 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 08:43:20 crc kubenswrapper[5094]: E0220 08:43:20.684213 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b382ec69-4b87-43f5-b964-eba4282bcc42" containerName="glance-db-sync" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.684230 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b382ec69-4b87-43f5-b964-eba4282bcc42" containerName="glance-db-sync" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.684392 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b382ec69-4b87-43f5-b964-eba4282bcc42" containerName="glance-db-sync" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.685294 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.688204 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-7xqwp" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.688630 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.688796 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.688890 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.702046 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b495df7c5-7nbzn"] Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.703677 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.716127 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b495df7c5-7nbzn"] Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.732590 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.759374 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fbabc907-d404-4942-a4d8-470ca14f2727-ceph\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.759422 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbabc907-d404-4942-a4d8-470ca14f2727-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.759444 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6cfg\" (UniqueName: \"kubernetes.io/projected/fbabc907-d404-4942-a4d8-470ca14f2727-kube-api-access-g6cfg\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.759471 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-config-data\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.759557 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-scripts\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.759589 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.759816 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbabc907-d404-4942-a4d8-470ca14f2727-logs\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.844928 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.846382 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.857081 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.858467 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.861154 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbabc907-d404-4942-a4d8-470ca14f2727-logs\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.861185 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-ovsdbserver-nb\") pod \"dnsmasq-dns-5b495df7c5-7nbzn\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.861251 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4kpp\" (UniqueName: \"kubernetes.io/projected/84d7a949-86f8-4325-8494-1e37848e76ec-kube-api-access-b4kpp\") pod \"dnsmasq-dns-5b495df7c5-7nbzn\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.861285 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fbabc907-d404-4942-a4d8-470ca14f2727-ceph\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.861305 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbabc907-d404-4942-a4d8-470ca14f2727-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.861324 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6cfg\" (UniqueName: \"kubernetes.io/projected/fbabc907-d404-4942-a4d8-470ca14f2727-kube-api-access-g6cfg\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.861341 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-ovsdbserver-sb\") pod \"dnsmasq-dns-5b495df7c5-7nbzn\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.861364 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-config-data\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.861387 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-scripts\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.861402 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.861431 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-dns-svc\") pod \"dnsmasq-dns-5b495df7c5-7nbzn\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.861450 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-config\") pod \"dnsmasq-dns-5b495df7c5-7nbzn\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.861865 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbabc907-d404-4942-a4d8-470ca14f2727-logs\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.862291 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbabc907-d404-4942-a4d8-470ca14f2727-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.866473 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fbabc907-d404-4942-a4d8-470ca14f2727-ceph\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.866881 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.867477 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-config-data\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.871683 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-scripts\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.903354 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6cfg\" (UniqueName: \"kubernetes.io/projected/fbabc907-d404-4942-a4d8-470ca14f2727-kube-api-access-g6cfg\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.963976 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.964032 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-dns-svc\") pod \"dnsmasq-dns-5b495df7c5-7nbzn\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.964054 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a6187d03-61b1-472a-815f-ca42b191010f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.964074 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-config\") pod \"dnsmasq-dns-5b495df7c5-7nbzn\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.964098 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6187d03-61b1-472a-815f-ca42b191010f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.964128 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6187d03-61b1-472a-815f-ca42b191010f-logs\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.964155 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.964170 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-ovsdbserver-nb\") pod \"dnsmasq-dns-5b495df7c5-7nbzn\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.964218 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4kpp\" (UniqueName: \"kubernetes.io/projected/84d7a949-86f8-4325-8494-1e37848e76ec-kube-api-access-b4kpp\") pod \"dnsmasq-dns-5b495df7c5-7nbzn\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.964234 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.964260 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzklh\" (UniqueName: \"kubernetes.io/projected/a6187d03-61b1-472a-815f-ca42b191010f-kube-api-access-zzklh\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.964829 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-ovsdbserver-sb\") pod \"dnsmasq-dns-5b495df7c5-7nbzn\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.965116 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-dns-svc\") pod \"dnsmasq-dns-5b495df7c5-7nbzn\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.965510 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-ovsdbserver-nb\") pod \"dnsmasq-dns-5b495df7c5-7nbzn\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.969313 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-config\") pod \"dnsmasq-dns-5b495df7c5-7nbzn\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.970333 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-ovsdbserver-sb\") pod \"dnsmasq-dns-5b495df7c5-7nbzn\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.981307 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4kpp\" (UniqueName: \"kubernetes.io/projected/84d7a949-86f8-4325-8494-1e37848e76ec-kube-api-access-b4kpp\") pod \"dnsmasq-dns-5b495df7c5-7nbzn\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.004153 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.019218 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.066170 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.066666 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a6187d03-61b1-472a-815f-ca42b191010f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.066716 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6187d03-61b1-472a-815f-ca42b191010f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.066747 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6187d03-61b1-472a-815f-ca42b191010f-logs\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.066766 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.066818 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.066865 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzklh\" (UniqueName: \"kubernetes.io/projected/a6187d03-61b1-472a-815f-ca42b191010f-kube-api-access-zzklh\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.067230 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6187d03-61b1-472a-815f-ca42b191010f-logs\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.067533 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6187d03-61b1-472a-815f-ca42b191010f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.070845 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.071452 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.072411 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a6187d03-61b1-472a-815f-ca42b191010f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.073120 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.085429 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzklh\" (UniqueName: \"kubernetes.io/projected/a6187d03-61b1-472a-815f-ca42b191010f-kube-api-access-zzklh\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.161318 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 08:43:21 crc kubenswrapper[5094]: W0220 08:43:21.588394 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbabc907_d404_4942_a4d8_470ca14f2727.slice/crio-cce65189e72bfca27537c0863492d5e96935f34cfab3ea3bf5ad6c46b32d05c4 WatchSource:0}: Error finding container cce65189e72bfca27537c0863492d5e96935f34cfab3ea3bf5ad6c46b32d05c4: Status 404 returned error can't find the container with id cce65189e72bfca27537c0863492d5e96935f34cfab3ea3bf5ad6c46b32d05c4 Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.589059 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.606929 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b495df7c5-7nbzn"] Feb 20 08:43:21 crc kubenswrapper[5094]: W0220 08:43:21.621529 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84d7a949_86f8_4325_8494_1e37848e76ec.slice/crio-9b5aa36d351d8e72474f8ef575d86e027a54c562714317f08d7c4cf38067fb2a WatchSource:0}: Error finding container 9b5aa36d351d8e72474f8ef575d86e027a54c562714317f08d7c4cf38067fb2a: Status 404 returned error can't find the container with id 9b5aa36d351d8e72474f8ef575d86e027a54c562714317f08d7c4cf38067fb2a Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.664517 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 08:43:21 crc kubenswrapper[5094]: W0220 08:43:21.797922 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6187d03_61b1_472a_815f_ca42b191010f.slice/crio-49f143470e06e51ff04f2ae879e5f3d29275722915277f2f9c70aa490bec1d6b WatchSource:0}: Error finding container 49f143470e06e51ff04f2ae879e5f3d29275722915277f2f9c70aa490bec1d6b: Status 404 returned error can't find the container with id 49f143470e06e51ff04f2ae879e5f3d29275722915277f2f9c70aa490bec1d6b Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.798041 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 08:43:22 crc kubenswrapper[5094]: I0220 08:43:22.336793 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a6187d03-61b1-472a-815f-ca42b191010f","Type":"ContainerStarted","Data":"49f143470e06e51ff04f2ae879e5f3d29275722915277f2f9c70aa490bec1d6b"} Feb 20 08:43:22 crc kubenswrapper[5094]: I0220 08:43:22.340340 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fbabc907-d404-4942-a4d8-470ca14f2727","Type":"ContainerStarted","Data":"85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0"} Feb 20 08:43:22 crc kubenswrapper[5094]: I0220 08:43:22.340388 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fbabc907-d404-4942-a4d8-470ca14f2727","Type":"ContainerStarted","Data":"cce65189e72bfca27537c0863492d5e96935f34cfab3ea3bf5ad6c46b32d05c4"} Feb 20 08:43:22 crc kubenswrapper[5094]: I0220 08:43:22.343783 5094 generic.go:334] "Generic (PLEG): container finished" podID="84d7a949-86f8-4325-8494-1e37848e76ec" containerID="208d3211ae79c8d4cda59830631f10fba8107fe5fff0ad3a2d8a3afdcb25b100" exitCode=0 Feb 20 08:43:22 crc kubenswrapper[5094]: I0220 08:43:22.343848 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" event={"ID":"84d7a949-86f8-4325-8494-1e37848e76ec","Type":"ContainerDied","Data":"208d3211ae79c8d4cda59830631f10fba8107fe5fff0ad3a2d8a3afdcb25b100"} Feb 20 08:43:22 crc kubenswrapper[5094]: I0220 08:43:22.343873 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" event={"ID":"84d7a949-86f8-4325-8494-1e37848e76ec","Type":"ContainerStarted","Data":"9b5aa36d351d8e72474f8ef575d86e027a54c562714317f08d7c4cf38067fb2a"} Feb 20 08:43:23 crc kubenswrapper[5094]: I0220 08:43:23.352903 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a6187d03-61b1-472a-815f-ca42b191010f","Type":"ContainerStarted","Data":"0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce"} Feb 20 08:43:23 crc kubenswrapper[5094]: I0220 08:43:23.353727 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a6187d03-61b1-472a-815f-ca42b191010f","Type":"ContainerStarted","Data":"c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a"} Feb 20 08:43:23 crc kubenswrapper[5094]: I0220 08:43:23.354311 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fbabc907-d404-4942-a4d8-470ca14f2727","Type":"ContainerStarted","Data":"d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851"} Feb 20 08:43:23 crc kubenswrapper[5094]: I0220 08:43:23.354368 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fbabc907-d404-4942-a4d8-470ca14f2727" containerName="glance-log" containerID="cri-o://85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0" gracePeriod=30 Feb 20 08:43:23 crc kubenswrapper[5094]: I0220 08:43:23.354425 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fbabc907-d404-4942-a4d8-470ca14f2727" containerName="glance-httpd" containerID="cri-o://d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851" gracePeriod=30 Feb 20 08:43:23 crc kubenswrapper[5094]: I0220 08:43:23.356141 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" event={"ID":"84d7a949-86f8-4325-8494-1e37848e76ec","Type":"ContainerStarted","Data":"913c46cb53746a67114f39304fb82db0087e728a7ec5b6b477343f5f416a8b8f"} Feb 20 08:43:23 crc kubenswrapper[5094]: I0220 08:43:23.356294 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:23 crc kubenswrapper[5094]: I0220 08:43:23.379324 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.379306246 podStartE2EDuration="3.379306246s" podCreationTimestamp="2026-02-20 08:43:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:43:23.374006138 +0000 UTC m=+7018.246632849" watchObservedRunningTime="2026-02-20 08:43:23.379306246 +0000 UTC m=+7018.251932947" Feb 20 08:43:23 crc kubenswrapper[5094]: I0220 08:43:23.403136 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.403118499 podStartE2EDuration="3.403118499s" podCreationTimestamp="2026-02-20 08:43:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:43:23.396189002 +0000 UTC m=+7018.268815713" watchObservedRunningTime="2026-02-20 08:43:23.403118499 +0000 UTC m=+7018.275745210" Feb 20 08:43:23 crc kubenswrapper[5094]: I0220 08:43:23.842643 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" podStartSLOduration=3.842624417 podStartE2EDuration="3.842624417s" podCreationTimestamp="2026-02-20 08:43:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:43:23.419510714 +0000 UTC m=+7018.292137425" watchObservedRunningTime="2026-02-20 08:43:23.842624417 +0000 UTC m=+7018.715251128" Feb 20 08:43:23 crc kubenswrapper[5094]: I0220 08:43:23.856801 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 08:43:23 crc kubenswrapper[5094]: I0220 08:43:23.995870 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.133619 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbabc907-d404-4942-a4d8-470ca14f2727-httpd-run\") pod \"fbabc907-d404-4942-a4d8-470ca14f2727\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.133756 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbabc907-d404-4942-a4d8-470ca14f2727-logs\") pod \"fbabc907-d404-4942-a4d8-470ca14f2727\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.133814 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-combined-ca-bundle\") pod \"fbabc907-d404-4942-a4d8-470ca14f2727\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.133831 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-scripts\") pod \"fbabc907-d404-4942-a4d8-470ca14f2727\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.133858 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6cfg\" (UniqueName: \"kubernetes.io/projected/fbabc907-d404-4942-a4d8-470ca14f2727-kube-api-access-g6cfg\") pod \"fbabc907-d404-4942-a4d8-470ca14f2727\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.133875 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fbabc907-d404-4942-a4d8-470ca14f2727-ceph\") pod \"fbabc907-d404-4942-a4d8-470ca14f2727\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.133947 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-config-data\") pod \"fbabc907-d404-4942-a4d8-470ca14f2727\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.134111 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbabc907-d404-4942-a4d8-470ca14f2727-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fbabc907-d404-4942-a4d8-470ca14f2727" (UID: "fbabc907-d404-4942-a4d8-470ca14f2727"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.134325 5094 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbabc907-d404-4942-a4d8-470ca14f2727-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.134640 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbabc907-d404-4942-a4d8-470ca14f2727-logs" (OuterVolumeSpecName: "logs") pod "fbabc907-d404-4942-a4d8-470ca14f2727" (UID: "fbabc907-d404-4942-a4d8-470ca14f2727"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.139076 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-scripts" (OuterVolumeSpecName: "scripts") pod "fbabc907-d404-4942-a4d8-470ca14f2727" (UID: "fbabc907-d404-4942-a4d8-470ca14f2727"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.139714 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbabc907-d404-4942-a4d8-470ca14f2727-ceph" (OuterVolumeSpecName: "ceph") pod "fbabc907-d404-4942-a4d8-470ca14f2727" (UID: "fbabc907-d404-4942-a4d8-470ca14f2727"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.144060 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbabc907-d404-4942-a4d8-470ca14f2727-kube-api-access-g6cfg" (OuterVolumeSpecName: "kube-api-access-g6cfg") pod "fbabc907-d404-4942-a4d8-470ca14f2727" (UID: "fbabc907-d404-4942-a4d8-470ca14f2727"). InnerVolumeSpecName "kube-api-access-g6cfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.160791 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbabc907-d404-4942-a4d8-470ca14f2727" (UID: "fbabc907-d404-4942-a4d8-470ca14f2727"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.192222 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-config-data" (OuterVolumeSpecName: "config-data") pod "fbabc907-d404-4942-a4d8-470ca14f2727" (UID: "fbabc907-d404-4942-a4d8-470ca14f2727"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.236267 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbabc907-d404-4942-a4d8-470ca14f2727-logs\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.236302 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.236312 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.236320 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6cfg\" (UniqueName: \"kubernetes.io/projected/fbabc907-d404-4942-a4d8-470ca14f2727-kube-api-access-g6cfg\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.236331 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fbabc907-d404-4942-a4d8-470ca14f2727-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.236339 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.364722 5094 generic.go:334] "Generic (PLEG): container finished" podID="fbabc907-d404-4942-a4d8-470ca14f2727" containerID="d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851" exitCode=0 Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.364753 5094 generic.go:334] "Generic (PLEG): container finished" podID="fbabc907-d404-4942-a4d8-470ca14f2727" containerID="85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0" exitCode=143 Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.364805 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.364863 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fbabc907-d404-4942-a4d8-470ca14f2727","Type":"ContainerDied","Data":"d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851"} Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.364898 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fbabc907-d404-4942-a4d8-470ca14f2727","Type":"ContainerDied","Data":"85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0"} Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.364915 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fbabc907-d404-4942-a4d8-470ca14f2727","Type":"ContainerDied","Data":"cce65189e72bfca27537c0863492d5e96935f34cfab3ea3bf5ad6c46b32d05c4"} Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.364934 5094 scope.go:117] "RemoveContainer" containerID="d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.387227 5094 scope.go:117] "RemoveContainer" containerID="85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.398252 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.406484 5094 scope.go:117] "RemoveContainer" containerID="d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.406504 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 08:43:24 crc kubenswrapper[5094]: E0220 08:43:24.407197 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851\": container with ID starting with d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851 not found: ID does not exist" containerID="d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.407230 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851"} err="failed to get container status \"d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851\": rpc error: code = NotFound desc = could not find container \"d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851\": container with ID starting with d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851 not found: ID does not exist" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.407251 5094 scope.go:117] "RemoveContainer" containerID="85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0" Feb 20 08:43:24 crc kubenswrapper[5094]: E0220 08:43:24.407503 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0\": container with ID starting with 85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0 not found: ID does not exist" containerID="85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.407526 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0"} err="failed to get container status \"85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0\": rpc error: code = NotFound desc = could not find container \"85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0\": container with ID starting with 85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0 not found: ID does not exist" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.407543 5094 scope.go:117] "RemoveContainer" containerID="d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.407790 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851"} err="failed to get container status \"d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851\": rpc error: code = NotFound desc = could not find container \"d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851\": container with ID starting with d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851 not found: ID does not exist" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.407817 5094 scope.go:117] "RemoveContainer" containerID="85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.408120 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0"} err="failed to get container status \"85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0\": rpc error: code = NotFound desc = could not find container \"85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0\": container with ID starting with 85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0 not found: ID does not exist" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.426237 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 08:43:24 crc kubenswrapper[5094]: E0220 08:43:24.426736 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbabc907-d404-4942-a4d8-470ca14f2727" containerName="glance-log" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.426761 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbabc907-d404-4942-a4d8-470ca14f2727" containerName="glance-log" Feb 20 08:43:24 crc kubenswrapper[5094]: E0220 08:43:24.426779 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbabc907-d404-4942-a4d8-470ca14f2727" containerName="glance-httpd" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.426790 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbabc907-d404-4942-a4d8-470ca14f2727" containerName="glance-httpd" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.427022 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbabc907-d404-4942-a4d8-470ca14f2727" containerName="glance-log" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.427059 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbabc907-d404-4942-a4d8-470ca14f2727" containerName="glance-httpd" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.428105 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.430112 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.440912 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.543675 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0babde66-7106-44f9-8108-dc7123e64645-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.543923 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-scripts\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.544038 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0babde66-7106-44f9-8108-dc7123e64645-ceph\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.544149 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0babde66-7106-44f9-8108-dc7123e64645-logs\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.544290 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-config-data\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.544381 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.544413 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcxvn\" (UniqueName: \"kubernetes.io/projected/0babde66-7106-44f9-8108-dc7123e64645-kube-api-access-pcxvn\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.646094 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0babde66-7106-44f9-8108-dc7123e64645-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.646160 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-scripts\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.646199 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0babde66-7106-44f9-8108-dc7123e64645-ceph\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.646256 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0babde66-7106-44f9-8108-dc7123e64645-logs\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.646284 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-config-data\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.646319 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.646342 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcxvn\" (UniqueName: \"kubernetes.io/projected/0babde66-7106-44f9-8108-dc7123e64645-kube-api-access-pcxvn\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.646585 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0babde66-7106-44f9-8108-dc7123e64645-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.646854 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0babde66-7106-44f9-8108-dc7123e64645-logs\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.651095 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-scripts\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.651752 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-config-data\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.652450 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0babde66-7106-44f9-8108-dc7123e64645-ceph\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.656074 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.663663 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcxvn\" (UniqueName: \"kubernetes.io/projected/0babde66-7106-44f9-8108-dc7123e64645-kube-api-access-pcxvn\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.797551 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.349465 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 08:43:25 crc kubenswrapper[5094]: W0220 08:43:25.362158 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0babde66_7106_44f9_8108_dc7123e64645.slice/crio-8d5bad0851a2161ffc5adb1e5293851f312f7324fa1146ff757f90b48ad072a5 WatchSource:0}: Error finding container 8d5bad0851a2161ffc5adb1e5293851f312f7324fa1146ff757f90b48ad072a5: Status 404 returned error can't find the container with id 8d5bad0851a2161ffc5adb1e5293851f312f7324fa1146ff757f90b48ad072a5 Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.375596 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0babde66-7106-44f9-8108-dc7123e64645","Type":"ContainerStarted","Data":"8d5bad0851a2161ffc5adb1e5293851f312f7324fa1146ff757f90b48ad072a5"} Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.375758 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a6187d03-61b1-472a-815f-ca42b191010f" containerName="glance-log" containerID="cri-o://0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce" gracePeriod=30 Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.375810 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a6187d03-61b1-472a-815f-ca42b191010f" containerName="glance-httpd" containerID="cri-o://c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a" gracePeriod=30 Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.851814 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbabc907-d404-4942-a4d8-470ca14f2727" path="/var/lib/kubelet/pods/fbabc907-d404-4942-a4d8-470ca14f2727/volumes" Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.882874 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.968630 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a6187d03-61b1-472a-815f-ca42b191010f-ceph\") pod \"a6187d03-61b1-472a-815f-ca42b191010f\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.968680 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-config-data\") pod \"a6187d03-61b1-472a-815f-ca42b191010f\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.968714 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-scripts\") pod \"a6187d03-61b1-472a-815f-ca42b191010f\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.968782 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6187d03-61b1-472a-815f-ca42b191010f-httpd-run\") pod \"a6187d03-61b1-472a-815f-ca42b191010f\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.968817 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzklh\" (UniqueName: \"kubernetes.io/projected/a6187d03-61b1-472a-815f-ca42b191010f-kube-api-access-zzklh\") pod \"a6187d03-61b1-472a-815f-ca42b191010f\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.968843 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-combined-ca-bundle\") pod \"a6187d03-61b1-472a-815f-ca42b191010f\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.968880 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6187d03-61b1-472a-815f-ca42b191010f-logs\") pod \"a6187d03-61b1-472a-815f-ca42b191010f\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.969718 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6187d03-61b1-472a-815f-ca42b191010f-logs" (OuterVolumeSpecName: "logs") pod "a6187d03-61b1-472a-815f-ca42b191010f" (UID: "a6187d03-61b1-472a-815f-ca42b191010f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.969944 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6187d03-61b1-472a-815f-ca42b191010f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a6187d03-61b1-472a-815f-ca42b191010f" (UID: "a6187d03-61b1-472a-815f-ca42b191010f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.975634 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6187d03-61b1-472a-815f-ca42b191010f-ceph" (OuterVolumeSpecName: "ceph") pod "a6187d03-61b1-472a-815f-ca42b191010f" (UID: "a6187d03-61b1-472a-815f-ca42b191010f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.975922 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-scripts" (OuterVolumeSpecName: "scripts") pod "a6187d03-61b1-472a-815f-ca42b191010f" (UID: "a6187d03-61b1-472a-815f-ca42b191010f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.976931 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6187d03-61b1-472a-815f-ca42b191010f-kube-api-access-zzklh" (OuterVolumeSpecName: "kube-api-access-zzklh") pod "a6187d03-61b1-472a-815f-ca42b191010f" (UID: "a6187d03-61b1-472a-815f-ca42b191010f"). InnerVolumeSpecName "kube-api-access-zzklh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.011838 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6187d03-61b1-472a-815f-ca42b191010f" (UID: "a6187d03-61b1-472a-815f-ca42b191010f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.033433 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-config-data" (OuterVolumeSpecName: "config-data") pod "a6187d03-61b1-472a-815f-ca42b191010f" (UID: "a6187d03-61b1-472a-815f-ca42b191010f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.070475 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzklh\" (UniqueName: \"kubernetes.io/projected/a6187d03-61b1-472a-815f-ca42b191010f-kube-api-access-zzklh\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.070523 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.070537 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6187d03-61b1-472a-815f-ca42b191010f-logs\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.070554 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a6187d03-61b1-472a-815f-ca42b191010f-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.070566 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.070588 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.070600 5094 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6187d03-61b1-472a-815f-ca42b191010f-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.386465 5094 generic.go:334] "Generic (PLEG): container finished" podID="a6187d03-61b1-472a-815f-ca42b191010f" containerID="c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a" exitCode=0 Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.386497 5094 generic.go:334] "Generic (PLEG): container finished" podID="a6187d03-61b1-472a-815f-ca42b191010f" containerID="0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce" exitCode=143 Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.386531 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a6187d03-61b1-472a-815f-ca42b191010f","Type":"ContainerDied","Data":"c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a"} Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.386558 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a6187d03-61b1-472a-815f-ca42b191010f","Type":"ContainerDied","Data":"0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce"} Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.386567 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a6187d03-61b1-472a-815f-ca42b191010f","Type":"ContainerDied","Data":"49f143470e06e51ff04f2ae879e5f3d29275722915277f2f9c70aa490bec1d6b"} Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.386582 5094 scope.go:117] "RemoveContainer" containerID="c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.386695 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.393396 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0babde66-7106-44f9-8108-dc7123e64645","Type":"ContainerStarted","Data":"ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673"} Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.447609 5094 scope.go:117] "RemoveContainer" containerID="0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.450599 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.465881 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.475410 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 08:43:26 crc kubenswrapper[5094]: E0220 08:43:26.475833 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6187d03-61b1-472a-815f-ca42b191010f" containerName="glance-log" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.475850 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6187d03-61b1-472a-815f-ca42b191010f" containerName="glance-log" Feb 20 08:43:26 crc kubenswrapper[5094]: E0220 08:43:26.475874 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6187d03-61b1-472a-815f-ca42b191010f" containerName="glance-httpd" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.475882 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6187d03-61b1-472a-815f-ca42b191010f" containerName="glance-httpd" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.476030 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6187d03-61b1-472a-815f-ca42b191010f" containerName="glance-log" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.476050 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6187d03-61b1-472a-815f-ca42b191010f" containerName="glance-httpd" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.476974 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.480283 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.481116 5094 scope.go:117] "RemoveContainer" containerID="c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a" Feb 20 08:43:26 crc kubenswrapper[5094]: E0220 08:43:26.482471 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a\": container with ID starting with c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a not found: ID does not exist" containerID="c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.482633 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a"} err="failed to get container status \"c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a\": rpc error: code = NotFound desc = could not find container \"c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a\": container with ID starting with c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a not found: ID does not exist" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.482823 5094 scope.go:117] "RemoveContainer" containerID="0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.485114 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 08:43:26 crc kubenswrapper[5094]: E0220 08:43:26.499382 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce\": container with ID starting with 0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce not found: ID does not exist" containerID="0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.499423 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce"} err="failed to get container status \"0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce\": rpc error: code = NotFound desc = could not find container \"0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce\": container with ID starting with 0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce not found: ID does not exist" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.499448 5094 scope.go:117] "RemoveContainer" containerID="c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.505045 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a"} err="failed to get container status \"c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a\": rpc error: code = NotFound desc = could not find container \"c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a\": container with ID starting with c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a not found: ID does not exist" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.505079 5094 scope.go:117] "RemoveContainer" containerID="0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.505388 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce"} err="failed to get container status \"0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce\": rpc error: code = NotFound desc = could not find container \"0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce\": container with ID starting with 0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce not found: ID does not exist" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.581960 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9ba9e313-83db-4e08-a308-376d5fdf5820-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.582055 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ba9e313-83db-4e08-a308-376d5fdf5820-logs\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.582099 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9ba9e313-83db-4e08-a308-376d5fdf5820-ceph\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.582225 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.582432 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.582493 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shfm9\" (UniqueName: \"kubernetes.io/projected/9ba9e313-83db-4e08-a308-376d5fdf5820-kube-api-access-shfm9\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.582657 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.684192 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.684524 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shfm9\" (UniqueName: \"kubernetes.io/projected/9ba9e313-83db-4e08-a308-376d5fdf5820-kube-api-access-shfm9\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.684686 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.684844 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9ba9e313-83db-4e08-a308-376d5fdf5820-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.685238 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ba9e313-83db-4e08-a308-376d5fdf5820-logs\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.685350 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9ba9e313-83db-4e08-a308-376d5fdf5820-ceph\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.685476 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ba9e313-83db-4e08-a308-376d5fdf5820-logs\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.685281 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9ba9e313-83db-4e08-a308-376d5fdf5820-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.685688 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.692336 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.693474 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.693820 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.694177 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9ba9e313-83db-4e08-a308-376d5fdf5820-ceph\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.702205 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shfm9\" (UniqueName: \"kubernetes.io/projected/9ba9e313-83db-4e08-a308-376d5fdf5820-kube-api-access-shfm9\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.800396 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 08:43:27 crc kubenswrapper[5094]: I0220 08:43:27.359968 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 08:43:27 crc kubenswrapper[5094]: I0220 08:43:27.406675 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0babde66-7106-44f9-8108-dc7123e64645","Type":"ContainerStarted","Data":"d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54"} Feb 20 08:43:27 crc kubenswrapper[5094]: I0220 08:43:27.408199 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9ba9e313-83db-4e08-a308-376d5fdf5820","Type":"ContainerStarted","Data":"b0e75d749acef441fc02419393d76ceab32d56e244c55548218d0246a2690c4a"} Feb 20 08:43:27 crc kubenswrapper[5094]: I0220 08:43:27.443223 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.443204896 podStartE2EDuration="3.443204896s" podCreationTimestamp="2026-02-20 08:43:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:43:27.442006448 +0000 UTC m=+7022.314633179" watchObservedRunningTime="2026-02-20 08:43:27.443204896 +0000 UTC m=+7022.315831607" Feb 20 08:43:27 crc kubenswrapper[5094]: I0220 08:43:27.855998 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6187d03-61b1-472a-815f-ca42b191010f" path="/var/lib/kubelet/pods/a6187d03-61b1-472a-815f-ca42b191010f/volumes" Feb 20 08:43:28 crc kubenswrapper[5094]: I0220 08:43:28.419875 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9ba9e313-83db-4e08-a308-376d5fdf5820","Type":"ContainerStarted","Data":"7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb"} Feb 20 08:43:28 crc kubenswrapper[5094]: I0220 08:43:28.420224 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9ba9e313-83db-4e08-a308-376d5fdf5820","Type":"ContainerStarted","Data":"280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1"} Feb 20 08:43:28 crc kubenswrapper[5094]: I0220 08:43:28.447602 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.44758329 podStartE2EDuration="2.44758329s" podCreationTimestamp="2026-02-20 08:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:43:28.443114903 +0000 UTC m=+7023.315741664" watchObservedRunningTime="2026-02-20 08:43:28.44758329 +0000 UTC m=+7023.320209991" Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.021732 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.087622 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-549654cbff-7zg62"] Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.087996 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-549654cbff-7zg62" podUID="14b1e378-3ae1-4707-8c14-3ee7ad292a55" containerName="dnsmasq-dns" containerID="cri-o://09b2ae2e83651f531f10fff7937847937cdbc6ce024c4c5e0334ed9056460e09" gracePeriod=10 Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.131028 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-549654cbff-7zg62" podUID="14b1e378-3ae1-4707-8c14-3ee7ad292a55" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.38:5353: connect: connection refused" Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.461083 5094 generic.go:334] "Generic (PLEG): container finished" podID="14b1e378-3ae1-4707-8c14-3ee7ad292a55" containerID="09b2ae2e83651f531f10fff7937847937cdbc6ce024c4c5e0334ed9056460e09" exitCode=0 Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.461359 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-549654cbff-7zg62" event={"ID":"14b1e378-3ae1-4707-8c14-3ee7ad292a55","Type":"ContainerDied","Data":"09b2ae2e83651f531f10fff7937847937cdbc6ce024c4c5e0334ed9056460e09"} Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.571307 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.677781 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-dns-svc\") pod \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.677903 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-ovsdbserver-sb\") pod \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.677933 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-config\") pod \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.677978 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9dbc\" (UniqueName: \"kubernetes.io/projected/14b1e378-3ae1-4707-8c14-3ee7ad292a55-kube-api-access-x9dbc\") pod \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.678012 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-ovsdbserver-nb\") pod \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.691032 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14b1e378-3ae1-4707-8c14-3ee7ad292a55-kube-api-access-x9dbc" (OuterVolumeSpecName: "kube-api-access-x9dbc") pod "14b1e378-3ae1-4707-8c14-3ee7ad292a55" (UID: "14b1e378-3ae1-4707-8c14-3ee7ad292a55"). InnerVolumeSpecName "kube-api-access-x9dbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.722909 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "14b1e378-3ae1-4707-8c14-3ee7ad292a55" (UID: "14b1e378-3ae1-4707-8c14-3ee7ad292a55"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.722982 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-config" (OuterVolumeSpecName: "config") pod "14b1e378-3ae1-4707-8c14-3ee7ad292a55" (UID: "14b1e378-3ae1-4707-8c14-3ee7ad292a55"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.725346 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "14b1e378-3ae1-4707-8c14-3ee7ad292a55" (UID: "14b1e378-3ae1-4707-8c14-3ee7ad292a55"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.726435 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "14b1e378-3ae1-4707-8c14-3ee7ad292a55" (UID: "14b1e378-3ae1-4707-8c14-3ee7ad292a55"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.779614 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.779650 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.779660 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.779671 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9dbc\" (UniqueName: \"kubernetes.io/projected/14b1e378-3ae1-4707-8c14-3ee7ad292a55-kube-api-access-x9dbc\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.779680 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:32 crc kubenswrapper[5094]: I0220 08:43:32.486526 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-549654cbff-7zg62" event={"ID":"14b1e378-3ae1-4707-8c14-3ee7ad292a55","Type":"ContainerDied","Data":"4b216a605a8b79d7c7ddf65eb3de811aab7c907beb4645b0cfc8890ec094ca61"} Feb 20 08:43:32 crc kubenswrapper[5094]: I0220 08:43:32.486597 5094 scope.go:117] "RemoveContainer" containerID="09b2ae2e83651f531f10fff7937847937cdbc6ce024c4c5e0334ed9056460e09" Feb 20 08:43:32 crc kubenswrapper[5094]: I0220 08:43:32.486850 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:43:32 crc kubenswrapper[5094]: I0220 08:43:32.537667 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-549654cbff-7zg62"] Feb 20 08:43:32 crc kubenswrapper[5094]: I0220 08:43:32.541115 5094 scope.go:117] "RemoveContainer" containerID="a8b2fd7d2d6131025e57a52935b313eca72cf89c2a15b9432a7f1f536f05a672" Feb 20 08:43:32 crc kubenswrapper[5094]: I0220 08:43:32.548167 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-549654cbff-7zg62"] Feb 20 08:43:33 crc kubenswrapper[5094]: I0220 08:43:33.858900 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14b1e378-3ae1-4707-8c14-3ee7ad292a55" path="/var/lib/kubelet/pods/14b1e378-3ae1-4707-8c14-3ee7ad292a55/volumes" Feb 20 08:43:34 crc kubenswrapper[5094]: I0220 08:43:34.107394 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:43:34 crc kubenswrapper[5094]: I0220 08:43:34.107986 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:43:34 crc kubenswrapper[5094]: I0220 08:43:34.108220 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 08:43:34 crc kubenswrapper[5094]: I0220 08:43:34.109778 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 08:43:34 crc kubenswrapper[5094]: I0220 08:43:34.110067 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" gracePeriod=600 Feb 20 08:43:34 crc kubenswrapper[5094]: E0220 08:43:34.243575 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:43:34 crc kubenswrapper[5094]: I0220 08:43:34.507987 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" exitCode=0 Feb 20 08:43:34 crc kubenswrapper[5094]: I0220 08:43:34.508077 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec"} Feb 20 08:43:34 crc kubenswrapper[5094]: I0220 08:43:34.508410 5094 scope.go:117] "RemoveContainer" containerID="4eb6a1e784b78bde35c2b1ca39f3fd3b16c3f1975e9f121306145f15615f4afb" Feb 20 08:43:34 crc kubenswrapper[5094]: I0220 08:43:34.509010 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:43:34 crc kubenswrapper[5094]: E0220 08:43:34.509298 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:43:34 crc kubenswrapper[5094]: I0220 08:43:34.798658 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 20 08:43:34 crc kubenswrapper[5094]: I0220 08:43:34.798737 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 20 08:43:34 crc kubenswrapper[5094]: I0220 08:43:34.824206 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 20 08:43:34 crc kubenswrapper[5094]: I0220 08:43:34.835252 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 20 08:43:35 crc kubenswrapper[5094]: I0220 08:43:35.525140 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 20 08:43:35 crc kubenswrapper[5094]: I0220 08:43:35.525178 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 20 08:43:36 crc kubenswrapper[5094]: I0220 08:43:36.801218 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 20 08:43:36 crc kubenswrapper[5094]: I0220 08:43:36.801487 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 20 08:43:36 crc kubenswrapper[5094]: I0220 08:43:36.829188 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 20 08:43:36 crc kubenswrapper[5094]: I0220 08:43:36.836781 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 20 08:43:37 crc kubenswrapper[5094]: I0220 08:43:37.093399 5094 scope.go:117] "RemoveContainer" containerID="51c5d24049c628fe72ec29c7d6aad6b1b26637f9d1812b5f27f768d7d83239ed" Feb 20 08:43:37 crc kubenswrapper[5094]: I0220 08:43:37.401794 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 20 08:43:37 crc kubenswrapper[5094]: I0220 08:43:37.418659 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 20 08:43:37 crc kubenswrapper[5094]: I0220 08:43:37.540553 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 20 08:43:37 crc kubenswrapper[5094]: I0220 08:43:37.540604 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 20 08:43:39 crc kubenswrapper[5094]: I0220 08:43:39.450219 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 20 08:43:39 crc kubenswrapper[5094]: I0220 08:43:39.457650 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 20 08:43:46 crc kubenswrapper[5094]: I0220 08:43:46.840436 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:43:46 crc kubenswrapper[5094]: E0220 08:43:46.841090 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.088418 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-r5qjd"] Feb 20 08:43:47 crc kubenswrapper[5094]: E0220 08:43:47.088912 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14b1e378-3ae1-4707-8c14-3ee7ad292a55" containerName="dnsmasq-dns" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.088932 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="14b1e378-3ae1-4707-8c14-3ee7ad292a55" containerName="dnsmasq-dns" Feb 20 08:43:47 crc kubenswrapper[5094]: E0220 08:43:47.088967 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14b1e378-3ae1-4707-8c14-3ee7ad292a55" containerName="init" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.088977 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="14b1e378-3ae1-4707-8c14-3ee7ad292a55" containerName="init" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.089212 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="14b1e378-3ae1-4707-8c14-3ee7ad292a55" containerName="dnsmasq-dns" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.089936 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-r5qjd" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.097761 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-r5qjd"] Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.198577 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e67bd4c-454a-4166-9e28-49c348795b29-operator-scripts\") pod \"placement-db-create-r5qjd\" (UID: \"0e67bd4c-454a-4166-9e28-49c348795b29\") " pod="openstack/placement-db-create-r5qjd" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.198818 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5pgk\" (UniqueName: \"kubernetes.io/projected/0e67bd4c-454a-4166-9e28-49c348795b29-kube-api-access-p5pgk\") pod \"placement-db-create-r5qjd\" (UID: \"0e67bd4c-454a-4166-9e28-49c348795b29\") " pod="openstack/placement-db-create-r5qjd" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.199898 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-63f8-account-create-update-szqmc"] Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.201851 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-63f8-account-create-update-szqmc" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.203800 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.209152 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-63f8-account-create-update-szqmc"] Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.299850 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e67bd4c-454a-4166-9e28-49c348795b29-operator-scripts\") pod \"placement-db-create-r5qjd\" (UID: \"0e67bd4c-454a-4166-9e28-49c348795b29\") " pod="openstack/placement-db-create-r5qjd" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.299931 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5bcef59-b989-4157-8233-6482f9f3abab-operator-scripts\") pod \"placement-63f8-account-create-update-szqmc\" (UID: \"d5bcef59-b989-4157-8233-6482f9f3abab\") " pod="openstack/placement-63f8-account-create-update-szqmc" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.299958 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5pgk\" (UniqueName: \"kubernetes.io/projected/0e67bd4c-454a-4166-9e28-49c348795b29-kube-api-access-p5pgk\") pod \"placement-db-create-r5qjd\" (UID: \"0e67bd4c-454a-4166-9e28-49c348795b29\") " pod="openstack/placement-db-create-r5qjd" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.300064 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvlf8\" (UniqueName: \"kubernetes.io/projected/d5bcef59-b989-4157-8233-6482f9f3abab-kube-api-access-rvlf8\") pod \"placement-63f8-account-create-update-szqmc\" (UID: \"d5bcef59-b989-4157-8233-6482f9f3abab\") " pod="openstack/placement-63f8-account-create-update-szqmc" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.300569 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e67bd4c-454a-4166-9e28-49c348795b29-operator-scripts\") pod \"placement-db-create-r5qjd\" (UID: \"0e67bd4c-454a-4166-9e28-49c348795b29\") " pod="openstack/placement-db-create-r5qjd" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.319366 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5pgk\" (UniqueName: \"kubernetes.io/projected/0e67bd4c-454a-4166-9e28-49c348795b29-kube-api-access-p5pgk\") pod \"placement-db-create-r5qjd\" (UID: \"0e67bd4c-454a-4166-9e28-49c348795b29\") " pod="openstack/placement-db-create-r5qjd" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.401157 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvlf8\" (UniqueName: \"kubernetes.io/projected/d5bcef59-b989-4157-8233-6482f9f3abab-kube-api-access-rvlf8\") pod \"placement-63f8-account-create-update-szqmc\" (UID: \"d5bcef59-b989-4157-8233-6482f9f3abab\") " pod="openstack/placement-63f8-account-create-update-szqmc" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.401553 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5bcef59-b989-4157-8233-6482f9f3abab-operator-scripts\") pod \"placement-63f8-account-create-update-szqmc\" (UID: \"d5bcef59-b989-4157-8233-6482f9f3abab\") " pod="openstack/placement-63f8-account-create-update-szqmc" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.402382 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5bcef59-b989-4157-8233-6482f9f3abab-operator-scripts\") pod \"placement-63f8-account-create-update-szqmc\" (UID: \"d5bcef59-b989-4157-8233-6482f9f3abab\") " pod="openstack/placement-63f8-account-create-update-szqmc" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.411751 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-r5qjd" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.417422 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvlf8\" (UniqueName: \"kubernetes.io/projected/d5bcef59-b989-4157-8233-6482f9f3abab-kube-api-access-rvlf8\") pod \"placement-63f8-account-create-update-szqmc\" (UID: \"d5bcef59-b989-4157-8233-6482f9f3abab\") " pod="openstack/placement-63f8-account-create-update-szqmc" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.545840 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-63f8-account-create-update-szqmc" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.875797 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-63f8-account-create-update-szqmc"] Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.883805 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-r5qjd"] Feb 20 08:43:48 crc kubenswrapper[5094]: I0220 08:43:48.629420 5094 generic.go:334] "Generic (PLEG): container finished" podID="0e67bd4c-454a-4166-9e28-49c348795b29" containerID="9b130573754c208a821c2a5aa00744abfcde1ec2f224d985ae00e81ebcaa218e" exitCode=0 Feb 20 08:43:48 crc kubenswrapper[5094]: I0220 08:43:48.629478 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-r5qjd" event={"ID":"0e67bd4c-454a-4166-9e28-49c348795b29","Type":"ContainerDied","Data":"9b130573754c208a821c2a5aa00744abfcde1ec2f224d985ae00e81ebcaa218e"} Feb 20 08:43:48 crc kubenswrapper[5094]: I0220 08:43:48.629864 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-r5qjd" event={"ID":"0e67bd4c-454a-4166-9e28-49c348795b29","Type":"ContainerStarted","Data":"b7b0ed74d0876d26c24341bb21762e9ffb513b9e5d55e63aee8133e9c96863bc"} Feb 20 08:43:48 crc kubenswrapper[5094]: I0220 08:43:48.631997 5094 generic.go:334] "Generic (PLEG): container finished" podID="d5bcef59-b989-4157-8233-6482f9f3abab" containerID="8dfc18891e7f2cecc2e704cc07266d7a47a98f1dcf9f167194c7d37d347b850e" exitCode=0 Feb 20 08:43:48 crc kubenswrapper[5094]: I0220 08:43:48.632049 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-63f8-account-create-update-szqmc" event={"ID":"d5bcef59-b989-4157-8233-6482f9f3abab","Type":"ContainerDied","Data":"8dfc18891e7f2cecc2e704cc07266d7a47a98f1dcf9f167194c7d37d347b850e"} Feb 20 08:43:48 crc kubenswrapper[5094]: I0220 08:43:48.632080 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-63f8-account-create-update-szqmc" event={"ID":"d5bcef59-b989-4157-8233-6482f9f3abab","Type":"ContainerStarted","Data":"e4b2fb567d00953b04fddae3ab249dedcb5e7873615fe6ecf6147543376c3a09"} Feb 20 08:43:49 crc kubenswrapper[5094]: I0220 08:43:49.944104 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-63f8-account-create-update-szqmc" Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.034408 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-r5qjd" Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.047564 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5bcef59-b989-4157-8233-6482f9f3abab-operator-scripts\") pod \"d5bcef59-b989-4157-8233-6482f9f3abab\" (UID: \"d5bcef59-b989-4157-8233-6482f9f3abab\") " Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.047656 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvlf8\" (UniqueName: \"kubernetes.io/projected/d5bcef59-b989-4157-8233-6482f9f3abab-kube-api-access-rvlf8\") pod \"d5bcef59-b989-4157-8233-6482f9f3abab\" (UID: \"d5bcef59-b989-4157-8233-6482f9f3abab\") " Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.048307 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5bcef59-b989-4157-8233-6482f9f3abab-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d5bcef59-b989-4157-8233-6482f9f3abab" (UID: "d5bcef59-b989-4157-8233-6482f9f3abab"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.052758 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5bcef59-b989-4157-8233-6482f9f3abab-kube-api-access-rvlf8" (OuterVolumeSpecName: "kube-api-access-rvlf8") pod "d5bcef59-b989-4157-8233-6482f9f3abab" (UID: "d5bcef59-b989-4157-8233-6482f9f3abab"). InnerVolumeSpecName "kube-api-access-rvlf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.149066 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5pgk\" (UniqueName: \"kubernetes.io/projected/0e67bd4c-454a-4166-9e28-49c348795b29-kube-api-access-p5pgk\") pod \"0e67bd4c-454a-4166-9e28-49c348795b29\" (UID: \"0e67bd4c-454a-4166-9e28-49c348795b29\") " Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.149184 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e67bd4c-454a-4166-9e28-49c348795b29-operator-scripts\") pod \"0e67bd4c-454a-4166-9e28-49c348795b29\" (UID: \"0e67bd4c-454a-4166-9e28-49c348795b29\") " Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.149636 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5bcef59-b989-4157-8233-6482f9f3abab-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.149656 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvlf8\" (UniqueName: \"kubernetes.io/projected/d5bcef59-b989-4157-8233-6482f9f3abab-kube-api-access-rvlf8\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.149827 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e67bd4c-454a-4166-9e28-49c348795b29-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0e67bd4c-454a-4166-9e28-49c348795b29" (UID: "0e67bd4c-454a-4166-9e28-49c348795b29"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.153301 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e67bd4c-454a-4166-9e28-49c348795b29-kube-api-access-p5pgk" (OuterVolumeSpecName: "kube-api-access-p5pgk") pod "0e67bd4c-454a-4166-9e28-49c348795b29" (UID: "0e67bd4c-454a-4166-9e28-49c348795b29"). InnerVolumeSpecName "kube-api-access-p5pgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.251421 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5pgk\" (UniqueName: \"kubernetes.io/projected/0e67bd4c-454a-4166-9e28-49c348795b29-kube-api-access-p5pgk\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.251505 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e67bd4c-454a-4166-9e28-49c348795b29-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.653335 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-r5qjd" Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.653336 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-r5qjd" event={"ID":"0e67bd4c-454a-4166-9e28-49c348795b29","Type":"ContainerDied","Data":"b7b0ed74d0876d26c24341bb21762e9ffb513b9e5d55e63aee8133e9c96863bc"} Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.653748 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7b0ed74d0876d26c24341bb21762e9ffb513b9e5d55e63aee8133e9c96863bc" Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.655850 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-63f8-account-create-update-szqmc" event={"ID":"d5bcef59-b989-4157-8233-6482f9f3abab","Type":"ContainerDied","Data":"e4b2fb567d00953b04fddae3ab249dedcb5e7873615fe6ecf6147543376c3a09"} Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.655930 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4b2fb567d00953b04fddae3ab249dedcb5e7873615fe6ecf6147543376c3a09" Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.655976 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-63f8-account-create-update-szqmc" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.497738 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-567db69c47-cctzv"] Feb 20 08:43:52 crc kubenswrapper[5094]: E0220 08:43:52.498517 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5bcef59-b989-4157-8233-6482f9f3abab" containerName="mariadb-account-create-update" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.498531 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5bcef59-b989-4157-8233-6482f9f3abab" containerName="mariadb-account-create-update" Feb 20 08:43:52 crc kubenswrapper[5094]: E0220 08:43:52.498546 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e67bd4c-454a-4166-9e28-49c348795b29" containerName="mariadb-database-create" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.498552 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e67bd4c-454a-4166-9e28-49c348795b29" containerName="mariadb-database-create" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.498791 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5bcef59-b989-4157-8233-6482f9f3abab" containerName="mariadb-account-create-update" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.498816 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e67bd4c-454a-4166-9e28-49c348795b29" containerName="mariadb-database-create" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.500840 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.510470 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-567db69c47-cctzv"] Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.534038 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-smd54"] Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.535249 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-smd54" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.538146 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.538398 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-p4pxm" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.538552 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.571893 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-smd54"] Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.595782 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-config-data\") pod \"placement-db-sync-smd54\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " pod="openstack/placement-db-sync-smd54" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.595873 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-config\") pod \"dnsmasq-dns-567db69c47-cctzv\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.595914 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-scripts\") pod \"placement-db-sync-smd54\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " pod="openstack/placement-db-sync-smd54" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.595953 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvwps\" (UniqueName: \"kubernetes.io/projected/b63f3e88-3e2a-43db-88de-8cf778187671-kube-api-access-mvwps\") pod \"placement-db-sync-smd54\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " pod="openstack/placement-db-sync-smd54" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.595985 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-ovsdbserver-nb\") pod \"dnsmasq-dns-567db69c47-cctzv\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.596058 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b63f3e88-3e2a-43db-88de-8cf778187671-logs\") pod \"placement-db-sync-smd54\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " pod="openstack/placement-db-sync-smd54" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.596127 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-dns-svc\") pod \"dnsmasq-dns-567db69c47-cctzv\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.596160 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvlvt\" (UniqueName: \"kubernetes.io/projected/c8a1891c-fffd-4032-9384-bef764ca9f57-kube-api-access-dvlvt\") pod \"dnsmasq-dns-567db69c47-cctzv\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.596237 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-combined-ca-bundle\") pod \"placement-db-sync-smd54\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " pod="openstack/placement-db-sync-smd54" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.596289 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-ovsdbserver-sb\") pod \"dnsmasq-dns-567db69c47-cctzv\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.698065 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-combined-ca-bundle\") pod \"placement-db-sync-smd54\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " pod="openstack/placement-db-sync-smd54" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.698136 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-ovsdbserver-sb\") pod \"dnsmasq-dns-567db69c47-cctzv\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.698169 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-config-data\") pod \"placement-db-sync-smd54\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " pod="openstack/placement-db-sync-smd54" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.698188 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-config\") pod \"dnsmasq-dns-567db69c47-cctzv\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.698211 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-scripts\") pod \"placement-db-sync-smd54\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " pod="openstack/placement-db-sync-smd54" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.698243 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvwps\" (UniqueName: \"kubernetes.io/projected/b63f3e88-3e2a-43db-88de-8cf778187671-kube-api-access-mvwps\") pod \"placement-db-sync-smd54\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " pod="openstack/placement-db-sync-smd54" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.698261 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-ovsdbserver-nb\") pod \"dnsmasq-dns-567db69c47-cctzv\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.698306 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b63f3e88-3e2a-43db-88de-8cf778187671-logs\") pod \"placement-db-sync-smd54\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " pod="openstack/placement-db-sync-smd54" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.698346 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-dns-svc\") pod \"dnsmasq-dns-567db69c47-cctzv\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.698368 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvlvt\" (UniqueName: \"kubernetes.io/projected/c8a1891c-fffd-4032-9384-bef764ca9f57-kube-api-access-dvlvt\") pod \"dnsmasq-dns-567db69c47-cctzv\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.703364 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b63f3e88-3e2a-43db-88de-8cf778187671-logs\") pod \"placement-db-sync-smd54\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " pod="openstack/placement-db-sync-smd54" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.703768 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-config\") pod \"dnsmasq-dns-567db69c47-cctzv\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.703835 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-dns-svc\") pod \"dnsmasq-dns-567db69c47-cctzv\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.703913 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-ovsdbserver-sb\") pod \"dnsmasq-dns-567db69c47-cctzv\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.703930 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-ovsdbserver-nb\") pod \"dnsmasq-dns-567db69c47-cctzv\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.719006 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-combined-ca-bundle\") pod \"placement-db-sync-smd54\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " pod="openstack/placement-db-sync-smd54" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.720675 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-scripts\") pod \"placement-db-sync-smd54\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " pod="openstack/placement-db-sync-smd54" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.724556 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-config-data\") pod \"placement-db-sync-smd54\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " pod="openstack/placement-db-sync-smd54" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.742173 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvwps\" (UniqueName: \"kubernetes.io/projected/b63f3e88-3e2a-43db-88de-8cf778187671-kube-api-access-mvwps\") pod \"placement-db-sync-smd54\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " pod="openstack/placement-db-sync-smd54" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.744257 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvlvt\" (UniqueName: \"kubernetes.io/projected/c8a1891c-fffd-4032-9384-bef764ca9f57-kube-api-access-dvlvt\") pod \"dnsmasq-dns-567db69c47-cctzv\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.821323 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.862128 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-smd54" Feb 20 08:43:53 crc kubenswrapper[5094]: I0220 08:43:53.306930 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-567db69c47-cctzv"] Feb 20 08:43:53 crc kubenswrapper[5094]: I0220 08:43:53.357030 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-smd54"] Feb 20 08:43:53 crc kubenswrapper[5094]: I0220 08:43:53.686576 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-smd54" event={"ID":"b63f3e88-3e2a-43db-88de-8cf778187671","Type":"ContainerStarted","Data":"fcdd175ed55b2d24a7ffffadd8ffe886b1a529efda710241b94c420889488a0d"} Feb 20 08:43:53 crc kubenswrapper[5094]: I0220 08:43:53.689197 5094 generic.go:334] "Generic (PLEG): container finished" podID="c8a1891c-fffd-4032-9384-bef764ca9f57" containerID="e5df72384fdffb4deabf4c6cbb6c43678269bc4fec968e5b22a2e15228a210f5" exitCode=0 Feb 20 08:43:53 crc kubenswrapper[5094]: I0220 08:43:53.689255 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-567db69c47-cctzv" event={"ID":"c8a1891c-fffd-4032-9384-bef764ca9f57","Type":"ContainerDied","Data":"e5df72384fdffb4deabf4c6cbb6c43678269bc4fec968e5b22a2e15228a210f5"} Feb 20 08:43:53 crc kubenswrapper[5094]: I0220 08:43:53.689279 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-567db69c47-cctzv" event={"ID":"c8a1891c-fffd-4032-9384-bef764ca9f57","Type":"ContainerStarted","Data":"ac86d266f2e6539516ad0f20e1cd64fe7ffd8192e43ce128347406606139c997"} Feb 20 08:43:54 crc kubenswrapper[5094]: I0220 08:43:54.712400 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-567db69c47-cctzv" event={"ID":"c8a1891c-fffd-4032-9384-bef764ca9f57","Type":"ContainerStarted","Data":"20b70321388fb9ccdce0f5d0ab13ac7703a9ad371987c42a5b72759e55aa7f72"} Feb 20 08:43:54 crc kubenswrapper[5094]: I0220 08:43:54.712907 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:54 crc kubenswrapper[5094]: I0220 08:43:54.735406 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-567db69c47-cctzv" podStartSLOduration=2.7353900810000003 podStartE2EDuration="2.735390081s" podCreationTimestamp="2026-02-20 08:43:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:43:54.729779506 +0000 UTC m=+7049.602406247" watchObservedRunningTime="2026-02-20 08:43:54.735390081 +0000 UTC m=+7049.608016792" Feb 20 08:43:57 crc kubenswrapper[5094]: I0220 08:43:57.738896 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-smd54" event={"ID":"b63f3e88-3e2a-43db-88de-8cf778187671","Type":"ContainerStarted","Data":"3996b425ffaed37f9d76c6d868da173f9ea735a39c30061d9efa5a940e6f7333"} Feb 20 08:43:57 crc kubenswrapper[5094]: I0220 08:43:57.769856 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-smd54" podStartSLOduration=2.61271562 podStartE2EDuration="5.769833854s" podCreationTimestamp="2026-02-20 08:43:52 +0000 UTC" firstStartedPulling="2026-02-20 08:43:53.362141372 +0000 UTC m=+7048.234768073" lastFinishedPulling="2026-02-20 08:43:56.519259596 +0000 UTC m=+7051.391886307" observedRunningTime="2026-02-20 08:43:57.760397077 +0000 UTC m=+7052.633023788" watchObservedRunningTime="2026-02-20 08:43:57.769833854 +0000 UTC m=+7052.642460565" Feb 20 08:43:58 crc kubenswrapper[5094]: I0220 08:43:58.751617 5094 generic.go:334] "Generic (PLEG): container finished" podID="b63f3e88-3e2a-43db-88de-8cf778187671" containerID="3996b425ffaed37f9d76c6d868da173f9ea735a39c30061d9efa5a940e6f7333" exitCode=0 Feb 20 08:43:58 crc kubenswrapper[5094]: I0220 08:43:58.751726 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-smd54" event={"ID":"b63f3e88-3e2a-43db-88de-8cf778187671","Type":"ContainerDied","Data":"3996b425ffaed37f9d76c6d868da173f9ea735a39c30061d9efa5a940e6f7333"} Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.190586 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-smd54" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.246208 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-combined-ca-bundle\") pod \"b63f3e88-3e2a-43db-88de-8cf778187671\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.246302 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvwps\" (UniqueName: \"kubernetes.io/projected/b63f3e88-3e2a-43db-88de-8cf778187671-kube-api-access-mvwps\") pod \"b63f3e88-3e2a-43db-88de-8cf778187671\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.246428 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-config-data\") pod \"b63f3e88-3e2a-43db-88de-8cf778187671\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.246509 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-scripts\") pod \"b63f3e88-3e2a-43db-88de-8cf778187671\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.246609 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b63f3e88-3e2a-43db-88de-8cf778187671-logs\") pod \"b63f3e88-3e2a-43db-88de-8cf778187671\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.247278 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b63f3e88-3e2a-43db-88de-8cf778187671-logs" (OuterVolumeSpecName: "logs") pod "b63f3e88-3e2a-43db-88de-8cf778187671" (UID: "b63f3e88-3e2a-43db-88de-8cf778187671"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.251885 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-scripts" (OuterVolumeSpecName: "scripts") pod "b63f3e88-3e2a-43db-88de-8cf778187671" (UID: "b63f3e88-3e2a-43db-88de-8cf778187671"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.251975 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b63f3e88-3e2a-43db-88de-8cf778187671-kube-api-access-mvwps" (OuterVolumeSpecName: "kube-api-access-mvwps") pod "b63f3e88-3e2a-43db-88de-8cf778187671" (UID: "b63f3e88-3e2a-43db-88de-8cf778187671"). InnerVolumeSpecName "kube-api-access-mvwps". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.272597 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b63f3e88-3e2a-43db-88de-8cf778187671" (UID: "b63f3e88-3e2a-43db-88de-8cf778187671"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.275022 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-config-data" (OuterVolumeSpecName: "config-data") pod "b63f3e88-3e2a-43db-88de-8cf778187671" (UID: "b63f3e88-3e2a-43db-88de-8cf778187671"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.348341 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b63f3e88-3e2a-43db-88de-8cf778187671-logs\") on node \"crc\" DevicePath \"\"" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.348375 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.348385 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvwps\" (UniqueName: \"kubernetes.io/projected/b63f3e88-3e2a-43db-88de-8cf778187671-kube-api-access-mvwps\") on node \"crc\" DevicePath \"\"" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.348396 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.348404 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.776660 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-smd54" event={"ID":"b63f3e88-3e2a-43db-88de-8cf778187671","Type":"ContainerDied","Data":"fcdd175ed55b2d24a7ffffadd8ffe886b1a529efda710241b94c420889488a0d"} Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.776697 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcdd175ed55b2d24a7ffffadd8ffe886b1a529efda710241b94c420889488a0d" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.776696 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-smd54" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.850737 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-64d8d4f69d-shjqs"] Feb 20 08:44:00 crc kubenswrapper[5094]: E0220 08:44:00.851461 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b63f3e88-3e2a-43db-88de-8cf778187671" containerName="placement-db-sync" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.851482 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b63f3e88-3e2a-43db-88de-8cf778187671" containerName="placement-db-sync" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.851677 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b63f3e88-3e2a-43db-88de-8cf778187671" containerName="placement-db-sync" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.852745 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.854952 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-p4pxm" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.855387 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.856374 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.898650 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-64d8d4f69d-shjqs"] Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.958359 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc482d5b-0b27-4293-b02b-7b02007cf790-config-data\") pod \"placement-64d8d4f69d-shjqs\" (UID: \"cc482d5b-0b27-4293-b02b-7b02007cf790\") " pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.958640 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc482d5b-0b27-4293-b02b-7b02007cf790-combined-ca-bundle\") pod \"placement-64d8d4f69d-shjqs\" (UID: \"cc482d5b-0b27-4293-b02b-7b02007cf790\") " pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.958698 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc482d5b-0b27-4293-b02b-7b02007cf790-scripts\") pod \"placement-64d8d4f69d-shjqs\" (UID: \"cc482d5b-0b27-4293-b02b-7b02007cf790\") " pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.958819 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfwwk\" (UniqueName: \"kubernetes.io/projected/cc482d5b-0b27-4293-b02b-7b02007cf790-kube-api-access-hfwwk\") pod \"placement-64d8d4f69d-shjqs\" (UID: \"cc482d5b-0b27-4293-b02b-7b02007cf790\") " pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.958932 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc482d5b-0b27-4293-b02b-7b02007cf790-logs\") pod \"placement-64d8d4f69d-shjqs\" (UID: \"cc482d5b-0b27-4293-b02b-7b02007cf790\") " pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:01 crc kubenswrapper[5094]: I0220 08:44:01.059948 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc482d5b-0b27-4293-b02b-7b02007cf790-logs\") pod \"placement-64d8d4f69d-shjqs\" (UID: \"cc482d5b-0b27-4293-b02b-7b02007cf790\") " pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:01 crc kubenswrapper[5094]: I0220 08:44:01.060022 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc482d5b-0b27-4293-b02b-7b02007cf790-config-data\") pod \"placement-64d8d4f69d-shjqs\" (UID: \"cc482d5b-0b27-4293-b02b-7b02007cf790\") " pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:01 crc kubenswrapper[5094]: I0220 08:44:01.060084 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc482d5b-0b27-4293-b02b-7b02007cf790-combined-ca-bundle\") pod \"placement-64d8d4f69d-shjqs\" (UID: \"cc482d5b-0b27-4293-b02b-7b02007cf790\") " pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:01 crc kubenswrapper[5094]: I0220 08:44:01.060104 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc482d5b-0b27-4293-b02b-7b02007cf790-scripts\") pod \"placement-64d8d4f69d-shjqs\" (UID: \"cc482d5b-0b27-4293-b02b-7b02007cf790\") " pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:01 crc kubenswrapper[5094]: I0220 08:44:01.060138 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfwwk\" (UniqueName: \"kubernetes.io/projected/cc482d5b-0b27-4293-b02b-7b02007cf790-kube-api-access-hfwwk\") pod \"placement-64d8d4f69d-shjqs\" (UID: \"cc482d5b-0b27-4293-b02b-7b02007cf790\") " pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:01 crc kubenswrapper[5094]: I0220 08:44:01.060417 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc482d5b-0b27-4293-b02b-7b02007cf790-logs\") pod \"placement-64d8d4f69d-shjqs\" (UID: \"cc482d5b-0b27-4293-b02b-7b02007cf790\") " pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:01 crc kubenswrapper[5094]: I0220 08:44:01.063620 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc482d5b-0b27-4293-b02b-7b02007cf790-scripts\") pod \"placement-64d8d4f69d-shjqs\" (UID: \"cc482d5b-0b27-4293-b02b-7b02007cf790\") " pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:01 crc kubenswrapper[5094]: I0220 08:44:01.064208 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc482d5b-0b27-4293-b02b-7b02007cf790-config-data\") pod \"placement-64d8d4f69d-shjqs\" (UID: \"cc482d5b-0b27-4293-b02b-7b02007cf790\") " pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:01 crc kubenswrapper[5094]: I0220 08:44:01.074756 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc482d5b-0b27-4293-b02b-7b02007cf790-combined-ca-bundle\") pod \"placement-64d8d4f69d-shjqs\" (UID: \"cc482d5b-0b27-4293-b02b-7b02007cf790\") " pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:01 crc kubenswrapper[5094]: I0220 08:44:01.078188 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfwwk\" (UniqueName: \"kubernetes.io/projected/cc482d5b-0b27-4293-b02b-7b02007cf790-kube-api-access-hfwwk\") pod \"placement-64d8d4f69d-shjqs\" (UID: \"cc482d5b-0b27-4293-b02b-7b02007cf790\") " pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:01 crc kubenswrapper[5094]: I0220 08:44:01.195309 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:01 crc kubenswrapper[5094]: I0220 08:44:01.637814 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-64d8d4f69d-shjqs"] Feb 20 08:44:01 crc kubenswrapper[5094]: W0220 08:44:01.644984 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc482d5b_0b27_4293_b02b_7b02007cf790.slice/crio-bb1cd92eceed647f6ac1a213eabeececf52d940c826cde33a913bf1a19950dba WatchSource:0}: Error finding container bb1cd92eceed647f6ac1a213eabeececf52d940c826cde33a913bf1a19950dba: Status 404 returned error can't find the container with id bb1cd92eceed647f6ac1a213eabeececf52d940c826cde33a913bf1a19950dba Feb 20 08:44:01 crc kubenswrapper[5094]: I0220 08:44:01.785354 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-64d8d4f69d-shjqs" event={"ID":"cc482d5b-0b27-4293-b02b-7b02007cf790","Type":"ContainerStarted","Data":"bb1cd92eceed647f6ac1a213eabeececf52d940c826cde33a913bf1a19950dba"} Feb 20 08:44:01 crc kubenswrapper[5094]: I0220 08:44:01.839842 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:44:01 crc kubenswrapper[5094]: E0220 08:44:01.840150 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:44:02 crc kubenswrapper[5094]: I0220 08:44:02.811389 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-64d8d4f69d-shjqs" event={"ID":"cc482d5b-0b27-4293-b02b-7b02007cf790","Type":"ContainerStarted","Data":"87e44785505a7c4d57850817ba0c257b31f219096005591405831fb50885168d"} Feb 20 08:44:02 crc kubenswrapper[5094]: I0220 08:44:02.812890 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:02 crc kubenswrapper[5094]: I0220 08:44:02.812994 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:02 crc kubenswrapper[5094]: I0220 08:44:02.813074 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-64d8d4f69d-shjqs" event={"ID":"cc482d5b-0b27-4293-b02b-7b02007cf790","Type":"ContainerStarted","Data":"ec1b8e7fa8db3b20e94f7aa0892926f5a97eee09b9575ed017cd8f1b46da622a"} Feb 20 08:44:02 crc kubenswrapper[5094]: I0220 08:44:02.822900 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:44:02 crc kubenswrapper[5094]: I0220 08:44:02.842476 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-64d8d4f69d-shjqs" podStartSLOduration=2.8424556 podStartE2EDuration="2.8424556s" podCreationTimestamp="2026-02-20 08:44:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:44:02.832112182 +0000 UTC m=+7057.704738923" watchObservedRunningTime="2026-02-20 08:44:02.8424556 +0000 UTC m=+7057.715082311" Feb 20 08:44:02 crc kubenswrapper[5094]: I0220 08:44:02.916028 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b495df7c5-7nbzn"] Feb 20 08:44:02 crc kubenswrapper[5094]: I0220 08:44:02.916563 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" podUID="84d7a949-86f8-4325-8494-1e37848e76ec" containerName="dnsmasq-dns" containerID="cri-o://913c46cb53746a67114f39304fb82db0087e728a7ec5b6b477343f5f416a8b8f" gracePeriod=10 Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.394806 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.506932 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-ovsdbserver-sb\") pod \"84d7a949-86f8-4325-8494-1e37848e76ec\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.506995 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4kpp\" (UniqueName: \"kubernetes.io/projected/84d7a949-86f8-4325-8494-1e37848e76ec-kube-api-access-b4kpp\") pod \"84d7a949-86f8-4325-8494-1e37848e76ec\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.507030 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-dns-svc\") pod \"84d7a949-86f8-4325-8494-1e37848e76ec\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.507056 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-config\") pod \"84d7a949-86f8-4325-8494-1e37848e76ec\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.507085 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-ovsdbserver-nb\") pod \"84d7a949-86f8-4325-8494-1e37848e76ec\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.523777 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84d7a949-86f8-4325-8494-1e37848e76ec-kube-api-access-b4kpp" (OuterVolumeSpecName: "kube-api-access-b4kpp") pod "84d7a949-86f8-4325-8494-1e37848e76ec" (UID: "84d7a949-86f8-4325-8494-1e37848e76ec"). InnerVolumeSpecName "kube-api-access-b4kpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.549289 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "84d7a949-86f8-4325-8494-1e37848e76ec" (UID: "84d7a949-86f8-4325-8494-1e37848e76ec"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.549352 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "84d7a949-86f8-4325-8494-1e37848e76ec" (UID: "84d7a949-86f8-4325-8494-1e37848e76ec"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.561650 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-config" (OuterVolumeSpecName: "config") pod "84d7a949-86f8-4325-8494-1e37848e76ec" (UID: "84d7a949-86f8-4325-8494-1e37848e76ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.562216 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "84d7a949-86f8-4325-8494-1e37848e76ec" (UID: "84d7a949-86f8-4325-8494-1e37848e76ec"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.608423 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4kpp\" (UniqueName: \"kubernetes.io/projected/84d7a949-86f8-4325-8494-1e37848e76ec-kube-api-access-b4kpp\") on node \"crc\" DevicePath \"\"" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.608448 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.608457 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.608466 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.608473 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.821957 5094 generic.go:334] "Generic (PLEG): container finished" podID="84d7a949-86f8-4325-8494-1e37848e76ec" containerID="913c46cb53746a67114f39304fb82db0087e728a7ec5b6b477343f5f416a8b8f" exitCode=0 Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.822822 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.825870 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" event={"ID":"84d7a949-86f8-4325-8494-1e37848e76ec","Type":"ContainerDied","Data":"913c46cb53746a67114f39304fb82db0087e728a7ec5b6b477343f5f416a8b8f"} Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.825912 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" event={"ID":"84d7a949-86f8-4325-8494-1e37848e76ec","Type":"ContainerDied","Data":"9b5aa36d351d8e72474f8ef575d86e027a54c562714317f08d7c4cf38067fb2a"} Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.825941 5094 scope.go:117] "RemoveContainer" containerID="913c46cb53746a67114f39304fb82db0087e728a7ec5b6b477343f5f416a8b8f" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.857949 5094 scope.go:117] "RemoveContainer" containerID="208d3211ae79c8d4cda59830631f10fba8107fe5fff0ad3a2d8a3afdcb25b100" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.864789 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b495df7c5-7nbzn"] Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.875831 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b495df7c5-7nbzn"] Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.894295 5094 scope.go:117] "RemoveContainer" containerID="913c46cb53746a67114f39304fb82db0087e728a7ec5b6b477343f5f416a8b8f" Feb 20 08:44:03 crc kubenswrapper[5094]: E0220 08:44:03.894614 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"913c46cb53746a67114f39304fb82db0087e728a7ec5b6b477343f5f416a8b8f\": container with ID starting with 913c46cb53746a67114f39304fb82db0087e728a7ec5b6b477343f5f416a8b8f not found: ID does not exist" containerID="913c46cb53746a67114f39304fb82db0087e728a7ec5b6b477343f5f416a8b8f" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.894650 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"913c46cb53746a67114f39304fb82db0087e728a7ec5b6b477343f5f416a8b8f"} err="failed to get container status \"913c46cb53746a67114f39304fb82db0087e728a7ec5b6b477343f5f416a8b8f\": rpc error: code = NotFound desc = could not find container \"913c46cb53746a67114f39304fb82db0087e728a7ec5b6b477343f5f416a8b8f\": container with ID starting with 913c46cb53746a67114f39304fb82db0087e728a7ec5b6b477343f5f416a8b8f not found: ID does not exist" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.894669 5094 scope.go:117] "RemoveContainer" containerID="208d3211ae79c8d4cda59830631f10fba8107fe5fff0ad3a2d8a3afdcb25b100" Feb 20 08:44:03 crc kubenswrapper[5094]: E0220 08:44:03.895008 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"208d3211ae79c8d4cda59830631f10fba8107fe5fff0ad3a2d8a3afdcb25b100\": container with ID starting with 208d3211ae79c8d4cda59830631f10fba8107fe5fff0ad3a2d8a3afdcb25b100 not found: ID does not exist" containerID="208d3211ae79c8d4cda59830631f10fba8107fe5fff0ad3a2d8a3afdcb25b100" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.895028 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"208d3211ae79c8d4cda59830631f10fba8107fe5fff0ad3a2d8a3afdcb25b100"} err="failed to get container status \"208d3211ae79c8d4cda59830631f10fba8107fe5fff0ad3a2d8a3afdcb25b100\": rpc error: code = NotFound desc = could not find container \"208d3211ae79c8d4cda59830631f10fba8107fe5fff0ad3a2d8a3afdcb25b100\": container with ID starting with 208d3211ae79c8d4cda59830631f10fba8107fe5fff0ad3a2d8a3afdcb25b100 not found: ID does not exist" Feb 20 08:44:05 crc kubenswrapper[5094]: I0220 08:44:05.865785 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84d7a949-86f8-4325-8494-1e37848e76ec" path="/var/lib/kubelet/pods/84d7a949-86f8-4325-8494-1e37848e76ec/volumes" Feb 20 08:44:13 crc kubenswrapper[5094]: I0220 08:44:13.840415 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:44:13 crc kubenswrapper[5094]: E0220 08:44:13.841314 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:44:28 crc kubenswrapper[5094]: I0220 08:44:28.840104 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:44:28 crc kubenswrapper[5094]: E0220 08:44:28.840801 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:44:32 crc kubenswrapper[5094]: I0220 08:44:32.226935 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:32 crc kubenswrapper[5094]: I0220 08:44:32.229768 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:37 crc kubenswrapper[5094]: I0220 08:44:37.199303 5094 scope.go:117] "RemoveContainer" containerID="63ebc0843e1f18b14a59e973034594be94e63f23e4b14fd38a292a888d5971cd" Feb 20 08:44:37 crc kubenswrapper[5094]: I0220 08:44:37.229188 5094 scope.go:117] "RemoveContainer" containerID="c629842f4857fc824451a2868ac4ca4ec36a7daeeb169573db6f9b5405d05a43" Feb 20 08:44:39 crc kubenswrapper[5094]: I0220 08:44:39.839817 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:44:39 crc kubenswrapper[5094]: E0220 08:44:39.840382 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:44:53 crc kubenswrapper[5094]: I0220 08:44:53.843483 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:44:53 crc kubenswrapper[5094]: E0220 08:44:53.844363 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.046652 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-q8cvq"] Feb 20 08:44:56 crc kubenswrapper[5094]: E0220 08:44:56.047547 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d7a949-86f8-4325-8494-1e37848e76ec" containerName="dnsmasq-dns" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.047565 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d7a949-86f8-4325-8494-1e37848e76ec" containerName="dnsmasq-dns" Feb 20 08:44:56 crc kubenswrapper[5094]: E0220 08:44:56.047589 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d7a949-86f8-4325-8494-1e37848e76ec" containerName="init" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.047596 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d7a949-86f8-4325-8494-1e37848e76ec" containerName="init" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.047774 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d7a949-86f8-4325-8494-1e37848e76ec" containerName="dnsmasq-dns" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.048346 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-q8cvq" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.053257 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-q8cvq"] Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.084852 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ab2f8a8-e11c-4b13-a12f-7006756e4d56-operator-scripts\") pod \"nova-api-db-create-q8cvq\" (UID: \"2ab2f8a8-e11c-4b13-a12f-7006756e4d56\") " pod="openstack/nova-api-db-create-q8cvq" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.084930 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r857\" (UniqueName: \"kubernetes.io/projected/2ab2f8a8-e11c-4b13-a12f-7006756e4d56-kube-api-access-7r857\") pod \"nova-api-db-create-q8cvq\" (UID: \"2ab2f8a8-e11c-4b13-a12f-7006756e4d56\") " pod="openstack/nova-api-db-create-q8cvq" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.143170 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-qwnhp"] Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.145638 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qwnhp" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.152229 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qwnhp"] Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.191334 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkjz6\" (UniqueName: \"kubernetes.io/projected/62afc590-4a32-45a1-b7e9-bde09c7f0b6a-kube-api-access-jkjz6\") pod \"nova-cell0-db-create-qwnhp\" (UID: \"62afc590-4a32-45a1-b7e9-bde09c7f0b6a\") " pod="openstack/nova-cell0-db-create-qwnhp" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.191418 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62afc590-4a32-45a1-b7e9-bde09c7f0b6a-operator-scripts\") pod \"nova-cell0-db-create-qwnhp\" (UID: \"62afc590-4a32-45a1-b7e9-bde09c7f0b6a\") " pod="openstack/nova-cell0-db-create-qwnhp" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.191478 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ab2f8a8-e11c-4b13-a12f-7006756e4d56-operator-scripts\") pod \"nova-api-db-create-q8cvq\" (UID: \"2ab2f8a8-e11c-4b13-a12f-7006756e4d56\") " pod="openstack/nova-api-db-create-q8cvq" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.191508 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r857\" (UniqueName: \"kubernetes.io/projected/2ab2f8a8-e11c-4b13-a12f-7006756e4d56-kube-api-access-7r857\") pod \"nova-api-db-create-q8cvq\" (UID: \"2ab2f8a8-e11c-4b13-a12f-7006756e4d56\") " pod="openstack/nova-api-db-create-q8cvq" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.197724 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ab2f8a8-e11c-4b13-a12f-7006756e4d56-operator-scripts\") pod \"nova-api-db-create-q8cvq\" (UID: \"2ab2f8a8-e11c-4b13-a12f-7006756e4d56\") " pod="openstack/nova-api-db-create-q8cvq" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.221074 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r857\" (UniqueName: \"kubernetes.io/projected/2ab2f8a8-e11c-4b13-a12f-7006756e4d56-kube-api-access-7r857\") pod \"nova-api-db-create-q8cvq\" (UID: \"2ab2f8a8-e11c-4b13-a12f-7006756e4d56\") " pod="openstack/nova-api-db-create-q8cvq" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.264237 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-1c22-account-create-update-wl44h"] Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.265692 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1c22-account-create-update-wl44h" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.267676 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.292136 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1c22-account-create-update-wl44h"] Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.293724 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62afc590-4a32-45a1-b7e9-bde09c7f0b6a-operator-scripts\") pod \"nova-cell0-db-create-qwnhp\" (UID: \"62afc590-4a32-45a1-b7e9-bde09c7f0b6a\") " pod="openstack/nova-cell0-db-create-qwnhp" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.293872 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkjz6\" (UniqueName: \"kubernetes.io/projected/62afc590-4a32-45a1-b7e9-bde09c7f0b6a-kube-api-access-jkjz6\") pod \"nova-cell0-db-create-qwnhp\" (UID: \"62afc590-4a32-45a1-b7e9-bde09c7f0b6a\") " pod="openstack/nova-cell0-db-create-qwnhp" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.294587 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62afc590-4a32-45a1-b7e9-bde09c7f0b6a-operator-scripts\") pod \"nova-cell0-db-create-qwnhp\" (UID: \"62afc590-4a32-45a1-b7e9-bde09c7f0b6a\") " pod="openstack/nova-cell0-db-create-qwnhp" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.314453 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkjz6\" (UniqueName: \"kubernetes.io/projected/62afc590-4a32-45a1-b7e9-bde09c7f0b6a-kube-api-access-jkjz6\") pod \"nova-cell0-db-create-qwnhp\" (UID: \"62afc590-4a32-45a1-b7e9-bde09c7f0b6a\") " pod="openstack/nova-cell0-db-create-qwnhp" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.369072 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-vjfql"] Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.372365 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-q8cvq" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.397749 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmkdm\" (UniqueName: \"kubernetes.io/projected/95274e98-2b48-4b4d-b0c5-5dedafedc43f-kube-api-access-mmkdm\") pod \"nova-api-1c22-account-create-update-wl44h\" (UID: \"95274e98-2b48-4b4d-b0c5-5dedafedc43f\") " pod="openstack/nova-api-1c22-account-create-update-wl44h" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.397863 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95274e98-2b48-4b4d-b0c5-5dedafedc43f-operator-scripts\") pod \"nova-api-1c22-account-create-update-wl44h\" (UID: \"95274e98-2b48-4b4d-b0c5-5dedafedc43f\") " pod="openstack/nova-api-1c22-account-create-update-wl44h" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.404018 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-vjfql"] Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.404114 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vjfql" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.462163 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qwnhp" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.474689 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-9129-account-create-update-kcqsj"] Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.476111 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9129-account-create-update-kcqsj" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.483114 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.490351 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9129-account-create-update-kcqsj"] Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.499571 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmkdm\" (UniqueName: \"kubernetes.io/projected/95274e98-2b48-4b4d-b0c5-5dedafedc43f-kube-api-access-mmkdm\") pod \"nova-api-1c22-account-create-update-wl44h\" (UID: \"95274e98-2b48-4b4d-b0c5-5dedafedc43f\") " pod="openstack/nova-api-1c22-account-create-update-wl44h" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.499619 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e24ca1b9-7440-432c-a0eb-58a17f83a8ee-operator-scripts\") pod \"nova-cell1-db-create-vjfql\" (UID: \"e24ca1b9-7440-432c-a0eb-58a17f83a8ee\") " pod="openstack/nova-cell1-db-create-vjfql" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.499653 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsmld\" (UniqueName: \"kubernetes.io/projected/e24ca1b9-7440-432c-a0eb-58a17f83a8ee-kube-api-access-hsmld\") pod \"nova-cell1-db-create-vjfql\" (UID: \"e24ca1b9-7440-432c-a0eb-58a17f83a8ee\") " pod="openstack/nova-cell1-db-create-vjfql" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.499675 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95274e98-2b48-4b4d-b0c5-5dedafedc43f-operator-scripts\") pod \"nova-api-1c22-account-create-update-wl44h\" (UID: \"95274e98-2b48-4b4d-b0c5-5dedafedc43f\") " pod="openstack/nova-api-1c22-account-create-update-wl44h" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.500613 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95274e98-2b48-4b4d-b0c5-5dedafedc43f-operator-scripts\") pod \"nova-api-1c22-account-create-update-wl44h\" (UID: \"95274e98-2b48-4b4d-b0c5-5dedafedc43f\") " pod="openstack/nova-api-1c22-account-create-update-wl44h" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.566969 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmkdm\" (UniqueName: \"kubernetes.io/projected/95274e98-2b48-4b4d-b0c5-5dedafedc43f-kube-api-access-mmkdm\") pod \"nova-api-1c22-account-create-update-wl44h\" (UID: \"95274e98-2b48-4b4d-b0c5-5dedafedc43f\") " pod="openstack/nova-api-1c22-account-create-update-wl44h" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.586067 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1c22-account-create-update-wl44h" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.600662 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jstfk\" (UniqueName: \"kubernetes.io/projected/90396e9c-2602-41dd-92c3-da38bb5f7be7-kube-api-access-jstfk\") pod \"nova-cell0-9129-account-create-update-kcqsj\" (UID: \"90396e9c-2602-41dd-92c3-da38bb5f7be7\") " pod="openstack/nova-cell0-9129-account-create-update-kcqsj" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.600728 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e24ca1b9-7440-432c-a0eb-58a17f83a8ee-operator-scripts\") pod \"nova-cell1-db-create-vjfql\" (UID: \"e24ca1b9-7440-432c-a0eb-58a17f83a8ee\") " pod="openstack/nova-cell1-db-create-vjfql" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.600771 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsmld\" (UniqueName: \"kubernetes.io/projected/e24ca1b9-7440-432c-a0eb-58a17f83a8ee-kube-api-access-hsmld\") pod \"nova-cell1-db-create-vjfql\" (UID: \"e24ca1b9-7440-432c-a0eb-58a17f83a8ee\") " pod="openstack/nova-cell1-db-create-vjfql" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.600789 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90396e9c-2602-41dd-92c3-da38bb5f7be7-operator-scripts\") pod \"nova-cell0-9129-account-create-update-kcqsj\" (UID: \"90396e9c-2602-41dd-92c3-da38bb5f7be7\") " pod="openstack/nova-cell0-9129-account-create-update-kcqsj" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.601419 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e24ca1b9-7440-432c-a0eb-58a17f83a8ee-operator-scripts\") pod \"nova-cell1-db-create-vjfql\" (UID: \"e24ca1b9-7440-432c-a0eb-58a17f83a8ee\") " pod="openstack/nova-cell1-db-create-vjfql" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.622844 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsmld\" (UniqueName: \"kubernetes.io/projected/e24ca1b9-7440-432c-a0eb-58a17f83a8ee-kube-api-access-hsmld\") pod \"nova-cell1-db-create-vjfql\" (UID: \"e24ca1b9-7440-432c-a0eb-58a17f83a8ee\") " pod="openstack/nova-cell1-db-create-vjfql" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.657157 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-437f-account-create-update-cc6d4"] Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.658509 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-437f-account-create-update-cc6d4" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.673308 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.674321 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-437f-account-create-update-cc6d4"] Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.702174 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jstfk\" (UniqueName: \"kubernetes.io/projected/90396e9c-2602-41dd-92c3-da38bb5f7be7-kube-api-access-jstfk\") pod \"nova-cell0-9129-account-create-update-kcqsj\" (UID: \"90396e9c-2602-41dd-92c3-da38bb5f7be7\") " pod="openstack/nova-cell0-9129-account-create-update-kcqsj" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.702232 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33e893dc-597d-4b0d-b59d-04c636d58ce4-operator-scripts\") pod \"nova-cell1-437f-account-create-update-cc6d4\" (UID: \"33e893dc-597d-4b0d-b59d-04c636d58ce4\") " pod="openstack/nova-cell1-437f-account-create-update-cc6d4" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.702259 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4b79\" (UniqueName: \"kubernetes.io/projected/33e893dc-597d-4b0d-b59d-04c636d58ce4-kube-api-access-w4b79\") pod \"nova-cell1-437f-account-create-update-cc6d4\" (UID: \"33e893dc-597d-4b0d-b59d-04c636d58ce4\") " pod="openstack/nova-cell1-437f-account-create-update-cc6d4" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.702283 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90396e9c-2602-41dd-92c3-da38bb5f7be7-operator-scripts\") pod \"nova-cell0-9129-account-create-update-kcqsj\" (UID: \"90396e9c-2602-41dd-92c3-da38bb5f7be7\") " pod="openstack/nova-cell0-9129-account-create-update-kcqsj" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.703178 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90396e9c-2602-41dd-92c3-da38bb5f7be7-operator-scripts\") pod \"nova-cell0-9129-account-create-update-kcqsj\" (UID: \"90396e9c-2602-41dd-92c3-da38bb5f7be7\") " pod="openstack/nova-cell0-9129-account-create-update-kcqsj" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.719016 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jstfk\" (UniqueName: \"kubernetes.io/projected/90396e9c-2602-41dd-92c3-da38bb5f7be7-kube-api-access-jstfk\") pod \"nova-cell0-9129-account-create-update-kcqsj\" (UID: \"90396e9c-2602-41dd-92c3-da38bb5f7be7\") " pod="openstack/nova-cell0-9129-account-create-update-kcqsj" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.804014 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33e893dc-597d-4b0d-b59d-04c636d58ce4-operator-scripts\") pod \"nova-cell1-437f-account-create-update-cc6d4\" (UID: \"33e893dc-597d-4b0d-b59d-04c636d58ce4\") " pod="openstack/nova-cell1-437f-account-create-update-cc6d4" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.804082 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4b79\" (UniqueName: \"kubernetes.io/projected/33e893dc-597d-4b0d-b59d-04c636d58ce4-kube-api-access-w4b79\") pod \"nova-cell1-437f-account-create-update-cc6d4\" (UID: \"33e893dc-597d-4b0d-b59d-04c636d58ce4\") " pod="openstack/nova-cell1-437f-account-create-update-cc6d4" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.804700 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33e893dc-597d-4b0d-b59d-04c636d58ce4-operator-scripts\") pod \"nova-cell1-437f-account-create-update-cc6d4\" (UID: \"33e893dc-597d-4b0d-b59d-04c636d58ce4\") " pod="openstack/nova-cell1-437f-account-create-update-cc6d4" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.813688 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vjfql" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.841194 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4b79\" (UniqueName: \"kubernetes.io/projected/33e893dc-597d-4b0d-b59d-04c636d58ce4-kube-api-access-w4b79\") pod \"nova-cell1-437f-account-create-update-cc6d4\" (UID: \"33e893dc-597d-4b0d-b59d-04c636d58ce4\") " pod="openstack/nova-cell1-437f-account-create-update-cc6d4" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.890329 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9129-account-create-update-kcqsj" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.989179 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-437f-account-create-update-cc6d4" Feb 20 08:44:57 crc kubenswrapper[5094]: I0220 08:44:57.033873 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-q8cvq"] Feb 20 08:44:57 crc kubenswrapper[5094]: W0220 08:44:57.064025 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ab2f8a8_e11c_4b13_a12f_7006756e4d56.slice/crio-71d06dd390ead65459118c430e3e92cf62decfb5df76c8b5366da5fbbcb16c71 WatchSource:0}: Error finding container 71d06dd390ead65459118c430e3e92cf62decfb5df76c8b5366da5fbbcb16c71: Status 404 returned error can't find the container with id 71d06dd390ead65459118c430e3e92cf62decfb5df76c8b5366da5fbbcb16c71 Feb 20 08:44:57 crc kubenswrapper[5094]: I0220 08:44:57.150469 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qwnhp"] Feb 20 08:44:57 crc kubenswrapper[5094]: I0220 08:44:57.205716 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1c22-account-create-update-wl44h"] Feb 20 08:44:57 crc kubenswrapper[5094]: W0220 08:44:57.217920 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95274e98_2b48_4b4d_b0c5_5dedafedc43f.slice/crio-1dbf96c9c2fb9c9c59994ebb1568bc2fa160b27d8989791e349374e858cc8c96 WatchSource:0}: Error finding container 1dbf96c9c2fb9c9c59994ebb1568bc2fa160b27d8989791e349374e858cc8c96: Status 404 returned error can't find the container with id 1dbf96c9c2fb9c9c59994ebb1568bc2fa160b27d8989791e349374e858cc8c96 Feb 20 08:44:57 crc kubenswrapper[5094]: I0220 08:44:57.295795 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qwnhp" event={"ID":"62afc590-4a32-45a1-b7e9-bde09c7f0b6a","Type":"ContainerStarted","Data":"3402431af5c0b829ac8fa80c190f9199ba60a6730114334bae45f899a11cb0fe"} Feb 20 08:44:57 crc kubenswrapper[5094]: I0220 08:44:57.300125 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-q8cvq" event={"ID":"2ab2f8a8-e11c-4b13-a12f-7006756e4d56","Type":"ContainerStarted","Data":"18a0fc5e2df223fc2c55d1f16cc18e7c5d24a21c6534f46e5e010adfb875a921"} Feb 20 08:44:57 crc kubenswrapper[5094]: I0220 08:44:57.300168 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-q8cvq" event={"ID":"2ab2f8a8-e11c-4b13-a12f-7006756e4d56","Type":"ContainerStarted","Data":"71d06dd390ead65459118c430e3e92cf62decfb5df76c8b5366da5fbbcb16c71"} Feb 20 08:44:57 crc kubenswrapper[5094]: I0220 08:44:57.305350 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1c22-account-create-update-wl44h" event={"ID":"95274e98-2b48-4b4d-b0c5-5dedafedc43f","Type":"ContainerStarted","Data":"1dbf96c9c2fb9c9c59994ebb1568bc2fa160b27d8989791e349374e858cc8c96"} Feb 20 08:44:57 crc kubenswrapper[5094]: I0220 08:44:57.319791 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-q8cvq" podStartSLOduration=1.319770209 podStartE2EDuration="1.319770209s" podCreationTimestamp="2026-02-20 08:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:44:57.316301355 +0000 UTC m=+7112.188928056" watchObservedRunningTime="2026-02-20 08:44:57.319770209 +0000 UTC m=+7112.192396920" Feb 20 08:44:57 crc kubenswrapper[5094]: W0220 08:44:57.342325 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode24ca1b9_7440_432c_a0eb_58a17f83a8ee.slice/crio-a618dc7847d0c1e997e5258ecd34df71b91c9ca9712f96fe9bce654865ed51d2 WatchSource:0}: Error finding container a618dc7847d0c1e997e5258ecd34df71b91c9ca9712f96fe9bce654865ed51d2: Status 404 returned error can't find the container with id a618dc7847d0c1e997e5258ecd34df71b91c9ca9712f96fe9bce654865ed51d2 Feb 20 08:44:57 crc kubenswrapper[5094]: I0220 08:44:57.373221 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-vjfql"] Feb 20 08:44:57 crc kubenswrapper[5094]: I0220 08:44:57.506626 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9129-account-create-update-kcqsj"] Feb 20 08:44:57 crc kubenswrapper[5094]: W0220 08:44:57.549926 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90396e9c_2602_41dd_92c3_da38bb5f7be7.slice/crio-e4db54b07b51e441746c30da24b69b2b5a330b4c4da10e88c7e0cad568ec5b9c WatchSource:0}: Error finding container e4db54b07b51e441746c30da24b69b2b5a330b4c4da10e88c7e0cad568ec5b9c: Status 404 returned error can't find the container with id e4db54b07b51e441746c30da24b69b2b5a330b4c4da10e88c7e0cad568ec5b9c Feb 20 08:44:57 crc kubenswrapper[5094]: I0220 08:44:57.624651 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-437f-account-create-update-cc6d4"] Feb 20 08:44:57 crc kubenswrapper[5094]: W0220 08:44:57.635121 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33e893dc_597d_4b0d_b59d_04c636d58ce4.slice/crio-9dec53993c58a49e10cba3dda4698b5b9362b895ed9d44eabdf14e9fd3345aaf WatchSource:0}: Error finding container 9dec53993c58a49e10cba3dda4698b5b9362b895ed9d44eabdf14e9fd3345aaf: Status 404 returned error can't find the container with id 9dec53993c58a49e10cba3dda4698b5b9362b895ed9d44eabdf14e9fd3345aaf Feb 20 08:44:58 crc kubenswrapper[5094]: I0220 08:44:58.314395 5094 generic.go:334] "Generic (PLEG): container finished" podID="33e893dc-597d-4b0d-b59d-04c636d58ce4" containerID="3f997facacb6313e0f115f2a2227ee22f54c84973cc33a1b4f4cc4cd0e2df3df" exitCode=0 Feb 20 08:44:58 crc kubenswrapper[5094]: I0220 08:44:58.314501 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-437f-account-create-update-cc6d4" event={"ID":"33e893dc-597d-4b0d-b59d-04c636d58ce4","Type":"ContainerDied","Data":"3f997facacb6313e0f115f2a2227ee22f54c84973cc33a1b4f4cc4cd0e2df3df"} Feb 20 08:44:58 crc kubenswrapper[5094]: I0220 08:44:58.314567 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-437f-account-create-update-cc6d4" event={"ID":"33e893dc-597d-4b0d-b59d-04c636d58ce4","Type":"ContainerStarted","Data":"9dec53993c58a49e10cba3dda4698b5b9362b895ed9d44eabdf14e9fd3345aaf"} Feb 20 08:44:58 crc kubenswrapper[5094]: I0220 08:44:58.316242 5094 generic.go:334] "Generic (PLEG): container finished" podID="2ab2f8a8-e11c-4b13-a12f-7006756e4d56" containerID="18a0fc5e2df223fc2c55d1f16cc18e7c5d24a21c6534f46e5e010adfb875a921" exitCode=0 Feb 20 08:44:58 crc kubenswrapper[5094]: I0220 08:44:58.316279 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-q8cvq" event={"ID":"2ab2f8a8-e11c-4b13-a12f-7006756e4d56","Type":"ContainerDied","Data":"18a0fc5e2df223fc2c55d1f16cc18e7c5d24a21c6534f46e5e010adfb875a921"} Feb 20 08:44:58 crc kubenswrapper[5094]: I0220 08:44:58.317996 5094 generic.go:334] "Generic (PLEG): container finished" podID="95274e98-2b48-4b4d-b0c5-5dedafedc43f" containerID="0d114a7c88828f83388e2e035f175ac9a3e4b92dd7429d32fee56582784e51b6" exitCode=0 Feb 20 08:44:58 crc kubenswrapper[5094]: I0220 08:44:58.318049 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1c22-account-create-update-wl44h" event={"ID":"95274e98-2b48-4b4d-b0c5-5dedafedc43f","Type":"ContainerDied","Data":"0d114a7c88828f83388e2e035f175ac9a3e4b92dd7429d32fee56582784e51b6"} Feb 20 08:44:58 crc kubenswrapper[5094]: I0220 08:44:58.321365 5094 generic.go:334] "Generic (PLEG): container finished" podID="90396e9c-2602-41dd-92c3-da38bb5f7be7" containerID="814c099e47197bd5868d74e553deb48d652a97c496a27496d27f367ca0750674" exitCode=0 Feb 20 08:44:58 crc kubenswrapper[5094]: I0220 08:44:58.321461 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9129-account-create-update-kcqsj" event={"ID":"90396e9c-2602-41dd-92c3-da38bb5f7be7","Type":"ContainerDied","Data":"814c099e47197bd5868d74e553deb48d652a97c496a27496d27f367ca0750674"} Feb 20 08:44:58 crc kubenswrapper[5094]: I0220 08:44:58.321498 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9129-account-create-update-kcqsj" event={"ID":"90396e9c-2602-41dd-92c3-da38bb5f7be7","Type":"ContainerStarted","Data":"e4db54b07b51e441746c30da24b69b2b5a330b4c4da10e88c7e0cad568ec5b9c"} Feb 20 08:44:58 crc kubenswrapper[5094]: I0220 08:44:58.324964 5094 generic.go:334] "Generic (PLEG): container finished" podID="e24ca1b9-7440-432c-a0eb-58a17f83a8ee" containerID="7d035de1d36dadcbc2b1699a2d04fbaf8dc66a5156f2934f3122a703497829c7" exitCode=0 Feb 20 08:44:58 crc kubenswrapper[5094]: I0220 08:44:58.325081 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-vjfql" event={"ID":"e24ca1b9-7440-432c-a0eb-58a17f83a8ee","Type":"ContainerDied","Data":"7d035de1d36dadcbc2b1699a2d04fbaf8dc66a5156f2934f3122a703497829c7"} Feb 20 08:44:58 crc kubenswrapper[5094]: I0220 08:44:58.325137 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-vjfql" event={"ID":"e24ca1b9-7440-432c-a0eb-58a17f83a8ee","Type":"ContainerStarted","Data":"a618dc7847d0c1e997e5258ecd34df71b91c9ca9712f96fe9bce654865ed51d2"} Feb 20 08:44:58 crc kubenswrapper[5094]: I0220 08:44:58.327209 5094 generic.go:334] "Generic (PLEG): container finished" podID="62afc590-4a32-45a1-b7e9-bde09c7f0b6a" containerID="b22a4c98fab8bd430cea1082edfc23c911f8d32bd3adc55526aec0a42c5684bd" exitCode=0 Feb 20 08:44:58 crc kubenswrapper[5094]: I0220 08:44:58.327274 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qwnhp" event={"ID":"62afc590-4a32-45a1-b7e9-bde09c7f0b6a","Type":"ContainerDied","Data":"b22a4c98fab8bd430cea1082edfc23c911f8d32bd3adc55526aec0a42c5684bd"} Feb 20 08:44:59 crc kubenswrapper[5094]: I0220 08:44:59.771108 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-437f-account-create-update-cc6d4" Feb 20 08:44:59 crc kubenswrapper[5094]: I0220 08:44:59.861412 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4b79\" (UniqueName: \"kubernetes.io/projected/33e893dc-597d-4b0d-b59d-04c636d58ce4-kube-api-access-w4b79\") pod \"33e893dc-597d-4b0d-b59d-04c636d58ce4\" (UID: \"33e893dc-597d-4b0d-b59d-04c636d58ce4\") " Feb 20 08:44:59 crc kubenswrapper[5094]: I0220 08:44:59.861652 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33e893dc-597d-4b0d-b59d-04c636d58ce4-operator-scripts\") pod \"33e893dc-597d-4b0d-b59d-04c636d58ce4\" (UID: \"33e893dc-597d-4b0d-b59d-04c636d58ce4\") " Feb 20 08:44:59 crc kubenswrapper[5094]: I0220 08:44:59.862122 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33e893dc-597d-4b0d-b59d-04c636d58ce4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "33e893dc-597d-4b0d-b59d-04c636d58ce4" (UID: "33e893dc-597d-4b0d-b59d-04c636d58ce4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:44:59 crc kubenswrapper[5094]: I0220 08:44:59.866882 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33e893dc-597d-4b0d-b59d-04c636d58ce4-kube-api-access-w4b79" (OuterVolumeSpecName: "kube-api-access-w4b79") pod "33e893dc-597d-4b0d-b59d-04c636d58ce4" (UID: "33e893dc-597d-4b0d-b59d-04c636d58ce4"). InnerVolumeSpecName "kube-api-access-w4b79". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:44:59 crc kubenswrapper[5094]: I0220 08:44:59.942850 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vjfql" Feb 20 08:44:59 crc kubenswrapper[5094]: I0220 08:44:59.949228 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qwnhp" Feb 20 08:44:59 crc kubenswrapper[5094]: I0220 08:44:59.958603 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9129-account-create-update-kcqsj" Feb 20 08:44:59 crc kubenswrapper[5094]: I0220 08:44:59.963539 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33e893dc-597d-4b0d-b59d-04c636d58ce4-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:44:59 crc kubenswrapper[5094]: I0220 08:44:59.963681 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4b79\" (UniqueName: \"kubernetes.io/projected/33e893dc-597d-4b0d-b59d-04c636d58ce4-kube-api-access-w4b79\") on node \"crc\" DevicePath \"\"" Feb 20 08:44:59 crc kubenswrapper[5094]: I0220 08:44:59.964828 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1c22-account-create-update-wl44h" Feb 20 08:44:59 crc kubenswrapper[5094]: I0220 08:44:59.979987 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-q8cvq" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.065297 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95274e98-2b48-4b4d-b0c5-5dedafedc43f-operator-scripts\") pod \"95274e98-2b48-4b4d-b0c5-5dedafedc43f\" (UID: \"95274e98-2b48-4b4d-b0c5-5dedafedc43f\") " Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.065359 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62afc590-4a32-45a1-b7e9-bde09c7f0b6a-operator-scripts\") pod \"62afc590-4a32-45a1-b7e9-bde09c7f0b6a\" (UID: \"62afc590-4a32-45a1-b7e9-bde09c7f0b6a\") " Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.065399 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ab2f8a8-e11c-4b13-a12f-7006756e4d56-operator-scripts\") pod \"2ab2f8a8-e11c-4b13-a12f-7006756e4d56\" (UID: \"2ab2f8a8-e11c-4b13-a12f-7006756e4d56\") " Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.065425 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e24ca1b9-7440-432c-a0eb-58a17f83a8ee-operator-scripts\") pod \"e24ca1b9-7440-432c-a0eb-58a17f83a8ee\" (UID: \"e24ca1b9-7440-432c-a0eb-58a17f83a8ee\") " Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.065461 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90396e9c-2602-41dd-92c3-da38bb5f7be7-operator-scripts\") pod \"90396e9c-2602-41dd-92c3-da38bb5f7be7\" (UID: \"90396e9c-2602-41dd-92c3-da38bb5f7be7\") " Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.065489 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jstfk\" (UniqueName: \"kubernetes.io/projected/90396e9c-2602-41dd-92c3-da38bb5f7be7-kube-api-access-jstfk\") pod \"90396e9c-2602-41dd-92c3-da38bb5f7be7\" (UID: \"90396e9c-2602-41dd-92c3-da38bb5f7be7\") " Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.065546 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7r857\" (UniqueName: \"kubernetes.io/projected/2ab2f8a8-e11c-4b13-a12f-7006756e4d56-kube-api-access-7r857\") pod \"2ab2f8a8-e11c-4b13-a12f-7006756e4d56\" (UID: \"2ab2f8a8-e11c-4b13-a12f-7006756e4d56\") " Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.065588 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmkdm\" (UniqueName: \"kubernetes.io/projected/95274e98-2b48-4b4d-b0c5-5dedafedc43f-kube-api-access-mmkdm\") pod \"95274e98-2b48-4b4d-b0c5-5dedafedc43f\" (UID: \"95274e98-2b48-4b4d-b0c5-5dedafedc43f\") " Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.065648 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsmld\" (UniqueName: \"kubernetes.io/projected/e24ca1b9-7440-432c-a0eb-58a17f83a8ee-kube-api-access-hsmld\") pod \"e24ca1b9-7440-432c-a0eb-58a17f83a8ee\" (UID: \"e24ca1b9-7440-432c-a0eb-58a17f83a8ee\") " Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.065735 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkjz6\" (UniqueName: \"kubernetes.io/projected/62afc590-4a32-45a1-b7e9-bde09c7f0b6a-kube-api-access-jkjz6\") pod \"62afc590-4a32-45a1-b7e9-bde09c7f0b6a\" (UID: \"62afc590-4a32-45a1-b7e9-bde09c7f0b6a\") " Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.065897 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62afc590-4a32-45a1-b7e9-bde09c7f0b6a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "62afc590-4a32-45a1-b7e9-bde09c7f0b6a" (UID: "62afc590-4a32-45a1-b7e9-bde09c7f0b6a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.065905 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95274e98-2b48-4b4d-b0c5-5dedafedc43f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "95274e98-2b48-4b4d-b0c5-5dedafedc43f" (UID: "95274e98-2b48-4b4d-b0c5-5dedafedc43f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.066172 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95274e98-2b48-4b4d-b0c5-5dedafedc43f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.066192 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62afc590-4a32-45a1-b7e9-bde09c7f0b6a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.066269 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ab2f8a8-e11c-4b13-a12f-7006756e4d56-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2ab2f8a8-e11c-4b13-a12f-7006756e4d56" (UID: "2ab2f8a8-e11c-4b13-a12f-7006756e4d56"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.067865 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90396e9c-2602-41dd-92c3-da38bb5f7be7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "90396e9c-2602-41dd-92c3-da38bb5f7be7" (UID: "90396e9c-2602-41dd-92c3-da38bb5f7be7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.069252 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e24ca1b9-7440-432c-a0eb-58a17f83a8ee-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e24ca1b9-7440-432c-a0eb-58a17f83a8ee" (UID: "e24ca1b9-7440-432c-a0eb-58a17f83a8ee"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.070046 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ab2f8a8-e11c-4b13-a12f-7006756e4d56-kube-api-access-7r857" (OuterVolumeSpecName: "kube-api-access-7r857") pod "2ab2f8a8-e11c-4b13-a12f-7006756e4d56" (UID: "2ab2f8a8-e11c-4b13-a12f-7006756e4d56"). InnerVolumeSpecName "kube-api-access-7r857". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.070129 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e24ca1b9-7440-432c-a0eb-58a17f83a8ee-kube-api-access-hsmld" (OuterVolumeSpecName: "kube-api-access-hsmld") pod "e24ca1b9-7440-432c-a0eb-58a17f83a8ee" (UID: "e24ca1b9-7440-432c-a0eb-58a17f83a8ee"). InnerVolumeSpecName "kube-api-access-hsmld". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.070242 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90396e9c-2602-41dd-92c3-da38bb5f7be7-kube-api-access-jstfk" (OuterVolumeSpecName: "kube-api-access-jstfk") pod "90396e9c-2602-41dd-92c3-da38bb5f7be7" (UID: "90396e9c-2602-41dd-92c3-da38bb5f7be7"). InnerVolumeSpecName "kube-api-access-jstfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.070425 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62afc590-4a32-45a1-b7e9-bde09c7f0b6a-kube-api-access-jkjz6" (OuterVolumeSpecName: "kube-api-access-jkjz6") pod "62afc590-4a32-45a1-b7e9-bde09c7f0b6a" (UID: "62afc590-4a32-45a1-b7e9-bde09c7f0b6a"). InnerVolumeSpecName "kube-api-access-jkjz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.070868 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95274e98-2b48-4b4d-b0c5-5dedafedc43f-kube-api-access-mmkdm" (OuterVolumeSpecName: "kube-api-access-mmkdm") pod "95274e98-2b48-4b4d-b0c5-5dedafedc43f" (UID: "95274e98-2b48-4b4d-b0c5-5dedafedc43f"). InnerVolumeSpecName "kube-api-access-mmkdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.134426 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4"] Feb 20 08:45:00 crc kubenswrapper[5094]: E0220 08:45:00.134912 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62afc590-4a32-45a1-b7e9-bde09c7f0b6a" containerName="mariadb-database-create" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.134930 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="62afc590-4a32-45a1-b7e9-bde09c7f0b6a" containerName="mariadb-database-create" Feb 20 08:45:00 crc kubenswrapper[5094]: E0220 08:45:00.134948 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33e893dc-597d-4b0d-b59d-04c636d58ce4" containerName="mariadb-account-create-update" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.134955 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="33e893dc-597d-4b0d-b59d-04c636d58ce4" containerName="mariadb-account-create-update" Feb 20 08:45:00 crc kubenswrapper[5094]: E0220 08:45:00.134978 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e24ca1b9-7440-432c-a0eb-58a17f83a8ee" containerName="mariadb-database-create" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.134984 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="e24ca1b9-7440-432c-a0eb-58a17f83a8ee" containerName="mariadb-database-create" Feb 20 08:45:00 crc kubenswrapper[5094]: E0220 08:45:00.134998 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95274e98-2b48-4b4d-b0c5-5dedafedc43f" containerName="mariadb-account-create-update" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.135003 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="95274e98-2b48-4b4d-b0c5-5dedafedc43f" containerName="mariadb-account-create-update" Feb 20 08:45:00 crc kubenswrapper[5094]: E0220 08:45:00.135017 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90396e9c-2602-41dd-92c3-da38bb5f7be7" containerName="mariadb-account-create-update" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.135023 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="90396e9c-2602-41dd-92c3-da38bb5f7be7" containerName="mariadb-account-create-update" Feb 20 08:45:00 crc kubenswrapper[5094]: E0220 08:45:00.135034 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab2f8a8-e11c-4b13-a12f-7006756e4d56" containerName="mariadb-database-create" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.135040 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab2f8a8-e11c-4b13-a12f-7006756e4d56" containerName="mariadb-database-create" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.135206 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="90396e9c-2602-41dd-92c3-da38bb5f7be7" containerName="mariadb-account-create-update" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.135215 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab2f8a8-e11c-4b13-a12f-7006756e4d56" containerName="mariadb-database-create" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.135225 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="62afc590-4a32-45a1-b7e9-bde09c7f0b6a" containerName="mariadb-database-create" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.135235 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="95274e98-2b48-4b4d-b0c5-5dedafedc43f" containerName="mariadb-account-create-update" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.135248 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="e24ca1b9-7440-432c-a0eb-58a17f83a8ee" containerName="mariadb-database-create" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.135259 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="33e893dc-597d-4b0d-b59d-04c636d58ce4" containerName="mariadb-account-create-update" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.135893 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.139767 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.140952 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.141980 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4"] Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.167722 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4271712d-7fb9-4862-bc38-e3cfbcced425-config-volume\") pod \"collect-profiles-29526285-265c4\" (UID: \"4271712d-7fb9-4862-bc38-e3cfbcced425\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.167820 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c9mx\" (UniqueName: \"kubernetes.io/projected/4271712d-7fb9-4862-bc38-e3cfbcced425-kube-api-access-5c9mx\") pod \"collect-profiles-29526285-265c4\" (UID: \"4271712d-7fb9-4862-bc38-e3cfbcced425\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.167894 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4271712d-7fb9-4862-bc38-e3cfbcced425-secret-volume\") pod \"collect-profiles-29526285-265c4\" (UID: \"4271712d-7fb9-4862-bc38-e3cfbcced425\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.167996 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ab2f8a8-e11c-4b13-a12f-7006756e4d56-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.168009 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e24ca1b9-7440-432c-a0eb-58a17f83a8ee-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.168019 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90396e9c-2602-41dd-92c3-da38bb5f7be7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.168028 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jstfk\" (UniqueName: \"kubernetes.io/projected/90396e9c-2602-41dd-92c3-da38bb5f7be7-kube-api-access-jstfk\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.168038 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7r857\" (UniqueName: \"kubernetes.io/projected/2ab2f8a8-e11c-4b13-a12f-7006756e4d56-kube-api-access-7r857\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.168047 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmkdm\" (UniqueName: \"kubernetes.io/projected/95274e98-2b48-4b4d-b0c5-5dedafedc43f-kube-api-access-mmkdm\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.168056 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsmld\" (UniqueName: \"kubernetes.io/projected/e24ca1b9-7440-432c-a0eb-58a17f83a8ee-kube-api-access-hsmld\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.168094 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkjz6\" (UniqueName: \"kubernetes.io/projected/62afc590-4a32-45a1-b7e9-bde09c7f0b6a-kube-api-access-jkjz6\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.269359 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4271712d-7fb9-4862-bc38-e3cfbcced425-secret-volume\") pod \"collect-profiles-29526285-265c4\" (UID: \"4271712d-7fb9-4862-bc38-e3cfbcced425\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.269631 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4271712d-7fb9-4862-bc38-e3cfbcced425-config-volume\") pod \"collect-profiles-29526285-265c4\" (UID: \"4271712d-7fb9-4862-bc38-e3cfbcced425\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.269790 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c9mx\" (UniqueName: \"kubernetes.io/projected/4271712d-7fb9-4862-bc38-e3cfbcced425-kube-api-access-5c9mx\") pod \"collect-profiles-29526285-265c4\" (UID: \"4271712d-7fb9-4862-bc38-e3cfbcced425\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.270558 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4271712d-7fb9-4862-bc38-e3cfbcced425-config-volume\") pod \"collect-profiles-29526285-265c4\" (UID: \"4271712d-7fb9-4862-bc38-e3cfbcced425\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.272636 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4271712d-7fb9-4862-bc38-e3cfbcced425-secret-volume\") pod \"collect-profiles-29526285-265c4\" (UID: \"4271712d-7fb9-4862-bc38-e3cfbcced425\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.285434 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c9mx\" (UniqueName: \"kubernetes.io/projected/4271712d-7fb9-4862-bc38-e3cfbcced425-kube-api-access-5c9mx\") pod \"collect-profiles-29526285-265c4\" (UID: \"4271712d-7fb9-4862-bc38-e3cfbcced425\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.359064 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1c22-account-create-update-wl44h" event={"ID":"95274e98-2b48-4b4d-b0c5-5dedafedc43f","Type":"ContainerDied","Data":"1dbf96c9c2fb9c9c59994ebb1568bc2fa160b27d8989791e349374e858cc8c96"} Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.359116 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dbf96c9c2fb9c9c59994ebb1568bc2fa160b27d8989791e349374e858cc8c96" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.359092 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1c22-account-create-update-wl44h" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.360631 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9129-account-create-update-kcqsj" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.360675 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9129-account-create-update-kcqsj" event={"ID":"90396e9c-2602-41dd-92c3-da38bb5f7be7","Type":"ContainerDied","Data":"e4db54b07b51e441746c30da24b69b2b5a330b4c4da10e88c7e0cad568ec5b9c"} Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.360794 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4db54b07b51e441746c30da24b69b2b5a330b4c4da10e88c7e0cad568ec5b9c" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.362600 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-vjfql" event={"ID":"e24ca1b9-7440-432c-a0eb-58a17f83a8ee","Type":"ContainerDied","Data":"a618dc7847d0c1e997e5258ecd34df71b91c9ca9712f96fe9bce654865ed51d2"} Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.362858 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a618dc7847d0c1e997e5258ecd34df71b91c9ca9712f96fe9bce654865ed51d2" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.362618 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vjfql" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.365039 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qwnhp" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.365043 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qwnhp" event={"ID":"62afc590-4a32-45a1-b7e9-bde09c7f0b6a","Type":"ContainerDied","Data":"3402431af5c0b829ac8fa80c190f9199ba60a6730114334bae45f899a11cb0fe"} Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.365154 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3402431af5c0b829ac8fa80c190f9199ba60a6730114334bae45f899a11cb0fe" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.367172 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-437f-account-create-update-cc6d4" event={"ID":"33e893dc-597d-4b0d-b59d-04c636d58ce4","Type":"ContainerDied","Data":"9dec53993c58a49e10cba3dda4698b5b9362b895ed9d44eabdf14e9fd3345aaf"} Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.367200 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9dec53993c58a49e10cba3dda4698b5b9362b895ed9d44eabdf14e9fd3345aaf" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.367269 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-437f-account-create-update-cc6d4" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.369047 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-q8cvq" event={"ID":"2ab2f8a8-e11c-4b13-a12f-7006756e4d56","Type":"ContainerDied","Data":"71d06dd390ead65459118c430e3e92cf62decfb5df76c8b5366da5fbbcb16c71"} Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.369068 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71d06dd390ead65459118c430e3e92cf62decfb5df76c8b5366da5fbbcb16c71" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.369110 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-q8cvq" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.457465 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.041995 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4"] Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.380194 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" event={"ID":"4271712d-7fb9-4862-bc38-e3cfbcced425","Type":"ContainerStarted","Data":"8db4ee156703861a72d9f8f5a2380b086eb9a7f4aadd08037037563019ebbb48"} Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.380524 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" event={"ID":"4271712d-7fb9-4862-bc38-e3cfbcced425","Type":"ContainerStarted","Data":"05446f0e59e5eb589aee3a0f185fa7c010261fa51c4c3bcc0680f6fef958ea2c"} Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.406140 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" podStartSLOduration=1.406119283 podStartE2EDuration="1.406119283s" podCreationTimestamp="2026-02-20 08:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:45:01.405658992 +0000 UTC m=+7116.278285703" watchObservedRunningTime="2026-02-20 08:45:01.406119283 +0000 UTC m=+7116.278745994" Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.852943 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mkwlr"] Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.853882 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mkwlr"] Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.853962 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mkwlr" Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.856231 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-2cnhr" Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.856451 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.856735 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.895695 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mkwlr\" (UID: \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\") " pod="openstack/nova-cell0-conductor-db-sync-mkwlr" Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.895805 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-scripts\") pod \"nova-cell0-conductor-db-sync-mkwlr\" (UID: \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\") " pod="openstack/nova-cell0-conductor-db-sync-mkwlr" Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.895830 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb425\" (UniqueName: \"kubernetes.io/projected/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-kube-api-access-xb425\") pod \"nova-cell0-conductor-db-sync-mkwlr\" (UID: \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\") " pod="openstack/nova-cell0-conductor-db-sync-mkwlr" Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.895869 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-config-data\") pod \"nova-cell0-conductor-db-sync-mkwlr\" (UID: \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\") " pod="openstack/nova-cell0-conductor-db-sync-mkwlr" Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.997419 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mkwlr\" (UID: \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\") " pod="openstack/nova-cell0-conductor-db-sync-mkwlr" Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.997541 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-scripts\") pod \"nova-cell0-conductor-db-sync-mkwlr\" (UID: \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\") " pod="openstack/nova-cell0-conductor-db-sync-mkwlr" Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.997575 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb425\" (UniqueName: \"kubernetes.io/projected/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-kube-api-access-xb425\") pod \"nova-cell0-conductor-db-sync-mkwlr\" (UID: \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\") " pod="openstack/nova-cell0-conductor-db-sync-mkwlr" Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.997636 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-config-data\") pod \"nova-cell0-conductor-db-sync-mkwlr\" (UID: \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\") " pod="openstack/nova-cell0-conductor-db-sync-mkwlr" Feb 20 08:45:02 crc kubenswrapper[5094]: I0220 08:45:02.004995 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-scripts\") pod \"nova-cell0-conductor-db-sync-mkwlr\" (UID: \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\") " pod="openstack/nova-cell0-conductor-db-sync-mkwlr" Feb 20 08:45:02 crc kubenswrapper[5094]: I0220 08:45:02.005332 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mkwlr\" (UID: \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\") " pod="openstack/nova-cell0-conductor-db-sync-mkwlr" Feb 20 08:45:02 crc kubenswrapper[5094]: I0220 08:45:02.005773 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-config-data\") pod \"nova-cell0-conductor-db-sync-mkwlr\" (UID: \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\") " pod="openstack/nova-cell0-conductor-db-sync-mkwlr" Feb 20 08:45:02 crc kubenswrapper[5094]: I0220 08:45:02.029544 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb425\" (UniqueName: \"kubernetes.io/projected/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-kube-api-access-xb425\") pod \"nova-cell0-conductor-db-sync-mkwlr\" (UID: \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\") " pod="openstack/nova-cell0-conductor-db-sync-mkwlr" Feb 20 08:45:02 crc kubenswrapper[5094]: I0220 08:45:02.181108 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mkwlr" Feb 20 08:45:02 crc kubenswrapper[5094]: I0220 08:45:02.391817 5094 generic.go:334] "Generic (PLEG): container finished" podID="4271712d-7fb9-4862-bc38-e3cfbcced425" containerID="8db4ee156703861a72d9f8f5a2380b086eb9a7f4aadd08037037563019ebbb48" exitCode=0 Feb 20 08:45:02 crc kubenswrapper[5094]: I0220 08:45:02.391869 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" event={"ID":"4271712d-7fb9-4862-bc38-e3cfbcced425","Type":"ContainerDied","Data":"8db4ee156703861a72d9f8f5a2380b086eb9a7f4aadd08037037563019ebbb48"} Feb 20 08:45:02 crc kubenswrapper[5094]: I0220 08:45:02.640503 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mkwlr"] Feb 20 08:45:02 crc kubenswrapper[5094]: W0220 08:45:02.641073 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8cc89fb_1ef5_4f62_afbc_a06a1d75fa91.slice/crio-3514502e9d5373371a4ad5a129f3f7468d51d10eae032c22c932337ebad3e532 WatchSource:0}: Error finding container 3514502e9d5373371a4ad5a129f3f7468d51d10eae032c22c932337ebad3e532: Status 404 returned error can't find the container with id 3514502e9d5373371a4ad5a129f3f7468d51d10eae032c22c932337ebad3e532 Feb 20 08:45:03 crc kubenswrapper[5094]: I0220 08:45:03.399354 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mkwlr" event={"ID":"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91","Type":"ContainerStarted","Data":"3514502e9d5373371a4ad5a129f3f7468d51d10eae032c22c932337ebad3e532"} Feb 20 08:45:03 crc kubenswrapper[5094]: I0220 08:45:03.734050 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" Feb 20 08:45:03 crc kubenswrapper[5094]: I0220 08:45:03.828512 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4271712d-7fb9-4862-bc38-e3cfbcced425-config-volume\") pod \"4271712d-7fb9-4862-bc38-e3cfbcced425\" (UID: \"4271712d-7fb9-4862-bc38-e3cfbcced425\") " Feb 20 08:45:03 crc kubenswrapper[5094]: I0220 08:45:03.828878 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4271712d-7fb9-4862-bc38-e3cfbcced425-secret-volume\") pod \"4271712d-7fb9-4862-bc38-e3cfbcced425\" (UID: \"4271712d-7fb9-4862-bc38-e3cfbcced425\") " Feb 20 08:45:03 crc kubenswrapper[5094]: I0220 08:45:03.828945 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5c9mx\" (UniqueName: \"kubernetes.io/projected/4271712d-7fb9-4862-bc38-e3cfbcced425-kube-api-access-5c9mx\") pod \"4271712d-7fb9-4862-bc38-e3cfbcced425\" (UID: \"4271712d-7fb9-4862-bc38-e3cfbcced425\") " Feb 20 08:45:03 crc kubenswrapper[5094]: I0220 08:45:03.829497 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4271712d-7fb9-4862-bc38-e3cfbcced425-config-volume" (OuterVolumeSpecName: "config-volume") pod "4271712d-7fb9-4862-bc38-e3cfbcced425" (UID: "4271712d-7fb9-4862-bc38-e3cfbcced425"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:45:03 crc kubenswrapper[5094]: I0220 08:45:03.834229 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4271712d-7fb9-4862-bc38-e3cfbcced425-kube-api-access-5c9mx" (OuterVolumeSpecName: "kube-api-access-5c9mx") pod "4271712d-7fb9-4862-bc38-e3cfbcced425" (UID: "4271712d-7fb9-4862-bc38-e3cfbcced425"). InnerVolumeSpecName "kube-api-access-5c9mx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:03 crc kubenswrapper[5094]: I0220 08:45:03.838860 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4271712d-7fb9-4862-bc38-e3cfbcced425-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4271712d-7fb9-4862-bc38-e3cfbcced425" (UID: "4271712d-7fb9-4862-bc38-e3cfbcced425"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:03 crc kubenswrapper[5094]: I0220 08:45:03.930208 5094 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4271712d-7fb9-4862-bc38-e3cfbcced425-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:03 crc kubenswrapper[5094]: I0220 08:45:03.930238 5094 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4271712d-7fb9-4862-bc38-e3cfbcced425-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:03 crc kubenswrapper[5094]: I0220 08:45:03.930250 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5c9mx\" (UniqueName: \"kubernetes.io/projected/4271712d-7fb9-4862-bc38-e3cfbcced425-kube-api-access-5c9mx\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:04 crc kubenswrapper[5094]: I0220 08:45:04.414363 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" event={"ID":"4271712d-7fb9-4862-bc38-e3cfbcced425","Type":"ContainerDied","Data":"05446f0e59e5eb589aee3a0f185fa7c010261fa51c4c3bcc0680f6fef958ea2c"} Feb 20 08:45:04 crc kubenswrapper[5094]: I0220 08:45:04.414412 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05446f0e59e5eb589aee3a0f185fa7c010261fa51c4c3bcc0680f6fef958ea2c" Feb 20 08:45:04 crc kubenswrapper[5094]: I0220 08:45:04.414477 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" Feb 20 08:45:04 crc kubenswrapper[5094]: I0220 08:45:04.496682 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td"] Feb 20 08:45:04 crc kubenswrapper[5094]: I0220 08:45:04.504141 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td"] Feb 20 08:45:05 crc kubenswrapper[5094]: I0220 08:45:05.853181 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a036c1c3-0425-4a2e-a42d-2abfcdc49620" path="/var/lib/kubelet/pods/a036c1c3-0425-4a2e-a42d-2abfcdc49620/volumes" Feb 20 08:45:07 crc kubenswrapper[5094]: I0220 08:45:07.841039 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:45:07 crc kubenswrapper[5094]: E0220 08:45:07.841300 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:45:11 crc kubenswrapper[5094]: I0220 08:45:11.466357 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mkwlr" event={"ID":"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91","Type":"ContainerStarted","Data":"cc26980783bd1f2aa717556354f0e4a6d1d7792f5dd61a8146cad63d1f649ba5"} Feb 20 08:45:11 crc kubenswrapper[5094]: I0220 08:45:11.486198 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-mkwlr" podStartSLOduration=2.327103886 podStartE2EDuration="10.486180277s" podCreationTimestamp="2026-02-20 08:45:01 +0000 UTC" firstStartedPulling="2026-02-20 08:45:02.643002502 +0000 UTC m=+7117.515629223" lastFinishedPulling="2026-02-20 08:45:10.802078903 +0000 UTC m=+7125.674705614" observedRunningTime="2026-02-20 08:45:11.484565858 +0000 UTC m=+7126.357192569" watchObservedRunningTime="2026-02-20 08:45:11.486180277 +0000 UTC m=+7126.358806988" Feb 20 08:45:16 crc kubenswrapper[5094]: I0220 08:45:16.508730 5094 generic.go:334] "Generic (PLEG): container finished" podID="c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91" containerID="cc26980783bd1f2aa717556354f0e4a6d1d7792f5dd61a8146cad63d1f649ba5" exitCode=0 Feb 20 08:45:16 crc kubenswrapper[5094]: I0220 08:45:16.508767 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mkwlr" event={"ID":"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91","Type":"ContainerDied","Data":"cc26980783bd1f2aa717556354f0e4a6d1d7792f5dd61a8146cad63d1f649ba5"} Feb 20 08:45:17 crc kubenswrapper[5094]: I0220 08:45:17.800159 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mkwlr" Feb 20 08:45:17 crc kubenswrapper[5094]: I0220 08:45:17.959139 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-combined-ca-bundle\") pod \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\" (UID: \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\") " Feb 20 08:45:17 crc kubenswrapper[5094]: I0220 08:45:17.959255 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xb425\" (UniqueName: \"kubernetes.io/projected/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-kube-api-access-xb425\") pod \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\" (UID: \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\") " Feb 20 08:45:17 crc kubenswrapper[5094]: I0220 08:45:17.959316 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-scripts\") pod \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\" (UID: \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\") " Feb 20 08:45:17 crc kubenswrapper[5094]: I0220 08:45:17.959363 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-config-data\") pod \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\" (UID: \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\") " Feb 20 08:45:17 crc kubenswrapper[5094]: I0220 08:45:17.964718 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-scripts" (OuterVolumeSpecName: "scripts") pod "c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91" (UID: "c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:17 crc kubenswrapper[5094]: I0220 08:45:17.967976 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-kube-api-access-xb425" (OuterVolumeSpecName: "kube-api-access-xb425") pod "c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91" (UID: "c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91"). InnerVolumeSpecName "kube-api-access-xb425". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:17 crc kubenswrapper[5094]: I0220 08:45:17.987692 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-config-data" (OuterVolumeSpecName: "config-data") pod "c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91" (UID: "c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.001753 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91" (UID: "c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.061196 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.061236 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xb425\" (UniqueName: \"kubernetes.io/projected/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-kube-api-access-xb425\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.061249 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.061257 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.525083 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mkwlr" event={"ID":"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91","Type":"ContainerDied","Data":"3514502e9d5373371a4ad5a129f3f7468d51d10eae032c22c932337ebad3e532"} Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.525131 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mkwlr" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.525131 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3514502e9d5373371a4ad5a129f3f7468d51d10eae032c22c932337ebad3e532" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.645596 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 08:45:18 crc kubenswrapper[5094]: E0220 08:45:18.646014 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91" containerName="nova-cell0-conductor-db-sync" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.646032 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91" containerName="nova-cell0-conductor-db-sync" Feb 20 08:45:18 crc kubenswrapper[5094]: E0220 08:45:18.646043 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4271712d-7fb9-4862-bc38-e3cfbcced425" containerName="collect-profiles" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.646052 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4271712d-7fb9-4862-bc38-e3cfbcced425" containerName="collect-profiles" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.646237 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4271712d-7fb9-4862-bc38-e3cfbcced425" containerName="collect-profiles" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.646255 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91" containerName="nova-cell0-conductor-db-sync" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.649441 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.652034 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-2cnhr" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.652222 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.656899 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.775168 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.775612 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c49n4\" (UniqueName: \"kubernetes.io/projected/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-kube-api-access-c49n4\") pod \"nova-cell0-conductor-0\" (UID: \"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.775775 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.877117 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c49n4\" (UniqueName: \"kubernetes.io/projected/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-kube-api-access-c49n4\") pod \"nova-cell0-conductor-0\" (UID: \"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.877213 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.877316 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.882119 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.882287 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.900547 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c49n4\" (UniqueName: \"kubernetes.io/projected/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-kube-api-access-c49n4\") pod \"nova-cell0-conductor-0\" (UID: \"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.970965 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 08:45:19 crc kubenswrapper[5094]: I0220 08:45:19.423679 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 08:45:19 crc kubenswrapper[5094]: I0220 08:45:19.540827 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca","Type":"ContainerStarted","Data":"ed4718ea392d6b6de8bbaf21e72aac2f90b8c262b455dfd4b484e4049f29e229"} Feb 20 08:45:19 crc kubenswrapper[5094]: I0220 08:45:19.839685 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:45:19 crc kubenswrapper[5094]: E0220 08:45:19.839935 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:45:20 crc kubenswrapper[5094]: I0220 08:45:20.551979 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca","Type":"ContainerStarted","Data":"e554f01421ff77c0561f5176cdeba2c223172b1b97e15c34de574fa303143b58"} Feb 20 08:45:20 crc kubenswrapper[5094]: I0220 08:45:20.553171 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 20 08:45:20 crc kubenswrapper[5094]: I0220 08:45:20.577170 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.577156372 podStartE2EDuration="2.577156372s" podCreationTimestamp="2026-02-20 08:45:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:45:20.569805506 +0000 UTC m=+7135.442432217" watchObservedRunningTime="2026-02-20 08:45:20.577156372 +0000 UTC m=+7135.449783083" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.004422 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.451680 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-62kch"] Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.453748 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-62kch" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.460998 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-62kch"] Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.461979 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.462042 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.572606 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tkl4\" (UniqueName: \"kubernetes.io/projected/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-kube-api-access-4tkl4\") pod \"nova-cell0-cell-mapping-62kch\" (UID: \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\") " pod="openstack/nova-cell0-cell-mapping-62kch" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.572725 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-config-data\") pod \"nova-cell0-cell-mapping-62kch\" (UID: \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\") " pod="openstack/nova-cell0-cell-mapping-62kch" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.572783 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-scripts\") pod \"nova-cell0-cell-mapping-62kch\" (UID: \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\") " pod="openstack/nova-cell0-cell-mapping-62kch" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.572812 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-62kch\" (UID: \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\") " pod="openstack/nova-cell0-cell-mapping-62kch" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.603656 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.605225 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.607151 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.614952 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.616941 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.620084 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.637253 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.653386 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.676955 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tkl4\" (UniqueName: \"kubernetes.io/projected/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-kube-api-access-4tkl4\") pod \"nova-cell0-cell-mapping-62kch\" (UID: \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\") " pod="openstack/nova-cell0-cell-mapping-62kch" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.677025 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-config-data\") pod \"nova-cell0-cell-mapping-62kch\" (UID: \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\") " pod="openstack/nova-cell0-cell-mapping-62kch" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.677063 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-scripts\") pod \"nova-cell0-cell-mapping-62kch\" (UID: \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\") " pod="openstack/nova-cell0-cell-mapping-62kch" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.677084 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-62kch\" (UID: \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\") " pod="openstack/nova-cell0-cell-mapping-62kch" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.690269 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-scripts\") pod \"nova-cell0-cell-mapping-62kch\" (UID: \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\") " pod="openstack/nova-cell0-cell-mapping-62kch" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.690754 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-62kch\" (UID: \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\") " pod="openstack/nova-cell0-cell-mapping-62kch" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.694309 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-config-data\") pod \"nova-cell0-cell-mapping-62kch\" (UID: \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\") " pod="openstack/nova-cell0-cell-mapping-62kch" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.709754 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tkl4\" (UniqueName: \"kubernetes.io/projected/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-kube-api-access-4tkl4\") pod \"nova-cell0-cell-mapping-62kch\" (UID: \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\") " pod="openstack/nova-cell0-cell-mapping-62kch" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.721021 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.722660 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.728266 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.746851 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.780436 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-config-data\") pod \"nova-api-0\" (UID: \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\") " pod="openstack/nova-api-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.780949 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b2421e1-8243-473f-8dd5-86bc130d251f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b2421e1-8243-473f-8dd5-86bc130d251f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.781223 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh7fd\" (UniqueName: \"kubernetes.io/projected/1b2421e1-8243-473f-8dd5-86bc130d251f-kube-api-access-kh7fd\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b2421e1-8243-473f-8dd5-86bc130d251f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.781372 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b2421e1-8243-473f-8dd5-86bc130d251f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b2421e1-8243-473f-8dd5-86bc130d251f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.781469 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\") " pod="openstack/nova-api-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.781587 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z897\" (UniqueName: \"kubernetes.io/projected/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-kube-api-access-8z897\") pod \"nova-api-0\" (UID: \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\") " pod="openstack/nova-api-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.781777 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-logs\") pod \"nova-api-0\" (UID: \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\") " pod="openstack/nova-api-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.798032 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-62kch" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.806226 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.807961 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.810975 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.839943 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.887817 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z897\" (UniqueName: \"kubernetes.io/projected/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-kube-api-access-8z897\") pod \"nova-api-0\" (UID: \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\") " pod="openstack/nova-api-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.888170 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecd0e105-6bf4-436e-9d70-1b42f662e67f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ecd0e105-6bf4-436e-9d70-1b42f662e67f\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.888210 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecd0e105-6bf4-436e-9d70-1b42f662e67f-config-data\") pod \"nova-scheduler-0\" (UID: \"ecd0e105-6bf4-436e-9d70-1b42f662e67f\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.888273 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-logs\") pod \"nova-api-0\" (UID: \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\") " pod="openstack/nova-api-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.888332 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-config-data\") pod \"nova-api-0\" (UID: \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\") " pod="openstack/nova-api-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.888370 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b2421e1-8243-473f-8dd5-86bc130d251f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b2421e1-8243-473f-8dd5-86bc130d251f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.888444 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ljr7\" (UniqueName: \"kubernetes.io/projected/ecd0e105-6bf4-436e-9d70-1b42f662e67f-kube-api-access-2ljr7\") pod \"nova-scheduler-0\" (UID: \"ecd0e105-6bf4-436e-9d70-1b42f662e67f\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.888614 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh7fd\" (UniqueName: \"kubernetes.io/projected/1b2421e1-8243-473f-8dd5-86bc130d251f-kube-api-access-kh7fd\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b2421e1-8243-473f-8dd5-86bc130d251f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.888640 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b2421e1-8243-473f-8dd5-86bc130d251f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b2421e1-8243-473f-8dd5-86bc130d251f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.888661 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\") " pod="openstack/nova-api-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.902552 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-config-data\") pod \"nova-api-0\" (UID: \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\") " pod="openstack/nova-api-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.903844 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b2421e1-8243-473f-8dd5-86bc130d251f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b2421e1-8243-473f-8dd5-86bc130d251f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.904634 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-logs\") pod \"nova-api-0\" (UID: \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\") " pod="openstack/nova-api-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.918743 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\") " pod="openstack/nova-api-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.929576 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b2421e1-8243-473f-8dd5-86bc130d251f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b2421e1-8243-473f-8dd5-86bc130d251f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.931482 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z897\" (UniqueName: \"kubernetes.io/projected/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-kube-api-access-8z897\") pod \"nova-api-0\" (UID: \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\") " pod="openstack/nova-api-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.935056 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh7fd\" (UniqueName: \"kubernetes.io/projected/1b2421e1-8243-473f-8dd5-86bc130d251f-kube-api-access-kh7fd\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b2421e1-8243-473f-8dd5-86bc130d251f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.940746 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.964775 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9dcb44685-h54hc"] Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.970100 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.976910 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9dcb44685-h54hc"] Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.990487 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf10215b-d08a-452a-a7de-e7c828922d47-logs\") pod \"nova-metadata-0\" (UID: \"cf10215b-d08a-452a-a7de-e7c828922d47\") " pod="openstack/nova-metadata-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.990538 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf10215b-d08a-452a-a7de-e7c828922d47-config-data\") pod \"nova-metadata-0\" (UID: \"cf10215b-d08a-452a-a7de-e7c828922d47\") " pod="openstack/nova-metadata-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.990563 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59tqv\" (UniqueName: \"kubernetes.io/projected/cf10215b-d08a-452a-a7de-e7c828922d47-kube-api-access-59tqv\") pod \"nova-metadata-0\" (UID: \"cf10215b-d08a-452a-a7de-e7c828922d47\") " pod="openstack/nova-metadata-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.990592 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ljr7\" (UniqueName: \"kubernetes.io/projected/ecd0e105-6bf4-436e-9d70-1b42f662e67f-kube-api-access-2ljr7\") pod \"nova-scheduler-0\" (UID: \"ecd0e105-6bf4-436e-9d70-1b42f662e67f\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.990672 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf10215b-d08a-452a-a7de-e7c828922d47-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cf10215b-d08a-452a-a7de-e7c828922d47\") " pod="openstack/nova-metadata-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.990697 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecd0e105-6bf4-436e-9d70-1b42f662e67f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ecd0e105-6bf4-436e-9d70-1b42f662e67f\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.991273 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecd0e105-6bf4-436e-9d70-1b42f662e67f-config-data\") pod \"nova-scheduler-0\" (UID: \"ecd0e105-6bf4-436e-9d70-1b42f662e67f\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.996391 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecd0e105-6bf4-436e-9d70-1b42f662e67f-config-data\") pod \"nova-scheduler-0\" (UID: \"ecd0e105-6bf4-436e-9d70-1b42f662e67f\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.001893 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecd0e105-6bf4-436e-9d70-1b42f662e67f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ecd0e105-6bf4-436e-9d70-1b42f662e67f\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.014847 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ljr7\" (UniqueName: \"kubernetes.io/projected/ecd0e105-6bf4-436e-9d70-1b42f662e67f-kube-api-access-2ljr7\") pod \"nova-scheduler-0\" (UID: \"ecd0e105-6bf4-436e-9d70-1b42f662e67f\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.092113 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.096066 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-ovsdbserver-sb\") pod \"dnsmasq-dns-9dcb44685-h54hc\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.096152 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-ovsdbserver-nb\") pod \"dnsmasq-dns-9dcb44685-h54hc\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.096200 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf10215b-d08a-452a-a7de-e7c828922d47-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cf10215b-d08a-452a-a7de-e7c828922d47\") " pod="openstack/nova-metadata-0" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.096240 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-config\") pod \"dnsmasq-dns-9dcb44685-h54hc\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.096271 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbwbn\" (UniqueName: \"kubernetes.io/projected/67ea7a9d-b48a-4ea9-be81-50d152a57e58-kube-api-access-jbwbn\") pod \"dnsmasq-dns-9dcb44685-h54hc\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.096304 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-dns-svc\") pod \"dnsmasq-dns-9dcb44685-h54hc\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.096338 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf10215b-d08a-452a-a7de-e7c828922d47-logs\") pod \"nova-metadata-0\" (UID: \"cf10215b-d08a-452a-a7de-e7c828922d47\") " pod="openstack/nova-metadata-0" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.096365 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf10215b-d08a-452a-a7de-e7c828922d47-config-data\") pod \"nova-metadata-0\" (UID: \"cf10215b-d08a-452a-a7de-e7c828922d47\") " pod="openstack/nova-metadata-0" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.096411 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59tqv\" (UniqueName: \"kubernetes.io/projected/cf10215b-d08a-452a-a7de-e7c828922d47-kube-api-access-59tqv\") pod \"nova-metadata-0\" (UID: \"cf10215b-d08a-452a-a7de-e7c828922d47\") " pod="openstack/nova-metadata-0" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.098827 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf10215b-d08a-452a-a7de-e7c828922d47-logs\") pod \"nova-metadata-0\" (UID: \"cf10215b-d08a-452a-a7de-e7c828922d47\") " pod="openstack/nova-metadata-0" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.103018 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf10215b-d08a-452a-a7de-e7c828922d47-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cf10215b-d08a-452a-a7de-e7c828922d47\") " pod="openstack/nova-metadata-0" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.115359 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf10215b-d08a-452a-a7de-e7c828922d47-config-data\") pod \"nova-metadata-0\" (UID: \"cf10215b-d08a-452a-a7de-e7c828922d47\") " pod="openstack/nova-metadata-0" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.129832 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59tqv\" (UniqueName: \"kubernetes.io/projected/cf10215b-d08a-452a-a7de-e7c828922d47-kube-api-access-59tqv\") pod \"nova-metadata-0\" (UID: \"cf10215b-d08a-452a-a7de-e7c828922d47\") " pod="openstack/nova-metadata-0" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.176920 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.198855 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-ovsdbserver-sb\") pod \"dnsmasq-dns-9dcb44685-h54hc\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.198918 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-ovsdbserver-nb\") pod \"dnsmasq-dns-9dcb44685-h54hc\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.198986 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-config\") pod \"dnsmasq-dns-9dcb44685-h54hc\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.199031 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbwbn\" (UniqueName: \"kubernetes.io/projected/67ea7a9d-b48a-4ea9-be81-50d152a57e58-kube-api-access-jbwbn\") pod \"dnsmasq-dns-9dcb44685-h54hc\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.199066 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-dns-svc\") pod \"dnsmasq-dns-9dcb44685-h54hc\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.199926 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-ovsdbserver-sb\") pod \"dnsmasq-dns-9dcb44685-h54hc\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.200532 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-dns-svc\") pod \"dnsmasq-dns-9dcb44685-h54hc\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.200857 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-config\") pod \"dnsmasq-dns-9dcb44685-h54hc\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.201118 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-ovsdbserver-nb\") pod \"dnsmasq-dns-9dcb44685-h54hc\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.219744 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbwbn\" (UniqueName: \"kubernetes.io/projected/67ea7a9d-b48a-4ea9-be81-50d152a57e58-kube-api-access-jbwbn\") pod \"dnsmasq-dns-9dcb44685-h54hc\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.225721 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.298315 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.426667 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-62kch"] Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.510689 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cwpxs"] Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.512475 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-cwpxs" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.518643 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.518856 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.525413 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cwpxs"] Feb 20 08:45:30 crc kubenswrapper[5094]: W0220 08:45:30.543849 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59f04fe3_56d8_4fcb_a1bf_35b730bd7d89.slice/crio-9fcd6b0d8fc435c109aac005bdc71da0aba586fad59d4980840ac748c251b23e WatchSource:0}: Error finding container 9fcd6b0d8fc435c109aac005bdc71da0aba586fad59d4980840ac748c251b23e: Status 404 returned error can't find the container with id 9fcd6b0d8fc435c109aac005bdc71da0aba586fad59d4980840ac748c251b23e Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.554861 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.607772 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-config-data\") pod \"nova-cell1-conductor-db-sync-cwpxs\" (UID: \"077dc649-6898-4f04-837d-b694decf612b\") " pod="openstack/nova-cell1-conductor-db-sync-cwpxs" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.607821 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-cwpxs\" (UID: \"077dc649-6898-4f04-837d-b694decf612b\") " pod="openstack/nova-cell1-conductor-db-sync-cwpxs" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.607863 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcnmh\" (UniqueName: \"kubernetes.io/projected/077dc649-6898-4f04-837d-b694decf612b-kube-api-access-vcnmh\") pod \"nova-cell1-conductor-db-sync-cwpxs\" (UID: \"077dc649-6898-4f04-837d-b694decf612b\") " pod="openstack/nova-cell1-conductor-db-sync-cwpxs" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.607886 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-scripts\") pod \"nova-cell1-conductor-db-sync-cwpxs\" (UID: \"077dc649-6898-4f04-837d-b694decf612b\") " pod="openstack/nova-cell1-conductor-db-sync-cwpxs" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.652762 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89","Type":"ContainerStarted","Data":"9fcd6b0d8fc435c109aac005bdc71da0aba586fad59d4980840ac748c251b23e"} Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.658138 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-62kch" event={"ID":"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3","Type":"ContainerStarted","Data":"a81d372de4e3c666bafe4f07d24f6f7cc4bee3b83fd66c5127214476a5ecb65e"} Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.658169 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-62kch" event={"ID":"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3","Type":"ContainerStarted","Data":"4451dcbad7b501ecea1158230334701023d79a8cdb69845458a3f3fcb82a0bfe"} Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.674074 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:45:30 crc kubenswrapper[5094]: W0220 08:45:30.675778 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecd0e105_6bf4_436e_9d70_1b42f662e67f.slice/crio-984caf900fe47312a9e7a1b27d29fca7525debf96ff707a7c96772dcdf30d5cf WatchSource:0}: Error finding container 984caf900fe47312a9e7a1b27d29fca7525debf96ff707a7c96772dcdf30d5cf: Status 404 returned error can't find the container with id 984caf900fe47312a9e7a1b27d29fca7525debf96ff707a7c96772dcdf30d5cf Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.679760 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-62kch" podStartSLOduration=1.679733987 podStartE2EDuration="1.679733987s" podCreationTimestamp="2026-02-20 08:45:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:45:30.672217975 +0000 UTC m=+7145.544844686" watchObservedRunningTime="2026-02-20 08:45:30.679733987 +0000 UTC m=+7145.552360718" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.709929 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-config-data\") pod \"nova-cell1-conductor-db-sync-cwpxs\" (UID: \"077dc649-6898-4f04-837d-b694decf612b\") " pod="openstack/nova-cell1-conductor-db-sync-cwpxs" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.709983 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-cwpxs\" (UID: \"077dc649-6898-4f04-837d-b694decf612b\") " pod="openstack/nova-cell1-conductor-db-sync-cwpxs" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.710054 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcnmh\" (UniqueName: \"kubernetes.io/projected/077dc649-6898-4f04-837d-b694decf612b-kube-api-access-vcnmh\") pod \"nova-cell1-conductor-db-sync-cwpxs\" (UID: \"077dc649-6898-4f04-837d-b694decf612b\") " pod="openstack/nova-cell1-conductor-db-sync-cwpxs" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.710080 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-scripts\") pod \"nova-cell1-conductor-db-sync-cwpxs\" (UID: \"077dc649-6898-4f04-837d-b694decf612b\") " pod="openstack/nova-cell1-conductor-db-sync-cwpxs" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.719089 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-scripts\") pod \"nova-cell1-conductor-db-sync-cwpxs\" (UID: \"077dc649-6898-4f04-837d-b694decf612b\") " pod="openstack/nova-cell1-conductor-db-sync-cwpxs" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.721552 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-cwpxs\" (UID: \"077dc649-6898-4f04-837d-b694decf612b\") " pod="openstack/nova-cell1-conductor-db-sync-cwpxs" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.731024 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-config-data\") pod \"nova-cell1-conductor-db-sync-cwpxs\" (UID: \"077dc649-6898-4f04-837d-b694decf612b\") " pod="openstack/nova-cell1-conductor-db-sync-cwpxs" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.732589 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcnmh\" (UniqueName: \"kubernetes.io/projected/077dc649-6898-4f04-837d-b694decf612b-kube-api-access-vcnmh\") pod \"nova-cell1-conductor-db-sync-cwpxs\" (UID: \"077dc649-6898-4f04-837d-b694decf612b\") " pod="openstack/nova-cell1-conductor-db-sync-cwpxs" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.739524 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.843151 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-cwpxs" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.892055 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.900935 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9dcb44685-h54hc"] Feb 20 08:45:30 crc kubenswrapper[5094]: W0220 08:45:30.924296 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67ea7a9d_b48a_4ea9_be81_50d152a57e58.slice/crio-aecfbb6ad113706e1f13505e743a3ff12b31470b6a099b5823c226a7d30ce55c WatchSource:0}: Error finding container aecfbb6ad113706e1f13505e743a3ff12b31470b6a099b5823c226a7d30ce55c: Status 404 returned error can't find the container with id aecfbb6ad113706e1f13505e743a3ff12b31470b6a099b5823c226a7d30ce55c Feb 20 08:45:31 crc kubenswrapper[5094]: I0220 08:45:31.355959 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cwpxs"] Feb 20 08:45:31 crc kubenswrapper[5094]: W0220 08:45:31.368988 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod077dc649_6898_4f04_837d_b694decf612b.slice/crio-726988865e66909f2e9134f226f72595647874d8b06885775a28a8d3e68af441 WatchSource:0}: Error finding container 726988865e66909f2e9134f226f72595647874d8b06885775a28a8d3e68af441: Status 404 returned error can't find the container with id 726988865e66909f2e9134f226f72595647874d8b06885775a28a8d3e68af441 Feb 20 08:45:31 crc kubenswrapper[5094]: I0220 08:45:31.694881 5094 generic.go:334] "Generic (PLEG): container finished" podID="67ea7a9d-b48a-4ea9-be81-50d152a57e58" containerID="c16a7fdf9fc7f05c88a3c26ca7db3574a5f1cb1c1c0097c7a0361cae6e703d9a" exitCode=0 Feb 20 08:45:31 crc kubenswrapper[5094]: I0220 08:45:31.695012 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9dcb44685-h54hc" event={"ID":"67ea7a9d-b48a-4ea9-be81-50d152a57e58","Type":"ContainerDied","Data":"c16a7fdf9fc7f05c88a3c26ca7db3574a5f1cb1c1c0097c7a0361cae6e703d9a"} Feb 20 08:45:31 crc kubenswrapper[5094]: I0220 08:45:31.695041 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9dcb44685-h54hc" event={"ID":"67ea7a9d-b48a-4ea9-be81-50d152a57e58","Type":"ContainerStarted","Data":"aecfbb6ad113706e1f13505e743a3ff12b31470b6a099b5823c226a7d30ce55c"} Feb 20 08:45:31 crc kubenswrapper[5094]: I0220 08:45:31.697721 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ecd0e105-6bf4-436e-9d70-1b42f662e67f","Type":"ContainerStarted","Data":"984caf900fe47312a9e7a1b27d29fca7525debf96ff707a7c96772dcdf30d5cf"} Feb 20 08:45:31 crc kubenswrapper[5094]: I0220 08:45:31.699752 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-cwpxs" event={"ID":"077dc649-6898-4f04-837d-b694decf612b","Type":"ContainerStarted","Data":"20cb7da637ab28cd5dddb0edb00930b7379bd84765eae45228fa5efc43d1c866"} Feb 20 08:45:31 crc kubenswrapper[5094]: I0220 08:45:31.699778 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-cwpxs" event={"ID":"077dc649-6898-4f04-837d-b694decf612b","Type":"ContainerStarted","Data":"726988865e66909f2e9134f226f72595647874d8b06885775a28a8d3e68af441"} Feb 20 08:45:31 crc kubenswrapper[5094]: I0220 08:45:31.702034 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1b2421e1-8243-473f-8dd5-86bc130d251f","Type":"ContainerStarted","Data":"3235e0f9a591cd9fc84d38dd78cd295e9068409f0e216b0a40d76217b8e522fa"} Feb 20 08:45:31 crc kubenswrapper[5094]: I0220 08:45:31.704662 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cf10215b-d08a-452a-a7de-e7c828922d47","Type":"ContainerStarted","Data":"2c8f9dde56772efe4f4ead50b7fa81668866d757af9f32b2557ab9828d4e4e5e"} Feb 20 08:45:31 crc kubenswrapper[5094]: I0220 08:45:31.747290 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-cwpxs" podStartSLOduration=1.747266283 podStartE2EDuration="1.747266283s" podCreationTimestamp="2026-02-20 08:45:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:45:31.731187016 +0000 UTC m=+7146.603813727" watchObservedRunningTime="2026-02-20 08:45:31.747266283 +0000 UTC m=+7146.619892994" Feb 20 08:45:33 crc kubenswrapper[5094]: I0220 08:45:33.728315 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1b2421e1-8243-473f-8dd5-86bc130d251f","Type":"ContainerStarted","Data":"df33dc2485d8aea35d1d004dfc1ba0cd8d779ec59ff0b6fd5dcd53a2dfe85422"} Feb 20 08:45:33 crc kubenswrapper[5094]: I0220 08:45:33.730985 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cf10215b-d08a-452a-a7de-e7c828922d47","Type":"ContainerStarted","Data":"a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2"} Feb 20 08:45:33 crc kubenswrapper[5094]: I0220 08:45:33.731024 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cf10215b-d08a-452a-a7de-e7c828922d47","Type":"ContainerStarted","Data":"4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866"} Feb 20 08:45:33 crc kubenswrapper[5094]: I0220 08:45:33.733278 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9dcb44685-h54hc" event={"ID":"67ea7a9d-b48a-4ea9-be81-50d152a57e58","Type":"ContainerStarted","Data":"362e6ae91ccecf694cdc7738cb8ce59162ad28f767e767b41fae829ceaf54420"} Feb 20 08:45:33 crc kubenswrapper[5094]: I0220 08:45:33.733729 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:33 crc kubenswrapper[5094]: I0220 08:45:33.741447 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ecd0e105-6bf4-436e-9d70-1b42f662e67f","Type":"ContainerStarted","Data":"eff10a7e15bd494bb0f1c45b333f9c99c72283453691bb43a221edbe3d81d589"} Feb 20 08:45:33 crc kubenswrapper[5094]: I0220 08:45:33.761408 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.6816731369999998 podStartE2EDuration="4.761385547s" podCreationTimestamp="2026-02-20 08:45:29 +0000 UTC" firstStartedPulling="2026-02-20 08:45:30.900780594 +0000 UTC m=+7145.773407305" lastFinishedPulling="2026-02-20 08:45:32.980492964 +0000 UTC m=+7147.853119715" observedRunningTime="2026-02-20 08:45:33.75115255 +0000 UTC m=+7148.623779261" watchObservedRunningTime="2026-02-20 08:45:33.761385547 +0000 UTC m=+7148.634012278" Feb 20 08:45:33 crc kubenswrapper[5094]: I0220 08:45:33.763639 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89","Type":"ContainerStarted","Data":"471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55"} Feb 20 08:45:33 crc kubenswrapper[5094]: I0220 08:45:33.763686 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89","Type":"ContainerStarted","Data":"bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51"} Feb 20 08:45:33 crc kubenswrapper[5094]: I0220 08:45:33.786172 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9dcb44685-h54hc" podStartSLOduration=4.786151762 podStartE2EDuration="4.786151762s" podCreationTimestamp="2026-02-20 08:45:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:45:33.774135463 +0000 UTC m=+7148.646762174" watchObservedRunningTime="2026-02-20 08:45:33.786151762 +0000 UTC m=+7148.658778473" Feb 20 08:45:33 crc kubenswrapper[5094]: I0220 08:45:33.812838 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.573476443 podStartE2EDuration="4.812819303s" podCreationTimestamp="2026-02-20 08:45:29 +0000 UTC" firstStartedPulling="2026-02-20 08:45:30.741484852 +0000 UTC m=+7145.614111563" lastFinishedPulling="2026-02-20 08:45:32.980827712 +0000 UTC m=+7147.853454423" observedRunningTime="2026-02-20 08:45:33.798051778 +0000 UTC m=+7148.670678489" watchObservedRunningTime="2026-02-20 08:45:33.812819303 +0000 UTC m=+7148.685446014" Feb 20 08:45:33 crc kubenswrapper[5094]: I0220 08:45:33.819439 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.5219750039999997 podStartE2EDuration="4.819422192s" podCreationTimestamp="2026-02-20 08:45:29 +0000 UTC" firstStartedPulling="2026-02-20 08:45:30.67904759 +0000 UTC m=+7145.551674301" lastFinishedPulling="2026-02-20 08:45:32.976494778 +0000 UTC m=+7147.849121489" observedRunningTime="2026-02-20 08:45:33.814050143 +0000 UTC m=+7148.686676864" watchObservedRunningTime="2026-02-20 08:45:33.819422192 +0000 UTC m=+7148.692048903" Feb 20 08:45:33 crc kubenswrapper[5094]: I0220 08:45:33.839012 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.406714001 podStartE2EDuration="4.838994573s" podCreationTimestamp="2026-02-20 08:45:29 +0000 UTC" firstStartedPulling="2026-02-20 08:45:30.546313587 +0000 UTC m=+7145.418940288" lastFinishedPulling="2026-02-20 08:45:32.978594149 +0000 UTC m=+7147.851220860" observedRunningTime="2026-02-20 08:45:33.837081647 +0000 UTC m=+7148.709708368" watchObservedRunningTime="2026-02-20 08:45:33.838994573 +0000 UTC m=+7148.711621284" Feb 20 08:45:33 crc kubenswrapper[5094]: I0220 08:45:33.840172 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:45:33 crc kubenswrapper[5094]: E0220 08:45:33.840443 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:45:34 crc kubenswrapper[5094]: I0220 08:45:34.774043 5094 generic.go:334] "Generic (PLEG): container finished" podID="077dc649-6898-4f04-837d-b694decf612b" containerID="20cb7da637ab28cd5dddb0edb00930b7379bd84765eae45228fa5efc43d1c866" exitCode=0 Feb 20 08:45:34 crc kubenswrapper[5094]: I0220 08:45:34.774875 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-cwpxs" event={"ID":"077dc649-6898-4f04-837d-b694decf612b","Type":"ContainerDied","Data":"20cb7da637ab28cd5dddb0edb00930b7379bd84765eae45228fa5efc43d1c866"} Feb 20 08:45:35 crc kubenswrapper[5094]: I0220 08:45:35.092533 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 20 08:45:35 crc kubenswrapper[5094]: I0220 08:45:35.178069 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 08:45:35 crc kubenswrapper[5094]: I0220 08:45:35.178118 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 08:45:35 crc kubenswrapper[5094]: I0220 08:45:35.225891 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:45:35 crc kubenswrapper[5094]: I0220 08:45:35.782999 5094 generic.go:334] "Generic (PLEG): container finished" podID="2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3" containerID="a81d372de4e3c666bafe4f07d24f6f7cc4bee3b83fd66c5127214476a5ecb65e" exitCode=0 Feb 20 08:45:35 crc kubenswrapper[5094]: I0220 08:45:35.783110 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-62kch" event={"ID":"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3","Type":"ContainerDied","Data":"a81d372de4e3c666bafe4f07d24f6f7cc4bee3b83fd66c5127214476a5ecb65e"} Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.127653 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-cwpxs" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.216730 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-combined-ca-bundle\") pod \"077dc649-6898-4f04-837d-b694decf612b\" (UID: \"077dc649-6898-4f04-837d-b694decf612b\") " Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.217117 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-config-data\") pod \"077dc649-6898-4f04-837d-b694decf612b\" (UID: \"077dc649-6898-4f04-837d-b694decf612b\") " Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.217212 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-scripts\") pod \"077dc649-6898-4f04-837d-b694decf612b\" (UID: \"077dc649-6898-4f04-837d-b694decf612b\") " Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.217235 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcnmh\" (UniqueName: \"kubernetes.io/projected/077dc649-6898-4f04-837d-b694decf612b-kube-api-access-vcnmh\") pod \"077dc649-6898-4f04-837d-b694decf612b\" (UID: \"077dc649-6898-4f04-837d-b694decf612b\") " Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.222694 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-scripts" (OuterVolumeSpecName: "scripts") pod "077dc649-6898-4f04-837d-b694decf612b" (UID: "077dc649-6898-4f04-837d-b694decf612b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.227123 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/077dc649-6898-4f04-837d-b694decf612b-kube-api-access-vcnmh" (OuterVolumeSpecName: "kube-api-access-vcnmh") pod "077dc649-6898-4f04-837d-b694decf612b" (UID: "077dc649-6898-4f04-837d-b694decf612b"). InnerVolumeSpecName "kube-api-access-vcnmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.271024 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-config-data" (OuterVolumeSpecName: "config-data") pod "077dc649-6898-4f04-837d-b694decf612b" (UID: "077dc649-6898-4f04-837d-b694decf612b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.274967 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "077dc649-6898-4f04-837d-b694decf612b" (UID: "077dc649-6898-4f04-837d-b694decf612b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.318732 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.318755 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.318766 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.318775 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcnmh\" (UniqueName: \"kubernetes.io/projected/077dc649-6898-4f04-837d-b694decf612b-kube-api-access-vcnmh\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.797568 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-cwpxs" event={"ID":"077dc649-6898-4f04-837d-b694decf612b","Type":"ContainerDied","Data":"726988865e66909f2e9134f226f72595647874d8b06885775a28a8d3e68af441"} Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.797641 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-cwpxs" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.797662 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="726988865e66909f2e9134f226f72595647874d8b06885775a28a8d3e68af441" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.892611 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 08:45:36 crc kubenswrapper[5094]: E0220 08:45:36.893293 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077dc649-6898-4f04-837d-b694decf612b" containerName="nova-cell1-conductor-db-sync" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.893324 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="077dc649-6898-4f04-837d-b694decf612b" containerName="nova-cell1-conductor-db-sync" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.893646 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="077dc649-6898-4f04-837d-b694decf612b" containerName="nova-cell1-conductor-db-sync" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.894636 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.898692 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.903540 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.931764 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd3b441-92b9-4fd4-8451-dec1c354915e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4bd3b441-92b9-4fd4-8451-dec1c354915e\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.931850 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mcnc\" (UniqueName: \"kubernetes.io/projected/4bd3b441-92b9-4fd4-8451-dec1c354915e-kube-api-access-7mcnc\") pod \"nova-cell1-conductor-0\" (UID: \"4bd3b441-92b9-4fd4-8451-dec1c354915e\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.931909 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd3b441-92b9-4fd4-8451-dec1c354915e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4bd3b441-92b9-4fd4-8451-dec1c354915e\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.033815 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd3b441-92b9-4fd4-8451-dec1c354915e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4bd3b441-92b9-4fd4-8451-dec1c354915e\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.033871 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mcnc\" (UniqueName: \"kubernetes.io/projected/4bd3b441-92b9-4fd4-8451-dec1c354915e-kube-api-access-7mcnc\") pod \"nova-cell1-conductor-0\" (UID: \"4bd3b441-92b9-4fd4-8451-dec1c354915e\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.033906 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd3b441-92b9-4fd4-8451-dec1c354915e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4bd3b441-92b9-4fd4-8451-dec1c354915e\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.053815 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd3b441-92b9-4fd4-8451-dec1c354915e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4bd3b441-92b9-4fd4-8451-dec1c354915e\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.054397 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd3b441-92b9-4fd4-8451-dec1c354915e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4bd3b441-92b9-4fd4-8451-dec1c354915e\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.055050 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mcnc\" (UniqueName: \"kubernetes.io/projected/4bd3b441-92b9-4fd4-8451-dec1c354915e-kube-api-access-7mcnc\") pod \"nova-cell1-conductor-0\" (UID: \"4bd3b441-92b9-4fd4-8451-dec1c354915e\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.203232 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-62kch" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.229060 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.238230 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-config-data\") pod \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\" (UID: \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\") " Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.238368 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-combined-ca-bundle\") pod \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\" (UID: \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\") " Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.238398 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-scripts\") pod \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\" (UID: \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\") " Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.238491 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tkl4\" (UniqueName: \"kubernetes.io/projected/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-kube-api-access-4tkl4\") pod \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\" (UID: \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\") " Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.243853 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-scripts" (OuterVolumeSpecName: "scripts") pod "2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3" (UID: "2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.250092 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-kube-api-access-4tkl4" (OuterVolumeSpecName: "kube-api-access-4tkl4") pod "2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3" (UID: "2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3"). InnerVolumeSpecName "kube-api-access-4tkl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.277120 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-config-data" (OuterVolumeSpecName: "config-data") pod "2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3" (UID: "2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.279783 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3" (UID: "2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.332969 5094 scope.go:117] "RemoveContainer" containerID="3bce3a1b265a86956611d3f0e735c171240869a005a9b1236b337c2fd156e6f9" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.342891 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.342944 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.342963 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tkl4\" (UniqueName: \"kubernetes.io/projected/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-kube-api-access-4tkl4\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.342981 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.386314 5094 scope.go:117] "RemoveContainer" containerID="a7f01ab3dfebce16c461640e15ab5cb83ed76e8a8bf4b49d9de590c4cb6aacd4" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.681394 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 08:45:37 crc kubenswrapper[5094]: W0220 08:45:37.688880 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bd3b441_92b9_4fd4_8451_dec1c354915e.slice/crio-05c39de3bac79bc7ca3ed4fb07b709e2d6f8ab1a2eb157f4f36bbc1df541309c WatchSource:0}: Error finding container 05c39de3bac79bc7ca3ed4fb07b709e2d6f8ab1a2eb157f4f36bbc1df541309c: Status 404 returned error can't find the container with id 05c39de3bac79bc7ca3ed4fb07b709e2d6f8ab1a2eb157f4f36bbc1df541309c Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.808649 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-62kch" event={"ID":"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3","Type":"ContainerDied","Data":"4451dcbad7b501ecea1158230334701023d79a8cdb69845458a3f3fcb82a0bfe"} Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.808681 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-62kch" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.808692 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4451dcbad7b501ecea1158230334701023d79a8cdb69845458a3f3fcb82a0bfe" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.813699 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4bd3b441-92b9-4fd4-8451-dec1c354915e","Type":"ContainerStarted","Data":"05c39de3bac79bc7ca3ed4fb07b709e2d6f8ab1a2eb157f4f36bbc1df541309c"} Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.990216 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.990780 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="59f04fe3-56d8-4fcb-a1bf-35b730bd7d89" containerName="nova-api-log" containerID="cri-o://bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51" gracePeriod=30 Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.990944 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="59f04fe3-56d8-4fcb-a1bf-35b730bd7d89" containerName="nova-api-api" containerID="cri-o://471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55" gracePeriod=30 Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.008493 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.008771 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ecd0e105-6bf4-436e-9d70-1b42f662e67f" containerName="nova-scheduler-scheduler" containerID="cri-o://eff10a7e15bd494bb0f1c45b333f9c99c72283453691bb43a221edbe3d81d589" gracePeriod=30 Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.020381 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.020574 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cf10215b-d08a-452a-a7de-e7c828922d47" containerName="nova-metadata-log" containerID="cri-o://4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866" gracePeriod=30 Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.020721 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cf10215b-d08a-452a-a7de-e7c828922d47" containerName="nova-metadata-metadata" containerID="cri-o://a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2" gracePeriod=30 Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.565082 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.573865 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.667269 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-config-data\") pod \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\" (UID: \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\") " Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.667396 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf10215b-d08a-452a-a7de-e7c828922d47-logs\") pod \"cf10215b-d08a-452a-a7de-e7c828922d47\" (UID: \"cf10215b-d08a-452a-a7de-e7c828922d47\") " Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.667426 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-combined-ca-bundle\") pod \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\" (UID: \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\") " Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.667467 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z897\" (UniqueName: \"kubernetes.io/projected/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-kube-api-access-8z897\") pod \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\" (UID: \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\") " Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.667492 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59tqv\" (UniqueName: \"kubernetes.io/projected/cf10215b-d08a-452a-a7de-e7c828922d47-kube-api-access-59tqv\") pod \"cf10215b-d08a-452a-a7de-e7c828922d47\" (UID: \"cf10215b-d08a-452a-a7de-e7c828922d47\") " Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.667544 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-logs\") pod \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\" (UID: \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\") " Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.667585 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf10215b-d08a-452a-a7de-e7c828922d47-combined-ca-bundle\") pod \"cf10215b-d08a-452a-a7de-e7c828922d47\" (UID: \"cf10215b-d08a-452a-a7de-e7c828922d47\") " Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.667657 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf10215b-d08a-452a-a7de-e7c828922d47-config-data\") pod \"cf10215b-d08a-452a-a7de-e7c828922d47\" (UID: \"cf10215b-d08a-452a-a7de-e7c828922d47\") " Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.670540 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-logs" (OuterVolumeSpecName: "logs") pod "59f04fe3-56d8-4fcb-a1bf-35b730bd7d89" (UID: "59f04fe3-56d8-4fcb-a1bf-35b730bd7d89"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.679767 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf10215b-d08a-452a-a7de-e7c828922d47-kube-api-access-59tqv" (OuterVolumeSpecName: "kube-api-access-59tqv") pod "cf10215b-d08a-452a-a7de-e7c828922d47" (UID: "cf10215b-d08a-452a-a7de-e7c828922d47"). InnerVolumeSpecName "kube-api-access-59tqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.680095 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf10215b-d08a-452a-a7de-e7c828922d47-logs" (OuterVolumeSpecName: "logs") pod "cf10215b-d08a-452a-a7de-e7c828922d47" (UID: "cf10215b-d08a-452a-a7de-e7c828922d47"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.707054 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-kube-api-access-8z897" (OuterVolumeSpecName: "kube-api-access-8z897") pod "59f04fe3-56d8-4fcb-a1bf-35b730bd7d89" (UID: "59f04fe3-56d8-4fcb-a1bf-35b730bd7d89"). InnerVolumeSpecName "kube-api-access-8z897". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.710884 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf10215b-d08a-452a-a7de-e7c828922d47-config-data" (OuterVolumeSpecName: "config-data") pod "cf10215b-d08a-452a-a7de-e7c828922d47" (UID: "cf10215b-d08a-452a-a7de-e7c828922d47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.734974 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59f04fe3-56d8-4fcb-a1bf-35b730bd7d89" (UID: "59f04fe3-56d8-4fcb-a1bf-35b730bd7d89"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.751908 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf10215b-d08a-452a-a7de-e7c828922d47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf10215b-d08a-452a-a7de-e7c828922d47" (UID: "cf10215b-d08a-452a-a7de-e7c828922d47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.770263 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf10215b-d08a-452a-a7de-e7c828922d47-logs\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.770295 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.770306 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z897\" (UniqueName: \"kubernetes.io/projected/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-kube-api-access-8z897\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.770316 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59tqv\" (UniqueName: \"kubernetes.io/projected/cf10215b-d08a-452a-a7de-e7c828922d47-kube-api-access-59tqv\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.770325 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-logs\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.770333 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf10215b-d08a-452a-a7de-e7c828922d47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.770346 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf10215b-d08a-452a-a7de-e7c828922d47-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.770923 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-config-data" (OuterVolumeSpecName: "config-data") pod "59f04fe3-56d8-4fcb-a1bf-35b730bd7d89" (UID: "59f04fe3-56d8-4fcb-a1bf-35b730bd7d89"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.823386 5094 generic.go:334] "Generic (PLEG): container finished" podID="59f04fe3-56d8-4fcb-a1bf-35b730bd7d89" containerID="471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55" exitCode=0 Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.823419 5094 generic.go:334] "Generic (PLEG): container finished" podID="59f04fe3-56d8-4fcb-a1bf-35b730bd7d89" containerID="bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51" exitCode=143 Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.823440 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89","Type":"ContainerDied","Data":"471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55"} Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.823467 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.823494 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89","Type":"ContainerDied","Data":"bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51"} Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.823532 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89","Type":"ContainerDied","Data":"9fcd6b0d8fc435c109aac005bdc71da0aba586fad59d4980840ac748c251b23e"} Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.823543 5094 scope.go:117] "RemoveContainer" containerID="471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.827935 5094 generic.go:334] "Generic (PLEG): container finished" podID="cf10215b-d08a-452a-a7de-e7c828922d47" containerID="a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2" exitCode=0 Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.827955 5094 generic.go:334] "Generic (PLEG): container finished" podID="cf10215b-d08a-452a-a7de-e7c828922d47" containerID="4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866" exitCode=143 Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.827992 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cf10215b-d08a-452a-a7de-e7c828922d47","Type":"ContainerDied","Data":"a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2"} Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.828014 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cf10215b-d08a-452a-a7de-e7c828922d47","Type":"ContainerDied","Data":"4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866"} Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.828023 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cf10215b-d08a-452a-a7de-e7c828922d47","Type":"ContainerDied","Data":"2c8f9dde56772efe4f4ead50b7fa81668866d757af9f32b2557ab9828d4e4e5e"} Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.828070 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.835824 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4bd3b441-92b9-4fd4-8451-dec1c354915e","Type":"ContainerStarted","Data":"43a32182f09efed60387dd69fdb95acc9d172d5439401fc1ab4dd9e997f9483d"} Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.835969 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.844536 5094 scope.go:117] "RemoveContainer" containerID="bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.865001 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.864982898 podStartE2EDuration="2.864982898s" podCreationTimestamp="2026-02-20 08:45:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:45:38.851417641 +0000 UTC m=+7153.724044352" watchObservedRunningTime="2026-02-20 08:45:38.864982898 +0000 UTC m=+7153.737609609" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.866873 5094 scope.go:117] "RemoveContainer" containerID="471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55" Feb 20 08:45:38 crc kubenswrapper[5094]: E0220 08:45:38.867549 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55\": container with ID starting with 471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55 not found: ID does not exist" containerID="471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.867585 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55"} err="failed to get container status \"471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55\": rpc error: code = NotFound desc = could not find container \"471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55\": container with ID starting with 471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55 not found: ID does not exist" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.867608 5094 scope.go:117] "RemoveContainer" containerID="bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51" Feb 20 08:45:38 crc kubenswrapper[5094]: E0220 08:45:38.868086 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51\": container with ID starting with bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51 not found: ID does not exist" containerID="bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.868140 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51"} err="failed to get container status \"bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51\": rpc error: code = NotFound desc = could not find container \"bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51\": container with ID starting with bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51 not found: ID does not exist" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.868174 5094 scope.go:117] "RemoveContainer" containerID="471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.874905 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55"} err="failed to get container status \"471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55\": rpc error: code = NotFound desc = could not find container \"471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55\": container with ID starting with 471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55 not found: ID does not exist" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.874948 5094 scope.go:117] "RemoveContainer" containerID="bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.877042 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.878023 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51"} err="failed to get container status \"bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51\": rpc error: code = NotFound desc = could not find container \"bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51\": container with ID starting with bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51 not found: ID does not exist" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.878053 5094 scope.go:117] "RemoveContainer" containerID="a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.889853 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.896043 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.899571 5094 scope.go:117] "RemoveContainer" containerID="4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.920921 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.922981 5094 scope.go:117] "RemoveContainer" containerID="a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2" Feb 20 08:45:38 crc kubenswrapper[5094]: E0220 08:45:38.923384 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2\": container with ID starting with a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2 not found: ID does not exist" containerID="a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.923416 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2"} err="failed to get container status \"a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2\": rpc error: code = NotFound desc = could not find container \"a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2\": container with ID starting with a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2 not found: ID does not exist" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.923441 5094 scope.go:117] "RemoveContainer" containerID="4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866" Feb 20 08:45:38 crc kubenswrapper[5094]: E0220 08:45:38.925532 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866\": container with ID starting with 4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866 not found: ID does not exist" containerID="4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.925552 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866"} err="failed to get container status \"4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866\": rpc error: code = NotFound desc = could not find container \"4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866\": container with ID starting with 4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866 not found: ID does not exist" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.925566 5094 scope.go:117] "RemoveContainer" containerID="a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.925901 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2"} err="failed to get container status \"a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2\": rpc error: code = NotFound desc = could not find container \"a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2\": container with ID starting with a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2 not found: ID does not exist" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.925923 5094 scope.go:117] "RemoveContainer" containerID="4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.927567 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866"} err="failed to get container status \"4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866\": rpc error: code = NotFound desc = could not find container \"4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866\": container with ID starting with 4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866 not found: ID does not exist" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.928479 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 20 08:45:38 crc kubenswrapper[5094]: E0220 08:45:38.928883 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf10215b-d08a-452a-a7de-e7c828922d47" containerName="nova-metadata-metadata" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.928901 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf10215b-d08a-452a-a7de-e7c828922d47" containerName="nova-metadata-metadata" Feb 20 08:45:38 crc kubenswrapper[5094]: E0220 08:45:38.928930 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59f04fe3-56d8-4fcb-a1bf-35b730bd7d89" containerName="nova-api-api" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.928936 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f04fe3-56d8-4fcb-a1bf-35b730bd7d89" containerName="nova-api-api" Feb 20 08:45:38 crc kubenswrapper[5094]: E0220 08:45:38.928947 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf10215b-d08a-452a-a7de-e7c828922d47" containerName="nova-metadata-log" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.928953 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf10215b-d08a-452a-a7de-e7c828922d47" containerName="nova-metadata-log" Feb 20 08:45:38 crc kubenswrapper[5094]: E0220 08:45:38.928959 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59f04fe3-56d8-4fcb-a1bf-35b730bd7d89" containerName="nova-api-log" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.928965 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f04fe3-56d8-4fcb-a1bf-35b730bd7d89" containerName="nova-api-log" Feb 20 08:45:38 crc kubenswrapper[5094]: E0220 08:45:38.928986 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3" containerName="nova-manage" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.928992 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3" containerName="nova-manage" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.929135 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3" containerName="nova-manage" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.929147 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="59f04fe3-56d8-4fcb-a1bf-35b730bd7d89" containerName="nova-api-log" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.929155 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf10215b-d08a-452a-a7de-e7c828922d47" containerName="nova-metadata-metadata" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.929168 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="59f04fe3-56d8-4fcb-a1bf-35b730bd7d89" containerName="nova-api-api" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.929174 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf10215b-d08a-452a-a7de-e7c828922d47" containerName="nova-metadata-log" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.930201 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.935815 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.958486 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.968070 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.974750 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.976363 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.980066 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-logs\") pod \"nova-api-0\" (UID: \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\") " pod="openstack/nova-api-0" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.980241 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mknbw\" (UniqueName: \"kubernetes.io/projected/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-kube-api-access-mknbw\") pod \"nova-api-0\" (UID: \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\") " pod="openstack/nova-api-0" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.980365 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\") " pod="openstack/nova-api-0" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.980461 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-config-data\") pod \"nova-api-0\" (UID: \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\") " pod="openstack/nova-api-0" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.981071 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.991911 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.081653 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03adc315-89a1-44e2-b06f-f2279bd0805f-config-data\") pod \"nova-metadata-0\" (UID: \"03adc315-89a1-44e2-b06f-f2279bd0805f\") " pod="openstack/nova-metadata-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.081734 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlhhr\" (UniqueName: \"kubernetes.io/projected/03adc315-89a1-44e2-b06f-f2279bd0805f-kube-api-access-zlhhr\") pod \"nova-metadata-0\" (UID: \"03adc315-89a1-44e2-b06f-f2279bd0805f\") " pod="openstack/nova-metadata-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.081768 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mknbw\" (UniqueName: \"kubernetes.io/projected/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-kube-api-access-mknbw\") pod \"nova-api-0\" (UID: \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\") " pod="openstack/nova-api-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.081812 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\") " pod="openstack/nova-api-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.081848 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-config-data\") pod \"nova-api-0\" (UID: \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\") " pod="openstack/nova-api-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.081891 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03adc315-89a1-44e2-b06f-f2279bd0805f-logs\") pod \"nova-metadata-0\" (UID: \"03adc315-89a1-44e2-b06f-f2279bd0805f\") " pod="openstack/nova-metadata-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.081937 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-logs\") pod \"nova-api-0\" (UID: \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\") " pod="openstack/nova-api-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.081960 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03adc315-89a1-44e2-b06f-f2279bd0805f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"03adc315-89a1-44e2-b06f-f2279bd0805f\") " pod="openstack/nova-metadata-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.083289 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-logs\") pod \"nova-api-0\" (UID: \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\") " pod="openstack/nova-api-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.085636 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\") " pod="openstack/nova-api-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.088829 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-config-data\") pod \"nova-api-0\" (UID: \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\") " pod="openstack/nova-api-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.100444 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mknbw\" (UniqueName: \"kubernetes.io/projected/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-kube-api-access-mknbw\") pod \"nova-api-0\" (UID: \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\") " pod="openstack/nova-api-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.183308 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03adc315-89a1-44e2-b06f-f2279bd0805f-logs\") pod \"nova-metadata-0\" (UID: \"03adc315-89a1-44e2-b06f-f2279bd0805f\") " pod="openstack/nova-metadata-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.183397 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03adc315-89a1-44e2-b06f-f2279bd0805f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"03adc315-89a1-44e2-b06f-f2279bd0805f\") " pod="openstack/nova-metadata-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.183441 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03adc315-89a1-44e2-b06f-f2279bd0805f-config-data\") pod \"nova-metadata-0\" (UID: \"03adc315-89a1-44e2-b06f-f2279bd0805f\") " pod="openstack/nova-metadata-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.183466 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlhhr\" (UniqueName: \"kubernetes.io/projected/03adc315-89a1-44e2-b06f-f2279bd0805f-kube-api-access-zlhhr\") pod \"nova-metadata-0\" (UID: \"03adc315-89a1-44e2-b06f-f2279bd0805f\") " pod="openstack/nova-metadata-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.184272 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03adc315-89a1-44e2-b06f-f2279bd0805f-logs\") pod \"nova-metadata-0\" (UID: \"03adc315-89a1-44e2-b06f-f2279bd0805f\") " pod="openstack/nova-metadata-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.190323 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03adc315-89a1-44e2-b06f-f2279bd0805f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"03adc315-89a1-44e2-b06f-f2279bd0805f\") " pod="openstack/nova-metadata-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.191410 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03adc315-89a1-44e2-b06f-f2279bd0805f-config-data\") pod \"nova-metadata-0\" (UID: \"03adc315-89a1-44e2-b06f-f2279bd0805f\") " pod="openstack/nova-metadata-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.199514 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlhhr\" (UniqueName: \"kubernetes.io/projected/03adc315-89a1-44e2-b06f-f2279bd0805f-kube-api-access-zlhhr\") pod \"nova-metadata-0\" (UID: \"03adc315-89a1-44e2-b06f-f2279bd0805f\") " pod="openstack/nova-metadata-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.252844 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.301839 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.701426 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:45:39 crc kubenswrapper[5094]: W0220 08:45:39.707121 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod481ed5ff_4180_4ff6_8d5f_b7876b484fb2.slice/crio-f3b21367601b2d8d9c4f3af3cf1e049f73d472a734810328e7d1183321ab8d37 WatchSource:0}: Error finding container f3b21367601b2d8d9c4f3af3cf1e049f73d472a734810328e7d1183321ab8d37: Status 404 returned error can't find the container with id f3b21367601b2d8d9c4f3af3cf1e049f73d472a734810328e7d1183321ab8d37 Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.780297 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.851180 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59f04fe3-56d8-4fcb-a1bf-35b730bd7d89" path="/var/lib/kubelet/pods/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89/volumes" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.853732 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf10215b-d08a-452a-a7de-e7c828922d47" path="/var/lib/kubelet/pods/cf10215b-d08a-452a-a7de-e7c828922d47/volumes" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.855098 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"03adc315-89a1-44e2-b06f-f2279bd0805f","Type":"ContainerStarted","Data":"0e6070237df8767b14a2b477f51aaced221d1a725981607f33d88c8bcb05cbb9"} Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.855138 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"481ed5ff-4180-4ff6-8d5f-b7876b484fb2","Type":"ContainerStarted","Data":"f3b21367601b2d8d9c4f3af3cf1e049f73d472a734810328e7d1183321ab8d37"} Feb 20 08:45:40 crc kubenswrapper[5094]: I0220 08:45:40.226941 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:45:40 crc kubenswrapper[5094]: I0220 08:45:40.239240 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:45:40 crc kubenswrapper[5094]: I0220 08:45:40.299867 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:40 crc kubenswrapper[5094]: I0220 08:45:40.366644 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-567db69c47-cctzv"] Feb 20 08:45:40 crc kubenswrapper[5094]: I0220 08:45:40.366924 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-567db69c47-cctzv" podUID="c8a1891c-fffd-4032-9384-bef764ca9f57" containerName="dnsmasq-dns" containerID="cri-o://20b70321388fb9ccdce0f5d0ab13ac7703a9ad371987c42a5b72759e55aa7f72" gracePeriod=10 Feb 20 08:45:40 crc kubenswrapper[5094]: I0220 08:45:40.867743 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-567db69c47-cctzv" event={"ID":"c8a1891c-fffd-4032-9384-bef764ca9f57","Type":"ContainerDied","Data":"20b70321388fb9ccdce0f5d0ab13ac7703a9ad371987c42a5b72759e55aa7f72"} Feb 20 08:45:40 crc kubenswrapper[5094]: I0220 08:45:40.867689 5094 generic.go:334] "Generic (PLEG): container finished" podID="c8a1891c-fffd-4032-9384-bef764ca9f57" containerID="20b70321388fb9ccdce0f5d0ab13ac7703a9ad371987c42a5b72759e55aa7f72" exitCode=0 Feb 20 08:45:40 crc kubenswrapper[5094]: I0220 08:45:40.870043 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"03adc315-89a1-44e2-b06f-f2279bd0805f","Type":"ContainerStarted","Data":"5ca1c7e15cb6d9d823eb80f708a2c7294008bcec7b44ac10b83d1221ebe02341"} Feb 20 08:45:40 crc kubenswrapper[5094]: I0220 08:45:40.870313 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"03adc315-89a1-44e2-b06f-f2279bd0805f","Type":"ContainerStarted","Data":"e7475258d7537ed8a05219d178f13988fa9c6b085bc9ec476ed3d839bed3afbd"} Feb 20 08:45:40 crc kubenswrapper[5094]: I0220 08:45:40.879864 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"481ed5ff-4180-4ff6-8d5f-b7876b484fb2","Type":"ContainerStarted","Data":"33cdd137dbb0b138ccaa238a59c498915d20088d8ecfd36abd345e1ae43df087"} Feb 20 08:45:40 crc kubenswrapper[5094]: I0220 08:45:40.879907 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"481ed5ff-4180-4ff6-8d5f-b7876b484fb2","Type":"ContainerStarted","Data":"a6fc48bcb490e8f80490f7e2f6378f1e5b84f106073ec34ad5101cf96c2294bf"} Feb 20 08:45:40 crc kubenswrapper[5094]: I0220 08:45:40.881346 5094 generic.go:334] "Generic (PLEG): container finished" podID="ecd0e105-6bf4-436e-9d70-1b42f662e67f" containerID="eff10a7e15bd494bb0f1c45b333f9c99c72283453691bb43a221edbe3d81d589" exitCode=0 Feb 20 08:45:40 crc kubenswrapper[5094]: I0220 08:45:40.881414 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ecd0e105-6bf4-436e-9d70-1b42f662e67f","Type":"ContainerDied","Data":"eff10a7e15bd494bb0f1c45b333f9c99c72283453691bb43a221edbe3d81d589"} Feb 20 08:45:40 crc kubenswrapper[5094]: I0220 08:45:40.906087 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.90606666 podStartE2EDuration="2.90606666s" podCreationTimestamp="2026-02-20 08:45:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:45:40.89199855 +0000 UTC m=+7155.764625261" watchObservedRunningTime="2026-02-20 08:45:40.90606666 +0000 UTC m=+7155.778693371" Feb 20 08:45:40 crc kubenswrapper[5094]: I0220 08:45:40.907377 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:45:40 crc kubenswrapper[5094]: I0220 08:45:40.919767 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.919751728 podStartE2EDuration="2.919751728s" podCreationTimestamp="2026-02-20 08:45:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:45:40.915100597 +0000 UTC m=+7155.787727308" watchObservedRunningTime="2026-02-20 08:45:40.919751728 +0000 UTC m=+7155.792378439" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.210721 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.301404 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.334831 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecd0e105-6bf4-436e-9d70-1b42f662e67f-combined-ca-bundle\") pod \"ecd0e105-6bf4-436e-9d70-1b42f662e67f\" (UID: \"ecd0e105-6bf4-436e-9d70-1b42f662e67f\") " Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.334891 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecd0e105-6bf4-436e-9d70-1b42f662e67f-config-data\") pod \"ecd0e105-6bf4-436e-9d70-1b42f662e67f\" (UID: \"ecd0e105-6bf4-436e-9d70-1b42f662e67f\") " Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.335312 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ljr7\" (UniqueName: \"kubernetes.io/projected/ecd0e105-6bf4-436e-9d70-1b42f662e67f-kube-api-access-2ljr7\") pod \"ecd0e105-6bf4-436e-9d70-1b42f662e67f\" (UID: \"ecd0e105-6bf4-436e-9d70-1b42f662e67f\") " Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.340035 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecd0e105-6bf4-436e-9d70-1b42f662e67f-kube-api-access-2ljr7" (OuterVolumeSpecName: "kube-api-access-2ljr7") pod "ecd0e105-6bf4-436e-9d70-1b42f662e67f" (UID: "ecd0e105-6bf4-436e-9d70-1b42f662e67f"). InnerVolumeSpecName "kube-api-access-2ljr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.361431 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecd0e105-6bf4-436e-9d70-1b42f662e67f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ecd0e105-6bf4-436e-9d70-1b42f662e67f" (UID: "ecd0e105-6bf4-436e-9d70-1b42f662e67f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.364154 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecd0e105-6bf4-436e-9d70-1b42f662e67f-config-data" (OuterVolumeSpecName: "config-data") pod "ecd0e105-6bf4-436e-9d70-1b42f662e67f" (UID: "ecd0e105-6bf4-436e-9d70-1b42f662e67f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.437338 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-ovsdbserver-nb\") pod \"c8a1891c-fffd-4032-9384-bef764ca9f57\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.437444 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvlvt\" (UniqueName: \"kubernetes.io/projected/c8a1891c-fffd-4032-9384-bef764ca9f57-kube-api-access-dvlvt\") pod \"c8a1891c-fffd-4032-9384-bef764ca9f57\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.437541 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-config\") pod \"c8a1891c-fffd-4032-9384-bef764ca9f57\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.437572 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-dns-svc\") pod \"c8a1891c-fffd-4032-9384-bef764ca9f57\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.437625 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-ovsdbserver-sb\") pod \"c8a1891c-fffd-4032-9384-bef764ca9f57\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.438070 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecd0e105-6bf4-436e-9d70-1b42f662e67f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.438096 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecd0e105-6bf4-436e-9d70-1b42f662e67f-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.438568 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ljr7\" (UniqueName: \"kubernetes.io/projected/ecd0e105-6bf4-436e-9d70-1b42f662e67f-kube-api-access-2ljr7\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.440477 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8a1891c-fffd-4032-9384-bef764ca9f57-kube-api-access-dvlvt" (OuterVolumeSpecName: "kube-api-access-dvlvt") pod "c8a1891c-fffd-4032-9384-bef764ca9f57" (UID: "c8a1891c-fffd-4032-9384-bef764ca9f57"). InnerVolumeSpecName "kube-api-access-dvlvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.476454 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c8a1891c-fffd-4032-9384-bef764ca9f57" (UID: "c8a1891c-fffd-4032-9384-bef764ca9f57"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.476467 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c8a1891c-fffd-4032-9384-bef764ca9f57" (UID: "c8a1891c-fffd-4032-9384-bef764ca9f57"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.479049 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-config" (OuterVolumeSpecName: "config") pod "c8a1891c-fffd-4032-9384-bef764ca9f57" (UID: "c8a1891c-fffd-4032-9384-bef764ca9f57"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.486016 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c8a1891c-fffd-4032-9384-bef764ca9f57" (UID: "c8a1891c-fffd-4032-9384-bef764ca9f57"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.539658 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.539693 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.539726 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.539747 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.539760 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvlvt\" (UniqueName: \"kubernetes.io/projected/c8a1891c-fffd-4032-9384-bef764ca9f57-kube-api-access-dvlvt\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.890778 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.890807 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ecd0e105-6bf4-436e-9d70-1b42f662e67f","Type":"ContainerDied","Data":"984caf900fe47312a9e7a1b27d29fca7525debf96ff707a7c96772dcdf30d5cf"} Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.891175 5094 scope.go:117] "RemoveContainer" containerID="eff10a7e15bd494bb0f1c45b333f9c99c72283453691bb43a221edbe3d81d589" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.896247 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-567db69c47-cctzv" event={"ID":"c8a1891c-fffd-4032-9384-bef764ca9f57","Type":"ContainerDied","Data":"ac86d266f2e6539516ad0f20e1cd64fe7ffd8192e43ce128347406606139c997"} Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.897245 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.928225 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.931835 5094 scope.go:117] "RemoveContainer" containerID="20b70321388fb9ccdce0f5d0ab13ac7703a9ad371987c42a5b72759e55aa7f72" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.942263 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.955260 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:45:41 crc kubenswrapper[5094]: E0220 08:45:41.955717 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecd0e105-6bf4-436e-9d70-1b42f662e67f" containerName="nova-scheduler-scheduler" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.955734 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecd0e105-6bf4-436e-9d70-1b42f662e67f" containerName="nova-scheduler-scheduler" Feb 20 08:45:41 crc kubenswrapper[5094]: E0220 08:45:41.955754 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8a1891c-fffd-4032-9384-bef764ca9f57" containerName="init" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.955760 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8a1891c-fffd-4032-9384-bef764ca9f57" containerName="init" Feb 20 08:45:41 crc kubenswrapper[5094]: E0220 08:45:41.955786 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8a1891c-fffd-4032-9384-bef764ca9f57" containerName="dnsmasq-dns" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.955792 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8a1891c-fffd-4032-9384-bef764ca9f57" containerName="dnsmasq-dns" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.955959 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecd0e105-6bf4-436e-9d70-1b42f662e67f" containerName="nova-scheduler-scheduler" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.955973 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8a1891c-fffd-4032-9384-bef764ca9f57" containerName="dnsmasq-dns" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.956646 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.958824 5094 scope.go:117] "RemoveContainer" containerID="e5df72384fdffb4deabf4c6cbb6c43678269bc4fec968e5b22a2e15228a210f5" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.966529 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.967863 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-567db69c47-cctzv"] Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.978678 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.987149 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-567db69c47-cctzv"] Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.049679 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e15e686-66dc-4bb3-989f-d1f84b318cf7-config-data\") pod \"nova-scheduler-0\" (UID: \"2e15e686-66dc-4bb3-989f-d1f84b318cf7\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.050070 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e15e686-66dc-4bb3-989f-d1f84b318cf7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2e15e686-66dc-4bb3-989f-d1f84b318cf7\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.050159 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn7mf\" (UniqueName: \"kubernetes.io/projected/2e15e686-66dc-4bb3-989f-d1f84b318cf7-kube-api-access-bn7mf\") pod \"nova-scheduler-0\" (UID: \"2e15e686-66dc-4bb3-989f-d1f84b318cf7\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.151749 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e15e686-66dc-4bb3-989f-d1f84b318cf7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2e15e686-66dc-4bb3-989f-d1f84b318cf7\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.151841 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn7mf\" (UniqueName: \"kubernetes.io/projected/2e15e686-66dc-4bb3-989f-d1f84b318cf7-kube-api-access-bn7mf\") pod \"nova-scheduler-0\" (UID: \"2e15e686-66dc-4bb3-989f-d1f84b318cf7\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.151901 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e15e686-66dc-4bb3-989f-d1f84b318cf7-config-data\") pod \"nova-scheduler-0\" (UID: \"2e15e686-66dc-4bb3-989f-d1f84b318cf7\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.155553 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e15e686-66dc-4bb3-989f-d1f84b318cf7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2e15e686-66dc-4bb3-989f-d1f84b318cf7\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.156278 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e15e686-66dc-4bb3-989f-d1f84b318cf7-config-data\") pod \"nova-scheduler-0\" (UID: \"2e15e686-66dc-4bb3-989f-d1f84b318cf7\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.171163 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn7mf\" (UniqueName: \"kubernetes.io/projected/2e15e686-66dc-4bb3-989f-d1f84b318cf7-kube-api-access-bn7mf\") pod \"nova-scheduler-0\" (UID: \"2e15e686-66dc-4bb3-989f-d1f84b318cf7\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.269202 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.272427 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.707262 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-5cthc"] Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.708684 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5cthc" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.711951 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.718285 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.720319 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-5cthc"] Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.758265 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:45:42 crc kubenswrapper[5094]: W0220 08:45:42.761981 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e15e686_66dc_4bb3_989f_d1f84b318cf7.slice/crio-f4ce19ef2565a3b6eabf7f55a97f70c4b62b04d5648dcef78143beeffc69d496 WatchSource:0}: Error finding container f4ce19ef2565a3b6eabf7f55a97f70c4b62b04d5648dcef78143beeffc69d496: Status 404 returned error can't find the container with id f4ce19ef2565a3b6eabf7f55a97f70c4b62b04d5648dcef78143beeffc69d496 Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.764634 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-scripts\") pod \"nova-cell1-cell-mapping-5cthc\" (UID: \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\") " pod="openstack/nova-cell1-cell-mapping-5cthc" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.764799 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6n5d\" (UniqueName: \"kubernetes.io/projected/47f4c643-fb8b-41d6-97b5-fa0c0928f370-kube-api-access-q6n5d\") pod \"nova-cell1-cell-mapping-5cthc\" (UID: \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\") " pod="openstack/nova-cell1-cell-mapping-5cthc" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.764914 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-config-data\") pod \"nova-cell1-cell-mapping-5cthc\" (UID: \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\") " pod="openstack/nova-cell1-cell-mapping-5cthc" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.765028 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5cthc\" (UID: \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\") " pod="openstack/nova-cell1-cell-mapping-5cthc" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.866986 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5cthc\" (UID: \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\") " pod="openstack/nova-cell1-cell-mapping-5cthc" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.867069 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-scripts\") pod \"nova-cell1-cell-mapping-5cthc\" (UID: \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\") " pod="openstack/nova-cell1-cell-mapping-5cthc" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.867130 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6n5d\" (UniqueName: \"kubernetes.io/projected/47f4c643-fb8b-41d6-97b5-fa0c0928f370-kube-api-access-q6n5d\") pod \"nova-cell1-cell-mapping-5cthc\" (UID: \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\") " pod="openstack/nova-cell1-cell-mapping-5cthc" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.867168 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-config-data\") pod \"nova-cell1-cell-mapping-5cthc\" (UID: \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\") " pod="openstack/nova-cell1-cell-mapping-5cthc" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.870898 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-scripts\") pod \"nova-cell1-cell-mapping-5cthc\" (UID: \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\") " pod="openstack/nova-cell1-cell-mapping-5cthc" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.871200 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-config-data\") pod \"nova-cell1-cell-mapping-5cthc\" (UID: \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\") " pod="openstack/nova-cell1-cell-mapping-5cthc" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.872047 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5cthc\" (UID: \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\") " pod="openstack/nova-cell1-cell-mapping-5cthc" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.885144 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6n5d\" (UniqueName: \"kubernetes.io/projected/47f4c643-fb8b-41d6-97b5-fa0c0928f370-kube-api-access-q6n5d\") pod \"nova-cell1-cell-mapping-5cthc\" (UID: \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\") " pod="openstack/nova-cell1-cell-mapping-5cthc" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.905350 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2e15e686-66dc-4bb3-989f-d1f84b318cf7","Type":"ContainerStarted","Data":"f4ce19ef2565a3b6eabf7f55a97f70c4b62b04d5648dcef78143beeffc69d496"} Feb 20 08:45:43 crc kubenswrapper[5094]: I0220 08:45:43.024125 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5cthc" Feb 20 08:45:43 crc kubenswrapper[5094]: I0220 08:45:43.475801 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-5cthc"] Feb 20 08:45:43 crc kubenswrapper[5094]: I0220 08:45:43.854056 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8a1891c-fffd-4032-9384-bef764ca9f57" path="/var/lib/kubelet/pods/c8a1891c-fffd-4032-9384-bef764ca9f57/volumes" Feb 20 08:45:43 crc kubenswrapper[5094]: I0220 08:45:43.856552 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecd0e105-6bf4-436e-9d70-1b42f662e67f" path="/var/lib/kubelet/pods/ecd0e105-6bf4-436e-9d70-1b42f662e67f/volumes" Feb 20 08:45:43 crc kubenswrapper[5094]: I0220 08:45:43.916955 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5cthc" event={"ID":"47f4c643-fb8b-41d6-97b5-fa0c0928f370","Type":"ContainerStarted","Data":"8c8e45f1f20160f7c96239278e797286c527baed5911fbde0292c56051da6b16"} Feb 20 08:45:43 crc kubenswrapper[5094]: I0220 08:45:43.918025 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5cthc" event={"ID":"47f4c643-fb8b-41d6-97b5-fa0c0928f370","Type":"ContainerStarted","Data":"243c552ada916fbd8c98356e79d33549118c50125ea4dea29e072301c1c2979e"} Feb 20 08:45:43 crc kubenswrapper[5094]: I0220 08:45:43.925165 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2e15e686-66dc-4bb3-989f-d1f84b318cf7","Type":"ContainerStarted","Data":"ba3fb8433e297295f6890e9d15da15d3e69001dff8980a0f56301d98c2a70d28"} Feb 20 08:45:43 crc kubenswrapper[5094]: I0220 08:45:43.935553 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-5cthc" podStartSLOduration=1.935534154 podStartE2EDuration="1.935534154s" podCreationTimestamp="2026-02-20 08:45:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:45:43.933379922 +0000 UTC m=+7158.806006643" watchObservedRunningTime="2026-02-20 08:45:43.935534154 +0000 UTC m=+7158.808160865" Feb 20 08:45:43 crc kubenswrapper[5094]: I0220 08:45:43.961323 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.961300804 podStartE2EDuration="2.961300804s" podCreationTimestamp="2026-02-20 08:45:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:45:43.954399598 +0000 UTC m=+7158.827026309" watchObservedRunningTime="2026-02-20 08:45:43.961300804 +0000 UTC m=+7158.833927515" Feb 20 08:45:44 crc kubenswrapper[5094]: I0220 08:45:44.302951 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 08:45:44 crc kubenswrapper[5094]: I0220 08:45:44.304801 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 08:45:47 crc kubenswrapper[5094]: I0220 08:45:47.273524 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 20 08:45:48 crc kubenswrapper[5094]: I0220 08:45:48.840833 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:45:48 crc kubenswrapper[5094]: E0220 08:45:48.841155 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:45:48 crc kubenswrapper[5094]: I0220 08:45:48.984887 5094 generic.go:334] "Generic (PLEG): container finished" podID="47f4c643-fb8b-41d6-97b5-fa0c0928f370" containerID="8c8e45f1f20160f7c96239278e797286c527baed5911fbde0292c56051da6b16" exitCode=0 Feb 20 08:45:48 crc kubenswrapper[5094]: I0220 08:45:48.984992 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5cthc" event={"ID":"47f4c643-fb8b-41d6-97b5-fa0c0928f370","Type":"ContainerDied","Data":"8c8e45f1f20160f7c96239278e797286c527baed5911fbde0292c56051da6b16"} Feb 20 08:45:49 crc kubenswrapper[5094]: I0220 08:45:49.254050 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 08:45:49 crc kubenswrapper[5094]: I0220 08:45:49.254112 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 08:45:49 crc kubenswrapper[5094]: I0220 08:45:49.303024 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 08:45:49 crc kubenswrapper[5094]: I0220 08:45:49.303329 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.337027 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="481ed5ff-4180-4ff6-8d5f-b7876b484fb2" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.70:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.337161 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="481ed5ff-4180-4ff6-8d5f-b7876b484fb2" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.70:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.417990 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5cthc" Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.421193 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="03adc315-89a1-44e2-b06f-f2279bd0805f" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.71:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.421191 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="03adc315-89a1-44e2-b06f-f2279bd0805f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.71:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.525332 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-scripts\") pod \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\" (UID: \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\") " Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.525386 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-combined-ca-bundle\") pod \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\" (UID: \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\") " Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.525581 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-config-data\") pod \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\" (UID: \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\") " Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.525868 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6n5d\" (UniqueName: \"kubernetes.io/projected/47f4c643-fb8b-41d6-97b5-fa0c0928f370-kube-api-access-q6n5d\") pod \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\" (UID: \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\") " Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.531590 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47f4c643-fb8b-41d6-97b5-fa0c0928f370-kube-api-access-q6n5d" (OuterVolumeSpecName: "kube-api-access-q6n5d") pod "47f4c643-fb8b-41d6-97b5-fa0c0928f370" (UID: "47f4c643-fb8b-41d6-97b5-fa0c0928f370"). InnerVolumeSpecName "kube-api-access-q6n5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.532482 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-scripts" (OuterVolumeSpecName: "scripts") pod "47f4c643-fb8b-41d6-97b5-fa0c0928f370" (UID: "47f4c643-fb8b-41d6-97b5-fa0c0928f370"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.559839 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47f4c643-fb8b-41d6-97b5-fa0c0928f370" (UID: "47f4c643-fb8b-41d6-97b5-fa0c0928f370"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.573824 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-config-data" (OuterVolumeSpecName: "config-data") pod "47f4c643-fb8b-41d6-97b5-fa0c0928f370" (UID: "47f4c643-fb8b-41d6-97b5-fa0c0928f370"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.628499 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.628585 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6n5d\" (UniqueName: \"kubernetes.io/projected/47f4c643-fb8b-41d6-97b5-fa0c0928f370-kube-api-access-q6n5d\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.628606 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.628624 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:51 crc kubenswrapper[5094]: I0220 08:45:51.023131 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5cthc" event={"ID":"47f4c643-fb8b-41d6-97b5-fa0c0928f370","Type":"ContainerDied","Data":"243c552ada916fbd8c98356e79d33549118c50125ea4dea29e072301c1c2979e"} Feb 20 08:45:51 crc kubenswrapper[5094]: I0220 08:45:51.023188 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="243c552ada916fbd8c98356e79d33549118c50125ea4dea29e072301c1c2979e" Feb 20 08:45:51 crc kubenswrapper[5094]: I0220 08:45:51.023225 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5cthc" Feb 20 08:45:51 crc kubenswrapper[5094]: I0220 08:45:51.205948 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:45:51 crc kubenswrapper[5094]: I0220 08:45:51.206303 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="481ed5ff-4180-4ff6-8d5f-b7876b484fb2" containerName="nova-api-log" containerID="cri-o://a6fc48bcb490e8f80490f7e2f6378f1e5b84f106073ec34ad5101cf96c2294bf" gracePeriod=30 Feb 20 08:45:51 crc kubenswrapper[5094]: I0220 08:45:51.206357 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="481ed5ff-4180-4ff6-8d5f-b7876b484fb2" containerName="nova-api-api" containerID="cri-o://33cdd137dbb0b138ccaa238a59c498915d20088d8ecfd36abd345e1ae43df087" gracePeriod=30 Feb 20 08:45:51 crc kubenswrapper[5094]: I0220 08:45:51.230331 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:45:51 crc kubenswrapper[5094]: I0220 08:45:51.230675 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2e15e686-66dc-4bb3-989f-d1f84b318cf7" containerName="nova-scheduler-scheduler" containerID="cri-o://ba3fb8433e297295f6890e9d15da15d3e69001dff8980a0f56301d98c2a70d28" gracePeriod=30 Feb 20 08:45:51 crc kubenswrapper[5094]: I0220 08:45:51.301879 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:45:51 crc kubenswrapper[5094]: I0220 08:45:51.302409 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="03adc315-89a1-44e2-b06f-f2279bd0805f" containerName="nova-metadata-log" containerID="cri-o://e7475258d7537ed8a05219d178f13988fa9c6b085bc9ec476ed3d839bed3afbd" gracePeriod=30 Feb 20 08:45:51 crc kubenswrapper[5094]: I0220 08:45:51.302586 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="03adc315-89a1-44e2-b06f-f2279bd0805f" containerName="nova-metadata-metadata" containerID="cri-o://5ca1c7e15cb6d9d823eb80f708a2c7294008bcec7b44ac10b83d1221ebe02341" gracePeriod=30 Feb 20 08:45:52 crc kubenswrapper[5094]: I0220 08:45:52.045960 5094 generic.go:334] "Generic (PLEG): container finished" podID="03adc315-89a1-44e2-b06f-f2279bd0805f" containerID="e7475258d7537ed8a05219d178f13988fa9c6b085bc9ec476ed3d839bed3afbd" exitCode=143 Feb 20 08:45:52 crc kubenswrapper[5094]: I0220 08:45:52.046143 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"03adc315-89a1-44e2-b06f-f2279bd0805f","Type":"ContainerDied","Data":"e7475258d7537ed8a05219d178f13988fa9c6b085bc9ec476ed3d839bed3afbd"} Feb 20 08:45:52 crc kubenswrapper[5094]: I0220 08:45:52.061674 5094 generic.go:334] "Generic (PLEG): container finished" podID="481ed5ff-4180-4ff6-8d5f-b7876b484fb2" containerID="a6fc48bcb490e8f80490f7e2f6378f1e5b84f106073ec34ad5101cf96c2294bf" exitCode=143 Feb 20 08:45:52 crc kubenswrapper[5094]: I0220 08:45:52.061793 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"481ed5ff-4180-4ff6-8d5f-b7876b484fb2","Type":"ContainerDied","Data":"a6fc48bcb490e8f80490f7e2f6378f1e5b84f106073ec34ad5101cf96c2294bf"} Feb 20 08:45:54 crc kubenswrapper[5094]: I0220 08:45:54.084740 5094 generic.go:334] "Generic (PLEG): container finished" podID="2e15e686-66dc-4bb3-989f-d1f84b318cf7" containerID="ba3fb8433e297295f6890e9d15da15d3e69001dff8980a0f56301d98c2a70d28" exitCode=0 Feb 20 08:45:54 crc kubenswrapper[5094]: I0220 08:45:54.084816 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2e15e686-66dc-4bb3-989f-d1f84b318cf7","Type":"ContainerDied","Data":"ba3fb8433e297295f6890e9d15da15d3e69001dff8980a0f56301d98c2a70d28"} Feb 20 08:45:54 crc kubenswrapper[5094]: I0220 08:45:54.344979 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 08:45:54 crc kubenswrapper[5094]: I0220 08:45:54.399632 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e15e686-66dc-4bb3-989f-d1f84b318cf7-config-data\") pod \"2e15e686-66dc-4bb3-989f-d1f84b318cf7\" (UID: \"2e15e686-66dc-4bb3-989f-d1f84b318cf7\") " Feb 20 08:45:54 crc kubenswrapper[5094]: I0220 08:45:54.399896 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e15e686-66dc-4bb3-989f-d1f84b318cf7-combined-ca-bundle\") pod \"2e15e686-66dc-4bb3-989f-d1f84b318cf7\" (UID: \"2e15e686-66dc-4bb3-989f-d1f84b318cf7\") " Feb 20 08:45:54 crc kubenswrapper[5094]: I0220 08:45:54.399935 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn7mf\" (UniqueName: \"kubernetes.io/projected/2e15e686-66dc-4bb3-989f-d1f84b318cf7-kube-api-access-bn7mf\") pod \"2e15e686-66dc-4bb3-989f-d1f84b318cf7\" (UID: \"2e15e686-66dc-4bb3-989f-d1f84b318cf7\") " Feb 20 08:45:54 crc kubenswrapper[5094]: I0220 08:45:54.407627 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e15e686-66dc-4bb3-989f-d1f84b318cf7-kube-api-access-bn7mf" (OuterVolumeSpecName: "kube-api-access-bn7mf") pod "2e15e686-66dc-4bb3-989f-d1f84b318cf7" (UID: "2e15e686-66dc-4bb3-989f-d1f84b318cf7"). InnerVolumeSpecName "kube-api-access-bn7mf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:54 crc kubenswrapper[5094]: I0220 08:45:54.427194 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e15e686-66dc-4bb3-989f-d1f84b318cf7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e15e686-66dc-4bb3-989f-d1f84b318cf7" (UID: "2e15e686-66dc-4bb3-989f-d1f84b318cf7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:54 crc kubenswrapper[5094]: I0220 08:45:54.428608 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e15e686-66dc-4bb3-989f-d1f84b318cf7-config-data" (OuterVolumeSpecName: "config-data") pod "2e15e686-66dc-4bb3-989f-d1f84b318cf7" (UID: "2e15e686-66dc-4bb3-989f-d1f84b318cf7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:54 crc kubenswrapper[5094]: I0220 08:45:54.502096 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e15e686-66dc-4bb3-989f-d1f84b318cf7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:54 crc kubenswrapper[5094]: I0220 08:45:54.502138 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn7mf\" (UniqueName: \"kubernetes.io/projected/2e15e686-66dc-4bb3-989f-d1f84b318cf7-kube-api-access-bn7mf\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:54 crc kubenswrapper[5094]: I0220 08:45:54.502149 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e15e686-66dc-4bb3-989f-d1f84b318cf7-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:54 crc kubenswrapper[5094]: I0220 08:45:54.871823 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.011539 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03adc315-89a1-44e2-b06f-f2279bd0805f-combined-ca-bundle\") pod \"03adc315-89a1-44e2-b06f-f2279bd0805f\" (UID: \"03adc315-89a1-44e2-b06f-f2279bd0805f\") " Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.011600 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03adc315-89a1-44e2-b06f-f2279bd0805f-logs\") pod \"03adc315-89a1-44e2-b06f-f2279bd0805f\" (UID: \"03adc315-89a1-44e2-b06f-f2279bd0805f\") " Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.011806 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlhhr\" (UniqueName: \"kubernetes.io/projected/03adc315-89a1-44e2-b06f-f2279bd0805f-kube-api-access-zlhhr\") pod \"03adc315-89a1-44e2-b06f-f2279bd0805f\" (UID: \"03adc315-89a1-44e2-b06f-f2279bd0805f\") " Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.012191 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03adc315-89a1-44e2-b06f-f2279bd0805f-logs" (OuterVolumeSpecName: "logs") pod "03adc315-89a1-44e2-b06f-f2279bd0805f" (UID: "03adc315-89a1-44e2-b06f-f2279bd0805f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.012383 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03adc315-89a1-44e2-b06f-f2279bd0805f-config-data\") pod \"03adc315-89a1-44e2-b06f-f2279bd0805f\" (UID: \"03adc315-89a1-44e2-b06f-f2279bd0805f\") " Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.013098 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03adc315-89a1-44e2-b06f-f2279bd0805f-logs\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.016397 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03adc315-89a1-44e2-b06f-f2279bd0805f-kube-api-access-zlhhr" (OuterVolumeSpecName: "kube-api-access-zlhhr") pod "03adc315-89a1-44e2-b06f-f2279bd0805f" (UID: "03adc315-89a1-44e2-b06f-f2279bd0805f"). InnerVolumeSpecName "kube-api-access-zlhhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.038161 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03adc315-89a1-44e2-b06f-f2279bd0805f-config-data" (OuterVolumeSpecName: "config-data") pod "03adc315-89a1-44e2-b06f-f2279bd0805f" (UID: "03adc315-89a1-44e2-b06f-f2279bd0805f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.058045 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03adc315-89a1-44e2-b06f-f2279bd0805f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03adc315-89a1-44e2-b06f-f2279bd0805f" (UID: "03adc315-89a1-44e2-b06f-f2279bd0805f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.095244 5094 generic.go:334] "Generic (PLEG): container finished" podID="03adc315-89a1-44e2-b06f-f2279bd0805f" containerID="5ca1c7e15cb6d9d823eb80f708a2c7294008bcec7b44ac10b83d1221ebe02341" exitCode=0 Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.095344 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"03adc315-89a1-44e2-b06f-f2279bd0805f","Type":"ContainerDied","Data":"5ca1c7e15cb6d9d823eb80f708a2c7294008bcec7b44ac10b83d1221ebe02341"} Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.095376 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"03adc315-89a1-44e2-b06f-f2279bd0805f","Type":"ContainerDied","Data":"0e6070237df8767b14a2b477f51aaced221d1a725981607f33d88c8bcb05cbb9"} Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.095399 5094 scope.go:117] "RemoveContainer" containerID="5ca1c7e15cb6d9d823eb80f708a2c7294008bcec7b44ac10b83d1221ebe02341" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.095541 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.097827 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2e15e686-66dc-4bb3-989f-d1f84b318cf7","Type":"ContainerDied","Data":"f4ce19ef2565a3b6eabf7f55a97f70c4b62b04d5648dcef78143beeffc69d496"} Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.097916 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.115650 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlhhr\" (UniqueName: \"kubernetes.io/projected/03adc315-89a1-44e2-b06f-f2279bd0805f-kube-api-access-zlhhr\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.115683 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03adc315-89a1-44e2-b06f-f2279bd0805f-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.115694 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03adc315-89a1-44e2-b06f-f2279bd0805f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.129939 5094 scope.go:117] "RemoveContainer" containerID="e7475258d7537ed8a05219d178f13988fa9c6b085bc9ec476ed3d839bed3afbd" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.154892 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.173100 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.181948 5094 scope.go:117] "RemoveContainer" containerID="5ca1c7e15cb6d9d823eb80f708a2c7294008bcec7b44ac10b83d1221ebe02341" Feb 20 08:45:55 crc kubenswrapper[5094]: E0220 08:45:55.184590 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ca1c7e15cb6d9d823eb80f708a2c7294008bcec7b44ac10b83d1221ebe02341\": container with ID starting with 5ca1c7e15cb6d9d823eb80f708a2c7294008bcec7b44ac10b83d1221ebe02341 not found: ID does not exist" containerID="5ca1c7e15cb6d9d823eb80f708a2c7294008bcec7b44ac10b83d1221ebe02341" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.184620 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ca1c7e15cb6d9d823eb80f708a2c7294008bcec7b44ac10b83d1221ebe02341"} err="failed to get container status \"5ca1c7e15cb6d9d823eb80f708a2c7294008bcec7b44ac10b83d1221ebe02341\": rpc error: code = NotFound desc = could not find container \"5ca1c7e15cb6d9d823eb80f708a2c7294008bcec7b44ac10b83d1221ebe02341\": container with ID starting with 5ca1c7e15cb6d9d823eb80f708a2c7294008bcec7b44ac10b83d1221ebe02341 not found: ID does not exist" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.184639 5094 scope.go:117] "RemoveContainer" containerID="e7475258d7537ed8a05219d178f13988fa9c6b085bc9ec476ed3d839bed3afbd" Feb 20 08:45:55 crc kubenswrapper[5094]: E0220 08:45:55.185818 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7475258d7537ed8a05219d178f13988fa9c6b085bc9ec476ed3d839bed3afbd\": container with ID starting with e7475258d7537ed8a05219d178f13988fa9c6b085bc9ec476ed3d839bed3afbd not found: ID does not exist" containerID="e7475258d7537ed8a05219d178f13988fa9c6b085bc9ec476ed3d839bed3afbd" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.185837 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7475258d7537ed8a05219d178f13988fa9c6b085bc9ec476ed3d839bed3afbd"} err="failed to get container status \"e7475258d7537ed8a05219d178f13988fa9c6b085bc9ec476ed3d839bed3afbd\": rpc error: code = NotFound desc = could not find container \"e7475258d7537ed8a05219d178f13988fa9c6b085bc9ec476ed3d839bed3afbd\": container with ID starting with e7475258d7537ed8a05219d178f13988fa9c6b085bc9ec476ed3d839bed3afbd not found: ID does not exist" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.185850 5094 scope.go:117] "RemoveContainer" containerID="ba3fb8433e297295f6890e9d15da15d3e69001dff8980a0f56301d98c2a70d28" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.200991 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.212643 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.221292 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:45:55 crc kubenswrapper[5094]: E0220 08:45:55.221630 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47f4c643-fb8b-41d6-97b5-fa0c0928f370" containerName="nova-manage" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.221641 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="47f4c643-fb8b-41d6-97b5-fa0c0928f370" containerName="nova-manage" Feb 20 08:45:55 crc kubenswrapper[5094]: E0220 08:45:55.221652 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03adc315-89a1-44e2-b06f-f2279bd0805f" containerName="nova-metadata-metadata" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.221658 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="03adc315-89a1-44e2-b06f-f2279bd0805f" containerName="nova-metadata-metadata" Feb 20 08:45:55 crc kubenswrapper[5094]: E0220 08:45:55.221674 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e15e686-66dc-4bb3-989f-d1f84b318cf7" containerName="nova-scheduler-scheduler" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.221680 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e15e686-66dc-4bb3-989f-d1f84b318cf7" containerName="nova-scheduler-scheduler" Feb 20 08:45:55 crc kubenswrapper[5094]: E0220 08:45:55.221687 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03adc315-89a1-44e2-b06f-f2279bd0805f" containerName="nova-metadata-log" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.221693 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="03adc315-89a1-44e2-b06f-f2279bd0805f" containerName="nova-metadata-log" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.221882 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="47f4c643-fb8b-41d6-97b5-fa0c0928f370" containerName="nova-manage" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.221903 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="03adc315-89a1-44e2-b06f-f2279bd0805f" containerName="nova-metadata-metadata" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.221918 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e15e686-66dc-4bb3-989f-d1f84b318cf7" containerName="nova-scheduler-scheduler" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.221929 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="03adc315-89a1-44e2-b06f-f2279bd0805f" containerName="nova-metadata-log" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.222859 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.224948 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.231651 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.244913 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.246426 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.248685 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.263562 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.317955 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d7zv\" (UniqueName: \"kubernetes.io/projected/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-kube-api-access-8d7zv\") pod \"nova-metadata-0\" (UID: \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\") " pod="openstack/nova-metadata-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.318017 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/008eca6a-12d6-40dd-96bb-391428bd27c5-config-data\") pod \"nova-scheduler-0\" (UID: \"008eca6a-12d6-40dd-96bb-391428bd27c5\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.318230 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svnb9\" (UniqueName: \"kubernetes.io/projected/008eca6a-12d6-40dd-96bb-391428bd27c5-kube-api-access-svnb9\") pod \"nova-scheduler-0\" (UID: \"008eca6a-12d6-40dd-96bb-391428bd27c5\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.318430 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-logs\") pod \"nova-metadata-0\" (UID: \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\") " pod="openstack/nova-metadata-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.318518 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\") " pod="openstack/nova-metadata-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.318675 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-config-data\") pod \"nova-metadata-0\" (UID: \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\") " pod="openstack/nova-metadata-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.318742 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/008eca6a-12d6-40dd-96bb-391428bd27c5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"008eca6a-12d6-40dd-96bb-391428bd27c5\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.420565 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-logs\") pod \"nova-metadata-0\" (UID: \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\") " pod="openstack/nova-metadata-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.420625 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\") " pod="openstack/nova-metadata-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.420689 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-config-data\") pod \"nova-metadata-0\" (UID: \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\") " pod="openstack/nova-metadata-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.420733 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/008eca6a-12d6-40dd-96bb-391428bd27c5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"008eca6a-12d6-40dd-96bb-391428bd27c5\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.420758 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d7zv\" (UniqueName: \"kubernetes.io/projected/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-kube-api-access-8d7zv\") pod \"nova-metadata-0\" (UID: \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\") " pod="openstack/nova-metadata-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.420788 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/008eca6a-12d6-40dd-96bb-391428bd27c5-config-data\") pod \"nova-scheduler-0\" (UID: \"008eca6a-12d6-40dd-96bb-391428bd27c5\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.420823 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svnb9\" (UniqueName: \"kubernetes.io/projected/008eca6a-12d6-40dd-96bb-391428bd27c5-kube-api-access-svnb9\") pod \"nova-scheduler-0\" (UID: \"008eca6a-12d6-40dd-96bb-391428bd27c5\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.421160 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-logs\") pod \"nova-metadata-0\" (UID: \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\") " pod="openstack/nova-metadata-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.426229 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-config-data\") pod \"nova-metadata-0\" (UID: \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\") " pod="openstack/nova-metadata-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.429185 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/008eca6a-12d6-40dd-96bb-391428bd27c5-config-data\") pod \"nova-scheduler-0\" (UID: \"008eca6a-12d6-40dd-96bb-391428bd27c5\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.431367 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/008eca6a-12d6-40dd-96bb-391428bd27c5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"008eca6a-12d6-40dd-96bb-391428bd27c5\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.431413 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\") " pod="openstack/nova-metadata-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.437039 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svnb9\" (UniqueName: \"kubernetes.io/projected/008eca6a-12d6-40dd-96bb-391428bd27c5-kube-api-access-svnb9\") pod \"nova-scheduler-0\" (UID: \"008eca6a-12d6-40dd-96bb-391428bd27c5\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.441331 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d7zv\" (UniqueName: \"kubernetes.io/projected/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-kube-api-access-8d7zv\") pod \"nova-metadata-0\" (UID: \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\") " pod="openstack/nova-metadata-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.544114 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.566619 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.867642 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03adc315-89a1-44e2-b06f-f2279bd0805f" path="/var/lib/kubelet/pods/03adc315-89a1-44e2-b06f-f2279bd0805f/volumes" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.869067 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e15e686-66dc-4bb3-989f-d1f84b318cf7" path="/var/lib/kubelet/pods/2e15e686-66dc-4bb3-989f-d1f84b318cf7/volumes" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.949289 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 08:45:56 crc kubenswrapper[5094]: W0220 08:45:56.017460 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0bce5a1d_755b_4c0a_b9aa_1fce8cbfc8b8.slice/crio-1d94c5e4163bc96c9ddb5b16f82044a027cfbc06ea952da5d0ad20ac89c00a85 WatchSource:0}: Error finding container 1d94c5e4163bc96c9ddb5b16f82044a027cfbc06ea952da5d0ad20ac89c00a85: Status 404 returned error can't find the container with id 1d94c5e4163bc96c9ddb5b16f82044a027cfbc06ea952da5d0ad20ac89c00a85 Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.018984 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.031022 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-combined-ca-bundle\") pod \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\" (UID: \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\") " Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.031157 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mknbw\" (UniqueName: \"kubernetes.io/projected/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-kube-api-access-mknbw\") pod \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\" (UID: \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\") " Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.031268 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-config-data\") pod \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\" (UID: \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\") " Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.031315 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-logs\") pod \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\" (UID: \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\") " Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.032396 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-logs" (OuterVolumeSpecName: "logs") pod "481ed5ff-4180-4ff6-8d5f-b7876b484fb2" (UID: "481ed5ff-4180-4ff6-8d5f-b7876b484fb2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.036641 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-kube-api-access-mknbw" (OuterVolumeSpecName: "kube-api-access-mknbw") pod "481ed5ff-4180-4ff6-8d5f-b7876b484fb2" (UID: "481ed5ff-4180-4ff6-8d5f-b7876b484fb2"). InnerVolumeSpecName "kube-api-access-mknbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.054596 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "481ed5ff-4180-4ff6-8d5f-b7876b484fb2" (UID: "481ed5ff-4180-4ff6-8d5f-b7876b484fb2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.061547 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-config-data" (OuterVolumeSpecName: "config-data") pod "481ed5ff-4180-4ff6-8d5f-b7876b484fb2" (UID: "481ed5ff-4180-4ff6-8d5f-b7876b484fb2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.118291 5094 generic.go:334] "Generic (PLEG): container finished" podID="481ed5ff-4180-4ff6-8d5f-b7876b484fb2" containerID="33cdd137dbb0b138ccaa238a59c498915d20088d8ecfd36abd345e1ae43df087" exitCode=0 Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.118646 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"481ed5ff-4180-4ff6-8d5f-b7876b484fb2","Type":"ContainerDied","Data":"33cdd137dbb0b138ccaa238a59c498915d20088d8ecfd36abd345e1ae43df087"} Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.118676 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"481ed5ff-4180-4ff6-8d5f-b7876b484fb2","Type":"ContainerDied","Data":"f3b21367601b2d8d9c4f3af3cf1e049f73d472a734810328e7d1183321ab8d37"} Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.118696 5094 scope.go:117] "RemoveContainer" containerID="33cdd137dbb0b138ccaa238a59c498915d20088d8ecfd36abd345e1ae43df087" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.118866 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.121849 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.122733 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8","Type":"ContainerStarted","Data":"1d94c5e4163bc96c9ddb5b16f82044a027cfbc06ea952da5d0ad20ac89c00a85"} Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.132777 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.132796 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-logs\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.132806 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.132818 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mknbw\" (UniqueName: \"kubernetes.io/projected/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-kube-api-access-mknbw\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.270139 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.273580 5094 scope.go:117] "RemoveContainer" containerID="a6fc48bcb490e8f80490f7e2f6378f1e5b84f106073ec34ad5101cf96c2294bf" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.287010 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.299398 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 20 08:45:56 crc kubenswrapper[5094]: E0220 08:45:56.301035 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod481ed5ff_4180_4ff6_8d5f_b7876b484fb2.slice\": RecentStats: unable to find data in memory cache]" Feb 20 08:45:56 crc kubenswrapper[5094]: E0220 08:45:56.303284 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="481ed5ff-4180-4ff6-8d5f-b7876b484fb2" containerName="nova-api-log" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.303308 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="481ed5ff-4180-4ff6-8d5f-b7876b484fb2" containerName="nova-api-log" Feb 20 08:45:56 crc kubenswrapper[5094]: E0220 08:45:56.303334 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="481ed5ff-4180-4ff6-8d5f-b7876b484fb2" containerName="nova-api-api" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.303342 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="481ed5ff-4180-4ff6-8d5f-b7876b484fb2" containerName="nova-api-api" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.303547 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="481ed5ff-4180-4ff6-8d5f-b7876b484fb2" containerName="nova-api-api" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.303561 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="481ed5ff-4180-4ff6-8d5f-b7876b484fb2" containerName="nova-api-log" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.304482 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.307037 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.318386 5094 scope.go:117] "RemoveContainer" containerID="33cdd137dbb0b138ccaa238a59c498915d20088d8ecfd36abd345e1ae43df087" Feb 20 08:45:56 crc kubenswrapper[5094]: E0220 08:45:56.319509 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33cdd137dbb0b138ccaa238a59c498915d20088d8ecfd36abd345e1ae43df087\": container with ID starting with 33cdd137dbb0b138ccaa238a59c498915d20088d8ecfd36abd345e1ae43df087 not found: ID does not exist" containerID="33cdd137dbb0b138ccaa238a59c498915d20088d8ecfd36abd345e1ae43df087" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.322454 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33cdd137dbb0b138ccaa238a59c498915d20088d8ecfd36abd345e1ae43df087"} err="failed to get container status \"33cdd137dbb0b138ccaa238a59c498915d20088d8ecfd36abd345e1ae43df087\": rpc error: code = NotFound desc = could not find container \"33cdd137dbb0b138ccaa238a59c498915d20088d8ecfd36abd345e1ae43df087\": container with ID starting with 33cdd137dbb0b138ccaa238a59c498915d20088d8ecfd36abd345e1ae43df087 not found: ID does not exist" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.322499 5094 scope.go:117] "RemoveContainer" containerID="a6fc48bcb490e8f80490f7e2f6378f1e5b84f106073ec34ad5101cf96c2294bf" Feb 20 08:45:56 crc kubenswrapper[5094]: E0220 08:45:56.323514 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6fc48bcb490e8f80490f7e2f6378f1e5b84f106073ec34ad5101cf96c2294bf\": container with ID starting with a6fc48bcb490e8f80490f7e2f6378f1e5b84f106073ec34ad5101cf96c2294bf not found: ID does not exist" containerID="a6fc48bcb490e8f80490f7e2f6378f1e5b84f106073ec34ad5101cf96c2294bf" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.323558 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6fc48bcb490e8f80490f7e2f6378f1e5b84f106073ec34ad5101cf96c2294bf"} err="failed to get container status \"a6fc48bcb490e8f80490f7e2f6378f1e5b84f106073ec34ad5101cf96c2294bf\": rpc error: code = NotFound desc = could not find container \"a6fc48bcb490e8f80490f7e2f6378f1e5b84f106073ec34ad5101cf96c2294bf\": container with ID starting with a6fc48bcb490e8f80490f7e2f6378f1e5b84f106073ec34ad5101cf96c2294bf not found: ID does not exist" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.323604 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.436792 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-logs\") pod \"nova-api-0\" (UID: \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\") " pod="openstack/nova-api-0" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.437061 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\") " pod="openstack/nova-api-0" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.437115 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttjnw\" (UniqueName: \"kubernetes.io/projected/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-kube-api-access-ttjnw\") pod \"nova-api-0\" (UID: \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\") " pod="openstack/nova-api-0" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.437319 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-config-data\") pod \"nova-api-0\" (UID: \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\") " pod="openstack/nova-api-0" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.538658 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-logs\") pod \"nova-api-0\" (UID: \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\") " pod="openstack/nova-api-0" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.538713 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\") " pod="openstack/nova-api-0" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.538745 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttjnw\" (UniqueName: \"kubernetes.io/projected/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-kube-api-access-ttjnw\") pod \"nova-api-0\" (UID: \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\") " pod="openstack/nova-api-0" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.538823 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-config-data\") pod \"nova-api-0\" (UID: \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\") " pod="openstack/nova-api-0" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.539110 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-logs\") pod \"nova-api-0\" (UID: \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\") " pod="openstack/nova-api-0" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.542853 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-config-data\") pod \"nova-api-0\" (UID: \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\") " pod="openstack/nova-api-0" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.543694 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\") " pod="openstack/nova-api-0" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.555457 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttjnw\" (UniqueName: \"kubernetes.io/projected/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-kube-api-access-ttjnw\") pod \"nova-api-0\" (UID: \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\") " pod="openstack/nova-api-0" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.622540 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 08:45:57 crc kubenswrapper[5094]: I0220 08:45:57.041868 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:45:57 crc kubenswrapper[5094]: I0220 08:45:57.144119 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd","Type":"ContainerStarted","Data":"33ea9e800121c631e21658cce4961167b1aecb67f072d8888e0fe3827be668d0"} Feb 20 08:45:57 crc kubenswrapper[5094]: I0220 08:45:57.146027 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"008eca6a-12d6-40dd-96bb-391428bd27c5","Type":"ContainerStarted","Data":"a9a170791955d3710a5e3621cf8bb57829fa932fb69e487f09a56fe32269f824"} Feb 20 08:45:57 crc kubenswrapper[5094]: I0220 08:45:57.146070 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"008eca6a-12d6-40dd-96bb-391428bd27c5","Type":"ContainerStarted","Data":"0e61e77c80dfdbd6b142c3a986f85177ae881def4dc50810f9959c9b8afee96d"} Feb 20 08:45:57 crc kubenswrapper[5094]: I0220 08:45:57.154577 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8","Type":"ContainerStarted","Data":"ba11d9d7cdbecb4617edeb253e7344dc95bff86a042703cc75a176732533812d"} Feb 20 08:45:57 crc kubenswrapper[5094]: I0220 08:45:57.154629 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8","Type":"ContainerStarted","Data":"98e3c706b02a59b45b5bb55556cb058774353d206688f9292aefcfb6aeee1c49"} Feb 20 08:45:57 crc kubenswrapper[5094]: I0220 08:45:57.175684 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.175666153 podStartE2EDuration="2.175666153s" podCreationTimestamp="2026-02-20 08:45:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:45:57.164253449 +0000 UTC m=+7172.036880230" watchObservedRunningTime="2026-02-20 08:45:57.175666153 +0000 UTC m=+7172.048292864" Feb 20 08:45:57 crc kubenswrapper[5094]: I0220 08:45:57.183806 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.183788859 podStartE2EDuration="2.183788859s" podCreationTimestamp="2026-02-20 08:45:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:45:57.179647449 +0000 UTC m=+7172.052274170" watchObservedRunningTime="2026-02-20 08:45:57.183788859 +0000 UTC m=+7172.056415570" Feb 20 08:45:57 crc kubenswrapper[5094]: I0220 08:45:57.850213 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="481ed5ff-4180-4ff6-8d5f-b7876b484fb2" path="/var/lib/kubelet/pods/481ed5ff-4180-4ff6-8d5f-b7876b484fb2/volumes" Feb 20 08:45:58 crc kubenswrapper[5094]: I0220 08:45:58.164248 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd","Type":"ContainerStarted","Data":"c133aaff7af6bc08d5607f7f1072c13d9db525ce16f2e919cdf821ceb99af810"} Feb 20 08:45:58 crc kubenswrapper[5094]: I0220 08:45:58.164285 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd","Type":"ContainerStarted","Data":"ebc900f8a66f90ec54e3f832c463c60f80a44df3cab9a5a642e1198f63755d0f"} Feb 20 08:45:58 crc kubenswrapper[5094]: I0220 08:45:58.201726 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.201675581 podStartE2EDuration="2.201675581s" podCreationTimestamp="2026-02-20 08:45:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:45:58.185557784 +0000 UTC m=+7173.058184515" watchObservedRunningTime="2026-02-20 08:45:58.201675581 +0000 UTC m=+7173.074302332" Feb 20 08:46:00 crc kubenswrapper[5094]: I0220 08:46:00.545117 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 08:46:00 crc kubenswrapper[5094]: I0220 08:46:00.545171 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 08:46:00 crc kubenswrapper[5094]: I0220 08:46:00.567779 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 20 08:46:00 crc kubenswrapper[5094]: I0220 08:46:00.840644 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:46:00 crc kubenswrapper[5094]: E0220 08:46:00.840912 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:46:05 crc kubenswrapper[5094]: I0220 08:46:05.545233 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 08:46:05 crc kubenswrapper[5094]: I0220 08:46:05.545773 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 08:46:05 crc kubenswrapper[5094]: I0220 08:46:05.567785 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 20 08:46:05 crc kubenswrapper[5094]: I0220 08:46:05.611196 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 20 08:46:06 crc kubenswrapper[5094]: I0220 08:46:06.287180 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 20 08:46:06 crc kubenswrapper[5094]: I0220 08:46:06.622867 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 08:46:06 crc kubenswrapper[5094]: I0220 08:46:06.622925 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 08:46:06 crc kubenswrapper[5094]: I0220 08:46:06.627856 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.74:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 08:46:06 crc kubenswrapper[5094]: I0220 08:46:06.627870 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.74:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 08:46:07 crc kubenswrapper[5094]: I0220 08:46:07.705911 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.76:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 08:46:07 crc kubenswrapper[5094]: I0220 08:46:07.706200 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.76:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 08:46:12 crc kubenswrapper[5094]: I0220 08:46:12.840052 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:46:12 crc kubenswrapper[5094]: E0220 08:46:12.840629 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:46:15 crc kubenswrapper[5094]: I0220 08:46:15.546816 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 20 08:46:15 crc kubenswrapper[5094]: I0220 08:46:15.547102 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 20 08:46:15 crc kubenswrapper[5094]: I0220 08:46:15.548548 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 20 08:46:15 crc kubenswrapper[5094]: I0220 08:46:15.548954 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 20 08:46:16 crc kubenswrapper[5094]: I0220 08:46:16.627790 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 20 08:46:16 crc kubenswrapper[5094]: I0220 08:46:16.627874 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 20 08:46:16 crc kubenswrapper[5094]: I0220 08:46:16.628653 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 20 08:46:16 crc kubenswrapper[5094]: I0220 08:46:16.628685 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 20 08:46:16 crc kubenswrapper[5094]: I0220 08:46:16.632096 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 20 08:46:16 crc kubenswrapper[5094]: I0220 08:46:16.632345 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 20 08:46:16 crc kubenswrapper[5094]: I0220 08:46:16.868291 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84fd48bb75-4njkr"] Feb 20 08:46:16 crc kubenswrapper[5094]: I0220 08:46:16.873110 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:16 crc kubenswrapper[5094]: I0220 08:46:16.907109 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84fd48bb75-4njkr"] Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.015663 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-config\") pod \"dnsmasq-dns-84fd48bb75-4njkr\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.015952 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-dns-svc\") pod \"dnsmasq-dns-84fd48bb75-4njkr\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.016049 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp8v7\" (UniqueName: \"kubernetes.io/projected/4b88f14c-752a-4565-848b-8fb7820295db-kube-api-access-lp8v7\") pod \"dnsmasq-dns-84fd48bb75-4njkr\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.016184 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-ovsdbserver-nb\") pod \"dnsmasq-dns-84fd48bb75-4njkr\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.016255 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-ovsdbserver-sb\") pod \"dnsmasq-dns-84fd48bb75-4njkr\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.117311 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp8v7\" (UniqueName: \"kubernetes.io/projected/4b88f14c-752a-4565-848b-8fb7820295db-kube-api-access-lp8v7\") pod \"dnsmasq-dns-84fd48bb75-4njkr\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.117627 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-ovsdbserver-nb\") pod \"dnsmasq-dns-84fd48bb75-4njkr\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.117731 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-ovsdbserver-sb\") pod \"dnsmasq-dns-84fd48bb75-4njkr\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.117901 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-config\") pod \"dnsmasq-dns-84fd48bb75-4njkr\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.117986 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-dns-svc\") pod \"dnsmasq-dns-84fd48bb75-4njkr\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.118517 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-ovsdbserver-nb\") pod \"dnsmasq-dns-84fd48bb75-4njkr\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.118622 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-ovsdbserver-sb\") pod \"dnsmasq-dns-84fd48bb75-4njkr\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.118906 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-dns-svc\") pod \"dnsmasq-dns-84fd48bb75-4njkr\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.119063 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-config\") pod \"dnsmasq-dns-84fd48bb75-4njkr\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.134419 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp8v7\" (UniqueName: \"kubernetes.io/projected/4b88f14c-752a-4565-848b-8fb7820295db-kube-api-access-lp8v7\") pod \"dnsmasq-dns-84fd48bb75-4njkr\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.210207 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.641894 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84fd48bb75-4njkr"] Feb 20 08:46:18 crc kubenswrapper[5094]: I0220 08:46:18.371193 5094 generic.go:334] "Generic (PLEG): container finished" podID="4b88f14c-752a-4565-848b-8fb7820295db" containerID="1841a2e465ce1712f2969881d7f130fad139643616a8d74914bfde959d8674dd" exitCode=0 Feb 20 08:46:18 crc kubenswrapper[5094]: I0220 08:46:18.371278 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" event={"ID":"4b88f14c-752a-4565-848b-8fb7820295db","Type":"ContainerDied","Data":"1841a2e465ce1712f2969881d7f130fad139643616a8d74914bfde959d8674dd"} Feb 20 08:46:18 crc kubenswrapper[5094]: I0220 08:46:18.371673 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" event={"ID":"4b88f14c-752a-4565-848b-8fb7820295db","Type":"ContainerStarted","Data":"1c3f648da5272f28b64986994942baf398d516fa729e7f58160c062950c2a99e"} Feb 20 08:46:19 crc kubenswrapper[5094]: I0220 08:46:19.381025 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" event={"ID":"4b88f14c-752a-4565-848b-8fb7820295db","Type":"ContainerStarted","Data":"d12039d2f80d89c47082ff12d98c2303384b5a9706d174263440ba803b4d8bd8"} Feb 20 08:46:19 crc kubenswrapper[5094]: I0220 08:46:19.381372 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:19 crc kubenswrapper[5094]: I0220 08:46:19.402470 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" podStartSLOduration=3.402450276 podStartE2EDuration="3.402450276s" podCreationTimestamp="2026-02-20 08:46:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:46:19.400673134 +0000 UTC m=+7194.273299845" watchObservedRunningTime="2026-02-20 08:46:19.402450276 +0000 UTC m=+7194.275077007" Feb 20 08:46:25 crc kubenswrapper[5094]: I0220 08:46:25.845719 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:46:25 crc kubenswrapper[5094]: E0220 08:46:25.846642 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:46:27 crc kubenswrapper[5094]: I0220 08:46:27.212154 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:27 crc kubenswrapper[5094]: I0220 08:46:27.295267 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9dcb44685-h54hc"] Feb 20 08:46:27 crc kubenswrapper[5094]: I0220 08:46:27.296209 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9dcb44685-h54hc" podUID="67ea7a9d-b48a-4ea9-be81-50d152a57e58" containerName="dnsmasq-dns" containerID="cri-o://362e6ae91ccecf694cdc7738cb8ce59162ad28f767e767b41fae829ceaf54420" gracePeriod=10 Feb 20 08:46:27 crc kubenswrapper[5094]: I0220 08:46:27.457299 5094 generic.go:334] "Generic (PLEG): container finished" podID="67ea7a9d-b48a-4ea9-be81-50d152a57e58" containerID="362e6ae91ccecf694cdc7738cb8ce59162ad28f767e767b41fae829ceaf54420" exitCode=0 Feb 20 08:46:27 crc kubenswrapper[5094]: I0220 08:46:27.457341 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9dcb44685-h54hc" event={"ID":"67ea7a9d-b48a-4ea9-be81-50d152a57e58","Type":"ContainerDied","Data":"362e6ae91ccecf694cdc7738cb8ce59162ad28f767e767b41fae829ceaf54420"} Feb 20 08:46:27 crc kubenswrapper[5094]: I0220 08:46:27.778339 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:46:27 crc kubenswrapper[5094]: I0220 08:46:27.913239 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-config\") pod \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " Feb 20 08:46:27 crc kubenswrapper[5094]: I0220 08:46:27.913358 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-dns-svc\") pod \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " Feb 20 08:46:27 crc kubenswrapper[5094]: I0220 08:46:27.913421 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-ovsdbserver-nb\") pod \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " Feb 20 08:46:27 crc kubenswrapper[5094]: I0220 08:46:27.913465 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbwbn\" (UniqueName: \"kubernetes.io/projected/67ea7a9d-b48a-4ea9-be81-50d152a57e58-kube-api-access-jbwbn\") pod \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " Feb 20 08:46:27 crc kubenswrapper[5094]: I0220 08:46:27.913497 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-ovsdbserver-sb\") pod \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " Feb 20 08:46:27 crc kubenswrapper[5094]: I0220 08:46:27.920916 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67ea7a9d-b48a-4ea9-be81-50d152a57e58-kube-api-access-jbwbn" (OuterVolumeSpecName: "kube-api-access-jbwbn") pod "67ea7a9d-b48a-4ea9-be81-50d152a57e58" (UID: "67ea7a9d-b48a-4ea9-be81-50d152a57e58"). InnerVolumeSpecName "kube-api-access-jbwbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:46:27 crc kubenswrapper[5094]: I0220 08:46:27.955199 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-config" (OuterVolumeSpecName: "config") pod "67ea7a9d-b48a-4ea9-be81-50d152a57e58" (UID: "67ea7a9d-b48a-4ea9-be81-50d152a57e58"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:46:27 crc kubenswrapper[5094]: I0220 08:46:27.961436 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "67ea7a9d-b48a-4ea9-be81-50d152a57e58" (UID: "67ea7a9d-b48a-4ea9-be81-50d152a57e58"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:46:27 crc kubenswrapper[5094]: I0220 08:46:27.987380 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "67ea7a9d-b48a-4ea9-be81-50d152a57e58" (UID: "67ea7a9d-b48a-4ea9-be81-50d152a57e58"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:46:27 crc kubenswrapper[5094]: I0220 08:46:27.989376 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "67ea7a9d-b48a-4ea9-be81-50d152a57e58" (UID: "67ea7a9d-b48a-4ea9-be81-50d152a57e58"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:46:28 crc kubenswrapper[5094]: I0220 08:46:28.016063 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:46:28 crc kubenswrapper[5094]: I0220 08:46:28.016111 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 08:46:28 crc kubenswrapper[5094]: I0220 08:46:28.016129 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 08:46:28 crc kubenswrapper[5094]: I0220 08:46:28.016148 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbwbn\" (UniqueName: \"kubernetes.io/projected/67ea7a9d-b48a-4ea9-be81-50d152a57e58-kube-api-access-jbwbn\") on node \"crc\" DevicePath \"\"" Feb 20 08:46:28 crc kubenswrapper[5094]: I0220 08:46:28.016166 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 08:46:28 crc kubenswrapper[5094]: I0220 08:46:28.467319 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9dcb44685-h54hc" event={"ID":"67ea7a9d-b48a-4ea9-be81-50d152a57e58","Type":"ContainerDied","Data":"aecfbb6ad113706e1f13505e743a3ff12b31470b6a099b5823c226a7d30ce55c"} Feb 20 08:46:28 crc kubenswrapper[5094]: I0220 08:46:28.467380 5094 scope.go:117] "RemoveContainer" containerID="362e6ae91ccecf694cdc7738cb8ce59162ad28f767e767b41fae829ceaf54420" Feb 20 08:46:28 crc kubenswrapper[5094]: I0220 08:46:28.467376 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:46:28 crc kubenswrapper[5094]: I0220 08:46:28.502485 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9dcb44685-h54hc"] Feb 20 08:46:28 crc kubenswrapper[5094]: I0220 08:46:28.504101 5094 scope.go:117] "RemoveContainer" containerID="c16a7fdf9fc7f05c88a3c26ca7db3574a5f1cb1c1c0097c7a0361cae6e703d9a" Feb 20 08:46:28 crc kubenswrapper[5094]: I0220 08:46:28.510873 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9dcb44685-h54hc"] Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.542140 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-4ggkd"] Feb 20 08:46:29 crc kubenswrapper[5094]: E0220 08:46:29.542498 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67ea7a9d-b48a-4ea9-be81-50d152a57e58" containerName="init" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.542510 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="67ea7a9d-b48a-4ea9-be81-50d152a57e58" containerName="init" Feb 20 08:46:29 crc kubenswrapper[5094]: E0220 08:46:29.542533 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67ea7a9d-b48a-4ea9-be81-50d152a57e58" containerName="dnsmasq-dns" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.542539 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="67ea7a9d-b48a-4ea9-be81-50d152a57e58" containerName="dnsmasq-dns" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.542724 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="67ea7a9d-b48a-4ea9-be81-50d152a57e58" containerName="dnsmasq-dns" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.543361 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4ggkd" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.552016 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-4ggkd"] Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.649049 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv6j7\" (UniqueName: \"kubernetes.io/projected/3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991-kube-api-access-bv6j7\") pod \"cinder-db-create-4ggkd\" (UID: \"3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991\") " pod="openstack/cinder-db-create-4ggkd" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.649299 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991-operator-scripts\") pod \"cinder-db-create-4ggkd\" (UID: \"3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991\") " pod="openstack/cinder-db-create-4ggkd" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.653899 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-dc9b-account-create-update-6hd8k"] Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.655055 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-dc9b-account-create-update-6hd8k" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.659821 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.668253 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-dc9b-account-create-update-6hd8k"] Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.751289 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991-operator-scripts\") pod \"cinder-db-create-4ggkd\" (UID: \"3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991\") " pod="openstack/cinder-db-create-4ggkd" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.751372 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjnr8\" (UniqueName: \"kubernetes.io/projected/7475a056-ad82-42aa-85ee-4b5d6834434a-kube-api-access-wjnr8\") pod \"cinder-dc9b-account-create-update-6hd8k\" (UID: \"7475a056-ad82-42aa-85ee-4b5d6834434a\") " pod="openstack/cinder-dc9b-account-create-update-6hd8k" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.751470 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7475a056-ad82-42aa-85ee-4b5d6834434a-operator-scripts\") pod \"cinder-dc9b-account-create-update-6hd8k\" (UID: \"7475a056-ad82-42aa-85ee-4b5d6834434a\") " pod="openstack/cinder-dc9b-account-create-update-6hd8k" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.751514 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv6j7\" (UniqueName: \"kubernetes.io/projected/3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991-kube-api-access-bv6j7\") pod \"cinder-db-create-4ggkd\" (UID: \"3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991\") " pod="openstack/cinder-db-create-4ggkd" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.752929 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991-operator-scripts\") pod \"cinder-db-create-4ggkd\" (UID: \"3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991\") " pod="openstack/cinder-db-create-4ggkd" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.779476 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv6j7\" (UniqueName: \"kubernetes.io/projected/3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991-kube-api-access-bv6j7\") pod \"cinder-db-create-4ggkd\" (UID: \"3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991\") " pod="openstack/cinder-db-create-4ggkd" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.848202 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67ea7a9d-b48a-4ea9-be81-50d152a57e58" path="/var/lib/kubelet/pods/67ea7a9d-b48a-4ea9-be81-50d152a57e58/volumes" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.852795 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7475a056-ad82-42aa-85ee-4b5d6834434a-operator-scripts\") pod \"cinder-dc9b-account-create-update-6hd8k\" (UID: \"7475a056-ad82-42aa-85ee-4b5d6834434a\") " pod="openstack/cinder-dc9b-account-create-update-6hd8k" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.852909 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjnr8\" (UniqueName: \"kubernetes.io/projected/7475a056-ad82-42aa-85ee-4b5d6834434a-kube-api-access-wjnr8\") pod \"cinder-dc9b-account-create-update-6hd8k\" (UID: \"7475a056-ad82-42aa-85ee-4b5d6834434a\") " pod="openstack/cinder-dc9b-account-create-update-6hd8k" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.853492 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7475a056-ad82-42aa-85ee-4b5d6834434a-operator-scripts\") pod \"cinder-dc9b-account-create-update-6hd8k\" (UID: \"7475a056-ad82-42aa-85ee-4b5d6834434a\") " pod="openstack/cinder-dc9b-account-create-update-6hd8k" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.858337 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4ggkd" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.868003 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjnr8\" (UniqueName: \"kubernetes.io/projected/7475a056-ad82-42aa-85ee-4b5d6834434a-kube-api-access-wjnr8\") pod \"cinder-dc9b-account-create-update-6hd8k\" (UID: \"7475a056-ad82-42aa-85ee-4b5d6834434a\") " pod="openstack/cinder-dc9b-account-create-update-6hd8k" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.980031 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-dc9b-account-create-update-6hd8k" Feb 20 08:46:30 crc kubenswrapper[5094]: I0220 08:46:30.291177 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-4ggkd"] Feb 20 08:46:30 crc kubenswrapper[5094]: W0220 08:46:30.292599 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d9bbb80_f9cc_40ef_b3d1_c5a5cea72991.slice/crio-19b3baf919195ae5c3215c55724596ff0d01bd3eb70aab93ba302d7ed3b42b1d WatchSource:0}: Error finding container 19b3baf919195ae5c3215c55724596ff0d01bd3eb70aab93ba302d7ed3b42b1d: Status 404 returned error can't find the container with id 19b3baf919195ae5c3215c55724596ff0d01bd3eb70aab93ba302d7ed3b42b1d Feb 20 08:46:30 crc kubenswrapper[5094]: W0220 08:46:30.391398 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7475a056_ad82_42aa_85ee_4b5d6834434a.slice/crio-8e11e82accca1d45cfec13e5fcee565daa011139d2f502ad3af9c6cf2dfb2576 WatchSource:0}: Error finding container 8e11e82accca1d45cfec13e5fcee565daa011139d2f502ad3af9c6cf2dfb2576: Status 404 returned error can't find the container with id 8e11e82accca1d45cfec13e5fcee565daa011139d2f502ad3af9c6cf2dfb2576 Feb 20 08:46:30 crc kubenswrapper[5094]: I0220 08:46:30.391748 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-dc9b-account-create-update-6hd8k"] Feb 20 08:46:30 crc kubenswrapper[5094]: I0220 08:46:30.483785 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-4ggkd" event={"ID":"3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991","Type":"ContainerStarted","Data":"304a415638ed4b15154b7c41ed2540ace3b8c9ba6f1ff38016fbc2160491bb9c"} Feb 20 08:46:30 crc kubenswrapper[5094]: I0220 08:46:30.483869 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-4ggkd" event={"ID":"3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991","Type":"ContainerStarted","Data":"19b3baf919195ae5c3215c55724596ff0d01bd3eb70aab93ba302d7ed3b42b1d"} Feb 20 08:46:30 crc kubenswrapper[5094]: I0220 08:46:30.485185 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-dc9b-account-create-update-6hd8k" event={"ID":"7475a056-ad82-42aa-85ee-4b5d6834434a","Type":"ContainerStarted","Data":"8e11e82accca1d45cfec13e5fcee565daa011139d2f502ad3af9c6cf2dfb2576"} Feb 20 08:46:30 crc kubenswrapper[5094]: I0220 08:46:30.499823 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-4ggkd" podStartSLOduration=1.499780718 podStartE2EDuration="1.499780718s" podCreationTimestamp="2026-02-20 08:46:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:46:30.49614508 +0000 UTC m=+7205.368771791" watchObservedRunningTime="2026-02-20 08:46:30.499780718 +0000 UTC m=+7205.372407449" Feb 20 08:46:31 crc kubenswrapper[5094]: I0220 08:46:31.495489 5094 generic.go:334] "Generic (PLEG): container finished" podID="3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991" containerID="304a415638ed4b15154b7c41ed2540ace3b8c9ba6f1ff38016fbc2160491bb9c" exitCode=0 Feb 20 08:46:31 crc kubenswrapper[5094]: I0220 08:46:31.495578 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-4ggkd" event={"ID":"3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991","Type":"ContainerDied","Data":"304a415638ed4b15154b7c41ed2540ace3b8c9ba6f1ff38016fbc2160491bb9c"} Feb 20 08:46:31 crc kubenswrapper[5094]: I0220 08:46:31.500774 5094 generic.go:334] "Generic (PLEG): container finished" podID="7475a056-ad82-42aa-85ee-4b5d6834434a" containerID="e06fc5fd3620d2019a01f12c26721ba58935bf528cffce9cee66802b4ab5054a" exitCode=0 Feb 20 08:46:31 crc kubenswrapper[5094]: I0220 08:46:31.500819 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-dc9b-account-create-update-6hd8k" event={"ID":"7475a056-ad82-42aa-85ee-4b5d6834434a","Type":"ContainerDied","Data":"e06fc5fd3620d2019a01f12c26721ba58935bf528cffce9cee66802b4ab5054a"} Feb 20 08:46:32 crc kubenswrapper[5094]: I0220 08:46:32.877639 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4ggkd" Feb 20 08:46:32 crc kubenswrapper[5094]: I0220 08:46:32.976891 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-dc9b-account-create-update-6hd8k" Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.016542 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bv6j7\" (UniqueName: \"kubernetes.io/projected/3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991-kube-api-access-bv6j7\") pod \"3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991\" (UID: \"3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991\") " Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.016671 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991-operator-scripts\") pod \"3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991\" (UID: \"3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991\") " Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.017425 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991" (UID: "3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.022656 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991-kube-api-access-bv6j7" (OuterVolumeSpecName: "kube-api-access-bv6j7") pod "3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991" (UID: "3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991"). InnerVolumeSpecName "kube-api-access-bv6j7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.118806 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjnr8\" (UniqueName: \"kubernetes.io/projected/7475a056-ad82-42aa-85ee-4b5d6834434a-kube-api-access-wjnr8\") pod \"7475a056-ad82-42aa-85ee-4b5d6834434a\" (UID: \"7475a056-ad82-42aa-85ee-4b5d6834434a\") " Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.118891 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7475a056-ad82-42aa-85ee-4b5d6834434a-operator-scripts\") pod \"7475a056-ad82-42aa-85ee-4b5d6834434a\" (UID: \"7475a056-ad82-42aa-85ee-4b5d6834434a\") " Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.119197 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bv6j7\" (UniqueName: \"kubernetes.io/projected/3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991-kube-api-access-bv6j7\") on node \"crc\" DevicePath \"\"" Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.119212 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.119492 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7475a056-ad82-42aa-85ee-4b5d6834434a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7475a056-ad82-42aa-85ee-4b5d6834434a" (UID: "7475a056-ad82-42aa-85ee-4b5d6834434a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.121344 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7475a056-ad82-42aa-85ee-4b5d6834434a-kube-api-access-wjnr8" (OuterVolumeSpecName: "kube-api-access-wjnr8") pod "7475a056-ad82-42aa-85ee-4b5d6834434a" (UID: "7475a056-ad82-42aa-85ee-4b5d6834434a"). InnerVolumeSpecName "kube-api-access-wjnr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.220768 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjnr8\" (UniqueName: \"kubernetes.io/projected/7475a056-ad82-42aa-85ee-4b5d6834434a-kube-api-access-wjnr8\") on node \"crc\" DevicePath \"\"" Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.220841 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7475a056-ad82-42aa-85ee-4b5d6834434a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.524004 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-4ggkd" event={"ID":"3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991","Type":"ContainerDied","Data":"19b3baf919195ae5c3215c55724596ff0d01bd3eb70aab93ba302d7ed3b42b1d"} Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.524042 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19b3baf919195ae5c3215c55724596ff0d01bd3eb70aab93ba302d7ed3b42b1d" Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.524103 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4ggkd" Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.527387 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-dc9b-account-create-update-6hd8k" event={"ID":"7475a056-ad82-42aa-85ee-4b5d6834434a","Type":"ContainerDied","Data":"8e11e82accca1d45cfec13e5fcee565daa011139d2f502ad3af9c6cf2dfb2576"} Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.527431 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e11e82accca1d45cfec13e5fcee565daa011139d2f502ad3af9c6cf2dfb2576" Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.527499 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-dc9b-account-create-update-6hd8k" Feb 20 08:46:34 crc kubenswrapper[5094]: I0220 08:46:34.893433 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-n78qt"] Feb 20 08:46:34 crc kubenswrapper[5094]: E0220 08:46:34.894002 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991" containerName="mariadb-database-create" Feb 20 08:46:34 crc kubenswrapper[5094]: I0220 08:46:34.894025 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991" containerName="mariadb-database-create" Feb 20 08:46:34 crc kubenswrapper[5094]: E0220 08:46:34.894045 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7475a056-ad82-42aa-85ee-4b5d6834434a" containerName="mariadb-account-create-update" Feb 20 08:46:34 crc kubenswrapper[5094]: I0220 08:46:34.894057 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7475a056-ad82-42aa-85ee-4b5d6834434a" containerName="mariadb-account-create-update" Feb 20 08:46:34 crc kubenswrapper[5094]: I0220 08:46:34.894370 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="7475a056-ad82-42aa-85ee-4b5d6834434a" containerName="mariadb-account-create-update" Feb 20 08:46:34 crc kubenswrapper[5094]: I0220 08:46:34.894402 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991" containerName="mariadb-database-create" Feb 20 08:46:34 crc kubenswrapper[5094]: I0220 08:46:34.895226 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:34 crc kubenswrapper[5094]: I0220 08:46:34.903329 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 20 08:46:34 crc kubenswrapper[5094]: I0220 08:46:34.903517 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 20 08:46:34 crc kubenswrapper[5094]: I0220 08:46:34.904800 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xwpqq" Feb 20 08:46:34 crc kubenswrapper[5094]: I0220 08:46:34.906467 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-n78qt"] Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.053187 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlwfl\" (UniqueName: \"kubernetes.io/projected/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-kube-api-access-wlwfl\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.053231 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-db-sync-config-data\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.053255 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-etc-machine-id\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.053310 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-config-data\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.053365 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-combined-ca-bundle\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.053384 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-scripts\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.155113 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-scripts\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.155206 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlwfl\" (UniqueName: \"kubernetes.io/projected/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-kube-api-access-wlwfl\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.155229 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-db-sync-config-data\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.155253 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-etc-machine-id\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.155302 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-config-data\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.155371 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-combined-ca-bundle\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.155417 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-etc-machine-id\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.161195 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-db-sync-config-data\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.161195 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-scripts\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.161302 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-combined-ca-bundle\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.162317 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-config-data\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.173266 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlwfl\" (UniqueName: \"kubernetes.io/projected/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-kube-api-access-wlwfl\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.217225 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.663841 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-n78qt"] Feb 20 08:46:35 crc kubenswrapper[5094]: W0220 08:46:35.667139 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cbb5a80_aef6_405d_bf92_d0d9cc872c78.slice/crio-4f2f28f26a575d3a3ffbf892a30f944a346d706f099f73887583161f5992901d WatchSource:0}: Error finding container 4f2f28f26a575d3a3ffbf892a30f944a346d706f099f73887583161f5992901d: Status 404 returned error can't find the container with id 4f2f28f26a575d3a3ffbf892a30f944a346d706f099f73887583161f5992901d Feb 20 08:46:36 crc kubenswrapper[5094]: I0220 08:46:36.572695 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-n78qt" event={"ID":"0cbb5a80-aef6-405d-bf92-d0d9cc872c78","Type":"ContainerStarted","Data":"4f2f28f26a575d3a3ffbf892a30f944a346d706f099f73887583161f5992901d"} Feb 20 08:46:39 crc kubenswrapper[5094]: I0220 08:46:39.840174 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:46:39 crc kubenswrapper[5094]: E0220 08:46:39.840733 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:46:51 crc kubenswrapper[5094]: I0220 08:46:51.844168 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:46:51 crc kubenswrapper[5094]: E0220 08:46:51.844852 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:46:54 crc kubenswrapper[5094]: I0220 08:46:54.738352 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-n78qt" event={"ID":"0cbb5a80-aef6-405d-bf92-d0d9cc872c78","Type":"ContainerStarted","Data":"08a5be203668aa394a410644bd2e295ff1d095189ea60c660ca9c1800575b1fd"} Feb 20 08:46:54 crc kubenswrapper[5094]: I0220 08:46:54.766832 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-n78qt" podStartSLOduration=2.7379194460000003 podStartE2EDuration="20.766817205s" podCreationTimestamp="2026-02-20 08:46:34 +0000 UTC" firstStartedPulling="2026-02-20 08:46:35.669614732 +0000 UTC m=+7210.542241443" lastFinishedPulling="2026-02-20 08:46:53.698512491 +0000 UTC m=+7228.571139202" observedRunningTime="2026-02-20 08:46:54.759017767 +0000 UTC m=+7229.631644528" watchObservedRunningTime="2026-02-20 08:46:54.766817205 +0000 UTC m=+7229.639443916" Feb 20 08:46:56 crc kubenswrapper[5094]: I0220 08:46:56.753983 5094 generic.go:334] "Generic (PLEG): container finished" podID="0cbb5a80-aef6-405d-bf92-d0d9cc872c78" containerID="08a5be203668aa394a410644bd2e295ff1d095189ea60c660ca9c1800575b1fd" exitCode=0 Feb 20 08:46:56 crc kubenswrapper[5094]: I0220 08:46:56.754268 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-n78qt" event={"ID":"0cbb5a80-aef6-405d-bf92-d0d9cc872c78","Type":"ContainerDied","Data":"08a5be203668aa394a410644bd2e295ff1d095189ea60c660ca9c1800575b1fd"} Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.067534 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.227771 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-config-data\") pod \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.227844 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-db-sync-config-data\") pod \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.227871 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlwfl\" (UniqueName: \"kubernetes.io/projected/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-kube-api-access-wlwfl\") pod \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.227914 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-etc-machine-id\") pod \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.228021 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-combined-ca-bundle\") pod \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.228043 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-scripts\") pod \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.228305 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0cbb5a80-aef6-405d-bf92-d0d9cc872c78" (UID: "0cbb5a80-aef6-405d-bf92-d0d9cc872c78"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.228483 5094 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.232954 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0cbb5a80-aef6-405d-bf92-d0d9cc872c78" (UID: "0cbb5a80-aef6-405d-bf92-d0d9cc872c78"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.233022 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-scripts" (OuterVolumeSpecName: "scripts") pod "0cbb5a80-aef6-405d-bf92-d0d9cc872c78" (UID: "0cbb5a80-aef6-405d-bf92-d0d9cc872c78"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.233345 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-kube-api-access-wlwfl" (OuterVolumeSpecName: "kube-api-access-wlwfl") pod "0cbb5a80-aef6-405d-bf92-d0d9cc872c78" (UID: "0cbb5a80-aef6-405d-bf92-d0d9cc872c78"). InnerVolumeSpecName "kube-api-access-wlwfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.250855 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0cbb5a80-aef6-405d-bf92-d0d9cc872c78" (UID: "0cbb5a80-aef6-405d-bf92-d0d9cc872c78"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.286911 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-config-data" (OuterVolumeSpecName: "config-data") pod "0cbb5a80-aef6-405d-bf92-d0d9cc872c78" (UID: "0cbb5a80-aef6-405d-bf92-d0d9cc872c78"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.330989 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.331034 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.331048 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.331059 5094 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.331072 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlwfl\" (UniqueName: \"kubernetes.io/projected/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-kube-api-access-wlwfl\") on node \"crc\" DevicePath \"\"" Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.774547 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-n78qt" event={"ID":"0cbb5a80-aef6-405d-bf92-d0d9cc872c78","Type":"ContainerDied","Data":"4f2f28f26a575d3a3ffbf892a30f944a346d706f099f73887583161f5992901d"} Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.774594 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f2f28f26a575d3a3ffbf892a30f944a346d706f099f73887583161f5992901d" Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.774658 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.376747 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-644b874bd7-7xjnn"] Feb 20 08:46:59 crc kubenswrapper[5094]: E0220 08:46:59.377218 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cbb5a80-aef6-405d-bf92-d0d9cc872c78" containerName="cinder-db-sync" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.377235 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cbb5a80-aef6-405d-bf92-d0d9cc872c78" containerName="cinder-db-sync" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.377459 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cbb5a80-aef6-405d-bf92-d0d9cc872c78" containerName="cinder-db-sync" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.378583 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.409001 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-644b874bd7-7xjnn"] Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.550169 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-ovsdbserver-nb\") pod \"dnsmasq-dns-644b874bd7-7xjnn\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.550231 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-dns-svc\") pod \"dnsmasq-dns-644b874bd7-7xjnn\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.550276 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-ovsdbserver-sb\") pod \"dnsmasq-dns-644b874bd7-7xjnn\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.550293 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-config\") pod \"dnsmasq-dns-644b874bd7-7xjnn\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.550377 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjv28\" (UniqueName: \"kubernetes.io/projected/4909c4ac-65fa-412c-990d-974868b0f104-kube-api-access-sjv28\") pod \"dnsmasq-dns-644b874bd7-7xjnn\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.552400 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.553903 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.556923 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.557598 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.557841 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.557960 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xwpqq" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.564937 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.651909 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdwt4\" (UniqueName: \"kubernetes.io/projected/a7799fdb-4c3c-4792-be6d-f988852a6dad-kube-api-access-xdwt4\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.652224 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-scripts\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.652249 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7799fdb-4c3c-4792-be6d-f988852a6dad-logs\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.652268 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7799fdb-4c3c-4792-be6d-f988852a6dad-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.652296 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjv28\" (UniqueName: \"kubernetes.io/projected/4909c4ac-65fa-412c-990d-974868b0f104-kube-api-access-sjv28\") pod \"dnsmasq-dns-644b874bd7-7xjnn\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.652322 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-config-data\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.652355 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-ovsdbserver-nb\") pod \"dnsmasq-dns-644b874bd7-7xjnn\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.652377 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.652400 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-dns-svc\") pod \"dnsmasq-dns-644b874bd7-7xjnn\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.652439 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-ovsdbserver-sb\") pod \"dnsmasq-dns-644b874bd7-7xjnn\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.652458 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-config\") pod \"dnsmasq-dns-644b874bd7-7xjnn\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.652477 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-config-data-custom\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.653468 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-ovsdbserver-nb\") pod \"dnsmasq-dns-644b874bd7-7xjnn\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.654028 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-dns-svc\") pod \"dnsmasq-dns-644b874bd7-7xjnn\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.654388 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-config\") pod \"dnsmasq-dns-644b874bd7-7xjnn\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.654519 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-ovsdbserver-sb\") pod \"dnsmasq-dns-644b874bd7-7xjnn\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.671027 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjv28\" (UniqueName: \"kubernetes.io/projected/4909c4ac-65fa-412c-990d-974868b0f104-kube-api-access-sjv28\") pod \"dnsmasq-dns-644b874bd7-7xjnn\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.697771 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.760155 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-scripts\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.760212 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7799fdb-4c3c-4792-be6d-f988852a6dad-logs\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.760240 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7799fdb-4c3c-4792-be6d-f988852a6dad-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.760280 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-config-data\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.760333 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.760404 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-config-data-custom\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.760436 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdwt4\" (UniqueName: \"kubernetes.io/projected/a7799fdb-4c3c-4792-be6d-f988852a6dad-kube-api-access-xdwt4\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.762489 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7799fdb-4c3c-4792-be6d-f988852a6dad-logs\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.765700 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.765809 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7799fdb-4c3c-4792-be6d-f988852a6dad-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.767611 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-config-data\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.774637 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-config-data-custom\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.777200 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-scripts\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.783728 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdwt4\" (UniqueName: \"kubernetes.io/projected/a7799fdb-4c3c-4792-be6d-f988852a6dad-kube-api-access-xdwt4\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.869163 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 08:47:00 crc kubenswrapper[5094]: I0220 08:47:00.219441 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-644b874bd7-7xjnn"] Feb 20 08:47:00 crc kubenswrapper[5094]: W0220 08:47:00.392698 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7799fdb_4c3c_4792_be6d_f988852a6dad.slice/crio-0083b149ecde6dd092d89d6192562bc09a0f773314ca1e2250b77701f210c2ad WatchSource:0}: Error finding container 0083b149ecde6dd092d89d6192562bc09a0f773314ca1e2250b77701f210c2ad: Status 404 returned error can't find the container with id 0083b149ecde6dd092d89d6192562bc09a0f773314ca1e2250b77701f210c2ad Feb 20 08:47:00 crc kubenswrapper[5094]: I0220 08:47:00.396868 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 20 08:47:00 crc kubenswrapper[5094]: I0220 08:47:00.818751 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a7799fdb-4c3c-4792-be6d-f988852a6dad","Type":"ContainerStarted","Data":"0083b149ecde6dd092d89d6192562bc09a0f773314ca1e2250b77701f210c2ad"} Feb 20 08:47:00 crc kubenswrapper[5094]: I0220 08:47:00.820585 5094 generic.go:334] "Generic (PLEG): container finished" podID="4909c4ac-65fa-412c-990d-974868b0f104" containerID="f5767cd62a5a9e26fc88ffbe25eb74c9c4932ee6d1de8eb39356b77614dedec0" exitCode=0 Feb 20 08:47:00 crc kubenswrapper[5094]: I0220 08:47:00.820616 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" event={"ID":"4909c4ac-65fa-412c-990d-974868b0f104","Type":"ContainerDied","Data":"f5767cd62a5a9e26fc88ffbe25eb74c9c4932ee6d1de8eb39356b77614dedec0"} Feb 20 08:47:00 crc kubenswrapper[5094]: I0220 08:47:00.820631 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" event={"ID":"4909c4ac-65fa-412c-990d-974868b0f104","Type":"ContainerStarted","Data":"8084f28745ebf13a7935e0af610ee153d1789476c3995420facfd289029eaab4"} Feb 20 08:47:01 crc kubenswrapper[5094]: I0220 08:47:01.830799 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" event={"ID":"4909c4ac-65fa-412c-990d-974868b0f104","Type":"ContainerStarted","Data":"08c36e9a1d8ff13d6dc18d5d5c0ee6433ecb5624744ee3a3486155201c35db6a"} Feb 20 08:47:01 crc kubenswrapper[5094]: I0220 08:47:01.831399 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:47:01 crc kubenswrapper[5094]: I0220 08:47:01.834555 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a7799fdb-4c3c-4792-be6d-f988852a6dad","Type":"ContainerStarted","Data":"c6cbdccc1408b1467e78e30844e9e20cc8ecf13baefc42e4d8d177a2ad9ea513"} Feb 20 08:47:01 crc kubenswrapper[5094]: I0220 08:47:01.834607 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a7799fdb-4c3c-4792-be6d-f988852a6dad","Type":"ContainerStarted","Data":"ac9a8a209c41aa38d1afaa5649246b4c179ca53d98b2ce40bde25654f0c08e0a"} Feb 20 08:47:01 crc kubenswrapper[5094]: I0220 08:47:01.834737 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 20 08:47:01 crc kubenswrapper[5094]: I0220 08:47:01.876342 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" podStartSLOduration=2.876318352 podStartE2EDuration="2.876318352s" podCreationTimestamp="2026-02-20 08:46:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:47:01.854278282 +0000 UTC m=+7236.726905003" watchObservedRunningTime="2026-02-20 08:47:01.876318352 +0000 UTC m=+7236.748945073" Feb 20 08:47:01 crc kubenswrapper[5094]: I0220 08:47:01.886355 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.886333153 podStartE2EDuration="2.886333153s" podCreationTimestamp="2026-02-20 08:46:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:47:01.871598008 +0000 UTC m=+7236.744224719" watchObservedRunningTime="2026-02-20 08:47:01.886333153 +0000 UTC m=+7236.758959854" Feb 20 08:47:03 crc kubenswrapper[5094]: I0220 08:47:03.839879 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:47:03 crc kubenswrapper[5094]: E0220 08:47:03.840386 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:47:09 crc kubenswrapper[5094]: I0220 08:47:09.700041 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:47:09 crc kubenswrapper[5094]: I0220 08:47:09.803119 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84fd48bb75-4njkr"] Feb 20 08:47:09 crc kubenswrapper[5094]: I0220 08:47:09.803468 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" podUID="4b88f14c-752a-4565-848b-8fb7820295db" containerName="dnsmasq-dns" containerID="cri-o://d12039d2f80d89c47082ff12d98c2303384b5a9706d174263440ba803b4d8bd8" gracePeriod=10 Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.338873 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.368264 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-ovsdbserver-sb\") pod \"4b88f14c-752a-4565-848b-8fb7820295db\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.369310 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lp8v7\" (UniqueName: \"kubernetes.io/projected/4b88f14c-752a-4565-848b-8fb7820295db-kube-api-access-lp8v7\") pod \"4b88f14c-752a-4565-848b-8fb7820295db\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.369397 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-dns-svc\") pod \"4b88f14c-752a-4565-848b-8fb7820295db\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.369434 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-config\") pod \"4b88f14c-752a-4565-848b-8fb7820295db\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.369484 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-ovsdbserver-nb\") pod \"4b88f14c-752a-4565-848b-8fb7820295db\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.374868 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b88f14c-752a-4565-848b-8fb7820295db-kube-api-access-lp8v7" (OuterVolumeSpecName: "kube-api-access-lp8v7") pod "4b88f14c-752a-4565-848b-8fb7820295db" (UID: "4b88f14c-752a-4565-848b-8fb7820295db"). InnerVolumeSpecName "kube-api-access-lp8v7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.420971 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-config" (OuterVolumeSpecName: "config") pod "4b88f14c-752a-4565-848b-8fb7820295db" (UID: "4b88f14c-752a-4565-848b-8fb7820295db"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.442469 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4b88f14c-752a-4565-848b-8fb7820295db" (UID: "4b88f14c-752a-4565-848b-8fb7820295db"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.456198 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4b88f14c-752a-4565-848b-8fb7820295db" (UID: "4b88f14c-752a-4565-848b-8fb7820295db"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.459361 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4b88f14c-752a-4565-848b-8fb7820295db" (UID: "4b88f14c-752a-4565-848b-8fb7820295db"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.471544 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.471571 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lp8v7\" (UniqueName: \"kubernetes.io/projected/4b88f14c-752a-4565-848b-8fb7820295db-kube-api-access-lp8v7\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.471582 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.471595 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.471603 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.536635 5094 generic.go:334] "Generic (PLEG): container finished" podID="4b88f14c-752a-4565-848b-8fb7820295db" containerID="d12039d2f80d89c47082ff12d98c2303384b5a9706d174263440ba803b4d8bd8" exitCode=0 Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.536673 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" event={"ID":"4b88f14c-752a-4565-848b-8fb7820295db","Type":"ContainerDied","Data":"d12039d2f80d89c47082ff12d98c2303384b5a9706d174263440ba803b4d8bd8"} Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.536699 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" event={"ID":"4b88f14c-752a-4565-848b-8fb7820295db","Type":"ContainerDied","Data":"1c3f648da5272f28b64986994942baf398d516fa729e7f58160c062950c2a99e"} Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.536730 5094 scope.go:117] "RemoveContainer" containerID="d12039d2f80d89c47082ff12d98c2303384b5a9706d174263440ba803b4d8bd8" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.536837 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.570072 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84fd48bb75-4njkr"] Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.573345 5094 scope.go:117] "RemoveContainer" containerID="1841a2e465ce1712f2969881d7f130fad139643616a8d74914bfde959d8674dd" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.583301 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84fd48bb75-4njkr"] Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.617694 5094 scope.go:117] "RemoveContainer" containerID="d12039d2f80d89c47082ff12d98c2303384b5a9706d174263440ba803b4d8bd8" Feb 20 08:47:10 crc kubenswrapper[5094]: E0220 08:47:10.618172 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d12039d2f80d89c47082ff12d98c2303384b5a9706d174263440ba803b4d8bd8\": container with ID starting with d12039d2f80d89c47082ff12d98c2303384b5a9706d174263440ba803b4d8bd8 not found: ID does not exist" containerID="d12039d2f80d89c47082ff12d98c2303384b5a9706d174263440ba803b4d8bd8" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.618223 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d12039d2f80d89c47082ff12d98c2303384b5a9706d174263440ba803b4d8bd8"} err="failed to get container status \"d12039d2f80d89c47082ff12d98c2303384b5a9706d174263440ba803b4d8bd8\": rpc error: code = NotFound desc = could not find container \"d12039d2f80d89c47082ff12d98c2303384b5a9706d174263440ba803b4d8bd8\": container with ID starting with d12039d2f80d89c47082ff12d98c2303384b5a9706d174263440ba803b4d8bd8 not found: ID does not exist" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.618257 5094 scope.go:117] "RemoveContainer" containerID="1841a2e465ce1712f2969881d7f130fad139643616a8d74914bfde959d8674dd" Feb 20 08:47:10 crc kubenswrapper[5094]: E0220 08:47:10.619076 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1841a2e465ce1712f2969881d7f130fad139643616a8d74914bfde959d8674dd\": container with ID starting with 1841a2e465ce1712f2969881d7f130fad139643616a8d74914bfde959d8674dd not found: ID does not exist" containerID="1841a2e465ce1712f2969881d7f130fad139643616a8d74914bfde959d8674dd" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.619114 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1841a2e465ce1712f2969881d7f130fad139643616a8d74914bfde959d8674dd"} err="failed to get container status \"1841a2e465ce1712f2969881d7f130fad139643616a8d74914bfde959d8674dd\": rpc error: code = NotFound desc = could not find container \"1841a2e465ce1712f2969881d7f130fad139643616a8d74914bfde959d8674dd\": container with ID starting with 1841a2e465ce1712f2969881d7f130fad139643616a8d74914bfde959d8674dd not found: ID does not exist" Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.032004 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.032282 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="008eca6a-12d6-40dd-96bb-391428bd27c5" containerName="nova-scheduler-scheduler" containerID="cri-o://a9a170791955d3710a5e3621cf8bb57829fa932fb69e487f09a56fe32269f824" gracePeriod=30 Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.049191 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.049472 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" containerName="nova-metadata-log" containerID="cri-o://98e3c706b02a59b45b5bb55556cb058774353d206688f9292aefcfb6aeee1c49" gracePeriod=30 Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.049569 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" containerName="nova-metadata-metadata" containerID="cri-o://ba11d9d7cdbecb4617edeb253e7344dc95bff86a042703cc75a176732533812d" gracePeriod=30 Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.062492 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.062755 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" containerName="nova-api-log" containerID="cri-o://ebc900f8a66f90ec54e3f832c463c60f80a44df3cab9a5a642e1198f63755d0f" gracePeriod=30 Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.062793 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" containerName="nova-api-api" containerID="cri-o://c133aaff7af6bc08d5607f7f1072c13d9db525ce16f2e919cdf821ceb99af810" gracePeriod=30 Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.076160 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.076392 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="1b2421e1-8243-473f-8dd5-86bc130d251f" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://df33dc2485d8aea35d1d004dfc1ba0cd8d779ec59ff0b6fd5dcd53a2dfe85422" gracePeriod=30 Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.087403 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.087636 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="4bd3b441-92b9-4fd4-8451-dec1c354915e" containerName="nova-cell1-conductor-conductor" containerID="cri-o://43a32182f09efed60387dd69fdb95acc9d172d5439401fc1ab4dd9e997f9483d" gracePeriod=30 Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.122980 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.123170 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="feb6d550-160b-45ff-a4e3-f12bf0e0a4ca" containerName="nova-cell0-conductor-conductor" containerID="cri-o://e554f01421ff77c0561f5176cdeba2c223172b1b97e15c34de574fa303143b58" gracePeriod=30 Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.548515 5094 generic.go:334] "Generic (PLEG): container finished" podID="1b2421e1-8243-473f-8dd5-86bc130d251f" containerID="df33dc2485d8aea35d1d004dfc1ba0cd8d779ec59ff0b6fd5dcd53a2dfe85422" exitCode=0 Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.548564 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1b2421e1-8243-473f-8dd5-86bc130d251f","Type":"ContainerDied","Data":"df33dc2485d8aea35d1d004dfc1ba0cd8d779ec59ff0b6fd5dcd53a2dfe85422"} Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.551360 5094 generic.go:334] "Generic (PLEG): container finished" podID="ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" containerID="ebc900f8a66f90ec54e3f832c463c60f80a44df3cab9a5a642e1198f63755d0f" exitCode=143 Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.551391 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd","Type":"ContainerDied","Data":"ebc900f8a66f90ec54e3f832c463c60f80a44df3cab9a5a642e1198f63755d0f"} Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.553040 5094 generic.go:334] "Generic (PLEG): container finished" podID="0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" containerID="98e3c706b02a59b45b5bb55556cb058774353d206688f9292aefcfb6aeee1c49" exitCode=143 Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.553103 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8","Type":"ContainerDied","Data":"98e3c706b02a59b45b5bb55556cb058774353d206688f9292aefcfb6aeee1c49"} Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.772194 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.791198 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kh7fd\" (UniqueName: \"kubernetes.io/projected/1b2421e1-8243-473f-8dd5-86bc130d251f-kube-api-access-kh7fd\") pod \"1b2421e1-8243-473f-8dd5-86bc130d251f\" (UID: \"1b2421e1-8243-473f-8dd5-86bc130d251f\") " Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.791419 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b2421e1-8243-473f-8dd5-86bc130d251f-combined-ca-bundle\") pod \"1b2421e1-8243-473f-8dd5-86bc130d251f\" (UID: \"1b2421e1-8243-473f-8dd5-86bc130d251f\") " Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.791480 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b2421e1-8243-473f-8dd5-86bc130d251f-config-data\") pod \"1b2421e1-8243-473f-8dd5-86bc130d251f\" (UID: \"1b2421e1-8243-473f-8dd5-86bc130d251f\") " Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.796750 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b2421e1-8243-473f-8dd5-86bc130d251f-kube-api-access-kh7fd" (OuterVolumeSpecName: "kube-api-access-kh7fd") pod "1b2421e1-8243-473f-8dd5-86bc130d251f" (UID: "1b2421e1-8243-473f-8dd5-86bc130d251f"). InnerVolumeSpecName "kube-api-access-kh7fd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.818257 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b2421e1-8243-473f-8dd5-86bc130d251f-config-data" (OuterVolumeSpecName: "config-data") pod "1b2421e1-8243-473f-8dd5-86bc130d251f" (UID: "1b2421e1-8243-473f-8dd5-86bc130d251f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.832213 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b2421e1-8243-473f-8dd5-86bc130d251f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b2421e1-8243-473f-8dd5-86bc130d251f" (UID: "1b2421e1-8243-473f-8dd5-86bc130d251f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.862331 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b88f14c-752a-4565-848b-8fb7820295db" path="/var/lib/kubelet/pods/4b88f14c-752a-4565-848b-8fb7820295db/volumes" Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.893759 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b2421e1-8243-473f-8dd5-86bc130d251f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.893798 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b2421e1-8243-473f-8dd5-86bc130d251f-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.893812 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kh7fd\" (UniqueName: \"kubernetes.io/projected/1b2421e1-8243-473f-8dd5-86bc130d251f-kube-api-access-kh7fd\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.010829 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 20 08:47:12 crc kubenswrapper[5094]: E0220 08:47:12.231558 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="43a32182f09efed60387dd69fdb95acc9d172d5439401fc1ab4dd9e997f9483d" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 08:47:12 crc kubenswrapper[5094]: E0220 08:47:12.232685 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="43a32182f09efed60387dd69fdb95acc9d172d5439401fc1ab4dd9e997f9483d" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 08:47:12 crc kubenswrapper[5094]: E0220 08:47:12.236411 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="43a32182f09efed60387dd69fdb95acc9d172d5439401fc1ab4dd9e997f9483d" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 08:47:12 crc kubenswrapper[5094]: E0220 08:47:12.236451 5094 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="4bd3b441-92b9-4fd4-8451-dec1c354915e" containerName="nova-cell1-conductor-conductor" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.565105 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1b2421e1-8243-473f-8dd5-86bc130d251f","Type":"ContainerDied","Data":"3235e0f9a591cd9fc84d38dd78cd295e9068409f0e216b0a40d76217b8e522fa"} Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.565357 5094 scope.go:117] "RemoveContainer" containerID="df33dc2485d8aea35d1d004dfc1ba0cd8d779ec59ff0b6fd5dcd53a2dfe85422" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.565465 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.573640 5094 generic.go:334] "Generic (PLEG): container finished" podID="feb6d550-160b-45ff-a4e3-f12bf0e0a4ca" containerID="e554f01421ff77c0561f5176cdeba2c223172b1b97e15c34de574fa303143b58" exitCode=0 Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.573874 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca","Type":"ContainerDied","Data":"e554f01421ff77c0561f5176cdeba2c223172b1b97e15c34de574fa303143b58"} Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.610667 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.627692 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.656900 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 08:47:12 crc kubenswrapper[5094]: E0220 08:47:12.657621 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b2421e1-8243-473f-8dd5-86bc130d251f" containerName="nova-cell1-novncproxy-novncproxy" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.657638 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b2421e1-8243-473f-8dd5-86bc130d251f" containerName="nova-cell1-novncproxy-novncproxy" Feb 20 08:47:12 crc kubenswrapper[5094]: E0220 08:47:12.657657 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b88f14c-752a-4565-848b-8fb7820295db" containerName="dnsmasq-dns" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.657665 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b88f14c-752a-4565-848b-8fb7820295db" containerName="dnsmasq-dns" Feb 20 08:47:12 crc kubenswrapper[5094]: E0220 08:47:12.657677 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b88f14c-752a-4565-848b-8fb7820295db" containerName="init" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.657682 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b88f14c-752a-4565-848b-8fb7820295db" containerName="init" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.657963 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b88f14c-752a-4565-848b-8fb7820295db" containerName="dnsmasq-dns" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.657977 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b2421e1-8243-473f-8dd5-86bc130d251f" containerName="nova-cell1-novncproxy-novncproxy" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.658630 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.661669 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.668515 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.708836 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.708886 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hschv\" (UniqueName: \"kubernetes.io/projected/deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb-kube-api-access-hschv\") pod \"nova-cell1-novncproxy-0\" (UID: \"deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.708961 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.810839 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.811069 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hschv\" (UniqueName: \"kubernetes.io/projected/deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb-kube-api-access-hschv\") pod \"nova-cell1-novncproxy-0\" (UID: \"deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.811672 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.816469 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.820069 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.835242 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hschv\" (UniqueName: \"kubernetes.io/projected/deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb-kube-api-access-hschv\") pod \"nova-cell1-novncproxy-0\" (UID: \"deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.917186 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.977714 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.015745 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-combined-ca-bundle\") pod \"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca\" (UID: \"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca\") " Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.016221 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c49n4\" (UniqueName: \"kubernetes.io/projected/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-kube-api-access-c49n4\") pod \"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca\" (UID: \"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca\") " Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.016482 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-config-data\") pod \"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca\" (UID: \"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca\") " Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.023117 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-kube-api-access-c49n4" (OuterVolumeSpecName: "kube-api-access-c49n4") pod "feb6d550-160b-45ff-a4e3-f12bf0e0a4ca" (UID: "feb6d550-160b-45ff-a4e3-f12bf0e0a4ca"). InnerVolumeSpecName "kube-api-access-c49n4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.044199 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "feb6d550-160b-45ff-a4e3-f12bf0e0a4ca" (UID: "feb6d550-160b-45ff-a4e3-f12bf0e0a4ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.057344 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-config-data" (OuterVolumeSpecName: "config-data") pod "feb6d550-160b-45ff-a4e3-f12bf0e0a4ca" (UID: "feb6d550-160b-45ff-a4e3-f12bf0e0a4ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.119223 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.119253 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.119266 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c49n4\" (UniqueName: \"kubernetes.io/projected/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-kube-api-access-c49n4\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.229603 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.586116 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.586109 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca","Type":"ContainerDied","Data":"ed4718ea392d6b6de8bbaf21e72aac2f90b8c262b455dfd4b484e4049f29e229"} Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.586619 5094 scope.go:117] "RemoveContainer" containerID="e554f01421ff77c0561f5176cdeba2c223172b1b97e15c34de574fa303143b58" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.590018 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb","Type":"ContainerStarted","Data":"c0b59c2a463b548710f6fbe279ff8a8274457e652db869cc9b9e3369bcd64626"} Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.590065 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb","Type":"ContainerStarted","Data":"b0244251df9f7a574b074e23929a9ed957444491d4ff2d0dcfbdab8da31068a8"} Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.613954 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.613931173 podStartE2EDuration="1.613931173s" podCreationTimestamp="2026-02-20 08:47:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:47:13.613101613 +0000 UTC m=+7248.485728314" watchObservedRunningTime="2026-02-20 08:47:13.613931173 +0000 UTC m=+7248.486557894" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.651508 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.661803 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.674635 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 08:47:13 crc kubenswrapper[5094]: E0220 08:47:13.675112 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feb6d550-160b-45ff-a4e3-f12bf0e0a4ca" containerName="nova-cell0-conductor-conductor" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.675132 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="feb6d550-160b-45ff-a4e3-f12bf0e0a4ca" containerName="nova-cell0-conductor-conductor" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.675356 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="feb6d550-160b-45ff-a4e3-f12bf0e0a4ca" containerName="nova-cell0-conductor-conductor" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.676262 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.681348 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.684503 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.731564 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk747\" (UniqueName: \"kubernetes.io/projected/02305b70-64d3-46af-876a-f81d73f83cbf-kube-api-access-wk747\") pod \"nova-cell0-conductor-0\" (UID: \"02305b70-64d3-46af-876a-f81d73f83cbf\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.731767 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02305b70-64d3-46af-876a-f81d73f83cbf-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"02305b70-64d3-46af-876a-f81d73f83cbf\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.731944 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02305b70-64d3-46af-876a-f81d73f83cbf-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"02305b70-64d3-46af-876a-f81d73f83cbf\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.834848 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02305b70-64d3-46af-876a-f81d73f83cbf-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"02305b70-64d3-46af-876a-f81d73f83cbf\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.835016 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk747\" (UniqueName: \"kubernetes.io/projected/02305b70-64d3-46af-876a-f81d73f83cbf-kube-api-access-wk747\") pod \"nova-cell0-conductor-0\" (UID: \"02305b70-64d3-46af-876a-f81d73f83cbf\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.835138 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02305b70-64d3-46af-876a-f81d73f83cbf-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"02305b70-64d3-46af-876a-f81d73f83cbf\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.840314 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02305b70-64d3-46af-876a-f81d73f83cbf-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"02305b70-64d3-46af-876a-f81d73f83cbf\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.841236 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02305b70-64d3-46af-876a-f81d73f83cbf-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"02305b70-64d3-46af-876a-f81d73f83cbf\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.853239 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b2421e1-8243-473f-8dd5-86bc130d251f" path="/var/lib/kubelet/pods/1b2421e1-8243-473f-8dd5-86bc130d251f/volumes" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.853804 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feb6d550-160b-45ff-a4e3-f12bf0e0a4ca" path="/var/lib/kubelet/pods/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca/volumes" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.854795 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk747\" (UniqueName: \"kubernetes.io/projected/02305b70-64d3-46af-876a-f81d73f83cbf-kube-api-access-wk747\") pod \"nova-cell0-conductor-0\" (UID: \"02305b70-64d3-46af-876a-f81d73f83cbf\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.998275 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.460973 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.552357 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svnb9\" (UniqueName: \"kubernetes.io/projected/008eca6a-12d6-40dd-96bb-391428bd27c5-kube-api-access-svnb9\") pod \"008eca6a-12d6-40dd-96bb-391428bd27c5\" (UID: \"008eca6a-12d6-40dd-96bb-391428bd27c5\") " Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.552553 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/008eca6a-12d6-40dd-96bb-391428bd27c5-config-data\") pod \"008eca6a-12d6-40dd-96bb-391428bd27c5\" (UID: \"008eca6a-12d6-40dd-96bb-391428bd27c5\") " Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.552658 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/008eca6a-12d6-40dd-96bb-391428bd27c5-combined-ca-bundle\") pod \"008eca6a-12d6-40dd-96bb-391428bd27c5\" (UID: \"008eca6a-12d6-40dd-96bb-391428bd27c5\") " Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.559845 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/008eca6a-12d6-40dd-96bb-391428bd27c5-kube-api-access-svnb9" (OuterVolumeSpecName: "kube-api-access-svnb9") pod "008eca6a-12d6-40dd-96bb-391428bd27c5" (UID: "008eca6a-12d6-40dd-96bb-391428bd27c5"). InnerVolumeSpecName "kube-api-access-svnb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.583425 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/008eca6a-12d6-40dd-96bb-391428bd27c5-config-data" (OuterVolumeSpecName: "config-data") pod "008eca6a-12d6-40dd-96bb-391428bd27c5" (UID: "008eca6a-12d6-40dd-96bb-391428bd27c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.586803 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/008eca6a-12d6-40dd-96bb-391428bd27c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "008eca6a-12d6-40dd-96bb-391428bd27c5" (UID: "008eca6a-12d6-40dd-96bb-391428bd27c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.604168 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.609787 5094 generic.go:334] "Generic (PLEG): container finished" podID="0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" containerID="ba11d9d7cdbecb4617edeb253e7344dc95bff86a042703cc75a176732533812d" exitCode=0 Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.609956 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8","Type":"ContainerDied","Data":"ba11d9d7cdbecb4617edeb253e7344dc95bff86a042703cc75a176732533812d"} Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.649150 5094 generic.go:334] "Generic (PLEG): container finished" podID="ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" containerID="c133aaff7af6bc08d5607f7f1072c13d9db525ce16f2e919cdf821ceb99af810" exitCode=0 Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.649235 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd","Type":"ContainerDied","Data":"c133aaff7af6bc08d5607f7f1072c13d9db525ce16f2e919cdf821ceb99af810"} Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.651504 5094 generic.go:334] "Generic (PLEG): container finished" podID="008eca6a-12d6-40dd-96bb-391428bd27c5" containerID="a9a170791955d3710a5e3621cf8bb57829fa932fb69e487f09a56fe32269f824" exitCode=0 Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.651864 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.652798 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"008eca6a-12d6-40dd-96bb-391428bd27c5","Type":"ContainerDied","Data":"a9a170791955d3710a5e3621cf8bb57829fa932fb69e487f09a56fe32269f824"} Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.652838 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"008eca6a-12d6-40dd-96bb-391428bd27c5","Type":"ContainerDied","Data":"0e61e77c80dfdbd6b142c3a986f85177ae881def4dc50810f9959c9b8afee96d"} Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.652865 5094 scope.go:117] "RemoveContainer" containerID="a9a170791955d3710a5e3621cf8bb57829fa932fb69e487f09a56fe32269f824" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.654986 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/008eca6a-12d6-40dd-96bb-391428bd27c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.655031 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svnb9\" (UniqueName: \"kubernetes.io/projected/008eca6a-12d6-40dd-96bb-391428bd27c5-kube-api-access-svnb9\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.655044 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/008eca6a-12d6-40dd-96bb-391428bd27c5-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.714307 5094 scope.go:117] "RemoveContainer" containerID="a9a170791955d3710a5e3621cf8bb57829fa932fb69e487f09a56fe32269f824" Feb 20 08:47:14 crc kubenswrapper[5094]: E0220 08:47:14.715358 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9a170791955d3710a5e3621cf8bb57829fa932fb69e487f09a56fe32269f824\": container with ID starting with a9a170791955d3710a5e3621cf8bb57829fa932fb69e487f09a56fe32269f824 not found: ID does not exist" containerID="a9a170791955d3710a5e3621cf8bb57829fa932fb69e487f09a56fe32269f824" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.715523 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9a170791955d3710a5e3621cf8bb57829fa932fb69e487f09a56fe32269f824"} err="failed to get container status \"a9a170791955d3710a5e3621cf8bb57829fa932fb69e487f09a56fe32269f824\": rpc error: code = NotFound desc = could not find container \"a9a170791955d3710a5e3621cf8bb57829fa932fb69e487f09a56fe32269f824\": container with ID starting with a9a170791955d3710a5e3621cf8bb57829fa932fb69e487f09a56fe32269f824 not found: ID does not exist" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.731917 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.746237 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.753832 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:47:14 crc kubenswrapper[5094]: E0220 08:47:14.754359 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="008eca6a-12d6-40dd-96bb-391428bd27c5" containerName="nova-scheduler-scheduler" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.754386 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="008eca6a-12d6-40dd-96bb-391428bd27c5" containerName="nova-scheduler-scheduler" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.754748 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="008eca6a-12d6-40dd-96bb-391428bd27c5" containerName="nova-scheduler-scheduler" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.756873 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.761540 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.769787 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.861765 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.873191 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.965773 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-combined-ca-bundle\") pod \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\" (UID: \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\") " Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.965974 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-logs\") pod \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\" (UID: \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\") " Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.966047 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttjnw\" (UniqueName: \"kubernetes.io/projected/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-kube-api-access-ttjnw\") pod \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\" (UID: \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\") " Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.966107 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-config-data\") pod \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\" (UID: \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\") " Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.966619 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7kqr\" (UniqueName: \"kubernetes.io/projected/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-kube-api-access-t7kqr\") pod \"nova-scheduler-0\" (UID: \"80d2f807-a13f-4a1d-93d3-293d1afd6e4c\") " pod="openstack/nova-scheduler-0" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.966685 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"80d2f807-a13f-4a1d-93d3-293d1afd6e4c\") " pod="openstack/nova-scheduler-0" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.966771 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-config-data\") pod \"nova-scheduler-0\" (UID: \"80d2f807-a13f-4a1d-93d3-293d1afd6e4c\") " pod="openstack/nova-scheduler-0" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.969000 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-logs" (OuterVolumeSpecName: "logs") pod "ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" (UID: "ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.975189 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-kube-api-access-ttjnw" (OuterVolumeSpecName: "kube-api-access-ttjnw") pod "ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" (UID: "ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd"). InnerVolumeSpecName "kube-api-access-ttjnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.009654 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-config-data" (OuterVolumeSpecName: "config-data") pod "ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" (UID: "ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.024088 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" (UID: "ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.069338 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-combined-ca-bundle\") pod \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\" (UID: \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\") " Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.069434 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-logs\") pod \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\" (UID: \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\") " Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.069496 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-config-data\") pod \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\" (UID: \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\") " Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.069573 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d7zv\" (UniqueName: \"kubernetes.io/projected/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-kube-api-access-8d7zv\") pod \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\" (UID: \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\") " Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.069886 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"80d2f807-a13f-4a1d-93d3-293d1afd6e4c\") " pod="openstack/nova-scheduler-0" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.069962 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-config-data\") pod \"nova-scheduler-0\" (UID: \"80d2f807-a13f-4a1d-93d3-293d1afd6e4c\") " pod="openstack/nova-scheduler-0" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.070084 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7kqr\" (UniqueName: \"kubernetes.io/projected/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-kube-api-access-t7kqr\") pod \"nova-scheduler-0\" (UID: \"80d2f807-a13f-4a1d-93d3-293d1afd6e4c\") " pod="openstack/nova-scheduler-0" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.071421 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.071732 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-logs\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.071756 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttjnw\" (UniqueName: \"kubernetes.io/projected/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-kube-api-access-ttjnw\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.071772 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.075556 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-logs" (OuterVolumeSpecName: "logs") pod "0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" (UID: "0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.078589 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"80d2f807-a13f-4a1d-93d3-293d1afd6e4c\") " pod="openstack/nova-scheduler-0" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.081312 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-kube-api-access-8d7zv" (OuterVolumeSpecName: "kube-api-access-8d7zv") pod "0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" (UID: "0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8"). InnerVolumeSpecName "kube-api-access-8d7zv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.082611 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-config-data\") pod \"nova-scheduler-0\" (UID: \"80d2f807-a13f-4a1d-93d3-293d1afd6e4c\") " pod="openstack/nova-scheduler-0" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.088737 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7kqr\" (UniqueName: \"kubernetes.io/projected/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-kube-api-access-t7kqr\") pod \"nova-scheduler-0\" (UID: \"80d2f807-a13f-4a1d-93d3-293d1afd6e4c\") " pod="openstack/nova-scheduler-0" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.095982 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" (UID: "0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.116934 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-config-data" (OuterVolumeSpecName: "config-data") pod "0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" (UID: "0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.140456 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.174325 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8d7zv\" (UniqueName: \"kubernetes.io/projected/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-kube-api-access-8d7zv\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.174365 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.174378 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-logs\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.174390 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:15 crc kubenswrapper[5094]: W0220 08:47:15.642047 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80d2f807_a13f_4a1d_93d3_293d1afd6e4c.slice/crio-364b3455721cb71cf6727cbca688257580fa376c421f75064e6d61cba30218c8 WatchSource:0}: Error finding container 364b3455721cb71cf6727cbca688257580fa376c421f75064e6d61cba30218c8: Status 404 returned error can't find the container with id 364b3455721cb71cf6727cbca688257580fa376c421f75064e6d61cba30218c8 Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.657990 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.677577 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8","Type":"ContainerDied","Data":"1d94c5e4163bc96c9ddb5b16f82044a027cfbc06ea952da5d0ad20ac89c00a85"} Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.677640 5094 scope.go:117] "RemoveContainer" containerID="ba11d9d7cdbecb4617edeb253e7344dc95bff86a042703cc75a176732533812d" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.677823 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.698781 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.700245 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd","Type":"ContainerDied","Data":"33ea9e800121c631e21658cce4961167b1aecb67f072d8888e0fe3827be668d0"} Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.711167 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"02305b70-64d3-46af-876a-f81d73f83cbf","Type":"ContainerStarted","Data":"fec92b2a8da1e2997bd3d8a328f8f1e86063ef10d1fd953772b4ae368e9d524b"} Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.711210 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"02305b70-64d3-46af-876a-f81d73f83cbf","Type":"ContainerStarted","Data":"b27e92493a93d647968058e4cfe443d6348c867ece948ce72e75c01521bbc434"} Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.711538 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.736371 5094 scope.go:117] "RemoveContainer" containerID="98e3c706b02a59b45b5bb55556cb058774353d206688f9292aefcfb6aeee1c49" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.739321 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"80d2f807-a13f-4a1d-93d3-293d1afd6e4c","Type":"ContainerStarted","Data":"364b3455721cb71cf6727cbca688257580fa376c421f75064e6d61cba30218c8"} Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.754920 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.754891847 podStartE2EDuration="2.754891847s" podCreationTimestamp="2026-02-20 08:47:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:47:15.735953871 +0000 UTC m=+7250.608580582" watchObservedRunningTime="2026-02-20 08:47:15.754891847 +0000 UTC m=+7250.627518568" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.789919 5094 scope.go:117] "RemoveContainer" containerID="c133aaff7af6bc08d5607f7f1072c13d9db525ce16f2e919cdf821ceb99af810" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.794920 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.821475 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.825346 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.838872 5094 scope.go:117] "RemoveContainer" containerID="ebc900f8a66f90ec54e3f832c463c60f80a44df3cab9a5a642e1198f63755d0f" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.839752 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:47:15 crc kubenswrapper[5094]: E0220 08:47:15.839996 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.879982 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="008eca6a-12d6-40dd-96bb-391428bd27c5" path="/var/lib/kubelet/pods/008eca6a-12d6-40dd-96bb-391428bd27c5/volumes" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.881108 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" path="/var/lib/kubelet/pods/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd/volumes" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.881843 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 20 08:47:15 crc kubenswrapper[5094]: E0220 08:47:15.882219 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" containerName="nova-metadata-metadata" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.882236 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" containerName="nova-metadata-metadata" Feb 20 08:47:15 crc kubenswrapper[5094]: E0220 08:47:15.882260 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" containerName="nova-metadata-log" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.882267 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" containerName="nova-metadata-log" Feb 20 08:47:15 crc kubenswrapper[5094]: E0220 08:47:15.882281 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" containerName="nova-api-log" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.882287 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" containerName="nova-api-log" Feb 20 08:47:15 crc kubenswrapper[5094]: E0220 08:47:15.882316 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" containerName="nova-api-api" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.882322 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" containerName="nova-api-api" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.882510 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" containerName="nova-metadata-log" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.882529 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" containerName="nova-api-log" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.882540 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" containerName="nova-metadata-metadata" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.882550 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" containerName="nova-api-api" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.884217 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.884240 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.884346 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.887894 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.920804 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.922917 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.936951 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.939772 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.012739 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c1b5836-3f97-4ae2-a894-e42a72b29729-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4c1b5836-3f97-4ae2-a894-e42a72b29729\") " pod="openstack/nova-api-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.012792 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c1b5836-3f97-4ae2-a894-e42a72b29729-logs\") pod \"nova-api-0\" (UID: \"4c1b5836-3f97-4ae2-a894-e42a72b29729\") " pod="openstack/nova-api-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.012836 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltthv\" (UniqueName: \"kubernetes.io/projected/4c1b5836-3f97-4ae2-a894-e42a72b29729-kube-api-access-ltthv\") pod \"nova-api-0\" (UID: \"4c1b5836-3f97-4ae2-a894-e42a72b29729\") " pod="openstack/nova-api-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.012888 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1323ed20-0605-4081-a36d-6fa8c40f26e6-config-data\") pod \"nova-metadata-0\" (UID: \"1323ed20-0605-4081-a36d-6fa8c40f26e6\") " pod="openstack/nova-metadata-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.012907 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c1b5836-3f97-4ae2-a894-e42a72b29729-config-data\") pod \"nova-api-0\" (UID: \"4c1b5836-3f97-4ae2-a894-e42a72b29729\") " pod="openstack/nova-api-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.012969 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1323ed20-0605-4081-a36d-6fa8c40f26e6-logs\") pod \"nova-metadata-0\" (UID: \"1323ed20-0605-4081-a36d-6fa8c40f26e6\") " pod="openstack/nova-metadata-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.012986 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1323ed20-0605-4081-a36d-6fa8c40f26e6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1323ed20-0605-4081-a36d-6fa8c40f26e6\") " pod="openstack/nova-metadata-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.013023 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcbxk\" (UniqueName: \"kubernetes.io/projected/1323ed20-0605-4081-a36d-6fa8c40f26e6-kube-api-access-dcbxk\") pod \"nova-metadata-0\" (UID: \"1323ed20-0605-4081-a36d-6fa8c40f26e6\") " pod="openstack/nova-metadata-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.115222 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1323ed20-0605-4081-a36d-6fa8c40f26e6-logs\") pod \"nova-metadata-0\" (UID: \"1323ed20-0605-4081-a36d-6fa8c40f26e6\") " pod="openstack/nova-metadata-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.115286 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1323ed20-0605-4081-a36d-6fa8c40f26e6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1323ed20-0605-4081-a36d-6fa8c40f26e6\") " pod="openstack/nova-metadata-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.115357 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcbxk\" (UniqueName: \"kubernetes.io/projected/1323ed20-0605-4081-a36d-6fa8c40f26e6-kube-api-access-dcbxk\") pod \"nova-metadata-0\" (UID: \"1323ed20-0605-4081-a36d-6fa8c40f26e6\") " pod="openstack/nova-metadata-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.115408 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c1b5836-3f97-4ae2-a894-e42a72b29729-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4c1b5836-3f97-4ae2-a894-e42a72b29729\") " pod="openstack/nova-api-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.115431 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c1b5836-3f97-4ae2-a894-e42a72b29729-logs\") pod \"nova-api-0\" (UID: \"4c1b5836-3f97-4ae2-a894-e42a72b29729\") " pod="openstack/nova-api-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.115465 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltthv\" (UniqueName: \"kubernetes.io/projected/4c1b5836-3f97-4ae2-a894-e42a72b29729-kube-api-access-ltthv\") pod \"nova-api-0\" (UID: \"4c1b5836-3f97-4ae2-a894-e42a72b29729\") " pod="openstack/nova-api-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.115510 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1323ed20-0605-4081-a36d-6fa8c40f26e6-config-data\") pod \"nova-metadata-0\" (UID: \"1323ed20-0605-4081-a36d-6fa8c40f26e6\") " pod="openstack/nova-metadata-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.115533 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c1b5836-3f97-4ae2-a894-e42a72b29729-config-data\") pod \"nova-api-0\" (UID: \"4c1b5836-3f97-4ae2-a894-e42a72b29729\") " pod="openstack/nova-api-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.117295 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c1b5836-3f97-4ae2-a894-e42a72b29729-logs\") pod \"nova-api-0\" (UID: \"4c1b5836-3f97-4ae2-a894-e42a72b29729\") " pod="openstack/nova-api-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.119321 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1323ed20-0605-4081-a36d-6fa8c40f26e6-logs\") pod \"nova-metadata-0\" (UID: \"1323ed20-0605-4081-a36d-6fa8c40f26e6\") " pod="openstack/nova-metadata-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.123341 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c1b5836-3f97-4ae2-a894-e42a72b29729-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4c1b5836-3f97-4ae2-a894-e42a72b29729\") " pod="openstack/nova-api-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.124003 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1323ed20-0605-4081-a36d-6fa8c40f26e6-config-data\") pod \"nova-metadata-0\" (UID: \"1323ed20-0605-4081-a36d-6fa8c40f26e6\") " pod="openstack/nova-metadata-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.135678 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c1b5836-3f97-4ae2-a894-e42a72b29729-config-data\") pod \"nova-api-0\" (UID: \"4c1b5836-3f97-4ae2-a894-e42a72b29729\") " pod="openstack/nova-api-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.142466 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1323ed20-0605-4081-a36d-6fa8c40f26e6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1323ed20-0605-4081-a36d-6fa8c40f26e6\") " pod="openstack/nova-metadata-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.150155 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcbxk\" (UniqueName: \"kubernetes.io/projected/1323ed20-0605-4081-a36d-6fa8c40f26e6-kube-api-access-dcbxk\") pod \"nova-metadata-0\" (UID: \"1323ed20-0605-4081-a36d-6fa8c40f26e6\") " pod="openstack/nova-metadata-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.169770 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltthv\" (UniqueName: \"kubernetes.io/projected/4c1b5836-3f97-4ae2-a894-e42a72b29729-kube-api-access-ltthv\") pod \"nova-api-0\" (UID: \"4c1b5836-3f97-4ae2-a894-e42a72b29729\") " pod="openstack/nova-api-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.238379 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.274605 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.782159 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"80d2f807-a13f-4a1d-93d3-293d1afd6e4c","Type":"ContainerStarted","Data":"10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f"} Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.792589 5094 generic.go:334] "Generic (PLEG): container finished" podID="4bd3b441-92b9-4fd4-8451-dec1c354915e" containerID="43a32182f09efed60387dd69fdb95acc9d172d5439401fc1ab4dd9e997f9483d" exitCode=0 Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.792681 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4bd3b441-92b9-4fd4-8451-dec1c354915e","Type":"ContainerDied","Data":"43a32182f09efed60387dd69fdb95acc9d172d5439401fc1ab4dd9e997f9483d"} Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.903370 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.921549 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.921528777 podStartE2EDuration="2.921528777s" podCreationTimestamp="2026-02-20 08:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:47:16.816853149 +0000 UTC m=+7251.689479860" watchObservedRunningTime="2026-02-20 08:47:16.921528777 +0000 UTC m=+7251.794155488" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.931177 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.038947 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mcnc\" (UniqueName: \"kubernetes.io/projected/4bd3b441-92b9-4fd4-8451-dec1c354915e-kube-api-access-7mcnc\") pod \"4bd3b441-92b9-4fd4-8451-dec1c354915e\" (UID: \"4bd3b441-92b9-4fd4-8451-dec1c354915e\") " Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.041175 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd3b441-92b9-4fd4-8451-dec1c354915e-config-data\") pod \"4bd3b441-92b9-4fd4-8451-dec1c354915e\" (UID: \"4bd3b441-92b9-4fd4-8451-dec1c354915e\") " Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.041396 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd3b441-92b9-4fd4-8451-dec1c354915e-combined-ca-bundle\") pod \"4bd3b441-92b9-4fd4-8451-dec1c354915e\" (UID: \"4bd3b441-92b9-4fd4-8451-dec1c354915e\") " Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.061338 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bd3b441-92b9-4fd4-8451-dec1c354915e-kube-api-access-7mcnc" (OuterVolumeSpecName: "kube-api-access-7mcnc") pod "4bd3b441-92b9-4fd4-8451-dec1c354915e" (UID: "4bd3b441-92b9-4fd4-8451-dec1c354915e"). InnerVolumeSpecName "kube-api-access-7mcnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.070665 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.080172 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bd3b441-92b9-4fd4-8451-dec1c354915e-config-data" (OuterVolumeSpecName: "config-data") pod "4bd3b441-92b9-4fd4-8451-dec1c354915e" (UID: "4bd3b441-92b9-4fd4-8451-dec1c354915e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.089574 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bd3b441-92b9-4fd4-8451-dec1c354915e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4bd3b441-92b9-4fd4-8451-dec1c354915e" (UID: "4bd3b441-92b9-4fd4-8451-dec1c354915e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.144627 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mcnc\" (UniqueName: \"kubernetes.io/projected/4bd3b441-92b9-4fd4-8451-dec1c354915e-kube-api-access-7mcnc\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.144683 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd3b441-92b9-4fd4-8451-dec1c354915e-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.144698 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd3b441-92b9-4fd4-8451-dec1c354915e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.822825 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c1b5836-3f97-4ae2-a894-e42a72b29729","Type":"ContainerStarted","Data":"c876bb36d6cd6351bc15c4941fbdd419137b0dffee9422b0eca06e2602e21509"} Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.823139 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c1b5836-3f97-4ae2-a894-e42a72b29729","Type":"ContainerStarted","Data":"39a13020f662d3e6609d4881367424f2f25adb683ad3729bfa3b75921443ae45"} Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.823149 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c1b5836-3f97-4ae2-a894-e42a72b29729","Type":"ContainerStarted","Data":"867e153f6129e6c09f4b4a68b08d0ac6938b5f39543d1e08a62fd3fdae93737c"} Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.825372 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1323ed20-0605-4081-a36d-6fa8c40f26e6","Type":"ContainerStarted","Data":"25dc2efe543de2f343b425911821a1eda1c851282daa752a1b28d46a6d470381"} Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.825394 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1323ed20-0605-4081-a36d-6fa8c40f26e6","Type":"ContainerStarted","Data":"c29b5297edcb6838d15c351b78051941a64fc614165717a744eca6ab46c45320"} Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.825402 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1323ed20-0605-4081-a36d-6fa8c40f26e6","Type":"ContainerStarted","Data":"d585671e2fde2c389818c568ec8f701d1f0c341b00acbfaa339458d079916a62"} Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.827612 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.835226 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4bd3b441-92b9-4fd4-8451-dec1c354915e","Type":"ContainerDied","Data":"05c39de3bac79bc7ca3ed4fb07b709e2d6f8ab1a2eb157f4f36bbc1df541309c"} Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.835334 5094 scope.go:117] "RemoveContainer" containerID="43a32182f09efed60387dd69fdb95acc9d172d5439401fc1ab4dd9e997f9483d" Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.850222 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" path="/var/lib/kubelet/pods/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8/volumes" Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.872039 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.872016138 podStartE2EDuration="2.872016138s" podCreationTimestamp="2026-02-20 08:47:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:47:17.864821055 +0000 UTC m=+7252.737447766" watchObservedRunningTime="2026-02-20 08:47:17.872016138 +0000 UTC m=+7252.744642849" Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.894100 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.894070819 podStartE2EDuration="2.894070819s" podCreationTimestamp="2026-02-20 08:47:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:47:17.890621696 +0000 UTC m=+7252.763248427" watchObservedRunningTime="2026-02-20 08:47:17.894070819 +0000 UTC m=+7252.766697540" Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.925215 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.943691 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.954910 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 08:47:17 crc kubenswrapper[5094]: E0220 08:47:17.955338 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bd3b441-92b9-4fd4-8451-dec1c354915e" containerName="nova-cell1-conductor-conductor" Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.955356 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bd3b441-92b9-4fd4-8451-dec1c354915e" containerName="nova-cell1-conductor-conductor" Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.955552 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bd3b441-92b9-4fd4-8451-dec1c354915e" containerName="nova-cell1-conductor-conductor" Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.956332 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.959499 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.965883 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.978247 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:18 crc kubenswrapper[5094]: I0220 08:47:18.063023 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d2s8\" (UniqueName: \"kubernetes.io/projected/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-kube-api-access-2d2s8\") pod \"nova-cell1-conductor-0\" (UID: \"ec04fa38-0d41-4c78-99fd-56299cd1c5ac\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:47:18 crc kubenswrapper[5094]: I0220 08:47:18.063071 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ec04fa38-0d41-4c78-99fd-56299cd1c5ac\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:47:18 crc kubenswrapper[5094]: I0220 08:47:18.063146 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ec04fa38-0d41-4c78-99fd-56299cd1c5ac\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:47:18 crc kubenswrapper[5094]: I0220 08:47:18.166267 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d2s8\" (UniqueName: \"kubernetes.io/projected/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-kube-api-access-2d2s8\") pod \"nova-cell1-conductor-0\" (UID: \"ec04fa38-0d41-4c78-99fd-56299cd1c5ac\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:47:18 crc kubenswrapper[5094]: I0220 08:47:18.166751 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ec04fa38-0d41-4c78-99fd-56299cd1c5ac\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:47:18 crc kubenswrapper[5094]: I0220 08:47:18.166966 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ec04fa38-0d41-4c78-99fd-56299cd1c5ac\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:47:18 crc kubenswrapper[5094]: I0220 08:47:18.175854 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ec04fa38-0d41-4c78-99fd-56299cd1c5ac\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:47:18 crc kubenswrapper[5094]: I0220 08:47:18.185327 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d2s8\" (UniqueName: \"kubernetes.io/projected/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-kube-api-access-2d2s8\") pod \"nova-cell1-conductor-0\" (UID: \"ec04fa38-0d41-4c78-99fd-56299cd1c5ac\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:47:18 crc kubenswrapper[5094]: I0220 08:47:18.201223 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ec04fa38-0d41-4c78-99fd-56299cd1c5ac\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:47:18 crc kubenswrapper[5094]: I0220 08:47:18.286548 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 08:47:18 crc kubenswrapper[5094]: W0220 08:47:18.796940 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec04fa38_0d41_4c78_99fd_56299cd1c5ac.slice/crio-205874c2bd37603e15fe2e158ed7cb1e0b3a0637daa0a0e89a2d6fb0e765f899 WatchSource:0}: Error finding container 205874c2bd37603e15fe2e158ed7cb1e0b3a0637daa0a0e89a2d6fb0e765f899: Status 404 returned error can't find the container with id 205874c2bd37603e15fe2e158ed7cb1e0b3a0637daa0a0e89a2d6fb0e765f899 Feb 20 08:47:18 crc kubenswrapper[5094]: I0220 08:47:18.800692 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 08:47:18 crc kubenswrapper[5094]: I0220 08:47:18.841636 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ec04fa38-0d41-4c78-99fd-56299cd1c5ac","Type":"ContainerStarted","Data":"205874c2bd37603e15fe2e158ed7cb1e0b3a0637daa0a0e89a2d6fb0e765f899"} Feb 20 08:47:19 crc kubenswrapper[5094]: I0220 08:47:19.855855 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bd3b441-92b9-4fd4-8451-dec1c354915e" path="/var/lib/kubelet/pods/4bd3b441-92b9-4fd4-8451-dec1c354915e/volumes" Feb 20 08:47:19 crc kubenswrapper[5094]: I0220 08:47:19.857051 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ec04fa38-0d41-4c78-99fd-56299cd1c5ac","Type":"ContainerStarted","Data":"b0e35cbe8bf849fece9161e859f69976c6d3a8bd0f88046bd05e34f439cfb7f8"} Feb 20 08:47:19 crc kubenswrapper[5094]: I0220 08:47:19.857115 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 20 08:47:19 crc kubenswrapper[5094]: I0220 08:47:19.905579 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.905550778 podStartE2EDuration="2.905550778s" podCreationTimestamp="2026-02-20 08:47:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:47:19.875582138 +0000 UTC m=+7254.748208849" watchObservedRunningTime="2026-02-20 08:47:19.905550778 +0000 UTC m=+7254.778177489" Feb 20 08:47:20 crc kubenswrapper[5094]: I0220 08:47:20.141888 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 20 08:47:21 crc kubenswrapper[5094]: I0220 08:47:21.275489 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 08:47:21 crc kubenswrapper[5094]: I0220 08:47:21.275559 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 08:47:22 crc kubenswrapper[5094]: I0220 08:47:22.979142 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:23 crc kubenswrapper[5094]: I0220 08:47:23.002539 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:23 crc kubenswrapper[5094]: I0220 08:47:23.901162 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:24 crc kubenswrapper[5094]: I0220 08:47:24.035381 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 20 08:47:25 crc kubenswrapper[5094]: I0220 08:47:25.141556 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 20 08:47:25 crc kubenswrapper[5094]: I0220 08:47:25.178344 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 20 08:47:25 crc kubenswrapper[5094]: I0220 08:47:25.939234 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 20 08:47:26 crc kubenswrapper[5094]: I0220 08:47:26.238992 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 08:47:26 crc kubenswrapper[5094]: I0220 08:47:26.239788 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 08:47:26 crc kubenswrapper[5094]: I0220 08:47:26.275814 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 08:47:26 crc kubenswrapper[5094]: I0220 08:47:26.275866 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 08:47:27 crc kubenswrapper[5094]: I0220 08:47:27.403079 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4c1b5836-3f97-4ae2-a894-e42a72b29729" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.86:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 08:47:27 crc kubenswrapper[5094]: I0220 08:47:27.403100 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1323ed20-0605-4081-a36d-6fa8c40f26e6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.87:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 08:47:27 crc kubenswrapper[5094]: I0220 08:47:27.403120 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4c1b5836-3f97-4ae2-a894-e42a72b29729" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.86:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 08:47:27 crc kubenswrapper[5094]: I0220 08:47:27.403360 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1323ed20-0605-4081-a36d-6fa8c40f26e6" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.87:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 08:47:27 crc kubenswrapper[5094]: I0220 08:47:27.840584 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:47:27 crc kubenswrapper[5094]: E0220 08:47:27.840831 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:47:28 crc kubenswrapper[5094]: I0220 08:47:28.325354 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.507719 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.509678 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.511779 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.533496 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.631518 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7cb209d2-d0d5-41b3-a452-ffe3fd846798-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.631560 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.631604 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-config-data\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.631828 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.631919 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6zwv\" (UniqueName: \"kubernetes.io/projected/7cb209d2-d0d5-41b3-a452-ffe3fd846798-kube-api-access-k6zwv\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.631974 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-scripts\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.733520 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-config-data\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.733620 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.733658 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6zwv\" (UniqueName: \"kubernetes.io/projected/7cb209d2-d0d5-41b3-a452-ffe3fd846798-kube-api-access-k6zwv\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.733682 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-scripts\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.733757 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7cb209d2-d0d5-41b3-a452-ffe3fd846798-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.733963 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7cb209d2-d0d5-41b3-a452-ffe3fd846798-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.734496 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.738895 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.739614 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.740014 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-config-data\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.742545 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-scripts\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.757786 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6zwv\" (UniqueName: \"kubernetes.io/projected/7cb209d2-d0d5-41b3-a452-ffe3fd846798-kube-api-access-k6zwv\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.828979 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 08:47:35 crc kubenswrapper[5094]: I0220 08:47:35.301349 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 08:47:35 crc kubenswrapper[5094]: I0220 08:47:35.996005 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7cb209d2-d0d5-41b3-a452-ffe3fd846798","Type":"ContainerStarted","Data":"96e6715e90a0b7ca9a4f361deda20ad4a5f2e1b7bb2403e2af62f561294f2544"} Feb 20 08:47:36 crc kubenswrapper[5094]: I0220 08:47:36.248946 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 20 08:47:36 crc kubenswrapper[5094]: I0220 08:47:36.250975 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 20 08:47:36 crc kubenswrapper[5094]: I0220 08:47:36.260073 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 20 08:47:36 crc kubenswrapper[5094]: I0220 08:47:36.277202 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 20 08:47:36 crc kubenswrapper[5094]: I0220 08:47:36.286998 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 20 08:47:36 crc kubenswrapper[5094]: I0220 08:47:36.288089 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 20 08:47:36 crc kubenswrapper[5094]: I0220 08:47:36.296106 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 20 08:47:36 crc kubenswrapper[5094]: I0220 08:47:36.385270 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 20 08:47:36 crc kubenswrapper[5094]: I0220 08:47:36.386019 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a7799fdb-4c3c-4792-be6d-f988852a6dad" containerName="cinder-api-log" containerID="cri-o://ac9a8a209c41aa38d1afaa5649246b4c179ca53d98b2ce40bde25654f0c08e0a" gracePeriod=30 Feb 20 08:47:36 crc kubenswrapper[5094]: I0220 08:47:36.386118 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a7799fdb-4c3c-4792-be6d-f988852a6dad" containerName="cinder-api" containerID="cri-o://c6cbdccc1408b1467e78e30844e9e20cc8ecf13baefc42e4d8d177a2ad9ea513" gracePeriod=30 Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.006814 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7cb209d2-d0d5-41b3-a452-ffe3fd846798","Type":"ContainerStarted","Data":"6fea4c135338d1a075521aee446680ad85fdc4ca1be3734940b421558747f123"} Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.007139 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7cb209d2-d0d5-41b3-a452-ffe3fd846798","Type":"ContainerStarted","Data":"559602b89f51921b2a84d2e61fb334636cf26303ca35d00060fdb477994544a2"} Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.010366 5094 generic.go:334] "Generic (PLEG): container finished" podID="a7799fdb-4c3c-4792-be6d-f988852a6dad" containerID="ac9a8a209c41aa38d1afaa5649246b4c179ca53d98b2ce40bde25654f0c08e0a" exitCode=143 Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.010452 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a7799fdb-4c3c-4792-be6d-f988852a6dad","Type":"ContainerDied","Data":"ac9a8a209c41aa38d1afaa5649246b4c179ca53d98b2ce40bde25654f0c08e0a"} Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.010825 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.012900 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.022282 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.024322 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.024665 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.026136 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.037909 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.045364 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.7583919789999998 podStartE2EDuration="3.045343801s" podCreationTimestamp="2026-02-20 08:47:34 +0000 UTC" firstStartedPulling="2026-02-20 08:47:35.302395319 +0000 UTC m=+7270.175022030" lastFinishedPulling="2026-02-20 08:47:35.589347141 +0000 UTC m=+7270.461973852" observedRunningTime="2026-02-20 08:47:37.038178269 +0000 UTC m=+7271.910804980" watchObservedRunningTime="2026-02-20 08:47:37.045343801 +0000 UTC m=+7271.917970512" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.085309 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-run\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.085365 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.085385 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-dev\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.085400 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.085418 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.085464 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/571a6098-6e30-438f-a6a9-fb751a79ca27-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.085486 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/571a6098-6e30-438f-a6a9-fb751a79ca27-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.085557 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.085595 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rmjg\" (UniqueName: \"kubernetes.io/projected/571a6098-6e30-438f-a6a9-fb751a79ca27-kube-api-access-7rmjg\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.085687 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/571a6098-6e30-438f-a6a9-fb751a79ca27-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.085742 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/571a6098-6e30-438f-a6a9-fb751a79ca27-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.085768 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/571a6098-6e30-438f-a6a9-fb751a79ca27-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.085796 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.085891 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.085994 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.086021 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-sys\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.187567 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/571a6098-6e30-438f-a6a9-fb751a79ca27-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.187615 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/571a6098-6e30-438f-a6a9-fb751a79ca27-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.187637 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/571a6098-6e30-438f-a6a9-fb751a79ca27-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.187660 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.187718 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.187757 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.187775 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-sys\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.187805 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-run\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.187832 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.187852 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-dev\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.187869 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.187888 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.187913 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/571a6098-6e30-438f-a6a9-fb751a79ca27-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.187938 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/571a6098-6e30-438f-a6a9-fb751a79ca27-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.187959 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.187975 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rmjg\" (UniqueName: \"kubernetes.io/projected/571a6098-6e30-438f-a6a9-fb751a79ca27-kube-api-access-7rmjg\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.188669 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.188888 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.188940 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.189030 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.189077 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-sys\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.189112 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-run\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.189146 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.189219 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.189251 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-dev\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.188667 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.194446 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/571a6098-6e30-438f-a6a9-fb751a79ca27-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.195604 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/571a6098-6e30-438f-a6a9-fb751a79ca27-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.208346 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rmjg\" (UniqueName: \"kubernetes.io/projected/571a6098-6e30-438f-a6a9-fb751a79ca27-kube-api-access-7rmjg\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.208736 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/571a6098-6e30-438f-a6a9-fb751a79ca27-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.208754 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/571a6098-6e30-438f-a6a9-fb751a79ca27-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.223638 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/571a6098-6e30-438f-a6a9-fb751a79ca27-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.342792 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.695977 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.697940 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.705224 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.721123 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.799883 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7f13f97-3504-4faa-a8cf-8ad4a7973623-config-data-custom\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.799942 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-sys\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.799970 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.799999 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-etc-nvme\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.800021 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.800042 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d7f13f97-3504-4faa-a8cf-8ad4a7973623-ceph\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.800060 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-dev\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.800075 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.800127 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-run\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.800181 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.800315 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7f13f97-3504-4faa-a8cf-8ad4a7973623-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.800374 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7f13f97-3504-4faa-a8cf-8ad4a7973623-scripts\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.800471 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7f13f97-3504-4faa-a8cf-8ad4a7973623-config-data\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.800501 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.800557 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-lib-modules\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.800589 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccnws\" (UniqueName: \"kubernetes.io/projected/d7f13f97-3504-4faa-a8cf-8ad4a7973623-kube-api-access-ccnws\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.867115 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.902550 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7f13f97-3504-4faa-a8cf-8ad4a7973623-config-data\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.902602 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.902636 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-lib-modules\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.902660 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccnws\" (UniqueName: \"kubernetes.io/projected/d7f13f97-3504-4faa-a8cf-8ad4a7973623-kube-api-access-ccnws\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.902738 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7f13f97-3504-4faa-a8cf-8ad4a7973623-config-data-custom\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.902763 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.902787 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-sys\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.902807 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-lib-modules\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.903451 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.903457 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-sys\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.903503 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.903783 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-etc-nvme\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.903854 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.903893 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d7f13f97-3504-4faa-a8cf-8ad4a7973623-ceph\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.903918 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-dev\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.903942 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.903970 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-run\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.903993 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.904077 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7f13f97-3504-4faa-a8cf-8ad4a7973623-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.904124 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7f13f97-3504-4faa-a8cf-8ad4a7973623-scripts\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.907528 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-run\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.907575 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-dev\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.907610 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.907635 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.907665 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-etc-nvme\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.907984 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.908533 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d7f13f97-3504-4faa-a8cf-8ad4a7973623-ceph\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.910335 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7f13f97-3504-4faa-a8cf-8ad4a7973623-config-data\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.910411 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7f13f97-3504-4faa-a8cf-8ad4a7973623-config-data-custom\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.910973 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7f13f97-3504-4faa-a8cf-8ad4a7973623-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.912465 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7f13f97-3504-4faa-a8cf-8ad4a7973623-scripts\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.920809 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccnws\" (UniqueName: \"kubernetes.io/projected/d7f13f97-3504-4faa-a8cf-8ad4a7973623-kube-api-access-ccnws\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:38 crc kubenswrapper[5094]: I0220 08:47:38.021634 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"571a6098-6e30-438f-a6a9-fb751a79ca27","Type":"ContainerStarted","Data":"aeb59c1588eed2cf17b2842288e5805871d6e404b09670ae490547f5c37d6bbf"} Feb 20 08:47:38 crc kubenswrapper[5094]: I0220 08:47:38.039009 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Feb 20 08:47:38 crc kubenswrapper[5094]: I0220 08:47:38.576348 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Feb 20 08:47:39 crc kubenswrapper[5094]: I0220 08:47:39.034910 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"571a6098-6e30-438f-a6a9-fb751a79ca27","Type":"ContainerStarted","Data":"263654ba6b99106d51044c8bcfb147076ab69754ce97751c1bb0923e1c8d7582"} Feb 20 08:47:39 crc kubenswrapper[5094]: I0220 08:47:39.035379 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"571a6098-6e30-438f-a6a9-fb751a79ca27","Type":"ContainerStarted","Data":"5af98cbef9ac7a3e0e5bae1f39cad5d6cabea6e45418f4b771bfc377fb91dd98"} Feb 20 08:47:39 crc kubenswrapper[5094]: I0220 08:47:39.040317 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"d7f13f97-3504-4faa-a8cf-8ad4a7973623","Type":"ContainerStarted","Data":"c189d74fd0a1b93bdf0acee366dbe356f8f5f161057f579d8366a8f622988683"} Feb 20 08:47:39 crc kubenswrapper[5094]: I0220 08:47:39.061937 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=2.629014001 podStartE2EDuration="3.061912173s" podCreationTimestamp="2026-02-20 08:47:36 +0000 UTC" firstStartedPulling="2026-02-20 08:47:37.882441435 +0000 UTC m=+7272.755068146" lastFinishedPulling="2026-02-20 08:47:38.315339607 +0000 UTC m=+7273.187966318" observedRunningTime="2026-02-20 08:47:39.05430692 +0000 UTC m=+7273.926933651" watchObservedRunningTime="2026-02-20 08:47:39.061912173 +0000 UTC m=+7273.934538884" Feb 20 08:47:39 crc kubenswrapper[5094]: I0220 08:47:39.829913 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.003333 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.051143 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"d7f13f97-3504-4faa-a8cf-8ad4a7973623","Type":"ContainerStarted","Data":"e8645feb6916fb733f295972fe628c533a4264269a03a3fa0d4febf7afa90ed8"} Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.051187 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"d7f13f97-3504-4faa-a8cf-8ad4a7973623","Type":"ContainerStarted","Data":"b150727e7d44034a400e262ea8fb30b0e4ebceb9cca7b3a084a18f2176c6797f"} Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.054206 5094 generic.go:334] "Generic (PLEG): container finished" podID="a7799fdb-4c3c-4792-be6d-f988852a6dad" containerID="c6cbdccc1408b1467e78e30844e9e20cc8ecf13baefc42e4d8d177a2ad9ea513" exitCode=0 Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.054831 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.055000 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a7799fdb-4c3c-4792-be6d-f988852a6dad","Type":"ContainerDied","Data":"c6cbdccc1408b1467e78e30844e9e20cc8ecf13baefc42e4d8d177a2ad9ea513"} Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.055026 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a7799fdb-4c3c-4792-be6d-f988852a6dad","Type":"ContainerDied","Data":"0083b149ecde6dd092d89d6192562bc09a0f773314ca1e2250b77701f210c2ad"} Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.055042 5094 scope.go:117] "RemoveContainer" containerID="c6cbdccc1408b1467e78e30844e9e20cc8ecf13baefc42e4d8d177a2ad9ea513" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.095957 5094 scope.go:117] "RemoveContainer" containerID="ac9a8a209c41aa38d1afaa5649246b4c179ca53d98b2ce40bde25654f0c08e0a" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.099944 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.781029 podStartE2EDuration="3.099924409s" podCreationTimestamp="2026-02-20 08:47:37 +0000 UTC" firstStartedPulling="2026-02-20 08:47:38.588165299 +0000 UTC m=+7273.460792010" lastFinishedPulling="2026-02-20 08:47:38.907060708 +0000 UTC m=+7273.779687419" observedRunningTime="2026-02-20 08:47:40.080148753 +0000 UTC m=+7274.952775464" watchObservedRunningTime="2026-02-20 08:47:40.099924409 +0000 UTC m=+7274.972551120" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.120987 5094 scope.go:117] "RemoveContainer" containerID="c6cbdccc1408b1467e78e30844e9e20cc8ecf13baefc42e4d8d177a2ad9ea513" Feb 20 08:47:40 crc kubenswrapper[5094]: E0220 08:47:40.126345 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6cbdccc1408b1467e78e30844e9e20cc8ecf13baefc42e4d8d177a2ad9ea513\": container with ID starting with c6cbdccc1408b1467e78e30844e9e20cc8ecf13baefc42e4d8d177a2ad9ea513 not found: ID does not exist" containerID="c6cbdccc1408b1467e78e30844e9e20cc8ecf13baefc42e4d8d177a2ad9ea513" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.126976 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6cbdccc1408b1467e78e30844e9e20cc8ecf13baefc42e4d8d177a2ad9ea513"} err="failed to get container status \"c6cbdccc1408b1467e78e30844e9e20cc8ecf13baefc42e4d8d177a2ad9ea513\": rpc error: code = NotFound desc = could not find container \"c6cbdccc1408b1467e78e30844e9e20cc8ecf13baefc42e4d8d177a2ad9ea513\": container with ID starting with c6cbdccc1408b1467e78e30844e9e20cc8ecf13baefc42e4d8d177a2ad9ea513 not found: ID does not exist" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.127002 5094 scope.go:117] "RemoveContainer" containerID="ac9a8a209c41aa38d1afaa5649246b4c179ca53d98b2ce40bde25654f0c08e0a" Feb 20 08:47:40 crc kubenswrapper[5094]: E0220 08:47:40.127492 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac9a8a209c41aa38d1afaa5649246b4c179ca53d98b2ce40bde25654f0c08e0a\": container with ID starting with ac9a8a209c41aa38d1afaa5649246b4c179ca53d98b2ce40bde25654f0c08e0a not found: ID does not exist" containerID="ac9a8a209c41aa38d1afaa5649246b4c179ca53d98b2ce40bde25654f0c08e0a" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.127533 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac9a8a209c41aa38d1afaa5649246b4c179ca53d98b2ce40bde25654f0c08e0a"} err="failed to get container status \"ac9a8a209c41aa38d1afaa5649246b4c179ca53d98b2ce40bde25654f0c08e0a\": rpc error: code = NotFound desc = could not find container \"ac9a8a209c41aa38d1afaa5649246b4c179ca53d98b2ce40bde25654f0c08e0a\": container with ID starting with ac9a8a209c41aa38d1afaa5649246b4c179ca53d98b2ce40bde25654f0c08e0a not found: ID does not exist" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.145243 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-config-data\") pod \"a7799fdb-4c3c-4792-be6d-f988852a6dad\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.145354 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-scripts\") pod \"a7799fdb-4c3c-4792-be6d-f988852a6dad\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.145374 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7799fdb-4c3c-4792-be6d-f988852a6dad-logs\") pod \"a7799fdb-4c3c-4792-be6d-f988852a6dad\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.146434 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-combined-ca-bundle\") pod \"a7799fdb-4c3c-4792-be6d-f988852a6dad\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.146462 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7799fdb-4c3c-4792-be6d-f988852a6dad-etc-machine-id\") pod \"a7799fdb-4c3c-4792-be6d-f988852a6dad\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.146509 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdwt4\" (UniqueName: \"kubernetes.io/projected/a7799fdb-4c3c-4792-be6d-f988852a6dad-kube-api-access-xdwt4\") pod \"a7799fdb-4c3c-4792-be6d-f988852a6dad\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.146571 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-config-data-custom\") pod \"a7799fdb-4c3c-4792-be6d-f988852a6dad\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.146033 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7799fdb-4c3c-4792-be6d-f988852a6dad-logs" (OuterVolumeSpecName: "logs") pod "a7799fdb-4c3c-4792-be6d-f988852a6dad" (UID: "a7799fdb-4c3c-4792-be6d-f988852a6dad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.147629 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7799fdb-4c3c-4792-be6d-f988852a6dad-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a7799fdb-4c3c-4792-be6d-f988852a6dad" (UID: "a7799fdb-4c3c-4792-be6d-f988852a6dad"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.151317 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-scripts" (OuterVolumeSpecName: "scripts") pod "a7799fdb-4c3c-4792-be6d-f988852a6dad" (UID: "a7799fdb-4c3c-4792-be6d-f988852a6dad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.153074 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7799fdb-4c3c-4792-be6d-f988852a6dad-kube-api-access-xdwt4" (OuterVolumeSpecName: "kube-api-access-xdwt4") pod "a7799fdb-4c3c-4792-be6d-f988852a6dad" (UID: "a7799fdb-4c3c-4792-be6d-f988852a6dad"). InnerVolumeSpecName "kube-api-access-xdwt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.157627 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a7799fdb-4c3c-4792-be6d-f988852a6dad" (UID: "a7799fdb-4c3c-4792-be6d-f988852a6dad"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.195472 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7799fdb-4c3c-4792-be6d-f988852a6dad" (UID: "a7799fdb-4c3c-4792-be6d-f988852a6dad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.207095 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-config-data" (OuterVolumeSpecName: "config-data") pod "a7799fdb-4c3c-4792-be6d-f988852a6dad" (UID: "a7799fdb-4c3c-4792-be6d-f988852a6dad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.249485 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.249646 5094 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7799fdb-4c3c-4792-be6d-f988852a6dad-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.249676 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdwt4\" (UniqueName: \"kubernetes.io/projected/a7799fdb-4c3c-4792-be6d-f988852a6dad-kube-api-access-xdwt4\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.249752 5094 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.249778 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.249814 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.249827 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7799fdb-4c3c-4792-be6d-f988852a6dad-logs\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.388352 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.401247 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.415817 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 20 08:47:40 crc kubenswrapper[5094]: E0220 08:47:40.416299 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7799fdb-4c3c-4792-be6d-f988852a6dad" containerName="cinder-api" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.416326 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7799fdb-4c3c-4792-be6d-f988852a6dad" containerName="cinder-api" Feb 20 08:47:40 crc kubenswrapper[5094]: E0220 08:47:40.416438 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7799fdb-4c3c-4792-be6d-f988852a6dad" containerName="cinder-api-log" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.416528 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7799fdb-4c3c-4792-be6d-f988852a6dad" containerName="cinder-api-log" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.417318 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7799fdb-4c3c-4792-be6d-f988852a6dad" containerName="cinder-api" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.417378 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7799fdb-4c3c-4792-be6d-f988852a6dad" containerName="cinder-api-log" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.418603 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.422548 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.454323 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b8551a6-6aac-4c12-b3ce-913397a5316f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.454420 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b8551a6-6aac-4c12-b3ce-913397a5316f-scripts\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.454539 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b8551a6-6aac-4c12-b3ce-913397a5316f-config-data\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.454591 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b8551a6-6aac-4c12-b3ce-913397a5316f-config-data-custom\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.454624 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b8551a6-6aac-4c12-b3ce-913397a5316f-logs\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.454651 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n5mc\" (UniqueName: \"kubernetes.io/projected/3b8551a6-6aac-4c12-b3ce-913397a5316f-kube-api-access-4n5mc\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.454684 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3b8551a6-6aac-4c12-b3ce-913397a5316f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.455046 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.556062 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b8551a6-6aac-4c12-b3ce-913397a5316f-config-data\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.556112 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b8551a6-6aac-4c12-b3ce-913397a5316f-config-data-custom\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.556138 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b8551a6-6aac-4c12-b3ce-913397a5316f-logs\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.556156 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n5mc\" (UniqueName: \"kubernetes.io/projected/3b8551a6-6aac-4c12-b3ce-913397a5316f-kube-api-access-4n5mc\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.556177 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3b8551a6-6aac-4c12-b3ce-913397a5316f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.556208 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b8551a6-6aac-4c12-b3ce-913397a5316f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.556251 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b8551a6-6aac-4c12-b3ce-913397a5316f-scripts\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.557015 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3b8551a6-6aac-4c12-b3ce-913397a5316f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.557596 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b8551a6-6aac-4c12-b3ce-913397a5316f-logs\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.561322 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b8551a6-6aac-4c12-b3ce-913397a5316f-config-data-custom\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.561468 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b8551a6-6aac-4c12-b3ce-913397a5316f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.562142 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b8551a6-6aac-4c12-b3ce-913397a5316f-config-data\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.566424 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b8551a6-6aac-4c12-b3ce-913397a5316f-scripts\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.576608 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n5mc\" (UniqueName: \"kubernetes.io/projected/3b8551a6-6aac-4c12-b3ce-913397a5316f-kube-api-access-4n5mc\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.751491 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 08:47:41 crc kubenswrapper[5094]: W0220 08:47:41.181101 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b8551a6_6aac_4c12_b3ce_913397a5316f.slice/crio-a80cdb5ee2bf9399b06e128091592f8303d539cda325cebf2ee2498728645ce1 WatchSource:0}: Error finding container a80cdb5ee2bf9399b06e128091592f8303d539cda325cebf2ee2498728645ce1: Status 404 returned error can't find the container with id a80cdb5ee2bf9399b06e128091592f8303d539cda325cebf2ee2498728645ce1 Feb 20 08:47:41 crc kubenswrapper[5094]: I0220 08:47:41.192765 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 20 08:47:41 crc kubenswrapper[5094]: I0220 08:47:41.841293 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:47:41 crc kubenswrapper[5094]: E0220 08:47:41.841857 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:47:41 crc kubenswrapper[5094]: I0220 08:47:41.861496 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7799fdb-4c3c-4792-be6d-f988852a6dad" path="/var/lib/kubelet/pods/a7799fdb-4c3c-4792-be6d-f988852a6dad/volumes" Feb 20 08:47:42 crc kubenswrapper[5094]: I0220 08:47:42.074999 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3b8551a6-6aac-4c12-b3ce-913397a5316f","Type":"ContainerStarted","Data":"848984c5cec08fc4ac1f4f8a3a4487b094f0af4ff916255c85d282443bbbf29a"} Feb 20 08:47:42 crc kubenswrapper[5094]: I0220 08:47:42.075056 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3b8551a6-6aac-4c12-b3ce-913397a5316f","Type":"ContainerStarted","Data":"a80cdb5ee2bf9399b06e128091592f8303d539cda325cebf2ee2498728645ce1"} Feb 20 08:47:42 crc kubenswrapper[5094]: I0220 08:47:42.343101 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:43 crc kubenswrapper[5094]: I0220 08:47:43.040196 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Feb 20 08:47:43 crc kubenswrapper[5094]: I0220 08:47:43.084272 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3b8551a6-6aac-4c12-b3ce-913397a5316f","Type":"ContainerStarted","Data":"3f045c8da32a76424886c896e6d99cc15a91c8738009ed815181fed0d46683e4"} Feb 20 08:47:43 crc kubenswrapper[5094]: I0220 08:47:43.084453 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 20 08:47:43 crc kubenswrapper[5094]: I0220 08:47:43.106852 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.106830381 podStartE2EDuration="3.106830381s" podCreationTimestamp="2026-02-20 08:47:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:47:43.105147991 +0000 UTC m=+7277.977774702" watchObservedRunningTime="2026-02-20 08:47:43.106830381 +0000 UTC m=+7277.979457102" Feb 20 08:47:44 crc kubenswrapper[5094]: I0220 08:47:44.870290 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="a7799fdb-4c3c-4792-be6d-f988852a6dad" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.82:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 08:47:45 crc kubenswrapper[5094]: I0220 08:47:45.018278 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 20 08:47:45 crc kubenswrapper[5094]: I0220 08:47:45.073053 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 08:47:45 crc kubenswrapper[5094]: I0220 08:47:45.105848 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7cb209d2-d0d5-41b3-a452-ffe3fd846798" containerName="cinder-scheduler" containerID="cri-o://559602b89f51921b2a84d2e61fb334636cf26303ca35d00060fdb477994544a2" gracePeriod=30 Feb 20 08:47:45 crc kubenswrapper[5094]: I0220 08:47:45.105955 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7cb209d2-d0d5-41b3-a452-ffe3fd846798" containerName="probe" containerID="cri-o://6fea4c135338d1a075521aee446680ad85fdc4ca1be3734940b421558747f123" gracePeriod=30 Feb 20 08:47:46 crc kubenswrapper[5094]: I0220 08:47:46.120669 5094 generic.go:334] "Generic (PLEG): container finished" podID="7cb209d2-d0d5-41b3-a452-ffe3fd846798" containerID="6fea4c135338d1a075521aee446680ad85fdc4ca1be3734940b421558747f123" exitCode=0 Feb 20 08:47:46 crc kubenswrapper[5094]: I0220 08:47:46.120748 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7cb209d2-d0d5-41b3-a452-ffe3fd846798","Type":"ContainerDied","Data":"6fea4c135338d1a075521aee446680ad85fdc4ca1be3734940b421558747f123"} Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.143146 5094 generic.go:334] "Generic (PLEG): container finished" podID="7cb209d2-d0d5-41b3-a452-ffe3fd846798" containerID="559602b89f51921b2a84d2e61fb334636cf26303ca35d00060fdb477994544a2" exitCode=0 Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.143229 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7cb209d2-d0d5-41b3-a452-ffe3fd846798","Type":"ContainerDied","Data":"559602b89f51921b2a84d2e61fb334636cf26303ca35d00060fdb477994544a2"} Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.495260 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.576787 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.608827 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6zwv\" (UniqueName: \"kubernetes.io/projected/7cb209d2-d0d5-41b3-a452-ffe3fd846798-kube-api-access-k6zwv\") pod \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.609255 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7cb209d2-d0d5-41b3-a452-ffe3fd846798-etc-machine-id\") pod \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.609370 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-combined-ca-bundle\") pod \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.609373 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cb209d2-d0d5-41b3-a452-ffe3fd846798-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7cb209d2-d0d5-41b3-a452-ffe3fd846798" (UID: "7cb209d2-d0d5-41b3-a452-ffe3fd846798"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.609416 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-scripts\") pod \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.609467 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-config-data\") pod \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.609496 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-config-data-custom\") pod \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.610046 5094 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7cb209d2-d0d5-41b3-a452-ffe3fd846798-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.625393 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7cb209d2-d0d5-41b3-a452-ffe3fd846798" (UID: "7cb209d2-d0d5-41b3-a452-ffe3fd846798"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.625425 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cb209d2-d0d5-41b3-a452-ffe3fd846798-kube-api-access-k6zwv" (OuterVolumeSpecName: "kube-api-access-k6zwv") pod "7cb209d2-d0d5-41b3-a452-ffe3fd846798" (UID: "7cb209d2-d0d5-41b3-a452-ffe3fd846798"). InnerVolumeSpecName "kube-api-access-k6zwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.636920 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-scripts" (OuterVolumeSpecName: "scripts") pod "7cb209d2-d0d5-41b3-a452-ffe3fd846798" (UID: "7cb209d2-d0d5-41b3-a452-ffe3fd846798"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.666688 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7cb209d2-d0d5-41b3-a452-ffe3fd846798" (UID: "7cb209d2-d0d5-41b3-a452-ffe3fd846798"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.712263 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6zwv\" (UniqueName: \"kubernetes.io/projected/7cb209d2-d0d5-41b3-a452-ffe3fd846798-kube-api-access-k6zwv\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.712302 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.712315 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.712329 5094 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.715450 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-config-data" (OuterVolumeSpecName: "config-data") pod "7cb209d2-d0d5-41b3-a452-ffe3fd846798" (UID: "7cb209d2-d0d5-41b3-a452-ffe3fd846798"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.813641 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.156640 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7cb209d2-d0d5-41b3-a452-ffe3fd846798","Type":"ContainerDied","Data":"96e6715e90a0b7ca9a4f361deda20ad4a5f2e1b7bb2403e2af62f561294f2544"} Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.156696 5094 scope.go:117] "RemoveContainer" containerID="6fea4c135338d1a075521aee446680ad85fdc4ca1be3734940b421558747f123" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.156786 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.185790 5094 scope.go:117] "RemoveContainer" containerID="559602b89f51921b2a84d2e61fb334636cf26303ca35d00060fdb477994544a2" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.204903 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.219132 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.235837 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 08:47:48 crc kubenswrapper[5094]: E0220 08:47:48.236308 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cb209d2-d0d5-41b3-a452-ffe3fd846798" containerName="probe" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.236328 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cb209d2-d0d5-41b3-a452-ffe3fd846798" containerName="probe" Feb 20 08:47:48 crc kubenswrapper[5094]: E0220 08:47:48.236351 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cb209d2-d0d5-41b3-a452-ffe3fd846798" containerName="cinder-scheduler" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.236358 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cb209d2-d0d5-41b3-a452-ffe3fd846798" containerName="cinder-scheduler" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.236523 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cb209d2-d0d5-41b3-a452-ffe3fd846798" containerName="cinder-scheduler" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.236543 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cb209d2-d0d5-41b3-a452-ffe3fd846798" containerName="probe" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.237571 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.240041 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.253233 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.309307 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.322419 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88808044-5011-40de-9088-154284495e1a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.322792 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/88808044-5011-40de-9088-154284495e1a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.322831 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88808044-5011-40de-9088-154284495e1a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.322866 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjzzm\" (UniqueName: \"kubernetes.io/projected/88808044-5011-40de-9088-154284495e1a-kube-api-access-mjzzm\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.323003 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88808044-5011-40de-9088-154284495e1a-scripts\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.323310 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88808044-5011-40de-9088-154284495e1a-config-data\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.425466 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88808044-5011-40de-9088-154284495e1a-config-data\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.425591 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88808044-5011-40de-9088-154284495e1a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.425621 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/88808044-5011-40de-9088-154284495e1a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.425640 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88808044-5011-40de-9088-154284495e1a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.425665 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjzzm\" (UniqueName: \"kubernetes.io/projected/88808044-5011-40de-9088-154284495e1a-kube-api-access-mjzzm\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.425693 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88808044-5011-40de-9088-154284495e1a-scripts\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.426370 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/88808044-5011-40de-9088-154284495e1a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.430455 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88808044-5011-40de-9088-154284495e1a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.433758 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88808044-5011-40de-9088-154284495e1a-config-data\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.436495 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88808044-5011-40de-9088-154284495e1a-scripts\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.436561 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88808044-5011-40de-9088-154284495e1a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.444101 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjzzm\" (UniqueName: \"kubernetes.io/projected/88808044-5011-40de-9088-154284495e1a-kube-api-access-mjzzm\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.563293 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.944971 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 08:47:49 crc kubenswrapper[5094]: I0220 08:47:49.168215 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"88808044-5011-40de-9088-154284495e1a","Type":"ContainerStarted","Data":"702907a7643fffb7875b919f007bb005ff748c49ecad63e3eb1db73d3953e8f0"} Feb 20 08:47:49 crc kubenswrapper[5094]: I0220 08:47:49.854891 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cb209d2-d0d5-41b3-a452-ffe3fd846798" path="/var/lib/kubelet/pods/7cb209d2-d0d5-41b3-a452-ffe3fd846798/volumes" Feb 20 08:47:50 crc kubenswrapper[5094]: I0220 08:47:50.178626 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"88808044-5011-40de-9088-154284495e1a","Type":"ContainerStarted","Data":"331e7f6fd087287847d0160f47f187ac0aa2f7b50a0f8c6e674fc48ad5b6acf5"} Feb 20 08:47:50 crc kubenswrapper[5094]: I0220 08:47:50.179019 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"88808044-5011-40de-9088-154284495e1a","Type":"ContainerStarted","Data":"ebba4ba9a94e19f6e1606438b8296ad93fc0dbb17dc965adce0aa681a076f877"} Feb 20 08:47:50 crc kubenswrapper[5094]: I0220 08:47:50.199542 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.199526044 podStartE2EDuration="2.199526044s" podCreationTimestamp="2026-02-20 08:47:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:47:50.198032868 +0000 UTC m=+7285.070659579" watchObservedRunningTime="2026-02-20 08:47:50.199526044 +0000 UTC m=+7285.072152755" Feb 20 08:47:52 crc kubenswrapper[5094]: I0220 08:47:52.496257 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 20 08:47:53 crc kubenswrapper[5094]: I0220 08:47:53.564243 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 20 08:47:53 crc kubenswrapper[5094]: I0220 08:47:53.840926 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:47:53 crc kubenswrapper[5094]: E0220 08:47:53.841475 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:47:58 crc kubenswrapper[5094]: I0220 08:47:58.853943 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 20 08:48:04 crc kubenswrapper[5094]: I0220 08:48:04.840029 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:48:04 crc kubenswrapper[5094]: E0220 08:48:04.840817 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:48:19 crc kubenswrapper[5094]: I0220 08:48:19.841430 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:48:19 crc kubenswrapper[5094]: E0220 08:48:19.842234 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:48:28 crc kubenswrapper[5094]: I0220 08:48:28.084049 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bhj65"] Feb 20 08:48:28 crc kubenswrapper[5094]: I0220 08:48:28.086998 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:28 crc kubenswrapper[5094]: I0220 08:48:28.127913 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bhj65"] Feb 20 08:48:28 crc kubenswrapper[5094]: I0220 08:48:28.190391 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz7ql\" (UniqueName: \"kubernetes.io/projected/dc1476cd-037c-4974-ac5b-7b914e175b0c-kube-api-access-jz7ql\") pod \"redhat-marketplace-bhj65\" (UID: \"dc1476cd-037c-4974-ac5b-7b914e175b0c\") " pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:28 crc kubenswrapper[5094]: I0220 08:48:28.190679 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc1476cd-037c-4974-ac5b-7b914e175b0c-catalog-content\") pod \"redhat-marketplace-bhj65\" (UID: \"dc1476cd-037c-4974-ac5b-7b914e175b0c\") " pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:28 crc kubenswrapper[5094]: I0220 08:48:28.190809 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc1476cd-037c-4974-ac5b-7b914e175b0c-utilities\") pod \"redhat-marketplace-bhj65\" (UID: \"dc1476cd-037c-4974-ac5b-7b914e175b0c\") " pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:28 crc kubenswrapper[5094]: I0220 08:48:28.291878 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz7ql\" (UniqueName: \"kubernetes.io/projected/dc1476cd-037c-4974-ac5b-7b914e175b0c-kube-api-access-jz7ql\") pod \"redhat-marketplace-bhj65\" (UID: \"dc1476cd-037c-4974-ac5b-7b914e175b0c\") " pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:28 crc kubenswrapper[5094]: I0220 08:48:28.291990 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc1476cd-037c-4974-ac5b-7b914e175b0c-catalog-content\") pod \"redhat-marketplace-bhj65\" (UID: \"dc1476cd-037c-4974-ac5b-7b914e175b0c\") " pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:28 crc kubenswrapper[5094]: I0220 08:48:28.292048 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc1476cd-037c-4974-ac5b-7b914e175b0c-utilities\") pod \"redhat-marketplace-bhj65\" (UID: \"dc1476cd-037c-4974-ac5b-7b914e175b0c\") " pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:28 crc kubenswrapper[5094]: I0220 08:48:28.292562 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc1476cd-037c-4974-ac5b-7b914e175b0c-catalog-content\") pod \"redhat-marketplace-bhj65\" (UID: \"dc1476cd-037c-4974-ac5b-7b914e175b0c\") " pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:28 crc kubenswrapper[5094]: I0220 08:48:28.292592 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc1476cd-037c-4974-ac5b-7b914e175b0c-utilities\") pod \"redhat-marketplace-bhj65\" (UID: \"dc1476cd-037c-4974-ac5b-7b914e175b0c\") " pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:28 crc kubenswrapper[5094]: I0220 08:48:28.324645 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz7ql\" (UniqueName: \"kubernetes.io/projected/dc1476cd-037c-4974-ac5b-7b914e175b0c-kube-api-access-jz7ql\") pod \"redhat-marketplace-bhj65\" (UID: \"dc1476cd-037c-4974-ac5b-7b914e175b0c\") " pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:28 crc kubenswrapper[5094]: I0220 08:48:28.409340 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:28 crc kubenswrapper[5094]: I0220 08:48:28.883608 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bhj65"] Feb 20 08:48:28 crc kubenswrapper[5094]: W0220 08:48:28.885264 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc1476cd_037c_4974_ac5b_7b914e175b0c.slice/crio-634742879a6c4ad93a0d5b994644cd639eb638b5c2df868afc86c0ba7e9263d6 WatchSource:0}: Error finding container 634742879a6c4ad93a0d5b994644cd639eb638b5c2df868afc86c0ba7e9263d6: Status 404 returned error can't find the container with id 634742879a6c4ad93a0d5b994644cd639eb638b5c2df868afc86c0ba7e9263d6 Feb 20 08:48:29 crc kubenswrapper[5094]: I0220 08:48:29.577105 5094 generic.go:334] "Generic (PLEG): container finished" podID="dc1476cd-037c-4974-ac5b-7b914e175b0c" containerID="d4dbd3d88d452cfbf96b4b013b71fabf22ef758279d3fbebd557e66db2963d68" exitCode=0 Feb 20 08:48:29 crc kubenswrapper[5094]: I0220 08:48:29.577201 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bhj65" event={"ID":"dc1476cd-037c-4974-ac5b-7b914e175b0c","Type":"ContainerDied","Data":"d4dbd3d88d452cfbf96b4b013b71fabf22ef758279d3fbebd557e66db2963d68"} Feb 20 08:48:29 crc kubenswrapper[5094]: I0220 08:48:29.577523 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bhj65" event={"ID":"dc1476cd-037c-4974-ac5b-7b914e175b0c","Type":"ContainerStarted","Data":"634742879a6c4ad93a0d5b994644cd639eb638b5c2df868afc86c0ba7e9263d6"} Feb 20 08:48:29 crc kubenswrapper[5094]: I0220 08:48:29.579857 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 08:48:30 crc kubenswrapper[5094]: I0220 08:48:30.590127 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bhj65" event={"ID":"dc1476cd-037c-4974-ac5b-7b914e175b0c","Type":"ContainerStarted","Data":"c06c2015233b69995c50a06fa09831ff1aea6f2254ff515b4a70c854fdaab0ba"} Feb 20 08:48:31 crc kubenswrapper[5094]: I0220 08:48:31.599645 5094 generic.go:334] "Generic (PLEG): container finished" podID="dc1476cd-037c-4974-ac5b-7b914e175b0c" containerID="c06c2015233b69995c50a06fa09831ff1aea6f2254ff515b4a70c854fdaab0ba" exitCode=0 Feb 20 08:48:31 crc kubenswrapper[5094]: I0220 08:48:31.599687 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bhj65" event={"ID":"dc1476cd-037c-4974-ac5b-7b914e175b0c","Type":"ContainerDied","Data":"c06c2015233b69995c50a06fa09831ff1aea6f2254ff515b4a70c854fdaab0ba"} Feb 20 08:48:32 crc kubenswrapper[5094]: I0220 08:48:32.615307 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bhj65" event={"ID":"dc1476cd-037c-4974-ac5b-7b914e175b0c","Type":"ContainerStarted","Data":"4e748da98a0207a7e26602bc55b18b9204208e287bb9430cf800cbc8294b37f3"} Feb 20 08:48:32 crc kubenswrapper[5094]: I0220 08:48:32.641113 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bhj65" podStartSLOduration=2.154948426 podStartE2EDuration="4.641072942s" podCreationTimestamp="2026-02-20 08:48:28 +0000 UTC" firstStartedPulling="2026-02-20 08:48:29.579389523 +0000 UTC m=+7324.452016264" lastFinishedPulling="2026-02-20 08:48:32.065514059 +0000 UTC m=+7326.938140780" observedRunningTime="2026-02-20 08:48:32.635718253 +0000 UTC m=+7327.508344964" watchObservedRunningTime="2026-02-20 08:48:32.641072942 +0000 UTC m=+7327.513699703" Feb 20 08:48:32 crc kubenswrapper[5094]: I0220 08:48:32.841181 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:48:32 crc kubenswrapper[5094]: E0220 08:48:32.842056 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:48:38 crc kubenswrapper[5094]: I0220 08:48:38.410546 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:38 crc kubenswrapper[5094]: I0220 08:48:38.411308 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:38 crc kubenswrapper[5094]: I0220 08:48:38.479380 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:38 crc kubenswrapper[5094]: I0220 08:48:38.719803 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:38 crc kubenswrapper[5094]: I0220 08:48:38.768138 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bhj65"] Feb 20 08:48:40 crc kubenswrapper[5094]: I0220 08:48:40.691797 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bhj65" podUID="dc1476cd-037c-4974-ac5b-7b914e175b0c" containerName="registry-server" containerID="cri-o://4e748da98a0207a7e26602bc55b18b9204208e287bb9430cf800cbc8294b37f3" gracePeriod=2 Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.178675 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.247369 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jz7ql\" (UniqueName: \"kubernetes.io/projected/dc1476cd-037c-4974-ac5b-7b914e175b0c-kube-api-access-jz7ql\") pod \"dc1476cd-037c-4974-ac5b-7b914e175b0c\" (UID: \"dc1476cd-037c-4974-ac5b-7b914e175b0c\") " Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.247472 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc1476cd-037c-4974-ac5b-7b914e175b0c-utilities\") pod \"dc1476cd-037c-4974-ac5b-7b914e175b0c\" (UID: \"dc1476cd-037c-4974-ac5b-7b914e175b0c\") " Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.247521 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc1476cd-037c-4974-ac5b-7b914e175b0c-catalog-content\") pod \"dc1476cd-037c-4974-ac5b-7b914e175b0c\" (UID: \"dc1476cd-037c-4974-ac5b-7b914e175b0c\") " Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.249136 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc1476cd-037c-4974-ac5b-7b914e175b0c-utilities" (OuterVolumeSpecName: "utilities") pod "dc1476cd-037c-4974-ac5b-7b914e175b0c" (UID: "dc1476cd-037c-4974-ac5b-7b914e175b0c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.263002 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc1476cd-037c-4974-ac5b-7b914e175b0c-kube-api-access-jz7ql" (OuterVolumeSpecName: "kube-api-access-jz7ql") pod "dc1476cd-037c-4974-ac5b-7b914e175b0c" (UID: "dc1476cd-037c-4974-ac5b-7b914e175b0c"). InnerVolumeSpecName "kube-api-access-jz7ql". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.274155 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc1476cd-037c-4974-ac5b-7b914e175b0c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dc1476cd-037c-4974-ac5b-7b914e175b0c" (UID: "dc1476cd-037c-4974-ac5b-7b914e175b0c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.350235 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jz7ql\" (UniqueName: \"kubernetes.io/projected/dc1476cd-037c-4974-ac5b-7b914e175b0c-kube-api-access-jz7ql\") on node \"crc\" DevicePath \"\"" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.350527 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc1476cd-037c-4974-ac5b-7b914e175b0c-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.350541 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc1476cd-037c-4974-ac5b-7b914e175b0c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.701780 5094 generic.go:334] "Generic (PLEG): container finished" podID="dc1476cd-037c-4974-ac5b-7b914e175b0c" containerID="4e748da98a0207a7e26602bc55b18b9204208e287bb9430cf800cbc8294b37f3" exitCode=0 Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.701824 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bhj65" event={"ID":"dc1476cd-037c-4974-ac5b-7b914e175b0c","Type":"ContainerDied","Data":"4e748da98a0207a7e26602bc55b18b9204208e287bb9430cf800cbc8294b37f3"} Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.701853 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bhj65" event={"ID":"dc1476cd-037c-4974-ac5b-7b914e175b0c","Type":"ContainerDied","Data":"634742879a6c4ad93a0d5b994644cd639eb638b5c2df868afc86c0ba7e9263d6"} Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.701848 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.701867 5094 scope.go:117] "RemoveContainer" containerID="4e748da98a0207a7e26602bc55b18b9204208e287bb9430cf800cbc8294b37f3" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.726048 5094 scope.go:117] "RemoveContainer" containerID="c06c2015233b69995c50a06fa09831ff1aea6f2254ff515b4a70c854fdaab0ba" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.732889 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bhj65"] Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.759027 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bhj65"] Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.776476 5094 scope.go:117] "RemoveContainer" containerID="d4dbd3d88d452cfbf96b4b013b71fabf22ef758279d3fbebd557e66db2963d68" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.815164 5094 scope.go:117] "RemoveContainer" containerID="4e748da98a0207a7e26602bc55b18b9204208e287bb9430cf800cbc8294b37f3" Feb 20 08:48:41 crc kubenswrapper[5094]: E0220 08:48:41.816148 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e748da98a0207a7e26602bc55b18b9204208e287bb9430cf800cbc8294b37f3\": container with ID starting with 4e748da98a0207a7e26602bc55b18b9204208e287bb9430cf800cbc8294b37f3 not found: ID does not exist" containerID="4e748da98a0207a7e26602bc55b18b9204208e287bb9430cf800cbc8294b37f3" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.816183 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e748da98a0207a7e26602bc55b18b9204208e287bb9430cf800cbc8294b37f3"} err="failed to get container status \"4e748da98a0207a7e26602bc55b18b9204208e287bb9430cf800cbc8294b37f3\": rpc error: code = NotFound desc = could not find container \"4e748da98a0207a7e26602bc55b18b9204208e287bb9430cf800cbc8294b37f3\": container with ID starting with 4e748da98a0207a7e26602bc55b18b9204208e287bb9430cf800cbc8294b37f3 not found: ID does not exist" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.816205 5094 scope.go:117] "RemoveContainer" containerID="c06c2015233b69995c50a06fa09831ff1aea6f2254ff515b4a70c854fdaab0ba" Feb 20 08:48:41 crc kubenswrapper[5094]: E0220 08:48:41.818660 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c06c2015233b69995c50a06fa09831ff1aea6f2254ff515b4a70c854fdaab0ba\": container with ID starting with c06c2015233b69995c50a06fa09831ff1aea6f2254ff515b4a70c854fdaab0ba not found: ID does not exist" containerID="c06c2015233b69995c50a06fa09831ff1aea6f2254ff515b4a70c854fdaab0ba" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.818689 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c06c2015233b69995c50a06fa09831ff1aea6f2254ff515b4a70c854fdaab0ba"} err="failed to get container status \"c06c2015233b69995c50a06fa09831ff1aea6f2254ff515b4a70c854fdaab0ba\": rpc error: code = NotFound desc = could not find container \"c06c2015233b69995c50a06fa09831ff1aea6f2254ff515b4a70c854fdaab0ba\": container with ID starting with c06c2015233b69995c50a06fa09831ff1aea6f2254ff515b4a70c854fdaab0ba not found: ID does not exist" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.818718 5094 scope.go:117] "RemoveContainer" containerID="d4dbd3d88d452cfbf96b4b013b71fabf22ef758279d3fbebd557e66db2963d68" Feb 20 08:48:41 crc kubenswrapper[5094]: E0220 08:48:41.823817 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4dbd3d88d452cfbf96b4b013b71fabf22ef758279d3fbebd557e66db2963d68\": container with ID starting with d4dbd3d88d452cfbf96b4b013b71fabf22ef758279d3fbebd557e66db2963d68 not found: ID does not exist" containerID="d4dbd3d88d452cfbf96b4b013b71fabf22ef758279d3fbebd557e66db2963d68" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.823862 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4dbd3d88d452cfbf96b4b013b71fabf22ef758279d3fbebd557e66db2963d68"} err="failed to get container status \"d4dbd3d88d452cfbf96b4b013b71fabf22ef758279d3fbebd557e66db2963d68\": rpc error: code = NotFound desc = could not find container \"d4dbd3d88d452cfbf96b4b013b71fabf22ef758279d3fbebd557e66db2963d68\": container with ID starting with d4dbd3d88d452cfbf96b4b013b71fabf22ef758279d3fbebd557e66db2963d68 not found: ID does not exist" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.867360 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc1476cd-037c-4974-ac5b-7b914e175b0c" path="/var/lib/kubelet/pods/dc1476cd-037c-4974-ac5b-7b914e175b0c/volumes" Feb 20 08:48:42 crc kubenswrapper[5094]: I0220 08:48:42.037924 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-l2kgb"] Feb 20 08:48:42 crc kubenswrapper[5094]: I0220 08:48:42.047720 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5af6-account-create-update-9f77r"] Feb 20 08:48:42 crc kubenswrapper[5094]: I0220 08:48:42.054805 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-l2kgb"] Feb 20 08:48:42 crc kubenswrapper[5094]: I0220 08:48:42.062478 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5af6-account-create-update-9f77r"] Feb 20 08:48:43 crc kubenswrapper[5094]: I0220 08:48:43.854321 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d1678de-0344-47d5-98bb-d9ffd63912e7" path="/var/lib/kubelet/pods/1d1678de-0344-47d5-98bb-d9ffd63912e7/volumes" Feb 20 08:48:43 crc kubenswrapper[5094]: I0220 08:48:43.855614 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec59a7fc-e360-4e39-8c57-cfaa43d23566" path="/var/lib/kubelet/pods/ec59a7fc-e360-4e39-8c57-cfaa43d23566/volumes" Feb 20 08:48:46 crc kubenswrapper[5094]: I0220 08:48:46.840392 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:48:47 crc kubenswrapper[5094]: I0220 08:48:47.765974 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"e8403acee0c31e397e6e5d741268e12e6725d137dadeeb1b9238f72fcf352268"} Feb 20 08:48:53 crc kubenswrapper[5094]: I0220 08:48:53.041134 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-bh8cv"] Feb 20 08:48:53 crc kubenswrapper[5094]: I0220 08:48:53.061403 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-bh8cv"] Feb 20 08:48:53 crc kubenswrapper[5094]: I0220 08:48:53.850628 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81601ce5-f2ae-4f57-a829-6b235b7ae4df" path="/var/lib/kubelet/pods/81601ce5-f2ae-4f57-a829-6b235b7ae4df/volumes" Feb 20 08:49:06 crc kubenswrapper[5094]: I0220 08:49:06.057848 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-869rz"] Feb 20 08:49:06 crc kubenswrapper[5094]: I0220 08:49:06.071299 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-869rz"] Feb 20 08:49:07 crc kubenswrapper[5094]: I0220 08:49:07.851808 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ab956a6-a68a-4da9-9065-6f09fb2a8f28" path="/var/lib/kubelet/pods/5ab956a6-a68a-4da9-9065-6f09fb2a8f28/volumes" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.088639 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-54879cd99c-v7mr5"] Feb 20 08:49:31 crc kubenswrapper[5094]: E0220 08:49:31.089771 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc1476cd-037c-4974-ac5b-7b914e175b0c" containerName="extract-content" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.089784 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc1476cd-037c-4974-ac5b-7b914e175b0c" containerName="extract-content" Feb 20 08:49:31 crc kubenswrapper[5094]: E0220 08:49:31.089801 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc1476cd-037c-4974-ac5b-7b914e175b0c" containerName="extract-utilities" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.089808 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc1476cd-037c-4974-ac5b-7b914e175b0c" containerName="extract-utilities" Feb 20 08:49:31 crc kubenswrapper[5094]: E0220 08:49:31.089824 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc1476cd-037c-4974-ac5b-7b914e175b0c" containerName="registry-server" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.089829 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc1476cd-037c-4974-ac5b-7b914e175b0c" containerName="registry-server" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.089987 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc1476cd-037c-4974-ac5b-7b914e175b0c" containerName="registry-server" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.090871 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.093043 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.093208 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.093417 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-mpk6z" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.093582 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.109015 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54879cd99c-v7mr5"] Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.142401 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.142631 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9ba9e313-83db-4e08-a308-376d5fdf5820" containerName="glance-log" containerID="cri-o://280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1" gracePeriod=30 Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.142748 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9ba9e313-83db-4e08-a308-376d5fdf5820" containerName="glance-httpd" containerID="cri-o://7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb" gracePeriod=30 Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.207780 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.208064 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0babde66-7106-44f9-8108-dc7123e64645" containerName="glance-log" containerID="cri-o://ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673" gracePeriod=30 Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.208238 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0babde66-7106-44f9-8108-dc7123e64645" containerName="glance-httpd" containerID="cri-o://d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54" gracePeriod=30 Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.222778 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7cd88dbb9c-w5xqj"] Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.224765 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.237601 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-horizon-secret-key\") pod \"horizon-54879cd99c-v7mr5\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.237674 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-logs\") pod \"horizon-54879cd99c-v7mr5\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.237731 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-scripts\") pod \"horizon-54879cd99c-v7mr5\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.237760 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-config-data\") pod \"horizon-54879cd99c-v7mr5\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.237824 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkx4x\" (UniqueName: \"kubernetes.io/projected/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-kube-api-access-kkx4x\") pod \"horizon-54879cd99c-v7mr5\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.265331 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7cd88dbb9c-w5xqj"] Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.339940 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9cjw\" (UniqueName: \"kubernetes.io/projected/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-kube-api-access-x9cjw\") pod \"horizon-7cd88dbb9c-w5xqj\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.340618 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-scripts\") pod \"horizon-54879cd99c-v7mr5\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.340789 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-config-data\") pod \"horizon-54879cd99c-v7mr5\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.340897 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-config-data\") pod \"horizon-7cd88dbb9c-w5xqj\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.341374 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-scripts\") pod \"horizon-7cd88dbb9c-w5xqj\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.341481 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-logs\") pod \"horizon-7cd88dbb9c-w5xqj\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.341591 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkx4x\" (UniqueName: \"kubernetes.io/projected/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-kube-api-access-kkx4x\") pod \"horizon-54879cd99c-v7mr5\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.341943 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-scripts\") pod \"horizon-54879cd99c-v7mr5\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.342157 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-horizon-secret-key\") pod \"horizon-54879cd99c-v7mr5\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.342328 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-horizon-secret-key\") pod \"horizon-7cd88dbb9c-w5xqj\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.342434 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-logs\") pod \"horizon-54879cd99c-v7mr5\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.342986 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-logs\") pod \"horizon-54879cd99c-v7mr5\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.343267 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-config-data\") pod \"horizon-54879cd99c-v7mr5\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.354231 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-horizon-secret-key\") pod \"horizon-54879cd99c-v7mr5\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.362622 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkx4x\" (UniqueName: \"kubernetes.io/projected/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-kube-api-access-kkx4x\") pod \"horizon-54879cd99c-v7mr5\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.420035 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.443582 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-horizon-secret-key\") pod \"horizon-7cd88dbb9c-w5xqj\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.443649 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9cjw\" (UniqueName: \"kubernetes.io/projected/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-kube-api-access-x9cjw\") pod \"horizon-7cd88dbb9c-w5xqj\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.443683 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-config-data\") pod \"horizon-7cd88dbb9c-w5xqj\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.443749 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-scripts\") pod \"horizon-7cd88dbb9c-w5xqj\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.443766 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-logs\") pod \"horizon-7cd88dbb9c-w5xqj\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.444115 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-logs\") pod \"horizon-7cd88dbb9c-w5xqj\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: E0220 08:49:31.445591 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0babde66_7106_44f9_8108_dc7123e64645.slice/crio-ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673.scope\": RecentStats: unable to find data in memory cache]" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.448496 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-config-data\") pod \"horizon-7cd88dbb9c-w5xqj\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.449899 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-scripts\") pod \"horizon-7cd88dbb9c-w5xqj\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.459120 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-horizon-secret-key\") pod \"horizon-7cd88dbb9c-w5xqj\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.462899 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9cjw\" (UniqueName: \"kubernetes.io/projected/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-kube-api-access-x9cjw\") pod \"horizon-7cd88dbb9c-w5xqj\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.611432 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.818498 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-54879cd99c-v7mr5"] Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.869945 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6d54b4d569-kqd4s"] Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.872028 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.889689 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d54b4d569-kqd4s"] Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.923411 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-54879cd99c-v7mr5"] Feb 20 08:49:31 crc kubenswrapper[5094]: W0220 08:49:31.971367 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa4baf13_0870_4bf6_9a0b_d4fd1fb598ce.slice/crio-cdb3de8f0c49a984e54c19f650258f82b6b14df1b84ec278f6a93e74f5a453ba WatchSource:0}: Error finding container cdb3de8f0c49a984e54c19f650258f82b6b14df1b84ec278f6a93e74f5a453ba: Status 404 returned error can't find the container with id cdb3de8f0c49a984e54c19f650258f82b6b14df1b84ec278f6a93e74f5a453ba Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.976764 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7cd88dbb9c-w5xqj"] Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.055195 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-horizon-secret-key\") pod \"horizon-6d54b4d569-kqd4s\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.055250 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rkqz\" (UniqueName: \"kubernetes.io/projected/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-kube-api-access-2rkqz\") pod \"horizon-6d54b4d569-kqd4s\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.055279 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-scripts\") pod \"horizon-6d54b4d569-kqd4s\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.055333 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-config-data\") pod \"horizon-6d54b4d569-kqd4s\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.055364 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-logs\") pod \"horizon-6d54b4d569-kqd4s\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.157454 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-horizon-secret-key\") pod \"horizon-6d54b4d569-kqd4s\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.157509 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rkqz\" (UniqueName: \"kubernetes.io/projected/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-kube-api-access-2rkqz\") pod \"horizon-6d54b4d569-kqd4s\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.157540 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-scripts\") pod \"horizon-6d54b4d569-kqd4s\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.157595 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-config-data\") pod \"horizon-6d54b4d569-kqd4s\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.157631 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-logs\") pod \"horizon-6d54b4d569-kqd4s\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.158167 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-logs\") pod \"horizon-6d54b4d569-kqd4s\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.158418 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-scripts\") pod \"horizon-6d54b4d569-kqd4s\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.159195 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-config-data\") pod \"horizon-6d54b4d569-kqd4s\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.163415 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-horizon-secret-key\") pod \"horizon-6d54b4d569-kqd4s\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.173021 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rkqz\" (UniqueName: \"kubernetes.io/projected/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-kube-api-access-2rkqz\") pod \"horizon-6d54b4d569-kqd4s\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.190078 5094 generic.go:334] "Generic (PLEG): container finished" podID="0babde66-7106-44f9-8108-dc7123e64645" containerID="ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673" exitCode=143 Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.190155 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0babde66-7106-44f9-8108-dc7123e64645","Type":"ContainerDied","Data":"ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673"} Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.192542 5094 generic.go:334] "Generic (PLEG): container finished" podID="9ba9e313-83db-4e08-a308-376d5fdf5820" containerID="280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1" exitCode=143 Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.192608 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9ba9e313-83db-4e08-a308-376d5fdf5820","Type":"ContainerDied","Data":"280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1"} Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.194158 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54879cd99c-v7mr5" event={"ID":"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad","Type":"ContainerStarted","Data":"eb0248c1e16d27da1be9e878bc7100201452eaa9c36214b5074f678269d497a4"} Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.195873 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cd88dbb9c-w5xqj" event={"ID":"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce","Type":"ContainerStarted","Data":"cdb3de8f0c49a984e54c19f650258f82b6b14df1b84ec278f6a93e74f5a453ba"} Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.205759 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.761664 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d54b4d569-kqd4s"] Feb 20 08:49:32 crc kubenswrapper[5094]: W0220 08:49:32.800001 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8eb2c0e1_59eb_4f7a_aeea_8965a35d861c.slice/crio-8a9bb3e82b6fd52dcf3df5d577a488c8f0daee979b4bcefb84c464a576031f35 WatchSource:0}: Error finding container 8a9bb3e82b6fd52dcf3df5d577a488c8f0daee979b4bcefb84c464a576031f35: Status 404 returned error can't find the container with id 8a9bb3e82b6fd52dcf3df5d577a488c8f0daee979b4bcefb84c464a576031f35 Feb 20 08:49:33 crc kubenswrapper[5094]: I0220 08:49:33.205957 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d54b4d569-kqd4s" event={"ID":"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c","Type":"ContainerStarted","Data":"8a9bb3e82b6fd52dcf3df5d577a488c8f0daee979b4bcefb84c464a576031f35"} Feb 20 08:49:34 crc kubenswrapper[5094]: I0220 08:49:34.816675 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 08:49:34 crc kubenswrapper[5094]: I0220 08:49:34.912412 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shfm9\" (UniqueName: \"kubernetes.io/projected/9ba9e313-83db-4e08-a308-376d5fdf5820-kube-api-access-shfm9\") pod \"9ba9e313-83db-4e08-a308-376d5fdf5820\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " Feb 20 08:49:34 crc kubenswrapper[5094]: I0220 08:49:34.912467 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9ba9e313-83db-4e08-a308-376d5fdf5820-ceph\") pod \"9ba9e313-83db-4e08-a308-376d5fdf5820\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " Feb 20 08:49:34 crc kubenswrapper[5094]: I0220 08:49:34.912552 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-combined-ca-bundle\") pod \"9ba9e313-83db-4e08-a308-376d5fdf5820\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " Feb 20 08:49:34 crc kubenswrapper[5094]: I0220 08:49:34.912743 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9ba9e313-83db-4e08-a308-376d5fdf5820-httpd-run\") pod \"9ba9e313-83db-4e08-a308-376d5fdf5820\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " Feb 20 08:49:34 crc kubenswrapper[5094]: I0220 08:49:34.912799 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-scripts\") pod \"9ba9e313-83db-4e08-a308-376d5fdf5820\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " Feb 20 08:49:34 crc kubenswrapper[5094]: I0220 08:49:34.912824 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-config-data\") pod \"9ba9e313-83db-4e08-a308-376d5fdf5820\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " Feb 20 08:49:34 crc kubenswrapper[5094]: I0220 08:49:34.912862 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ba9e313-83db-4e08-a308-376d5fdf5820-logs\") pod \"9ba9e313-83db-4e08-a308-376d5fdf5820\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " Feb 20 08:49:34 crc kubenswrapper[5094]: I0220 08:49:34.913381 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ba9e313-83db-4e08-a308-376d5fdf5820-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9ba9e313-83db-4e08-a308-376d5fdf5820" (UID: "9ba9e313-83db-4e08-a308-376d5fdf5820"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:49:34 crc kubenswrapper[5094]: I0220 08:49:34.914229 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ba9e313-83db-4e08-a308-376d5fdf5820-logs" (OuterVolumeSpecName: "logs") pod "9ba9e313-83db-4e08-a308-376d5fdf5820" (UID: "9ba9e313-83db-4e08-a308-376d5fdf5820"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:49:34 crc kubenswrapper[5094]: I0220 08:49:34.919778 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-scripts" (OuterVolumeSpecName: "scripts") pod "9ba9e313-83db-4e08-a308-376d5fdf5820" (UID: "9ba9e313-83db-4e08-a308-376d5fdf5820"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:49:34 crc kubenswrapper[5094]: I0220 08:49:34.920314 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ba9e313-83db-4e08-a308-376d5fdf5820-kube-api-access-shfm9" (OuterVolumeSpecName: "kube-api-access-shfm9") pod "9ba9e313-83db-4e08-a308-376d5fdf5820" (UID: "9ba9e313-83db-4e08-a308-376d5fdf5820"). InnerVolumeSpecName "kube-api-access-shfm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:49:34 crc kubenswrapper[5094]: I0220 08:49:34.920550 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ba9e313-83db-4e08-a308-376d5fdf5820-ceph" (OuterVolumeSpecName: "ceph") pod "9ba9e313-83db-4e08-a308-376d5fdf5820" (UID: "9ba9e313-83db-4e08-a308-376d5fdf5820"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:49:34 crc kubenswrapper[5094]: I0220 08:49:34.947689 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 08:49:34 crc kubenswrapper[5094]: I0220 08:49:34.950469 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ba9e313-83db-4e08-a308-376d5fdf5820" (UID: "9ba9e313-83db-4e08-a308-376d5fdf5820"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.019489 5094 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9ba9e313-83db-4e08-a308-376d5fdf5820-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.019525 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.019535 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ba9e313-83db-4e08-a308-376d5fdf5820-logs\") on node \"crc\" DevicePath \"\"" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.019546 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shfm9\" (UniqueName: \"kubernetes.io/projected/9ba9e313-83db-4e08-a308-376d5fdf5820-kube-api-access-shfm9\") on node \"crc\" DevicePath \"\"" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.019559 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9ba9e313-83db-4e08-a308-376d5fdf5820-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.019568 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.034968 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-config-data" (OuterVolumeSpecName: "config-data") pod "9ba9e313-83db-4e08-a308-376d5fdf5820" (UID: "9ba9e313-83db-4e08-a308-376d5fdf5820"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.121296 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-config-data\") pod \"0babde66-7106-44f9-8108-dc7123e64645\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.121688 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0babde66-7106-44f9-8108-dc7123e64645-ceph\") pod \"0babde66-7106-44f9-8108-dc7123e64645\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.121809 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-combined-ca-bundle\") pod \"0babde66-7106-44f9-8108-dc7123e64645\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.121841 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0babde66-7106-44f9-8108-dc7123e64645-logs\") pod \"0babde66-7106-44f9-8108-dc7123e64645\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.122059 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-scripts\") pod \"0babde66-7106-44f9-8108-dc7123e64645\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.122129 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0babde66-7106-44f9-8108-dc7123e64645-httpd-run\") pod \"0babde66-7106-44f9-8108-dc7123e64645\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.122213 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxvn\" (UniqueName: \"kubernetes.io/projected/0babde66-7106-44f9-8108-dc7123e64645-kube-api-access-pcxvn\") pod \"0babde66-7106-44f9-8108-dc7123e64645\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.122840 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.127084 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0babde66-7106-44f9-8108-dc7123e64645-kube-api-access-pcxvn" (OuterVolumeSpecName: "kube-api-access-pcxvn") pod "0babde66-7106-44f9-8108-dc7123e64645" (UID: "0babde66-7106-44f9-8108-dc7123e64645"). InnerVolumeSpecName "kube-api-access-pcxvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.127828 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0babde66-7106-44f9-8108-dc7123e64645-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0babde66-7106-44f9-8108-dc7123e64645" (UID: "0babde66-7106-44f9-8108-dc7123e64645"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.127938 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0babde66-7106-44f9-8108-dc7123e64645-logs" (OuterVolumeSpecName: "logs") pod "0babde66-7106-44f9-8108-dc7123e64645" (UID: "0babde66-7106-44f9-8108-dc7123e64645"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.129988 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0babde66-7106-44f9-8108-dc7123e64645-ceph" (OuterVolumeSpecName: "ceph") pod "0babde66-7106-44f9-8108-dc7123e64645" (UID: "0babde66-7106-44f9-8108-dc7123e64645"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.130288 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-scripts" (OuterVolumeSpecName: "scripts") pod "0babde66-7106-44f9-8108-dc7123e64645" (UID: "0babde66-7106-44f9-8108-dc7123e64645"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.153581 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0babde66-7106-44f9-8108-dc7123e64645" (UID: "0babde66-7106-44f9-8108-dc7123e64645"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.174724 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-config-data" (OuterVolumeSpecName: "config-data") pod "0babde66-7106-44f9-8108-dc7123e64645" (UID: "0babde66-7106-44f9-8108-dc7123e64645"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.224921 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.224966 5094 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0babde66-7106-44f9-8108-dc7123e64645-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.224981 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxvn\" (UniqueName: \"kubernetes.io/projected/0babde66-7106-44f9-8108-dc7123e64645-kube-api-access-pcxvn\") on node \"crc\" DevicePath \"\"" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.224993 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.225001 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0babde66-7106-44f9-8108-dc7123e64645-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.225009 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.225018 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0babde66-7106-44f9-8108-dc7123e64645-logs\") on node \"crc\" DevicePath \"\"" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.240446 5094 generic.go:334] "Generic (PLEG): container finished" podID="0babde66-7106-44f9-8108-dc7123e64645" containerID="d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54" exitCode=0 Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.240533 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.240534 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0babde66-7106-44f9-8108-dc7123e64645","Type":"ContainerDied","Data":"d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54"} Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.240598 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0babde66-7106-44f9-8108-dc7123e64645","Type":"ContainerDied","Data":"8d5bad0851a2161ffc5adb1e5293851f312f7324fa1146ff757f90b48ad072a5"} Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.240621 5094 scope.go:117] "RemoveContainer" containerID="d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.244579 5094 generic.go:334] "Generic (PLEG): container finished" podID="9ba9e313-83db-4e08-a308-376d5fdf5820" containerID="7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb" exitCode=0 Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.244603 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9ba9e313-83db-4e08-a308-376d5fdf5820","Type":"ContainerDied","Data":"7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb"} Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.244627 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9ba9e313-83db-4e08-a308-376d5fdf5820","Type":"ContainerDied","Data":"b0e75d749acef441fc02419393d76ceab32d56e244c55548218d0246a2690c4a"} Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.244732 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.291051 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.304632 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.323345 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.343959 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 08:49:35 crc kubenswrapper[5094]: E0220 08:49:35.344412 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ba9e313-83db-4e08-a308-376d5fdf5820" containerName="glance-httpd" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.344427 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ba9e313-83db-4e08-a308-376d5fdf5820" containerName="glance-httpd" Feb 20 08:49:35 crc kubenswrapper[5094]: E0220 08:49:35.344436 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ba9e313-83db-4e08-a308-376d5fdf5820" containerName="glance-log" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.344443 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ba9e313-83db-4e08-a308-376d5fdf5820" containerName="glance-log" Feb 20 08:49:35 crc kubenswrapper[5094]: E0220 08:49:35.344463 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0babde66-7106-44f9-8108-dc7123e64645" containerName="glance-httpd" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.344471 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0babde66-7106-44f9-8108-dc7123e64645" containerName="glance-httpd" Feb 20 08:49:35 crc kubenswrapper[5094]: E0220 08:49:35.344501 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0babde66-7106-44f9-8108-dc7123e64645" containerName="glance-log" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.344509 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0babde66-7106-44f9-8108-dc7123e64645" containerName="glance-log" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.344729 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ba9e313-83db-4e08-a308-376d5fdf5820" containerName="glance-log" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.344752 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="0babde66-7106-44f9-8108-dc7123e64645" containerName="glance-httpd" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.344760 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ba9e313-83db-4e08-a308-376d5fdf5820" containerName="glance-httpd" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.344769 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="0babde66-7106-44f9-8108-dc7123e64645" containerName="glance-log" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.345869 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.347679 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.347861 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.347986 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-7xqwp" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.381279 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.390613 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.407813 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.410391 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.412527 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.418474 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.532437 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.532488 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-logs\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.532546 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-config-data\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.532625 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.532772 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-scripts\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.532846 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-ceph\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.532915 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.532991 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gsvl\" (UniqueName: \"kubernetes.io/projected/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-kube-api-access-4gsvl\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.533035 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-logs\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.533060 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.533174 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n645w\" (UniqueName: \"kubernetes.io/projected/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-kube-api-access-n645w\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.533206 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.533253 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.533278 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.634512 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-scripts\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.634557 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-ceph\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.634591 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.634638 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gsvl\" (UniqueName: \"kubernetes.io/projected/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-kube-api-access-4gsvl\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.634666 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-logs\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.634683 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.634725 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n645w\" (UniqueName: \"kubernetes.io/projected/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-kube-api-access-n645w\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.634742 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.634757 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.634778 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.634801 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-logs\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.634817 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.634841 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-config-data\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.634870 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.635278 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.636238 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-logs\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.636252 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.636502 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-logs\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.639219 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.645889 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.646040 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-scripts\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.646063 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.646078 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.646120 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-config-data\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.648683 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-ceph\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.650932 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.651213 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gsvl\" (UniqueName: \"kubernetes.io/projected/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-kube-api-access-4gsvl\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.654360 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n645w\" (UniqueName: \"kubernetes.io/projected/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-kube-api-access-n645w\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.713207 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.728488 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.854432 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0babde66-7106-44f9-8108-dc7123e64645" path="/var/lib/kubelet/pods/0babde66-7106-44f9-8108-dc7123e64645/volumes" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.855563 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ba9e313-83db-4e08-a308-376d5fdf5820" path="/var/lib/kubelet/pods/9ba9e313-83db-4e08-a308-376d5fdf5820/volumes" Feb 20 08:49:37 crc kubenswrapper[5094]: I0220 08:49:37.786091 5094 scope.go:117] "RemoveContainer" containerID="280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.052765 5094 scope.go:117] "RemoveContainer" containerID="ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.067371 5094 scope.go:117] "RemoveContainer" containerID="ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.138277 5094 scope.go:117] "RemoveContainer" containerID="d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54" Feb 20 08:49:40 crc kubenswrapper[5094]: E0220 08:49:40.138644 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54\": container with ID starting with d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54 not found: ID does not exist" containerID="d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.138729 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54"} err="failed to get container status \"d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54\": rpc error: code = NotFound desc = could not find container \"d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54\": container with ID starting with d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54 not found: ID does not exist" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.138768 5094 scope.go:117] "RemoveContainer" containerID="ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673" Feb 20 08:49:40 crc kubenswrapper[5094]: E0220 08:49:40.139105 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673\": container with ID starting with ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673 not found: ID does not exist" containerID="ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.139133 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673"} err="failed to get container status \"ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673\": rpc error: code = NotFound desc = could not find container \"ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673\": container with ID starting with ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673 not found: ID does not exist" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.139151 5094 scope.go:117] "RemoveContainer" containerID="7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb" Feb 20 08:49:40 crc kubenswrapper[5094]: E0220 08:49:40.149613 5094 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_glance-log_glance-default-external-api-0_openstack_0babde66-7106-44f9-8108-dc7123e64645_0 in pod sandbox 8d5bad0851a2161ffc5adb1e5293851f312f7324fa1146ff757f90b48ad072a5: identifier is not a container" containerID="ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673" Feb 20 08:49:40 crc kubenswrapper[5094]: E0220 08:49:40.149669 5094 kuberuntime_gc.go:150] "Failed to remove container" err="rpc error: code = Unknown desc = failed to delete container k8s_glance-log_glance-default-external-api-0_openstack_0babde66-7106-44f9-8108-dc7123e64645_0 in pod sandbox 8d5bad0851a2161ffc5adb1e5293851f312f7324fa1146ff757f90b48ad072a5: identifier is not a container" containerID="ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.149693 5094 scope.go:117] "RemoveContainer" containerID="130895fd2e1c68f426f440f2c22e59759f2e0edcb44074b22c5181ef4b193c92" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.374233 5094 scope.go:117] "RemoveContainer" containerID="8731ce151a95e8d35ad8f8cec8f98989c881d304429dbe888614942996f9f454" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.395946 5094 scope.go:117] "RemoveContainer" containerID="280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1" Feb 20 08:49:40 crc kubenswrapper[5094]: E0220 08:49:40.396466 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1\": container with ID starting with 280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1 not found: ID does not exist" containerID="280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.396510 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1"} err="failed to get container status \"280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1\": rpc error: code = NotFound desc = could not find container \"280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1\": container with ID starting with 280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1 not found: ID does not exist" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.396544 5094 scope.go:117] "RemoveContainer" containerID="7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb" Feb 20 08:49:40 crc kubenswrapper[5094]: E0220 08:49:40.397314 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb\": container with ID starting with 7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb not found: ID does not exist" containerID="7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.397352 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb"} err="failed to get container status \"7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb\": rpc error: code = NotFound desc = could not find container \"7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb\": container with ID starting with 7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb not found: ID does not exist" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.397378 5094 scope.go:117] "RemoveContainer" containerID="280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.397898 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1"} err="failed to get container status \"280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1\": rpc error: code = NotFound desc = could not find container \"280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1\": container with ID starting with 280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1 not found: ID does not exist" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.420171 5094 scope.go:117] "RemoveContainer" containerID="cd80f63c99e6776fe7e02e1fb80f9d75dc8541b9606921e566c011df6c0e65ac" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.442059 5094 scope.go:117] "RemoveContainer" containerID="7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb" Feb 20 08:49:40 crc kubenswrapper[5094]: E0220 08:49:40.442547 5094 kuberuntime_gc.go:150] "Failed to remove container" err="failed to get container status \"7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb\": rpc error: code = NotFound desc = could not find container \"7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb\": container with ID starting with 7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb not found: ID does not exist" containerID="7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.442601 5094 scope.go:117] "RemoveContainer" containerID="52db5b53565602a22b540482712ac73023427fa1b0c5c5dd0a43d58c9fbc73b5" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.475859 5094 scope.go:117] "RemoveContainer" containerID="d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54" Feb 20 08:49:40 crc kubenswrapper[5094]: E0220 08:49:40.477046 5094 kuberuntime_gc.go:150] "Failed to remove container" err="failed to get container status \"d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54\": rpc error: code = NotFound desc = could not find container \"d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54\": container with ID starting with d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54 not found: ID does not exist" containerID="d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.691363 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 08:49:40 crc kubenswrapper[5094]: W0220 08:49:40.706532 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf67b4c32_25f3_4bc0_af69_ff9a9aa04404.slice/crio-a277f4d894779a45c93b1262a35dee5043817208ec1a19240c1c267bdba44161 WatchSource:0}: Error finding container a277f4d894779a45c93b1262a35dee5043817208ec1a19240c1c267bdba44161: Status 404 returned error can't find the container with id a277f4d894779a45c93b1262a35dee5043817208ec1a19240c1c267bdba44161 Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.793508 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 08:49:40 crc kubenswrapper[5094]: W0220 08:49:40.797771 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9c121ca_4074_4775_a8e5_0c7f8a00ce22.slice/crio-10b7ab67f854b392550485e5d8cc257cd94034258f7c80ddceb26f28cb3de6ae WatchSource:0}: Error finding container 10b7ab67f854b392550485e5d8cc257cd94034258f7c80ddceb26f28cb3de6ae: Status 404 returned error can't find the container with id 10b7ab67f854b392550485e5d8cc257cd94034258f7c80ddceb26f28cb3de6ae Feb 20 08:49:41 crc kubenswrapper[5094]: I0220 08:49:41.347733 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e9c121ca-4074-4775-a8e5-0c7f8a00ce22","Type":"ContainerStarted","Data":"10b7ab67f854b392550485e5d8cc257cd94034258f7c80ddceb26f28cb3de6ae"} Feb 20 08:49:41 crc kubenswrapper[5094]: I0220 08:49:41.349430 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f67b4c32-25f3-4bc0-af69-ff9a9aa04404","Type":"ContainerStarted","Data":"a277f4d894779a45c93b1262a35dee5043817208ec1a19240c1c267bdba44161"} Feb 20 08:49:41 crc kubenswrapper[5094]: I0220 08:49:41.351466 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d54b4d569-kqd4s" event={"ID":"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c","Type":"ContainerStarted","Data":"383a27955fb0f86aaf1453d917dac5350556fafc41a927eb7c86a2bb5a520c4d"} Feb 20 08:49:41 crc kubenswrapper[5094]: I0220 08:49:41.351533 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d54b4d569-kqd4s" event={"ID":"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c","Type":"ContainerStarted","Data":"1d9312f8455277c3cb35f55a1583969cb70ad9d4c52d34eaec9448990ee953f6"} Feb 20 08:49:41 crc kubenswrapper[5094]: I0220 08:49:41.359435 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54879cd99c-v7mr5" event={"ID":"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad","Type":"ContainerStarted","Data":"a253a0cdc59df09716156eda440d5655d328fc7850d293b9093cf8b148e34b46"} Feb 20 08:49:41 crc kubenswrapper[5094]: I0220 08:49:41.359486 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54879cd99c-v7mr5" event={"ID":"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad","Type":"ContainerStarted","Data":"4c1355ff4c58f3d8da61df01ab4f55870452a5c2ce565bb148c221041dd245f6"} Feb 20 08:49:41 crc kubenswrapper[5094]: I0220 08:49:41.359622 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-54879cd99c-v7mr5" podUID="a42f98d6-6f5b-40c1-a6af-fc44035ed4ad" containerName="horizon-log" containerID="cri-o://4c1355ff4c58f3d8da61df01ab4f55870452a5c2ce565bb148c221041dd245f6" gracePeriod=30 Feb 20 08:49:41 crc kubenswrapper[5094]: I0220 08:49:41.359906 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-54879cd99c-v7mr5" podUID="a42f98d6-6f5b-40c1-a6af-fc44035ed4ad" containerName="horizon" containerID="cri-o://a253a0cdc59df09716156eda440d5655d328fc7850d293b9093cf8b148e34b46" gracePeriod=30 Feb 20 08:49:41 crc kubenswrapper[5094]: I0220 08:49:41.373932 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cd88dbb9c-w5xqj" event={"ID":"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce","Type":"ContainerStarted","Data":"f33835eabda10dc85467686f8b097753120431960d21d398e7ef1b424eade45e"} Feb 20 08:49:41 crc kubenswrapper[5094]: I0220 08:49:41.373989 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cd88dbb9c-w5xqj" event={"ID":"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce","Type":"ContainerStarted","Data":"b2a6d67a4ec9b9a83a00be4b0eec89aed7fd40eb1bfd2448407868b42be4653c"} Feb 20 08:49:41 crc kubenswrapper[5094]: I0220 08:49:41.387009 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6d54b4d569-kqd4s" podStartSLOduration=3.006720379 podStartE2EDuration="10.386974937s" podCreationTimestamp="2026-02-20 08:49:31 +0000 UTC" firstStartedPulling="2026-02-20 08:49:32.802230749 +0000 UTC m=+7387.674857460" lastFinishedPulling="2026-02-20 08:49:40.182485307 +0000 UTC m=+7395.055112018" observedRunningTime="2026-02-20 08:49:41.378568844 +0000 UTC m=+7396.251195555" watchObservedRunningTime="2026-02-20 08:49:41.386974937 +0000 UTC m=+7396.259601648" Feb 20 08:49:41 crc kubenswrapper[5094]: I0220 08:49:41.415777 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-54879cd99c-v7mr5" podStartSLOduration=2.12496127 podStartE2EDuration="10.415755009s" podCreationTimestamp="2026-02-20 08:49:31 +0000 UTC" firstStartedPulling="2026-02-20 08:49:31.932887869 +0000 UTC m=+7386.805514580" lastFinishedPulling="2026-02-20 08:49:40.223681608 +0000 UTC m=+7395.096308319" observedRunningTime="2026-02-20 08:49:41.395633786 +0000 UTC m=+7396.268260497" watchObservedRunningTime="2026-02-20 08:49:41.415755009 +0000 UTC m=+7396.288381720" Feb 20 08:49:41 crc kubenswrapper[5094]: I0220 08:49:41.420557 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:41 crc kubenswrapper[5094]: I0220 08:49:41.426108 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7cd88dbb9c-w5xqj" podStartSLOduration=2.221772728 podStartE2EDuration="10.426086177s" podCreationTimestamp="2026-02-20 08:49:31 +0000 UTC" firstStartedPulling="2026-02-20 08:49:31.974752856 +0000 UTC m=+7386.847379567" lastFinishedPulling="2026-02-20 08:49:40.179066305 +0000 UTC m=+7395.051693016" observedRunningTime="2026-02-20 08:49:41.418657139 +0000 UTC m=+7396.291283850" watchObservedRunningTime="2026-02-20 08:49:41.426086177 +0000 UTC m=+7396.298712888" Feb 20 08:49:41 crc kubenswrapper[5094]: I0220 08:49:41.611515 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:41 crc kubenswrapper[5094]: I0220 08:49:41.611606 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:42 crc kubenswrapper[5094]: I0220 08:49:42.206795 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:42 crc kubenswrapper[5094]: I0220 08:49:42.208098 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:42 crc kubenswrapper[5094]: I0220 08:49:42.404018 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e9c121ca-4074-4775-a8e5-0c7f8a00ce22","Type":"ContainerStarted","Data":"ebfce51c6e65003d83ecac4575539cbe89ef33ed9ef179649160e40ffb0d9424"} Feb 20 08:49:42 crc kubenswrapper[5094]: I0220 08:49:42.404075 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e9c121ca-4074-4775-a8e5-0c7f8a00ce22","Type":"ContainerStarted","Data":"851ae34f377a0b4f477b2e669be692d9a489046997a716b0cd48cdf5ebcf1c96"} Feb 20 08:49:42 crc kubenswrapper[5094]: I0220 08:49:42.413393 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f67b4c32-25f3-4bc0-af69-ff9a9aa04404","Type":"ContainerStarted","Data":"5e925d2c3ff306cc61595db457c21ce5ed0a6c83a5364ed398652c314c9004df"} Feb 20 08:49:42 crc kubenswrapper[5094]: I0220 08:49:42.413438 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f67b4c32-25f3-4bc0-af69-ff9a9aa04404","Type":"ContainerStarted","Data":"b7058d3a04484a92e5290bb3b23f6ae51eb873831f097864bdd8b116c4fd9b7a"} Feb 20 08:49:42 crc kubenswrapper[5094]: I0220 08:49:42.436857 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.436834608 podStartE2EDuration="7.436834608s" podCreationTimestamp="2026-02-20 08:49:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:49:42.428384945 +0000 UTC m=+7397.301011656" watchObservedRunningTime="2026-02-20 08:49:42.436834608 +0000 UTC m=+7397.309461329" Feb 20 08:49:42 crc kubenswrapper[5094]: I0220 08:49:42.455986 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.455969889 podStartE2EDuration="7.455969889s" podCreationTimestamp="2026-02-20 08:49:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:49:42.45310212 +0000 UTC m=+7397.325728831" watchObservedRunningTime="2026-02-20 08:49:42.455969889 +0000 UTC m=+7397.328596600" Feb 20 08:49:45 crc kubenswrapper[5094]: I0220 08:49:45.715024 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 20 08:49:45 crc kubenswrapper[5094]: I0220 08:49:45.717195 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 20 08:49:45 crc kubenswrapper[5094]: I0220 08:49:45.729336 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 20 08:49:45 crc kubenswrapper[5094]: I0220 08:49:45.729414 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 20 08:49:45 crc kubenswrapper[5094]: I0220 08:49:45.757315 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 20 08:49:45 crc kubenswrapper[5094]: I0220 08:49:45.771830 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 20 08:49:45 crc kubenswrapper[5094]: I0220 08:49:45.779514 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 20 08:49:45 crc kubenswrapper[5094]: I0220 08:49:45.789246 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 20 08:49:46 crc kubenswrapper[5094]: I0220 08:49:46.456134 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 20 08:49:46 crc kubenswrapper[5094]: I0220 08:49:46.456212 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 20 08:49:46 crc kubenswrapper[5094]: I0220 08:49:46.456252 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 20 08:49:46 crc kubenswrapper[5094]: I0220 08:49:46.456263 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 20 08:49:48 crc kubenswrapper[5094]: I0220 08:49:48.489631 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 20 08:49:48 crc kubenswrapper[5094]: I0220 08:49:48.522307 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 20 08:49:48 crc kubenswrapper[5094]: I0220 08:49:48.528204 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 20 08:49:49 crc kubenswrapper[5094]: I0220 08:49:49.603630 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 20 08:49:51 crc kubenswrapper[5094]: I0220 08:49:51.613177 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7cd88dbb9c-w5xqj" podUID="aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.96:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.96:8080: connect: connection refused" Feb 20 08:49:52 crc kubenswrapper[5094]: I0220 08:49:52.208975 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6d54b4d569-kqd4s" podUID="8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.97:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.97:8080: connect: connection refused" Feb 20 08:50:03 crc kubenswrapper[5094]: I0220 08:50:03.507020 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:50:03 crc kubenswrapper[5094]: I0220 08:50:03.931674 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:50:05 crc kubenswrapper[5094]: I0220 08:50:05.264881 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:50:05 crc kubenswrapper[5094]: I0220 08:50:05.633158 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:50:05 crc kubenswrapper[5094]: I0220 08:50:05.694739 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7cd88dbb9c-w5xqj"] Feb 20 08:50:05 crc kubenswrapper[5094]: I0220 08:50:05.695297 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7cd88dbb9c-w5xqj" podUID="aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" containerName="horizon-log" containerID="cri-o://b2a6d67a4ec9b9a83a00be4b0eec89aed7fd40eb1bfd2448407868b42be4653c" gracePeriod=30 Feb 20 08:50:05 crc kubenswrapper[5094]: I0220 08:50:05.695424 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7cd88dbb9c-w5xqj" podUID="aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" containerName="horizon" containerID="cri-o://f33835eabda10dc85467686f8b097753120431960d21d398e7ef1b424eade45e" gracePeriod=30 Feb 20 08:50:06 crc kubenswrapper[5094]: I0220 08:50:06.451261 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-csb4t"] Feb 20 08:50:06 crc kubenswrapper[5094]: I0220 08:50:06.453164 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:06 crc kubenswrapper[5094]: I0220 08:50:06.471225 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-csb4t"] Feb 20 08:50:06 crc kubenswrapper[5094]: I0220 08:50:06.576573 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-utilities\") pod \"redhat-operators-csb4t\" (UID: \"488d08d3-57d4-47fe-a49a-65c71e0e0c6e\") " pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:06 crc kubenswrapper[5094]: I0220 08:50:06.576690 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-catalog-content\") pod \"redhat-operators-csb4t\" (UID: \"488d08d3-57d4-47fe-a49a-65c71e0e0c6e\") " pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:06 crc kubenswrapper[5094]: I0220 08:50:06.576786 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqz4g\" (UniqueName: \"kubernetes.io/projected/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-kube-api-access-zqz4g\") pod \"redhat-operators-csb4t\" (UID: \"488d08d3-57d4-47fe-a49a-65c71e0e0c6e\") " pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:06 crc kubenswrapper[5094]: I0220 08:50:06.678452 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-utilities\") pod \"redhat-operators-csb4t\" (UID: \"488d08d3-57d4-47fe-a49a-65c71e0e0c6e\") " pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:06 crc kubenswrapper[5094]: I0220 08:50:06.678525 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-catalog-content\") pod \"redhat-operators-csb4t\" (UID: \"488d08d3-57d4-47fe-a49a-65c71e0e0c6e\") " pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:06 crc kubenswrapper[5094]: I0220 08:50:06.678574 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqz4g\" (UniqueName: \"kubernetes.io/projected/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-kube-api-access-zqz4g\") pod \"redhat-operators-csb4t\" (UID: \"488d08d3-57d4-47fe-a49a-65c71e0e0c6e\") " pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:06 crc kubenswrapper[5094]: I0220 08:50:06.679109 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-catalog-content\") pod \"redhat-operators-csb4t\" (UID: \"488d08d3-57d4-47fe-a49a-65c71e0e0c6e\") " pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:06 crc kubenswrapper[5094]: I0220 08:50:06.679221 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-utilities\") pod \"redhat-operators-csb4t\" (UID: \"488d08d3-57d4-47fe-a49a-65c71e0e0c6e\") " pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:06 crc kubenswrapper[5094]: I0220 08:50:06.698649 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqz4g\" (UniqueName: \"kubernetes.io/projected/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-kube-api-access-zqz4g\") pod \"redhat-operators-csb4t\" (UID: \"488d08d3-57d4-47fe-a49a-65c71e0e0c6e\") " pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:06 crc kubenswrapper[5094]: I0220 08:50:06.788478 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:07 crc kubenswrapper[5094]: I0220 08:50:07.269649 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-csb4t"] Feb 20 08:50:07 crc kubenswrapper[5094]: I0220 08:50:07.672484 5094 generic.go:334] "Generic (PLEG): container finished" podID="488d08d3-57d4-47fe-a49a-65c71e0e0c6e" containerID="543cf7ed92ceb54ac0ce3676ff68db26877706c2eb49f4a878eeda170e2830ad" exitCode=0 Feb 20 08:50:07 crc kubenswrapper[5094]: I0220 08:50:07.672533 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csb4t" event={"ID":"488d08d3-57d4-47fe-a49a-65c71e0e0c6e","Type":"ContainerDied","Data":"543cf7ed92ceb54ac0ce3676ff68db26877706c2eb49f4a878eeda170e2830ad"} Feb 20 08:50:07 crc kubenswrapper[5094]: I0220 08:50:07.672558 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csb4t" event={"ID":"488d08d3-57d4-47fe-a49a-65c71e0e0c6e","Type":"ContainerStarted","Data":"71cc0a54d9dc6f0261bfd2197724a6801fb7455d85a516d26f0e7df693b59ef1"} Feb 20 08:50:09 crc kubenswrapper[5094]: I0220 08:50:09.697881 5094 generic.go:334] "Generic (PLEG): container finished" podID="488d08d3-57d4-47fe-a49a-65c71e0e0c6e" containerID="503bbbb1d2206ab997b04b7307eae096f0b1864d14ed076cb276c3526d85055c" exitCode=0 Feb 20 08:50:09 crc kubenswrapper[5094]: I0220 08:50:09.697974 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csb4t" event={"ID":"488d08d3-57d4-47fe-a49a-65c71e0e0c6e","Type":"ContainerDied","Data":"503bbbb1d2206ab997b04b7307eae096f0b1864d14ed076cb276c3526d85055c"} Feb 20 08:50:09 crc kubenswrapper[5094]: I0220 08:50:09.702783 5094 generic.go:334] "Generic (PLEG): container finished" podID="aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" containerID="f33835eabda10dc85467686f8b097753120431960d21d398e7ef1b424eade45e" exitCode=0 Feb 20 08:50:09 crc kubenswrapper[5094]: I0220 08:50:09.702826 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cd88dbb9c-w5xqj" event={"ID":"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce","Type":"ContainerDied","Data":"f33835eabda10dc85467686f8b097753120431960d21d398e7ef1b424eade45e"} Feb 20 08:50:10 crc kubenswrapper[5094]: I0220 08:50:10.715287 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csb4t" event={"ID":"488d08d3-57d4-47fe-a49a-65c71e0e0c6e","Type":"ContainerStarted","Data":"00b00a71decc0a7270a2637de7cfb4d1cda3db7c0bd357bae2508a76daddba23"} Feb 20 08:50:10 crc kubenswrapper[5094]: I0220 08:50:10.738427 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-csb4t" podStartSLOduration=2.198908806 podStartE2EDuration="4.738408066s" podCreationTimestamp="2026-02-20 08:50:06 +0000 UTC" firstStartedPulling="2026-02-20 08:50:07.675553208 +0000 UTC m=+7422.548179919" lastFinishedPulling="2026-02-20 08:50:10.215052458 +0000 UTC m=+7425.087679179" observedRunningTime="2026-02-20 08:50:10.733824405 +0000 UTC m=+7425.606451126" watchObservedRunningTime="2026-02-20 08:50:10.738408066 +0000 UTC m=+7425.611034777" Feb 20 08:50:11 crc kubenswrapper[5094]: I0220 08:50:11.612463 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7cd88dbb9c-w5xqj" podUID="aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.96:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.96:8080: connect: connection refused" Feb 20 08:50:11 crc kubenswrapper[5094]: I0220 08:50:11.725489 5094 generic.go:334] "Generic (PLEG): container finished" podID="a42f98d6-6f5b-40c1-a6af-fc44035ed4ad" containerID="a253a0cdc59df09716156eda440d5655d328fc7850d293b9093cf8b148e34b46" exitCode=137 Feb 20 08:50:11 crc kubenswrapper[5094]: I0220 08:50:11.725526 5094 generic.go:334] "Generic (PLEG): container finished" podID="a42f98d6-6f5b-40c1-a6af-fc44035ed4ad" containerID="4c1355ff4c58f3d8da61df01ab4f55870452a5c2ce565bb148c221041dd245f6" exitCode=137 Feb 20 08:50:11 crc kubenswrapper[5094]: I0220 08:50:11.725522 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54879cd99c-v7mr5" event={"ID":"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad","Type":"ContainerDied","Data":"a253a0cdc59df09716156eda440d5655d328fc7850d293b9093cf8b148e34b46"} Feb 20 08:50:11 crc kubenswrapper[5094]: I0220 08:50:11.725566 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54879cd99c-v7mr5" event={"ID":"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad","Type":"ContainerDied","Data":"4c1355ff4c58f3d8da61df01ab4f55870452a5c2ce565bb148c221041dd245f6"} Feb 20 08:50:11 crc kubenswrapper[5094]: I0220 08:50:11.725594 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54879cd99c-v7mr5" event={"ID":"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad","Type":"ContainerDied","Data":"eb0248c1e16d27da1be9e878bc7100201452eaa9c36214b5074f678269d497a4"} Feb 20 08:50:11 crc kubenswrapper[5094]: I0220 08:50:11.725606 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb0248c1e16d27da1be9e878bc7100201452eaa9c36214b5074f678269d497a4" Feb 20 08:50:11 crc kubenswrapper[5094]: I0220 08:50:11.814403 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:50:11 crc kubenswrapper[5094]: I0220 08:50:11.984297 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-config-data\") pod \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " Feb 20 08:50:11 crc kubenswrapper[5094]: I0220 08:50:11.984426 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkx4x\" (UniqueName: \"kubernetes.io/projected/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-kube-api-access-kkx4x\") pod \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " Feb 20 08:50:11 crc kubenswrapper[5094]: I0220 08:50:11.984457 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-scripts\") pod \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " Feb 20 08:50:11 crc kubenswrapper[5094]: I0220 08:50:11.984544 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-horizon-secret-key\") pod \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " Feb 20 08:50:11 crc kubenswrapper[5094]: I0220 08:50:11.984619 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-logs\") pod \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " Feb 20 08:50:11 crc kubenswrapper[5094]: I0220 08:50:11.985164 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-logs" (OuterVolumeSpecName: "logs") pod "a42f98d6-6f5b-40c1-a6af-fc44035ed4ad" (UID: "a42f98d6-6f5b-40c1-a6af-fc44035ed4ad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:50:11 crc kubenswrapper[5094]: I0220 08:50:11.992176 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-kube-api-access-kkx4x" (OuterVolumeSpecName: "kube-api-access-kkx4x") pod "a42f98d6-6f5b-40c1-a6af-fc44035ed4ad" (UID: "a42f98d6-6f5b-40c1-a6af-fc44035ed4ad"). InnerVolumeSpecName "kube-api-access-kkx4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:50:11 crc kubenswrapper[5094]: I0220 08:50:11.993937 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a42f98d6-6f5b-40c1-a6af-fc44035ed4ad" (UID: "a42f98d6-6f5b-40c1-a6af-fc44035ed4ad"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:50:12 crc kubenswrapper[5094]: I0220 08:50:12.007070 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-config-data" (OuterVolumeSpecName: "config-data") pod "a42f98d6-6f5b-40c1-a6af-fc44035ed4ad" (UID: "a42f98d6-6f5b-40c1-a6af-fc44035ed4ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:50:12 crc kubenswrapper[5094]: I0220 08:50:12.022602 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-scripts" (OuterVolumeSpecName: "scripts") pod "a42f98d6-6f5b-40c1-a6af-fc44035ed4ad" (UID: "a42f98d6-6f5b-40c1-a6af-fc44035ed4ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:50:12 crc kubenswrapper[5094]: I0220 08:50:12.086401 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:12 crc kubenswrapper[5094]: I0220 08:50:12.086444 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkx4x\" (UniqueName: \"kubernetes.io/projected/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-kube-api-access-kkx4x\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:12 crc kubenswrapper[5094]: I0220 08:50:12.086459 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:12 crc kubenswrapper[5094]: I0220 08:50:12.086470 5094 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:12 crc kubenswrapper[5094]: I0220 08:50:12.086481 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-logs\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:12 crc kubenswrapper[5094]: I0220 08:50:12.732315 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:50:12 crc kubenswrapper[5094]: I0220 08:50:12.769609 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-54879cd99c-v7mr5"] Feb 20 08:50:12 crc kubenswrapper[5094]: I0220 08:50:12.783810 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-54879cd99c-v7mr5"] Feb 20 08:50:13 crc kubenswrapper[5094]: I0220 08:50:13.849208 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a42f98d6-6f5b-40c1-a6af-fc44035ed4ad" path="/var/lib/kubelet/pods/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad/volumes" Feb 20 08:50:16 crc kubenswrapper[5094]: I0220 08:50:16.790747 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:16 crc kubenswrapper[5094]: I0220 08:50:16.791091 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:16 crc kubenswrapper[5094]: I0220 08:50:16.860309 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:17 crc kubenswrapper[5094]: I0220 08:50:17.810199 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:17 crc kubenswrapper[5094]: I0220 08:50:17.863678 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-csb4t"] Feb 20 08:50:19 crc kubenswrapper[5094]: I0220 08:50:19.793461 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-csb4t" podUID="488d08d3-57d4-47fe-a49a-65c71e0e0c6e" containerName="registry-server" containerID="cri-o://00b00a71decc0a7270a2637de7cfb4d1cda3db7c0bd357bae2508a76daddba23" gracePeriod=2 Feb 20 08:50:20 crc kubenswrapper[5094]: I0220 08:50:20.807049 5094 generic.go:334] "Generic (PLEG): container finished" podID="488d08d3-57d4-47fe-a49a-65c71e0e0c6e" containerID="00b00a71decc0a7270a2637de7cfb4d1cda3db7c0bd357bae2508a76daddba23" exitCode=0 Feb 20 08:50:20 crc kubenswrapper[5094]: I0220 08:50:20.807105 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csb4t" event={"ID":"488d08d3-57d4-47fe-a49a-65c71e0e0c6e","Type":"ContainerDied","Data":"00b00a71decc0a7270a2637de7cfb4d1cda3db7c0bd357bae2508a76daddba23"} Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.395071 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.471787 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-utilities\") pod \"488d08d3-57d4-47fe-a49a-65c71e0e0c6e\" (UID: \"488d08d3-57d4-47fe-a49a-65c71e0e0c6e\") " Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.471957 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-catalog-content\") pod \"488d08d3-57d4-47fe-a49a-65c71e0e0c6e\" (UID: \"488d08d3-57d4-47fe-a49a-65c71e0e0c6e\") " Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.472110 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqz4g\" (UniqueName: \"kubernetes.io/projected/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-kube-api-access-zqz4g\") pod \"488d08d3-57d4-47fe-a49a-65c71e0e0c6e\" (UID: \"488d08d3-57d4-47fe-a49a-65c71e0e0c6e\") " Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.472749 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-utilities" (OuterVolumeSpecName: "utilities") pod "488d08d3-57d4-47fe-a49a-65c71e0e0c6e" (UID: "488d08d3-57d4-47fe-a49a-65c71e0e0c6e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.477653 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-kube-api-access-zqz4g" (OuterVolumeSpecName: "kube-api-access-zqz4g") pod "488d08d3-57d4-47fe-a49a-65c71e0e0c6e" (UID: "488d08d3-57d4-47fe-a49a-65c71e0e0c6e"). InnerVolumeSpecName "kube-api-access-zqz4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.574378 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqz4g\" (UniqueName: \"kubernetes.io/projected/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-kube-api-access-zqz4g\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.574408 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.612419 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7cd88dbb9c-w5xqj" podUID="aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.96:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.96:8080: connect: connection refused" Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.616493 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "488d08d3-57d4-47fe-a49a-65c71e0e0c6e" (UID: "488d08d3-57d4-47fe-a49a-65c71e0e0c6e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.676744 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.821226 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csb4t" event={"ID":"488d08d3-57d4-47fe-a49a-65c71e0e0c6e","Type":"ContainerDied","Data":"71cc0a54d9dc6f0261bfd2197724a6801fb7455d85a516d26f0e7df693b59ef1"} Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.821291 5094 scope.go:117] "RemoveContainer" containerID="00b00a71decc0a7270a2637de7cfb4d1cda3db7c0bd357bae2508a76daddba23" Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.821374 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.870516 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-csb4t"] Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.873118 5094 scope.go:117] "RemoveContainer" containerID="503bbbb1d2206ab997b04b7307eae096f0b1864d14ed076cb276c3526d85055c" Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.880806 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-csb4t"] Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.897312 5094 scope.go:117] "RemoveContainer" containerID="543cf7ed92ceb54ac0ce3676ff68db26877706c2eb49f4a878eeda170e2830ad" Feb 20 08:50:23 crc kubenswrapper[5094]: I0220 08:50:23.850111 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="488d08d3-57d4-47fe-a49a-65c71e0e0c6e" path="/var/lib/kubelet/pods/488d08d3-57d4-47fe-a49a-65c71e0e0c6e/volumes" Feb 20 08:50:31 crc kubenswrapper[5094]: I0220 08:50:31.612255 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7cd88dbb9c-w5xqj" podUID="aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.96:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.96:8080: connect: connection refused" Feb 20 08:50:31 crc kubenswrapper[5094]: I0220 08:50:31.612936 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:50:35 crc kubenswrapper[5094]: E0220 08:50:35.939750 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa4baf13_0870_4bf6_9a0b_d4fd1fb598ce.slice/crio-conmon-b2a6d67a4ec9b9a83a00be4b0eec89aed7fd40eb1bfd2448407868b42be4653c.scope\": RecentStats: unable to find data in memory cache]" Feb 20 08:50:35 crc kubenswrapper[5094]: I0220 08:50:35.979303 5094 generic.go:334] "Generic (PLEG): container finished" podID="aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" containerID="b2a6d67a4ec9b9a83a00be4b0eec89aed7fd40eb1bfd2448407868b42be4653c" exitCode=137 Feb 20 08:50:35 crc kubenswrapper[5094]: I0220 08:50:35.979348 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cd88dbb9c-w5xqj" event={"ID":"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce","Type":"ContainerDied","Data":"b2a6d67a4ec9b9a83a00be4b0eec89aed7fd40eb1bfd2448407868b42be4653c"} Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.083119 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.191280 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-horizon-secret-key\") pod \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.192012 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9cjw\" (UniqueName: \"kubernetes.io/projected/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-kube-api-access-x9cjw\") pod \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.192475 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-scripts\") pod \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.192761 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-logs\") pod \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.193170 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-config-data\") pod \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.193376 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-logs" (OuterVolumeSpecName: "logs") pod "aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" (UID: "aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.194134 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-logs\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.197096 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-kube-api-access-x9cjw" (OuterVolumeSpecName: "kube-api-access-x9cjw") pod "aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" (UID: "aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce"). InnerVolumeSpecName "kube-api-access-x9cjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.197184 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" (UID: "aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.214978 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-scripts" (OuterVolumeSpecName: "scripts") pod "aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" (UID: "aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.215404 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-config-data" (OuterVolumeSpecName: "config-data") pod "aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" (UID: "aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.295787 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.296001 5094 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.296015 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9cjw\" (UniqueName: \"kubernetes.io/projected/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-kube-api-access-x9cjw\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.296027 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.995947 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cd88dbb9c-w5xqj" event={"ID":"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce","Type":"ContainerDied","Data":"cdb3de8f0c49a984e54c19f650258f82b6b14df1b84ec278f6a93e74f5a453ba"} Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.996041 5094 scope.go:117] "RemoveContainer" containerID="f33835eabda10dc85467686f8b097753120431960d21d398e7ef1b424eade45e" Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.996248 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:50:37 crc kubenswrapper[5094]: I0220 08:50:37.062146 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7cd88dbb9c-w5xqj"] Feb 20 08:50:37 crc kubenswrapper[5094]: I0220 08:50:37.070495 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7cd88dbb9c-w5xqj"] Feb 20 08:50:37 crc kubenswrapper[5094]: I0220 08:50:37.218859 5094 scope.go:117] "RemoveContainer" containerID="b2a6d67a4ec9b9a83a00be4b0eec89aed7fd40eb1bfd2448407868b42be4653c" Feb 20 08:50:37 crc kubenswrapper[5094]: I0220 08:50:37.856954 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" path="/var/lib/kubelet/pods/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce/volumes" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.856033 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-85f686b8b5-kz5d4"] Feb 20 08:50:48 crc kubenswrapper[5094]: E0220 08:50:48.857279 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="488d08d3-57d4-47fe-a49a-65c71e0e0c6e" containerName="registry-server" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.857299 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="488d08d3-57d4-47fe-a49a-65c71e0e0c6e" containerName="registry-server" Feb 20 08:50:48 crc kubenswrapper[5094]: E0220 08:50:48.857321 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a42f98d6-6f5b-40c1-a6af-fc44035ed4ad" containerName="horizon" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.857329 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a42f98d6-6f5b-40c1-a6af-fc44035ed4ad" containerName="horizon" Feb 20 08:50:48 crc kubenswrapper[5094]: E0220 08:50:48.857342 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" containerName="horizon-log" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.857351 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" containerName="horizon-log" Feb 20 08:50:48 crc kubenswrapper[5094]: E0220 08:50:48.857367 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" containerName="horizon" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.857375 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" containerName="horizon" Feb 20 08:50:48 crc kubenswrapper[5094]: E0220 08:50:48.857393 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a42f98d6-6f5b-40c1-a6af-fc44035ed4ad" containerName="horizon-log" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.857400 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a42f98d6-6f5b-40c1-a6af-fc44035ed4ad" containerName="horizon-log" Feb 20 08:50:48 crc kubenswrapper[5094]: E0220 08:50:48.857429 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="488d08d3-57d4-47fe-a49a-65c71e0e0c6e" containerName="extract-content" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.857436 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="488d08d3-57d4-47fe-a49a-65c71e0e0c6e" containerName="extract-content" Feb 20 08:50:48 crc kubenswrapper[5094]: E0220 08:50:48.857459 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="488d08d3-57d4-47fe-a49a-65c71e0e0c6e" containerName="extract-utilities" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.857467 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="488d08d3-57d4-47fe-a49a-65c71e0e0c6e" containerName="extract-utilities" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.857694 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" containerName="horizon-log" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.857732 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a42f98d6-6f5b-40c1-a6af-fc44035ed4ad" containerName="horizon-log" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.857747 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" containerName="horizon" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.857771 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a42f98d6-6f5b-40c1-a6af-fc44035ed4ad" containerName="horizon" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.857785 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="488d08d3-57d4-47fe-a49a-65c71e0e0c6e" containerName="registry-server" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.858957 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.878627 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-85f686b8b5-kz5d4"] Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.967837 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd051d85-41b3-420b-9999-5c9dee9aafe3-logs\") pod \"horizon-85f686b8b5-kz5d4\" (UID: \"dd051d85-41b3-420b-9999-5c9dee9aafe3\") " pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.967886 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd051d85-41b3-420b-9999-5c9dee9aafe3-horizon-secret-key\") pod \"horizon-85f686b8b5-kz5d4\" (UID: \"dd051d85-41b3-420b-9999-5c9dee9aafe3\") " pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.967907 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd051d85-41b3-420b-9999-5c9dee9aafe3-config-data\") pod \"horizon-85f686b8b5-kz5d4\" (UID: \"dd051d85-41b3-420b-9999-5c9dee9aafe3\") " pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.967955 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd051d85-41b3-420b-9999-5c9dee9aafe3-scripts\") pod \"horizon-85f686b8b5-kz5d4\" (UID: \"dd051d85-41b3-420b-9999-5c9dee9aafe3\") " pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.967980 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxpfb\" (UniqueName: \"kubernetes.io/projected/dd051d85-41b3-420b-9999-5c9dee9aafe3-kube-api-access-hxpfb\") pod \"horizon-85f686b8b5-kz5d4\" (UID: \"dd051d85-41b3-420b-9999-5c9dee9aafe3\") " pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:49 crc kubenswrapper[5094]: I0220 08:50:49.070767 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd051d85-41b3-420b-9999-5c9dee9aafe3-logs\") pod \"horizon-85f686b8b5-kz5d4\" (UID: \"dd051d85-41b3-420b-9999-5c9dee9aafe3\") " pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:49 crc kubenswrapper[5094]: I0220 08:50:49.070855 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd051d85-41b3-420b-9999-5c9dee9aafe3-horizon-secret-key\") pod \"horizon-85f686b8b5-kz5d4\" (UID: \"dd051d85-41b3-420b-9999-5c9dee9aafe3\") " pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:49 crc kubenswrapper[5094]: I0220 08:50:49.070885 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd051d85-41b3-420b-9999-5c9dee9aafe3-config-data\") pod \"horizon-85f686b8b5-kz5d4\" (UID: \"dd051d85-41b3-420b-9999-5c9dee9aafe3\") " pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:49 crc kubenswrapper[5094]: I0220 08:50:49.071086 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd051d85-41b3-420b-9999-5c9dee9aafe3-scripts\") pod \"horizon-85f686b8b5-kz5d4\" (UID: \"dd051d85-41b3-420b-9999-5c9dee9aafe3\") " pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:49 crc kubenswrapper[5094]: I0220 08:50:49.071747 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd051d85-41b3-420b-9999-5c9dee9aafe3-scripts\") pod \"horizon-85f686b8b5-kz5d4\" (UID: \"dd051d85-41b3-420b-9999-5c9dee9aafe3\") " pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:49 crc kubenswrapper[5094]: I0220 08:50:49.071922 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxpfb\" (UniqueName: \"kubernetes.io/projected/dd051d85-41b3-420b-9999-5c9dee9aafe3-kube-api-access-hxpfb\") pod \"horizon-85f686b8b5-kz5d4\" (UID: \"dd051d85-41b3-420b-9999-5c9dee9aafe3\") " pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:49 crc kubenswrapper[5094]: I0220 08:50:49.072431 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd051d85-41b3-420b-9999-5c9dee9aafe3-config-data\") pod \"horizon-85f686b8b5-kz5d4\" (UID: \"dd051d85-41b3-420b-9999-5c9dee9aafe3\") " pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:49 crc kubenswrapper[5094]: I0220 08:50:49.072661 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd051d85-41b3-420b-9999-5c9dee9aafe3-logs\") pod \"horizon-85f686b8b5-kz5d4\" (UID: \"dd051d85-41b3-420b-9999-5c9dee9aafe3\") " pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:49 crc kubenswrapper[5094]: I0220 08:50:49.090320 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd051d85-41b3-420b-9999-5c9dee9aafe3-horizon-secret-key\") pod \"horizon-85f686b8b5-kz5d4\" (UID: \"dd051d85-41b3-420b-9999-5c9dee9aafe3\") " pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:49 crc kubenswrapper[5094]: I0220 08:50:49.095395 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxpfb\" (UniqueName: \"kubernetes.io/projected/dd051d85-41b3-420b-9999-5c9dee9aafe3-kube-api-access-hxpfb\") pod \"horizon-85f686b8b5-kz5d4\" (UID: \"dd051d85-41b3-420b-9999-5c9dee9aafe3\") " pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:49 crc kubenswrapper[5094]: I0220 08:50:49.180401 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:49 crc kubenswrapper[5094]: I0220 08:50:49.656833 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-85f686b8b5-kz5d4"] Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.049377 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-zdzg9"] Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.051346 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-zdzg9" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.071160 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-zdzg9"] Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.090844 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tbqv\" (UniqueName: \"kubernetes.io/projected/1068d86d-d730-4dab-8aaf-12c5a5c62a70-kube-api-access-5tbqv\") pod \"heat-db-create-zdzg9\" (UID: \"1068d86d-d730-4dab-8aaf-12c5a5c62a70\") " pod="openstack/heat-db-create-zdzg9" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.091274 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1068d86d-d730-4dab-8aaf-12c5a5c62a70-operator-scripts\") pod \"heat-db-create-zdzg9\" (UID: \"1068d86d-d730-4dab-8aaf-12c5a5c62a70\") " pod="openstack/heat-db-create-zdzg9" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.136039 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-b8a1-account-create-update-27rkz"] Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.137729 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85f686b8b5-kz5d4" event={"ID":"dd051d85-41b3-420b-9999-5c9dee9aafe3","Type":"ContainerStarted","Data":"aea2110b1c796c7236f073b2306c6f3a761f8d1b2287bc63c9f068d41c82bb34"} Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.137769 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85f686b8b5-kz5d4" event={"ID":"dd051d85-41b3-420b-9999-5c9dee9aafe3","Type":"ContainerStarted","Data":"3c77126b664be918f03fe1aeab230480b2cfbabc1695d0b6dd41fb92b1c61748"} Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.137784 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85f686b8b5-kz5d4" event={"ID":"dd051d85-41b3-420b-9999-5c9dee9aafe3","Type":"ContainerStarted","Data":"8a41fe68af7c453825038647e557a4d036841eb76b42e3bc4f40c7cf5415ecbf"} Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.137880 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-b8a1-account-create-update-27rkz" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.139734 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.143556 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-b8a1-account-create-update-27rkz"] Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.162151 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-85f686b8b5-kz5d4" podStartSLOduration=2.162114098 podStartE2EDuration="2.162114098s" podCreationTimestamp="2026-02-20 08:50:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:50:50.154886973 +0000 UTC m=+7465.027513684" watchObservedRunningTime="2026-02-20 08:50:50.162114098 +0000 UTC m=+7465.034740799" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.199016 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/674e60ac-3253-4c4c-8e5b-7a59ed2e8989-operator-scripts\") pod \"heat-b8a1-account-create-update-27rkz\" (UID: \"674e60ac-3253-4c4c-8e5b-7a59ed2e8989\") " pod="openstack/heat-b8a1-account-create-update-27rkz" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.199286 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1068d86d-d730-4dab-8aaf-12c5a5c62a70-operator-scripts\") pod \"heat-db-create-zdzg9\" (UID: \"1068d86d-d730-4dab-8aaf-12c5a5c62a70\") " pod="openstack/heat-db-create-zdzg9" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.199427 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwnfz\" (UniqueName: \"kubernetes.io/projected/674e60ac-3253-4c4c-8e5b-7a59ed2e8989-kube-api-access-dwnfz\") pod \"heat-b8a1-account-create-update-27rkz\" (UID: \"674e60ac-3253-4c4c-8e5b-7a59ed2e8989\") " pod="openstack/heat-b8a1-account-create-update-27rkz" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.199667 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tbqv\" (UniqueName: \"kubernetes.io/projected/1068d86d-d730-4dab-8aaf-12c5a5c62a70-kube-api-access-5tbqv\") pod \"heat-db-create-zdzg9\" (UID: \"1068d86d-d730-4dab-8aaf-12c5a5c62a70\") " pod="openstack/heat-db-create-zdzg9" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.200392 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1068d86d-d730-4dab-8aaf-12c5a5c62a70-operator-scripts\") pod \"heat-db-create-zdzg9\" (UID: \"1068d86d-d730-4dab-8aaf-12c5a5c62a70\") " pod="openstack/heat-db-create-zdzg9" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.223816 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tbqv\" (UniqueName: \"kubernetes.io/projected/1068d86d-d730-4dab-8aaf-12c5a5c62a70-kube-api-access-5tbqv\") pod \"heat-db-create-zdzg9\" (UID: \"1068d86d-d730-4dab-8aaf-12c5a5c62a70\") " pod="openstack/heat-db-create-zdzg9" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.301739 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/674e60ac-3253-4c4c-8e5b-7a59ed2e8989-operator-scripts\") pod \"heat-b8a1-account-create-update-27rkz\" (UID: \"674e60ac-3253-4c4c-8e5b-7a59ed2e8989\") " pod="openstack/heat-b8a1-account-create-update-27rkz" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.301808 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwnfz\" (UniqueName: \"kubernetes.io/projected/674e60ac-3253-4c4c-8e5b-7a59ed2e8989-kube-api-access-dwnfz\") pod \"heat-b8a1-account-create-update-27rkz\" (UID: \"674e60ac-3253-4c4c-8e5b-7a59ed2e8989\") " pod="openstack/heat-b8a1-account-create-update-27rkz" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.302424 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/674e60ac-3253-4c4c-8e5b-7a59ed2e8989-operator-scripts\") pod \"heat-b8a1-account-create-update-27rkz\" (UID: \"674e60ac-3253-4c4c-8e5b-7a59ed2e8989\") " pod="openstack/heat-b8a1-account-create-update-27rkz" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.318122 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwnfz\" (UniqueName: \"kubernetes.io/projected/674e60ac-3253-4c4c-8e5b-7a59ed2e8989-kube-api-access-dwnfz\") pod \"heat-b8a1-account-create-update-27rkz\" (UID: \"674e60ac-3253-4c4c-8e5b-7a59ed2e8989\") " pod="openstack/heat-b8a1-account-create-update-27rkz" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.368741 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-zdzg9" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.455510 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-b8a1-account-create-update-27rkz" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.811990 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-zdzg9"] Feb 20 08:50:50 crc kubenswrapper[5094]: W0220 08:50:50.813138 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1068d86d_d730_4dab_8aaf_12c5a5c62a70.slice/crio-11ad34db8d48c9d73cf719f9c6912ab2ef74a189f571f10511dc0154749afce4 WatchSource:0}: Error finding container 11ad34db8d48c9d73cf719f9c6912ab2ef74a189f571f10511dc0154749afce4: Status 404 returned error can't find the container with id 11ad34db8d48c9d73cf719f9c6912ab2ef74a189f571f10511dc0154749afce4 Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.935301 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-b8a1-account-create-update-27rkz"] Feb 20 08:50:50 crc kubenswrapper[5094]: W0220 08:50:50.940509 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod674e60ac_3253_4c4c_8e5b_7a59ed2e8989.slice/crio-677bf1051097abf0cb68e4029808cf02e28e5e6a02e28f101c2fe08b01aa7d56 WatchSource:0}: Error finding container 677bf1051097abf0cb68e4029808cf02e28e5e6a02e28f101c2fe08b01aa7d56: Status 404 returned error can't find the container with id 677bf1051097abf0cb68e4029808cf02e28e5e6a02e28f101c2fe08b01aa7d56 Feb 20 08:50:51 crc kubenswrapper[5094]: I0220 08:50:51.146279 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-zdzg9" event={"ID":"1068d86d-d730-4dab-8aaf-12c5a5c62a70","Type":"ContainerStarted","Data":"54e699ae44cd59613f594b23e884598e72478881e5bbc4353851926cc53ca349"} Feb 20 08:50:51 crc kubenswrapper[5094]: I0220 08:50:51.146587 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-zdzg9" event={"ID":"1068d86d-d730-4dab-8aaf-12c5a5c62a70","Type":"ContainerStarted","Data":"11ad34db8d48c9d73cf719f9c6912ab2ef74a189f571f10511dc0154749afce4"} Feb 20 08:50:51 crc kubenswrapper[5094]: I0220 08:50:51.148853 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-b8a1-account-create-update-27rkz" event={"ID":"674e60ac-3253-4c4c-8e5b-7a59ed2e8989","Type":"ContainerStarted","Data":"f3933aedef06f045c06e2f5aeaa2d2665f532f7df1042dc925032507e64641fd"} Feb 20 08:50:51 crc kubenswrapper[5094]: I0220 08:50:51.148906 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-b8a1-account-create-update-27rkz" event={"ID":"674e60ac-3253-4c4c-8e5b-7a59ed2e8989","Type":"ContainerStarted","Data":"677bf1051097abf0cb68e4029808cf02e28e5e6a02e28f101c2fe08b01aa7d56"} Feb 20 08:50:51 crc kubenswrapper[5094]: I0220 08:50:51.172340 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-create-zdzg9" podStartSLOduration=1.172319784 podStartE2EDuration="1.172319784s" podCreationTimestamp="2026-02-20 08:50:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:50:51.160416169 +0000 UTC m=+7466.033042880" watchObservedRunningTime="2026-02-20 08:50:51.172319784 +0000 UTC m=+7466.044946495" Feb 20 08:50:51 crc kubenswrapper[5094]: I0220 08:50:51.176650 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-b8a1-account-create-update-27rkz" podStartSLOduration=1.176627308 podStartE2EDuration="1.176627308s" podCreationTimestamp="2026-02-20 08:50:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:50:51.172031328 +0000 UTC m=+7466.044658039" watchObservedRunningTime="2026-02-20 08:50:51.176627308 +0000 UTC m=+7466.049254019" Feb 20 08:50:52 crc kubenswrapper[5094]: I0220 08:50:52.163230 5094 generic.go:334] "Generic (PLEG): container finished" podID="674e60ac-3253-4c4c-8e5b-7a59ed2e8989" containerID="f3933aedef06f045c06e2f5aeaa2d2665f532f7df1042dc925032507e64641fd" exitCode=0 Feb 20 08:50:52 crc kubenswrapper[5094]: I0220 08:50:52.163320 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-b8a1-account-create-update-27rkz" event={"ID":"674e60ac-3253-4c4c-8e5b-7a59ed2e8989","Type":"ContainerDied","Data":"f3933aedef06f045c06e2f5aeaa2d2665f532f7df1042dc925032507e64641fd"} Feb 20 08:50:52 crc kubenswrapper[5094]: I0220 08:50:52.166696 5094 generic.go:334] "Generic (PLEG): container finished" podID="1068d86d-d730-4dab-8aaf-12c5a5c62a70" containerID="54e699ae44cd59613f594b23e884598e72478881e5bbc4353851926cc53ca349" exitCode=0 Feb 20 08:50:52 crc kubenswrapper[5094]: I0220 08:50:52.166801 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-zdzg9" event={"ID":"1068d86d-d730-4dab-8aaf-12c5a5c62a70","Type":"ContainerDied","Data":"54e699ae44cd59613f594b23e884598e72478881e5bbc4353851926cc53ca349"} Feb 20 08:50:53 crc kubenswrapper[5094]: I0220 08:50:53.628908 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-b8a1-account-create-update-27rkz" Feb 20 08:50:53 crc kubenswrapper[5094]: I0220 08:50:53.634972 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-zdzg9" Feb 20 08:50:53 crc kubenswrapper[5094]: I0220 08:50:53.663772 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tbqv\" (UniqueName: \"kubernetes.io/projected/1068d86d-d730-4dab-8aaf-12c5a5c62a70-kube-api-access-5tbqv\") pod \"1068d86d-d730-4dab-8aaf-12c5a5c62a70\" (UID: \"1068d86d-d730-4dab-8aaf-12c5a5c62a70\") " Feb 20 08:50:53 crc kubenswrapper[5094]: I0220 08:50:53.663952 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/674e60ac-3253-4c4c-8e5b-7a59ed2e8989-operator-scripts\") pod \"674e60ac-3253-4c4c-8e5b-7a59ed2e8989\" (UID: \"674e60ac-3253-4c4c-8e5b-7a59ed2e8989\") " Feb 20 08:50:53 crc kubenswrapper[5094]: I0220 08:50:53.664114 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwnfz\" (UniqueName: \"kubernetes.io/projected/674e60ac-3253-4c4c-8e5b-7a59ed2e8989-kube-api-access-dwnfz\") pod \"674e60ac-3253-4c4c-8e5b-7a59ed2e8989\" (UID: \"674e60ac-3253-4c4c-8e5b-7a59ed2e8989\") " Feb 20 08:50:53 crc kubenswrapper[5094]: I0220 08:50:53.664207 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1068d86d-d730-4dab-8aaf-12c5a5c62a70-operator-scripts\") pod \"1068d86d-d730-4dab-8aaf-12c5a5c62a70\" (UID: \"1068d86d-d730-4dab-8aaf-12c5a5c62a70\") " Feb 20 08:50:53 crc kubenswrapper[5094]: I0220 08:50:53.667084 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/674e60ac-3253-4c4c-8e5b-7a59ed2e8989-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "674e60ac-3253-4c4c-8e5b-7a59ed2e8989" (UID: "674e60ac-3253-4c4c-8e5b-7a59ed2e8989"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:50:53 crc kubenswrapper[5094]: I0220 08:50:53.667479 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1068d86d-d730-4dab-8aaf-12c5a5c62a70-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1068d86d-d730-4dab-8aaf-12c5a5c62a70" (UID: "1068d86d-d730-4dab-8aaf-12c5a5c62a70"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:50:53 crc kubenswrapper[5094]: I0220 08:50:53.673233 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1068d86d-d730-4dab-8aaf-12c5a5c62a70-kube-api-access-5tbqv" (OuterVolumeSpecName: "kube-api-access-5tbqv") pod "1068d86d-d730-4dab-8aaf-12c5a5c62a70" (UID: "1068d86d-d730-4dab-8aaf-12c5a5c62a70"). InnerVolumeSpecName "kube-api-access-5tbqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:50:53 crc kubenswrapper[5094]: I0220 08:50:53.674339 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/674e60ac-3253-4c4c-8e5b-7a59ed2e8989-kube-api-access-dwnfz" (OuterVolumeSpecName: "kube-api-access-dwnfz") pod "674e60ac-3253-4c4c-8e5b-7a59ed2e8989" (UID: "674e60ac-3253-4c4c-8e5b-7a59ed2e8989"). InnerVolumeSpecName "kube-api-access-dwnfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:50:53 crc kubenswrapper[5094]: I0220 08:50:53.766542 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwnfz\" (UniqueName: \"kubernetes.io/projected/674e60ac-3253-4c4c-8e5b-7a59ed2e8989-kube-api-access-dwnfz\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:53 crc kubenswrapper[5094]: I0220 08:50:53.766579 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1068d86d-d730-4dab-8aaf-12c5a5c62a70-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:53 crc kubenswrapper[5094]: I0220 08:50:53.766590 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tbqv\" (UniqueName: \"kubernetes.io/projected/1068d86d-d730-4dab-8aaf-12c5a5c62a70-kube-api-access-5tbqv\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:53 crc kubenswrapper[5094]: I0220 08:50:53.766599 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/674e60ac-3253-4c4c-8e5b-7a59ed2e8989-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:54 crc kubenswrapper[5094]: I0220 08:50:54.200303 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-b8a1-account-create-update-27rkz" Feb 20 08:50:54 crc kubenswrapper[5094]: I0220 08:50:54.200295 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-b8a1-account-create-update-27rkz" event={"ID":"674e60ac-3253-4c4c-8e5b-7a59ed2e8989","Type":"ContainerDied","Data":"677bf1051097abf0cb68e4029808cf02e28e5e6a02e28f101c2fe08b01aa7d56"} Feb 20 08:50:54 crc kubenswrapper[5094]: I0220 08:50:54.200655 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="677bf1051097abf0cb68e4029808cf02e28e5e6a02e28f101c2fe08b01aa7d56" Feb 20 08:50:54 crc kubenswrapper[5094]: I0220 08:50:54.202931 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-zdzg9" event={"ID":"1068d86d-d730-4dab-8aaf-12c5a5c62a70","Type":"ContainerDied","Data":"11ad34db8d48c9d73cf719f9c6912ab2ef74a189f571f10511dc0154749afce4"} Feb 20 08:50:54 crc kubenswrapper[5094]: I0220 08:50:54.202969 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11ad34db8d48c9d73cf719f9c6912ab2ef74a189f571f10511dc0154749afce4" Feb 20 08:50:54 crc kubenswrapper[5094]: I0220 08:50:54.203025 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-zdzg9" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.327592 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-fmrw9"] Feb 20 08:50:55 crc kubenswrapper[5094]: E0220 08:50:55.328402 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="674e60ac-3253-4c4c-8e5b-7a59ed2e8989" containerName="mariadb-account-create-update" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.328417 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="674e60ac-3253-4c4c-8e5b-7a59ed2e8989" containerName="mariadb-account-create-update" Feb 20 08:50:55 crc kubenswrapper[5094]: E0220 08:50:55.328442 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1068d86d-d730-4dab-8aaf-12c5a5c62a70" containerName="mariadb-database-create" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.328449 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="1068d86d-d730-4dab-8aaf-12c5a5c62a70" containerName="mariadb-database-create" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.328668 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="1068d86d-d730-4dab-8aaf-12c5a5c62a70" containerName="mariadb-database-create" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.328694 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="674e60ac-3253-4c4c-8e5b-7a59ed2e8989" containerName="mariadb-account-create-update" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.329557 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-fmrw9" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.336140 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.336224 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-v8n4z" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.348533 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-fmrw9"] Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.416847 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/791e2b3b-9d51-41fd-bf38-5b66849b5b77-combined-ca-bundle\") pod \"heat-db-sync-fmrw9\" (UID: \"791e2b3b-9d51-41fd-bf38-5b66849b5b77\") " pod="openstack/heat-db-sync-fmrw9" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.417224 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/791e2b3b-9d51-41fd-bf38-5b66849b5b77-config-data\") pod \"heat-db-sync-fmrw9\" (UID: \"791e2b3b-9d51-41fd-bf38-5b66849b5b77\") " pod="openstack/heat-db-sync-fmrw9" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.417569 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-522wj\" (UniqueName: \"kubernetes.io/projected/791e2b3b-9d51-41fd-bf38-5b66849b5b77-kube-api-access-522wj\") pod \"heat-db-sync-fmrw9\" (UID: \"791e2b3b-9d51-41fd-bf38-5b66849b5b77\") " pod="openstack/heat-db-sync-fmrw9" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.518837 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-522wj\" (UniqueName: \"kubernetes.io/projected/791e2b3b-9d51-41fd-bf38-5b66849b5b77-kube-api-access-522wj\") pod \"heat-db-sync-fmrw9\" (UID: \"791e2b3b-9d51-41fd-bf38-5b66849b5b77\") " pod="openstack/heat-db-sync-fmrw9" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.519102 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/791e2b3b-9d51-41fd-bf38-5b66849b5b77-combined-ca-bundle\") pod \"heat-db-sync-fmrw9\" (UID: \"791e2b3b-9d51-41fd-bf38-5b66849b5b77\") " pod="openstack/heat-db-sync-fmrw9" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.519244 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/791e2b3b-9d51-41fd-bf38-5b66849b5b77-config-data\") pod \"heat-db-sync-fmrw9\" (UID: \"791e2b3b-9d51-41fd-bf38-5b66849b5b77\") " pod="openstack/heat-db-sync-fmrw9" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.524390 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/791e2b3b-9d51-41fd-bf38-5b66849b5b77-combined-ca-bundle\") pod \"heat-db-sync-fmrw9\" (UID: \"791e2b3b-9d51-41fd-bf38-5b66849b5b77\") " pod="openstack/heat-db-sync-fmrw9" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.524795 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/791e2b3b-9d51-41fd-bf38-5b66849b5b77-config-data\") pod \"heat-db-sync-fmrw9\" (UID: \"791e2b3b-9d51-41fd-bf38-5b66849b5b77\") " pod="openstack/heat-db-sync-fmrw9" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.539462 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-522wj\" (UniqueName: \"kubernetes.io/projected/791e2b3b-9d51-41fd-bf38-5b66849b5b77-kube-api-access-522wj\") pod \"heat-db-sync-fmrw9\" (UID: \"791e2b3b-9d51-41fd-bf38-5b66849b5b77\") " pod="openstack/heat-db-sync-fmrw9" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.654654 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-fmrw9" Feb 20 08:50:56 crc kubenswrapper[5094]: I0220 08:50:56.095241 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-fmrw9"] Feb 20 08:50:56 crc kubenswrapper[5094]: I0220 08:50:56.219913 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-fmrw9" event={"ID":"791e2b3b-9d51-41fd-bf38-5b66849b5b77","Type":"ContainerStarted","Data":"037c8ccbfa67d05f5ae9f5822422a7180a181ab854dbbf2a204acef3cafe0f42"} Feb 20 08:50:59 crc kubenswrapper[5094]: I0220 08:50:59.180889 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:59 crc kubenswrapper[5094]: I0220 08:50:59.181938 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:59 crc kubenswrapper[5094]: I0220 08:50:59.184796 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-85f686b8b5-kz5d4" podUID="dd051d85-41b3-420b-9999-5c9dee9aafe3" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.101:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.101:8080: connect: connection refused" Feb 20 08:51:04 crc kubenswrapper[5094]: I0220 08:51:04.106529 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:51:04 crc kubenswrapper[5094]: I0220 08:51:04.107324 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:51:04 crc kubenswrapper[5094]: I0220 08:51:04.304151 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-fmrw9" event={"ID":"791e2b3b-9d51-41fd-bf38-5b66849b5b77","Type":"ContainerStarted","Data":"da9d0c1b854591282ea3fc19512d062edc11f81e6daa7e4388a1d825d28d97b3"} Feb 20 08:51:04 crc kubenswrapper[5094]: I0220 08:51:04.325742 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-fmrw9" podStartSLOduration=1.818657999 podStartE2EDuration="9.325697267s" podCreationTimestamp="2026-02-20 08:50:55 +0000 UTC" firstStartedPulling="2026-02-20 08:50:56.102551626 +0000 UTC m=+7470.975178337" lastFinishedPulling="2026-02-20 08:51:03.609590894 +0000 UTC m=+7478.482217605" observedRunningTime="2026-02-20 08:51:04.325046052 +0000 UTC m=+7479.197672763" watchObservedRunningTime="2026-02-20 08:51:04.325697267 +0000 UTC m=+7479.198323998" Feb 20 08:51:06 crc kubenswrapper[5094]: I0220 08:51:06.327448 5094 generic.go:334] "Generic (PLEG): container finished" podID="791e2b3b-9d51-41fd-bf38-5b66849b5b77" containerID="da9d0c1b854591282ea3fc19512d062edc11f81e6daa7e4388a1d825d28d97b3" exitCode=0 Feb 20 08:51:06 crc kubenswrapper[5094]: I0220 08:51:06.327562 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-fmrw9" event={"ID":"791e2b3b-9d51-41fd-bf38-5b66849b5b77","Type":"ContainerDied","Data":"da9d0c1b854591282ea3fc19512d062edc11f81e6daa7e4388a1d825d28d97b3"} Feb 20 08:51:07 crc kubenswrapper[5094]: I0220 08:51:07.680062 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-fmrw9" Feb 20 08:51:07 crc kubenswrapper[5094]: I0220 08:51:07.753104 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/791e2b3b-9d51-41fd-bf38-5b66849b5b77-config-data\") pod \"791e2b3b-9d51-41fd-bf38-5b66849b5b77\" (UID: \"791e2b3b-9d51-41fd-bf38-5b66849b5b77\") " Feb 20 08:51:07 crc kubenswrapper[5094]: I0220 08:51:07.753405 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-522wj\" (UniqueName: \"kubernetes.io/projected/791e2b3b-9d51-41fd-bf38-5b66849b5b77-kube-api-access-522wj\") pod \"791e2b3b-9d51-41fd-bf38-5b66849b5b77\" (UID: \"791e2b3b-9d51-41fd-bf38-5b66849b5b77\") " Feb 20 08:51:07 crc kubenswrapper[5094]: I0220 08:51:07.754177 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/791e2b3b-9d51-41fd-bf38-5b66849b5b77-combined-ca-bundle\") pod \"791e2b3b-9d51-41fd-bf38-5b66849b5b77\" (UID: \"791e2b3b-9d51-41fd-bf38-5b66849b5b77\") " Feb 20 08:51:07 crc kubenswrapper[5094]: I0220 08:51:07.758328 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/791e2b3b-9d51-41fd-bf38-5b66849b5b77-kube-api-access-522wj" (OuterVolumeSpecName: "kube-api-access-522wj") pod "791e2b3b-9d51-41fd-bf38-5b66849b5b77" (UID: "791e2b3b-9d51-41fd-bf38-5b66849b5b77"). InnerVolumeSpecName "kube-api-access-522wj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:51:07 crc kubenswrapper[5094]: I0220 08:51:07.791272 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/791e2b3b-9d51-41fd-bf38-5b66849b5b77-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "791e2b3b-9d51-41fd-bf38-5b66849b5b77" (UID: "791e2b3b-9d51-41fd-bf38-5b66849b5b77"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:51:07 crc kubenswrapper[5094]: I0220 08:51:07.839078 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/791e2b3b-9d51-41fd-bf38-5b66849b5b77-config-data" (OuterVolumeSpecName: "config-data") pod "791e2b3b-9d51-41fd-bf38-5b66849b5b77" (UID: "791e2b3b-9d51-41fd-bf38-5b66849b5b77"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:51:07 crc kubenswrapper[5094]: I0220 08:51:07.861731 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/791e2b3b-9d51-41fd-bf38-5b66849b5b77-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:51:07 crc kubenswrapper[5094]: I0220 08:51:07.861811 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-522wj\" (UniqueName: \"kubernetes.io/projected/791e2b3b-9d51-41fd-bf38-5b66849b5b77-kube-api-access-522wj\") on node \"crc\" DevicePath \"\"" Feb 20 08:51:07 crc kubenswrapper[5094]: I0220 08:51:07.861831 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/791e2b3b-9d51-41fd-bf38-5b66849b5b77-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:51:08 crc kubenswrapper[5094]: I0220 08:51:08.349200 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-fmrw9" event={"ID":"791e2b3b-9d51-41fd-bf38-5b66849b5b77","Type":"ContainerDied","Data":"037c8ccbfa67d05f5ae9f5822422a7180a181ab854dbbf2a204acef3cafe0f42"} Feb 20 08:51:08 crc kubenswrapper[5094]: I0220 08:51:08.349242 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="037c8ccbfa67d05f5ae9f5822422a7180a181ab854dbbf2a204acef3cafe0f42" Feb 20 08:51:08 crc kubenswrapper[5094]: I0220 08:51:08.349293 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-fmrw9" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.325539 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-68d6fbc7c5-czl7r"] Feb 20 08:51:09 crc kubenswrapper[5094]: E0220 08:51:09.326216 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="791e2b3b-9d51-41fd-bf38-5b66849b5b77" containerName="heat-db-sync" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.326228 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="791e2b3b-9d51-41fd-bf38-5b66849b5b77" containerName="heat-db-sync" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.326486 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="791e2b3b-9d51-41fd-bf38-5b66849b5b77" containerName="heat-db-sync" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.327243 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-68d6fbc7c5-czl7r" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.332036 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.332051 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.332288 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-v8n4z" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.359201 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-68d6fbc7c5-czl7r"] Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.390091 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4697fe9-ee95-4003-81d9-c6d7935b46cd-combined-ca-bundle\") pod \"heat-engine-68d6fbc7c5-czl7r\" (UID: \"f4697fe9-ee95-4003-81d9-c6d7935b46cd\") " pod="openstack/heat-engine-68d6fbc7c5-czl7r" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.390354 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f4697fe9-ee95-4003-81d9-c6d7935b46cd-config-data-custom\") pod \"heat-engine-68d6fbc7c5-czl7r\" (UID: \"f4697fe9-ee95-4003-81d9-c6d7935b46cd\") " pod="openstack/heat-engine-68d6fbc7c5-czl7r" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.390498 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xxl6\" (UniqueName: \"kubernetes.io/projected/f4697fe9-ee95-4003-81d9-c6d7935b46cd-kube-api-access-9xxl6\") pod \"heat-engine-68d6fbc7c5-czl7r\" (UID: \"f4697fe9-ee95-4003-81d9-c6d7935b46cd\") " pod="openstack/heat-engine-68d6fbc7c5-czl7r" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.390681 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4697fe9-ee95-4003-81d9-c6d7935b46cd-config-data\") pod \"heat-engine-68d6fbc7c5-czl7r\" (UID: \"f4697fe9-ee95-4003-81d9-c6d7935b46cd\") " pod="openstack/heat-engine-68d6fbc7c5-czl7r" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.492239 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4697fe9-ee95-4003-81d9-c6d7935b46cd-config-data\") pod \"heat-engine-68d6fbc7c5-czl7r\" (UID: \"f4697fe9-ee95-4003-81d9-c6d7935b46cd\") " pod="openstack/heat-engine-68d6fbc7c5-czl7r" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.492326 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4697fe9-ee95-4003-81d9-c6d7935b46cd-combined-ca-bundle\") pod \"heat-engine-68d6fbc7c5-czl7r\" (UID: \"f4697fe9-ee95-4003-81d9-c6d7935b46cd\") " pod="openstack/heat-engine-68d6fbc7c5-czl7r" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.492375 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f4697fe9-ee95-4003-81d9-c6d7935b46cd-config-data-custom\") pod \"heat-engine-68d6fbc7c5-czl7r\" (UID: \"f4697fe9-ee95-4003-81d9-c6d7935b46cd\") " pod="openstack/heat-engine-68d6fbc7c5-czl7r" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.492410 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xxl6\" (UniqueName: \"kubernetes.io/projected/f4697fe9-ee95-4003-81d9-c6d7935b46cd-kube-api-access-9xxl6\") pod \"heat-engine-68d6fbc7c5-czl7r\" (UID: \"f4697fe9-ee95-4003-81d9-c6d7935b46cd\") " pod="openstack/heat-engine-68d6fbc7c5-czl7r" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.501926 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4697fe9-ee95-4003-81d9-c6d7935b46cd-combined-ca-bundle\") pod \"heat-engine-68d6fbc7c5-czl7r\" (UID: \"f4697fe9-ee95-4003-81d9-c6d7935b46cd\") " pod="openstack/heat-engine-68d6fbc7c5-czl7r" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.504001 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4697fe9-ee95-4003-81d9-c6d7935b46cd-config-data\") pod \"heat-engine-68d6fbc7c5-czl7r\" (UID: \"f4697fe9-ee95-4003-81d9-c6d7935b46cd\") " pod="openstack/heat-engine-68d6fbc7c5-czl7r" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.506138 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f4697fe9-ee95-4003-81d9-c6d7935b46cd-config-data-custom\") pod \"heat-engine-68d6fbc7c5-czl7r\" (UID: \"f4697fe9-ee95-4003-81d9-c6d7935b46cd\") " pod="openstack/heat-engine-68d6fbc7c5-czl7r" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.511770 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-76899f657-g7f8m"] Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.513235 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76899f657-g7f8m" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.519064 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xxl6\" (UniqueName: \"kubernetes.io/projected/f4697fe9-ee95-4003-81d9-c6d7935b46cd-kube-api-access-9xxl6\") pod \"heat-engine-68d6fbc7c5-czl7r\" (UID: \"f4697fe9-ee95-4003-81d9-c6d7935b46cd\") " pod="openstack/heat-engine-68d6fbc7c5-czl7r" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.519468 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.525491 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-76899f657-g7f8m"] Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.553837 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-6cc7f55d5c-lvdts"] Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.555141 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6cc7f55d5c-lvdts" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.560945 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.595122 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rpqn\" (UniqueName: \"kubernetes.io/projected/891348e7-69c8-46e3-a5c2-86c001574a89-kube-api-access-2rpqn\") pod \"heat-cfnapi-76899f657-g7f8m\" (UID: \"891348e7-69c8-46e3-a5c2-86c001574a89\") " pod="openstack/heat-cfnapi-76899f657-g7f8m" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.595165 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/891348e7-69c8-46e3-a5c2-86c001574a89-combined-ca-bundle\") pod \"heat-cfnapi-76899f657-g7f8m\" (UID: \"891348e7-69c8-46e3-a5c2-86c001574a89\") " pod="openstack/heat-cfnapi-76899f657-g7f8m" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.595258 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/891348e7-69c8-46e3-a5c2-86c001574a89-config-data-custom\") pod \"heat-cfnapi-76899f657-g7f8m\" (UID: \"891348e7-69c8-46e3-a5c2-86c001574a89\") " pod="openstack/heat-cfnapi-76899f657-g7f8m" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.595289 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/891348e7-69c8-46e3-a5c2-86c001574a89-config-data\") pod \"heat-cfnapi-76899f657-g7f8m\" (UID: \"891348e7-69c8-46e3-a5c2-86c001574a89\") " pod="openstack/heat-cfnapi-76899f657-g7f8m" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.606394 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6cc7f55d5c-lvdts"] Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.650139 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-68d6fbc7c5-czl7r" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.697048 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jlhw\" (UniqueName: \"kubernetes.io/projected/128b27b4-464a-4392-af17-51d79bdd1e1e-kube-api-access-8jlhw\") pod \"heat-api-6cc7f55d5c-lvdts\" (UID: \"128b27b4-464a-4392-af17-51d79bdd1e1e\") " pod="openstack/heat-api-6cc7f55d5c-lvdts" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.697221 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/128b27b4-464a-4392-af17-51d79bdd1e1e-config-data\") pod \"heat-api-6cc7f55d5c-lvdts\" (UID: \"128b27b4-464a-4392-af17-51d79bdd1e1e\") " pod="openstack/heat-api-6cc7f55d5c-lvdts" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.697277 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/891348e7-69c8-46e3-a5c2-86c001574a89-config-data-custom\") pod \"heat-cfnapi-76899f657-g7f8m\" (UID: \"891348e7-69c8-46e3-a5c2-86c001574a89\") " pod="openstack/heat-cfnapi-76899f657-g7f8m" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.697345 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/891348e7-69c8-46e3-a5c2-86c001574a89-config-data\") pod \"heat-cfnapi-76899f657-g7f8m\" (UID: \"891348e7-69c8-46e3-a5c2-86c001574a89\") " pod="openstack/heat-cfnapi-76899f657-g7f8m" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.697373 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/128b27b4-464a-4392-af17-51d79bdd1e1e-combined-ca-bundle\") pod \"heat-api-6cc7f55d5c-lvdts\" (UID: \"128b27b4-464a-4392-af17-51d79bdd1e1e\") " pod="openstack/heat-api-6cc7f55d5c-lvdts" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.697406 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/128b27b4-464a-4392-af17-51d79bdd1e1e-config-data-custom\") pod \"heat-api-6cc7f55d5c-lvdts\" (UID: \"128b27b4-464a-4392-af17-51d79bdd1e1e\") " pod="openstack/heat-api-6cc7f55d5c-lvdts" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.697481 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rpqn\" (UniqueName: \"kubernetes.io/projected/891348e7-69c8-46e3-a5c2-86c001574a89-kube-api-access-2rpqn\") pod \"heat-cfnapi-76899f657-g7f8m\" (UID: \"891348e7-69c8-46e3-a5c2-86c001574a89\") " pod="openstack/heat-cfnapi-76899f657-g7f8m" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.697508 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/891348e7-69c8-46e3-a5c2-86c001574a89-combined-ca-bundle\") pod \"heat-cfnapi-76899f657-g7f8m\" (UID: \"891348e7-69c8-46e3-a5c2-86c001574a89\") " pod="openstack/heat-cfnapi-76899f657-g7f8m" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.702535 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/891348e7-69c8-46e3-a5c2-86c001574a89-combined-ca-bundle\") pod \"heat-cfnapi-76899f657-g7f8m\" (UID: \"891348e7-69c8-46e3-a5c2-86c001574a89\") " pod="openstack/heat-cfnapi-76899f657-g7f8m" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.702671 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/891348e7-69c8-46e3-a5c2-86c001574a89-config-data\") pod \"heat-cfnapi-76899f657-g7f8m\" (UID: \"891348e7-69c8-46e3-a5c2-86c001574a89\") " pod="openstack/heat-cfnapi-76899f657-g7f8m" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.704961 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/891348e7-69c8-46e3-a5c2-86c001574a89-config-data-custom\") pod \"heat-cfnapi-76899f657-g7f8m\" (UID: \"891348e7-69c8-46e3-a5c2-86c001574a89\") " pod="openstack/heat-cfnapi-76899f657-g7f8m" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.720456 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rpqn\" (UniqueName: \"kubernetes.io/projected/891348e7-69c8-46e3-a5c2-86c001574a89-kube-api-access-2rpqn\") pod \"heat-cfnapi-76899f657-g7f8m\" (UID: \"891348e7-69c8-46e3-a5c2-86c001574a89\") " pod="openstack/heat-cfnapi-76899f657-g7f8m" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.799570 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jlhw\" (UniqueName: \"kubernetes.io/projected/128b27b4-464a-4392-af17-51d79bdd1e1e-kube-api-access-8jlhw\") pod \"heat-api-6cc7f55d5c-lvdts\" (UID: \"128b27b4-464a-4392-af17-51d79bdd1e1e\") " pod="openstack/heat-api-6cc7f55d5c-lvdts" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.799950 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/128b27b4-464a-4392-af17-51d79bdd1e1e-config-data\") pod \"heat-api-6cc7f55d5c-lvdts\" (UID: \"128b27b4-464a-4392-af17-51d79bdd1e1e\") " pod="openstack/heat-api-6cc7f55d5c-lvdts" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.799996 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/128b27b4-464a-4392-af17-51d79bdd1e1e-combined-ca-bundle\") pod \"heat-api-6cc7f55d5c-lvdts\" (UID: \"128b27b4-464a-4392-af17-51d79bdd1e1e\") " pod="openstack/heat-api-6cc7f55d5c-lvdts" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.800022 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/128b27b4-464a-4392-af17-51d79bdd1e1e-config-data-custom\") pod \"heat-api-6cc7f55d5c-lvdts\" (UID: \"128b27b4-464a-4392-af17-51d79bdd1e1e\") " pod="openstack/heat-api-6cc7f55d5c-lvdts" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.806251 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/128b27b4-464a-4392-af17-51d79bdd1e1e-config-data\") pod \"heat-api-6cc7f55d5c-lvdts\" (UID: \"128b27b4-464a-4392-af17-51d79bdd1e1e\") " pod="openstack/heat-api-6cc7f55d5c-lvdts" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.806947 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/128b27b4-464a-4392-af17-51d79bdd1e1e-config-data-custom\") pod \"heat-api-6cc7f55d5c-lvdts\" (UID: \"128b27b4-464a-4392-af17-51d79bdd1e1e\") " pod="openstack/heat-api-6cc7f55d5c-lvdts" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.814297 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/128b27b4-464a-4392-af17-51d79bdd1e1e-combined-ca-bundle\") pod \"heat-api-6cc7f55d5c-lvdts\" (UID: \"128b27b4-464a-4392-af17-51d79bdd1e1e\") " pod="openstack/heat-api-6cc7f55d5c-lvdts" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.815562 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jlhw\" (UniqueName: \"kubernetes.io/projected/128b27b4-464a-4392-af17-51d79bdd1e1e-kube-api-access-8jlhw\") pod \"heat-api-6cc7f55d5c-lvdts\" (UID: \"128b27b4-464a-4392-af17-51d79bdd1e1e\") " pod="openstack/heat-api-6cc7f55d5c-lvdts" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.904619 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76899f657-g7f8m" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.913934 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6cc7f55d5c-lvdts" Feb 20 08:51:10 crc kubenswrapper[5094]: I0220 08:51:10.153594 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-68d6fbc7c5-czl7r"] Feb 20 08:51:10 crc kubenswrapper[5094]: I0220 08:51:10.382206 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-68d6fbc7c5-czl7r" event={"ID":"f4697fe9-ee95-4003-81d9-c6d7935b46cd","Type":"ContainerStarted","Data":"794f615ce1880f7ab8bbfd5bb19e5fa1f93a61809b6001dcd8fad2d8bdeb5a3b"} Feb 20 08:51:10 crc kubenswrapper[5094]: W0220 08:51:10.576090 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod891348e7_69c8_46e3_a5c2_86c001574a89.slice/crio-e1d16a4445e10f53c28f7932d390dda2275444fc811e7d7853fb365c85fa8ccc WatchSource:0}: Error finding container e1d16a4445e10f53c28f7932d390dda2275444fc811e7d7853fb365c85fa8ccc: Status 404 returned error can't find the container with id e1d16a4445e10f53c28f7932d390dda2275444fc811e7d7853fb365c85fa8ccc Feb 20 08:51:10 crc kubenswrapper[5094]: I0220 08:51:10.585355 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-76899f657-g7f8m"] Feb 20 08:51:10 crc kubenswrapper[5094]: I0220 08:51:10.642647 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6cc7f55d5c-lvdts"] Feb 20 08:51:10 crc kubenswrapper[5094]: W0220 08:51:10.644882 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod128b27b4_464a_4392_af17_51d79bdd1e1e.slice/crio-b53d024cdde88c8ef4dfde6aa81007a1b01a4de1a95899cdd6621adcbc8d65c8 WatchSource:0}: Error finding container b53d024cdde88c8ef4dfde6aa81007a1b01a4de1a95899cdd6621adcbc8d65c8: Status 404 returned error can't find the container with id b53d024cdde88c8ef4dfde6aa81007a1b01a4de1a95899cdd6621adcbc8d65c8 Feb 20 08:51:11 crc kubenswrapper[5094]: I0220 08:51:11.417638 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6cc7f55d5c-lvdts" event={"ID":"128b27b4-464a-4392-af17-51d79bdd1e1e","Type":"ContainerStarted","Data":"b53d024cdde88c8ef4dfde6aa81007a1b01a4de1a95899cdd6621adcbc8d65c8"} Feb 20 08:51:11 crc kubenswrapper[5094]: I0220 08:51:11.421302 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-68d6fbc7c5-czl7r" event={"ID":"f4697fe9-ee95-4003-81d9-c6d7935b46cd","Type":"ContainerStarted","Data":"fc2d8d6d91d1e0d6b463340683d1d9746cd3dfead8e2fdf2bb05e8b2d2af0428"} Feb 20 08:51:11 crc kubenswrapper[5094]: I0220 08:51:11.421505 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-68d6fbc7c5-czl7r" Feb 20 08:51:11 crc kubenswrapper[5094]: I0220 08:51:11.423005 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76899f657-g7f8m" event={"ID":"891348e7-69c8-46e3-a5c2-86c001574a89","Type":"ContainerStarted","Data":"e1d16a4445e10f53c28f7932d390dda2275444fc811e7d7853fb365c85fa8ccc"} Feb 20 08:51:11 crc kubenswrapper[5094]: I0220 08:51:11.444155 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-68d6fbc7c5-czl7r" podStartSLOduration=2.444136869 podStartE2EDuration="2.444136869s" podCreationTimestamp="2026-02-20 08:51:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:51:11.43545695 +0000 UTC m=+7486.308083661" watchObservedRunningTime="2026-02-20 08:51:11.444136869 +0000 UTC m=+7486.316763580" Feb 20 08:51:11 crc kubenswrapper[5094]: I0220 08:51:11.477904 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:51:12 crc kubenswrapper[5094]: I0220 08:51:12.434133 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6cc7f55d5c-lvdts" event={"ID":"128b27b4-464a-4392-af17-51d79bdd1e1e","Type":"ContainerStarted","Data":"215e02f645838bc960bfaa8b0aaa38d199c008cca27afa3cccc2a634176bed2e"} Feb 20 08:51:12 crc kubenswrapper[5094]: I0220 08:51:12.434671 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-6cc7f55d5c-lvdts" Feb 20 08:51:12 crc kubenswrapper[5094]: I0220 08:51:12.436620 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76899f657-g7f8m" event={"ID":"891348e7-69c8-46e3-a5c2-86c001574a89","Type":"ContainerStarted","Data":"47b3d2aa2aa8743d5d34690f2e29891b2ab75a019e1ab5624c6628a1fb916c7a"} Feb 20 08:51:12 crc kubenswrapper[5094]: I0220 08:51:12.453810 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-6cc7f55d5c-lvdts" podStartSLOduration=1.961544452 podStartE2EDuration="3.453788234s" podCreationTimestamp="2026-02-20 08:51:09 +0000 UTC" firstStartedPulling="2026-02-20 08:51:10.646862403 +0000 UTC m=+7485.519489114" lastFinishedPulling="2026-02-20 08:51:12.139106185 +0000 UTC m=+7487.011732896" observedRunningTime="2026-02-20 08:51:12.451859937 +0000 UTC m=+7487.324486648" watchObservedRunningTime="2026-02-20 08:51:12.453788234 +0000 UTC m=+7487.326414965" Feb 20 08:51:12 crc kubenswrapper[5094]: I0220 08:51:12.473440 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-76899f657-g7f8m" podStartSLOduration=1.915759051 podStartE2EDuration="3.473415186s" podCreationTimestamp="2026-02-20 08:51:09 +0000 UTC" firstStartedPulling="2026-02-20 08:51:10.578372046 +0000 UTC m=+7485.450998757" lastFinishedPulling="2026-02-20 08:51:12.136028181 +0000 UTC m=+7487.008654892" observedRunningTime="2026-02-20 08:51:12.467033752 +0000 UTC m=+7487.339660463" watchObservedRunningTime="2026-02-20 08:51:12.473415186 +0000 UTC m=+7487.346041897" Feb 20 08:51:13 crc kubenswrapper[5094]: I0220 08:51:13.348310 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:51:13 crc kubenswrapper[5094]: I0220 08:51:13.414569 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6d54b4d569-kqd4s"] Feb 20 08:51:13 crc kubenswrapper[5094]: I0220 08:51:13.414936 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6d54b4d569-kqd4s" podUID="8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" containerName="horizon" containerID="cri-o://383a27955fb0f86aaf1453d917dac5350556fafc41a927eb7c86a2bb5a520c4d" gracePeriod=30 Feb 20 08:51:13 crc kubenswrapper[5094]: I0220 08:51:13.415177 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6d54b4d569-kqd4s" podUID="8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" containerName="horizon-log" containerID="cri-o://1d9312f8455277c3cb35f55a1583969cb70ad9d4c52d34eaec9448990ee953f6" gracePeriod=30 Feb 20 08:51:13 crc kubenswrapper[5094]: I0220 08:51:13.452644 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-76899f657-g7f8m" Feb 20 08:51:17 crc kubenswrapper[5094]: I0220 08:51:17.497171 5094 generic.go:334] "Generic (PLEG): container finished" podID="8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" containerID="383a27955fb0f86aaf1453d917dac5350556fafc41a927eb7c86a2bb5a520c4d" exitCode=0 Feb 20 08:51:17 crc kubenswrapper[5094]: I0220 08:51:17.497292 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d54b4d569-kqd4s" event={"ID":"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c","Type":"ContainerDied","Data":"383a27955fb0f86aaf1453d917dac5350556fafc41a927eb7c86a2bb5a520c4d"} Feb 20 08:51:21 crc kubenswrapper[5094]: I0220 08:51:21.285000 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-6cc7f55d5c-lvdts" Feb 20 08:51:21 crc kubenswrapper[5094]: I0220 08:51:21.346585 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-76899f657-g7f8m" Feb 20 08:51:22 crc kubenswrapper[5094]: I0220 08:51:22.207081 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6d54b4d569-kqd4s" podUID="8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.97:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.97:8080: connect: connection refused" Feb 20 08:51:25 crc kubenswrapper[5094]: I0220 08:51:25.049280 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-2tqmv"] Feb 20 08:51:25 crc kubenswrapper[5094]: I0220 08:51:25.119590 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-0a46-account-create-update-6bs8s"] Feb 20 08:51:25 crc kubenswrapper[5094]: I0220 08:51:25.141832 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-2tqmv"] Feb 20 08:51:25 crc kubenswrapper[5094]: I0220 08:51:25.163349 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-0a46-account-create-update-6bs8s"] Feb 20 08:51:25 crc kubenswrapper[5094]: I0220 08:51:25.854029 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61baa8c9-4a07-47bc-94c8-b7e3f3846ff2" path="/var/lib/kubelet/pods/61baa8c9-4a07-47bc-94c8-b7e3f3846ff2/volumes" Feb 20 08:51:25 crc kubenswrapper[5094]: I0220 08:51:25.854991 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb71d5b0-a19d-4900-be92-77b1abeaf856" path="/var/lib/kubelet/pods/eb71d5b0-a19d-4900-be92-77b1abeaf856/volumes" Feb 20 08:51:29 crc kubenswrapper[5094]: I0220 08:51:29.678141 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-68d6fbc7c5-czl7r" Feb 20 08:51:32 crc kubenswrapper[5094]: I0220 08:51:32.206419 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6d54b4d569-kqd4s" podUID="8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.97:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.97:8080: connect: connection refused" Feb 20 08:51:34 crc kubenswrapper[5094]: I0220 08:51:34.107190 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:51:34 crc kubenswrapper[5094]: I0220 08:51:34.107660 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:51:36 crc kubenswrapper[5094]: I0220 08:51:36.038466 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-9jfqj"] Feb 20 08:51:36 crc kubenswrapper[5094]: I0220 08:51:36.046691 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-9jfqj"] Feb 20 08:51:37 crc kubenswrapper[5094]: I0220 08:51:37.850648 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b241ede-085a-44b3-857b-f64e36b7b14f" path="/var/lib/kubelet/pods/7b241ede-085a-44b3-857b-f64e36b7b14f/volumes" Feb 20 08:51:40 crc kubenswrapper[5094]: I0220 08:51:40.694954 5094 scope.go:117] "RemoveContainer" containerID="4231927e6f52319c4c7cbbaa5766e18430942afbbae151ea27a85c1b2eed2b12" Feb 20 08:51:40 crc kubenswrapper[5094]: I0220 08:51:40.725006 5094 scope.go:117] "RemoveContainer" containerID="ba2973a00772608356b5c6835b97549aca8d7d662bffa9fb35da2b972f2aa0f4" Feb 20 08:51:40 crc kubenswrapper[5094]: I0220 08:51:40.795543 5094 scope.go:117] "RemoveContainer" containerID="4f87cc562d40739a0734989e8f19246c6cf1e1144b307f5249bd8e950afcfbb0" Feb 20 08:51:41 crc kubenswrapper[5094]: I0220 08:51:41.250771 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz"] Feb 20 08:51:41 crc kubenswrapper[5094]: I0220 08:51:41.253850 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" Feb 20 08:51:41 crc kubenswrapper[5094]: I0220 08:51:41.268084 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz"] Feb 20 08:51:41 crc kubenswrapper[5094]: I0220 08:51:41.302875 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 20 08:51:41 crc kubenswrapper[5094]: I0220 08:51:41.405801 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n7j5\" (UniqueName: \"kubernetes.io/projected/77007c08-6c58-4a19-9c49-09c1677b9070-kube-api-access-7n7j5\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz\" (UID: \"77007c08-6c58-4a19-9c49-09c1677b9070\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" Feb 20 08:51:41 crc kubenswrapper[5094]: I0220 08:51:41.405930 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77007c08-6c58-4a19-9c49-09c1677b9070-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz\" (UID: \"77007c08-6c58-4a19-9c49-09c1677b9070\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" Feb 20 08:51:41 crc kubenswrapper[5094]: I0220 08:51:41.406126 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77007c08-6c58-4a19-9c49-09c1677b9070-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz\" (UID: \"77007c08-6c58-4a19-9c49-09c1677b9070\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" Feb 20 08:51:41 crc kubenswrapper[5094]: I0220 08:51:41.508088 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77007c08-6c58-4a19-9c49-09c1677b9070-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz\" (UID: \"77007c08-6c58-4a19-9c49-09c1677b9070\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" Feb 20 08:51:41 crc kubenswrapper[5094]: I0220 08:51:41.508245 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77007c08-6c58-4a19-9c49-09c1677b9070-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz\" (UID: \"77007c08-6c58-4a19-9c49-09c1677b9070\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" Feb 20 08:51:41 crc kubenswrapper[5094]: I0220 08:51:41.508817 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77007c08-6c58-4a19-9c49-09c1677b9070-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz\" (UID: \"77007c08-6c58-4a19-9c49-09c1677b9070\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" Feb 20 08:51:41 crc kubenswrapper[5094]: I0220 08:51:41.508817 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77007c08-6c58-4a19-9c49-09c1677b9070-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz\" (UID: \"77007c08-6c58-4a19-9c49-09c1677b9070\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" Feb 20 08:51:41 crc kubenswrapper[5094]: I0220 08:51:41.509132 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n7j5\" (UniqueName: \"kubernetes.io/projected/77007c08-6c58-4a19-9c49-09c1677b9070-kube-api-access-7n7j5\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz\" (UID: \"77007c08-6c58-4a19-9c49-09c1677b9070\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" Feb 20 08:51:41 crc kubenswrapper[5094]: I0220 08:51:41.528693 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n7j5\" (UniqueName: \"kubernetes.io/projected/77007c08-6c58-4a19-9c49-09c1677b9070-kube-api-access-7n7j5\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz\" (UID: \"77007c08-6c58-4a19-9c49-09c1677b9070\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" Feb 20 08:51:41 crc kubenswrapper[5094]: I0220 08:51:41.650950 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" Feb 20 08:51:42 crc kubenswrapper[5094]: I0220 08:51:42.170642 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz"] Feb 20 08:51:42 crc kubenswrapper[5094]: I0220 08:51:42.207914 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6d54b4d569-kqd4s" podUID="8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.97:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.97:8080: connect: connection refused" Feb 20 08:51:42 crc kubenswrapper[5094]: I0220 08:51:42.208065 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:51:42 crc kubenswrapper[5094]: I0220 08:51:42.740519 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" event={"ID":"77007c08-6c58-4a19-9c49-09c1677b9070","Type":"ContainerStarted","Data":"3cb356662f4541cb557f9f84de510f207980149256804e45c03e701292ad6e86"} Feb 20 08:51:42 crc kubenswrapper[5094]: I0220 08:51:42.740868 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" event={"ID":"77007c08-6c58-4a19-9c49-09c1677b9070","Type":"ContainerStarted","Data":"2fda3078b24f95f337aee86c4263f656c6c005cd02148d7411e16695a2ec7a86"} Feb 20 08:51:43 crc kubenswrapper[5094]: I0220 08:51:43.751097 5094 generic.go:334] "Generic (PLEG): container finished" podID="77007c08-6c58-4a19-9c49-09c1677b9070" containerID="3cb356662f4541cb557f9f84de510f207980149256804e45c03e701292ad6e86" exitCode=0 Feb 20 08:51:43 crc kubenswrapper[5094]: I0220 08:51:43.751249 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" event={"ID":"77007c08-6c58-4a19-9c49-09c1677b9070","Type":"ContainerDied","Data":"3cb356662f4541cb557f9f84de510f207980149256804e45c03e701292ad6e86"} Feb 20 08:51:43 crc kubenswrapper[5094]: I0220 08:51:43.757279 5094 generic.go:334] "Generic (PLEG): container finished" podID="8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" containerID="1d9312f8455277c3cb35f55a1583969cb70ad9d4c52d34eaec9448990ee953f6" exitCode=137 Feb 20 08:51:43 crc kubenswrapper[5094]: I0220 08:51:43.757316 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d54b4d569-kqd4s" event={"ID":"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c","Type":"ContainerDied","Data":"1d9312f8455277c3cb35f55a1583969cb70ad9d4c52d34eaec9448990ee953f6"} Feb 20 08:51:43 crc kubenswrapper[5094]: I0220 08:51:43.886205 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:51:43 crc kubenswrapper[5094]: I0220 08:51:43.952107 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-scripts\") pod \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " Feb 20 08:51:43 crc kubenswrapper[5094]: I0220 08:51:43.952158 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rkqz\" (UniqueName: \"kubernetes.io/projected/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-kube-api-access-2rkqz\") pod \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " Feb 20 08:51:43 crc kubenswrapper[5094]: I0220 08:51:43.952208 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-logs\") pod \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " Feb 20 08:51:43 crc kubenswrapper[5094]: I0220 08:51:43.952315 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-horizon-secret-key\") pod \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " Feb 20 08:51:43 crc kubenswrapper[5094]: I0220 08:51:43.952381 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-config-data\") pod \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " Feb 20 08:51:43 crc kubenswrapper[5094]: I0220 08:51:43.953229 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-logs" (OuterVolumeSpecName: "logs") pod "8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" (UID: "8eb2c0e1-59eb-4f7a-aeea-8965a35d861c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:51:43 crc kubenswrapper[5094]: I0220 08:51:43.958025 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" (UID: "8eb2c0e1-59eb-4f7a-aeea-8965a35d861c"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:51:43 crc kubenswrapper[5094]: I0220 08:51:43.958128 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-kube-api-access-2rkqz" (OuterVolumeSpecName: "kube-api-access-2rkqz") pod "8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" (UID: "8eb2c0e1-59eb-4f7a-aeea-8965a35d861c"). InnerVolumeSpecName "kube-api-access-2rkqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:51:43 crc kubenswrapper[5094]: I0220 08:51:43.981333 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-scripts" (OuterVolumeSpecName: "scripts") pod "8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" (UID: "8eb2c0e1-59eb-4f7a-aeea-8965a35d861c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:51:43 crc kubenswrapper[5094]: I0220 08:51:43.985807 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-config-data" (OuterVolumeSpecName: "config-data") pod "8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" (UID: "8eb2c0e1-59eb-4f7a-aeea-8965a35d861c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:51:44 crc kubenswrapper[5094]: I0220 08:51:44.054232 5094 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 20 08:51:44 crc kubenswrapper[5094]: I0220 08:51:44.054487 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:51:44 crc kubenswrapper[5094]: I0220 08:51:44.054495 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:51:44 crc kubenswrapper[5094]: I0220 08:51:44.054504 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rkqz\" (UniqueName: \"kubernetes.io/projected/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-kube-api-access-2rkqz\") on node \"crc\" DevicePath \"\"" Feb 20 08:51:44 crc kubenswrapper[5094]: I0220 08:51:44.054514 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-logs\") on node \"crc\" DevicePath \"\"" Feb 20 08:51:44 crc kubenswrapper[5094]: I0220 08:51:44.773815 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d54b4d569-kqd4s" event={"ID":"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c","Type":"ContainerDied","Data":"8a9bb3e82b6fd52dcf3df5d577a488c8f0daee979b4bcefb84c464a576031f35"} Feb 20 08:51:44 crc kubenswrapper[5094]: I0220 08:51:44.773907 5094 scope.go:117] "RemoveContainer" containerID="383a27955fb0f86aaf1453d917dac5350556fafc41a927eb7c86a2bb5a520c4d" Feb 20 08:51:44 crc kubenswrapper[5094]: I0220 08:51:44.773946 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:51:44 crc kubenswrapper[5094]: I0220 08:51:44.834951 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6d54b4d569-kqd4s"] Feb 20 08:51:44 crc kubenswrapper[5094]: I0220 08:51:44.851543 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6d54b4d569-kqd4s"] Feb 20 08:51:45 crc kubenswrapper[5094]: I0220 08:51:45.021268 5094 scope.go:117] "RemoveContainer" containerID="1d9312f8455277c3cb35f55a1583969cb70ad9d4c52d34eaec9448990ee953f6" Feb 20 08:51:45 crc kubenswrapper[5094]: I0220 08:51:45.786608 5094 generic.go:334] "Generic (PLEG): container finished" podID="77007c08-6c58-4a19-9c49-09c1677b9070" containerID="64c504c7c1cea457bbff3398c963e65cd43b025824a9b551288752d71fb56546" exitCode=0 Feb 20 08:51:45 crc kubenswrapper[5094]: I0220 08:51:45.786646 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" event={"ID":"77007c08-6c58-4a19-9c49-09c1677b9070","Type":"ContainerDied","Data":"64c504c7c1cea457bbff3398c963e65cd43b025824a9b551288752d71fb56546"} Feb 20 08:51:45 crc kubenswrapper[5094]: I0220 08:51:45.861354 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" path="/var/lib/kubelet/pods/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c/volumes" Feb 20 08:51:46 crc kubenswrapper[5094]: I0220 08:51:46.803176 5094 generic.go:334] "Generic (PLEG): container finished" podID="77007c08-6c58-4a19-9c49-09c1677b9070" containerID="844a0746cb4596152109695e95d4e6d419894f770e1fe1f8e26fe5fe78f38751" exitCode=0 Feb 20 08:51:46 crc kubenswrapper[5094]: I0220 08:51:46.803303 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" event={"ID":"77007c08-6c58-4a19-9c49-09c1677b9070","Type":"ContainerDied","Data":"844a0746cb4596152109695e95d4e6d419894f770e1fe1f8e26fe5fe78f38751"} Feb 20 08:51:48 crc kubenswrapper[5094]: I0220 08:51:48.232032 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" Feb 20 08:51:48 crc kubenswrapper[5094]: I0220 08:51:48.346345 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77007c08-6c58-4a19-9c49-09c1677b9070-bundle\") pod \"77007c08-6c58-4a19-9c49-09c1677b9070\" (UID: \"77007c08-6c58-4a19-9c49-09c1677b9070\") " Feb 20 08:51:48 crc kubenswrapper[5094]: I0220 08:51:48.346802 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77007c08-6c58-4a19-9c49-09c1677b9070-util\") pod \"77007c08-6c58-4a19-9c49-09c1677b9070\" (UID: \"77007c08-6c58-4a19-9c49-09c1677b9070\") " Feb 20 08:51:48 crc kubenswrapper[5094]: I0220 08:51:48.346845 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7n7j5\" (UniqueName: \"kubernetes.io/projected/77007c08-6c58-4a19-9c49-09c1677b9070-kube-api-access-7n7j5\") pod \"77007c08-6c58-4a19-9c49-09c1677b9070\" (UID: \"77007c08-6c58-4a19-9c49-09c1677b9070\") " Feb 20 08:51:48 crc kubenswrapper[5094]: I0220 08:51:48.348790 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77007c08-6c58-4a19-9c49-09c1677b9070-bundle" (OuterVolumeSpecName: "bundle") pod "77007c08-6c58-4a19-9c49-09c1677b9070" (UID: "77007c08-6c58-4a19-9c49-09c1677b9070"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:51:48 crc kubenswrapper[5094]: I0220 08:51:48.356076 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77007c08-6c58-4a19-9c49-09c1677b9070-kube-api-access-7n7j5" (OuterVolumeSpecName: "kube-api-access-7n7j5") pod "77007c08-6c58-4a19-9c49-09c1677b9070" (UID: "77007c08-6c58-4a19-9c49-09c1677b9070"). InnerVolumeSpecName "kube-api-access-7n7j5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:51:48 crc kubenswrapper[5094]: I0220 08:51:48.359891 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77007c08-6c58-4a19-9c49-09c1677b9070-util" (OuterVolumeSpecName: "util") pod "77007c08-6c58-4a19-9c49-09c1677b9070" (UID: "77007c08-6c58-4a19-9c49-09c1677b9070"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:51:48 crc kubenswrapper[5094]: I0220 08:51:48.449181 5094 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77007c08-6c58-4a19-9c49-09c1677b9070-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:51:48 crc kubenswrapper[5094]: I0220 08:51:48.449220 5094 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77007c08-6c58-4a19-9c49-09c1677b9070-util\") on node \"crc\" DevicePath \"\"" Feb 20 08:51:48 crc kubenswrapper[5094]: I0220 08:51:48.449233 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7n7j5\" (UniqueName: \"kubernetes.io/projected/77007c08-6c58-4a19-9c49-09c1677b9070-kube-api-access-7n7j5\") on node \"crc\" DevicePath \"\"" Feb 20 08:51:48 crc kubenswrapper[5094]: I0220 08:51:48.824977 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" event={"ID":"77007c08-6c58-4a19-9c49-09c1677b9070","Type":"ContainerDied","Data":"2fda3078b24f95f337aee86c4263f656c6c005cd02148d7411e16695a2ec7a86"} Feb 20 08:51:48 crc kubenswrapper[5094]: I0220 08:51:48.825020 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fda3078b24f95f337aee86c4263f656c6c005cd02148d7411e16695a2ec7a86" Feb 20 08:51:48 crc kubenswrapper[5094]: I0220 08:51:48.825052 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" Feb 20 08:51:56 crc kubenswrapper[5094]: I0220 08:51:56.047893 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-rbwv2"] Feb 20 08:51:56 crc kubenswrapper[5094]: I0220 08:51:56.059936 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-fc68-account-create-update-ctxrq"] Feb 20 08:51:56 crc kubenswrapper[5094]: I0220 08:51:56.076180 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-rbwv2"] Feb 20 08:51:56 crc kubenswrapper[5094]: I0220 08:51:56.090132 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-fc68-account-create-update-ctxrq"] Feb 20 08:51:57 crc kubenswrapper[5094]: I0220 08:51:57.851999 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f3a4acd-5b68-467c-b024-b518d0f4d27e" path="/var/lib/kubelet/pods/0f3a4acd-5b68-467c-b024-b518d0f4d27e/volumes" Feb 20 08:51:57 crc kubenswrapper[5094]: I0220 08:51:57.853184 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8c2373d-6a69-460a-8622-d001dc53efc0" path="/var/lib/kubelet/pods/d8c2373d-6a69-460a-8622-d001dc53efc0/volumes" Feb 20 08:51:58 crc kubenswrapper[5094]: I0220 08:51:58.829224 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nx84l"] Feb 20 08:51:58 crc kubenswrapper[5094]: E0220 08:51:58.829652 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77007c08-6c58-4a19-9c49-09c1677b9070" containerName="extract" Feb 20 08:51:58 crc kubenswrapper[5094]: I0220 08:51:58.829670 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="77007c08-6c58-4a19-9c49-09c1677b9070" containerName="extract" Feb 20 08:51:58 crc kubenswrapper[5094]: E0220 08:51:58.829684 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77007c08-6c58-4a19-9c49-09c1677b9070" containerName="util" Feb 20 08:51:58 crc kubenswrapper[5094]: I0220 08:51:58.829691 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="77007c08-6c58-4a19-9c49-09c1677b9070" containerName="util" Feb 20 08:51:58 crc kubenswrapper[5094]: E0220 08:51:58.829711 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77007c08-6c58-4a19-9c49-09c1677b9070" containerName="pull" Feb 20 08:51:58 crc kubenswrapper[5094]: I0220 08:51:58.829717 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="77007c08-6c58-4a19-9c49-09c1677b9070" containerName="pull" Feb 20 08:51:58 crc kubenswrapper[5094]: E0220 08:51:58.829738 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" containerName="horizon-log" Feb 20 08:51:58 crc kubenswrapper[5094]: I0220 08:51:58.829744 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" containerName="horizon-log" Feb 20 08:51:58 crc kubenswrapper[5094]: E0220 08:51:58.829753 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" containerName="horizon" Feb 20 08:51:58 crc kubenswrapper[5094]: I0220 08:51:58.829759 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" containerName="horizon" Feb 20 08:51:58 crc kubenswrapper[5094]: I0220 08:51:58.829950 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="77007c08-6c58-4a19-9c49-09c1677b9070" containerName="extract" Feb 20 08:51:58 crc kubenswrapper[5094]: I0220 08:51:58.829965 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" containerName="horizon-log" Feb 20 08:51:58 crc kubenswrapper[5094]: I0220 08:51:58.829977 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" containerName="horizon" Feb 20 08:51:58 crc kubenswrapper[5094]: I0220 08:51:58.831224 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:51:58 crc kubenswrapper[5094]: I0220 08:51:58.844899 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nx84l"] Feb 20 08:51:58 crc kubenswrapper[5094]: I0220 08:51:58.980071 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0394b5a4-125e-479d-b699-d9bd69bf812f-catalog-content\") pod \"certified-operators-nx84l\" (UID: \"0394b5a4-125e-479d-b699-d9bd69bf812f\") " pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:51:58 crc kubenswrapper[5094]: I0220 08:51:58.980502 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0394b5a4-125e-479d-b699-d9bd69bf812f-utilities\") pod \"certified-operators-nx84l\" (UID: \"0394b5a4-125e-479d-b699-d9bd69bf812f\") " pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:51:58 crc kubenswrapper[5094]: I0220 08:51:58.980549 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxwtt\" (UniqueName: \"kubernetes.io/projected/0394b5a4-125e-479d-b699-d9bd69bf812f-kube-api-access-fxwtt\") pod \"certified-operators-nx84l\" (UID: \"0394b5a4-125e-479d-b699-d9bd69bf812f\") " pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.082946 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0394b5a4-125e-479d-b699-d9bd69bf812f-catalog-content\") pod \"certified-operators-nx84l\" (UID: \"0394b5a4-125e-479d-b699-d9bd69bf812f\") " pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.083065 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0394b5a4-125e-479d-b699-d9bd69bf812f-utilities\") pod \"certified-operators-nx84l\" (UID: \"0394b5a4-125e-479d-b699-d9bd69bf812f\") " pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.083119 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxwtt\" (UniqueName: \"kubernetes.io/projected/0394b5a4-125e-479d-b699-d9bd69bf812f-kube-api-access-fxwtt\") pod \"certified-operators-nx84l\" (UID: \"0394b5a4-125e-479d-b699-d9bd69bf812f\") " pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.084239 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0394b5a4-125e-479d-b699-d9bd69bf812f-catalog-content\") pod \"certified-operators-nx84l\" (UID: \"0394b5a4-125e-479d-b699-d9bd69bf812f\") " pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.084525 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0394b5a4-125e-479d-b699-d9bd69bf812f-utilities\") pod \"certified-operators-nx84l\" (UID: \"0394b5a4-125e-479d-b699-d9bd69bf812f\") " pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.118031 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxwtt\" (UniqueName: \"kubernetes.io/projected/0394b5a4-125e-479d-b699-d9bd69bf812f-kube-api-access-fxwtt\") pod \"certified-operators-nx84l\" (UID: \"0394b5a4-125e-479d-b699-d9bd69bf812f\") " pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.155697 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.348815 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-kxcc8"] Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.350759 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kxcc8" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.355612 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-6fs65" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.355888 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.356001 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.358698 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-kxcc8"] Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.403770 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9mzr\" (UniqueName: \"kubernetes.io/projected/5a9736b1-aca8-4880-9d94-2d7c37efce50-kube-api-access-g9mzr\") pod \"obo-prometheus-operator-68bc856cb9-kxcc8\" (UID: \"5a9736b1-aca8-4880-9d94-2d7c37efce50\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kxcc8" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.505402 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9mzr\" (UniqueName: \"kubernetes.io/projected/5a9736b1-aca8-4880-9d94-2d7c37efce50-kube-api-access-g9mzr\") pod \"obo-prometheus-operator-68bc856cb9-kxcc8\" (UID: \"5a9736b1-aca8-4880-9d94-2d7c37efce50\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kxcc8" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.559094 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-47vjf"] Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.560749 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-47vjf" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.574877 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-d7v85"] Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.576076 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-d7v85" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.577110 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9mzr\" (UniqueName: \"kubernetes.io/projected/5a9736b1-aca8-4880-9d94-2d7c37efce50-kube-api-access-g9mzr\") pod \"obo-prometheus-operator-68bc856cb9-kxcc8\" (UID: \"5a9736b1-aca8-4880-9d94-2d7c37efce50\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kxcc8" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.587085 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-qvgrr" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.587399 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.606657 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b0fb9831-f265-4976-9a1d-14ed3e08daf5-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-56764c7d84-47vjf\" (UID: \"b0fb9831-f265-4976-9a1d-14ed3e08daf5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-47vjf" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.606789 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c70d95ea-5321-43fa-8df8-6d1138f0a732-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-56764c7d84-d7v85\" (UID: \"c70d95ea-5321-43fa-8df8-6d1138f0a732\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-d7v85" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.606834 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c70d95ea-5321-43fa-8df8-6d1138f0a732-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-56764c7d84-d7v85\" (UID: \"c70d95ea-5321-43fa-8df8-6d1138f0a732\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-d7v85" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.606863 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b0fb9831-f265-4976-9a1d-14ed3e08daf5-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-56764c7d84-47vjf\" (UID: \"b0fb9831-f265-4976-9a1d-14ed3e08daf5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-47vjf" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.629155 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-47vjf"] Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.683264 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kxcc8" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.707554 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b0fb9831-f265-4976-9a1d-14ed3e08daf5-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-56764c7d84-47vjf\" (UID: \"b0fb9831-f265-4976-9a1d-14ed3e08daf5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-47vjf" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.707661 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c70d95ea-5321-43fa-8df8-6d1138f0a732-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-56764c7d84-d7v85\" (UID: \"c70d95ea-5321-43fa-8df8-6d1138f0a732\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-d7v85" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.707730 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c70d95ea-5321-43fa-8df8-6d1138f0a732-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-56764c7d84-d7v85\" (UID: \"c70d95ea-5321-43fa-8df8-6d1138f0a732\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-d7v85" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.707781 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b0fb9831-f265-4976-9a1d-14ed3e08daf5-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-56764c7d84-47vjf\" (UID: \"b0fb9831-f265-4976-9a1d-14ed3e08daf5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-47vjf" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.717536 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b0fb9831-f265-4976-9a1d-14ed3e08daf5-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-56764c7d84-47vjf\" (UID: \"b0fb9831-f265-4976-9a1d-14ed3e08daf5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-47vjf" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.728243 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c70d95ea-5321-43fa-8df8-6d1138f0a732-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-56764c7d84-d7v85\" (UID: \"c70d95ea-5321-43fa-8df8-6d1138f0a732\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-d7v85" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.729816 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-d7v85"] Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.733221 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c70d95ea-5321-43fa-8df8-6d1138f0a732-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-56764c7d84-d7v85\" (UID: \"c70d95ea-5321-43fa-8df8-6d1138f0a732\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-d7v85" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.733597 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b0fb9831-f265-4976-9a1d-14ed3e08daf5-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-56764c7d84-47vjf\" (UID: \"b0fb9831-f265-4976-9a1d-14ed3e08daf5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-47vjf" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.792451 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-dcm8l"] Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.794113 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-dcm8l" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.797976 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-jnhxq" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.798151 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.813584 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/724c1050-e6d7-49c3-8b63-a89a3de26894-observability-operator-tls\") pod \"observability-operator-59bdc8b94-dcm8l\" (UID: \"724c1050-e6d7-49c3-8b63-a89a3de26894\") " pod="openshift-operators/observability-operator-59bdc8b94-dcm8l" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.813661 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7qzw\" (UniqueName: \"kubernetes.io/projected/724c1050-e6d7-49c3-8b63-a89a3de26894-kube-api-access-p7qzw\") pod \"observability-operator-59bdc8b94-dcm8l\" (UID: \"724c1050-e6d7-49c3-8b63-a89a3de26894\") " pod="openshift-operators/observability-operator-59bdc8b94-dcm8l" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.829846 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-dcm8l"] Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.902195 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-9hzgx"] Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.903571 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-9hzgx" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.908110 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-p4tqg" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.910553 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-9hzgx"] Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.916236 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vf9p\" (UniqueName: \"kubernetes.io/projected/1ab531ae-b53c-4de1-b927-ca32c159c244-kube-api-access-8vf9p\") pod \"perses-operator-5bf474d74f-9hzgx\" (UID: \"1ab531ae-b53c-4de1-b927-ca32c159c244\") " pod="openshift-operators/perses-operator-5bf474d74f-9hzgx" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.916361 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/724c1050-e6d7-49c3-8b63-a89a3de26894-observability-operator-tls\") pod \"observability-operator-59bdc8b94-dcm8l\" (UID: \"724c1050-e6d7-49c3-8b63-a89a3de26894\") " pod="openshift-operators/observability-operator-59bdc8b94-dcm8l" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.916404 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7qzw\" (UniqueName: \"kubernetes.io/projected/724c1050-e6d7-49c3-8b63-a89a3de26894-kube-api-access-p7qzw\") pod \"observability-operator-59bdc8b94-dcm8l\" (UID: \"724c1050-e6d7-49c3-8b63-a89a3de26894\") " pod="openshift-operators/observability-operator-59bdc8b94-dcm8l" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.916457 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1ab531ae-b53c-4de1-b927-ca32c159c244-openshift-service-ca\") pod \"perses-operator-5bf474d74f-9hzgx\" (UID: \"1ab531ae-b53c-4de1-b927-ca32c159c244\") " pod="openshift-operators/perses-operator-5bf474d74f-9hzgx" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.920374 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/724c1050-e6d7-49c3-8b63-a89a3de26894-observability-operator-tls\") pod \"observability-operator-59bdc8b94-dcm8l\" (UID: \"724c1050-e6d7-49c3-8b63-a89a3de26894\") " pod="openshift-operators/observability-operator-59bdc8b94-dcm8l" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.933385 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-47vjf" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.949123 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7qzw\" (UniqueName: \"kubernetes.io/projected/724c1050-e6d7-49c3-8b63-a89a3de26894-kube-api-access-p7qzw\") pod \"observability-operator-59bdc8b94-dcm8l\" (UID: \"724c1050-e6d7-49c3-8b63-a89a3de26894\") " pod="openshift-operators/observability-operator-59bdc8b94-dcm8l" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.967368 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-d7v85" Feb 20 08:52:00 crc kubenswrapper[5094]: I0220 08:52:00.019427 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1ab531ae-b53c-4de1-b927-ca32c159c244-openshift-service-ca\") pod \"perses-operator-5bf474d74f-9hzgx\" (UID: \"1ab531ae-b53c-4de1-b927-ca32c159c244\") " pod="openshift-operators/perses-operator-5bf474d74f-9hzgx" Feb 20 08:52:00 crc kubenswrapper[5094]: I0220 08:52:00.019541 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vf9p\" (UniqueName: \"kubernetes.io/projected/1ab531ae-b53c-4de1-b927-ca32c159c244-kube-api-access-8vf9p\") pod \"perses-operator-5bf474d74f-9hzgx\" (UID: \"1ab531ae-b53c-4de1-b927-ca32c159c244\") " pod="openshift-operators/perses-operator-5bf474d74f-9hzgx" Feb 20 08:52:00 crc kubenswrapper[5094]: I0220 08:52:00.020750 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1ab531ae-b53c-4de1-b927-ca32c159c244-openshift-service-ca\") pod \"perses-operator-5bf474d74f-9hzgx\" (UID: \"1ab531ae-b53c-4de1-b927-ca32c159c244\") " pod="openshift-operators/perses-operator-5bf474d74f-9hzgx" Feb 20 08:52:00 crc kubenswrapper[5094]: I0220 08:52:00.048649 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vf9p\" (UniqueName: \"kubernetes.io/projected/1ab531ae-b53c-4de1-b927-ca32c159c244-kube-api-access-8vf9p\") pod \"perses-operator-5bf474d74f-9hzgx\" (UID: \"1ab531ae-b53c-4de1-b927-ca32c159c244\") " pod="openshift-operators/perses-operator-5bf474d74f-9hzgx" Feb 20 08:52:00 crc kubenswrapper[5094]: I0220 08:52:00.074196 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nx84l"] Feb 20 08:52:00 crc kubenswrapper[5094]: I0220 08:52:00.132313 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-dcm8l" Feb 20 08:52:00 crc kubenswrapper[5094]: I0220 08:52:00.250146 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-9hzgx" Feb 20 08:52:00 crc kubenswrapper[5094]: I0220 08:52:00.430468 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-kxcc8"] Feb 20 08:52:00 crc kubenswrapper[5094]: W0220 08:52:00.456313 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a9736b1_aca8_4880_9d94_2d7c37efce50.slice/crio-0425fc7f6b5bdc103dfedc7ccddf9bf765c34c44f2b4b044230b9c903ac7a82e WatchSource:0}: Error finding container 0425fc7f6b5bdc103dfedc7ccddf9bf765c34c44f2b4b044230b9c903ac7a82e: Status 404 returned error can't find the container with id 0425fc7f6b5bdc103dfedc7ccddf9bf765c34c44f2b4b044230b9c903ac7a82e Feb 20 08:52:00 crc kubenswrapper[5094]: I0220 08:52:00.794608 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-d7v85"] Feb 20 08:52:00 crc kubenswrapper[5094]: I0220 08:52:00.860494 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-47vjf"] Feb 20 08:52:00 crc kubenswrapper[5094]: I0220 08:52:00.954370 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-dcm8l"] Feb 20 08:52:00 crc kubenswrapper[5094]: W0220 08:52:00.968452 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod724c1050_e6d7_49c3_8b63_a89a3de26894.slice/crio-5af3d6f292cfabc9ca13a35792ca5350ce7027d9b7ba0dc7d8892d027791fc4c WatchSource:0}: Error finding container 5af3d6f292cfabc9ca13a35792ca5350ce7027d9b7ba0dc7d8892d027791fc4c: Status 404 returned error can't find the container with id 5af3d6f292cfabc9ca13a35792ca5350ce7027d9b7ba0dc7d8892d027791fc4c Feb 20 08:52:01 crc kubenswrapper[5094]: I0220 08:52:01.010504 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kxcc8" event={"ID":"5a9736b1-aca8-4880-9d94-2d7c37efce50","Type":"ContainerStarted","Data":"0425fc7f6b5bdc103dfedc7ccddf9bf765c34c44f2b4b044230b9c903ac7a82e"} Feb 20 08:52:01 crc kubenswrapper[5094]: I0220 08:52:01.016085 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-d7v85" event={"ID":"c70d95ea-5321-43fa-8df8-6d1138f0a732","Type":"ContainerStarted","Data":"054684d1c1fc5b91f1db72fb93c700a235c03254e2cd71a609d2b1a8cace63d7"} Feb 20 08:52:01 crc kubenswrapper[5094]: I0220 08:52:01.024653 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-9hzgx"] Feb 20 08:52:01 crc kubenswrapper[5094]: I0220 08:52:01.026060 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-47vjf" event={"ID":"b0fb9831-f265-4976-9a1d-14ed3e08daf5","Type":"ContainerStarted","Data":"5b7fa806041218d240f6e687f2f2ae05330fa4bf1ad31b3f4d2f760ab8e97838"} Feb 20 08:52:01 crc kubenswrapper[5094]: I0220 08:52:01.031409 5094 generic.go:334] "Generic (PLEG): container finished" podID="0394b5a4-125e-479d-b699-d9bd69bf812f" containerID="7efb0ca9f17319c818fe9f84b6b2ebce31cd83d827cfa03850f26ab199e47cc2" exitCode=0 Feb 20 08:52:01 crc kubenswrapper[5094]: I0220 08:52:01.031461 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nx84l" event={"ID":"0394b5a4-125e-479d-b699-d9bd69bf812f","Type":"ContainerDied","Data":"7efb0ca9f17319c818fe9f84b6b2ebce31cd83d827cfa03850f26ab199e47cc2"} Feb 20 08:52:01 crc kubenswrapper[5094]: I0220 08:52:01.031493 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nx84l" event={"ID":"0394b5a4-125e-479d-b699-d9bd69bf812f","Type":"ContainerStarted","Data":"1df1d0ed97e3ac7409ed3dcedbc351adad4489c377b93acbbc87e23e667f53ac"} Feb 20 08:52:02 crc kubenswrapper[5094]: I0220 08:52:02.066830 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-dcm8l" event={"ID":"724c1050-e6d7-49c3-8b63-a89a3de26894","Type":"ContainerStarted","Data":"5af3d6f292cfabc9ca13a35792ca5350ce7027d9b7ba0dc7d8892d027791fc4c"} Feb 20 08:52:02 crc kubenswrapper[5094]: I0220 08:52:02.075879 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nx84l" event={"ID":"0394b5a4-125e-479d-b699-d9bd69bf812f","Type":"ContainerStarted","Data":"af131ecba7c2b936069eb198796d5d9b2fc1ea0d244e549b881018f904406df8"} Feb 20 08:52:02 crc kubenswrapper[5094]: I0220 08:52:02.085130 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-9hzgx" event={"ID":"1ab531ae-b53c-4de1-b927-ca32c159c244","Type":"ContainerStarted","Data":"9d5eba9e678dc50f7e4465f743bd52d6d4a633dcd87ade03b99eb12c677c9d5f"} Feb 20 08:52:03 crc kubenswrapper[5094]: I0220 08:52:03.123260 5094 generic.go:334] "Generic (PLEG): container finished" podID="0394b5a4-125e-479d-b699-d9bd69bf812f" containerID="af131ecba7c2b936069eb198796d5d9b2fc1ea0d244e549b881018f904406df8" exitCode=0 Feb 20 08:52:03 crc kubenswrapper[5094]: I0220 08:52:03.123315 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nx84l" event={"ID":"0394b5a4-125e-479d-b699-d9bd69bf812f","Type":"ContainerDied","Data":"af131ecba7c2b936069eb198796d5d9b2fc1ea0d244e549b881018f904406df8"} Feb 20 08:52:04 crc kubenswrapper[5094]: I0220 08:52:04.106344 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:52:04 crc kubenswrapper[5094]: I0220 08:52:04.106808 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:52:04 crc kubenswrapper[5094]: I0220 08:52:04.106848 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 08:52:04 crc kubenswrapper[5094]: I0220 08:52:04.107570 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e8403acee0c31e397e6e5d741268e12e6725d137dadeeb1b9238f72fcf352268"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 08:52:04 crc kubenswrapper[5094]: I0220 08:52:04.107622 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://e8403acee0c31e397e6e5d741268e12e6725d137dadeeb1b9238f72fcf352268" gracePeriod=600 Feb 20 08:52:04 crc kubenswrapper[5094]: I0220 08:52:04.143952 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nx84l" event={"ID":"0394b5a4-125e-479d-b699-d9bd69bf812f","Type":"ContainerStarted","Data":"63c8ab29f73793fd740bde02158cc28b743c87e1cd91233ed13edda13a34d972"} Feb 20 08:52:04 crc kubenswrapper[5094]: I0220 08:52:04.168982 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nx84l" podStartSLOduration=3.572531461 podStartE2EDuration="6.168963911s" podCreationTimestamp="2026-02-20 08:51:58 +0000 UTC" firstStartedPulling="2026-02-20 08:52:01.032858611 +0000 UTC m=+7535.905485322" lastFinishedPulling="2026-02-20 08:52:03.629291061 +0000 UTC m=+7538.501917772" observedRunningTime="2026-02-20 08:52:04.161659485 +0000 UTC m=+7539.034286196" watchObservedRunningTime="2026-02-20 08:52:04.168963911 +0000 UTC m=+7539.041590612" Feb 20 08:52:05 crc kubenswrapper[5094]: I0220 08:52:05.037263 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-j7lxk"] Feb 20 08:52:05 crc kubenswrapper[5094]: I0220 08:52:05.049875 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-j7lxk"] Feb 20 08:52:05 crc kubenswrapper[5094]: I0220 08:52:05.268461 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="e8403acee0c31e397e6e5d741268e12e6725d137dadeeb1b9238f72fcf352268" exitCode=0 Feb 20 08:52:05 crc kubenswrapper[5094]: I0220 08:52:05.268964 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"e8403acee0c31e397e6e5d741268e12e6725d137dadeeb1b9238f72fcf352268"} Feb 20 08:52:05 crc kubenswrapper[5094]: I0220 08:52:05.269022 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c"} Feb 20 08:52:05 crc kubenswrapper[5094]: I0220 08:52:05.269046 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:52:05 crc kubenswrapper[5094]: I0220 08:52:05.862032 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcbd09e1-8a1b-468e-9238-0691cafda43e" path="/var/lib/kubelet/pods/bcbd09e1-8a1b-468e-9238-0691cafda43e/volumes" Feb 20 08:52:09 crc kubenswrapper[5094]: I0220 08:52:09.157021 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:52:09 crc kubenswrapper[5094]: I0220 08:52:09.157472 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:52:10 crc kubenswrapper[5094]: I0220 08:52:10.204960 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-nx84l" podUID="0394b5a4-125e-479d-b699-d9bd69bf812f" containerName="registry-server" probeResult="failure" output=< Feb 20 08:52:10 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 08:52:10 crc kubenswrapper[5094]: > Feb 20 08:52:17 crc kubenswrapper[5094]: I0220 08:52:17.439627 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-9hzgx" event={"ID":"1ab531ae-b53c-4de1-b927-ca32c159c244","Type":"ContainerStarted","Data":"a6aeeebe3452765baf4fc4df8f2da74b6f5cfde3a173f15eb09905e67cdca1ea"} Feb 20 08:52:17 crc kubenswrapper[5094]: I0220 08:52:17.440532 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-9hzgx" Feb 20 08:52:17 crc kubenswrapper[5094]: I0220 08:52:17.442810 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-d7v85" event={"ID":"c70d95ea-5321-43fa-8df8-6d1138f0a732","Type":"ContainerStarted","Data":"d6ae85af8c0ad22bb7579b710f16c4e76e31031ca21f693d7ff02d09c3d3c194"} Feb 20 08:52:17 crc kubenswrapper[5094]: I0220 08:52:17.445260 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-dcm8l" event={"ID":"724c1050-e6d7-49c3-8b63-a89a3de26894","Type":"ContainerStarted","Data":"e5816d6a41fbf36dfd4eecd83bdc52eb0b23e5586239f148b01393e04c99d76d"} Feb 20 08:52:17 crc kubenswrapper[5094]: I0220 08:52:17.445535 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-dcm8l" Feb 20 08:52:17 crc kubenswrapper[5094]: I0220 08:52:17.448321 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-dcm8l" Feb 20 08:52:17 crc kubenswrapper[5094]: I0220 08:52:17.449057 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-47vjf" event={"ID":"b0fb9831-f265-4976-9a1d-14ed3e08daf5","Type":"ContainerStarted","Data":"16777c714efd6d0a3b065ff3b29685a6fc83e7a9d28335e6c63bcd1571c3c03e"} Feb 20 08:52:17 crc kubenswrapper[5094]: I0220 08:52:17.452037 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kxcc8" event={"ID":"5a9736b1-aca8-4880-9d94-2d7c37efce50","Type":"ContainerStarted","Data":"95b38ec28736aee248de3d23f584738f59fdb508c5b2fd50abe3103583bfa3f8"} Feb 20 08:52:17 crc kubenswrapper[5094]: I0220 08:52:17.498925 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-9hzgx" podStartSLOduration=3.30291945 podStartE2EDuration="18.498908332s" podCreationTimestamp="2026-02-20 08:51:59 +0000 UTC" firstStartedPulling="2026-02-20 08:52:01.026011358 +0000 UTC m=+7535.898638069" lastFinishedPulling="2026-02-20 08:52:16.22200024 +0000 UTC m=+7551.094626951" observedRunningTime="2026-02-20 08:52:17.493882331 +0000 UTC m=+7552.366509042" watchObservedRunningTime="2026-02-20 08:52:17.498908332 +0000 UTC m=+7552.371535043" Feb 20 08:52:17 crc kubenswrapper[5094]: I0220 08:52:17.530376 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-dcm8l" podStartSLOduration=3.379313426 podStartE2EDuration="18.530357218s" podCreationTimestamp="2026-02-20 08:51:59 +0000 UTC" firstStartedPulling="2026-02-20 08:52:01.018905966 +0000 UTC m=+7535.891532667" lastFinishedPulling="2026-02-20 08:52:16.169949748 +0000 UTC m=+7551.042576459" observedRunningTime="2026-02-20 08:52:17.52211539 +0000 UTC m=+7552.394742101" watchObservedRunningTime="2026-02-20 08:52:17.530357218 +0000 UTC m=+7552.402983929" Feb 20 08:52:17 crc kubenswrapper[5094]: I0220 08:52:17.548357 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-d7v85" podStartSLOduration=3.195497476 podStartE2EDuration="18.548340161s" podCreationTimestamp="2026-02-20 08:51:59 +0000 UTC" firstStartedPulling="2026-02-20 08:52:00.802921041 +0000 UTC m=+7535.675547752" lastFinishedPulling="2026-02-20 08:52:16.155763726 +0000 UTC m=+7551.028390437" observedRunningTime="2026-02-20 08:52:17.545229496 +0000 UTC m=+7552.417856207" watchObservedRunningTime="2026-02-20 08:52:17.548340161 +0000 UTC m=+7552.420966872" Feb 20 08:52:17 crc kubenswrapper[5094]: I0220 08:52:17.645176 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kxcc8" podStartSLOduration=2.933400952 podStartE2EDuration="18.645145069s" podCreationTimestamp="2026-02-20 08:51:59 +0000 UTC" firstStartedPulling="2026-02-20 08:52:00.45902244 +0000 UTC m=+7535.331649151" lastFinishedPulling="2026-02-20 08:52:16.170766557 +0000 UTC m=+7551.043393268" observedRunningTime="2026-02-20 08:52:17.60859234 +0000 UTC m=+7552.481219051" watchObservedRunningTime="2026-02-20 08:52:17.645145069 +0000 UTC m=+7552.517771780" Feb 20 08:52:17 crc kubenswrapper[5094]: I0220 08:52:17.681425 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-47vjf" podStartSLOduration=3.413205961 podStartE2EDuration="18.681401821s" podCreationTimestamp="2026-02-20 08:51:59 +0000 UTC" firstStartedPulling="2026-02-20 08:52:00.86485021 +0000 UTC m=+7535.737476911" lastFinishedPulling="2026-02-20 08:52:16.13304606 +0000 UTC m=+7551.005672771" observedRunningTime="2026-02-20 08:52:17.639462042 +0000 UTC m=+7552.512088753" watchObservedRunningTime="2026-02-20 08:52:17.681401821 +0000 UTC m=+7552.554028532" Feb 20 08:52:20 crc kubenswrapper[5094]: I0220 08:52:20.203436 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-nx84l" podUID="0394b5a4-125e-479d-b699-d9bd69bf812f" containerName="registry-server" probeResult="failure" output=< Feb 20 08:52:20 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 08:52:20 crc kubenswrapper[5094]: > Feb 20 08:52:29 crc kubenswrapper[5094]: I0220 08:52:29.226283 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:52:29 crc kubenswrapper[5094]: I0220 08:52:29.282082 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:52:30 crc kubenswrapper[5094]: I0220 08:52:30.034951 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nx84l"] Feb 20 08:52:30 crc kubenswrapper[5094]: I0220 08:52:30.253637 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-9hzgx" Feb 20 08:52:30 crc kubenswrapper[5094]: I0220 08:52:30.560452 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nx84l" podUID="0394b5a4-125e-479d-b699-d9bd69bf812f" containerName="registry-server" containerID="cri-o://63c8ab29f73793fd740bde02158cc28b743c87e1cd91233ed13edda13a34d972" gracePeriod=2 Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.229977 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.388634 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxwtt\" (UniqueName: \"kubernetes.io/projected/0394b5a4-125e-479d-b699-d9bd69bf812f-kube-api-access-fxwtt\") pod \"0394b5a4-125e-479d-b699-d9bd69bf812f\" (UID: \"0394b5a4-125e-479d-b699-d9bd69bf812f\") " Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.388689 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0394b5a4-125e-479d-b699-d9bd69bf812f-catalog-content\") pod \"0394b5a4-125e-479d-b699-d9bd69bf812f\" (UID: \"0394b5a4-125e-479d-b699-d9bd69bf812f\") " Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.388740 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0394b5a4-125e-479d-b699-d9bd69bf812f-utilities\") pod \"0394b5a4-125e-479d-b699-d9bd69bf812f\" (UID: \"0394b5a4-125e-479d-b699-d9bd69bf812f\") " Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.389339 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0394b5a4-125e-479d-b699-d9bd69bf812f-utilities" (OuterVolumeSpecName: "utilities") pod "0394b5a4-125e-479d-b699-d9bd69bf812f" (UID: "0394b5a4-125e-479d-b699-d9bd69bf812f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.395435 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0394b5a4-125e-479d-b699-d9bd69bf812f-kube-api-access-fxwtt" (OuterVolumeSpecName: "kube-api-access-fxwtt") pod "0394b5a4-125e-479d-b699-d9bd69bf812f" (UID: "0394b5a4-125e-479d-b699-d9bd69bf812f"). InnerVolumeSpecName "kube-api-access-fxwtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.439225 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0394b5a4-125e-479d-b699-d9bd69bf812f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0394b5a4-125e-479d-b699-d9bd69bf812f" (UID: "0394b5a4-125e-479d-b699-d9bd69bf812f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.491860 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxwtt\" (UniqueName: \"kubernetes.io/projected/0394b5a4-125e-479d-b699-d9bd69bf812f-kube-api-access-fxwtt\") on node \"crc\" DevicePath \"\"" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.491902 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0394b5a4-125e-479d-b699-d9bd69bf812f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.491916 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0394b5a4-125e-479d-b699-d9bd69bf812f-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.569571 5094 generic.go:334] "Generic (PLEG): container finished" podID="0394b5a4-125e-479d-b699-d9bd69bf812f" containerID="63c8ab29f73793fd740bde02158cc28b743c87e1cd91233ed13edda13a34d972" exitCode=0 Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.569616 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nx84l" event={"ID":"0394b5a4-125e-479d-b699-d9bd69bf812f","Type":"ContainerDied","Data":"63c8ab29f73793fd740bde02158cc28b743c87e1cd91233ed13edda13a34d972"} Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.569645 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.569669 5094 scope.go:117] "RemoveContainer" containerID="63c8ab29f73793fd740bde02158cc28b743c87e1cd91233ed13edda13a34d972" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.569657 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nx84l" event={"ID":"0394b5a4-125e-479d-b699-d9bd69bf812f","Type":"ContainerDied","Data":"1df1d0ed97e3ac7409ed3dcedbc351adad4489c377b93acbbc87e23e667f53ac"} Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.615966 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nx84l"] Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.619143 5094 scope.go:117] "RemoveContainer" containerID="af131ecba7c2b936069eb198796d5d9b2fc1ea0d244e549b881018f904406df8" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.630858 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nx84l"] Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.652881 5094 scope.go:117] "RemoveContainer" containerID="7efb0ca9f17319c818fe9f84b6b2ebce31cd83d827cfa03850f26ab199e47cc2" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.704456 5094 scope.go:117] "RemoveContainer" containerID="63c8ab29f73793fd740bde02158cc28b743c87e1cd91233ed13edda13a34d972" Feb 20 08:52:31 crc kubenswrapper[5094]: E0220 08:52:31.705040 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63c8ab29f73793fd740bde02158cc28b743c87e1cd91233ed13edda13a34d972\": container with ID starting with 63c8ab29f73793fd740bde02158cc28b743c87e1cd91233ed13edda13a34d972 not found: ID does not exist" containerID="63c8ab29f73793fd740bde02158cc28b743c87e1cd91233ed13edda13a34d972" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.705089 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63c8ab29f73793fd740bde02158cc28b743c87e1cd91233ed13edda13a34d972"} err="failed to get container status \"63c8ab29f73793fd740bde02158cc28b743c87e1cd91233ed13edda13a34d972\": rpc error: code = NotFound desc = could not find container \"63c8ab29f73793fd740bde02158cc28b743c87e1cd91233ed13edda13a34d972\": container with ID starting with 63c8ab29f73793fd740bde02158cc28b743c87e1cd91233ed13edda13a34d972 not found: ID does not exist" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.705110 5094 scope.go:117] "RemoveContainer" containerID="af131ecba7c2b936069eb198796d5d9b2fc1ea0d244e549b881018f904406df8" Feb 20 08:52:31 crc kubenswrapper[5094]: E0220 08:52:31.705526 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af131ecba7c2b936069eb198796d5d9b2fc1ea0d244e549b881018f904406df8\": container with ID starting with af131ecba7c2b936069eb198796d5d9b2fc1ea0d244e549b881018f904406df8 not found: ID does not exist" containerID="af131ecba7c2b936069eb198796d5d9b2fc1ea0d244e549b881018f904406df8" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.705567 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af131ecba7c2b936069eb198796d5d9b2fc1ea0d244e549b881018f904406df8"} err="failed to get container status \"af131ecba7c2b936069eb198796d5d9b2fc1ea0d244e549b881018f904406df8\": rpc error: code = NotFound desc = could not find container \"af131ecba7c2b936069eb198796d5d9b2fc1ea0d244e549b881018f904406df8\": container with ID starting with af131ecba7c2b936069eb198796d5d9b2fc1ea0d244e549b881018f904406df8 not found: ID does not exist" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.705591 5094 scope.go:117] "RemoveContainer" containerID="7efb0ca9f17319c818fe9f84b6b2ebce31cd83d827cfa03850f26ab199e47cc2" Feb 20 08:52:31 crc kubenswrapper[5094]: E0220 08:52:31.705894 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7efb0ca9f17319c818fe9f84b6b2ebce31cd83d827cfa03850f26ab199e47cc2\": container with ID starting with 7efb0ca9f17319c818fe9f84b6b2ebce31cd83d827cfa03850f26ab199e47cc2 not found: ID does not exist" containerID="7efb0ca9f17319c818fe9f84b6b2ebce31cd83d827cfa03850f26ab199e47cc2" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.705940 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7efb0ca9f17319c818fe9f84b6b2ebce31cd83d827cfa03850f26ab199e47cc2"} err="failed to get container status \"7efb0ca9f17319c818fe9f84b6b2ebce31cd83d827cfa03850f26ab199e47cc2\": rpc error: code = NotFound desc = could not find container \"7efb0ca9f17319c818fe9f84b6b2ebce31cd83d827cfa03850f26ab199e47cc2\": container with ID starting with 7efb0ca9f17319c818fe9f84b6b2ebce31cd83d827cfa03850f26ab199e47cc2 not found: ID does not exist" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.854973 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0394b5a4-125e-479d-b699-d9bd69bf812f" path="/var/lib/kubelet/pods/0394b5a4-125e-479d-b699-d9bd69bf812f/volumes" Feb 20 08:52:32 crc kubenswrapper[5094]: I0220 08:52:32.864481 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 20 08:52:32 crc kubenswrapper[5094]: I0220 08:52:32.865836 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="b4afe958-0e78-49e9-b05a-08ff4c42f602" containerName="openstackclient" containerID="cri-o://cb4acd794ebf0ed75ac1a7120cae6e99da13e366872ae910d88b143eaec6e181" gracePeriod=2 Feb 20 08:52:32 crc kubenswrapper[5094]: I0220 08:52:32.872660 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 20 08:52:32 crc kubenswrapper[5094]: I0220 08:52:32.941689 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 20 08:52:32 crc kubenswrapper[5094]: E0220 08:52:32.942478 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0394b5a4-125e-479d-b699-d9bd69bf812f" containerName="registry-server" Feb 20 08:52:32 crc kubenswrapper[5094]: I0220 08:52:32.942552 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0394b5a4-125e-479d-b699-d9bd69bf812f" containerName="registry-server" Feb 20 08:52:32 crc kubenswrapper[5094]: E0220 08:52:32.942633 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0394b5a4-125e-479d-b699-d9bd69bf812f" containerName="extract-content" Feb 20 08:52:32 crc kubenswrapper[5094]: I0220 08:52:32.942689 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0394b5a4-125e-479d-b699-d9bd69bf812f" containerName="extract-content" Feb 20 08:52:32 crc kubenswrapper[5094]: E0220 08:52:32.942778 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4afe958-0e78-49e9-b05a-08ff4c42f602" containerName="openstackclient" Feb 20 08:52:32 crc kubenswrapper[5094]: I0220 08:52:32.942953 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4afe958-0e78-49e9-b05a-08ff4c42f602" containerName="openstackclient" Feb 20 08:52:32 crc kubenswrapper[5094]: E0220 08:52:32.943020 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0394b5a4-125e-479d-b699-d9bd69bf812f" containerName="extract-utilities" Feb 20 08:52:32 crc kubenswrapper[5094]: I0220 08:52:32.943078 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0394b5a4-125e-479d-b699-d9bd69bf812f" containerName="extract-utilities" Feb 20 08:52:32 crc kubenswrapper[5094]: I0220 08:52:32.943312 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4afe958-0e78-49e9-b05a-08ff4c42f602" containerName="openstackclient" Feb 20 08:52:32 crc kubenswrapper[5094]: I0220 08:52:32.943397 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="0394b5a4-125e-479d-b699-d9bd69bf812f" containerName="registry-server" Feb 20 08:52:32 crc kubenswrapper[5094]: I0220 08:52:32.944071 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 08:52:32 crc kubenswrapper[5094]: I0220 08:52:32.963363 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 20 08:52:32 crc kubenswrapper[5094]: I0220 08:52:32.978949 5094 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="b4afe958-0e78-49e9-b05a-08ff4c42f602" podUID="3c21f8d0-ca22-4206-9cdf-26edee70eac2" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.120947 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3c21f8d0-ca22-4206-9cdf-26edee70eac2-openstack-config-secret\") pod \"openstackclient\" (UID: \"3c21f8d0-ca22-4206-9cdf-26edee70eac2\") " pod="openstack/openstackclient" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.120993 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3c21f8d0-ca22-4206-9cdf-26edee70eac2-openstack-config\") pod \"openstackclient\" (UID: \"3c21f8d0-ca22-4206-9cdf-26edee70eac2\") " pod="openstack/openstackclient" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.121118 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pncd7\" (UniqueName: \"kubernetes.io/projected/3c21f8d0-ca22-4206-9cdf-26edee70eac2-kube-api-access-pncd7\") pod \"openstackclient\" (UID: \"3c21f8d0-ca22-4206-9cdf-26edee70eac2\") " pod="openstack/openstackclient" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.124544 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.125838 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.137014 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-ck746" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.141995 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.224022 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3c21f8d0-ca22-4206-9cdf-26edee70eac2-openstack-config-secret\") pod \"openstackclient\" (UID: \"3c21f8d0-ca22-4206-9cdf-26edee70eac2\") " pod="openstack/openstackclient" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.224072 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3c21f8d0-ca22-4206-9cdf-26edee70eac2-openstack-config\") pod \"openstackclient\" (UID: \"3c21f8d0-ca22-4206-9cdf-26edee70eac2\") " pod="openstack/openstackclient" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.224169 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pncd7\" (UniqueName: \"kubernetes.io/projected/3c21f8d0-ca22-4206-9cdf-26edee70eac2-kube-api-access-pncd7\") pod \"openstackclient\" (UID: \"3c21f8d0-ca22-4206-9cdf-26edee70eac2\") " pod="openstack/openstackclient" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.228356 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3c21f8d0-ca22-4206-9cdf-26edee70eac2-openstack-config\") pod \"openstackclient\" (UID: \"3c21f8d0-ca22-4206-9cdf-26edee70eac2\") " pod="openstack/openstackclient" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.250427 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3c21f8d0-ca22-4206-9cdf-26edee70eac2-openstack-config-secret\") pod \"openstackclient\" (UID: \"3c21f8d0-ca22-4206-9cdf-26edee70eac2\") " pod="openstack/openstackclient" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.271134 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pncd7\" (UniqueName: \"kubernetes.io/projected/3c21f8d0-ca22-4206-9cdf-26edee70eac2-kube-api-access-pncd7\") pod \"openstackclient\" (UID: \"3c21f8d0-ca22-4206-9cdf-26edee70eac2\") " pod="openstack/openstackclient" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.326168 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrt96\" (UniqueName: \"kubernetes.io/projected/640e24e6-f89c-45ee-999a-e5aa0816aab2-kube-api-access-hrt96\") pod \"kube-state-metrics-0\" (UID: \"640e24e6-f89c-45ee-999a-e5aa0816aab2\") " pod="openstack/kube-state-metrics-0" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.428084 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrt96\" (UniqueName: \"kubernetes.io/projected/640e24e6-f89c-45ee-999a-e5aa0816aab2-kube-api-access-hrt96\") pod \"kube-state-metrics-0\" (UID: \"640e24e6-f89c-45ee-999a-e5aa0816aab2\") " pod="openstack/kube-state-metrics-0" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.451620 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrt96\" (UniqueName: \"kubernetes.io/projected/640e24e6-f89c-45ee-999a-e5aa0816aab2-kube-api-access-hrt96\") pod \"kube-state-metrics-0\" (UID: \"640e24e6-f89c-45ee-999a-e5aa0816aab2\") " pod="openstack/kube-state-metrics-0" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.562586 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.749753 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.868865 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.872451 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.875682 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.876203 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-wx2qf" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.876391 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.876626 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.876416 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.885450 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.968496 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c09fbf6b-1221-4e3d-b29d-6432848a564b-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.968641 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c09fbf6b-1221-4e3d-b29d-6432848a564b-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.968732 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c09fbf6b-1221-4e3d-b29d-6432848a564b-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.968814 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c09fbf6b-1221-4e3d-b29d-6432848a564b-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.968916 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkbk4\" (UniqueName: \"kubernetes.io/projected/c09fbf6b-1221-4e3d-b29d-6432848a564b-kube-api-access-bkbk4\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.968960 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c09fbf6b-1221-4e3d-b29d-6432848a564b-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.969001 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/c09fbf6b-1221-4e3d-b29d-6432848a564b-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.070956 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/c09fbf6b-1221-4e3d-b29d-6432848a564b-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.071037 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c09fbf6b-1221-4e3d-b29d-6432848a564b-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.071099 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c09fbf6b-1221-4e3d-b29d-6432848a564b-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.071125 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c09fbf6b-1221-4e3d-b29d-6432848a564b-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.071165 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c09fbf6b-1221-4e3d-b29d-6432848a564b-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.071202 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkbk4\" (UniqueName: \"kubernetes.io/projected/c09fbf6b-1221-4e3d-b29d-6432848a564b-kube-api-access-bkbk4\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.071224 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c09fbf6b-1221-4e3d-b29d-6432848a564b-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.072029 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/c09fbf6b-1221-4e3d-b29d-6432848a564b-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.081512 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c09fbf6b-1221-4e3d-b29d-6432848a564b-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.082911 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c09fbf6b-1221-4e3d-b29d-6432848a564b-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.090240 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c09fbf6b-1221-4e3d-b29d-6432848a564b-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.090675 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c09fbf6b-1221-4e3d-b29d-6432848a564b-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.091415 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c09fbf6b-1221-4e3d-b29d-6432848a564b-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.106719 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkbk4\" (UniqueName: \"kubernetes.io/projected/c09fbf6b-1221-4e3d-b29d-6432848a564b-kube-api-access-bkbk4\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.245793 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.371501 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.488409 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.492137 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.522172 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.522684 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.522795 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.522833 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.522221 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.522861 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.522984 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.523009 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-65nkr" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.560901 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.604780 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.674402 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"640e24e6-f89c-45ee-999a-e5aa0816aab2","Type":"ContainerStarted","Data":"1608dc353a7cc7eecd85ae0f251fb9d8d8e618c90e96fbbc4fdc692c3ab5e942"} Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.683382 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"3c21f8d0-ca22-4206-9cdf-26edee70eac2","Type":"ContainerStarted","Data":"7efd800e3876958580b04f199acc7c1bd9fc79868ef56b29f76534fff745b5a0"} Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.689664 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfmlj\" (UniqueName: \"kubernetes.io/projected/22516d8a-bb80-405e-8258-01fd733495ef-kube-api-access-xfmlj\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.689723 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/22516d8a-bb80-405e-8258-01fd733495ef-config\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.689766 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/22516d8a-bb80-405e-8258-01fd733495ef-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.689804 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/22516d8a-bb80-405e-8258-01fd733495ef-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.689839 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/22516d8a-bb80-405e-8258-01fd733495ef-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.689859 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/22516d8a-bb80-405e-8258-01fd733495ef-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.689908 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/22516d8a-bb80-405e-8258-01fd733495ef-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.689946 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/22516d8a-bb80-405e-8258-01fd733495ef-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.689973 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/22516d8a-bb80-405e-8258-01fd733495ef-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.690000 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7adbca82-75fc-4744-beaf-74d298170209\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7adbca82-75fc-4744-beaf-74d298170209\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.792525 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/22516d8a-bb80-405e-8258-01fd733495ef-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.792589 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/22516d8a-bb80-405e-8258-01fd733495ef-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.792618 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7adbca82-75fc-4744-beaf-74d298170209\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7adbca82-75fc-4744-beaf-74d298170209\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.792664 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfmlj\" (UniqueName: \"kubernetes.io/projected/22516d8a-bb80-405e-8258-01fd733495ef-kube-api-access-xfmlj\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.792682 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/22516d8a-bb80-405e-8258-01fd733495ef-config\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.792735 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/22516d8a-bb80-405e-8258-01fd733495ef-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.792791 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/22516d8a-bb80-405e-8258-01fd733495ef-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.792823 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/22516d8a-bb80-405e-8258-01fd733495ef-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.792844 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/22516d8a-bb80-405e-8258-01fd733495ef-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.792894 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/22516d8a-bb80-405e-8258-01fd733495ef-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.796823 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/22516d8a-bb80-405e-8258-01fd733495ef-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.797435 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/22516d8a-bb80-405e-8258-01fd733495ef-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.798727 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/22516d8a-bb80-405e-8258-01fd733495ef-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.804117 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/22516d8a-bb80-405e-8258-01fd733495ef-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.804939 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/22516d8a-bb80-405e-8258-01fd733495ef-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.809491 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/22516d8a-bb80-405e-8258-01fd733495ef-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.823366 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/22516d8a-bb80-405e-8258-01fd733495ef-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.832210 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/22516d8a-bb80-405e-8258-01fd733495ef-config\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.851647 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfmlj\" (UniqueName: \"kubernetes.io/projected/22516d8a-bb80-405e-8258-01fd733495ef-kube-api-access-xfmlj\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.861329 5094 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.861387 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7adbca82-75fc-4744-beaf-74d298170209\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7adbca82-75fc-4744-beaf-74d298170209\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2a5e5c320d42bf051c2648299b41e130990427635def81e3e854b40dad0c11aa/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.010368 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.022842 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7adbca82-75fc-4744-beaf-74d298170209\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7adbca82-75fc-4744-beaf-74d298170209\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.152921 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.486823 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.610925 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b4afe958-0e78-49e9-b05a-08ff4c42f602-openstack-config-secret\") pod \"b4afe958-0e78-49e9-b05a-08ff4c42f602\" (UID: \"b4afe958-0e78-49e9-b05a-08ff4c42f602\") " Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.611104 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b4afe958-0e78-49e9-b05a-08ff4c42f602-openstack-config\") pod \"b4afe958-0e78-49e9-b05a-08ff4c42f602\" (UID: \"b4afe958-0e78-49e9-b05a-08ff4c42f602\") " Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.611248 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd86c\" (UniqueName: \"kubernetes.io/projected/b4afe958-0e78-49e9-b05a-08ff4c42f602-kube-api-access-kd86c\") pod \"b4afe958-0e78-49e9-b05a-08ff4c42f602\" (UID: \"b4afe958-0e78-49e9-b05a-08ff4c42f602\") " Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.631370 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4afe958-0e78-49e9-b05a-08ff4c42f602-kube-api-access-kd86c" (OuterVolumeSpecName: "kube-api-access-kd86c") pod "b4afe958-0e78-49e9-b05a-08ff4c42f602" (UID: "b4afe958-0e78-49e9-b05a-08ff4c42f602"). InnerVolumeSpecName "kube-api-access-kd86c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.637262 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4afe958-0e78-49e9-b05a-08ff4c42f602-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "b4afe958-0e78-49e9-b05a-08ff4c42f602" (UID: "b4afe958-0e78-49e9-b05a-08ff4c42f602"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.713574 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4afe958-0e78-49e9-b05a-08ff4c42f602-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "b4afe958-0e78-49e9-b05a-08ff4c42f602" (UID: "b4afe958-0e78-49e9-b05a-08ff4c42f602"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.714993 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd86c\" (UniqueName: \"kubernetes.io/projected/b4afe958-0e78-49e9-b05a-08ff4c42f602-kube-api-access-kd86c\") on node \"crc\" DevicePath \"\"" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.715029 5094 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b4afe958-0e78-49e9-b05a-08ff4c42f602-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.715041 5094 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b4afe958-0e78-49e9-b05a-08ff4c42f602-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.718885 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"640e24e6-f89c-45ee-999a-e5aa0816aab2","Type":"ContainerStarted","Data":"fdad93376df51125a09e0b7e4c9aea575f5d0c65e55b1bcc0aad25c449686491"} Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.719559 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.725264 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 20 08:52:35 crc kubenswrapper[5094]: W0220 08:52:35.730163 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22516d8a_bb80_405e_8258_01fd733495ef.slice/crio-e3bfe29aea66f3f369d626303d40b9ea0a93f55fd4d02016e9ee891eca04c692 WatchSource:0}: Error finding container e3bfe29aea66f3f369d626303d40b9ea0a93f55fd4d02016e9ee891eca04c692: Status 404 returned error can't find the container with id e3bfe29aea66f3f369d626303d40b9ea0a93f55fd4d02016e9ee891eca04c692 Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.747507 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.224774564 podStartE2EDuration="2.747492025s" podCreationTimestamp="2026-02-20 08:52:33 +0000 UTC" firstStartedPulling="2026-02-20 08:52:34.62303149 +0000 UTC m=+7569.495658201" lastFinishedPulling="2026-02-20 08:52:35.145748951 +0000 UTC m=+7570.018375662" observedRunningTime="2026-02-20 08:52:35.746554112 +0000 UTC m=+7570.619180823" watchObservedRunningTime="2026-02-20 08:52:35.747492025 +0000 UTC m=+7570.620118736" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.754651 5094 generic.go:334] "Generic (PLEG): container finished" podID="b4afe958-0e78-49e9-b05a-08ff4c42f602" containerID="cb4acd794ebf0ed75ac1a7120cae6e99da13e366872ae910d88b143eaec6e181" exitCode=137 Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.754876 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.755655 5094 scope.go:117] "RemoveContainer" containerID="cb4acd794ebf0ed75ac1a7120cae6e99da13e366872ae910d88b143eaec6e181" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.760049 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"c09fbf6b-1221-4e3d-b29d-6432848a564b","Type":"ContainerStarted","Data":"5d16b227891bdbf1374415e0506890350e5bcb7672f0639b3366d1fd133d340c"} Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.763428 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"3c21f8d0-ca22-4206-9cdf-26edee70eac2","Type":"ContainerStarted","Data":"27267b5502fecc6a43056632683dfee580dedecd9796c21b815075674efcc79e"} Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.787895 5094 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="b4afe958-0e78-49e9-b05a-08ff4c42f602" podUID="3c21f8d0-ca22-4206-9cdf-26edee70eac2" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.796937 5094 scope.go:117] "RemoveContainer" containerID="cb4acd794ebf0ed75ac1a7120cae6e99da13e366872ae910d88b143eaec6e181" Feb 20 08:52:35 crc kubenswrapper[5094]: E0220 08:52:35.797888 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb4acd794ebf0ed75ac1a7120cae6e99da13e366872ae910d88b143eaec6e181\": container with ID starting with cb4acd794ebf0ed75ac1a7120cae6e99da13e366872ae910d88b143eaec6e181 not found: ID does not exist" containerID="cb4acd794ebf0ed75ac1a7120cae6e99da13e366872ae910d88b143eaec6e181" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.797946 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb4acd794ebf0ed75ac1a7120cae6e99da13e366872ae910d88b143eaec6e181"} err="failed to get container status \"cb4acd794ebf0ed75ac1a7120cae6e99da13e366872ae910d88b143eaec6e181\": rpc error: code = NotFound desc = could not find container \"cb4acd794ebf0ed75ac1a7120cae6e99da13e366872ae910d88b143eaec6e181\": container with ID starting with cb4acd794ebf0ed75ac1a7120cae6e99da13e366872ae910d88b143eaec6e181 not found: ID does not exist" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.800313 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.800292575 podStartE2EDuration="3.800292575s" podCreationTimestamp="2026-02-20 08:52:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:52:35.781865412 +0000 UTC m=+7570.654492123" watchObservedRunningTime="2026-02-20 08:52:35.800292575 +0000 UTC m=+7570.672919286" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.853285 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4afe958-0e78-49e9-b05a-08ff4c42f602" path="/var/lib/kubelet/pods/b4afe958-0e78-49e9-b05a-08ff4c42f602/volumes" Feb 20 08:52:36 crc kubenswrapper[5094]: I0220 08:52:36.782052 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"22516d8a-bb80-405e-8258-01fd733495ef","Type":"ContainerStarted","Data":"e3bfe29aea66f3f369d626303d40b9ea0a93f55fd4d02016e9ee891eca04c692"} Feb 20 08:52:40 crc kubenswrapper[5094]: I0220 08:52:40.838850 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"22516d8a-bb80-405e-8258-01fd733495ef","Type":"ContainerStarted","Data":"d747fb06a90682657b4ae142d3d94a412957a924fc193ade70b70b69eda6b31a"} Feb 20 08:52:40 crc kubenswrapper[5094]: I0220 08:52:40.842551 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"c09fbf6b-1221-4e3d-b29d-6432848a564b","Type":"ContainerStarted","Data":"db342d9226fdf538f0f218b110f06cb19147593032abe4184c062b1e1763c716"} Feb 20 08:52:40 crc kubenswrapper[5094]: I0220 08:52:40.897350 5094 scope.go:117] "RemoveContainer" containerID="5af35aa0d974ec2be3d578b66402a33233be4efbd611deaf5976f2b6d54c4e72" Feb 20 08:52:40 crc kubenswrapper[5094]: I0220 08:52:40.943884 5094 scope.go:117] "RemoveContainer" containerID="710b367e8a0475d2f89ae71b4ffcf7ae41da63c436f5361dbf27d4bd07bdf660" Feb 20 08:52:41 crc kubenswrapper[5094]: I0220 08:52:41.140873 5094 scope.go:117] "RemoveContainer" containerID="4495a0b785b56a81800453fd2516a41bac0676f202c2358f07c81e7849110742" Feb 20 08:52:43 crc kubenswrapper[5094]: I0220 08:52:43.765149 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 20 08:52:47 crc kubenswrapper[5094]: I0220 08:52:47.055937 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-bgw44"] Feb 20 08:52:47 crc kubenswrapper[5094]: I0220 08:52:47.068732 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-bgw44"] Feb 20 08:52:47 crc kubenswrapper[5094]: I0220 08:52:47.851453 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d39890b-bbcb-4fcb-9f5e-6f74782fc661" path="/var/lib/kubelet/pods/5d39890b-bbcb-4fcb-9f5e-6f74782fc661/volumes" Feb 20 08:52:47 crc kubenswrapper[5094]: I0220 08:52:47.918226 5094 generic.go:334] "Generic (PLEG): container finished" podID="22516d8a-bb80-405e-8258-01fd733495ef" containerID="d747fb06a90682657b4ae142d3d94a412957a924fc193ade70b70b69eda6b31a" exitCode=0 Feb 20 08:52:47 crc kubenswrapper[5094]: I0220 08:52:47.918310 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"22516d8a-bb80-405e-8258-01fd733495ef","Type":"ContainerDied","Data":"d747fb06a90682657b4ae142d3d94a412957a924fc193ade70b70b69eda6b31a"} Feb 20 08:52:47 crc kubenswrapper[5094]: I0220 08:52:47.921155 5094 generic.go:334] "Generic (PLEG): container finished" podID="c09fbf6b-1221-4e3d-b29d-6432848a564b" containerID="db342d9226fdf538f0f218b110f06cb19147593032abe4184c062b1e1763c716" exitCode=0 Feb 20 08:52:47 crc kubenswrapper[5094]: I0220 08:52:47.921187 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"c09fbf6b-1221-4e3d-b29d-6432848a564b","Type":"ContainerDied","Data":"db342d9226fdf538f0f218b110f06cb19147593032abe4184c062b1e1763c716"} Feb 20 08:52:48 crc kubenswrapper[5094]: I0220 08:52:48.039622 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-80d8-account-create-update-2lhxz"] Feb 20 08:52:48 crc kubenswrapper[5094]: I0220 08:52:48.049494 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-80d8-account-create-update-2lhxz"] Feb 20 08:52:49 crc kubenswrapper[5094]: I0220 08:52:49.855506 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0fbd49a-25e7-44de-a81d-f324feba0dff" path="/var/lib/kubelet/pods/f0fbd49a-25e7-44de-a81d-f324feba0dff/volumes" Feb 20 08:52:50 crc kubenswrapper[5094]: I0220 08:52:50.953805 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"c09fbf6b-1221-4e3d-b29d-6432848a564b","Type":"ContainerStarted","Data":"41f30873b500df1cc75c82c266570b32e40f31aa3508f4f529cfba349449edb0"} Feb 20 08:52:53 crc kubenswrapper[5094]: I0220 08:52:53.985785 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"22516d8a-bb80-405e-8258-01fd733495ef","Type":"ContainerStarted","Data":"962cb6f946961a50a912015b26a5b50b4fca505d54ba58164b9ff45aa52116d8"} Feb 20 08:52:53 crc kubenswrapper[5094]: I0220 08:52:53.988411 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"c09fbf6b-1221-4e3d-b29d-6432848a564b","Type":"ContainerStarted","Data":"5abfa5fefa6943b0f00687b18c9cec4287624f61d4fa8fe28ac28fa89cc8b86a"} Feb 20 08:52:53 crc kubenswrapper[5094]: I0220 08:52:53.989262 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:53 crc kubenswrapper[5094]: I0220 08:52:53.991575 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:54 crc kubenswrapper[5094]: I0220 08:52:54.028889 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=5.820937738 podStartE2EDuration="21.028865546s" podCreationTimestamp="2026-02-20 08:52:33 +0000 UTC" firstStartedPulling="2026-02-20 08:52:35.03131773 +0000 UTC m=+7569.903944441" lastFinishedPulling="2026-02-20 08:52:50.239245548 +0000 UTC m=+7585.111872249" observedRunningTime="2026-02-20 08:52:54.021098619 +0000 UTC m=+7588.893725370" watchObservedRunningTime="2026-02-20 08:52:54.028865546 +0000 UTC m=+7588.901492257" Feb 20 08:52:58 crc kubenswrapper[5094]: I0220 08:52:58.037228 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"22516d8a-bb80-405e-8258-01fd733495ef","Type":"ContainerStarted","Data":"91e1b6af9e2520978b5d06e517655629b996d9d120f47e987ff5902a730f1390"} Feb 20 08:53:02 crc kubenswrapper[5094]: I0220 08:53:02.085904 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"22516d8a-bb80-405e-8258-01fd733495ef","Type":"ContainerStarted","Data":"7e2aefc274a4b17af715cf73d71ebe7aa89df2a703b9bb2948244f3ccf5dd608"} Feb 20 08:53:02 crc kubenswrapper[5094]: I0220 08:53:02.130513 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=3.374768993 podStartE2EDuration="29.130483886s" podCreationTimestamp="2026-02-20 08:52:33 +0000 UTC" firstStartedPulling="2026-02-20 08:52:35.758247944 +0000 UTC m=+7570.630874655" lastFinishedPulling="2026-02-20 08:53:01.513962837 +0000 UTC m=+7596.386589548" observedRunningTime="2026-02-20 08:53:02.111861158 +0000 UTC m=+7596.984487909" watchObservedRunningTime="2026-02-20 08:53:02.130483886 +0000 UTC m=+7597.003110627" Feb 20 08:53:05 crc kubenswrapper[5094]: I0220 08:53:05.154610 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 20 08:53:05 crc kubenswrapper[5094]: I0220 08:53:05.155313 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 20 08:53:05 crc kubenswrapper[5094]: I0220 08:53:05.159215 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 20 08:53:06 crc kubenswrapper[5094]: I0220 08:53:06.124356 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.363884 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.367197 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.369671 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.369991 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.377245 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.481717 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-scripts\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.481786 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.481805 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-config-data\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.481825 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d5a7349-432b-4431-bbf3-5079cbad3819-log-httpd\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.481879 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.481968 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d5a7349-432b-4431-bbf3-5079cbad3819-run-httpd\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.482224 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb4wq\" (UniqueName: \"kubernetes.io/projected/3d5a7349-432b-4431-bbf3-5079cbad3819-kube-api-access-wb4wq\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.584589 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb4wq\" (UniqueName: \"kubernetes.io/projected/3d5a7349-432b-4431-bbf3-5079cbad3819-kube-api-access-wb4wq\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.584679 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-scripts\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.584734 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.584756 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-config-data\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.584784 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d5a7349-432b-4431-bbf3-5079cbad3819-log-httpd\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.584818 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.585050 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d5a7349-432b-4431-bbf3-5079cbad3819-run-httpd\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.585750 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d5a7349-432b-4431-bbf3-5079cbad3819-run-httpd\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.586409 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d5a7349-432b-4431-bbf3-5079cbad3819-log-httpd\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.591132 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.591618 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-config-data\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.592273 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-scripts\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.592580 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.602088 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb4wq\" (UniqueName: \"kubernetes.io/projected/3d5a7349-432b-4431-bbf3-5079cbad3819-kube-api-access-wb4wq\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.690181 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 08:53:09 crc kubenswrapper[5094]: I0220 08:53:09.336290 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 08:53:10 crc kubenswrapper[5094]: I0220 08:53:10.161753 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d5a7349-432b-4431-bbf3-5079cbad3819","Type":"ContainerStarted","Data":"8d3046356c6ab6376d54ed64d8a87d4b911f9de1922da02bb0c47ec017f24b8c"} Feb 20 08:53:13 crc kubenswrapper[5094]: I0220 08:53:13.192421 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d5a7349-432b-4431-bbf3-5079cbad3819","Type":"ContainerStarted","Data":"65a8886591306ad8879b6a7a385867de5351348368c26ea731d0f97822cb7df6"} Feb 20 08:53:14 crc kubenswrapper[5094]: I0220 08:53:14.205253 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d5a7349-432b-4431-bbf3-5079cbad3819","Type":"ContainerStarted","Data":"2857012eb54d9c2168b148c823755eb27da69fbf090698f1429d8cb1752e4e87"} Feb 20 08:53:15 crc kubenswrapper[5094]: I0220 08:53:15.222445 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d5a7349-432b-4431-bbf3-5079cbad3819","Type":"ContainerStarted","Data":"36383798cb5cce5f003b1a93a9f58a7a048970022e6f16ba4a5b228fc5f0ee8b"} Feb 20 08:53:17 crc kubenswrapper[5094]: I0220 08:53:17.246697 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d5a7349-432b-4431-bbf3-5079cbad3819","Type":"ContainerStarted","Data":"ebf23270078580538e17397e37d3095ef7fab2a602d5fa4d1a4d6c4d39282be2"} Feb 20 08:53:17 crc kubenswrapper[5094]: I0220 08:53:17.248203 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 08:53:17 crc kubenswrapper[5094]: I0220 08:53:17.282408 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.4558336069999998 podStartE2EDuration="9.282391067s" podCreationTimestamp="2026-02-20 08:53:08 +0000 UTC" firstStartedPulling="2026-02-20 08:53:09.331508154 +0000 UTC m=+7604.204134865" lastFinishedPulling="2026-02-20 08:53:16.158065604 +0000 UTC m=+7611.030692325" observedRunningTime="2026-02-20 08:53:17.268658356 +0000 UTC m=+7612.141285067" watchObservedRunningTime="2026-02-20 08:53:17.282391067 +0000 UTC m=+7612.155017768" Feb 20 08:53:20 crc kubenswrapper[5094]: I0220 08:53:20.049971 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-2fzgg"] Feb 20 08:53:20 crc kubenswrapper[5094]: I0220 08:53:20.060745 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-2fzgg"] Feb 20 08:53:21 crc kubenswrapper[5094]: I0220 08:53:21.858985 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b382ec69-4b87-43f5-b964-eba4282bcc42" path="/var/lib/kubelet/pods/b382ec69-4b87-43f5-b964-eba4282bcc42/volumes" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.176937 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-drwqv"] Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.178832 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-drwqv" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.191718 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-drwqv"] Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.239496 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl967\" (UniqueName: \"kubernetes.io/projected/ae290c11-18c8-4d9a-90d3-8f2219084a78-kube-api-access-hl967\") pod \"aodh-db-create-drwqv\" (UID: \"ae290c11-18c8-4d9a-90d3-8f2219084a78\") " pod="openstack/aodh-db-create-drwqv" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.239605 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae290c11-18c8-4d9a-90d3-8f2219084a78-operator-scripts\") pod \"aodh-db-create-drwqv\" (UID: \"ae290c11-18c8-4d9a-90d3-8f2219084a78\") " pod="openstack/aodh-db-create-drwqv" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.307574 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-4eea-account-create-update-rqsxn"] Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.309184 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-4eea-account-create-update-rqsxn" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.311787 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.317823 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-4eea-account-create-update-rqsxn"] Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.340869 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbtzz\" (UniqueName: \"kubernetes.io/projected/537981f5-8e74-406f-9199-8bac8aa60903-kube-api-access-gbtzz\") pod \"aodh-4eea-account-create-update-rqsxn\" (UID: \"537981f5-8e74-406f-9199-8bac8aa60903\") " pod="openstack/aodh-4eea-account-create-update-rqsxn" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.340923 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl967\" (UniqueName: \"kubernetes.io/projected/ae290c11-18c8-4d9a-90d3-8f2219084a78-kube-api-access-hl967\") pod \"aodh-db-create-drwqv\" (UID: \"ae290c11-18c8-4d9a-90d3-8f2219084a78\") " pod="openstack/aodh-db-create-drwqv" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.340962 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/537981f5-8e74-406f-9199-8bac8aa60903-operator-scripts\") pod \"aodh-4eea-account-create-update-rqsxn\" (UID: \"537981f5-8e74-406f-9199-8bac8aa60903\") " pod="openstack/aodh-4eea-account-create-update-rqsxn" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.341002 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae290c11-18c8-4d9a-90d3-8f2219084a78-operator-scripts\") pod \"aodh-db-create-drwqv\" (UID: \"ae290c11-18c8-4d9a-90d3-8f2219084a78\") " pod="openstack/aodh-db-create-drwqv" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.341746 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae290c11-18c8-4d9a-90d3-8f2219084a78-operator-scripts\") pod \"aodh-db-create-drwqv\" (UID: \"ae290c11-18c8-4d9a-90d3-8f2219084a78\") " pod="openstack/aodh-db-create-drwqv" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.363588 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl967\" (UniqueName: \"kubernetes.io/projected/ae290c11-18c8-4d9a-90d3-8f2219084a78-kube-api-access-hl967\") pod \"aodh-db-create-drwqv\" (UID: \"ae290c11-18c8-4d9a-90d3-8f2219084a78\") " pod="openstack/aodh-db-create-drwqv" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.442930 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbtzz\" (UniqueName: \"kubernetes.io/projected/537981f5-8e74-406f-9199-8bac8aa60903-kube-api-access-gbtzz\") pod \"aodh-4eea-account-create-update-rqsxn\" (UID: \"537981f5-8e74-406f-9199-8bac8aa60903\") " pod="openstack/aodh-4eea-account-create-update-rqsxn" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.442991 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/537981f5-8e74-406f-9199-8bac8aa60903-operator-scripts\") pod \"aodh-4eea-account-create-update-rqsxn\" (UID: \"537981f5-8e74-406f-9199-8bac8aa60903\") " pod="openstack/aodh-4eea-account-create-update-rqsxn" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.443609 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/537981f5-8e74-406f-9199-8bac8aa60903-operator-scripts\") pod \"aodh-4eea-account-create-update-rqsxn\" (UID: \"537981f5-8e74-406f-9199-8bac8aa60903\") " pod="openstack/aodh-4eea-account-create-update-rqsxn" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.471548 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbtzz\" (UniqueName: \"kubernetes.io/projected/537981f5-8e74-406f-9199-8bac8aa60903-kube-api-access-gbtzz\") pod \"aodh-4eea-account-create-update-rqsxn\" (UID: \"537981f5-8e74-406f-9199-8bac8aa60903\") " pod="openstack/aodh-4eea-account-create-update-rqsxn" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.573461 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-drwqv" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.629857 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-4eea-account-create-update-rqsxn" Feb 20 08:53:26 crc kubenswrapper[5094]: I0220 08:53:26.064471 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-drwqv"] Feb 20 08:53:26 crc kubenswrapper[5094]: I0220 08:53:26.182332 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-4eea-account-create-update-rqsxn"] Feb 20 08:53:26 crc kubenswrapper[5094]: W0220 08:53:26.185105 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod537981f5_8e74_406f_9199_8bac8aa60903.slice/crio-1a5a211997fe0479356df9c5ca80e3866ae06de0c730278dd3a3e5750105b695 WatchSource:0}: Error finding container 1a5a211997fe0479356df9c5ca80e3866ae06de0c730278dd3a3e5750105b695: Status 404 returned error can't find the container with id 1a5a211997fe0479356df9c5ca80e3866ae06de0c730278dd3a3e5750105b695 Feb 20 08:53:26 crc kubenswrapper[5094]: I0220 08:53:26.190810 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Feb 20 08:53:26 crc kubenswrapper[5094]: I0220 08:53:26.352266 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-4eea-account-create-update-rqsxn" event={"ID":"537981f5-8e74-406f-9199-8bac8aa60903","Type":"ContainerStarted","Data":"1a5a211997fe0479356df9c5ca80e3866ae06de0c730278dd3a3e5750105b695"} Feb 20 08:53:26 crc kubenswrapper[5094]: I0220 08:53:26.354044 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-drwqv" event={"ID":"ae290c11-18c8-4d9a-90d3-8f2219084a78","Type":"ContainerStarted","Data":"7d6c489e10960f3fa343c9c151323e484166523ad5c54d095e27280ce6e1cbd6"} Feb 20 08:53:26 crc kubenswrapper[5094]: I0220 08:53:26.354068 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-drwqv" event={"ID":"ae290c11-18c8-4d9a-90d3-8f2219084a78","Type":"ContainerStarted","Data":"94e8abe12427ae34e22b9ee3dcb41fee4c0f9b52a7ed17799a36e248a73e58e5"} Feb 20 08:53:26 crc kubenswrapper[5094]: I0220 08:53:26.369302 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-create-drwqv" podStartSLOduration=1.369284483 podStartE2EDuration="1.369284483s" podCreationTimestamp="2026-02-20 08:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:53:26.368155236 +0000 UTC m=+7621.240781947" watchObservedRunningTime="2026-02-20 08:53:26.369284483 +0000 UTC m=+7621.241911194" Feb 20 08:53:27 crc kubenswrapper[5094]: I0220 08:53:27.364168 5094 generic.go:334] "Generic (PLEG): container finished" podID="537981f5-8e74-406f-9199-8bac8aa60903" containerID="d3221fcda11fc25108efa9fb80c6774c8d350491f8d20f83e1f5fae473f8e306" exitCode=0 Feb 20 08:53:27 crc kubenswrapper[5094]: I0220 08:53:27.364210 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-4eea-account-create-update-rqsxn" event={"ID":"537981f5-8e74-406f-9199-8bac8aa60903","Type":"ContainerDied","Data":"d3221fcda11fc25108efa9fb80c6774c8d350491f8d20f83e1f5fae473f8e306"} Feb 20 08:53:27 crc kubenswrapper[5094]: I0220 08:53:27.366460 5094 generic.go:334] "Generic (PLEG): container finished" podID="ae290c11-18c8-4d9a-90d3-8f2219084a78" containerID="7d6c489e10960f3fa343c9c151323e484166523ad5c54d095e27280ce6e1cbd6" exitCode=0 Feb 20 08:53:27 crc kubenswrapper[5094]: I0220 08:53:27.366493 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-drwqv" event={"ID":"ae290c11-18c8-4d9a-90d3-8f2219084a78","Type":"ContainerDied","Data":"7d6c489e10960f3fa343c9c151323e484166523ad5c54d095e27280ce6e1cbd6"} Feb 20 08:53:28 crc kubenswrapper[5094]: I0220 08:53:28.892966 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-drwqv" Feb 20 08:53:28 crc kubenswrapper[5094]: I0220 08:53:28.897944 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-4eea-account-create-update-rqsxn" Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.012750 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae290c11-18c8-4d9a-90d3-8f2219084a78-operator-scripts\") pod \"ae290c11-18c8-4d9a-90d3-8f2219084a78\" (UID: \"ae290c11-18c8-4d9a-90d3-8f2219084a78\") " Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.012801 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/537981f5-8e74-406f-9199-8bac8aa60903-operator-scripts\") pod \"537981f5-8e74-406f-9199-8bac8aa60903\" (UID: \"537981f5-8e74-406f-9199-8bac8aa60903\") " Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.012843 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hl967\" (UniqueName: \"kubernetes.io/projected/ae290c11-18c8-4d9a-90d3-8f2219084a78-kube-api-access-hl967\") pod \"ae290c11-18c8-4d9a-90d3-8f2219084a78\" (UID: \"ae290c11-18c8-4d9a-90d3-8f2219084a78\") " Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.012972 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbtzz\" (UniqueName: \"kubernetes.io/projected/537981f5-8e74-406f-9199-8bac8aa60903-kube-api-access-gbtzz\") pod \"537981f5-8e74-406f-9199-8bac8aa60903\" (UID: \"537981f5-8e74-406f-9199-8bac8aa60903\") " Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.013699 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/537981f5-8e74-406f-9199-8bac8aa60903-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "537981f5-8e74-406f-9199-8bac8aa60903" (UID: "537981f5-8e74-406f-9199-8bac8aa60903"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.014404 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae290c11-18c8-4d9a-90d3-8f2219084a78-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ae290c11-18c8-4d9a-90d3-8f2219084a78" (UID: "ae290c11-18c8-4d9a-90d3-8f2219084a78"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.018936 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae290c11-18c8-4d9a-90d3-8f2219084a78-kube-api-access-hl967" (OuterVolumeSpecName: "kube-api-access-hl967") pod "ae290c11-18c8-4d9a-90d3-8f2219084a78" (UID: "ae290c11-18c8-4d9a-90d3-8f2219084a78"). InnerVolumeSpecName "kube-api-access-hl967". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.019594 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/537981f5-8e74-406f-9199-8bac8aa60903-kube-api-access-gbtzz" (OuterVolumeSpecName: "kube-api-access-gbtzz") pod "537981f5-8e74-406f-9199-8bac8aa60903" (UID: "537981f5-8e74-406f-9199-8bac8aa60903"). InnerVolumeSpecName "kube-api-access-gbtzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.115634 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae290c11-18c8-4d9a-90d3-8f2219084a78-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.116013 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/537981f5-8e74-406f-9199-8bac8aa60903-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.116034 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hl967\" (UniqueName: \"kubernetes.io/projected/ae290c11-18c8-4d9a-90d3-8f2219084a78-kube-api-access-hl967\") on node \"crc\" DevicePath \"\"" Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.116053 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbtzz\" (UniqueName: \"kubernetes.io/projected/537981f5-8e74-406f-9199-8bac8aa60903-kube-api-access-gbtzz\") on node \"crc\" DevicePath \"\"" Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.383785 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-4eea-account-create-update-rqsxn" event={"ID":"537981f5-8e74-406f-9199-8bac8aa60903","Type":"ContainerDied","Data":"1a5a211997fe0479356df9c5ca80e3866ae06de0c730278dd3a3e5750105b695"} Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.383832 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a5a211997fe0479356df9c5ca80e3866ae06de0c730278dd3a3e5750105b695" Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.383836 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-4eea-account-create-update-rqsxn" Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.387116 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-drwqv" event={"ID":"ae290c11-18c8-4d9a-90d3-8f2219084a78","Type":"ContainerDied","Data":"94e8abe12427ae34e22b9ee3dcb41fee4c0f9b52a7ed17799a36e248a73e58e5"} Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.387163 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94e8abe12427ae34e22b9ee3dcb41fee4c0f9b52a7ed17799a36e248a73e58e5" Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.387274 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-drwqv" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.634064 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-bg4qh"] Feb 20 08:53:30 crc kubenswrapper[5094]: E0220 08:53:30.634838 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="537981f5-8e74-406f-9199-8bac8aa60903" containerName="mariadb-account-create-update" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.634852 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="537981f5-8e74-406f-9199-8bac8aa60903" containerName="mariadb-account-create-update" Feb 20 08:53:30 crc kubenswrapper[5094]: E0220 08:53:30.634872 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae290c11-18c8-4d9a-90d3-8f2219084a78" containerName="mariadb-database-create" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.634878 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae290c11-18c8-4d9a-90d3-8f2219084a78" containerName="mariadb-database-create" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.635079 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="537981f5-8e74-406f-9199-8bac8aa60903" containerName="mariadb-account-create-update" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.635095 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae290c11-18c8-4d9a-90d3-8f2219084a78" containerName="mariadb-database-create" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.635891 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-bg4qh" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.641170 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.644101 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.644851 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-bg4qh"] Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.646578 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.646845 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-zl6jq" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.766118 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9rjz\" (UniqueName: \"kubernetes.io/projected/2950b502-5079-4a08-8aaf-f0b5d376a3f2-kube-api-access-l9rjz\") pod \"aodh-db-sync-bg4qh\" (UID: \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\") " pod="openstack/aodh-db-sync-bg4qh" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.766170 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-combined-ca-bundle\") pod \"aodh-db-sync-bg4qh\" (UID: \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\") " pod="openstack/aodh-db-sync-bg4qh" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.766615 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-scripts\") pod \"aodh-db-sync-bg4qh\" (UID: \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\") " pod="openstack/aodh-db-sync-bg4qh" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.766840 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-config-data\") pod \"aodh-db-sync-bg4qh\" (UID: \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\") " pod="openstack/aodh-db-sync-bg4qh" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.868319 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-config-data\") pod \"aodh-db-sync-bg4qh\" (UID: \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\") " pod="openstack/aodh-db-sync-bg4qh" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.868457 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9rjz\" (UniqueName: \"kubernetes.io/projected/2950b502-5079-4a08-8aaf-f0b5d376a3f2-kube-api-access-l9rjz\") pod \"aodh-db-sync-bg4qh\" (UID: \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\") " pod="openstack/aodh-db-sync-bg4qh" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.868496 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-combined-ca-bundle\") pod \"aodh-db-sync-bg4qh\" (UID: \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\") " pod="openstack/aodh-db-sync-bg4qh" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.869823 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-scripts\") pod \"aodh-db-sync-bg4qh\" (UID: \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\") " pod="openstack/aodh-db-sync-bg4qh" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.876640 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-combined-ca-bundle\") pod \"aodh-db-sync-bg4qh\" (UID: \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\") " pod="openstack/aodh-db-sync-bg4qh" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.878527 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-config-data\") pod \"aodh-db-sync-bg4qh\" (UID: \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\") " pod="openstack/aodh-db-sync-bg4qh" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.883604 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-scripts\") pod \"aodh-db-sync-bg4qh\" (UID: \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\") " pod="openstack/aodh-db-sync-bg4qh" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.885207 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9rjz\" (UniqueName: \"kubernetes.io/projected/2950b502-5079-4a08-8aaf-f0b5d376a3f2-kube-api-access-l9rjz\") pod \"aodh-db-sync-bg4qh\" (UID: \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\") " pod="openstack/aodh-db-sync-bg4qh" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.951498 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-bg4qh" Feb 20 08:53:31 crc kubenswrapper[5094]: W0220 08:53:31.610872 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2950b502_5079_4a08_8aaf_f0b5d376a3f2.slice/crio-0c2efecd63bd3804e02924264075112d6f9b735af119df1237adc505949e0e15 WatchSource:0}: Error finding container 0c2efecd63bd3804e02924264075112d6f9b735af119df1237adc505949e0e15: Status 404 returned error can't find the container with id 0c2efecd63bd3804e02924264075112d6f9b735af119df1237adc505949e0e15 Feb 20 08:53:31 crc kubenswrapper[5094]: I0220 08:53:31.614017 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 08:53:31 crc kubenswrapper[5094]: I0220 08:53:31.627363 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-bg4qh"] Feb 20 08:53:32 crc kubenswrapper[5094]: I0220 08:53:32.423831 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-bg4qh" event={"ID":"2950b502-5079-4a08-8aaf-f0b5d376a3f2","Type":"ContainerStarted","Data":"0c2efecd63bd3804e02924264075112d6f9b735af119df1237adc505949e0e15"} Feb 20 08:53:37 crc kubenswrapper[5094]: I0220 08:53:37.467155 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-bg4qh" event={"ID":"2950b502-5079-4a08-8aaf-f0b5d376a3f2","Type":"ContainerStarted","Data":"b71c497029e9f5437c06dfb05b6008f8e4f7c93cd886b8f44f838d87660036a7"} Feb 20 08:53:37 crc kubenswrapper[5094]: I0220 08:53:37.493604 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-bg4qh" podStartSLOduration=2.411351447 podStartE2EDuration="7.493581694s" podCreationTimestamp="2026-02-20 08:53:30 +0000 UTC" firstStartedPulling="2026-02-20 08:53:31.613564868 +0000 UTC m=+7626.486191619" lastFinishedPulling="2026-02-20 08:53:36.695795155 +0000 UTC m=+7631.568421866" observedRunningTime="2026-02-20 08:53:37.480576461 +0000 UTC m=+7632.353203182" watchObservedRunningTime="2026-02-20 08:53:37.493581694 +0000 UTC m=+7632.366208415" Feb 20 08:53:38 crc kubenswrapper[5094]: I0220 08:53:38.703465 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 20 08:53:40 crc kubenswrapper[5094]: I0220 08:53:40.493991 5094 generic.go:334] "Generic (PLEG): container finished" podID="2950b502-5079-4a08-8aaf-f0b5d376a3f2" containerID="b71c497029e9f5437c06dfb05b6008f8e4f7c93cd886b8f44f838d87660036a7" exitCode=0 Feb 20 08:53:40 crc kubenswrapper[5094]: I0220 08:53:40.494059 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-bg4qh" event={"ID":"2950b502-5079-4a08-8aaf-f0b5d376a3f2","Type":"ContainerDied","Data":"b71c497029e9f5437c06dfb05b6008f8e4f7c93cd886b8f44f838d87660036a7"} Feb 20 08:53:41 crc kubenswrapper[5094]: I0220 08:53:41.410895 5094 scope.go:117] "RemoveContainer" containerID="26fb92ef592af820040ca30c6c02e7ed550cd4c5268296895eb9386dd13d2c0a" Feb 20 08:53:41 crc kubenswrapper[5094]: I0220 08:53:41.435799 5094 scope.go:117] "RemoveContainer" containerID="0517d39a1199d93e907c142d0fd23dddc068f1fcc10b13d41993d7946b0ef46a" Feb 20 08:53:41 crc kubenswrapper[5094]: I0220 08:53:41.489401 5094 scope.go:117] "RemoveContainer" containerID="d77d5b604322e5a963ae828151741fdab64a97bfd2a29b72e3a01f5ffe6ac7d2" Feb 20 08:53:42 crc kubenswrapper[5094]: I0220 08:53:42.018818 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-bg4qh" Feb 20 08:53:42 crc kubenswrapper[5094]: I0220 08:53:42.160932 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-config-data\") pod \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\" (UID: \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\") " Feb 20 08:53:42 crc kubenswrapper[5094]: I0220 08:53:42.160987 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-scripts\") pod \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\" (UID: \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\") " Feb 20 08:53:42 crc kubenswrapper[5094]: I0220 08:53:42.161126 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9rjz\" (UniqueName: \"kubernetes.io/projected/2950b502-5079-4a08-8aaf-f0b5d376a3f2-kube-api-access-l9rjz\") pod \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\" (UID: \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\") " Feb 20 08:53:42 crc kubenswrapper[5094]: I0220 08:53:42.161144 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-combined-ca-bundle\") pod \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\" (UID: \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\") " Feb 20 08:53:42 crc kubenswrapper[5094]: I0220 08:53:42.166077 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-scripts" (OuterVolumeSpecName: "scripts") pod "2950b502-5079-4a08-8aaf-f0b5d376a3f2" (UID: "2950b502-5079-4a08-8aaf-f0b5d376a3f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:53:42 crc kubenswrapper[5094]: I0220 08:53:42.166367 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2950b502-5079-4a08-8aaf-f0b5d376a3f2-kube-api-access-l9rjz" (OuterVolumeSpecName: "kube-api-access-l9rjz") pod "2950b502-5079-4a08-8aaf-f0b5d376a3f2" (UID: "2950b502-5079-4a08-8aaf-f0b5d376a3f2"). InnerVolumeSpecName "kube-api-access-l9rjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:53:42 crc kubenswrapper[5094]: I0220 08:53:42.188661 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2950b502-5079-4a08-8aaf-f0b5d376a3f2" (UID: "2950b502-5079-4a08-8aaf-f0b5d376a3f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:53:42 crc kubenswrapper[5094]: I0220 08:53:42.190058 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-config-data" (OuterVolumeSpecName: "config-data") pod "2950b502-5079-4a08-8aaf-f0b5d376a3f2" (UID: "2950b502-5079-4a08-8aaf-f0b5d376a3f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:53:42 crc kubenswrapper[5094]: I0220 08:53:42.263385 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:53:42 crc kubenswrapper[5094]: I0220 08:53:42.263415 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:53:42 crc kubenswrapper[5094]: I0220 08:53:42.263426 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9rjz\" (UniqueName: \"kubernetes.io/projected/2950b502-5079-4a08-8aaf-f0b5d376a3f2-kube-api-access-l9rjz\") on node \"crc\" DevicePath \"\"" Feb 20 08:53:42 crc kubenswrapper[5094]: I0220 08:53:42.263437 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:53:42 crc kubenswrapper[5094]: I0220 08:53:42.526539 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-bg4qh" event={"ID":"2950b502-5079-4a08-8aaf-f0b5d376a3f2","Type":"ContainerDied","Data":"0c2efecd63bd3804e02924264075112d6f9b735af119df1237adc505949e0e15"} Feb 20 08:53:42 crc kubenswrapper[5094]: I0220 08:53:42.526611 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c2efecd63bd3804e02924264075112d6f9b735af119df1237adc505949e0e15" Feb 20 08:53:42 crc kubenswrapper[5094]: I0220 08:53:42.526797 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-bg4qh" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.240746 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Feb 20 08:53:45 crc kubenswrapper[5094]: E0220 08:53:45.243150 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2950b502-5079-4a08-8aaf-f0b5d376a3f2" containerName="aodh-db-sync" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.243178 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="2950b502-5079-4a08-8aaf-f0b5d376a3f2" containerName="aodh-db-sync" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.243406 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="2950b502-5079-4a08-8aaf-f0b5d376a3f2" containerName="aodh-db-sync" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.246836 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.251188 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.251200 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-zl6jq" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.251400 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.260285 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.334477 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpjvq\" (UniqueName: \"kubernetes.io/projected/4f38802d-49cb-413c-ac61-665d5c77a1a3-kube-api-access-wpjvq\") pod \"aodh-0\" (UID: \"4f38802d-49cb-413c-ac61-665d5c77a1a3\") " pod="openstack/aodh-0" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.334581 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f38802d-49cb-413c-ac61-665d5c77a1a3-scripts\") pod \"aodh-0\" (UID: \"4f38802d-49cb-413c-ac61-665d5c77a1a3\") " pod="openstack/aodh-0" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.334666 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f38802d-49cb-413c-ac61-665d5c77a1a3-config-data\") pod \"aodh-0\" (UID: \"4f38802d-49cb-413c-ac61-665d5c77a1a3\") " pod="openstack/aodh-0" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.334763 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f38802d-49cb-413c-ac61-665d5c77a1a3-combined-ca-bundle\") pod \"aodh-0\" (UID: \"4f38802d-49cb-413c-ac61-665d5c77a1a3\") " pod="openstack/aodh-0" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.436247 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f38802d-49cb-413c-ac61-665d5c77a1a3-combined-ca-bundle\") pod \"aodh-0\" (UID: \"4f38802d-49cb-413c-ac61-665d5c77a1a3\") " pod="openstack/aodh-0" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.436376 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpjvq\" (UniqueName: \"kubernetes.io/projected/4f38802d-49cb-413c-ac61-665d5c77a1a3-kube-api-access-wpjvq\") pod \"aodh-0\" (UID: \"4f38802d-49cb-413c-ac61-665d5c77a1a3\") " pod="openstack/aodh-0" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.436434 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f38802d-49cb-413c-ac61-665d5c77a1a3-scripts\") pod \"aodh-0\" (UID: \"4f38802d-49cb-413c-ac61-665d5c77a1a3\") " pod="openstack/aodh-0" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.436560 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f38802d-49cb-413c-ac61-665d5c77a1a3-config-data\") pod \"aodh-0\" (UID: \"4f38802d-49cb-413c-ac61-665d5c77a1a3\") " pod="openstack/aodh-0" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.444558 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f38802d-49cb-413c-ac61-665d5c77a1a3-combined-ca-bundle\") pod \"aodh-0\" (UID: \"4f38802d-49cb-413c-ac61-665d5c77a1a3\") " pod="openstack/aodh-0" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.448452 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f38802d-49cb-413c-ac61-665d5c77a1a3-config-data\") pod \"aodh-0\" (UID: \"4f38802d-49cb-413c-ac61-665d5c77a1a3\") " pod="openstack/aodh-0" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.454045 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f38802d-49cb-413c-ac61-665d5c77a1a3-scripts\") pod \"aodh-0\" (UID: \"4f38802d-49cb-413c-ac61-665d5c77a1a3\") " pod="openstack/aodh-0" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.455740 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpjvq\" (UniqueName: \"kubernetes.io/projected/4f38802d-49cb-413c-ac61-665d5c77a1a3-kube-api-access-wpjvq\") pod \"aodh-0\" (UID: \"4f38802d-49cb-413c-ac61-665d5c77a1a3\") " pod="openstack/aodh-0" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.573883 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 20 08:53:46 crc kubenswrapper[5094]: I0220 08:53:46.089283 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Feb 20 08:53:46 crc kubenswrapper[5094]: I0220 08:53:46.561524 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"4f38802d-49cb-413c-ac61-665d5c77a1a3","Type":"ContainerStarted","Data":"730b6ecf3ba7d016cc450efeb07b413166b9838f947281daf1abfc31fde01ce8"} Feb 20 08:53:46 crc kubenswrapper[5094]: I0220 08:53:46.561901 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"4f38802d-49cb-413c-ac61-665d5c77a1a3","Type":"ContainerStarted","Data":"3237a18f11e311c42d21d546d8c4ea50d1f8ff2f86a649e7d2c75e24ae2594ef"} Feb 20 08:53:47 crc kubenswrapper[5094]: I0220 08:53:47.498251 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 08:53:47 crc kubenswrapper[5094]: I0220 08:53:47.498877 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerName="ceilometer-central-agent" containerID="cri-o://65a8886591306ad8879b6a7a385867de5351348368c26ea731d0f97822cb7df6" gracePeriod=30 Feb 20 08:53:47 crc kubenswrapper[5094]: I0220 08:53:47.498902 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerName="proxy-httpd" containerID="cri-o://ebf23270078580538e17397e37d3095ef7fab2a602d5fa4d1a4d6c4d39282be2" gracePeriod=30 Feb 20 08:53:47 crc kubenswrapper[5094]: I0220 08:53:47.498961 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerName="ceilometer-notification-agent" containerID="cri-o://2857012eb54d9c2168b148c823755eb27da69fbf090698f1429d8cb1752e4e87" gracePeriod=30 Feb 20 08:53:47 crc kubenswrapper[5094]: I0220 08:53:47.498952 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerName="sg-core" containerID="cri-o://36383798cb5cce5f003b1a93a9f58a7a048970022e6f16ba4a5b228fc5f0ee8b" gracePeriod=30 Feb 20 08:53:48 crc kubenswrapper[5094]: I0220 08:53:48.599918 5094 generic.go:334] "Generic (PLEG): container finished" podID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerID="ebf23270078580538e17397e37d3095ef7fab2a602d5fa4d1a4d6c4d39282be2" exitCode=0 Feb 20 08:53:48 crc kubenswrapper[5094]: I0220 08:53:48.601117 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d5a7349-432b-4431-bbf3-5079cbad3819","Type":"ContainerDied","Data":"ebf23270078580538e17397e37d3095ef7fab2a602d5fa4d1a4d6c4d39282be2"} Feb 20 08:53:48 crc kubenswrapper[5094]: I0220 08:53:48.601118 5094 generic.go:334] "Generic (PLEG): container finished" podID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerID="36383798cb5cce5f003b1a93a9f58a7a048970022e6f16ba4a5b228fc5f0ee8b" exitCode=2 Feb 20 08:53:48 crc kubenswrapper[5094]: I0220 08:53:48.601150 5094 generic.go:334] "Generic (PLEG): container finished" podID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerID="65a8886591306ad8879b6a7a385867de5351348368c26ea731d0f97822cb7df6" exitCode=0 Feb 20 08:53:48 crc kubenswrapper[5094]: I0220 08:53:48.601361 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d5a7349-432b-4431-bbf3-5079cbad3819","Type":"ContainerDied","Data":"36383798cb5cce5f003b1a93a9f58a7a048970022e6f16ba4a5b228fc5f0ee8b"} Feb 20 08:53:48 crc kubenswrapper[5094]: I0220 08:53:48.601380 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d5a7349-432b-4431-bbf3-5079cbad3819","Type":"ContainerDied","Data":"65a8886591306ad8879b6a7a385867de5351348368c26ea731d0f97822cb7df6"} Feb 20 08:53:48 crc kubenswrapper[5094]: I0220 08:53:48.610087 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"4f38802d-49cb-413c-ac61-665d5c77a1a3","Type":"ContainerStarted","Data":"9ac908b0382fd3ade38084ba83187b903fd00f472508cca86774b10ee2f78f8c"} Feb 20 08:53:49 crc kubenswrapper[5094]: I0220 08:53:49.620963 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"4f38802d-49cb-413c-ac61-665d5c77a1a3","Type":"ContainerStarted","Data":"a0a9eae3512bc254e98755e973d0349cd0928b410aed8d756eb5fb675b1e04f4"} Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.041437 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-63f8-account-create-update-szqmc"] Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.053901 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-63f8-account-create-update-szqmc"] Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.634750 5094 generic.go:334] "Generic (PLEG): container finished" podID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerID="2857012eb54d9c2168b148c823755eb27da69fbf090698f1429d8cb1752e4e87" exitCode=0 Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.634810 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d5a7349-432b-4431-bbf3-5079cbad3819","Type":"ContainerDied","Data":"2857012eb54d9c2168b148c823755eb27da69fbf090698f1429d8cb1752e4e87"} Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.641238 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"4f38802d-49cb-413c-ac61-665d5c77a1a3","Type":"ContainerStarted","Data":"d880101af76eba9ae1c04df323918f531d2f04c0e84bcf88b07c2a52d7a1c082"} Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.662321 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.945613372 podStartE2EDuration="5.662295096s" podCreationTimestamp="2026-02-20 08:53:45 +0000 UTC" firstStartedPulling="2026-02-20 08:53:46.096546231 +0000 UTC m=+7640.969172942" lastFinishedPulling="2026-02-20 08:53:49.813227955 +0000 UTC m=+7644.685854666" observedRunningTime="2026-02-20 08:53:50.659179591 +0000 UTC m=+7645.531806302" watchObservedRunningTime="2026-02-20 08:53:50.662295096 +0000 UTC m=+7645.534921807" Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.740571 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.836810 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-combined-ca-bundle\") pod \"3d5a7349-432b-4431-bbf3-5079cbad3819\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.836875 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wb4wq\" (UniqueName: \"kubernetes.io/projected/3d5a7349-432b-4431-bbf3-5079cbad3819-kube-api-access-wb4wq\") pod \"3d5a7349-432b-4431-bbf3-5079cbad3819\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.836904 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-scripts\") pod \"3d5a7349-432b-4431-bbf3-5079cbad3819\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.836930 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d5a7349-432b-4431-bbf3-5079cbad3819-run-httpd\") pod \"3d5a7349-432b-4431-bbf3-5079cbad3819\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.836957 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d5a7349-432b-4431-bbf3-5079cbad3819-log-httpd\") pod \"3d5a7349-432b-4431-bbf3-5079cbad3819\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.837000 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-config-data\") pod \"3d5a7349-432b-4431-bbf3-5079cbad3819\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.837027 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-sg-core-conf-yaml\") pod \"3d5a7349-432b-4431-bbf3-5079cbad3819\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.840306 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d5a7349-432b-4431-bbf3-5079cbad3819-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3d5a7349-432b-4431-bbf3-5079cbad3819" (UID: "3d5a7349-432b-4431-bbf3-5079cbad3819"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.843634 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d5a7349-432b-4431-bbf3-5079cbad3819-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3d5a7349-432b-4431-bbf3-5079cbad3819" (UID: "3d5a7349-432b-4431-bbf3-5079cbad3819"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.846370 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d5a7349-432b-4431-bbf3-5079cbad3819-kube-api-access-wb4wq" (OuterVolumeSpecName: "kube-api-access-wb4wq") pod "3d5a7349-432b-4431-bbf3-5079cbad3819" (UID: "3d5a7349-432b-4431-bbf3-5079cbad3819"). InnerVolumeSpecName "kube-api-access-wb4wq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.847304 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-scripts" (OuterVolumeSpecName: "scripts") pod "3d5a7349-432b-4431-bbf3-5079cbad3819" (UID: "3d5a7349-432b-4431-bbf3-5079cbad3819"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.892354 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3d5a7349-432b-4431-bbf3-5079cbad3819" (UID: "3d5a7349-432b-4431-bbf3-5079cbad3819"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.915753 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d5a7349-432b-4431-bbf3-5079cbad3819" (UID: "3d5a7349-432b-4431-bbf3-5079cbad3819"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.939983 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.940030 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wb4wq\" (UniqueName: \"kubernetes.io/projected/3d5a7349-432b-4431-bbf3-5079cbad3819-kube-api-access-wb4wq\") on node \"crc\" DevicePath \"\"" Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.940044 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.940057 5094 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d5a7349-432b-4431-bbf3-5079cbad3819-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.940067 5094 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d5a7349-432b-4431-bbf3-5079cbad3819-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.940076 5094 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.986842 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-config-data" (OuterVolumeSpecName: "config-data") pod "3d5a7349-432b-4431-bbf3-5079cbad3819" (UID: "3d5a7349-432b-4431-bbf3-5079cbad3819"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.033314 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-r5qjd"] Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.042390 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.045579 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-r5qjd"] Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.656808 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.656821 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d5a7349-432b-4431-bbf3-5079cbad3819","Type":"ContainerDied","Data":"8d3046356c6ab6376d54ed64d8a87d4b911f9de1922da02bb0c47ec017f24b8c"} Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.657489 5094 scope.go:117] "RemoveContainer" containerID="ebf23270078580538e17397e37d3095ef7fab2a602d5fa4d1a4d6c4d39282be2" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.679502 5094 scope.go:117] "RemoveContainer" containerID="36383798cb5cce5f003b1a93a9f58a7a048970022e6f16ba4a5b228fc5f0ee8b" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.694853 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.707873 5094 scope.go:117] "RemoveContainer" containerID="2857012eb54d9c2168b148c823755eb27da69fbf090698f1429d8cb1752e4e87" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.709029 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.729763 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 08:53:51 crc kubenswrapper[5094]: E0220 08:53:51.730191 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerName="proxy-httpd" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.730213 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerName="proxy-httpd" Feb 20 08:53:51 crc kubenswrapper[5094]: E0220 08:53:51.730239 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerName="sg-core" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.730246 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerName="sg-core" Feb 20 08:53:51 crc kubenswrapper[5094]: E0220 08:53:51.730262 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerName="ceilometer-notification-agent" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.730270 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerName="ceilometer-notification-agent" Feb 20 08:53:51 crc kubenswrapper[5094]: E0220 08:53:51.730279 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerName="ceilometer-central-agent" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.730286 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerName="ceilometer-central-agent" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.730494 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerName="ceilometer-notification-agent" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.730510 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerName="proxy-httpd" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.730519 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerName="sg-core" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.730532 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerName="ceilometer-central-agent" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.732283 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.748340 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.765623 5094 scope.go:117] "RemoveContainer" containerID="65a8886591306ad8879b6a7a385867de5351348368c26ea731d0f97822cb7df6" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.767118 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.767374 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.767246 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.768838 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-scripts\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.769015 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg2cf\" (UniqueName: \"kubernetes.io/projected/4faa0d21-cfca-4eae-a05a-3ec287395c30-kube-api-access-lg2cf\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.769149 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4faa0d21-cfca-4eae-a05a-3ec287395c30-run-httpd\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.769220 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.769287 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-config-data\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.769462 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4faa0d21-cfca-4eae-a05a-3ec287395c30-log-httpd\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.855969 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e67bd4c-454a-4166-9e28-49c348795b29" path="/var/lib/kubelet/pods/0e67bd4c-454a-4166-9e28-49c348795b29/volumes" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.856729 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" path="/var/lib/kubelet/pods/3d5a7349-432b-4431-bbf3-5079cbad3819/volumes" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.857542 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5bcef59-b989-4157-8233-6482f9f3abab" path="/var/lib/kubelet/pods/d5bcef59-b989-4157-8233-6482f9f3abab/volumes" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.870718 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg2cf\" (UniqueName: \"kubernetes.io/projected/4faa0d21-cfca-4eae-a05a-3ec287395c30-kube-api-access-lg2cf\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.870781 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4faa0d21-cfca-4eae-a05a-3ec287395c30-run-httpd\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.870812 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.870845 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-config-data\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.870905 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4faa0d21-cfca-4eae-a05a-3ec287395c30-log-httpd\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.870967 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.870986 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-scripts\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.873922 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4faa0d21-cfca-4eae-a05a-3ec287395c30-run-httpd\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.874390 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4faa0d21-cfca-4eae-a05a-3ec287395c30-log-httpd\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.897032 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.897266 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-scripts\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.900283 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.901160 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-config-data\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.913325 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg2cf\" (UniqueName: \"kubernetes.io/projected/4faa0d21-cfca-4eae-a05a-3ec287395c30-kube-api-access-lg2cf\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:52 crc kubenswrapper[5094]: I0220 08:53:52.098664 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 08:53:52 crc kubenswrapper[5094]: I0220 08:53:52.583390 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 08:53:52 crc kubenswrapper[5094]: W0220 08:53:52.583394 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4faa0d21_cfca_4eae_a05a_3ec287395c30.slice/crio-4282bf0047b56410178151ee0ba808962a8b92244967608282436fe0dae092a8 WatchSource:0}: Error finding container 4282bf0047b56410178151ee0ba808962a8b92244967608282436fe0dae092a8: Status 404 returned error can't find the container with id 4282bf0047b56410178151ee0ba808962a8b92244967608282436fe0dae092a8 Feb 20 08:53:52 crc kubenswrapper[5094]: I0220 08:53:52.671038 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4faa0d21-cfca-4eae-a05a-3ec287395c30","Type":"ContainerStarted","Data":"4282bf0047b56410178151ee0ba808962a8b92244967608282436fe0dae092a8"} Feb 20 08:53:53 crc kubenswrapper[5094]: I0220 08:53:53.684876 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4faa0d21-cfca-4eae-a05a-3ec287395c30","Type":"ContainerStarted","Data":"da457492d3fcb44eb3db9f843668f654d08851b042c34f3918cf3c58cc836b51"} Feb 20 08:53:53 crc kubenswrapper[5094]: I0220 08:53:53.685199 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4faa0d21-cfca-4eae-a05a-3ec287395c30","Type":"ContainerStarted","Data":"6f7c3a64e5e40bcc8a0965c47b2906e4a7a28eb59209f746785649c8a13d38c1"} Feb 20 08:53:54 crc kubenswrapper[5094]: I0220 08:53:54.695395 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4faa0d21-cfca-4eae-a05a-3ec287395c30","Type":"ContainerStarted","Data":"63388c3e7e7eb77a160e75b227ead6c24b91ba4a2895c59cd363b4b43675b164"} Feb 20 08:53:55 crc kubenswrapper[5094]: I0220 08:53:55.711547 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4faa0d21-cfca-4eae-a05a-3ec287395c30","Type":"ContainerStarted","Data":"68eb7f14cec84db171cf8515f584e0954ccbe005bb13c77ec56c3620e8c5a7ec"} Feb 20 08:53:55 crc kubenswrapper[5094]: I0220 08:53:55.712214 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 08:53:55 crc kubenswrapper[5094]: I0220 08:53:55.744334 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.017267019 podStartE2EDuration="4.744095363s" podCreationTimestamp="2026-02-20 08:53:51 +0000 UTC" firstStartedPulling="2026-02-20 08:53:52.585519254 +0000 UTC m=+7647.458145965" lastFinishedPulling="2026-02-20 08:53:55.312347598 +0000 UTC m=+7650.184974309" observedRunningTime="2026-02-20 08:53:55.737660498 +0000 UTC m=+7650.610287209" watchObservedRunningTime="2026-02-20 08:53:55.744095363 +0000 UTC m=+7650.616722074" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.115999 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-mtnn8"] Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.117545 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-mtnn8" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.135772 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-mtnn8"] Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.262610 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnfgd\" (UniqueName: \"kubernetes.io/projected/c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3-kube-api-access-wnfgd\") pod \"manila-db-create-mtnn8\" (UID: \"c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3\") " pod="openstack/manila-db-create-mtnn8" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.262748 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3-operator-scripts\") pod \"manila-db-create-mtnn8\" (UID: \"c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3\") " pod="openstack/manila-db-create-mtnn8" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.325414 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-560b-account-create-update-6s8n2"] Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.326921 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-560b-account-create-update-6s8n2" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.329791 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.335214 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-560b-account-create-update-6s8n2"] Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.364640 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnfgd\" (UniqueName: \"kubernetes.io/projected/c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3-kube-api-access-wnfgd\") pod \"manila-db-create-mtnn8\" (UID: \"c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3\") " pod="openstack/manila-db-create-mtnn8" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.364784 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3-operator-scripts\") pod \"manila-db-create-mtnn8\" (UID: \"c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3\") " pod="openstack/manila-db-create-mtnn8" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.365571 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3-operator-scripts\") pod \"manila-db-create-mtnn8\" (UID: \"c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3\") " pod="openstack/manila-db-create-mtnn8" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.388538 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnfgd\" (UniqueName: \"kubernetes.io/projected/c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3-kube-api-access-wnfgd\") pod \"manila-db-create-mtnn8\" (UID: \"c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3\") " pod="openstack/manila-db-create-mtnn8" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.437126 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-mtnn8" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.466248 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cxwm\" (UniqueName: \"kubernetes.io/projected/6bad0291-15d9-4dc5-acd6-26bc8d8aad76-kube-api-access-6cxwm\") pod \"manila-560b-account-create-update-6s8n2\" (UID: \"6bad0291-15d9-4dc5-acd6-26bc8d8aad76\") " pod="openstack/manila-560b-account-create-update-6s8n2" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.466782 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bad0291-15d9-4dc5-acd6-26bc8d8aad76-operator-scripts\") pod \"manila-560b-account-create-update-6s8n2\" (UID: \"6bad0291-15d9-4dc5-acd6-26bc8d8aad76\") " pod="openstack/manila-560b-account-create-update-6s8n2" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.569382 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bad0291-15d9-4dc5-acd6-26bc8d8aad76-operator-scripts\") pod \"manila-560b-account-create-update-6s8n2\" (UID: \"6bad0291-15d9-4dc5-acd6-26bc8d8aad76\") " pod="openstack/manila-560b-account-create-update-6s8n2" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.569462 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cxwm\" (UniqueName: \"kubernetes.io/projected/6bad0291-15d9-4dc5-acd6-26bc8d8aad76-kube-api-access-6cxwm\") pod \"manila-560b-account-create-update-6s8n2\" (UID: \"6bad0291-15d9-4dc5-acd6-26bc8d8aad76\") " pod="openstack/manila-560b-account-create-update-6s8n2" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.570923 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bad0291-15d9-4dc5-acd6-26bc8d8aad76-operator-scripts\") pod \"manila-560b-account-create-update-6s8n2\" (UID: \"6bad0291-15d9-4dc5-acd6-26bc8d8aad76\") " pod="openstack/manila-560b-account-create-update-6s8n2" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.589632 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cxwm\" (UniqueName: \"kubernetes.io/projected/6bad0291-15d9-4dc5-acd6-26bc8d8aad76-kube-api-access-6cxwm\") pod \"manila-560b-account-create-update-6s8n2\" (UID: \"6bad0291-15d9-4dc5-acd6-26bc8d8aad76\") " pod="openstack/manila-560b-account-create-update-6s8n2" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.646473 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-560b-account-create-update-6s8n2" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.965801 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-mtnn8"] Feb 20 08:53:57 crc kubenswrapper[5094]: W0220 08:53:57.974478 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5cc3f76_dfd1_4d7c_8adb_08cbc55636a3.slice/crio-4fb224ed39ff04c2136c9646f40baaacccbff7b5c61381309178d49896dc2b40 WatchSource:0}: Error finding container 4fb224ed39ff04c2136c9646f40baaacccbff7b5c61381309178d49896dc2b40: Status 404 returned error can't find the container with id 4fb224ed39ff04c2136c9646f40baaacccbff7b5c61381309178d49896dc2b40 Feb 20 08:53:58 crc kubenswrapper[5094]: I0220 08:53:58.168534 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-560b-account-create-update-6s8n2"] Feb 20 08:53:58 crc kubenswrapper[5094]: W0220 08:53:58.173176 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bad0291_15d9_4dc5_acd6_26bc8d8aad76.slice/crio-cf6c4ea050a878c441827f0883727cadb0b05d7a626b34e4059bc274c7f4322d WatchSource:0}: Error finding container cf6c4ea050a878c441827f0883727cadb0b05d7a626b34e4059bc274c7f4322d: Status 404 returned error can't find the container with id cf6c4ea050a878c441827f0883727cadb0b05d7a626b34e4059bc274c7f4322d Feb 20 08:53:58 crc kubenswrapper[5094]: I0220 08:53:58.748328 5094 generic.go:334] "Generic (PLEG): container finished" podID="6bad0291-15d9-4dc5-acd6-26bc8d8aad76" containerID="fcec52d45c535185ac325065c4cab11c829c4a1ebad6b2123939c3a35f4b9360" exitCode=0 Feb 20 08:53:58 crc kubenswrapper[5094]: I0220 08:53:58.748409 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-560b-account-create-update-6s8n2" event={"ID":"6bad0291-15d9-4dc5-acd6-26bc8d8aad76","Type":"ContainerDied","Data":"fcec52d45c535185ac325065c4cab11c829c4a1ebad6b2123939c3a35f4b9360"} Feb 20 08:53:58 crc kubenswrapper[5094]: I0220 08:53:58.748734 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-560b-account-create-update-6s8n2" event={"ID":"6bad0291-15d9-4dc5-acd6-26bc8d8aad76","Type":"ContainerStarted","Data":"cf6c4ea050a878c441827f0883727cadb0b05d7a626b34e4059bc274c7f4322d"} Feb 20 08:53:58 crc kubenswrapper[5094]: I0220 08:53:58.749923 5094 generic.go:334] "Generic (PLEG): container finished" podID="c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3" containerID="40e80a7f49d2a8cd8ede69f04221413d74bc3298b5502921432bc86e342a4f7d" exitCode=0 Feb 20 08:53:58 crc kubenswrapper[5094]: I0220 08:53:58.749959 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-mtnn8" event={"ID":"c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3","Type":"ContainerDied","Data":"40e80a7f49d2a8cd8ede69f04221413d74bc3298b5502921432bc86e342a4f7d"} Feb 20 08:53:58 crc kubenswrapper[5094]: I0220 08:53:58.749974 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-mtnn8" event={"ID":"c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3","Type":"ContainerStarted","Data":"4fb224ed39ff04c2136c9646f40baaacccbff7b5c61381309178d49896dc2b40"} Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.067143 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-smd54"] Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.128738 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-smd54"] Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.446754 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-mtnn8" Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.451288 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-560b-account-create-update-6s8n2" Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.536578 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bad0291-15d9-4dc5-acd6-26bc8d8aad76-operator-scripts\") pod \"6bad0291-15d9-4dc5-acd6-26bc8d8aad76\" (UID: \"6bad0291-15d9-4dc5-acd6-26bc8d8aad76\") " Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.536670 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3-operator-scripts\") pod \"c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3\" (UID: \"c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3\") " Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.536746 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnfgd\" (UniqueName: \"kubernetes.io/projected/c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3-kube-api-access-wnfgd\") pod \"c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3\" (UID: \"c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3\") " Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.536811 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cxwm\" (UniqueName: \"kubernetes.io/projected/6bad0291-15d9-4dc5-acd6-26bc8d8aad76-kube-api-access-6cxwm\") pod \"6bad0291-15d9-4dc5-acd6-26bc8d8aad76\" (UID: \"6bad0291-15d9-4dc5-acd6-26bc8d8aad76\") " Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.537173 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3" (UID: "c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.537236 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bad0291-15d9-4dc5-acd6-26bc8d8aad76-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6bad0291-15d9-4dc5-acd6-26bc8d8aad76" (UID: "6bad0291-15d9-4dc5-acd6-26bc8d8aad76"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.545979 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bad0291-15d9-4dc5-acd6-26bc8d8aad76-kube-api-access-6cxwm" (OuterVolumeSpecName: "kube-api-access-6cxwm") pod "6bad0291-15d9-4dc5-acd6-26bc8d8aad76" (UID: "6bad0291-15d9-4dc5-acd6-26bc8d8aad76"). InnerVolumeSpecName "kube-api-access-6cxwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.550285 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3-kube-api-access-wnfgd" (OuterVolumeSpecName: "kube-api-access-wnfgd") pod "c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3" (UID: "c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3"). InnerVolumeSpecName "kube-api-access-wnfgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.641237 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bad0291-15d9-4dc5-acd6-26bc8d8aad76-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.641282 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.641295 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnfgd\" (UniqueName: \"kubernetes.io/projected/c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3-kube-api-access-wnfgd\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.641308 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cxwm\" (UniqueName: \"kubernetes.io/projected/6bad0291-15d9-4dc5-acd6-26bc8d8aad76-kube-api-access-6cxwm\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.779079 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-560b-account-create-update-6s8n2" event={"ID":"6bad0291-15d9-4dc5-acd6-26bc8d8aad76","Type":"ContainerDied","Data":"cf6c4ea050a878c441827f0883727cadb0b05d7a626b34e4059bc274c7f4322d"} Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.779127 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf6c4ea050a878c441827f0883727cadb0b05d7a626b34e4059bc274c7f4322d" Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.779186 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-560b-account-create-update-6s8n2" Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.783145 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-mtnn8" event={"ID":"c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3","Type":"ContainerDied","Data":"4fb224ed39ff04c2136c9646f40baaacccbff7b5c61381309178d49896dc2b40"} Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.783194 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fb224ed39ff04c2136c9646f40baaacccbff7b5c61381309178d49896dc2b40" Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.783436 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-mtnn8" Feb 20 08:54:01 crc kubenswrapper[5094]: I0220 08:54:01.853016 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b63f3e88-3e2a-43db-88de-8cf778187671" path="/var/lib/kubelet/pods/b63f3e88-3e2a-43db-88de-8cf778187671/volumes" Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.774407 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-87lwb"] Feb 20 08:54:02 crc kubenswrapper[5094]: E0220 08:54:02.775503 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3" containerName="mariadb-database-create" Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.775609 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3" containerName="mariadb-database-create" Feb 20 08:54:02 crc kubenswrapper[5094]: E0220 08:54:02.775753 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bad0291-15d9-4dc5-acd6-26bc8d8aad76" containerName="mariadb-account-create-update" Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.775845 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bad0291-15d9-4dc5-acd6-26bc8d8aad76" containerName="mariadb-account-create-update" Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.776169 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bad0291-15d9-4dc5-acd6-26bc8d8aad76" containerName="mariadb-account-create-update" Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.776284 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3" containerName="mariadb-database-create" Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.777309 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-87lwb" Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.781478 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-zwzzl" Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.783001 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.789684 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-87lwb"] Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.892179 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-combined-ca-bundle\") pod \"manila-db-sync-87lwb\" (UID: \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\") " pod="openstack/manila-db-sync-87lwb" Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.892414 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mdxr\" (UniqueName: \"kubernetes.io/projected/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-kube-api-access-6mdxr\") pod \"manila-db-sync-87lwb\" (UID: \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\") " pod="openstack/manila-db-sync-87lwb" Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.892510 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-config-data\") pod \"manila-db-sync-87lwb\" (UID: \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\") " pod="openstack/manila-db-sync-87lwb" Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.892539 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-job-config-data\") pod \"manila-db-sync-87lwb\" (UID: \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\") " pod="openstack/manila-db-sync-87lwb" Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.993885 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-combined-ca-bundle\") pod \"manila-db-sync-87lwb\" (UID: \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\") " pod="openstack/manila-db-sync-87lwb" Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.994205 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mdxr\" (UniqueName: \"kubernetes.io/projected/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-kube-api-access-6mdxr\") pod \"manila-db-sync-87lwb\" (UID: \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\") " pod="openstack/manila-db-sync-87lwb" Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.994945 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-config-data\") pod \"manila-db-sync-87lwb\" (UID: \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\") " pod="openstack/manila-db-sync-87lwb" Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.995045 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-job-config-data\") pod \"manila-db-sync-87lwb\" (UID: \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\") " pod="openstack/manila-db-sync-87lwb" Feb 20 08:54:03 crc kubenswrapper[5094]: I0220 08:54:03.002094 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-combined-ca-bundle\") pod \"manila-db-sync-87lwb\" (UID: \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\") " pod="openstack/manila-db-sync-87lwb" Feb 20 08:54:03 crc kubenswrapper[5094]: I0220 08:54:03.002633 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-job-config-data\") pod \"manila-db-sync-87lwb\" (UID: \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\") " pod="openstack/manila-db-sync-87lwb" Feb 20 08:54:03 crc kubenswrapper[5094]: I0220 08:54:03.003813 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-config-data\") pod \"manila-db-sync-87lwb\" (UID: \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\") " pod="openstack/manila-db-sync-87lwb" Feb 20 08:54:03 crc kubenswrapper[5094]: I0220 08:54:03.021870 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mdxr\" (UniqueName: \"kubernetes.io/projected/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-kube-api-access-6mdxr\") pod \"manila-db-sync-87lwb\" (UID: \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\") " pod="openstack/manila-db-sync-87lwb" Feb 20 08:54:03 crc kubenswrapper[5094]: I0220 08:54:03.117476 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-87lwb" Feb 20 08:54:03 crc kubenswrapper[5094]: I0220 08:54:03.947978 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-87lwb"] Feb 20 08:54:04 crc kubenswrapper[5094]: I0220 08:54:04.106883 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:54:04 crc kubenswrapper[5094]: I0220 08:54:04.106936 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:54:04 crc kubenswrapper[5094]: I0220 08:54:04.822139 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-87lwb" event={"ID":"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00","Type":"ContainerStarted","Data":"ab10d89216ed579b6386deb45a07ae8b41f6f27357e9c40b4d0986f26b63851f"} Feb 20 08:54:10 crc kubenswrapper[5094]: I0220 08:54:10.881163 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-87lwb" event={"ID":"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00","Type":"ContainerStarted","Data":"8325d32b9b097dfce89183c5462e4d5dad44f1febeae290e0597bb55bbe59825"} Feb 20 08:54:10 crc kubenswrapper[5094]: I0220 08:54:10.898671 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-87lwb" podStartSLOduration=3.115585194 podStartE2EDuration="8.898653328s" podCreationTimestamp="2026-02-20 08:54:02 +0000 UTC" firstStartedPulling="2026-02-20 08:54:03.964260663 +0000 UTC m=+7658.836887374" lastFinishedPulling="2026-02-20 08:54:09.747328787 +0000 UTC m=+7664.619955508" observedRunningTime="2026-02-20 08:54:10.894669213 +0000 UTC m=+7665.767295924" watchObservedRunningTime="2026-02-20 08:54:10.898653328 +0000 UTC m=+7665.771280039" Feb 20 08:54:11 crc kubenswrapper[5094]: I0220 08:54:11.778043 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zrlvp"] Feb 20 08:54:11 crc kubenswrapper[5094]: I0220 08:54:11.781880 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:11 crc kubenswrapper[5094]: I0220 08:54:11.796601 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zrlvp"] Feb 20 08:54:11 crc kubenswrapper[5094]: I0220 08:54:11.880126 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jll2n\" (UniqueName: \"kubernetes.io/projected/7f62335a-f07f-45c0-9db2-5fbb91ed2588-kube-api-access-jll2n\") pod \"community-operators-zrlvp\" (UID: \"7f62335a-f07f-45c0-9db2-5fbb91ed2588\") " pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:11 crc kubenswrapper[5094]: I0220 08:54:11.880522 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f62335a-f07f-45c0-9db2-5fbb91ed2588-utilities\") pod \"community-operators-zrlvp\" (UID: \"7f62335a-f07f-45c0-9db2-5fbb91ed2588\") " pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:11 crc kubenswrapper[5094]: I0220 08:54:11.880725 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f62335a-f07f-45c0-9db2-5fbb91ed2588-catalog-content\") pod \"community-operators-zrlvp\" (UID: \"7f62335a-f07f-45c0-9db2-5fbb91ed2588\") " pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:11 crc kubenswrapper[5094]: I0220 08:54:11.982789 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f62335a-f07f-45c0-9db2-5fbb91ed2588-utilities\") pod \"community-operators-zrlvp\" (UID: \"7f62335a-f07f-45c0-9db2-5fbb91ed2588\") " pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:11 crc kubenswrapper[5094]: I0220 08:54:11.982932 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f62335a-f07f-45c0-9db2-5fbb91ed2588-catalog-content\") pod \"community-operators-zrlvp\" (UID: \"7f62335a-f07f-45c0-9db2-5fbb91ed2588\") " pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:11 crc kubenswrapper[5094]: I0220 08:54:11.983117 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jll2n\" (UniqueName: \"kubernetes.io/projected/7f62335a-f07f-45c0-9db2-5fbb91ed2588-kube-api-access-jll2n\") pod \"community-operators-zrlvp\" (UID: \"7f62335a-f07f-45c0-9db2-5fbb91ed2588\") " pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:11 crc kubenswrapper[5094]: I0220 08:54:11.983350 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f62335a-f07f-45c0-9db2-5fbb91ed2588-utilities\") pod \"community-operators-zrlvp\" (UID: \"7f62335a-f07f-45c0-9db2-5fbb91ed2588\") " pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:11 crc kubenswrapper[5094]: I0220 08:54:11.983432 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f62335a-f07f-45c0-9db2-5fbb91ed2588-catalog-content\") pod \"community-operators-zrlvp\" (UID: \"7f62335a-f07f-45c0-9db2-5fbb91ed2588\") " pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:12 crc kubenswrapper[5094]: I0220 08:54:12.013752 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jll2n\" (UniqueName: \"kubernetes.io/projected/7f62335a-f07f-45c0-9db2-5fbb91ed2588-kube-api-access-jll2n\") pod \"community-operators-zrlvp\" (UID: \"7f62335a-f07f-45c0-9db2-5fbb91ed2588\") " pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:12 crc kubenswrapper[5094]: I0220 08:54:12.149100 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:12 crc kubenswrapper[5094]: I0220 08:54:12.599752 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zrlvp"] Feb 20 08:54:12 crc kubenswrapper[5094]: W0220 08:54:12.602558 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f62335a_f07f_45c0_9db2_5fbb91ed2588.slice/crio-d5f59bb757af9e7257cafd6f92f50a0f4fa521fad3dffe794d557e3c053860eb WatchSource:0}: Error finding container d5f59bb757af9e7257cafd6f92f50a0f4fa521fad3dffe794d557e3c053860eb: Status 404 returned error can't find the container with id d5f59bb757af9e7257cafd6f92f50a0f4fa521fad3dffe794d557e3c053860eb Feb 20 08:54:12 crc kubenswrapper[5094]: I0220 08:54:12.962508 5094 generic.go:334] "Generic (PLEG): container finished" podID="7f62335a-f07f-45c0-9db2-5fbb91ed2588" containerID="5295dd5a17ff5868f1250b18a2e2107bdc0361a941732b23ea2dd7611017a966" exitCode=0 Feb 20 08:54:12 crc kubenswrapper[5094]: I0220 08:54:12.962885 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrlvp" event={"ID":"7f62335a-f07f-45c0-9db2-5fbb91ed2588","Type":"ContainerDied","Data":"5295dd5a17ff5868f1250b18a2e2107bdc0361a941732b23ea2dd7611017a966"} Feb 20 08:54:12 crc kubenswrapper[5094]: I0220 08:54:12.962962 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrlvp" event={"ID":"7f62335a-f07f-45c0-9db2-5fbb91ed2588","Type":"ContainerStarted","Data":"d5f59bb757af9e7257cafd6f92f50a0f4fa521fad3dffe794d557e3c053860eb"} Feb 20 08:54:12 crc kubenswrapper[5094]: I0220 08:54:12.967874 5094 generic.go:334] "Generic (PLEG): container finished" podID="8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00" containerID="8325d32b9b097dfce89183c5462e4d5dad44f1febeae290e0597bb55bbe59825" exitCode=0 Feb 20 08:54:12 crc kubenswrapper[5094]: I0220 08:54:12.967934 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-87lwb" event={"ID":"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00","Type":"ContainerDied","Data":"8325d32b9b097dfce89183c5462e4d5dad44f1febeae290e0597bb55bbe59825"} Feb 20 08:54:13 crc kubenswrapper[5094]: I0220 08:54:13.978490 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrlvp" event={"ID":"7f62335a-f07f-45c0-9db2-5fbb91ed2588","Type":"ContainerStarted","Data":"e84829db4bf3833d220042d25b4548026fdd5015eed418466ddd3615e038e54f"} Feb 20 08:54:14 crc kubenswrapper[5094]: I0220 08:54:14.447144 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-87lwb" Feb 20 08:54:14 crc kubenswrapper[5094]: I0220 08:54:14.535353 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-config-data\") pod \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\" (UID: \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\") " Feb 20 08:54:14 crc kubenswrapper[5094]: I0220 08:54:14.535436 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-combined-ca-bundle\") pod \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\" (UID: \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\") " Feb 20 08:54:14 crc kubenswrapper[5094]: I0220 08:54:14.535692 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mdxr\" (UniqueName: \"kubernetes.io/projected/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-kube-api-access-6mdxr\") pod \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\" (UID: \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\") " Feb 20 08:54:14 crc kubenswrapper[5094]: I0220 08:54:14.535814 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-job-config-data\") pod \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\" (UID: \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\") " Feb 20 08:54:14 crc kubenswrapper[5094]: I0220 08:54:14.547679 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00" (UID: "8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:54:14 crc kubenswrapper[5094]: I0220 08:54:14.547841 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-kube-api-access-6mdxr" (OuterVolumeSpecName: "kube-api-access-6mdxr") pod "8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00" (UID: "8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00"). InnerVolumeSpecName "kube-api-access-6mdxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:54:14 crc kubenswrapper[5094]: I0220 08:54:14.547923 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-config-data" (OuterVolumeSpecName: "config-data") pod "8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00" (UID: "8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:54:14 crc kubenswrapper[5094]: I0220 08:54:14.574669 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00" (UID: "8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:54:14 crc kubenswrapper[5094]: I0220 08:54:14.637685 5094 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-job-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:14 crc kubenswrapper[5094]: I0220 08:54:14.637902 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:14 crc kubenswrapper[5094]: I0220 08:54:14.637967 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:14 crc kubenswrapper[5094]: I0220 08:54:14.638031 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mdxr\" (UniqueName: \"kubernetes.io/projected/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-kube-api-access-6mdxr\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.010598 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-87lwb" event={"ID":"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00","Type":"ContainerDied","Data":"ab10d89216ed579b6386deb45a07ae8b41f6f27357e9c40b4d0986f26b63851f"} Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.010661 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab10d89216ed579b6386deb45a07ae8b41f6f27357e9c40b4d0986f26b63851f" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.011998 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-87lwb" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.450214 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Feb 20 08:54:15 crc kubenswrapper[5094]: E0220 08:54:15.451208 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00" containerName="manila-db-sync" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.451327 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00" containerName="manila-db-sync" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.451650 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00" containerName="manila-db-sync" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.453237 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.457587 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.457864 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-zwzzl" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.458260 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.458548 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.468802 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.471002 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.475217 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.491215 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.513207 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.539644 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58b99f595-hbwr4"] Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.541801 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.550219 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58b99f595-hbwr4"] Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.566944 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.567008 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29ad14f6-de76-4992-b46a-29f0822654c7-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.567078 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29ad14f6-de76-4992-b46a-29f0822654c7-config-data\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.567190 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-ceph\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.567208 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-config-data\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.567249 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.567264 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29ad14f6-de76-4992-b46a-29f0822654c7-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.567735 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.567796 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29ad14f6-de76-4992-b46a-29f0822654c7-scripts\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.567908 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skkj2\" (UniqueName: \"kubernetes.io/projected/29ad14f6-de76-4992-b46a-29f0822654c7-kube-api-access-skkj2\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.567950 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmjhb\" (UniqueName: \"kubernetes.io/projected/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-kube-api-access-zmjhb\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.567984 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29ad14f6-de76-4992-b46a-29f0822654c7-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.568013 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.568095 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-scripts\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.669809 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxvk6\" (UniqueName: \"kubernetes.io/projected/d0e42f57-46ec-4ef5-a4f0-f262ce003602-kube-api-access-lxvk6\") pod \"dnsmasq-dns-58b99f595-hbwr4\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.670101 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmjhb\" (UniqueName: \"kubernetes.io/projected/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-kube-api-access-zmjhb\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.670216 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29ad14f6-de76-4992-b46a-29f0822654c7-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.670322 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.670461 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-scripts\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.670572 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.670631 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.670741 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29ad14f6-de76-4992-b46a-29f0822654c7-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.670841 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-ovsdbserver-sb\") pod \"dnsmasq-dns-58b99f595-hbwr4\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.670923 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29ad14f6-de76-4992-b46a-29f0822654c7-config-data\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.671011 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-ovsdbserver-nb\") pod \"dnsmasq-dns-58b99f595-hbwr4\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.670760 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29ad14f6-de76-4992-b46a-29f0822654c7-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.671161 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-config\") pod \"dnsmasq-dns-58b99f595-hbwr4\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.671284 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-config-data\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.671364 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-ceph\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.671457 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.671537 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29ad14f6-de76-4992-b46a-29f0822654c7-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.671643 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.671779 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29ad14f6-de76-4992-b46a-29f0822654c7-scripts\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.671878 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-dns-svc\") pod \"dnsmasq-dns-58b99f595-hbwr4\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.671971 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skkj2\" (UniqueName: \"kubernetes.io/projected/29ad14f6-de76-4992-b46a-29f0822654c7-kube-api-access-skkj2\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.680612 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-ceph\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.681153 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-scripts\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.682686 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.682766 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-config-data\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.684056 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.690463 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29ad14f6-de76-4992-b46a-29f0822654c7-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.690888 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.690893 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29ad14f6-de76-4992-b46a-29f0822654c7-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.691206 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29ad14f6-de76-4992-b46a-29f0822654c7-scripts\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.691572 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29ad14f6-de76-4992-b46a-29f0822654c7-config-data\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.697893 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skkj2\" (UniqueName: \"kubernetes.io/projected/29ad14f6-de76-4992-b46a-29f0822654c7-kube-api-access-skkj2\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.698915 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmjhb\" (UniqueName: \"kubernetes.io/projected/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-kube-api-access-zmjhb\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.748697 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.751036 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.766248 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.773948 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-config\") pod \"dnsmasq-dns-58b99f595-hbwr4\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.774107 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-dns-svc\") pod \"dnsmasq-dns-58b99f595-hbwr4\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.774166 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxvk6\" (UniqueName: \"kubernetes.io/projected/d0e42f57-46ec-4ef5-a4f0-f262ce003602-kube-api-access-lxvk6\") pod \"dnsmasq-dns-58b99f595-hbwr4\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.774273 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-ovsdbserver-sb\") pod \"dnsmasq-dns-58b99f595-hbwr4\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.774315 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-ovsdbserver-nb\") pod \"dnsmasq-dns-58b99f595-hbwr4\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.776123 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-ovsdbserver-sb\") pod \"dnsmasq-dns-58b99f595-hbwr4\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.776213 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-config\") pod \"dnsmasq-dns-58b99f595-hbwr4\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.776236 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-ovsdbserver-nb\") pod \"dnsmasq-dns-58b99f595-hbwr4\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.776745 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-dns-svc\") pod \"dnsmasq-dns-58b99f595-hbwr4\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.783649 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.784383 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.794412 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.803680 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxvk6\" (UniqueName: \"kubernetes.io/projected/d0e42f57-46ec-4ef5-a4f0-f262ce003602-kube-api-access-lxvk6\") pod \"dnsmasq-dns-58b99f595-hbwr4\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.879703 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc6c3c7a-374b-49fc-98d5-852785c56ee7-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.879772 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc6c3c7a-374b-49fc-98d5-852785c56ee7-config-data\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.879809 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc6c3c7a-374b-49fc-98d5-852785c56ee7-scripts\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.879854 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc6c3c7a-374b-49fc-98d5-852785c56ee7-etc-machine-id\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.879919 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdksh\" (UniqueName: \"kubernetes.io/projected/fc6c3c7a-374b-49fc-98d5-852785c56ee7-kube-api-access-qdksh\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.879941 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc6c3c7a-374b-49fc-98d5-852785c56ee7-config-data-custom\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.879960 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc6c3c7a-374b-49fc-98d5-852785c56ee7-logs\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.880375 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.982228 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc6c3c7a-374b-49fc-98d5-852785c56ee7-etc-machine-id\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.981940 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc6c3c7a-374b-49fc-98d5-852785c56ee7-etc-machine-id\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.982999 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdksh\" (UniqueName: \"kubernetes.io/projected/fc6c3c7a-374b-49fc-98d5-852785c56ee7-kube-api-access-qdksh\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.983395 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc6c3c7a-374b-49fc-98d5-852785c56ee7-config-data-custom\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.984153 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc6c3c7a-374b-49fc-98d5-852785c56ee7-logs\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.984142 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc6c3c7a-374b-49fc-98d5-852785c56ee7-logs\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.984502 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc6c3c7a-374b-49fc-98d5-852785c56ee7-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.984665 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc6c3c7a-374b-49fc-98d5-852785c56ee7-config-data\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.984856 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc6c3c7a-374b-49fc-98d5-852785c56ee7-scripts\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.990847 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc6c3c7a-374b-49fc-98d5-852785c56ee7-scripts\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.991719 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc6c3c7a-374b-49fc-98d5-852785c56ee7-config-data-custom\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.994990 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc6c3c7a-374b-49fc-98d5-852785c56ee7-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.995621 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc6c3c7a-374b-49fc-98d5-852785c56ee7-config-data\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:16 crc kubenswrapper[5094]: I0220 08:54:16.006815 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdksh\" (UniqueName: \"kubernetes.io/projected/fc6c3c7a-374b-49fc-98d5-852785c56ee7-kube-api-access-qdksh\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:16 crc kubenswrapper[5094]: I0220 08:54:16.265519 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Feb 20 08:54:16 crc kubenswrapper[5094]: I0220 08:54:16.679227 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Feb 20 08:54:16 crc kubenswrapper[5094]: I0220 08:54:16.715349 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58b99f595-hbwr4"] Feb 20 08:54:16 crc kubenswrapper[5094]: I0220 08:54:16.797402 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Feb 20 08:54:17 crc kubenswrapper[5094]: I0220 08:54:17.038644 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b","Type":"ContainerStarted","Data":"85208af740c940e58fba2853db780aefef08364e87cd43dea1d7ff6e20571fce"} Feb 20 08:54:17 crc kubenswrapper[5094]: I0220 08:54:17.041013 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"29ad14f6-de76-4992-b46a-29f0822654c7","Type":"ContainerStarted","Data":"8a6b028345388dd91c8ed8789e2f8c0b1b3c4e5ec81e75142289bb1a80a2f173"} Feb 20 08:54:17 crc kubenswrapper[5094]: I0220 08:54:17.043240 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b99f595-hbwr4" event={"ID":"d0e42f57-46ec-4ef5-a4f0-f262ce003602","Type":"ContainerStarted","Data":"0a945c3e150afe6769cf7a5b87b40b125560d57829dfe7b7047e2bf325ea9a2b"} Feb 20 08:54:17 crc kubenswrapper[5094]: I0220 08:54:17.045963 5094 generic.go:334] "Generic (PLEG): container finished" podID="7f62335a-f07f-45c0-9db2-5fbb91ed2588" containerID="e84829db4bf3833d220042d25b4548026fdd5015eed418466ddd3615e038e54f" exitCode=0 Feb 20 08:54:17 crc kubenswrapper[5094]: I0220 08:54:17.046007 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrlvp" event={"ID":"7f62335a-f07f-45c0-9db2-5fbb91ed2588","Type":"ContainerDied","Data":"e84829db4bf3833d220042d25b4548026fdd5015eed418466ddd3615e038e54f"} Feb 20 08:54:17 crc kubenswrapper[5094]: I0220 08:54:17.229298 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Feb 20 08:54:17 crc kubenswrapper[5094]: W0220 08:54:17.239887 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc6c3c7a_374b_49fc_98d5_852785c56ee7.slice/crio-72e41d453e08da9623bc506188ffe6016255c278e2aa0fb2dc8ffe38d2515827 WatchSource:0}: Error finding container 72e41d453e08da9623bc506188ffe6016255c278e2aa0fb2dc8ffe38d2515827: Status 404 returned error can't find the container with id 72e41d453e08da9623bc506188ffe6016255c278e2aa0fb2dc8ffe38d2515827 Feb 20 08:54:18 crc kubenswrapper[5094]: I0220 08:54:18.078493 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"fc6c3c7a-374b-49fc-98d5-852785c56ee7","Type":"ContainerStarted","Data":"697b34c22592b7300fcfa008f6af9269cc58d1392d91cf6428dfd278683fd31d"} Feb 20 08:54:18 crc kubenswrapper[5094]: I0220 08:54:18.079064 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"fc6c3c7a-374b-49fc-98d5-852785c56ee7","Type":"ContainerStarted","Data":"72e41d453e08da9623bc506188ffe6016255c278e2aa0fb2dc8ffe38d2515827"} Feb 20 08:54:18 crc kubenswrapper[5094]: I0220 08:54:18.088911 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"29ad14f6-de76-4992-b46a-29f0822654c7","Type":"ContainerStarted","Data":"02c2e4acc5f6ea8d4625a64c6e75382a69f1aaa6a4719c07c6f25196ceeb481d"} Feb 20 08:54:18 crc kubenswrapper[5094]: I0220 08:54:18.099620 5094 generic.go:334] "Generic (PLEG): container finished" podID="d0e42f57-46ec-4ef5-a4f0-f262ce003602" containerID="4a2f6d7a9fa1436f2f214f0a9fae369a7481837477a3433e0f286a8f3d99cdc1" exitCode=0 Feb 20 08:54:18 crc kubenswrapper[5094]: I0220 08:54:18.099694 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b99f595-hbwr4" event={"ID":"d0e42f57-46ec-4ef5-a4f0-f262ce003602","Type":"ContainerDied","Data":"4a2f6d7a9fa1436f2f214f0a9fae369a7481837477a3433e0f286a8f3d99cdc1"} Feb 20 08:54:19 crc kubenswrapper[5094]: I0220 08:54:19.109909 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b99f595-hbwr4" event={"ID":"d0e42f57-46ec-4ef5-a4f0-f262ce003602","Type":"ContainerStarted","Data":"355d244659b681399735fcdacac24c858003f7b54d0a83a0b5de3821ffa260e9"} Feb 20 08:54:19 crc kubenswrapper[5094]: I0220 08:54:19.110467 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:19 crc kubenswrapper[5094]: I0220 08:54:19.111558 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"fc6c3c7a-374b-49fc-98d5-852785c56ee7","Type":"ContainerStarted","Data":"b6e4e6054b823428e4103a0b80aa4680f017b35ffcd5ff94e955860a002c75ba"} Feb 20 08:54:19 crc kubenswrapper[5094]: I0220 08:54:19.111698 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Feb 20 08:54:19 crc kubenswrapper[5094]: I0220 08:54:19.116775 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrlvp" event={"ID":"7f62335a-f07f-45c0-9db2-5fbb91ed2588","Type":"ContainerStarted","Data":"03beb55f3b7a57c08ea38dda78b9f937a49b96f4577557bb63adad7f81d02e35"} Feb 20 08:54:19 crc kubenswrapper[5094]: I0220 08:54:19.119181 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"29ad14f6-de76-4992-b46a-29f0822654c7","Type":"ContainerStarted","Data":"f21b439125592db20581223d0556db95ff4be1555bdefe40714a2539f53b8b67"} Feb 20 08:54:19 crc kubenswrapper[5094]: I0220 08:54:19.131112 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58b99f595-hbwr4" podStartSLOduration=4.1310913639999995 podStartE2EDuration="4.131091364s" podCreationTimestamp="2026-02-20 08:54:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:54:19.128568773 +0000 UTC m=+7674.001195484" watchObservedRunningTime="2026-02-20 08:54:19.131091364 +0000 UTC m=+7674.003718075" Feb 20 08:54:19 crc kubenswrapper[5094]: I0220 08:54:19.150666 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=4.150647195 podStartE2EDuration="4.150647195s" podCreationTimestamp="2026-02-20 08:54:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:54:19.148250597 +0000 UTC m=+7674.020877308" watchObservedRunningTime="2026-02-20 08:54:19.150647195 +0000 UTC m=+7674.023273906" Feb 20 08:54:19 crc kubenswrapper[5094]: I0220 08:54:19.170318 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zrlvp" podStartSLOduration=3.507867857 podStartE2EDuration="8.170303317s" podCreationTimestamp="2026-02-20 08:54:11 +0000 UTC" firstStartedPulling="2026-02-20 08:54:12.966455333 +0000 UTC m=+7667.839082034" lastFinishedPulling="2026-02-20 08:54:17.628890783 +0000 UTC m=+7672.501517494" observedRunningTime="2026-02-20 08:54:19.164295773 +0000 UTC m=+7674.036922484" watchObservedRunningTime="2026-02-20 08:54:19.170303317 +0000 UTC m=+7674.042930028" Feb 20 08:54:22 crc kubenswrapper[5094]: I0220 08:54:22.149526 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:22 crc kubenswrapper[5094]: I0220 08:54:22.150140 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:22 crc kubenswrapper[5094]: I0220 08:54:22.156866 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 20 08:54:22 crc kubenswrapper[5094]: I0220 08:54:22.188051 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=6.747042442 podStartE2EDuration="7.188032449s" podCreationTimestamp="2026-02-20 08:54:15 +0000 UTC" firstStartedPulling="2026-02-20 08:54:16.723260101 +0000 UTC m=+7671.595886812" lastFinishedPulling="2026-02-20 08:54:17.164250108 +0000 UTC m=+7672.036876819" observedRunningTime="2026-02-20 08:54:19.188029134 +0000 UTC m=+7674.060655845" watchObservedRunningTime="2026-02-20 08:54:22.188032449 +0000 UTC m=+7677.060659160" Feb 20 08:54:23 crc kubenswrapper[5094]: I0220 08:54:23.212170 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-zrlvp" podUID="7f62335a-f07f-45c0-9db2-5fbb91ed2588" containerName="registry-server" probeResult="failure" output=< Feb 20 08:54:23 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 08:54:23 crc kubenswrapper[5094]: > Feb 20 08:54:25 crc kubenswrapper[5094]: I0220 08:54:25.796860 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Feb 20 08:54:25 crc kubenswrapper[5094]: I0220 08:54:25.894930 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.002755 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-644b874bd7-7xjnn"] Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.002980 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" podUID="4909c4ac-65fa-412c-990d-974868b0f104" containerName="dnsmasq-dns" containerID="cri-o://08c36e9a1d8ff13d6dc18d5d5c0ee6433ecb5624744ee3a3486155201c35db6a" gracePeriod=10 Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.235236 5094 generic.go:334] "Generic (PLEG): container finished" podID="4909c4ac-65fa-412c-990d-974868b0f104" containerID="08c36e9a1d8ff13d6dc18d5d5c0ee6433ecb5624744ee3a3486155201c35db6a" exitCode=0 Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.235322 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" event={"ID":"4909c4ac-65fa-412c-990d-974868b0f104","Type":"ContainerDied","Data":"08c36e9a1d8ff13d6dc18d5d5c0ee6433ecb5624744ee3a3486155201c35db6a"} Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.590045 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.674972 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-dns-svc\") pod \"4909c4ac-65fa-412c-990d-974868b0f104\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.675076 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-ovsdbserver-sb\") pod \"4909c4ac-65fa-412c-990d-974868b0f104\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.675167 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjv28\" (UniqueName: \"kubernetes.io/projected/4909c4ac-65fa-412c-990d-974868b0f104-kube-api-access-sjv28\") pod \"4909c4ac-65fa-412c-990d-974868b0f104\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.675200 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-config\") pod \"4909c4ac-65fa-412c-990d-974868b0f104\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.675229 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-ovsdbserver-nb\") pod \"4909c4ac-65fa-412c-990d-974868b0f104\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.682353 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4909c4ac-65fa-412c-990d-974868b0f104-kube-api-access-sjv28" (OuterVolumeSpecName: "kube-api-access-sjv28") pod "4909c4ac-65fa-412c-990d-974868b0f104" (UID: "4909c4ac-65fa-412c-990d-974868b0f104"). InnerVolumeSpecName "kube-api-access-sjv28". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.728689 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-config" (OuterVolumeSpecName: "config") pod "4909c4ac-65fa-412c-990d-974868b0f104" (UID: "4909c4ac-65fa-412c-990d-974868b0f104"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.729161 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4909c4ac-65fa-412c-990d-974868b0f104" (UID: "4909c4ac-65fa-412c-990d-974868b0f104"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.732131 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4909c4ac-65fa-412c-990d-974868b0f104" (UID: "4909c4ac-65fa-412c-990d-974868b0f104"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.744199 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4909c4ac-65fa-412c-990d-974868b0f104" (UID: "4909c4ac-65fa-412c-990d-974868b0f104"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.777560 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.777603 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.777621 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjv28\" (UniqueName: \"kubernetes.io/projected/4909c4ac-65fa-412c-990d-974868b0f104-kube-api-access-sjv28\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.777631 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.777640 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:27 crc kubenswrapper[5094]: I0220 08:54:27.245382 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" event={"ID":"4909c4ac-65fa-412c-990d-974868b0f104","Type":"ContainerDied","Data":"8084f28745ebf13a7935e0af610ee153d1789476c3995420facfd289029eaab4"} Feb 20 08:54:27 crc kubenswrapper[5094]: I0220 08:54:27.245396 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:54:27 crc kubenswrapper[5094]: I0220 08:54:27.245746 5094 scope.go:117] "RemoveContainer" containerID="08c36e9a1d8ff13d6dc18d5d5c0ee6433ecb5624744ee3a3486155201c35db6a" Feb 20 08:54:27 crc kubenswrapper[5094]: I0220 08:54:27.247334 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b","Type":"ContainerStarted","Data":"da55c28e144015c61d6361855066371e38dc4ac0636485764059023ff3e5347c"} Feb 20 08:54:27 crc kubenswrapper[5094]: I0220 08:54:27.247360 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b","Type":"ContainerStarted","Data":"7136c5df75041d4d9854b7cef7d18bb2ad0c2f3f8413bda279a9327c80d8f1b6"} Feb 20 08:54:27 crc kubenswrapper[5094]: I0220 08:54:27.273288 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.030862391 podStartE2EDuration="12.273269129s" podCreationTimestamp="2026-02-20 08:54:15 +0000 UTC" firstStartedPulling="2026-02-20 08:54:16.811021352 +0000 UTC m=+7671.683648063" lastFinishedPulling="2026-02-20 08:54:26.05342809 +0000 UTC m=+7680.926054801" observedRunningTime="2026-02-20 08:54:27.265777789 +0000 UTC m=+7682.138404500" watchObservedRunningTime="2026-02-20 08:54:27.273269129 +0000 UTC m=+7682.145895840" Feb 20 08:54:27 crc kubenswrapper[5094]: I0220 08:54:27.274069 5094 scope.go:117] "RemoveContainer" containerID="f5767cd62a5a9e26fc88ffbe25eb74c9c4932ee6d1de8eb39356b77614dedec0" Feb 20 08:54:27 crc kubenswrapper[5094]: I0220 08:54:27.297799 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-644b874bd7-7xjnn"] Feb 20 08:54:27 crc kubenswrapper[5094]: I0220 08:54:27.305582 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-644b874bd7-7xjnn"] Feb 20 08:54:27 crc kubenswrapper[5094]: I0220 08:54:27.851091 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4909c4ac-65fa-412c-990d-974868b0f104" path="/var/lib/kubelet/pods/4909c4ac-65fa-412c-990d-974868b0f104/volumes" Feb 20 08:54:28 crc kubenswrapper[5094]: I0220 08:54:28.408369 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 08:54:28 crc kubenswrapper[5094]: I0220 08:54:28.408937 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerName="ceilometer-central-agent" containerID="cri-o://6f7c3a64e5e40bcc8a0965c47b2906e4a7a28eb59209f746785649c8a13d38c1" gracePeriod=30 Feb 20 08:54:28 crc kubenswrapper[5094]: I0220 08:54:28.409008 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerName="ceilometer-notification-agent" containerID="cri-o://da457492d3fcb44eb3db9f843668f654d08851b042c34f3918cf3c58cc836b51" gracePeriod=30 Feb 20 08:54:28 crc kubenswrapper[5094]: I0220 08:54:28.408998 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerName="proxy-httpd" containerID="cri-o://68eb7f14cec84db171cf8515f584e0954ccbe005bb13c77ec56c3620e8c5a7ec" gracePeriod=30 Feb 20 08:54:28 crc kubenswrapper[5094]: I0220 08:54:28.409028 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerName="sg-core" containerID="cri-o://63388c3e7e7eb77a160e75b227ead6c24b91ba4a2895c59cd363b4b43675b164" gracePeriod=30 Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.291970 5094 generic.go:334] "Generic (PLEG): container finished" podID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerID="68eb7f14cec84db171cf8515f584e0954ccbe005bb13c77ec56c3620e8c5a7ec" exitCode=0 Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.292276 5094 generic.go:334] "Generic (PLEG): container finished" podID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerID="63388c3e7e7eb77a160e75b227ead6c24b91ba4a2895c59cd363b4b43675b164" exitCode=2 Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.292286 5094 generic.go:334] "Generic (PLEG): container finished" podID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerID="da457492d3fcb44eb3db9f843668f654d08851b042c34f3918cf3c58cc836b51" exitCode=0 Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.292057 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4faa0d21-cfca-4eae-a05a-3ec287395c30","Type":"ContainerDied","Data":"68eb7f14cec84db171cf8515f584e0954ccbe005bb13c77ec56c3620e8c5a7ec"} Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.292328 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4faa0d21-cfca-4eae-a05a-3ec287395c30","Type":"ContainerDied","Data":"63388c3e7e7eb77a160e75b227ead6c24b91ba4a2895c59cd363b4b43675b164"} Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.292341 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4faa0d21-cfca-4eae-a05a-3ec287395c30","Type":"ContainerDied","Data":"da457492d3fcb44eb3db9f843668f654d08851b042c34f3918cf3c58cc836b51"} Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.292351 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4faa0d21-cfca-4eae-a05a-3ec287395c30","Type":"ContainerDied","Data":"6f7c3a64e5e40bcc8a0965c47b2906e4a7a28eb59209f746785649c8a13d38c1"} Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.292295 5094 generic.go:334] "Generic (PLEG): container finished" podID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerID="6f7c3a64e5e40bcc8a0965c47b2906e4a7a28eb59209f746785649c8a13d38c1" exitCode=0 Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.630869 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.738025 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg2cf\" (UniqueName: \"kubernetes.io/projected/4faa0d21-cfca-4eae-a05a-3ec287395c30-kube-api-access-lg2cf\") pod \"4faa0d21-cfca-4eae-a05a-3ec287395c30\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.738112 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-sg-core-conf-yaml\") pod \"4faa0d21-cfca-4eae-a05a-3ec287395c30\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.738222 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4faa0d21-cfca-4eae-a05a-3ec287395c30-run-httpd\") pod \"4faa0d21-cfca-4eae-a05a-3ec287395c30\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.738250 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4faa0d21-cfca-4eae-a05a-3ec287395c30-log-httpd\") pod \"4faa0d21-cfca-4eae-a05a-3ec287395c30\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.738278 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-config-data\") pod \"4faa0d21-cfca-4eae-a05a-3ec287395c30\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.738359 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-combined-ca-bundle\") pod \"4faa0d21-cfca-4eae-a05a-3ec287395c30\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.738409 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-scripts\") pod \"4faa0d21-cfca-4eae-a05a-3ec287395c30\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.739060 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4faa0d21-cfca-4eae-a05a-3ec287395c30-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4faa0d21-cfca-4eae-a05a-3ec287395c30" (UID: "4faa0d21-cfca-4eae-a05a-3ec287395c30"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.739413 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4faa0d21-cfca-4eae-a05a-3ec287395c30-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4faa0d21-cfca-4eae-a05a-3ec287395c30" (UID: "4faa0d21-cfca-4eae-a05a-3ec287395c30"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.743970 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-scripts" (OuterVolumeSpecName: "scripts") pod "4faa0d21-cfca-4eae-a05a-3ec287395c30" (UID: "4faa0d21-cfca-4eae-a05a-3ec287395c30"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.744140 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4faa0d21-cfca-4eae-a05a-3ec287395c30-kube-api-access-lg2cf" (OuterVolumeSpecName: "kube-api-access-lg2cf") pod "4faa0d21-cfca-4eae-a05a-3ec287395c30" (UID: "4faa0d21-cfca-4eae-a05a-3ec287395c30"). InnerVolumeSpecName "kube-api-access-lg2cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.771943 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4faa0d21-cfca-4eae-a05a-3ec287395c30" (UID: "4faa0d21-cfca-4eae-a05a-3ec287395c30"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.841005 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.841233 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lg2cf\" (UniqueName: \"kubernetes.io/projected/4faa0d21-cfca-4eae-a05a-3ec287395c30-kube-api-access-lg2cf\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.841569 5094 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.841738 5094 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4faa0d21-cfca-4eae-a05a-3ec287395c30-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.841759 5094 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4faa0d21-cfca-4eae-a05a-3ec287395c30-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.841986 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4faa0d21-cfca-4eae-a05a-3ec287395c30" (UID: "4faa0d21-cfca-4eae-a05a-3ec287395c30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.860682 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-config-data" (OuterVolumeSpecName: "config-data") pod "4faa0d21-cfca-4eae-a05a-3ec287395c30" (UID: "4faa0d21-cfca-4eae-a05a-3ec287395c30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.944096 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.944125 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.303640 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4faa0d21-cfca-4eae-a05a-3ec287395c30","Type":"ContainerDied","Data":"4282bf0047b56410178151ee0ba808962a8b92244967608282436fe0dae092a8"} Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.303712 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.303705 5094 scope.go:117] "RemoveContainer" containerID="68eb7f14cec84db171cf8515f584e0954ccbe005bb13c77ec56c3620e8c5a7ec" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.337772 5094 scope.go:117] "RemoveContainer" containerID="63388c3e7e7eb77a160e75b227ead6c24b91ba4a2895c59cd363b4b43675b164" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.340850 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.358056 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.372447 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 08:54:30 crc kubenswrapper[5094]: E0220 08:54:30.372979 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4909c4ac-65fa-412c-990d-974868b0f104" containerName="init" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.373004 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4909c4ac-65fa-412c-990d-974868b0f104" containerName="init" Feb 20 08:54:30 crc kubenswrapper[5094]: E0220 08:54:30.373025 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerName="ceilometer-central-agent" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.373032 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerName="ceilometer-central-agent" Feb 20 08:54:30 crc kubenswrapper[5094]: E0220 08:54:30.373067 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerName="proxy-httpd" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.373076 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerName="proxy-httpd" Feb 20 08:54:30 crc kubenswrapper[5094]: E0220 08:54:30.373084 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerName="sg-core" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.373090 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerName="sg-core" Feb 20 08:54:30 crc kubenswrapper[5094]: E0220 08:54:30.373105 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerName="ceilometer-notification-agent" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.373113 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerName="ceilometer-notification-agent" Feb 20 08:54:30 crc kubenswrapper[5094]: E0220 08:54:30.373151 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4909c4ac-65fa-412c-990d-974868b0f104" containerName="dnsmasq-dns" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.373159 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4909c4ac-65fa-412c-990d-974868b0f104" containerName="dnsmasq-dns" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.373369 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4909c4ac-65fa-412c-990d-974868b0f104" containerName="dnsmasq-dns" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.373381 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerName="sg-core" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.373396 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerName="ceilometer-central-agent" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.373405 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerName="ceilometer-notification-agent" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.373418 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerName="proxy-httpd" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.375307 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.381156 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.384065 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.384172 5094 scope.go:117] "RemoveContainer" containerID="da457492d3fcb44eb3db9f843668f654d08851b042c34f3918cf3c58cc836b51" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.384309 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.417648 5094 scope.go:117] "RemoveContainer" containerID="6f7c3a64e5e40bcc8a0965c47b2906e4a7a28eb59209f746785649c8a13d38c1" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.452573 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f751c26-9b9c-4a25-a388-cc52b0934ab6-config-data\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.452654 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f751c26-9b9c-4a25-a388-cc52b0934ab6-run-httpd\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.452772 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m78lk\" (UniqueName: \"kubernetes.io/projected/2f751c26-9b9c-4a25-a388-cc52b0934ab6-kube-api-access-m78lk\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.453047 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f751c26-9b9c-4a25-a388-cc52b0934ab6-log-httpd\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.453079 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f751c26-9b9c-4a25-a388-cc52b0934ab6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.453278 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f751c26-9b9c-4a25-a388-cc52b0934ab6-scripts\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.453364 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f751c26-9b9c-4a25-a388-cc52b0934ab6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.555312 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f751c26-9b9c-4a25-a388-cc52b0934ab6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.555387 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f751c26-9b9c-4a25-a388-cc52b0934ab6-config-data\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.555459 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f751c26-9b9c-4a25-a388-cc52b0934ab6-run-httpd\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.555501 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m78lk\" (UniqueName: \"kubernetes.io/projected/2f751c26-9b9c-4a25-a388-cc52b0934ab6-kube-api-access-m78lk\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.555638 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f751c26-9b9c-4a25-a388-cc52b0934ab6-log-httpd\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.555682 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f751c26-9b9c-4a25-a388-cc52b0934ab6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.555751 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f751c26-9b9c-4a25-a388-cc52b0934ab6-scripts\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.555977 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f751c26-9b9c-4a25-a388-cc52b0934ab6-run-httpd\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.556198 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f751c26-9b9c-4a25-a388-cc52b0934ab6-log-httpd\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.559428 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f751c26-9b9c-4a25-a388-cc52b0934ab6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.561387 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f751c26-9b9c-4a25-a388-cc52b0934ab6-scripts\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.562007 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f751c26-9b9c-4a25-a388-cc52b0934ab6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.562605 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f751c26-9b9c-4a25-a388-cc52b0934ab6-config-data\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.580026 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m78lk\" (UniqueName: \"kubernetes.io/projected/2f751c26-9b9c-4a25-a388-cc52b0934ab6-kube-api-access-m78lk\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.693750 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 08:54:31 crc kubenswrapper[5094]: W0220 08:54:31.220156 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f751c26_9b9c_4a25_a388_cc52b0934ab6.slice/crio-f51cfa853f9b360febacfa1c6dcf6a12af9a666bc15b73ca70a6c801dfb5c241 WatchSource:0}: Error finding container f51cfa853f9b360febacfa1c6dcf6a12af9a666bc15b73ca70a6c801dfb5c241: Status 404 returned error can't find the container with id f51cfa853f9b360febacfa1c6dcf6a12af9a666bc15b73ca70a6c801dfb5c241 Feb 20 08:54:31 crc kubenswrapper[5094]: I0220 08:54:31.222411 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 08:54:31 crc kubenswrapper[5094]: I0220 08:54:31.324628 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f751c26-9b9c-4a25-a388-cc52b0934ab6","Type":"ContainerStarted","Data":"f51cfa853f9b360febacfa1c6dcf6a12af9a666bc15b73ca70a6c801dfb5c241"} Feb 20 08:54:31 crc kubenswrapper[5094]: I0220 08:54:31.853279 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" path="/var/lib/kubelet/pods/4faa0d21-cfca-4eae-a05a-3ec287395c30/volumes" Feb 20 08:54:32 crc kubenswrapper[5094]: I0220 08:54:32.198069 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:32 crc kubenswrapper[5094]: I0220 08:54:32.255151 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:32 crc kubenswrapper[5094]: I0220 08:54:32.339381 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f751c26-9b9c-4a25-a388-cc52b0934ab6","Type":"ContainerStarted","Data":"fa85623f0dc63071a2c04c1a12c55922711803a3d0d7da69877462774dd83d87"} Feb 20 08:54:32 crc kubenswrapper[5094]: I0220 08:54:32.339426 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f751c26-9b9c-4a25-a388-cc52b0934ab6","Type":"ContainerStarted","Data":"a334e0060f193223f9416b979637d2c6744c09e5e7365cbe9cad13dc86ce4a73"} Feb 20 08:54:32 crc kubenswrapper[5094]: I0220 08:54:32.438467 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zrlvp"] Feb 20 08:54:33 crc kubenswrapper[5094]: I0220 08:54:33.350314 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f751c26-9b9c-4a25-a388-cc52b0934ab6","Type":"ContainerStarted","Data":"d8ed66e59ddc31f00fbe729caf0c99ab348bd906da78040a6607314a54f0742c"} Feb 20 08:54:33 crc kubenswrapper[5094]: I0220 08:54:33.350450 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zrlvp" podUID="7f62335a-f07f-45c0-9db2-5fbb91ed2588" containerName="registry-server" containerID="cri-o://03beb55f3b7a57c08ea38dda78b9f937a49b96f4577557bb63adad7f81d02e35" gracePeriod=2 Feb 20 08:54:33 crc kubenswrapper[5094]: I0220 08:54:33.887426 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.030522 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jll2n\" (UniqueName: \"kubernetes.io/projected/7f62335a-f07f-45c0-9db2-5fbb91ed2588-kube-api-access-jll2n\") pod \"7f62335a-f07f-45c0-9db2-5fbb91ed2588\" (UID: \"7f62335a-f07f-45c0-9db2-5fbb91ed2588\") " Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.030614 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f62335a-f07f-45c0-9db2-5fbb91ed2588-catalog-content\") pod \"7f62335a-f07f-45c0-9db2-5fbb91ed2588\" (UID: \"7f62335a-f07f-45c0-9db2-5fbb91ed2588\") " Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.030654 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f62335a-f07f-45c0-9db2-5fbb91ed2588-utilities\") pod \"7f62335a-f07f-45c0-9db2-5fbb91ed2588\" (UID: \"7f62335a-f07f-45c0-9db2-5fbb91ed2588\") " Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.031571 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f62335a-f07f-45c0-9db2-5fbb91ed2588-utilities" (OuterVolumeSpecName: "utilities") pod "7f62335a-f07f-45c0-9db2-5fbb91ed2588" (UID: "7f62335a-f07f-45c0-9db2-5fbb91ed2588"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.052670 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f62335a-f07f-45c0-9db2-5fbb91ed2588-kube-api-access-jll2n" (OuterVolumeSpecName: "kube-api-access-jll2n") pod "7f62335a-f07f-45c0-9db2-5fbb91ed2588" (UID: "7f62335a-f07f-45c0-9db2-5fbb91ed2588"). InnerVolumeSpecName "kube-api-access-jll2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.100381 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f62335a-f07f-45c0-9db2-5fbb91ed2588-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7f62335a-f07f-45c0-9db2-5fbb91ed2588" (UID: "7f62335a-f07f-45c0-9db2-5fbb91ed2588"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.107575 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.107631 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.133383 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f62335a-f07f-45c0-9db2-5fbb91ed2588-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.133447 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f62335a-f07f-45c0-9db2-5fbb91ed2588-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.133461 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jll2n\" (UniqueName: \"kubernetes.io/projected/7f62335a-f07f-45c0-9db2-5fbb91ed2588-kube-api-access-jll2n\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.362290 5094 generic.go:334] "Generic (PLEG): container finished" podID="7f62335a-f07f-45c0-9db2-5fbb91ed2588" containerID="03beb55f3b7a57c08ea38dda78b9f937a49b96f4577557bb63adad7f81d02e35" exitCode=0 Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.362342 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrlvp" event={"ID":"7f62335a-f07f-45c0-9db2-5fbb91ed2588","Type":"ContainerDied","Data":"03beb55f3b7a57c08ea38dda78b9f937a49b96f4577557bb63adad7f81d02e35"} Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.362392 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrlvp" event={"ID":"7f62335a-f07f-45c0-9db2-5fbb91ed2588","Type":"ContainerDied","Data":"d5f59bb757af9e7257cafd6f92f50a0f4fa521fad3dffe794d557e3c053860eb"} Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.362409 5094 scope.go:117] "RemoveContainer" containerID="03beb55f3b7a57c08ea38dda78b9f937a49b96f4577557bb63adad7f81d02e35" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.362572 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.406184 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zrlvp"] Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.411262 5094 scope.go:117] "RemoveContainer" containerID="e84829db4bf3833d220042d25b4548026fdd5015eed418466ddd3615e038e54f" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.419263 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zrlvp"] Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.537764 5094 scope.go:117] "RemoveContainer" containerID="5295dd5a17ff5868f1250b18a2e2107bdc0361a941732b23ea2dd7611017a966" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.560848 5094 scope.go:117] "RemoveContainer" containerID="03beb55f3b7a57c08ea38dda78b9f937a49b96f4577557bb63adad7f81d02e35" Feb 20 08:54:34 crc kubenswrapper[5094]: E0220 08:54:34.561256 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03beb55f3b7a57c08ea38dda78b9f937a49b96f4577557bb63adad7f81d02e35\": container with ID starting with 03beb55f3b7a57c08ea38dda78b9f937a49b96f4577557bb63adad7f81d02e35 not found: ID does not exist" containerID="03beb55f3b7a57c08ea38dda78b9f937a49b96f4577557bb63adad7f81d02e35" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.561285 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03beb55f3b7a57c08ea38dda78b9f937a49b96f4577557bb63adad7f81d02e35"} err="failed to get container status \"03beb55f3b7a57c08ea38dda78b9f937a49b96f4577557bb63adad7f81d02e35\": rpc error: code = NotFound desc = could not find container \"03beb55f3b7a57c08ea38dda78b9f937a49b96f4577557bb63adad7f81d02e35\": container with ID starting with 03beb55f3b7a57c08ea38dda78b9f937a49b96f4577557bb63adad7f81d02e35 not found: ID does not exist" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.561304 5094 scope.go:117] "RemoveContainer" containerID="e84829db4bf3833d220042d25b4548026fdd5015eed418466ddd3615e038e54f" Feb 20 08:54:34 crc kubenswrapper[5094]: E0220 08:54:34.561605 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e84829db4bf3833d220042d25b4548026fdd5015eed418466ddd3615e038e54f\": container with ID starting with e84829db4bf3833d220042d25b4548026fdd5015eed418466ddd3615e038e54f not found: ID does not exist" containerID="e84829db4bf3833d220042d25b4548026fdd5015eed418466ddd3615e038e54f" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.561625 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e84829db4bf3833d220042d25b4548026fdd5015eed418466ddd3615e038e54f"} err="failed to get container status \"e84829db4bf3833d220042d25b4548026fdd5015eed418466ddd3615e038e54f\": rpc error: code = NotFound desc = could not find container \"e84829db4bf3833d220042d25b4548026fdd5015eed418466ddd3615e038e54f\": container with ID starting with e84829db4bf3833d220042d25b4548026fdd5015eed418466ddd3615e038e54f not found: ID does not exist" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.561637 5094 scope.go:117] "RemoveContainer" containerID="5295dd5a17ff5868f1250b18a2e2107bdc0361a941732b23ea2dd7611017a966" Feb 20 08:54:34 crc kubenswrapper[5094]: E0220 08:54:34.561897 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5295dd5a17ff5868f1250b18a2e2107bdc0361a941732b23ea2dd7611017a966\": container with ID starting with 5295dd5a17ff5868f1250b18a2e2107bdc0361a941732b23ea2dd7611017a966 not found: ID does not exist" containerID="5295dd5a17ff5868f1250b18a2e2107bdc0361a941732b23ea2dd7611017a966" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.561947 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5295dd5a17ff5868f1250b18a2e2107bdc0361a941732b23ea2dd7611017a966"} err="failed to get container status \"5295dd5a17ff5868f1250b18a2e2107bdc0361a941732b23ea2dd7611017a966\": rpc error: code = NotFound desc = could not find container \"5295dd5a17ff5868f1250b18a2e2107bdc0361a941732b23ea2dd7611017a966\": container with ID starting with 5295dd5a17ff5868f1250b18a2e2107bdc0361a941732b23ea2dd7611017a966 not found: ID does not exist" Feb 20 08:54:35 crc kubenswrapper[5094]: I0220 08:54:35.376182 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f751c26-9b9c-4a25-a388-cc52b0934ab6","Type":"ContainerStarted","Data":"0bdddbfa20ccd1fd60f8dd4da4d1b072906e9ab932c9fbc6487a0462f0ae21e4"} Feb 20 08:54:35 crc kubenswrapper[5094]: I0220 08:54:35.377595 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 08:54:35 crc kubenswrapper[5094]: I0220 08:54:35.414020 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.215743126 podStartE2EDuration="5.4139993s" podCreationTimestamp="2026-02-20 08:54:30 +0000 UTC" firstStartedPulling="2026-02-20 08:54:31.224858033 +0000 UTC m=+7686.097484744" lastFinishedPulling="2026-02-20 08:54:34.423114207 +0000 UTC m=+7689.295740918" observedRunningTime="2026-02-20 08:54:35.398120207 +0000 UTC m=+7690.270746918" watchObservedRunningTime="2026-02-20 08:54:35.4139993 +0000 UTC m=+7690.286626011" Feb 20 08:54:35 crc kubenswrapper[5094]: I0220 08:54:35.785683 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Feb 20 08:54:35 crc kubenswrapper[5094]: I0220 08:54:35.851411 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f62335a-f07f-45c0-9db2-5fbb91ed2588" path="/var/lib/kubelet/pods/7f62335a-f07f-45c0-9db2-5fbb91ed2588/volumes" Feb 20 08:54:37 crc kubenswrapper[5094]: I0220 08:54:37.373616 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Feb 20 08:54:37 crc kubenswrapper[5094]: I0220 08:54:37.693459 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Feb 20 08:54:41 crc kubenswrapper[5094]: I0220 08:54:41.620276 5094 scope.go:117] "RemoveContainer" containerID="3996b425ffaed37f9d76c6d868da173f9ea735a39c30061d9efa5a940e6f7333" Feb 20 08:54:41 crc kubenswrapper[5094]: I0220 08:54:41.660770 5094 scope.go:117] "RemoveContainer" containerID="9b130573754c208a821c2a5aa00744abfcde1ec2f224d985ae00e81ebcaa218e" Feb 20 08:54:41 crc kubenswrapper[5094]: I0220 08:54:41.701203 5094 scope.go:117] "RemoveContainer" containerID="8dfc18891e7f2cecc2e704cc07266d7a47a98f1dcf9f167194c7d37d347b850e" Feb 20 08:54:47 crc kubenswrapper[5094]: I0220 08:54:47.356917 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Feb 20 08:55:00 crc kubenswrapper[5094]: I0220 08:55:00.055074 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-437f-account-create-update-cc6d4"] Feb 20 08:55:00 crc kubenswrapper[5094]: I0220 08:55:00.073436 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-qwnhp"] Feb 20 08:55:00 crc kubenswrapper[5094]: I0220 08:55:00.082913 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-437f-account-create-update-cc6d4"] Feb 20 08:55:00 crc kubenswrapper[5094]: I0220 08:55:00.091829 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-q8cvq"] Feb 20 08:55:00 crc kubenswrapper[5094]: I0220 08:55:00.101009 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-vjfql"] Feb 20 08:55:00 crc kubenswrapper[5094]: I0220 08:55:00.109825 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-qwnhp"] Feb 20 08:55:00 crc kubenswrapper[5094]: I0220 08:55:00.118623 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-q8cvq"] Feb 20 08:55:00 crc kubenswrapper[5094]: I0220 08:55:00.127277 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-vjfql"] Feb 20 08:55:00 crc kubenswrapper[5094]: I0220 08:55:00.700260 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 20 08:55:01 crc kubenswrapper[5094]: I0220 08:55:01.032521 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-1c22-account-create-update-wl44h"] Feb 20 08:55:01 crc kubenswrapper[5094]: I0220 08:55:01.041023 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-9129-account-create-update-kcqsj"] Feb 20 08:55:01 crc kubenswrapper[5094]: I0220 08:55:01.049823 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-1c22-account-create-update-wl44h"] Feb 20 08:55:01 crc kubenswrapper[5094]: I0220 08:55:01.058419 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-9129-account-create-update-kcqsj"] Feb 20 08:55:01 crc kubenswrapper[5094]: I0220 08:55:01.850277 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ab2f8a8-e11c-4b13-a12f-7006756e4d56" path="/var/lib/kubelet/pods/2ab2f8a8-e11c-4b13-a12f-7006756e4d56/volumes" Feb 20 08:55:01 crc kubenswrapper[5094]: I0220 08:55:01.851128 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33e893dc-597d-4b0d-b59d-04c636d58ce4" path="/var/lib/kubelet/pods/33e893dc-597d-4b0d-b59d-04c636d58ce4/volumes" Feb 20 08:55:01 crc kubenswrapper[5094]: I0220 08:55:01.851711 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62afc590-4a32-45a1-b7e9-bde09c7f0b6a" path="/var/lib/kubelet/pods/62afc590-4a32-45a1-b7e9-bde09c7f0b6a/volumes" Feb 20 08:55:01 crc kubenswrapper[5094]: I0220 08:55:01.852352 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90396e9c-2602-41dd-92c3-da38bb5f7be7" path="/var/lib/kubelet/pods/90396e9c-2602-41dd-92c3-da38bb5f7be7/volumes" Feb 20 08:55:01 crc kubenswrapper[5094]: I0220 08:55:01.853409 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95274e98-2b48-4b4d-b0c5-5dedafedc43f" path="/var/lib/kubelet/pods/95274e98-2b48-4b4d-b0c5-5dedafedc43f/volumes" Feb 20 08:55:01 crc kubenswrapper[5094]: I0220 08:55:01.854060 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e24ca1b9-7440-432c-a0eb-58a17f83a8ee" path="/var/lib/kubelet/pods/e24ca1b9-7440-432c-a0eb-58a17f83a8ee/volumes" Feb 20 08:55:04 crc kubenswrapper[5094]: I0220 08:55:04.107125 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:55:04 crc kubenswrapper[5094]: I0220 08:55:04.107540 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:55:04 crc kubenswrapper[5094]: I0220 08:55:04.107594 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 08:55:04 crc kubenswrapper[5094]: I0220 08:55:04.108692 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 08:55:04 crc kubenswrapper[5094]: I0220 08:55:04.108787 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" gracePeriod=600 Feb 20 08:55:04 crc kubenswrapper[5094]: E0220 08:55:04.238235 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:55:04 crc kubenswrapper[5094]: I0220 08:55:04.679450 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" exitCode=0 Feb 20 08:55:04 crc kubenswrapper[5094]: I0220 08:55:04.679500 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c"} Feb 20 08:55:04 crc kubenswrapper[5094]: I0220 08:55:04.679541 5094 scope.go:117] "RemoveContainer" containerID="e8403acee0c31e397e6e5d741268e12e6725d137dadeeb1b9238f72fcf352268" Feb 20 08:55:04 crc kubenswrapper[5094]: I0220 08:55:04.681380 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:55:04 crc kubenswrapper[5094]: E0220 08:55:04.681847 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:55:18 crc kubenswrapper[5094]: I0220 08:55:18.037080 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mkwlr"] Feb 20 08:55:18 crc kubenswrapper[5094]: I0220 08:55:18.051313 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mkwlr"] Feb 20 08:55:19 crc kubenswrapper[5094]: I0220 08:55:19.840693 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:55:19 crc kubenswrapper[5094]: E0220 08:55:19.841260 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:55:19 crc kubenswrapper[5094]: I0220 08:55:19.851407 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91" path="/var/lib/kubelet/pods/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91/volumes" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.111415 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55d7f5f657-jsw6v"] Feb 20 08:55:29 crc kubenswrapper[5094]: E0220 08:55:29.112348 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f62335a-f07f-45c0-9db2-5fbb91ed2588" containerName="extract-utilities" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.112361 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f62335a-f07f-45c0-9db2-5fbb91ed2588" containerName="extract-utilities" Feb 20 08:55:29 crc kubenswrapper[5094]: E0220 08:55:29.112387 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f62335a-f07f-45c0-9db2-5fbb91ed2588" containerName="extract-content" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.112393 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f62335a-f07f-45c0-9db2-5fbb91ed2588" containerName="extract-content" Feb 20 08:55:29 crc kubenswrapper[5094]: E0220 08:55:29.112407 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f62335a-f07f-45c0-9db2-5fbb91ed2588" containerName="registry-server" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.112413 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f62335a-f07f-45c0-9db2-5fbb91ed2588" containerName="registry-server" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.112621 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f62335a-f07f-45c0-9db2-5fbb91ed2588" containerName="registry-server" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.113920 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.118417 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.127881 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55d7f5f657-jsw6v"] Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.218374 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-config\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.218757 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-openstack-cell1\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.218875 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-ovsdbserver-sb\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.218931 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spbf8\" (UniqueName: \"kubernetes.io/projected/7ef4d7b9-7970-4336-859e-08e2a4820524-kube-api-access-spbf8\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.219089 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-ovsdbserver-nb\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.219119 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-dns-svc\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.321626 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-openstack-cell1\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.321736 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-ovsdbserver-sb\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.321784 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spbf8\" (UniqueName: \"kubernetes.io/projected/7ef4d7b9-7970-4336-859e-08e2a4820524-kube-api-access-spbf8\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.321845 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-ovsdbserver-nb\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.321879 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-dns-svc\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.321957 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-config\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.322755 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-dns-svc\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.322765 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-openstack-cell1\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.323324 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-config\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.323481 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-ovsdbserver-sb\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.323973 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-ovsdbserver-nb\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.340495 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spbf8\" (UniqueName: \"kubernetes.io/projected/7ef4d7b9-7970-4336-859e-08e2a4820524-kube-api-access-spbf8\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.431312 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:30 crc kubenswrapper[5094]: I0220 08:55:29.909851 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55d7f5f657-jsw6v"] Feb 20 08:55:30 crc kubenswrapper[5094]: W0220 08:55:29.911122 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ef4d7b9_7970_4336_859e_08e2a4820524.slice/crio-366cdb507f5b3b52436869451d51fb6e8c589ab86c365a15488bd4c24a612eb6 WatchSource:0}: Error finding container 366cdb507f5b3b52436869451d51fb6e8c589ab86c365a15488bd4c24a612eb6: Status 404 returned error can't find the container with id 366cdb507f5b3b52436869451d51fb6e8c589ab86c365a15488bd4c24a612eb6 Feb 20 08:55:30 crc kubenswrapper[5094]: I0220 08:55:29.974857 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" event={"ID":"7ef4d7b9-7970-4336-859e-08e2a4820524","Type":"ContainerStarted","Data":"366cdb507f5b3b52436869451d51fb6e8c589ab86c365a15488bd4c24a612eb6"} Feb 20 08:55:30 crc kubenswrapper[5094]: I0220 08:55:30.840468 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:55:30 crc kubenswrapper[5094]: E0220 08:55:30.841289 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:55:31 crc kubenswrapper[5094]: I0220 08:55:31.028827 5094 generic.go:334] "Generic (PLEG): container finished" podID="7ef4d7b9-7970-4336-859e-08e2a4820524" containerID="d0d02e83461148d891a35888220f3b22b91ec3fc7a98679a44ab912f988c01ed" exitCode=0 Feb 20 08:55:31 crc kubenswrapper[5094]: I0220 08:55:31.028878 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" event={"ID":"7ef4d7b9-7970-4336-859e-08e2a4820524","Type":"ContainerDied","Data":"d0d02e83461148d891a35888220f3b22b91ec3fc7a98679a44ab912f988c01ed"} Feb 20 08:55:32 crc kubenswrapper[5094]: I0220 08:55:32.038858 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" event={"ID":"7ef4d7b9-7970-4336-859e-08e2a4820524","Type":"ContainerStarted","Data":"9d0e86403207be00e168f6826a234cf2f0c49931b532c3aeb9feb705b3e0e69d"} Feb 20 08:55:32 crc kubenswrapper[5094]: I0220 08:55:32.039205 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:32 crc kubenswrapper[5094]: I0220 08:55:32.064159 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" podStartSLOduration=3.064138271 podStartE2EDuration="3.064138271s" podCreationTimestamp="2026-02-20 08:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:55:32.058810733 +0000 UTC m=+7746.931437444" watchObservedRunningTime="2026-02-20 08:55:32.064138271 +0000 UTC m=+7746.936764982" Feb 20 08:55:36 crc kubenswrapper[5094]: I0220 08:55:36.059897 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cwpxs"] Feb 20 08:55:36 crc kubenswrapper[5094]: I0220 08:55:36.074187 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cwpxs"] Feb 20 08:55:37 crc kubenswrapper[5094]: I0220 08:55:37.038243 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-62kch"] Feb 20 08:55:37 crc kubenswrapper[5094]: I0220 08:55:37.051884 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-62kch"] Feb 20 08:55:37 crc kubenswrapper[5094]: I0220 08:55:37.861328 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="077dc649-6898-4f04-837d-b694decf612b" path="/var/lib/kubelet/pods/077dc649-6898-4f04-837d-b694decf612b/volumes" Feb 20 08:55:37 crc kubenswrapper[5094]: I0220 08:55:37.862755 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3" path="/var/lib/kubelet/pods/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3/volumes" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.433393 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.491237 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58b99f595-hbwr4"] Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.491462 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58b99f595-hbwr4" podUID="d0e42f57-46ec-4ef5-a4f0-f262ce003602" containerName="dnsmasq-dns" containerID="cri-o://355d244659b681399735fcdacac24c858003f7b54d0a83a0b5de3821ffa260e9" gracePeriod=10 Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.664606 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d74d874c7-2ztxv"] Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.681228 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.682015 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d74d874c7-2ztxv"] Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.767113 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-openstack-cell1\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.767481 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-ovsdbserver-sb\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.767543 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-dns-svc\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.767567 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-ovsdbserver-nb\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.767623 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz5g6\" (UniqueName: \"kubernetes.io/projected/31655ca7-8da2-4f24-a0c6-49ba0de9d207-kube-api-access-sz5g6\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.769160 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-config\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.872405 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-config\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.872480 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-openstack-cell1\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.872517 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-ovsdbserver-sb\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.872567 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-dns-svc\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.872590 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-ovsdbserver-nb\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.872636 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz5g6\" (UniqueName: \"kubernetes.io/projected/31655ca7-8da2-4f24-a0c6-49ba0de9d207-kube-api-access-sz5g6\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.873471 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-config\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.873491 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-openstack-cell1\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.873669 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-ovsdbserver-sb\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.874169 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-ovsdbserver-nb\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.874193 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-dns-svc\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.916390 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz5g6\" (UniqueName: \"kubernetes.io/projected/31655ca7-8da2-4f24-a0c6-49ba0de9d207-kube-api-access-sz5g6\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.043778 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.124262 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.160537 5094 generic.go:334] "Generic (PLEG): container finished" podID="d0e42f57-46ec-4ef5-a4f0-f262ce003602" containerID="355d244659b681399735fcdacac24c858003f7b54d0a83a0b5de3821ffa260e9" exitCode=0 Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.160935 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b99f595-hbwr4" event={"ID":"d0e42f57-46ec-4ef5-a4f0-f262ce003602","Type":"ContainerDied","Data":"355d244659b681399735fcdacac24c858003f7b54d0a83a0b5de3821ffa260e9"} Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.160981 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b99f595-hbwr4" event={"ID":"d0e42f57-46ec-4ef5-a4f0-f262ce003602","Type":"ContainerDied","Data":"0a945c3e150afe6769cf7a5b87b40b125560d57829dfe7b7047e2bf325ea9a2b"} Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.161003 5094 scope.go:117] "RemoveContainer" containerID="355d244659b681399735fcdacac24c858003f7b54d0a83a0b5de3821ffa260e9" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.160956 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.178066 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-ovsdbserver-nb\") pod \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.178200 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-dns-svc\") pod \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.178244 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-config\") pod \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.178287 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-ovsdbserver-sb\") pod \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.178357 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxvk6\" (UniqueName: \"kubernetes.io/projected/d0e42f57-46ec-4ef5-a4f0-f262ce003602-kube-api-access-lxvk6\") pod \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.183825 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0e42f57-46ec-4ef5-a4f0-f262ce003602-kube-api-access-lxvk6" (OuterVolumeSpecName: "kube-api-access-lxvk6") pod "d0e42f57-46ec-4ef5-a4f0-f262ce003602" (UID: "d0e42f57-46ec-4ef5-a4f0-f262ce003602"). InnerVolumeSpecName "kube-api-access-lxvk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.191070 5094 scope.go:117] "RemoveContainer" containerID="4a2f6d7a9fa1436f2f214f0a9fae369a7481837477a3433e0f286a8f3d99cdc1" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.217296 5094 scope.go:117] "RemoveContainer" containerID="355d244659b681399735fcdacac24c858003f7b54d0a83a0b5de3821ffa260e9" Feb 20 08:55:40 crc kubenswrapper[5094]: E0220 08:55:40.218741 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"355d244659b681399735fcdacac24c858003f7b54d0a83a0b5de3821ffa260e9\": container with ID starting with 355d244659b681399735fcdacac24c858003f7b54d0a83a0b5de3821ffa260e9 not found: ID does not exist" containerID="355d244659b681399735fcdacac24c858003f7b54d0a83a0b5de3821ffa260e9" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.218787 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"355d244659b681399735fcdacac24c858003f7b54d0a83a0b5de3821ffa260e9"} err="failed to get container status \"355d244659b681399735fcdacac24c858003f7b54d0a83a0b5de3821ffa260e9\": rpc error: code = NotFound desc = could not find container \"355d244659b681399735fcdacac24c858003f7b54d0a83a0b5de3821ffa260e9\": container with ID starting with 355d244659b681399735fcdacac24c858003f7b54d0a83a0b5de3821ffa260e9 not found: ID does not exist" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.218816 5094 scope.go:117] "RemoveContainer" containerID="4a2f6d7a9fa1436f2f214f0a9fae369a7481837477a3433e0f286a8f3d99cdc1" Feb 20 08:55:40 crc kubenswrapper[5094]: E0220 08:55:40.219180 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a2f6d7a9fa1436f2f214f0a9fae369a7481837477a3433e0f286a8f3d99cdc1\": container with ID starting with 4a2f6d7a9fa1436f2f214f0a9fae369a7481837477a3433e0f286a8f3d99cdc1 not found: ID does not exist" containerID="4a2f6d7a9fa1436f2f214f0a9fae369a7481837477a3433e0f286a8f3d99cdc1" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.219209 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a2f6d7a9fa1436f2f214f0a9fae369a7481837477a3433e0f286a8f3d99cdc1"} err="failed to get container status \"4a2f6d7a9fa1436f2f214f0a9fae369a7481837477a3433e0f286a8f3d99cdc1\": rpc error: code = NotFound desc = could not find container \"4a2f6d7a9fa1436f2f214f0a9fae369a7481837477a3433e0f286a8f3d99cdc1\": container with ID starting with 4a2f6d7a9fa1436f2f214f0a9fae369a7481837477a3433e0f286a8f3d99cdc1 not found: ID does not exist" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.243143 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d0e42f57-46ec-4ef5-a4f0-f262ce003602" (UID: "d0e42f57-46ec-4ef5-a4f0-f262ce003602"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.248347 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d0e42f57-46ec-4ef5-a4f0-f262ce003602" (UID: "d0e42f57-46ec-4ef5-a4f0-f262ce003602"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.276606 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d0e42f57-46ec-4ef5-a4f0-f262ce003602" (UID: "d0e42f57-46ec-4ef5-a4f0-f262ce003602"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.279878 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-config" (OuterVolumeSpecName: "config") pod "d0e42f57-46ec-4ef5-a4f0-f262ce003602" (UID: "d0e42f57-46ec-4ef5-a4f0-f262ce003602"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.280124 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.280151 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.280163 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.280172 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.280183 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxvk6\" (UniqueName: \"kubernetes.io/projected/d0e42f57-46ec-4ef5-a4f0-f262ce003602-kube-api-access-lxvk6\") on node \"crc\" DevicePath \"\"" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.498832 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58b99f595-hbwr4"] Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.508507 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58b99f595-hbwr4"] Feb 20 08:55:40 crc kubenswrapper[5094]: W0220 08:55:40.527092 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31655ca7_8da2_4f24_a0c6_49ba0de9d207.slice/crio-b8b1d201496df049d7a399da8b383ad8dc5d243c40082e06193cd89fed5c406a WatchSource:0}: Error finding container b8b1d201496df049d7a399da8b383ad8dc5d243c40082e06193cd89fed5c406a: Status 404 returned error can't find the container with id b8b1d201496df049d7a399da8b383ad8dc5d243c40082e06193cd89fed5c406a Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.529635 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d74d874c7-2ztxv"] Feb 20 08:55:41 crc kubenswrapper[5094]: I0220 08:55:41.172013 5094 generic.go:334] "Generic (PLEG): container finished" podID="31655ca7-8da2-4f24-a0c6-49ba0de9d207" containerID="50479b6866f7e99ab149ca475e268e49cdfc9bc8663eef94b4f64022a12bb80f" exitCode=0 Feb 20 08:55:41 crc kubenswrapper[5094]: I0220 08:55:41.172088 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" event={"ID":"31655ca7-8da2-4f24-a0c6-49ba0de9d207","Type":"ContainerDied","Data":"50479b6866f7e99ab149ca475e268e49cdfc9bc8663eef94b4f64022a12bb80f"} Feb 20 08:55:41 crc kubenswrapper[5094]: I0220 08:55:41.172155 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" event={"ID":"31655ca7-8da2-4f24-a0c6-49ba0de9d207","Type":"ContainerStarted","Data":"b8b1d201496df049d7a399da8b383ad8dc5d243c40082e06193cd89fed5c406a"} Feb 20 08:55:41 crc kubenswrapper[5094]: I0220 08:55:41.849951 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0e42f57-46ec-4ef5-a4f0-f262ce003602" path="/var/lib/kubelet/pods/d0e42f57-46ec-4ef5-a4f0-f262ce003602/volumes" Feb 20 08:55:41 crc kubenswrapper[5094]: I0220 08:55:41.857939 5094 scope.go:117] "RemoveContainer" containerID="a81d372de4e3c666bafe4f07d24f6f7cc4bee3b83fd66c5127214476a5ecb65e" Feb 20 08:55:41 crc kubenswrapper[5094]: I0220 08:55:41.901983 5094 scope.go:117] "RemoveContainer" containerID="20cb7da637ab28cd5dddb0edb00930b7379bd84765eae45228fa5efc43d1c866" Feb 20 08:55:41 crc kubenswrapper[5094]: I0220 08:55:41.953871 5094 scope.go:117] "RemoveContainer" containerID="18a0fc5e2df223fc2c55d1f16cc18e7c5d24a21c6534f46e5e010adfb875a921" Feb 20 08:55:41 crc kubenswrapper[5094]: I0220 08:55:41.981941 5094 scope.go:117] "RemoveContainer" containerID="cc26980783bd1f2aa717556354f0e4a6d1d7792f5dd61a8146cad63d1f649ba5" Feb 20 08:55:42 crc kubenswrapper[5094]: I0220 08:55:42.039794 5094 scope.go:117] "RemoveContainer" containerID="7d035de1d36dadcbc2b1699a2d04fbaf8dc66a5156f2934f3122a703497829c7" Feb 20 08:55:42 crc kubenswrapper[5094]: I0220 08:55:42.064918 5094 scope.go:117] "RemoveContainer" containerID="b22a4c98fab8bd430cea1082edfc23c911f8d32bd3adc55526aec0a42c5684bd" Feb 20 08:55:42 crc kubenswrapper[5094]: I0220 08:55:42.107187 5094 scope.go:117] "RemoveContainer" containerID="3f997facacb6313e0f115f2a2227ee22f54c84973cc33a1b4f4cc4cd0e2df3df" Feb 20 08:55:42 crc kubenswrapper[5094]: I0220 08:55:42.128003 5094 scope.go:117] "RemoveContainer" containerID="814c099e47197bd5868d74e553deb48d652a97c496a27496d27f367ca0750674" Feb 20 08:55:42 crc kubenswrapper[5094]: I0220 08:55:42.152245 5094 scope.go:117] "RemoveContainer" containerID="0d114a7c88828f83388e2e035f175ac9a3e4b92dd7429d32fee56582784e51b6" Feb 20 08:55:42 crc kubenswrapper[5094]: I0220 08:55:42.179345 5094 scope.go:117] "RemoveContainer" containerID="4c1355ff4c58f3d8da61df01ab4f55870452a5c2ce565bb148c221041dd245f6" Feb 20 08:55:42 crc kubenswrapper[5094]: I0220 08:55:42.190184 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" event={"ID":"31655ca7-8da2-4f24-a0c6-49ba0de9d207","Type":"ContainerStarted","Data":"b6ffb1c5e46eb4b0a0c85962e06f901a8bf1fc4eb6e8588f57307ad6fa4f9ad1"} Feb 20 08:55:42 crc kubenswrapper[5094]: I0220 08:55:42.190292 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:42 crc kubenswrapper[5094]: I0220 08:55:42.212269 5094 scope.go:117] "RemoveContainer" containerID="a253a0cdc59df09716156eda440d5655d328fc7850d293b9093cf8b148e34b46" Feb 20 08:55:42 crc kubenswrapper[5094]: I0220 08:55:42.215850 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" podStartSLOduration=3.215832858 podStartE2EDuration="3.215832858s" podCreationTimestamp="2026-02-20 08:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:55:42.20717235 +0000 UTC m=+7757.079799061" watchObservedRunningTime="2026-02-20 08:55:42.215832858 +0000 UTC m=+7757.088459569" Feb 20 08:55:42 crc kubenswrapper[5094]: I0220 08:55:42.840435 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:55:42 crc kubenswrapper[5094]: E0220 08:55:42.840910 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.044836 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.123751 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55d7f5f657-jsw6v"] Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.124046 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" podUID="7ef4d7b9-7970-4336-859e-08e2a4820524" containerName="dnsmasq-dns" containerID="cri-o://9d0e86403207be00e168f6826a234cf2f0c49931b532c3aeb9feb705b3e0e69d" gracePeriod=10 Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.311632 5094 generic.go:334] "Generic (PLEG): container finished" podID="7ef4d7b9-7970-4336-859e-08e2a4820524" containerID="9d0e86403207be00e168f6826a234cf2f0c49931b532c3aeb9feb705b3e0e69d" exitCode=0 Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.312355 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" event={"ID":"7ef4d7b9-7970-4336-859e-08e2a4820524","Type":"ContainerDied","Data":"9d0e86403207be00e168f6826a234cf2f0c49931b532c3aeb9feb705b3e0e69d"} Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.362559 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8479b9d65f-4mzqh"] Feb 20 08:55:50 crc kubenswrapper[5094]: E0220 08:55:50.363010 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0e42f57-46ec-4ef5-a4f0-f262ce003602" containerName="dnsmasq-dns" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.363026 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0e42f57-46ec-4ef5-a4f0-f262ce003602" containerName="dnsmasq-dns" Feb 20 08:55:50 crc kubenswrapper[5094]: E0220 08:55:50.363058 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0e42f57-46ec-4ef5-a4f0-f262ce003602" containerName="init" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.363065 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0e42f57-46ec-4ef5-a4f0-f262ce003602" containerName="init" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.363254 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0e42f57-46ec-4ef5-a4f0-f262ce003602" containerName="dnsmasq-dns" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.364435 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.368390 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-networker" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.373642 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8479b9d65f-4mzqh"] Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.429853 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-dns-svc\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.429902 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-ovsdbserver-sb\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.429986 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-config\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.430008 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-ovsdbserver-nb\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.430060 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-openstack-cell1\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.430095 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-openstack-networker\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.430128 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl9xc\" (UniqueName: \"kubernetes.io/projected/bee88947-a5ae-4438-9283-a3fc34fde9e4-kube-api-access-zl9xc\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.534187 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-openstack-networker\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.534890 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl9xc\" (UniqueName: \"kubernetes.io/projected/bee88947-a5ae-4438-9283-a3fc34fde9e4-kube-api-access-zl9xc\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.535107 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-dns-svc\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.535153 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-ovsdbserver-sb\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.535345 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-config\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.535373 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-ovsdbserver-nb\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.535586 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-openstack-cell1\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.537751 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-config\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.537763 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-dns-svc\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.537842 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-ovsdbserver-sb\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.538177 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-ovsdbserver-nb\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.538617 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-openstack-networker\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.546856 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-openstack-cell1\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.567006 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl9xc\" (UniqueName: \"kubernetes.io/projected/bee88947-a5ae-4438-9283-a3fc34fde9e4-kube-api-access-zl9xc\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.664997 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.700499 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.738937 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spbf8\" (UniqueName: \"kubernetes.io/projected/7ef4d7b9-7970-4336-859e-08e2a4820524-kube-api-access-spbf8\") pod \"7ef4d7b9-7970-4336-859e-08e2a4820524\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.739873 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-ovsdbserver-nb\") pod \"7ef4d7b9-7970-4336-859e-08e2a4820524\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.740051 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-openstack-cell1\") pod \"7ef4d7b9-7970-4336-859e-08e2a4820524\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.740132 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-dns-svc\") pod \"7ef4d7b9-7970-4336-859e-08e2a4820524\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.740161 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-config\") pod \"7ef4d7b9-7970-4336-859e-08e2a4820524\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.740182 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-ovsdbserver-sb\") pod \"7ef4d7b9-7970-4336-859e-08e2a4820524\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.750987 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ef4d7b9-7970-4336-859e-08e2a4820524-kube-api-access-spbf8" (OuterVolumeSpecName: "kube-api-access-spbf8") pod "7ef4d7b9-7970-4336-859e-08e2a4820524" (UID: "7ef4d7b9-7970-4336-859e-08e2a4820524"). InnerVolumeSpecName "kube-api-access-spbf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.806902 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7ef4d7b9-7970-4336-859e-08e2a4820524" (UID: "7ef4d7b9-7970-4336-859e-08e2a4820524"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.809157 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7ef4d7b9-7970-4336-859e-08e2a4820524" (UID: "7ef4d7b9-7970-4336-859e-08e2a4820524"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.810407 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-config" (OuterVolumeSpecName: "config") pod "7ef4d7b9-7970-4336-859e-08e2a4820524" (UID: "7ef4d7b9-7970-4336-859e-08e2a4820524"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.826637 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7ef4d7b9-7970-4336-859e-08e2a4820524" (UID: "7ef4d7b9-7970-4336-859e-08e2a4820524"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.835390 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "7ef4d7b9-7970-4336-859e-08e2a4820524" (UID: "7ef4d7b9-7970-4336-859e-08e2a4820524"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.845243 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.845946 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.845970 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.845988 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spbf8\" (UniqueName: \"kubernetes.io/projected/7ef4d7b9-7970-4336-859e-08e2a4820524-kube-api-access-spbf8\") on node \"crc\" DevicePath \"\"" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.846000 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.846008 5094 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 08:55:51 crc kubenswrapper[5094]: I0220 08:55:51.076988 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-5cthc"] Feb 20 08:55:51 crc kubenswrapper[5094]: I0220 08:55:51.088554 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-5cthc"] Feb 20 08:55:51 crc kubenswrapper[5094]: I0220 08:55:51.232388 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8479b9d65f-4mzqh"] Feb 20 08:55:51 crc kubenswrapper[5094]: I0220 08:55:51.331405 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" event={"ID":"7ef4d7b9-7970-4336-859e-08e2a4820524","Type":"ContainerDied","Data":"366cdb507f5b3b52436869451d51fb6e8c589ab86c365a15488bd4c24a612eb6"} Feb 20 08:55:51 crc kubenswrapper[5094]: I0220 08:55:51.331460 5094 scope.go:117] "RemoveContainer" containerID="9d0e86403207be00e168f6826a234cf2f0c49931b532c3aeb9feb705b3e0e69d" Feb 20 08:55:51 crc kubenswrapper[5094]: I0220 08:55:51.331625 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:51 crc kubenswrapper[5094]: I0220 08:55:51.334733 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" event={"ID":"bee88947-a5ae-4438-9283-a3fc34fde9e4","Type":"ContainerStarted","Data":"c6e5f29abf3edeb7e7f3059a5c05f5d2fb0bc1754c58a00813e1efd58de28903"} Feb 20 08:55:51 crc kubenswrapper[5094]: I0220 08:55:51.427862 5094 scope.go:117] "RemoveContainer" containerID="d0d02e83461148d891a35888220f3b22b91ec3fc7a98679a44ab912f988c01ed" Feb 20 08:55:51 crc kubenswrapper[5094]: I0220 08:55:51.463214 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55d7f5f657-jsw6v"] Feb 20 08:55:51 crc kubenswrapper[5094]: I0220 08:55:51.473291 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55d7f5f657-jsw6v"] Feb 20 08:55:51 crc kubenswrapper[5094]: I0220 08:55:51.851827 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47f4c643-fb8b-41d6-97b5-fa0c0928f370" path="/var/lib/kubelet/pods/47f4c643-fb8b-41d6-97b5-fa0c0928f370/volumes" Feb 20 08:55:51 crc kubenswrapper[5094]: I0220 08:55:51.852483 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ef4d7b9-7970-4336-859e-08e2a4820524" path="/var/lib/kubelet/pods/7ef4d7b9-7970-4336-859e-08e2a4820524/volumes" Feb 20 08:55:52 crc kubenswrapper[5094]: I0220 08:55:52.346399 5094 generic.go:334] "Generic (PLEG): container finished" podID="bee88947-a5ae-4438-9283-a3fc34fde9e4" containerID="daf047feda2b65bc85c2d0be5690fb87de0d553533044e35731527c9a067908e" exitCode=0 Feb 20 08:55:52 crc kubenswrapper[5094]: I0220 08:55:52.347063 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" event={"ID":"bee88947-a5ae-4438-9283-a3fc34fde9e4","Type":"ContainerDied","Data":"daf047feda2b65bc85c2d0be5690fb87de0d553533044e35731527c9a067908e"} Feb 20 08:55:53 crc kubenswrapper[5094]: I0220 08:55:53.360520 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" event={"ID":"bee88947-a5ae-4438-9283-a3fc34fde9e4","Type":"ContainerStarted","Data":"f0467c881595e5ec69b0db128e431cd7ecd6a345800ad96608f675ab91569186"} Feb 20 08:55:53 crc kubenswrapper[5094]: I0220 08:55:53.361139 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:53 crc kubenswrapper[5094]: I0220 08:55:53.388571 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" podStartSLOduration=3.388553284 podStartE2EDuration="3.388553284s" podCreationTimestamp="2026-02-20 08:55:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:55:53.382821906 +0000 UTC m=+7768.255448617" watchObservedRunningTime="2026-02-20 08:55:53.388553284 +0000 UTC m=+7768.261179995" Feb 20 08:55:56 crc kubenswrapper[5094]: I0220 08:55:56.840628 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:55:56 crc kubenswrapper[5094]: E0220 08:55:56.841842 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:56:00 crc kubenswrapper[5094]: I0220 08:56:00.702764 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:56:00 crc kubenswrapper[5094]: I0220 08:56:00.768978 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d74d874c7-2ztxv"] Feb 20 08:56:00 crc kubenswrapper[5094]: I0220 08:56:00.769254 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" podUID="31655ca7-8da2-4f24-a0c6-49ba0de9d207" containerName="dnsmasq-dns" containerID="cri-o://b6ffb1c5e46eb4b0a0c85962e06f901a8bf1fc4eb6e8588f57307ad6fa4f9ad1" gracePeriod=10 Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.308505 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.377152 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b"] Feb 20 08:56:01 crc kubenswrapper[5094]: E0220 08:56:01.377580 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef4d7b9-7970-4336-859e-08e2a4820524" containerName="init" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.377599 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef4d7b9-7970-4336-859e-08e2a4820524" containerName="init" Feb 20 08:56:01 crc kubenswrapper[5094]: E0220 08:56:01.377618 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31655ca7-8da2-4f24-a0c6-49ba0de9d207" containerName="dnsmasq-dns" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.377625 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="31655ca7-8da2-4f24-a0c6-49ba0de9d207" containerName="dnsmasq-dns" Feb 20 08:56:01 crc kubenswrapper[5094]: E0220 08:56:01.377639 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef4d7b9-7970-4336-859e-08e2a4820524" containerName="dnsmasq-dns" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.377646 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef4d7b9-7970-4336-859e-08e2a4820524" containerName="dnsmasq-dns" Feb 20 08:56:01 crc kubenswrapper[5094]: E0220 08:56:01.377683 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31655ca7-8da2-4f24-a0c6-49ba0de9d207" containerName="init" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.377688 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="31655ca7-8da2-4f24-a0c6-49ba0de9d207" containerName="init" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.377874 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ef4d7b9-7970-4336-859e-08e2a4820524" containerName="dnsmasq-dns" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.377895 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="31655ca7-8da2-4f24-a0c6-49ba0de9d207" containerName="dnsmasq-dns" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.379122 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.388401 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m"] Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.389589 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.389903 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-xf9xl" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.390177 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.390228 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.390505 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.392936 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-openstack-cell1\") pod \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.392989 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-ovsdbserver-nb\") pod \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.393029 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sz5g6\" (UniqueName: \"kubernetes.io/projected/31655ca7-8da2-4f24-a0c6-49ba0de9d207-kube-api-access-sz5g6\") pod \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.393064 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-config\") pod \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.393118 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-ovsdbserver-sb\") pod \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.393224 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-dns-svc\") pod \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.404022 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.404289 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.412932 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31655ca7-8da2-4f24-a0c6-49ba0de9d207-kube-api-access-sz5g6" (OuterVolumeSpecName: "kube-api-access-sz5g6") pod "31655ca7-8da2-4f24-a0c6-49ba0de9d207" (UID: "31655ca7-8da2-4f24-a0c6-49ba0de9d207"). InnerVolumeSpecName "kube-api-access-sz5g6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.433889 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b"] Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.458895 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m"] Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.481608 5094 generic.go:334] "Generic (PLEG): container finished" podID="31655ca7-8da2-4f24-a0c6-49ba0de9d207" containerID="b6ffb1c5e46eb4b0a0c85962e06f901a8bf1fc4eb6e8588f57307ad6fa4f9ad1" exitCode=0 Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.481653 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" event={"ID":"31655ca7-8da2-4f24-a0c6-49ba0de9d207","Type":"ContainerDied","Data":"b6ffb1c5e46eb4b0a0c85962e06f901a8bf1fc4eb6e8588f57307ad6fa4f9ad1"} Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.481677 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" event={"ID":"31655ca7-8da2-4f24-a0c6-49ba0de9d207","Type":"ContainerDied","Data":"b8b1d201496df049d7a399da8b383ad8dc5d243c40082e06193cd89fed5c406a"} Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.481694 5094 scope.go:117] "RemoveContainer" containerID="b6ffb1c5e46eb4b0a0c85962e06f901a8bf1fc4eb6e8588f57307ad6fa4f9ad1" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.481881 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.488383 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "31655ca7-8da2-4f24-a0c6-49ba0de9d207" (UID: "31655ca7-8da2-4f24-a0c6-49ba0de9d207"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.501825 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "31655ca7-8da2-4f24-a0c6-49ba0de9d207" (UID: "31655ca7-8da2-4f24-a0c6-49ba0de9d207"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.519245 5094 scope.go:117] "RemoveContainer" containerID="50479b6866f7e99ab149ca475e268e49cdfc9bc8663eef94b4f64022a12bb80f" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.521240 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqrcj\" (UniqueName: \"kubernetes.io/projected/750a5132-7613-40c0-a360-2f1a589d2554-kube-api-access-wqrcj\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b\" (UID: \"750a5132-7613-40c0-a360-2f1a589d2554\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.525471 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.525673 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b\" (UID: \"750a5132-7613-40c0-a360-2f1a589d2554\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.525739 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.526046 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.526215 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.526360 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qt97\" (UniqueName: \"kubernetes.io/projected/aee17d13-1b0d-49a2-a515-cc63a2f62c63-kube-api-access-6qt97\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.526429 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b\" (UID: \"750a5132-7613-40c0-a360-2f1a589d2554\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.526501 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-ssh-key-openstack-networker\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b\" (UID: \"750a5132-7613-40c0-a360-2f1a589d2554\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.526819 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.526837 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sz5g6\" (UniqueName: \"kubernetes.io/projected/31655ca7-8da2-4f24-a0c6-49ba0de9d207-kube-api-access-sz5g6\") on node \"crc\" DevicePath \"\"" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.526856 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.542732 5094 scope.go:117] "RemoveContainer" containerID="b6ffb1c5e46eb4b0a0c85962e06f901a8bf1fc4eb6e8588f57307ad6fa4f9ad1" Feb 20 08:56:01 crc kubenswrapper[5094]: E0220 08:56:01.544466 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6ffb1c5e46eb4b0a0c85962e06f901a8bf1fc4eb6e8588f57307ad6fa4f9ad1\": container with ID starting with b6ffb1c5e46eb4b0a0c85962e06f901a8bf1fc4eb6e8588f57307ad6fa4f9ad1 not found: ID does not exist" containerID="b6ffb1c5e46eb4b0a0c85962e06f901a8bf1fc4eb6e8588f57307ad6fa4f9ad1" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.544503 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6ffb1c5e46eb4b0a0c85962e06f901a8bf1fc4eb6e8588f57307ad6fa4f9ad1"} err="failed to get container status \"b6ffb1c5e46eb4b0a0c85962e06f901a8bf1fc4eb6e8588f57307ad6fa4f9ad1\": rpc error: code = NotFound desc = could not find container \"b6ffb1c5e46eb4b0a0c85962e06f901a8bf1fc4eb6e8588f57307ad6fa4f9ad1\": container with ID starting with b6ffb1c5e46eb4b0a0c85962e06f901a8bf1fc4eb6e8588f57307ad6fa4f9ad1 not found: ID does not exist" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.544526 5094 scope.go:117] "RemoveContainer" containerID="50479b6866f7e99ab149ca475e268e49cdfc9bc8663eef94b4f64022a12bb80f" Feb 20 08:56:01 crc kubenswrapper[5094]: E0220 08:56:01.544796 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50479b6866f7e99ab149ca475e268e49cdfc9bc8663eef94b4f64022a12bb80f\": container with ID starting with 50479b6866f7e99ab149ca475e268e49cdfc9bc8663eef94b4f64022a12bb80f not found: ID does not exist" containerID="50479b6866f7e99ab149ca475e268e49cdfc9bc8663eef94b4f64022a12bb80f" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.544815 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50479b6866f7e99ab149ca475e268e49cdfc9bc8663eef94b4f64022a12bb80f"} err="failed to get container status \"50479b6866f7e99ab149ca475e268e49cdfc9bc8663eef94b4f64022a12bb80f\": rpc error: code = NotFound desc = could not find container \"50479b6866f7e99ab149ca475e268e49cdfc9bc8663eef94b4f64022a12bb80f\": container with ID starting with 50479b6866f7e99ab149ca475e268e49cdfc9bc8663eef94b4f64022a12bb80f not found: ID does not exist" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.553664 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-config" (OuterVolumeSpecName: "config") pod "31655ca7-8da2-4f24-a0c6-49ba0de9d207" (UID: "31655ca7-8da2-4f24-a0c6-49ba0de9d207"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.554359 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "31655ca7-8da2-4f24-a0c6-49ba0de9d207" (UID: "31655ca7-8da2-4f24-a0c6-49ba0de9d207"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.563059 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "31655ca7-8da2-4f24-a0c6-49ba0de9d207" (UID: "31655ca7-8da2-4f24-a0c6-49ba0de9d207"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.628490 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b\" (UID: \"750a5132-7613-40c0-a360-2f1a589d2554\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.628536 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.628641 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.628693 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.628745 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qt97\" (UniqueName: \"kubernetes.io/projected/aee17d13-1b0d-49a2-a515-cc63a2f62c63-kube-api-access-6qt97\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.628770 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b\" (UID: \"750a5132-7613-40c0-a360-2f1a589d2554\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.628791 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-ssh-key-openstack-networker\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b\" (UID: \"750a5132-7613-40c0-a360-2f1a589d2554\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.628857 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqrcj\" (UniqueName: \"kubernetes.io/projected/750a5132-7613-40c0-a360-2f1a589d2554-kube-api-access-wqrcj\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b\" (UID: \"750a5132-7613-40c0-a360-2f1a589d2554\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.628884 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.628940 5094 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.628951 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.628963 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.632050 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.632600 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.634054 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b\" (UID: \"750a5132-7613-40c0-a360-2f1a589d2554\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.634101 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.634372 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.634822 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b\" (UID: \"750a5132-7613-40c0-a360-2f1a589d2554\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.634873 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-ssh-key-openstack-networker\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b\" (UID: \"750a5132-7613-40c0-a360-2f1a589d2554\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.653174 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqrcj\" (UniqueName: \"kubernetes.io/projected/750a5132-7613-40c0-a360-2f1a589d2554-kube-api-access-wqrcj\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b\" (UID: \"750a5132-7613-40c0-a360-2f1a589d2554\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.655954 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qt97\" (UniqueName: \"kubernetes.io/projected/aee17d13-1b0d-49a2-a515-cc63a2f62c63-kube-api-access-6qt97\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.705221 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.803820 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.832127 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d74d874c7-2ztxv"] Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.865740 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d74d874c7-2ztxv"] Feb 20 08:56:02 crc kubenswrapper[5094]: I0220 08:56:02.290887 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b"] Feb 20 08:56:02 crc kubenswrapper[5094]: I0220 08:56:02.443918 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m"] Feb 20 08:56:02 crc kubenswrapper[5094]: W0220 08:56:02.454756 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaee17d13_1b0d_49a2_a515_cc63a2f62c63.slice/crio-a568e7a6fe65530dd28da7ae2ad6b5b1d61c7ccbd4e7e6aa0b31eb0dd34c3473 WatchSource:0}: Error finding container a568e7a6fe65530dd28da7ae2ad6b5b1d61c7ccbd4e7e6aa0b31eb0dd34c3473: Status 404 returned error can't find the container with id a568e7a6fe65530dd28da7ae2ad6b5b1d61c7ccbd4e7e6aa0b31eb0dd34c3473 Feb 20 08:56:02 crc kubenswrapper[5094]: I0220 08:56:02.493914 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" event={"ID":"aee17d13-1b0d-49a2-a515-cc63a2f62c63","Type":"ContainerStarted","Data":"a568e7a6fe65530dd28da7ae2ad6b5b1d61c7ccbd4e7e6aa0b31eb0dd34c3473"} Feb 20 08:56:02 crc kubenswrapper[5094]: I0220 08:56:02.495225 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" event={"ID":"750a5132-7613-40c0-a360-2f1a589d2554","Type":"ContainerStarted","Data":"8214a023e8d2d42a8772770e282ea397fd96754091592c1ccdaf0cbad0e9d784"} Feb 20 08:56:03 crc kubenswrapper[5094]: I0220 08:56:03.866994 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31655ca7-8da2-4f24-a0c6-49ba0de9d207" path="/var/lib/kubelet/pods/31655ca7-8da2-4f24-a0c6-49ba0de9d207/volumes" Feb 20 08:56:10 crc kubenswrapper[5094]: I0220 08:56:10.840966 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:56:10 crc kubenswrapper[5094]: E0220 08:56:10.841729 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:56:12 crc kubenswrapper[5094]: I0220 08:56:12.601409 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" event={"ID":"aee17d13-1b0d-49a2-a515-cc63a2f62c63","Type":"ContainerStarted","Data":"3a06af21ed40f27cd0fdfa49cc6ae8b158034daa7a7d6e9dbd4c9377f0b0dbee"} Feb 20 08:56:12 crc kubenswrapper[5094]: I0220 08:56:12.603982 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" event={"ID":"750a5132-7613-40c0-a360-2f1a589d2554","Type":"ContainerStarted","Data":"b0838c70e985c9d6c06753020cfb1fedf883083444d6817034896958cb9b23f1"} Feb 20 08:56:12 crc kubenswrapper[5094]: I0220 08:56:12.629789 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" podStartSLOduration=2.285655248 podStartE2EDuration="11.62976811s" podCreationTimestamp="2026-02-20 08:56:01 +0000 UTC" firstStartedPulling="2026-02-20 08:56:02.457134511 +0000 UTC m=+7777.329761232" lastFinishedPulling="2026-02-20 08:56:11.801247363 +0000 UTC m=+7786.673874094" observedRunningTime="2026-02-20 08:56:12.619922973 +0000 UTC m=+7787.492549704" watchObservedRunningTime="2026-02-20 08:56:12.62976811 +0000 UTC m=+7787.502394831" Feb 20 08:56:12 crc kubenswrapper[5094]: I0220 08:56:12.647401 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" podStartSLOduration=2.158817597 podStartE2EDuration="11.647379804s" podCreationTimestamp="2026-02-20 08:56:01 +0000 UTC" firstStartedPulling="2026-02-20 08:56:02.298576397 +0000 UTC m=+7777.171203108" lastFinishedPulling="2026-02-20 08:56:11.787138604 +0000 UTC m=+7786.659765315" observedRunningTime="2026-02-20 08:56:12.643531111 +0000 UTC m=+7787.516157822" watchObservedRunningTime="2026-02-20 08:56:12.647379804 +0000 UTC m=+7787.520006525" Feb 20 08:56:21 crc kubenswrapper[5094]: I0220 08:56:21.691496 5094 generic.go:334] "Generic (PLEG): container finished" podID="750a5132-7613-40c0-a360-2f1a589d2554" containerID="b0838c70e985c9d6c06753020cfb1fedf883083444d6817034896958cb9b23f1" exitCode=0 Feb 20 08:56:21 crc kubenswrapper[5094]: I0220 08:56:21.691610 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" event={"ID":"750a5132-7613-40c0-a360-2f1a589d2554","Type":"ContainerDied","Data":"b0838c70e985c9d6c06753020cfb1fedf883083444d6817034896958cb9b23f1"} Feb 20 08:56:22 crc kubenswrapper[5094]: I0220 08:56:22.710371 5094 generic.go:334] "Generic (PLEG): container finished" podID="aee17d13-1b0d-49a2-a515-cc63a2f62c63" containerID="3a06af21ed40f27cd0fdfa49cc6ae8b158034daa7a7d6e9dbd4c9377f0b0dbee" exitCode=0 Feb 20 08:56:22 crc kubenswrapper[5094]: I0220 08:56:22.710492 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" event={"ID":"aee17d13-1b0d-49a2-a515-cc63a2f62c63","Type":"ContainerDied","Data":"3a06af21ed40f27cd0fdfa49cc6ae8b158034daa7a7d6e9dbd4c9377f0b0dbee"} Feb 20 08:56:23 crc kubenswrapper[5094]: I0220 08:56:23.193951 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" Feb 20 08:56:23 crc kubenswrapper[5094]: I0220 08:56:23.254743 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-inventory\") pod \"750a5132-7613-40c0-a360-2f1a589d2554\" (UID: \"750a5132-7613-40c0-a360-2f1a589d2554\") " Feb 20 08:56:23 crc kubenswrapper[5094]: I0220 08:56:23.255125 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-ssh-key-openstack-networker\") pod \"750a5132-7613-40c0-a360-2f1a589d2554\" (UID: \"750a5132-7613-40c0-a360-2f1a589d2554\") " Feb 20 08:56:23 crc kubenswrapper[5094]: I0220 08:56:23.255163 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-pre-adoption-validation-combined-ca-bundle\") pod \"750a5132-7613-40c0-a360-2f1a589d2554\" (UID: \"750a5132-7613-40c0-a360-2f1a589d2554\") " Feb 20 08:56:23 crc kubenswrapper[5094]: I0220 08:56:23.255204 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqrcj\" (UniqueName: \"kubernetes.io/projected/750a5132-7613-40c0-a360-2f1a589d2554-kube-api-access-wqrcj\") pod \"750a5132-7613-40c0-a360-2f1a589d2554\" (UID: \"750a5132-7613-40c0-a360-2f1a589d2554\") " Feb 20 08:56:23 crc kubenswrapper[5094]: I0220 08:56:23.262335 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "750a5132-7613-40c0-a360-2f1a589d2554" (UID: "750a5132-7613-40c0-a360-2f1a589d2554"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:56:23 crc kubenswrapper[5094]: I0220 08:56:23.262522 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/750a5132-7613-40c0-a360-2f1a589d2554-kube-api-access-wqrcj" (OuterVolumeSpecName: "kube-api-access-wqrcj") pod "750a5132-7613-40c0-a360-2f1a589d2554" (UID: "750a5132-7613-40c0-a360-2f1a589d2554"). InnerVolumeSpecName "kube-api-access-wqrcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:56:23 crc kubenswrapper[5094]: I0220 08:56:23.285042 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-inventory" (OuterVolumeSpecName: "inventory") pod "750a5132-7613-40c0-a360-2f1a589d2554" (UID: "750a5132-7613-40c0-a360-2f1a589d2554"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:56:23 crc kubenswrapper[5094]: I0220 08:56:23.287224 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "750a5132-7613-40c0-a360-2f1a589d2554" (UID: "750a5132-7613-40c0-a360-2f1a589d2554"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:56:23 crc kubenswrapper[5094]: I0220 08:56:23.357278 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 20 08:56:23 crc kubenswrapper[5094]: I0220 08:56:23.357331 5094 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:56:23 crc kubenswrapper[5094]: I0220 08:56:23.357348 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqrcj\" (UniqueName: \"kubernetes.io/projected/750a5132-7613-40c0-a360-2f1a589d2554-kube-api-access-wqrcj\") on node \"crc\" DevicePath \"\"" Feb 20 08:56:23 crc kubenswrapper[5094]: I0220 08:56:23.357362 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 08:56:23 crc kubenswrapper[5094]: I0220 08:56:23.733804 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" Feb 20 08:56:23 crc kubenswrapper[5094]: I0220 08:56:23.738124 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" event={"ID":"750a5132-7613-40c0-a360-2f1a589d2554","Type":"ContainerDied","Data":"8214a023e8d2d42a8772770e282ea397fd96754091592c1ccdaf0cbad0e9d784"} Feb 20 08:56:23 crc kubenswrapper[5094]: I0220 08:56:23.738266 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8214a023e8d2d42a8772770e282ea397fd96754091592c1ccdaf0cbad0e9d784" Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.191620 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.275104 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-inventory\") pod \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.275171 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-pre-adoption-validation-combined-ca-bundle\") pod \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.275270 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-ssh-key-openstack-cell1\") pod \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.275502 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qt97\" (UniqueName: \"kubernetes.io/projected/aee17d13-1b0d-49a2-a515-cc63a2f62c63-kube-api-access-6qt97\") pod \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.275621 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-ceph\") pod \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.279040 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aee17d13-1b0d-49a2-a515-cc63a2f62c63-kube-api-access-6qt97" (OuterVolumeSpecName: "kube-api-access-6qt97") pod "aee17d13-1b0d-49a2-a515-cc63a2f62c63" (UID: "aee17d13-1b0d-49a2-a515-cc63a2f62c63"). InnerVolumeSpecName "kube-api-access-6qt97". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.279204 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "aee17d13-1b0d-49a2-a515-cc63a2f62c63" (UID: "aee17d13-1b0d-49a2-a515-cc63a2f62c63"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.280999 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-ceph" (OuterVolumeSpecName: "ceph") pod "aee17d13-1b0d-49a2-a515-cc63a2f62c63" (UID: "aee17d13-1b0d-49a2-a515-cc63a2f62c63"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.302085 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "aee17d13-1b0d-49a2-a515-cc63a2f62c63" (UID: "aee17d13-1b0d-49a2-a515-cc63a2f62c63"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.303472 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-inventory" (OuterVolumeSpecName: "inventory") pod "aee17d13-1b0d-49a2-a515-cc63a2f62c63" (UID: "aee17d13-1b0d-49a2-a515-cc63a2f62c63"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.378301 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.378338 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.378349 5094 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.378362 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.378375 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qt97\" (UniqueName: \"kubernetes.io/projected/aee17d13-1b0d-49a2-a515-cc63a2f62c63-kube-api-access-6qt97\") on node \"crc\" DevicePath \"\"" Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.742629 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" event={"ID":"aee17d13-1b0d-49a2-a515-cc63a2f62c63","Type":"ContainerDied","Data":"a568e7a6fe65530dd28da7ae2ad6b5b1d61c7ccbd4e7e6aa0b31eb0dd34c3473"} Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.742672 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a568e7a6fe65530dd28da7ae2ad6b5b1d61c7ccbd4e7e6aa0b31eb0dd34c3473" Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.742781 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:24 crc kubenswrapper[5094]: E0220 08:56:24.845290 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaee17d13_1b0d_49a2_a515_cc63a2f62c63.slice\": RecentStats: unable to find data in memory cache]" Feb 20 08:56:25 crc kubenswrapper[5094]: I0220 08:56:25.847357 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:56:25 crc kubenswrapper[5094]: E0220 08:56:25.847909 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:56:33 crc kubenswrapper[5094]: I0220 08:56:33.060272 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-4ggkd"] Feb 20 08:56:33 crc kubenswrapper[5094]: I0220 08:56:33.072295 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-4ggkd"] Feb 20 08:56:33 crc kubenswrapper[5094]: I0220 08:56:33.857308 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991" path="/var/lib/kubelet/pods/3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991/volumes" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.031853 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-dc9b-account-create-update-6hd8k"] Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.040243 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-dc9b-account-create-update-6hd8k"] Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.778655 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6"] Feb 20 08:56:34 crc kubenswrapper[5094]: E0220 08:56:34.779579 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aee17d13-1b0d-49a2-a515-cc63a2f62c63" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.779614 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="aee17d13-1b0d-49a2-a515-cc63a2f62c63" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Feb 20 08:56:34 crc kubenswrapper[5094]: E0220 08:56:34.779670 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="750a5132-7613-40c0-a360-2f1a589d2554" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-networ" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.779684 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="750a5132-7613-40c0-a360-2f1a589d2554" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-networ" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.780150 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="aee17d13-1b0d-49a2-a515-cc63a2f62c63" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.780193 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="750a5132-7613-40c0-a360-2f1a589d2554" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-networ" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.781394 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.784011 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.784152 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.784572 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.784821 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.794541 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f"] Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.796023 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.797941 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.798396 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-xf9xl" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.804257 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-ssh-key-openstack-networker\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.804344 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.804404 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.804429 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.804456 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.804591 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcppq\" (UniqueName: \"kubernetes.io/projected/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-kube-api-access-tcppq\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.804675 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.804729 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svmxt\" (UniqueName: \"kubernetes.io/projected/110791b2-a067-409d-9970-9db4868f0d4d-kube-api-access-svmxt\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.804850 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.818883 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f"] Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.832128 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6"] Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.906519 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-ssh-key-openstack-networker\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.906595 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.906683 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.906759 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.906784 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.906876 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcppq\" (UniqueName: \"kubernetes.io/projected/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-kube-api-access-tcppq\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.906989 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.907018 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svmxt\" (UniqueName: \"kubernetes.io/projected/110791b2-a067-409d-9970-9db4868f0d4d-kube-api-access-svmxt\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.907186 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.912271 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.912517 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.912600 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-ssh-key-openstack-networker\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.913142 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.920202 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.920887 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.925232 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcppq\" (UniqueName: \"kubernetes.io/projected/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-kube-api-access-tcppq\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.927404 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.930888 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svmxt\" (UniqueName: \"kubernetes.io/projected/110791b2-a067-409d-9970-9db4868f0d4d-kube-api-access-svmxt\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:35 crc kubenswrapper[5094]: I0220 08:56:35.122168 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:35 crc kubenswrapper[5094]: I0220 08:56:35.134683 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" Feb 20 08:56:35 crc kubenswrapper[5094]: I0220 08:56:35.722163 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6"] Feb 20 08:56:35 crc kubenswrapper[5094]: I0220 08:56:35.826241 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f"] Feb 20 08:56:35 crc kubenswrapper[5094]: I0220 08:56:35.859638 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7475a056-ad82-42aa-85ee-4b5d6834434a" path="/var/lib/kubelet/pods/7475a056-ad82-42aa-85ee-4b5d6834434a/volumes" Feb 20 08:56:35 crc kubenswrapper[5094]: I0220 08:56:35.870401 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" event={"ID":"110791b2-a067-409d-9970-9db4868f0d4d","Type":"ContainerStarted","Data":"0c39006c151d238a3f167340ddd6cab9c442fbcdd2b65206e97f1302317761f1"} Feb 20 08:56:35 crc kubenswrapper[5094]: I0220 08:56:35.872285 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" event={"ID":"7894eb94-d4dd-4035-af5b-5994b4ae6d2f","Type":"ContainerStarted","Data":"ca240917156fd9c5acb4173083f283943d7a62c3a3cbea59cf29d0d548923640"} Feb 20 08:56:38 crc kubenswrapper[5094]: I0220 08:56:38.934189 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" event={"ID":"110791b2-a067-409d-9970-9db4868f0d4d","Type":"ContainerStarted","Data":"24186d99d2e091e90cea103f3ededd5ae0be73e5479d2f80e87c425b36de8252"} Feb 20 08:56:38 crc kubenswrapper[5094]: I0220 08:56:38.937312 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" event={"ID":"7894eb94-d4dd-4035-af5b-5994b4ae6d2f","Type":"ContainerStarted","Data":"a63a79298a636726404dd99afe9f61c35ff225d0787a9c54da454b3cf54a459f"} Feb 20 08:56:38 crc kubenswrapper[5094]: I0220 08:56:38.960291 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" podStartSLOduration=2.66317616 podStartE2EDuration="4.96027409s" podCreationTimestamp="2026-02-20 08:56:34 +0000 UTC" firstStartedPulling="2026-02-20 08:56:35.736200064 +0000 UTC m=+7810.608826795" lastFinishedPulling="2026-02-20 08:56:38.033298004 +0000 UTC m=+7812.905924725" observedRunningTime="2026-02-20 08:56:38.948516657 +0000 UTC m=+7813.821143368" watchObservedRunningTime="2026-02-20 08:56:38.96027409 +0000 UTC m=+7813.832900801" Feb 20 08:56:38 crc kubenswrapper[5094]: I0220 08:56:38.970647 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" podStartSLOduration=2.598907143 podStartE2EDuration="4.970622928s" podCreationTimestamp="2026-02-20 08:56:34 +0000 UTC" firstStartedPulling="2026-02-20 08:56:35.829920928 +0000 UTC m=+7810.702547639" lastFinishedPulling="2026-02-20 08:56:38.201636713 +0000 UTC m=+7813.074263424" observedRunningTime="2026-02-20 08:56:38.968824085 +0000 UTC m=+7813.841450796" watchObservedRunningTime="2026-02-20 08:56:38.970622928 +0000 UTC m=+7813.843249679" Feb 20 08:56:39 crc kubenswrapper[5094]: I0220 08:56:39.841815 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:56:39 crc kubenswrapper[5094]: E0220 08:56:39.842867 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:56:42 crc kubenswrapper[5094]: I0220 08:56:42.575585 5094 scope.go:117] "RemoveContainer" containerID="304a415638ed4b15154b7c41ed2540ace3b8c9ba6f1ff38016fbc2160491bb9c" Feb 20 08:56:42 crc kubenswrapper[5094]: I0220 08:56:42.615358 5094 scope.go:117] "RemoveContainer" containerID="8c8e45f1f20160f7c96239278e797286c527baed5911fbde0292c56051da6b16" Feb 20 08:56:42 crc kubenswrapper[5094]: I0220 08:56:42.700020 5094 scope.go:117] "RemoveContainer" containerID="e06fc5fd3620d2019a01f12c26721ba58935bf528cffce9cee66802b4ab5054a" Feb 20 08:56:53 crc kubenswrapper[5094]: I0220 08:56:53.842342 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:56:53 crc kubenswrapper[5094]: E0220 08:56:53.843376 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:56:59 crc kubenswrapper[5094]: I0220 08:56:59.045883 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-n78qt"] Feb 20 08:56:59 crc kubenswrapper[5094]: I0220 08:56:59.054337 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-n78qt"] Feb 20 08:56:59 crc kubenswrapper[5094]: I0220 08:56:59.852994 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cbb5a80-aef6-405d-bf92-d0d9cc872c78" path="/var/lib/kubelet/pods/0cbb5a80-aef6-405d-bf92-d0d9cc872c78/volumes" Feb 20 08:57:05 crc kubenswrapper[5094]: I0220 08:57:05.847871 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:57:05 crc kubenswrapper[5094]: E0220 08:57:05.849114 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:57:20 crc kubenswrapper[5094]: I0220 08:57:20.840857 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:57:20 crc kubenswrapper[5094]: E0220 08:57:20.841498 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:57:34 crc kubenswrapper[5094]: I0220 08:57:34.841777 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:57:34 crc kubenswrapper[5094]: E0220 08:57:34.843959 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:57:42 crc kubenswrapper[5094]: I0220 08:57:42.826525 5094 scope.go:117] "RemoveContainer" containerID="08a5be203668aa394a410644bd2e295ff1d095189ea60c660ca9c1800575b1fd" Feb 20 08:57:48 crc kubenswrapper[5094]: I0220 08:57:48.841070 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:57:48 crc kubenswrapper[5094]: E0220 08:57:48.842190 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:58:03 crc kubenswrapper[5094]: I0220 08:58:03.841344 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:58:03 crc kubenswrapper[5094]: E0220 08:58:03.842609 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:58:17 crc kubenswrapper[5094]: I0220 08:58:17.841030 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:58:17 crc kubenswrapper[5094]: E0220 08:58:17.842459 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:58:28 crc kubenswrapper[5094]: I0220 08:58:28.841319 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:58:28 crc kubenswrapper[5094]: E0220 08:58:28.842115 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:58:29 crc kubenswrapper[5094]: I0220 08:58:29.586621 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fhdkc"] Feb 20 08:58:29 crc kubenswrapper[5094]: I0220 08:58:29.589571 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:29 crc kubenswrapper[5094]: I0220 08:58:29.619100 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fhdkc"] Feb 20 08:58:29 crc kubenswrapper[5094]: I0220 08:58:29.712653 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgqnf\" (UniqueName: \"kubernetes.io/projected/49822999-3ac2-4409-ac86-dec02d5e0e7b-kube-api-access-jgqnf\") pod \"redhat-marketplace-fhdkc\" (UID: \"49822999-3ac2-4409-ac86-dec02d5e0e7b\") " pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:29 crc kubenswrapper[5094]: I0220 08:58:29.712854 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49822999-3ac2-4409-ac86-dec02d5e0e7b-catalog-content\") pod \"redhat-marketplace-fhdkc\" (UID: \"49822999-3ac2-4409-ac86-dec02d5e0e7b\") " pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:29 crc kubenswrapper[5094]: I0220 08:58:29.713050 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49822999-3ac2-4409-ac86-dec02d5e0e7b-utilities\") pod \"redhat-marketplace-fhdkc\" (UID: \"49822999-3ac2-4409-ac86-dec02d5e0e7b\") " pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:29 crc kubenswrapper[5094]: I0220 08:58:29.815380 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgqnf\" (UniqueName: \"kubernetes.io/projected/49822999-3ac2-4409-ac86-dec02d5e0e7b-kube-api-access-jgqnf\") pod \"redhat-marketplace-fhdkc\" (UID: \"49822999-3ac2-4409-ac86-dec02d5e0e7b\") " pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:29 crc kubenswrapper[5094]: I0220 08:58:29.815489 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49822999-3ac2-4409-ac86-dec02d5e0e7b-catalog-content\") pod \"redhat-marketplace-fhdkc\" (UID: \"49822999-3ac2-4409-ac86-dec02d5e0e7b\") " pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:29 crc kubenswrapper[5094]: I0220 08:58:29.815534 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49822999-3ac2-4409-ac86-dec02d5e0e7b-utilities\") pod \"redhat-marketplace-fhdkc\" (UID: \"49822999-3ac2-4409-ac86-dec02d5e0e7b\") " pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:29 crc kubenswrapper[5094]: I0220 08:58:29.816086 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49822999-3ac2-4409-ac86-dec02d5e0e7b-catalog-content\") pod \"redhat-marketplace-fhdkc\" (UID: \"49822999-3ac2-4409-ac86-dec02d5e0e7b\") " pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:29 crc kubenswrapper[5094]: I0220 08:58:29.816129 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49822999-3ac2-4409-ac86-dec02d5e0e7b-utilities\") pod \"redhat-marketplace-fhdkc\" (UID: \"49822999-3ac2-4409-ac86-dec02d5e0e7b\") " pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:29 crc kubenswrapper[5094]: I0220 08:58:29.832292 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgqnf\" (UniqueName: \"kubernetes.io/projected/49822999-3ac2-4409-ac86-dec02d5e0e7b-kube-api-access-jgqnf\") pod \"redhat-marketplace-fhdkc\" (UID: \"49822999-3ac2-4409-ac86-dec02d5e0e7b\") " pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:29 crc kubenswrapper[5094]: I0220 08:58:29.930408 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:30 crc kubenswrapper[5094]: I0220 08:58:30.442165 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fhdkc"] Feb 20 08:58:31 crc kubenswrapper[5094]: I0220 08:58:31.170577 5094 generic.go:334] "Generic (PLEG): container finished" podID="49822999-3ac2-4409-ac86-dec02d5e0e7b" containerID="75b7b66bdfbe10a707243b683836f8920fd4552e89d519d25c2461ed4ea9267a" exitCode=0 Feb 20 08:58:31 crc kubenswrapper[5094]: I0220 08:58:31.170675 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhdkc" event={"ID":"49822999-3ac2-4409-ac86-dec02d5e0e7b","Type":"ContainerDied","Data":"75b7b66bdfbe10a707243b683836f8920fd4552e89d519d25c2461ed4ea9267a"} Feb 20 08:58:31 crc kubenswrapper[5094]: I0220 08:58:31.171007 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhdkc" event={"ID":"49822999-3ac2-4409-ac86-dec02d5e0e7b","Type":"ContainerStarted","Data":"35f7059f97b45604919176cfdf23aa8bb5a05b3d1142abcc4ccdc4a4de0f3a19"} Feb 20 08:58:32 crc kubenswrapper[5094]: I0220 08:58:32.182801 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhdkc" event={"ID":"49822999-3ac2-4409-ac86-dec02d5e0e7b","Type":"ContainerStarted","Data":"2683b63c2969a70e315e87712e434f12adf29d5d7a0c2be7d34732a038d812d0"} Feb 20 08:58:33 crc kubenswrapper[5094]: I0220 08:58:33.194577 5094 generic.go:334] "Generic (PLEG): container finished" podID="49822999-3ac2-4409-ac86-dec02d5e0e7b" containerID="2683b63c2969a70e315e87712e434f12adf29d5d7a0c2be7d34732a038d812d0" exitCode=0 Feb 20 08:58:33 crc kubenswrapper[5094]: I0220 08:58:33.194730 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhdkc" event={"ID":"49822999-3ac2-4409-ac86-dec02d5e0e7b","Type":"ContainerDied","Data":"2683b63c2969a70e315e87712e434f12adf29d5d7a0c2be7d34732a038d812d0"} Feb 20 08:58:33 crc kubenswrapper[5094]: I0220 08:58:33.197776 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 08:58:34 crc kubenswrapper[5094]: I0220 08:58:34.205398 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhdkc" event={"ID":"49822999-3ac2-4409-ac86-dec02d5e0e7b","Type":"ContainerStarted","Data":"a9bf6fe2a51f8f83dcab18cae27fa1917088ada43ce12a6b73b799611a135451"} Feb 20 08:58:34 crc kubenswrapper[5094]: I0220 08:58:34.231614 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fhdkc" podStartSLOduration=2.7863152429999998 podStartE2EDuration="5.231597816s" podCreationTimestamp="2026-02-20 08:58:29 +0000 UTC" firstStartedPulling="2026-02-20 08:58:31.172418227 +0000 UTC m=+7926.045044938" lastFinishedPulling="2026-02-20 08:58:33.61770079 +0000 UTC m=+7928.490327511" observedRunningTime="2026-02-20 08:58:34.22760326 +0000 UTC m=+7929.100230001" watchObservedRunningTime="2026-02-20 08:58:34.231597816 +0000 UTC m=+7929.104224537" Feb 20 08:58:39 crc kubenswrapper[5094]: I0220 08:58:39.931087 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:39 crc kubenswrapper[5094]: I0220 08:58:39.933408 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:39 crc kubenswrapper[5094]: I0220 08:58:39.987260 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:40 crc kubenswrapper[5094]: I0220 08:58:40.325754 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:40 crc kubenswrapper[5094]: I0220 08:58:40.387006 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fhdkc"] Feb 20 08:58:40 crc kubenswrapper[5094]: I0220 08:58:40.841259 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:58:40 crc kubenswrapper[5094]: E0220 08:58:40.842003 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:58:42 crc kubenswrapper[5094]: I0220 08:58:42.287969 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fhdkc" podUID="49822999-3ac2-4409-ac86-dec02d5e0e7b" containerName="registry-server" containerID="cri-o://a9bf6fe2a51f8f83dcab18cae27fa1917088ada43ce12a6b73b799611a135451" gracePeriod=2 Feb 20 08:58:42 crc kubenswrapper[5094]: I0220 08:58:42.821223 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:42 crc kubenswrapper[5094]: I0220 08:58:42.944699 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgqnf\" (UniqueName: \"kubernetes.io/projected/49822999-3ac2-4409-ac86-dec02d5e0e7b-kube-api-access-jgqnf\") pod \"49822999-3ac2-4409-ac86-dec02d5e0e7b\" (UID: \"49822999-3ac2-4409-ac86-dec02d5e0e7b\") " Feb 20 08:58:42 crc kubenswrapper[5094]: I0220 08:58:42.944847 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49822999-3ac2-4409-ac86-dec02d5e0e7b-utilities\") pod \"49822999-3ac2-4409-ac86-dec02d5e0e7b\" (UID: \"49822999-3ac2-4409-ac86-dec02d5e0e7b\") " Feb 20 08:58:42 crc kubenswrapper[5094]: I0220 08:58:42.945023 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49822999-3ac2-4409-ac86-dec02d5e0e7b-catalog-content\") pod \"49822999-3ac2-4409-ac86-dec02d5e0e7b\" (UID: \"49822999-3ac2-4409-ac86-dec02d5e0e7b\") " Feb 20 08:58:42 crc kubenswrapper[5094]: I0220 08:58:42.945724 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49822999-3ac2-4409-ac86-dec02d5e0e7b-utilities" (OuterVolumeSpecName: "utilities") pod "49822999-3ac2-4409-ac86-dec02d5e0e7b" (UID: "49822999-3ac2-4409-ac86-dec02d5e0e7b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:58:42 crc kubenswrapper[5094]: I0220 08:58:42.958967 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49822999-3ac2-4409-ac86-dec02d5e0e7b-kube-api-access-jgqnf" (OuterVolumeSpecName: "kube-api-access-jgqnf") pod "49822999-3ac2-4409-ac86-dec02d5e0e7b" (UID: "49822999-3ac2-4409-ac86-dec02d5e0e7b"). InnerVolumeSpecName "kube-api-access-jgqnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:58:42 crc kubenswrapper[5094]: I0220 08:58:42.967785 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49822999-3ac2-4409-ac86-dec02d5e0e7b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49822999-3ac2-4409-ac86-dec02d5e0e7b" (UID: "49822999-3ac2-4409-ac86-dec02d5e0e7b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.047304 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgqnf\" (UniqueName: \"kubernetes.io/projected/49822999-3ac2-4409-ac86-dec02d5e0e7b-kube-api-access-jgqnf\") on node \"crc\" DevicePath \"\"" Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.047342 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49822999-3ac2-4409-ac86-dec02d5e0e7b-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.047357 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49822999-3ac2-4409-ac86-dec02d5e0e7b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.304638 5094 generic.go:334] "Generic (PLEG): container finished" podID="49822999-3ac2-4409-ac86-dec02d5e0e7b" containerID="a9bf6fe2a51f8f83dcab18cae27fa1917088ada43ce12a6b73b799611a135451" exitCode=0 Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.304685 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhdkc" event={"ID":"49822999-3ac2-4409-ac86-dec02d5e0e7b","Type":"ContainerDied","Data":"a9bf6fe2a51f8f83dcab18cae27fa1917088ada43ce12a6b73b799611a135451"} Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.304736 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhdkc" event={"ID":"49822999-3ac2-4409-ac86-dec02d5e0e7b","Type":"ContainerDied","Data":"35f7059f97b45604919176cfdf23aa8bb5a05b3d1142abcc4ccdc4a4de0f3a19"} Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.304757 5094 scope.go:117] "RemoveContainer" containerID="a9bf6fe2a51f8f83dcab18cae27fa1917088ada43ce12a6b73b799611a135451" Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.304886 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.360676 5094 scope.go:117] "RemoveContainer" containerID="2683b63c2969a70e315e87712e434f12adf29d5d7a0c2be7d34732a038d812d0" Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.373490 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fhdkc"] Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.388909 5094 scope.go:117] "RemoveContainer" containerID="75b7b66bdfbe10a707243b683836f8920fd4552e89d519d25c2461ed4ea9267a" Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.426513 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fhdkc"] Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.440405 5094 scope.go:117] "RemoveContainer" containerID="a9bf6fe2a51f8f83dcab18cae27fa1917088ada43ce12a6b73b799611a135451" Feb 20 08:58:43 crc kubenswrapper[5094]: E0220 08:58:43.440931 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9bf6fe2a51f8f83dcab18cae27fa1917088ada43ce12a6b73b799611a135451\": container with ID starting with a9bf6fe2a51f8f83dcab18cae27fa1917088ada43ce12a6b73b799611a135451 not found: ID does not exist" containerID="a9bf6fe2a51f8f83dcab18cae27fa1917088ada43ce12a6b73b799611a135451" Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.440964 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9bf6fe2a51f8f83dcab18cae27fa1917088ada43ce12a6b73b799611a135451"} err="failed to get container status \"a9bf6fe2a51f8f83dcab18cae27fa1917088ada43ce12a6b73b799611a135451\": rpc error: code = NotFound desc = could not find container \"a9bf6fe2a51f8f83dcab18cae27fa1917088ada43ce12a6b73b799611a135451\": container with ID starting with a9bf6fe2a51f8f83dcab18cae27fa1917088ada43ce12a6b73b799611a135451 not found: ID does not exist" Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.440984 5094 scope.go:117] "RemoveContainer" containerID="2683b63c2969a70e315e87712e434f12adf29d5d7a0c2be7d34732a038d812d0" Feb 20 08:58:43 crc kubenswrapper[5094]: E0220 08:58:43.441537 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2683b63c2969a70e315e87712e434f12adf29d5d7a0c2be7d34732a038d812d0\": container with ID starting with 2683b63c2969a70e315e87712e434f12adf29d5d7a0c2be7d34732a038d812d0 not found: ID does not exist" containerID="2683b63c2969a70e315e87712e434f12adf29d5d7a0c2be7d34732a038d812d0" Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.441572 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2683b63c2969a70e315e87712e434f12adf29d5d7a0c2be7d34732a038d812d0"} err="failed to get container status \"2683b63c2969a70e315e87712e434f12adf29d5d7a0c2be7d34732a038d812d0\": rpc error: code = NotFound desc = could not find container \"2683b63c2969a70e315e87712e434f12adf29d5d7a0c2be7d34732a038d812d0\": container with ID starting with 2683b63c2969a70e315e87712e434f12adf29d5d7a0c2be7d34732a038d812d0 not found: ID does not exist" Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.441593 5094 scope.go:117] "RemoveContainer" containerID="75b7b66bdfbe10a707243b683836f8920fd4552e89d519d25c2461ed4ea9267a" Feb 20 08:58:43 crc kubenswrapper[5094]: E0220 08:58:43.442177 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75b7b66bdfbe10a707243b683836f8920fd4552e89d519d25c2461ed4ea9267a\": container with ID starting with 75b7b66bdfbe10a707243b683836f8920fd4552e89d519d25c2461ed4ea9267a not found: ID does not exist" containerID="75b7b66bdfbe10a707243b683836f8920fd4552e89d519d25c2461ed4ea9267a" Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.442208 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75b7b66bdfbe10a707243b683836f8920fd4552e89d519d25c2461ed4ea9267a"} err="failed to get container status \"75b7b66bdfbe10a707243b683836f8920fd4552e89d519d25c2461ed4ea9267a\": rpc error: code = NotFound desc = could not find container \"75b7b66bdfbe10a707243b683836f8920fd4552e89d519d25c2461ed4ea9267a\": container with ID starting with 75b7b66bdfbe10a707243b683836f8920fd4552e89d519d25c2461ed4ea9267a not found: ID does not exist" Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.859653 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49822999-3ac2-4409-ac86-dec02d5e0e7b" path="/var/lib/kubelet/pods/49822999-3ac2-4409-ac86-dec02d5e0e7b/volumes" Feb 20 08:58:55 crc kubenswrapper[5094]: I0220 08:58:55.851323 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:58:55 crc kubenswrapper[5094]: E0220 08:58:55.852304 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:59:09 crc kubenswrapper[5094]: I0220 08:59:09.840730 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:59:09 crc kubenswrapper[5094]: E0220 08:59:09.841563 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:59:24 crc kubenswrapper[5094]: I0220 08:59:24.840653 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:59:24 crc kubenswrapper[5094]: E0220 08:59:24.841540 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:59:36 crc kubenswrapper[5094]: I0220 08:59:36.840341 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:59:36 crc kubenswrapper[5094]: E0220 08:59:36.841212 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:59:48 crc kubenswrapper[5094]: I0220 08:59:48.840864 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:59:48 crc kubenswrapper[5094]: E0220 08:59:48.841816 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.183827 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l"] Feb 20 09:00:00 crc kubenswrapper[5094]: E0220 09:00:00.184863 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49822999-3ac2-4409-ac86-dec02d5e0e7b" containerName="extract-utilities" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.184881 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="49822999-3ac2-4409-ac86-dec02d5e0e7b" containerName="extract-utilities" Feb 20 09:00:00 crc kubenswrapper[5094]: E0220 09:00:00.184894 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49822999-3ac2-4409-ac86-dec02d5e0e7b" containerName="registry-server" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.184899 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="49822999-3ac2-4409-ac86-dec02d5e0e7b" containerName="registry-server" Feb 20 09:00:00 crc kubenswrapper[5094]: E0220 09:00:00.184920 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49822999-3ac2-4409-ac86-dec02d5e0e7b" containerName="extract-content" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.184927 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="49822999-3ac2-4409-ac86-dec02d5e0e7b" containerName="extract-content" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.185121 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="49822999-3ac2-4409-ac86-dec02d5e0e7b" containerName="registry-server" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.185881 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.189891 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.191871 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.193024 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l"] Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.279237 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/30508b7a-ac76-48d8-822c-65a32552ca80-secret-volume\") pod \"collect-profiles-29526300-bqm8l\" (UID: \"30508b7a-ac76-48d8-822c-65a32552ca80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.279309 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n4sz\" (UniqueName: \"kubernetes.io/projected/30508b7a-ac76-48d8-822c-65a32552ca80-kube-api-access-8n4sz\") pod \"collect-profiles-29526300-bqm8l\" (UID: \"30508b7a-ac76-48d8-822c-65a32552ca80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.279351 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/30508b7a-ac76-48d8-822c-65a32552ca80-config-volume\") pod \"collect-profiles-29526300-bqm8l\" (UID: \"30508b7a-ac76-48d8-822c-65a32552ca80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.383186 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/30508b7a-ac76-48d8-822c-65a32552ca80-config-volume\") pod \"collect-profiles-29526300-bqm8l\" (UID: \"30508b7a-ac76-48d8-822c-65a32552ca80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.383408 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/30508b7a-ac76-48d8-822c-65a32552ca80-secret-volume\") pod \"collect-profiles-29526300-bqm8l\" (UID: \"30508b7a-ac76-48d8-822c-65a32552ca80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.383468 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n4sz\" (UniqueName: \"kubernetes.io/projected/30508b7a-ac76-48d8-822c-65a32552ca80-kube-api-access-8n4sz\") pod \"collect-profiles-29526300-bqm8l\" (UID: \"30508b7a-ac76-48d8-822c-65a32552ca80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.384162 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/30508b7a-ac76-48d8-822c-65a32552ca80-config-volume\") pod \"collect-profiles-29526300-bqm8l\" (UID: \"30508b7a-ac76-48d8-822c-65a32552ca80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.394641 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/30508b7a-ac76-48d8-822c-65a32552ca80-secret-volume\") pod \"collect-profiles-29526300-bqm8l\" (UID: \"30508b7a-ac76-48d8-822c-65a32552ca80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.446576 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n4sz\" (UniqueName: \"kubernetes.io/projected/30508b7a-ac76-48d8-822c-65a32552ca80-kube-api-access-8n4sz\") pod \"collect-profiles-29526300-bqm8l\" (UID: \"30508b7a-ac76-48d8-822c-65a32552ca80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.514085 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.840695 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 09:00:00 crc kubenswrapper[5094]: E0220 09:00:00.841271 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:00:01 crc kubenswrapper[5094]: I0220 09:00:01.041940 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l"] Feb 20 09:00:01 crc kubenswrapper[5094]: I0220 09:00:01.105748 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l" event={"ID":"30508b7a-ac76-48d8-822c-65a32552ca80","Type":"ContainerStarted","Data":"8a0b65f8627b8068f81e5abc2ed7bbee8aeb2ab222b31511575cdd1d3322e1f9"} Feb 20 09:00:02 crc kubenswrapper[5094]: I0220 09:00:02.116361 5094 generic.go:334] "Generic (PLEG): container finished" podID="30508b7a-ac76-48d8-822c-65a32552ca80" containerID="9dd7ec3da040b20e94b1ef4ad1e6147baa04fe89a2ea1cd7d18c7a1def8587f9" exitCode=0 Feb 20 09:00:02 crc kubenswrapper[5094]: I0220 09:00:02.116439 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l" event={"ID":"30508b7a-ac76-48d8-822c-65a32552ca80","Type":"ContainerDied","Data":"9dd7ec3da040b20e94b1ef4ad1e6147baa04fe89a2ea1cd7d18c7a1def8587f9"} Feb 20 09:00:03 crc kubenswrapper[5094]: I0220 09:00:03.487033 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l" Feb 20 09:00:03 crc kubenswrapper[5094]: I0220 09:00:03.565950 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/30508b7a-ac76-48d8-822c-65a32552ca80-config-volume\") pod \"30508b7a-ac76-48d8-822c-65a32552ca80\" (UID: \"30508b7a-ac76-48d8-822c-65a32552ca80\") " Feb 20 09:00:03 crc kubenswrapper[5094]: I0220 09:00:03.566167 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8n4sz\" (UniqueName: \"kubernetes.io/projected/30508b7a-ac76-48d8-822c-65a32552ca80-kube-api-access-8n4sz\") pod \"30508b7a-ac76-48d8-822c-65a32552ca80\" (UID: \"30508b7a-ac76-48d8-822c-65a32552ca80\") " Feb 20 09:00:03 crc kubenswrapper[5094]: I0220 09:00:03.566212 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/30508b7a-ac76-48d8-822c-65a32552ca80-secret-volume\") pod \"30508b7a-ac76-48d8-822c-65a32552ca80\" (UID: \"30508b7a-ac76-48d8-822c-65a32552ca80\") " Feb 20 09:00:03 crc kubenswrapper[5094]: I0220 09:00:03.566625 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30508b7a-ac76-48d8-822c-65a32552ca80-config-volume" (OuterVolumeSpecName: "config-volume") pod "30508b7a-ac76-48d8-822c-65a32552ca80" (UID: "30508b7a-ac76-48d8-822c-65a32552ca80"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:00:03 crc kubenswrapper[5094]: I0220 09:00:03.571195 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30508b7a-ac76-48d8-822c-65a32552ca80-kube-api-access-8n4sz" (OuterVolumeSpecName: "kube-api-access-8n4sz") pod "30508b7a-ac76-48d8-822c-65a32552ca80" (UID: "30508b7a-ac76-48d8-822c-65a32552ca80"). InnerVolumeSpecName "kube-api-access-8n4sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:00:03 crc kubenswrapper[5094]: I0220 09:00:03.571519 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30508b7a-ac76-48d8-822c-65a32552ca80-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "30508b7a-ac76-48d8-822c-65a32552ca80" (UID: "30508b7a-ac76-48d8-822c-65a32552ca80"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:00:03 crc kubenswrapper[5094]: I0220 09:00:03.669559 5094 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/30508b7a-ac76-48d8-822c-65a32552ca80-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 09:00:03 crc kubenswrapper[5094]: I0220 09:00:03.669614 5094 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/30508b7a-ac76-48d8-822c-65a32552ca80-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 09:00:03 crc kubenswrapper[5094]: I0220 09:00:03.669636 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8n4sz\" (UniqueName: \"kubernetes.io/projected/30508b7a-ac76-48d8-822c-65a32552ca80-kube-api-access-8n4sz\") on node \"crc\" DevicePath \"\"" Feb 20 09:00:04 crc kubenswrapper[5094]: I0220 09:00:04.135014 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l" event={"ID":"30508b7a-ac76-48d8-822c-65a32552ca80","Type":"ContainerDied","Data":"8a0b65f8627b8068f81e5abc2ed7bbee8aeb2ab222b31511575cdd1d3322e1f9"} Feb 20 09:00:04 crc kubenswrapper[5094]: I0220 09:00:04.135572 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a0b65f8627b8068f81e5abc2ed7bbee8aeb2ab222b31511575cdd1d3322e1f9" Feb 20 09:00:04 crc kubenswrapper[5094]: I0220 09:00:04.135091 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l" Feb 20 09:00:04 crc kubenswrapper[5094]: I0220 09:00:04.563731 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr"] Feb 20 09:00:04 crc kubenswrapper[5094]: I0220 09:00:04.572083 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr"] Feb 20 09:00:05 crc kubenswrapper[5094]: I0220 09:00:05.851424 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b1b88d4-fc9b-465d-907e-7abf6c46c919" path="/var/lib/kubelet/pods/0b1b88d4-fc9b-465d-907e-7abf6c46c919/volumes" Feb 20 09:00:14 crc kubenswrapper[5094]: I0220 09:00:14.841546 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 09:00:15 crc kubenswrapper[5094]: I0220 09:00:15.232132 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"8272b573c1c010339b8d8ae657303bad51b724d1c5f662b08fa0dc532f0ace33"} Feb 20 09:00:33 crc kubenswrapper[5094]: I0220 09:00:33.266974 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gnd5s"] Feb 20 09:00:33 crc kubenswrapper[5094]: E0220 09:00:33.268379 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30508b7a-ac76-48d8-822c-65a32552ca80" containerName="collect-profiles" Feb 20 09:00:33 crc kubenswrapper[5094]: I0220 09:00:33.268400 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="30508b7a-ac76-48d8-822c-65a32552ca80" containerName="collect-profiles" Feb 20 09:00:33 crc kubenswrapper[5094]: I0220 09:00:33.268920 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="30508b7a-ac76-48d8-822c-65a32552ca80" containerName="collect-profiles" Feb 20 09:00:33 crc kubenswrapper[5094]: I0220 09:00:33.271575 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:33 crc kubenswrapper[5094]: I0220 09:00:33.276273 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gnd5s"] Feb 20 09:00:33 crc kubenswrapper[5094]: I0220 09:00:33.471448 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35bec39e-02b7-45c3-b331-ac00eb024eca-catalog-content\") pod \"redhat-operators-gnd5s\" (UID: \"35bec39e-02b7-45c3-b331-ac00eb024eca\") " pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:33 crc kubenswrapper[5094]: I0220 09:00:33.471822 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdn7q\" (UniqueName: \"kubernetes.io/projected/35bec39e-02b7-45c3-b331-ac00eb024eca-kube-api-access-bdn7q\") pod \"redhat-operators-gnd5s\" (UID: \"35bec39e-02b7-45c3-b331-ac00eb024eca\") " pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:33 crc kubenswrapper[5094]: I0220 09:00:33.471857 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35bec39e-02b7-45c3-b331-ac00eb024eca-utilities\") pod \"redhat-operators-gnd5s\" (UID: \"35bec39e-02b7-45c3-b331-ac00eb024eca\") " pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:33 crc kubenswrapper[5094]: I0220 09:00:33.574020 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35bec39e-02b7-45c3-b331-ac00eb024eca-catalog-content\") pod \"redhat-operators-gnd5s\" (UID: \"35bec39e-02b7-45c3-b331-ac00eb024eca\") " pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:33 crc kubenswrapper[5094]: I0220 09:00:33.574095 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdn7q\" (UniqueName: \"kubernetes.io/projected/35bec39e-02b7-45c3-b331-ac00eb024eca-kube-api-access-bdn7q\") pod \"redhat-operators-gnd5s\" (UID: \"35bec39e-02b7-45c3-b331-ac00eb024eca\") " pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:33 crc kubenswrapper[5094]: I0220 09:00:33.574121 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35bec39e-02b7-45c3-b331-ac00eb024eca-utilities\") pod \"redhat-operators-gnd5s\" (UID: \"35bec39e-02b7-45c3-b331-ac00eb024eca\") " pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:33 crc kubenswrapper[5094]: I0220 09:00:33.574546 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35bec39e-02b7-45c3-b331-ac00eb024eca-catalog-content\") pod \"redhat-operators-gnd5s\" (UID: \"35bec39e-02b7-45c3-b331-ac00eb024eca\") " pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:33 crc kubenswrapper[5094]: I0220 09:00:33.574688 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35bec39e-02b7-45c3-b331-ac00eb024eca-utilities\") pod \"redhat-operators-gnd5s\" (UID: \"35bec39e-02b7-45c3-b331-ac00eb024eca\") " pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:33 crc kubenswrapper[5094]: I0220 09:00:33.593790 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdn7q\" (UniqueName: \"kubernetes.io/projected/35bec39e-02b7-45c3-b331-ac00eb024eca-kube-api-access-bdn7q\") pod \"redhat-operators-gnd5s\" (UID: \"35bec39e-02b7-45c3-b331-ac00eb024eca\") " pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:33 crc kubenswrapper[5094]: I0220 09:00:33.893073 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:34 crc kubenswrapper[5094]: I0220 09:00:34.408601 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gnd5s"] Feb 20 09:00:34 crc kubenswrapper[5094]: W0220 09:00:34.414344 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35bec39e_02b7_45c3_b331_ac00eb024eca.slice/crio-db3a6e6f72d8eeea9c87bd2024f02efe99a147a1c89ab21cf3dc23b3c906cbea WatchSource:0}: Error finding container db3a6e6f72d8eeea9c87bd2024f02efe99a147a1c89ab21cf3dc23b3c906cbea: Status 404 returned error can't find the container with id db3a6e6f72d8eeea9c87bd2024f02efe99a147a1c89ab21cf3dc23b3c906cbea Feb 20 09:00:34 crc kubenswrapper[5094]: I0220 09:00:34.434177 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gnd5s" event={"ID":"35bec39e-02b7-45c3-b331-ac00eb024eca","Type":"ContainerStarted","Data":"db3a6e6f72d8eeea9c87bd2024f02efe99a147a1c89ab21cf3dc23b3c906cbea"} Feb 20 09:00:35 crc kubenswrapper[5094]: I0220 09:00:35.445924 5094 generic.go:334] "Generic (PLEG): container finished" podID="35bec39e-02b7-45c3-b331-ac00eb024eca" containerID="f84b0e896a026e8b0994b8e53246f6056d2491862161506181502cfe56c6f519" exitCode=0 Feb 20 09:00:35 crc kubenswrapper[5094]: I0220 09:00:35.445977 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gnd5s" event={"ID":"35bec39e-02b7-45c3-b331-ac00eb024eca","Type":"ContainerDied","Data":"f84b0e896a026e8b0994b8e53246f6056d2491862161506181502cfe56c6f519"} Feb 20 09:00:36 crc kubenswrapper[5094]: I0220 09:00:36.459982 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gnd5s" event={"ID":"35bec39e-02b7-45c3-b331-ac00eb024eca","Type":"ContainerStarted","Data":"6da3f006b736a28f2fd1d6d86e5f81251619c126785be922218f3dc59a01e2eb"} Feb 20 09:00:41 crc kubenswrapper[5094]: I0220 09:00:41.510110 5094 generic.go:334] "Generic (PLEG): container finished" podID="35bec39e-02b7-45c3-b331-ac00eb024eca" containerID="6da3f006b736a28f2fd1d6d86e5f81251619c126785be922218f3dc59a01e2eb" exitCode=0 Feb 20 09:00:41 crc kubenswrapper[5094]: I0220 09:00:41.510185 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gnd5s" event={"ID":"35bec39e-02b7-45c3-b331-ac00eb024eca","Type":"ContainerDied","Data":"6da3f006b736a28f2fd1d6d86e5f81251619c126785be922218f3dc59a01e2eb"} Feb 20 09:00:42 crc kubenswrapper[5094]: I0220 09:00:42.522079 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gnd5s" event={"ID":"35bec39e-02b7-45c3-b331-ac00eb024eca","Type":"ContainerStarted","Data":"4b1a2f4e739eb6f0aa6b0eb92126ac137e0940d2ffe99cec196fa67181718885"} Feb 20 09:00:42 crc kubenswrapper[5094]: I0220 09:00:42.552661 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gnd5s" podStartSLOduration=3.060362197 podStartE2EDuration="9.552642058s" podCreationTimestamp="2026-02-20 09:00:33 +0000 UTC" firstStartedPulling="2026-02-20 09:00:35.44846695 +0000 UTC m=+8050.321093661" lastFinishedPulling="2026-02-20 09:00:41.940746811 +0000 UTC m=+8056.813373522" observedRunningTime="2026-02-20 09:00:42.545019836 +0000 UTC m=+8057.417646547" watchObservedRunningTime="2026-02-20 09:00:42.552642058 +0000 UTC m=+8057.425268769" Feb 20 09:00:42 crc kubenswrapper[5094]: I0220 09:00:42.935413 5094 scope.go:117] "RemoveContainer" containerID="df01effbb937965fa2da8671ac9d455b13bf8b90f5ca8dcd5de48223dc6408d9" Feb 20 09:00:43 crc kubenswrapper[5094]: I0220 09:00:43.893419 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:43 crc kubenswrapper[5094]: I0220 09:00:43.893755 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:44 crc kubenswrapper[5094]: I0220 09:00:44.949516 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gnd5s" podUID="35bec39e-02b7-45c3-b331-ac00eb024eca" containerName="registry-server" probeResult="failure" output=< Feb 20 09:00:44 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 09:00:44 crc kubenswrapper[5094]: > Feb 20 09:00:53 crc kubenswrapper[5094]: I0220 09:00:53.974196 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:54 crc kubenswrapper[5094]: I0220 09:00:54.034571 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:54 crc kubenswrapper[5094]: I0220 09:00:54.060579 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-b8a1-account-create-update-27rkz"] Feb 20 09:00:54 crc kubenswrapper[5094]: I0220 09:00:54.091462 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-zdzg9"] Feb 20 09:00:54 crc kubenswrapper[5094]: I0220 09:00:54.111744 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-b8a1-account-create-update-27rkz"] Feb 20 09:00:54 crc kubenswrapper[5094]: I0220 09:00:54.121504 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-zdzg9"] Feb 20 09:00:54 crc kubenswrapper[5094]: I0220 09:00:54.229645 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gnd5s"] Feb 20 09:00:55 crc kubenswrapper[5094]: I0220 09:00:55.637379 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gnd5s" podUID="35bec39e-02b7-45c3-b331-ac00eb024eca" containerName="registry-server" containerID="cri-o://4b1a2f4e739eb6f0aa6b0eb92126ac137e0940d2ffe99cec196fa67181718885" gracePeriod=2 Feb 20 09:00:55 crc kubenswrapper[5094]: I0220 09:00:55.855226 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1068d86d-d730-4dab-8aaf-12c5a5c62a70" path="/var/lib/kubelet/pods/1068d86d-d730-4dab-8aaf-12c5a5c62a70/volumes" Feb 20 09:00:55 crc kubenswrapper[5094]: I0220 09:00:55.856763 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="674e60ac-3253-4c4c-8e5b-7a59ed2e8989" path="/var/lib/kubelet/pods/674e60ac-3253-4c4c-8e5b-7a59ed2e8989/volumes" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.140070 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.245953 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35bec39e-02b7-45c3-b331-ac00eb024eca-catalog-content\") pod \"35bec39e-02b7-45c3-b331-ac00eb024eca\" (UID: \"35bec39e-02b7-45c3-b331-ac00eb024eca\") " Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.246100 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdn7q\" (UniqueName: \"kubernetes.io/projected/35bec39e-02b7-45c3-b331-ac00eb024eca-kube-api-access-bdn7q\") pod \"35bec39e-02b7-45c3-b331-ac00eb024eca\" (UID: \"35bec39e-02b7-45c3-b331-ac00eb024eca\") " Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.246240 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35bec39e-02b7-45c3-b331-ac00eb024eca-utilities\") pod \"35bec39e-02b7-45c3-b331-ac00eb024eca\" (UID: \"35bec39e-02b7-45c3-b331-ac00eb024eca\") " Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.248063 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35bec39e-02b7-45c3-b331-ac00eb024eca-utilities" (OuterVolumeSpecName: "utilities") pod "35bec39e-02b7-45c3-b331-ac00eb024eca" (UID: "35bec39e-02b7-45c3-b331-ac00eb024eca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.254834 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35bec39e-02b7-45c3-b331-ac00eb024eca-kube-api-access-bdn7q" (OuterVolumeSpecName: "kube-api-access-bdn7q") pod "35bec39e-02b7-45c3-b331-ac00eb024eca" (UID: "35bec39e-02b7-45c3-b331-ac00eb024eca"). InnerVolumeSpecName "kube-api-access-bdn7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.352791 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdn7q\" (UniqueName: \"kubernetes.io/projected/35bec39e-02b7-45c3-b331-ac00eb024eca-kube-api-access-bdn7q\") on node \"crc\" DevicePath \"\"" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.353020 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35bec39e-02b7-45c3-b331-ac00eb024eca-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.400181 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35bec39e-02b7-45c3-b331-ac00eb024eca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "35bec39e-02b7-45c3-b331-ac00eb024eca" (UID: "35bec39e-02b7-45c3-b331-ac00eb024eca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.455078 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35bec39e-02b7-45c3-b331-ac00eb024eca-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.649695 5094 generic.go:334] "Generic (PLEG): container finished" podID="35bec39e-02b7-45c3-b331-ac00eb024eca" containerID="4b1a2f4e739eb6f0aa6b0eb92126ac137e0940d2ffe99cec196fa67181718885" exitCode=0 Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.649763 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gnd5s" event={"ID":"35bec39e-02b7-45c3-b331-ac00eb024eca","Type":"ContainerDied","Data":"4b1a2f4e739eb6f0aa6b0eb92126ac137e0940d2ffe99cec196fa67181718885"} Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.649789 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.649795 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gnd5s" event={"ID":"35bec39e-02b7-45c3-b331-ac00eb024eca","Type":"ContainerDied","Data":"db3a6e6f72d8eeea9c87bd2024f02efe99a147a1c89ab21cf3dc23b3c906cbea"} Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.649830 5094 scope.go:117] "RemoveContainer" containerID="4b1a2f4e739eb6f0aa6b0eb92126ac137e0940d2ffe99cec196fa67181718885" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.695728 5094 scope.go:117] "RemoveContainer" containerID="6da3f006b736a28f2fd1d6d86e5f81251619c126785be922218f3dc59a01e2eb" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.700185 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gnd5s"] Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.709189 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gnd5s"] Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.718643 5094 scope.go:117] "RemoveContainer" containerID="f84b0e896a026e8b0994b8e53246f6056d2491862161506181502cfe56c6f519" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.786868 5094 scope.go:117] "RemoveContainer" containerID="4b1a2f4e739eb6f0aa6b0eb92126ac137e0940d2ffe99cec196fa67181718885" Feb 20 09:00:56 crc kubenswrapper[5094]: E0220 09:00:56.787456 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b1a2f4e739eb6f0aa6b0eb92126ac137e0940d2ffe99cec196fa67181718885\": container with ID starting with 4b1a2f4e739eb6f0aa6b0eb92126ac137e0940d2ffe99cec196fa67181718885 not found: ID does not exist" containerID="4b1a2f4e739eb6f0aa6b0eb92126ac137e0940d2ffe99cec196fa67181718885" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.787507 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b1a2f4e739eb6f0aa6b0eb92126ac137e0940d2ffe99cec196fa67181718885"} err="failed to get container status \"4b1a2f4e739eb6f0aa6b0eb92126ac137e0940d2ffe99cec196fa67181718885\": rpc error: code = NotFound desc = could not find container \"4b1a2f4e739eb6f0aa6b0eb92126ac137e0940d2ffe99cec196fa67181718885\": container with ID starting with 4b1a2f4e739eb6f0aa6b0eb92126ac137e0940d2ffe99cec196fa67181718885 not found: ID does not exist" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.787537 5094 scope.go:117] "RemoveContainer" containerID="6da3f006b736a28f2fd1d6d86e5f81251619c126785be922218f3dc59a01e2eb" Feb 20 09:00:56 crc kubenswrapper[5094]: E0220 09:00:56.788272 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6da3f006b736a28f2fd1d6d86e5f81251619c126785be922218f3dc59a01e2eb\": container with ID starting with 6da3f006b736a28f2fd1d6d86e5f81251619c126785be922218f3dc59a01e2eb not found: ID does not exist" containerID="6da3f006b736a28f2fd1d6d86e5f81251619c126785be922218f3dc59a01e2eb" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.788316 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6da3f006b736a28f2fd1d6d86e5f81251619c126785be922218f3dc59a01e2eb"} err="failed to get container status \"6da3f006b736a28f2fd1d6d86e5f81251619c126785be922218f3dc59a01e2eb\": rpc error: code = NotFound desc = could not find container \"6da3f006b736a28f2fd1d6d86e5f81251619c126785be922218f3dc59a01e2eb\": container with ID starting with 6da3f006b736a28f2fd1d6d86e5f81251619c126785be922218f3dc59a01e2eb not found: ID does not exist" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.788369 5094 scope.go:117] "RemoveContainer" containerID="f84b0e896a026e8b0994b8e53246f6056d2491862161506181502cfe56c6f519" Feb 20 09:00:56 crc kubenswrapper[5094]: E0220 09:00:56.788751 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f84b0e896a026e8b0994b8e53246f6056d2491862161506181502cfe56c6f519\": container with ID starting with f84b0e896a026e8b0994b8e53246f6056d2491862161506181502cfe56c6f519 not found: ID does not exist" containerID="f84b0e896a026e8b0994b8e53246f6056d2491862161506181502cfe56c6f519" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.788785 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f84b0e896a026e8b0994b8e53246f6056d2491862161506181502cfe56c6f519"} err="failed to get container status \"f84b0e896a026e8b0994b8e53246f6056d2491862161506181502cfe56c6f519\": rpc error: code = NotFound desc = could not find container \"f84b0e896a026e8b0994b8e53246f6056d2491862161506181502cfe56c6f519\": container with ID starting with f84b0e896a026e8b0994b8e53246f6056d2491862161506181502cfe56c6f519 not found: ID does not exist" Feb 20 09:00:57 crc kubenswrapper[5094]: I0220 09:00:57.857121 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35bec39e-02b7-45c3-b331-ac00eb024eca" path="/var/lib/kubelet/pods/35bec39e-02b7-45c3-b331-ac00eb024eca/volumes" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.147909 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29526301-wg5rm"] Feb 20 09:01:00 crc kubenswrapper[5094]: E0220 09:01:00.148587 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35bec39e-02b7-45c3-b331-ac00eb024eca" containerName="registry-server" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.148600 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="35bec39e-02b7-45c3-b331-ac00eb024eca" containerName="registry-server" Feb 20 09:01:00 crc kubenswrapper[5094]: E0220 09:01:00.148653 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35bec39e-02b7-45c3-b331-ac00eb024eca" containerName="extract-utilities" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.148660 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="35bec39e-02b7-45c3-b331-ac00eb024eca" containerName="extract-utilities" Feb 20 09:01:00 crc kubenswrapper[5094]: E0220 09:01:00.148674 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35bec39e-02b7-45c3-b331-ac00eb024eca" containerName="extract-content" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.148681 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="35bec39e-02b7-45c3-b331-ac00eb024eca" containerName="extract-content" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.148944 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="35bec39e-02b7-45c3-b331-ac00eb024eca" containerName="registry-server" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.149680 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29526301-wg5rm" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.164000 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29526301-wg5rm"] Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.240848 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-config-data\") pod \"keystone-cron-29526301-wg5rm\" (UID: \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\") " pod="openstack/keystone-cron-29526301-wg5rm" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.240912 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-fernet-keys\") pod \"keystone-cron-29526301-wg5rm\" (UID: \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\") " pod="openstack/keystone-cron-29526301-wg5rm" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.241076 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-combined-ca-bundle\") pod \"keystone-cron-29526301-wg5rm\" (UID: \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\") " pod="openstack/keystone-cron-29526301-wg5rm" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.241116 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfrhn\" (UniqueName: \"kubernetes.io/projected/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-kube-api-access-tfrhn\") pod \"keystone-cron-29526301-wg5rm\" (UID: \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\") " pod="openstack/keystone-cron-29526301-wg5rm" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.342978 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-fernet-keys\") pod \"keystone-cron-29526301-wg5rm\" (UID: \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\") " pod="openstack/keystone-cron-29526301-wg5rm" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.343123 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-combined-ca-bundle\") pod \"keystone-cron-29526301-wg5rm\" (UID: \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\") " pod="openstack/keystone-cron-29526301-wg5rm" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.343154 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfrhn\" (UniqueName: \"kubernetes.io/projected/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-kube-api-access-tfrhn\") pod \"keystone-cron-29526301-wg5rm\" (UID: \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\") " pod="openstack/keystone-cron-29526301-wg5rm" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.343222 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-config-data\") pod \"keystone-cron-29526301-wg5rm\" (UID: \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\") " pod="openstack/keystone-cron-29526301-wg5rm" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.349554 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-combined-ca-bundle\") pod \"keystone-cron-29526301-wg5rm\" (UID: \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\") " pod="openstack/keystone-cron-29526301-wg5rm" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.349920 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-config-data\") pod \"keystone-cron-29526301-wg5rm\" (UID: \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\") " pod="openstack/keystone-cron-29526301-wg5rm" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.361280 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-fernet-keys\") pod \"keystone-cron-29526301-wg5rm\" (UID: \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\") " pod="openstack/keystone-cron-29526301-wg5rm" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.362303 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfrhn\" (UniqueName: \"kubernetes.io/projected/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-kube-api-access-tfrhn\") pod \"keystone-cron-29526301-wg5rm\" (UID: \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\") " pod="openstack/keystone-cron-29526301-wg5rm" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.483678 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29526301-wg5rm" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.967163 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29526301-wg5rm"] Feb 20 09:01:00 crc kubenswrapper[5094]: W0220 09:01:00.972446 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba68c6b8_04f9_4515_8e85_3e7b4ca9615b.slice/crio-d0b5b818e33cee572381952b100d70bc745d719ac9d8e8b95a5d761c2a20a9c8 WatchSource:0}: Error finding container d0b5b818e33cee572381952b100d70bc745d719ac9d8e8b95a5d761c2a20a9c8: Status 404 returned error can't find the container with id d0b5b818e33cee572381952b100d70bc745d719ac9d8e8b95a5d761c2a20a9c8 Feb 20 09:01:01 crc kubenswrapper[5094]: I0220 09:01:01.702146 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29526301-wg5rm" event={"ID":"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b","Type":"ContainerStarted","Data":"dd6f537c0f31c70b8ddd13f131a51bcaf7ca2a867d0122b03fdb3921b3efdb82"} Feb 20 09:01:01 crc kubenswrapper[5094]: I0220 09:01:01.702728 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29526301-wg5rm" event={"ID":"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b","Type":"ContainerStarted","Data":"d0b5b818e33cee572381952b100d70bc745d719ac9d8e8b95a5d761c2a20a9c8"} Feb 20 09:01:01 crc kubenswrapper[5094]: I0220 09:01:01.728330 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29526301-wg5rm" podStartSLOduration=1.728314135 podStartE2EDuration="1.728314135s" podCreationTimestamp="2026-02-20 09:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:01:01.721993132 +0000 UTC m=+8076.594619843" watchObservedRunningTime="2026-02-20 09:01:01.728314135 +0000 UTC m=+8076.600940836" Feb 20 09:01:03 crc kubenswrapper[5094]: I0220 09:01:03.719600 5094 generic.go:334] "Generic (PLEG): container finished" podID="ba68c6b8-04f9-4515-8e85-3e7b4ca9615b" containerID="dd6f537c0f31c70b8ddd13f131a51bcaf7ca2a867d0122b03fdb3921b3efdb82" exitCode=0 Feb 20 09:01:03 crc kubenswrapper[5094]: I0220 09:01:03.719693 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29526301-wg5rm" event={"ID":"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b","Type":"ContainerDied","Data":"dd6f537c0f31c70b8ddd13f131a51bcaf7ca2a867d0122b03fdb3921b3efdb82"} Feb 20 09:01:05 crc kubenswrapper[5094]: I0220 09:01:05.159653 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29526301-wg5rm" Feb 20 09:01:05 crc kubenswrapper[5094]: I0220 09:01:05.340449 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-fernet-keys\") pod \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\" (UID: \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\") " Feb 20 09:01:05 crc kubenswrapper[5094]: I0220 09:01:05.340966 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-config-data\") pod \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\" (UID: \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\") " Feb 20 09:01:05 crc kubenswrapper[5094]: I0220 09:01:05.341131 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-combined-ca-bundle\") pod \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\" (UID: \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\") " Feb 20 09:01:05 crc kubenswrapper[5094]: I0220 09:01:05.341673 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfrhn\" (UniqueName: \"kubernetes.io/projected/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-kube-api-access-tfrhn\") pod \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\" (UID: \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\") " Feb 20 09:01:05 crc kubenswrapper[5094]: I0220 09:01:05.345880 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-kube-api-access-tfrhn" (OuterVolumeSpecName: "kube-api-access-tfrhn") pod "ba68c6b8-04f9-4515-8e85-3e7b4ca9615b" (UID: "ba68c6b8-04f9-4515-8e85-3e7b4ca9615b"). InnerVolumeSpecName "kube-api-access-tfrhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:01:05 crc kubenswrapper[5094]: I0220 09:01:05.357961 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ba68c6b8-04f9-4515-8e85-3e7b4ca9615b" (UID: "ba68c6b8-04f9-4515-8e85-3e7b4ca9615b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:01:05 crc kubenswrapper[5094]: I0220 09:01:05.366362 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba68c6b8-04f9-4515-8e85-3e7b4ca9615b" (UID: "ba68c6b8-04f9-4515-8e85-3e7b4ca9615b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:01:05 crc kubenswrapper[5094]: I0220 09:01:05.412325 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-config-data" (OuterVolumeSpecName: "config-data") pod "ba68c6b8-04f9-4515-8e85-3e7b4ca9615b" (UID: "ba68c6b8-04f9-4515-8e85-3e7b4ca9615b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:01:05 crc kubenswrapper[5094]: I0220 09:01:05.443517 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 09:01:05 crc kubenswrapper[5094]: I0220 09:01:05.443546 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:01:05 crc kubenswrapper[5094]: I0220 09:01:05.443556 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfrhn\" (UniqueName: \"kubernetes.io/projected/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-kube-api-access-tfrhn\") on node \"crc\" DevicePath \"\"" Feb 20 09:01:05 crc kubenswrapper[5094]: I0220 09:01:05.443566 5094 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 20 09:01:05 crc kubenswrapper[5094]: I0220 09:01:05.740661 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29526301-wg5rm" event={"ID":"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b","Type":"ContainerDied","Data":"d0b5b818e33cee572381952b100d70bc745d719ac9d8e8b95a5d761c2a20a9c8"} Feb 20 09:01:05 crc kubenswrapper[5094]: I0220 09:01:05.741007 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0b5b818e33cee572381952b100d70bc745d719ac9d8e8b95a5d761c2a20a9c8" Feb 20 09:01:05 crc kubenswrapper[5094]: I0220 09:01:05.740728 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29526301-wg5rm" Feb 20 09:01:06 crc kubenswrapper[5094]: I0220 09:01:06.752641 5094 generic.go:334] "Generic (PLEG): container finished" podID="7894eb94-d4dd-4035-af5b-5994b4ae6d2f" containerID="a63a79298a636726404dd99afe9f61c35ff225d0787a9c54da454b3cf54a459f" exitCode=0 Feb 20 09:01:06 crc kubenswrapper[5094]: I0220 09:01:06.752769 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" event={"ID":"7894eb94-d4dd-4035-af5b-5994b4ae6d2f","Type":"ContainerDied","Data":"a63a79298a636726404dd99afe9f61c35ff225d0787a9c54da454b3cf54a459f"} Feb 20 09:01:08 crc kubenswrapper[5094]: I0220 09:01:08.050320 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-fmrw9"] Feb 20 09:01:08 crc kubenswrapper[5094]: I0220 09:01:08.059903 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-fmrw9"] Feb 20 09:01:08 crc kubenswrapper[5094]: I0220 09:01:08.239150 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" Feb 20 09:01:08 crc kubenswrapper[5094]: I0220 09:01:08.401740 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-inventory\") pod \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " Feb 20 09:01:08 crc kubenswrapper[5094]: I0220 09:01:08.402061 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-ssh-key-openstack-networker\") pod \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " Feb 20 09:01:08 crc kubenswrapper[5094]: I0220 09:01:08.402135 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-tripleo-cleanup-combined-ca-bundle\") pod \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " Feb 20 09:01:08 crc kubenswrapper[5094]: I0220 09:01:08.402220 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcppq\" (UniqueName: \"kubernetes.io/projected/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-kube-api-access-tcppq\") pod \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " Feb 20 09:01:08 crc kubenswrapper[5094]: I0220 09:01:08.408874 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-kube-api-access-tcppq" (OuterVolumeSpecName: "kube-api-access-tcppq") pod "7894eb94-d4dd-4035-af5b-5994b4ae6d2f" (UID: "7894eb94-d4dd-4035-af5b-5994b4ae6d2f"). InnerVolumeSpecName "kube-api-access-tcppq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:01:08 crc kubenswrapper[5094]: I0220 09:01:08.409519 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "7894eb94-d4dd-4035-af5b-5994b4ae6d2f" (UID: "7894eb94-d4dd-4035-af5b-5994b4ae6d2f"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:01:08 crc kubenswrapper[5094]: E0220 09:01:08.431418 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-inventory podName:7894eb94-d4dd-4035-af5b-5994b4ae6d2f nodeName:}" failed. No retries permitted until 2026-02-20 09:01:08.931383848 +0000 UTC m=+8083.804010559 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-inventory") pod "7894eb94-d4dd-4035-af5b-5994b4ae6d2f" (UID: "7894eb94-d4dd-4035-af5b-5994b4ae6d2f") : error deleting /var/lib/kubelet/pods/7894eb94-d4dd-4035-af5b-5994b4ae6d2f/volume-subpaths: remove /var/lib/kubelet/pods/7894eb94-d4dd-4035-af5b-5994b4ae6d2f/volume-subpaths: no such file or directory Feb 20 09:01:08 crc kubenswrapper[5094]: I0220 09:01:08.435266 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "7894eb94-d4dd-4035-af5b-5994b4ae6d2f" (UID: "7894eb94-d4dd-4035-af5b-5994b4ae6d2f"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:01:08 crc kubenswrapper[5094]: I0220 09:01:08.505653 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 20 09:01:08 crc kubenswrapper[5094]: I0220 09:01:08.505765 5094 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:01:08 crc kubenswrapper[5094]: I0220 09:01:08.505799 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcppq\" (UniqueName: \"kubernetes.io/projected/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-kube-api-access-tcppq\") on node \"crc\" DevicePath \"\"" Feb 20 09:01:08 crc kubenswrapper[5094]: I0220 09:01:08.775131 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" event={"ID":"7894eb94-d4dd-4035-af5b-5994b4ae6d2f","Type":"ContainerDied","Data":"ca240917156fd9c5acb4173083f283943d7a62c3a3cbea59cf29d0d548923640"} Feb 20 09:01:08 crc kubenswrapper[5094]: I0220 09:01:08.775190 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca240917156fd9c5acb4173083f283943d7a62c3a3cbea59cf29d0d548923640" Feb 20 09:01:08 crc kubenswrapper[5094]: I0220 09:01:08.775201 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" Feb 20 09:01:09 crc kubenswrapper[5094]: I0220 09:01:09.015979 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-inventory\") pod \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " Feb 20 09:01:09 crc kubenswrapper[5094]: I0220 09:01:09.021891 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-inventory" (OuterVolumeSpecName: "inventory") pod "7894eb94-d4dd-4035-af5b-5994b4ae6d2f" (UID: "7894eb94-d4dd-4035-af5b-5994b4ae6d2f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:01:09 crc kubenswrapper[5094]: I0220 09:01:09.119203 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:01:09 crc kubenswrapper[5094]: I0220 09:01:09.854347 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="791e2b3b-9d51-41fd-bf38-5b66849b5b77" path="/var/lib/kubelet/pods/791e2b3b-9d51-41fd-bf38-5b66849b5b77/volumes" Feb 20 09:01:43 crc kubenswrapper[5094]: I0220 09:01:43.014813 5094 scope.go:117] "RemoveContainer" containerID="54e699ae44cd59613f594b23e884598e72478881e5bbc4353851926cc53ca349" Feb 20 09:01:43 crc kubenswrapper[5094]: I0220 09:01:43.043589 5094 scope.go:117] "RemoveContainer" containerID="da9d0c1b854591282ea3fc19512d062edc11f81e6daa7e4388a1d825d28d97b3" Feb 20 09:01:43 crc kubenswrapper[5094]: I0220 09:01:43.089243 5094 scope.go:117] "RemoveContainer" containerID="f3933aedef06f045c06e2f5aeaa2d2665f532f7df1042dc925032507e64641fd" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.040573 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r8h5d"] Feb 20 09:02:08 crc kubenswrapper[5094]: E0220 09:02:08.041642 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7894eb94-d4dd-4035-af5b-5994b4ae6d2f" containerName="tripleo-cleanup-tripleo-cleanup-openstack-networker" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.041658 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7894eb94-d4dd-4035-af5b-5994b4ae6d2f" containerName="tripleo-cleanup-tripleo-cleanup-openstack-networker" Feb 20 09:02:08 crc kubenswrapper[5094]: E0220 09:02:08.041678 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba68c6b8-04f9-4515-8e85-3e7b4ca9615b" containerName="keystone-cron" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.041684 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba68c6b8-04f9-4515-8e85-3e7b4ca9615b" containerName="keystone-cron" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.041911 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba68c6b8-04f9-4515-8e85-3e7b4ca9615b" containerName="keystone-cron" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.041927 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="7894eb94-d4dd-4035-af5b-5994b4ae6d2f" containerName="tripleo-cleanup-tripleo-cleanup-openstack-networker" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.043389 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.052337 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r8h5d"] Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.113257 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e37664a-95ca-4fbb-8991-0af7286b7b9e-utilities\") pod \"certified-operators-r8h5d\" (UID: \"5e37664a-95ca-4fbb-8991-0af7286b7b9e\") " pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.113366 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f72qr\" (UniqueName: \"kubernetes.io/projected/5e37664a-95ca-4fbb-8991-0af7286b7b9e-kube-api-access-f72qr\") pod \"certified-operators-r8h5d\" (UID: \"5e37664a-95ca-4fbb-8991-0af7286b7b9e\") " pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.113430 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e37664a-95ca-4fbb-8991-0af7286b7b9e-catalog-content\") pod \"certified-operators-r8h5d\" (UID: \"5e37664a-95ca-4fbb-8991-0af7286b7b9e\") " pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.215632 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e37664a-95ca-4fbb-8991-0af7286b7b9e-utilities\") pod \"certified-operators-r8h5d\" (UID: \"5e37664a-95ca-4fbb-8991-0af7286b7b9e\") " pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.215792 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f72qr\" (UniqueName: \"kubernetes.io/projected/5e37664a-95ca-4fbb-8991-0af7286b7b9e-kube-api-access-f72qr\") pod \"certified-operators-r8h5d\" (UID: \"5e37664a-95ca-4fbb-8991-0af7286b7b9e\") " pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.215880 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e37664a-95ca-4fbb-8991-0af7286b7b9e-catalog-content\") pod \"certified-operators-r8h5d\" (UID: \"5e37664a-95ca-4fbb-8991-0af7286b7b9e\") " pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.216320 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e37664a-95ca-4fbb-8991-0af7286b7b9e-utilities\") pod \"certified-operators-r8h5d\" (UID: \"5e37664a-95ca-4fbb-8991-0af7286b7b9e\") " pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.216365 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e37664a-95ca-4fbb-8991-0af7286b7b9e-catalog-content\") pod \"certified-operators-r8h5d\" (UID: \"5e37664a-95ca-4fbb-8991-0af7286b7b9e\") " pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.239534 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f72qr\" (UniqueName: \"kubernetes.io/projected/5e37664a-95ca-4fbb-8991-0af7286b7b9e-kube-api-access-f72qr\") pod \"certified-operators-r8h5d\" (UID: \"5e37664a-95ca-4fbb-8991-0af7286b7b9e\") " pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.389617 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.881297 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r8h5d"] Feb 20 09:02:09 crc kubenswrapper[5094]: I0220 09:02:09.417001 5094 generic.go:334] "Generic (PLEG): container finished" podID="5e37664a-95ca-4fbb-8991-0af7286b7b9e" containerID="7339d0c4e8259b177e2d20693ba2348687ae2cfe39995f853491a1c79073403a" exitCode=0 Feb 20 09:02:09 crc kubenswrapper[5094]: I0220 09:02:09.417088 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8h5d" event={"ID":"5e37664a-95ca-4fbb-8991-0af7286b7b9e","Type":"ContainerDied","Data":"7339d0c4e8259b177e2d20693ba2348687ae2cfe39995f853491a1c79073403a"} Feb 20 09:02:09 crc kubenswrapper[5094]: I0220 09:02:09.417354 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8h5d" event={"ID":"5e37664a-95ca-4fbb-8991-0af7286b7b9e","Type":"ContainerStarted","Data":"9793592739c2c4402f0646adb70f87a5b67be5909b34100bae0371dc05a317d7"} Feb 20 09:02:10 crc kubenswrapper[5094]: I0220 09:02:10.431183 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8h5d" event={"ID":"5e37664a-95ca-4fbb-8991-0af7286b7b9e","Type":"ContainerStarted","Data":"86dcafb75fc2be2feea5a22122ef0934b95506ef5ae71db827d0424bcb3aa7c2"} Feb 20 09:02:12 crc kubenswrapper[5094]: I0220 09:02:12.454928 5094 generic.go:334] "Generic (PLEG): container finished" podID="5e37664a-95ca-4fbb-8991-0af7286b7b9e" containerID="86dcafb75fc2be2feea5a22122ef0934b95506ef5ae71db827d0424bcb3aa7c2" exitCode=0 Feb 20 09:02:12 crc kubenswrapper[5094]: I0220 09:02:12.454999 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8h5d" event={"ID":"5e37664a-95ca-4fbb-8991-0af7286b7b9e","Type":"ContainerDied","Data":"86dcafb75fc2be2feea5a22122ef0934b95506ef5ae71db827d0424bcb3aa7c2"} Feb 20 09:02:13 crc kubenswrapper[5094]: I0220 09:02:13.478411 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8h5d" event={"ID":"5e37664a-95ca-4fbb-8991-0af7286b7b9e","Type":"ContainerStarted","Data":"1ec4157db14b26360df8b0fe8411cf32663bce1a3b07f6db3446bcf86afec191"} Feb 20 09:02:13 crc kubenswrapper[5094]: I0220 09:02:13.506567 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r8h5d" podStartSLOduration=2.083010457 podStartE2EDuration="5.50651407s" podCreationTimestamp="2026-02-20 09:02:08 +0000 UTC" firstStartedPulling="2026-02-20 09:02:09.419608647 +0000 UTC m=+8144.292235368" lastFinishedPulling="2026-02-20 09:02:12.84311227 +0000 UTC m=+8147.715738981" observedRunningTime="2026-02-20 09:02:13.4973713 +0000 UTC m=+8148.369998011" watchObservedRunningTime="2026-02-20 09:02:13.50651407 +0000 UTC m=+8148.379140781" Feb 20 09:02:18 crc kubenswrapper[5094]: I0220 09:02:18.390722 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:18 crc kubenswrapper[5094]: I0220 09:02:18.391373 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:18 crc kubenswrapper[5094]: I0220 09:02:18.464463 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:18 crc kubenswrapper[5094]: I0220 09:02:18.601489 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:18 crc kubenswrapper[5094]: I0220 09:02:18.715411 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r8h5d"] Feb 20 09:02:20 crc kubenswrapper[5094]: I0220 09:02:20.548682 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-r8h5d" podUID="5e37664a-95ca-4fbb-8991-0af7286b7b9e" containerName="registry-server" containerID="cri-o://1ec4157db14b26360df8b0fe8411cf32663bce1a3b07f6db3446bcf86afec191" gracePeriod=2 Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.519728 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.571094 5094 generic.go:334] "Generic (PLEG): container finished" podID="5e37664a-95ca-4fbb-8991-0af7286b7b9e" containerID="1ec4157db14b26360df8b0fe8411cf32663bce1a3b07f6db3446bcf86afec191" exitCode=0 Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.571145 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8h5d" event={"ID":"5e37664a-95ca-4fbb-8991-0af7286b7b9e","Type":"ContainerDied","Data":"1ec4157db14b26360df8b0fe8411cf32663bce1a3b07f6db3446bcf86afec191"} Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.571175 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8h5d" event={"ID":"5e37664a-95ca-4fbb-8991-0af7286b7b9e","Type":"ContainerDied","Data":"9793592739c2c4402f0646adb70f87a5b67be5909b34100bae0371dc05a317d7"} Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.571215 5094 scope.go:117] "RemoveContainer" containerID="1ec4157db14b26360df8b0fe8411cf32663bce1a3b07f6db3446bcf86afec191" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.571302 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.593167 5094 scope.go:117] "RemoveContainer" containerID="86dcafb75fc2be2feea5a22122ef0934b95506ef5ae71db827d0424bcb3aa7c2" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.614950 5094 scope.go:117] "RemoveContainer" containerID="7339d0c4e8259b177e2d20693ba2348687ae2cfe39995f853491a1c79073403a" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.618446 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e37664a-95ca-4fbb-8991-0af7286b7b9e-catalog-content\") pod \"5e37664a-95ca-4fbb-8991-0af7286b7b9e\" (UID: \"5e37664a-95ca-4fbb-8991-0af7286b7b9e\") " Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.618519 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e37664a-95ca-4fbb-8991-0af7286b7b9e-utilities\") pod \"5e37664a-95ca-4fbb-8991-0af7286b7b9e\" (UID: \"5e37664a-95ca-4fbb-8991-0af7286b7b9e\") " Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.618766 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f72qr\" (UniqueName: \"kubernetes.io/projected/5e37664a-95ca-4fbb-8991-0af7286b7b9e-kube-api-access-f72qr\") pod \"5e37664a-95ca-4fbb-8991-0af7286b7b9e\" (UID: \"5e37664a-95ca-4fbb-8991-0af7286b7b9e\") " Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.619490 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e37664a-95ca-4fbb-8991-0af7286b7b9e-utilities" (OuterVolumeSpecName: "utilities") pod "5e37664a-95ca-4fbb-8991-0af7286b7b9e" (UID: "5e37664a-95ca-4fbb-8991-0af7286b7b9e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.626530 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e37664a-95ca-4fbb-8991-0af7286b7b9e-kube-api-access-f72qr" (OuterVolumeSpecName: "kube-api-access-f72qr") pod "5e37664a-95ca-4fbb-8991-0af7286b7b9e" (UID: "5e37664a-95ca-4fbb-8991-0af7286b7b9e"). InnerVolumeSpecName "kube-api-access-f72qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.674539 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e37664a-95ca-4fbb-8991-0af7286b7b9e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e37664a-95ca-4fbb-8991-0af7286b7b9e" (UID: "5e37664a-95ca-4fbb-8991-0af7286b7b9e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.697639 5094 scope.go:117] "RemoveContainer" containerID="1ec4157db14b26360df8b0fe8411cf32663bce1a3b07f6db3446bcf86afec191" Feb 20 09:02:21 crc kubenswrapper[5094]: E0220 09:02:21.698148 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ec4157db14b26360df8b0fe8411cf32663bce1a3b07f6db3446bcf86afec191\": container with ID starting with 1ec4157db14b26360df8b0fe8411cf32663bce1a3b07f6db3446bcf86afec191 not found: ID does not exist" containerID="1ec4157db14b26360df8b0fe8411cf32663bce1a3b07f6db3446bcf86afec191" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.698213 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ec4157db14b26360df8b0fe8411cf32663bce1a3b07f6db3446bcf86afec191"} err="failed to get container status \"1ec4157db14b26360df8b0fe8411cf32663bce1a3b07f6db3446bcf86afec191\": rpc error: code = NotFound desc = could not find container \"1ec4157db14b26360df8b0fe8411cf32663bce1a3b07f6db3446bcf86afec191\": container with ID starting with 1ec4157db14b26360df8b0fe8411cf32663bce1a3b07f6db3446bcf86afec191 not found: ID does not exist" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.698247 5094 scope.go:117] "RemoveContainer" containerID="86dcafb75fc2be2feea5a22122ef0934b95506ef5ae71db827d0424bcb3aa7c2" Feb 20 09:02:21 crc kubenswrapper[5094]: E0220 09:02:21.698643 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86dcafb75fc2be2feea5a22122ef0934b95506ef5ae71db827d0424bcb3aa7c2\": container with ID starting with 86dcafb75fc2be2feea5a22122ef0934b95506ef5ae71db827d0424bcb3aa7c2 not found: ID does not exist" containerID="86dcafb75fc2be2feea5a22122ef0934b95506ef5ae71db827d0424bcb3aa7c2" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.698762 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86dcafb75fc2be2feea5a22122ef0934b95506ef5ae71db827d0424bcb3aa7c2"} err="failed to get container status \"86dcafb75fc2be2feea5a22122ef0934b95506ef5ae71db827d0424bcb3aa7c2\": rpc error: code = NotFound desc = could not find container \"86dcafb75fc2be2feea5a22122ef0934b95506ef5ae71db827d0424bcb3aa7c2\": container with ID starting with 86dcafb75fc2be2feea5a22122ef0934b95506ef5ae71db827d0424bcb3aa7c2 not found: ID does not exist" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.698815 5094 scope.go:117] "RemoveContainer" containerID="7339d0c4e8259b177e2d20693ba2348687ae2cfe39995f853491a1c79073403a" Feb 20 09:02:21 crc kubenswrapper[5094]: E0220 09:02:21.699172 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7339d0c4e8259b177e2d20693ba2348687ae2cfe39995f853491a1c79073403a\": container with ID starting with 7339d0c4e8259b177e2d20693ba2348687ae2cfe39995f853491a1c79073403a not found: ID does not exist" containerID="7339d0c4e8259b177e2d20693ba2348687ae2cfe39995f853491a1c79073403a" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.699199 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7339d0c4e8259b177e2d20693ba2348687ae2cfe39995f853491a1c79073403a"} err="failed to get container status \"7339d0c4e8259b177e2d20693ba2348687ae2cfe39995f853491a1c79073403a\": rpc error: code = NotFound desc = could not find container \"7339d0c4e8259b177e2d20693ba2348687ae2cfe39995f853491a1c79073403a\": container with ID starting with 7339d0c4e8259b177e2d20693ba2348687ae2cfe39995f853491a1c79073403a not found: ID does not exist" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.721899 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e37664a-95ca-4fbb-8991-0af7286b7b9e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.721945 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e37664a-95ca-4fbb-8991-0af7286b7b9e-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.722012 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f72qr\" (UniqueName: \"kubernetes.io/projected/5e37664a-95ca-4fbb-8991-0af7286b7b9e-kube-api-access-f72qr\") on node \"crc\" DevicePath \"\"" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.898745 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r8h5d"] Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.907763 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-r8h5d"] Feb 20 09:02:23 crc kubenswrapper[5094]: I0220 09:02:23.851354 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e37664a-95ca-4fbb-8991-0af7286b7b9e" path="/var/lib/kubelet/pods/5e37664a-95ca-4fbb-8991-0af7286b7b9e/volumes" Feb 20 09:02:34 crc kubenswrapper[5094]: I0220 09:02:34.106924 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:02:34 crc kubenswrapper[5094]: I0220 09:02:34.107523 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:03:04 crc kubenswrapper[5094]: I0220 09:03:04.107795 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:03:04 crc kubenswrapper[5094]: I0220 09:03:04.108463 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:03:29 crc kubenswrapper[5094]: I0220 09:03:29.056465 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-drwqv"] Feb 20 09:03:29 crc kubenswrapper[5094]: I0220 09:03:29.064730 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-4eea-account-create-update-rqsxn"] Feb 20 09:03:29 crc kubenswrapper[5094]: I0220 09:03:29.073273 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-drwqv"] Feb 20 09:03:29 crc kubenswrapper[5094]: I0220 09:03:29.081595 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-4eea-account-create-update-rqsxn"] Feb 20 09:03:29 crc kubenswrapper[5094]: I0220 09:03:29.860085 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="537981f5-8e74-406f-9199-8bac8aa60903" path="/var/lib/kubelet/pods/537981f5-8e74-406f-9199-8bac8aa60903/volumes" Feb 20 09:03:29 crc kubenswrapper[5094]: I0220 09:03:29.861294 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae290c11-18c8-4d9a-90d3-8f2219084a78" path="/var/lib/kubelet/pods/ae290c11-18c8-4d9a-90d3-8f2219084a78/volumes" Feb 20 09:03:34 crc kubenswrapper[5094]: I0220 09:03:34.107124 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:03:34 crc kubenswrapper[5094]: I0220 09:03:34.107523 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:03:34 crc kubenswrapper[5094]: I0220 09:03:34.107576 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 09:03:34 crc kubenswrapper[5094]: I0220 09:03:34.108568 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8272b573c1c010339b8d8ae657303bad51b724d1c5f662b08fa0dc532f0ace33"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 09:03:34 crc kubenswrapper[5094]: I0220 09:03:34.108630 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://8272b573c1c010339b8d8ae657303bad51b724d1c5f662b08fa0dc532f0ace33" gracePeriod=600 Feb 20 09:03:34 crc kubenswrapper[5094]: I0220 09:03:34.371896 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="8272b573c1c010339b8d8ae657303bad51b724d1c5f662b08fa0dc532f0ace33" exitCode=0 Feb 20 09:03:34 crc kubenswrapper[5094]: I0220 09:03:34.371954 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"8272b573c1c010339b8d8ae657303bad51b724d1c5f662b08fa0dc532f0ace33"} Feb 20 09:03:34 crc kubenswrapper[5094]: I0220 09:03:34.372049 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 09:03:35 crc kubenswrapper[5094]: I0220 09:03:35.381731 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60"} Feb 20 09:03:42 crc kubenswrapper[5094]: I0220 09:03:42.042842 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-bg4qh"] Feb 20 09:03:42 crc kubenswrapper[5094]: I0220 09:03:42.065907 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-bg4qh"] Feb 20 09:03:43 crc kubenswrapper[5094]: I0220 09:03:43.307984 5094 scope.go:117] "RemoveContainer" containerID="d3221fcda11fc25108efa9fb80c6774c8d350491f8d20f83e1f5fae473f8e306" Feb 20 09:03:43 crc kubenswrapper[5094]: I0220 09:03:43.341264 5094 scope.go:117] "RemoveContainer" containerID="7d6c489e10960f3fa343c9c151323e484166523ad5c54d095e27280ce6e1cbd6" Feb 20 09:03:43 crc kubenswrapper[5094]: I0220 09:03:43.854680 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2950b502-5079-4a08-8aaf-f0b5d376a3f2" path="/var/lib/kubelet/pods/2950b502-5079-4a08-8aaf-f0b5d376a3f2/volumes" Feb 20 09:04:01 crc kubenswrapper[5094]: I0220 09:04:01.049956 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-mtnn8"] Feb 20 09:04:01 crc kubenswrapper[5094]: I0220 09:04:01.060938 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-560b-account-create-update-6s8n2"] Feb 20 09:04:01 crc kubenswrapper[5094]: I0220 09:04:01.069425 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-mtnn8"] Feb 20 09:04:01 crc kubenswrapper[5094]: I0220 09:04:01.080772 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-560b-account-create-update-6s8n2"] Feb 20 09:04:01 crc kubenswrapper[5094]: I0220 09:04:01.857926 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bad0291-15d9-4dc5-acd6-26bc8d8aad76" path="/var/lib/kubelet/pods/6bad0291-15d9-4dc5-acd6-26bc8d8aad76/volumes" Feb 20 09:04:01 crc kubenswrapper[5094]: I0220 09:04:01.859471 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3" path="/var/lib/kubelet/pods/c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3/volumes" Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.480803 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hvxdd"] Feb 20 09:04:14 crc kubenswrapper[5094]: E0220 09:04:14.482572 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e37664a-95ca-4fbb-8991-0af7286b7b9e" containerName="extract-utilities" Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.482614 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e37664a-95ca-4fbb-8991-0af7286b7b9e" containerName="extract-utilities" Feb 20 09:04:14 crc kubenswrapper[5094]: E0220 09:04:14.482652 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e37664a-95ca-4fbb-8991-0af7286b7b9e" containerName="registry-server" Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.482670 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e37664a-95ca-4fbb-8991-0af7286b7b9e" containerName="registry-server" Feb 20 09:04:14 crc kubenswrapper[5094]: E0220 09:04:14.482745 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e37664a-95ca-4fbb-8991-0af7286b7b9e" containerName="extract-content" Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.482769 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e37664a-95ca-4fbb-8991-0af7286b7b9e" containerName="extract-content" Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.483280 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e37664a-95ca-4fbb-8991-0af7286b7b9e" containerName="registry-server" Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.486447 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.503197 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hvxdd"] Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.659153 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee43a380-78c4-4fdc-957a-a76021a27b53-utilities\") pod \"community-operators-hvxdd\" (UID: \"ee43a380-78c4-4fdc-957a-a76021a27b53\") " pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.659301 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee43a380-78c4-4fdc-957a-a76021a27b53-catalog-content\") pod \"community-operators-hvxdd\" (UID: \"ee43a380-78c4-4fdc-957a-a76021a27b53\") " pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.659353 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjk86\" (UniqueName: \"kubernetes.io/projected/ee43a380-78c4-4fdc-957a-a76021a27b53-kube-api-access-mjk86\") pod \"community-operators-hvxdd\" (UID: \"ee43a380-78c4-4fdc-957a-a76021a27b53\") " pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.761902 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee43a380-78c4-4fdc-957a-a76021a27b53-utilities\") pod \"community-operators-hvxdd\" (UID: \"ee43a380-78c4-4fdc-957a-a76021a27b53\") " pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.762087 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee43a380-78c4-4fdc-957a-a76021a27b53-catalog-content\") pod \"community-operators-hvxdd\" (UID: \"ee43a380-78c4-4fdc-957a-a76021a27b53\") " pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.762145 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjk86\" (UniqueName: \"kubernetes.io/projected/ee43a380-78c4-4fdc-957a-a76021a27b53-kube-api-access-mjk86\") pod \"community-operators-hvxdd\" (UID: \"ee43a380-78c4-4fdc-957a-a76021a27b53\") " pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.762620 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee43a380-78c4-4fdc-957a-a76021a27b53-catalog-content\") pod \"community-operators-hvxdd\" (UID: \"ee43a380-78c4-4fdc-957a-a76021a27b53\") " pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.762749 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee43a380-78c4-4fdc-957a-a76021a27b53-utilities\") pod \"community-operators-hvxdd\" (UID: \"ee43a380-78c4-4fdc-957a-a76021a27b53\") " pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.792482 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjk86\" (UniqueName: \"kubernetes.io/projected/ee43a380-78c4-4fdc-957a-a76021a27b53-kube-api-access-mjk86\") pod \"community-operators-hvxdd\" (UID: \"ee43a380-78c4-4fdc-957a-a76021a27b53\") " pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.831850 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:15 crc kubenswrapper[5094]: I0220 09:04:15.048849 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-87lwb"] Feb 20 09:04:15 crc kubenswrapper[5094]: I0220 09:04:15.057274 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-87lwb"] Feb 20 09:04:15 crc kubenswrapper[5094]: I0220 09:04:15.365640 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hvxdd"] Feb 20 09:04:15 crc kubenswrapper[5094]: I0220 09:04:15.799032 5094 generic.go:334] "Generic (PLEG): container finished" podID="ee43a380-78c4-4fdc-957a-a76021a27b53" containerID="f382e9bdcd3b0564a60c5db2dec447e6a631ee1061f879d955cbf44775b58b0d" exitCode=0 Feb 20 09:04:15 crc kubenswrapper[5094]: I0220 09:04:15.799166 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvxdd" event={"ID":"ee43a380-78c4-4fdc-957a-a76021a27b53","Type":"ContainerDied","Data":"f382e9bdcd3b0564a60c5db2dec447e6a631ee1061f879d955cbf44775b58b0d"} Feb 20 09:04:15 crc kubenswrapper[5094]: I0220 09:04:15.799307 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvxdd" event={"ID":"ee43a380-78c4-4fdc-957a-a76021a27b53","Type":"ContainerStarted","Data":"cf4431142b094e688304265b91f9daccea2c70afd9f108b3b0281393c36eadf9"} Feb 20 09:04:15 crc kubenswrapper[5094]: I0220 09:04:15.801212 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 09:04:15 crc kubenswrapper[5094]: I0220 09:04:15.858770 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00" path="/var/lib/kubelet/pods/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00/volumes" Feb 20 09:04:16 crc kubenswrapper[5094]: I0220 09:04:16.826060 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvxdd" event={"ID":"ee43a380-78c4-4fdc-957a-a76021a27b53","Type":"ContainerStarted","Data":"91ab2a8fa1dcb2274f66124aee2414a73de9ba0489ddc5132143ddf61255c32d"} Feb 20 09:04:17 crc kubenswrapper[5094]: E0220 09:04:17.820136 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee43a380_78c4_4fdc_957a_a76021a27b53.slice/crio-conmon-91ab2a8fa1dcb2274f66124aee2414a73de9ba0489ddc5132143ddf61255c32d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee43a380_78c4_4fdc_957a_a76021a27b53.slice/crio-91ab2a8fa1dcb2274f66124aee2414a73de9ba0489ddc5132143ddf61255c32d.scope\": RecentStats: unable to find data in memory cache]" Feb 20 09:04:17 crc kubenswrapper[5094]: I0220 09:04:17.840162 5094 generic.go:334] "Generic (PLEG): container finished" podID="ee43a380-78c4-4fdc-957a-a76021a27b53" containerID="91ab2a8fa1dcb2274f66124aee2414a73de9ba0489ddc5132143ddf61255c32d" exitCode=0 Feb 20 09:04:17 crc kubenswrapper[5094]: I0220 09:04:17.852454 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvxdd" event={"ID":"ee43a380-78c4-4fdc-957a-a76021a27b53","Type":"ContainerDied","Data":"91ab2a8fa1dcb2274f66124aee2414a73de9ba0489ddc5132143ddf61255c32d"} Feb 20 09:04:18 crc kubenswrapper[5094]: I0220 09:04:18.853771 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvxdd" event={"ID":"ee43a380-78c4-4fdc-957a-a76021a27b53","Type":"ContainerStarted","Data":"d9377f47eb257f2928083cfe994b97e5d229e2b4d7e8fc8ff63e0ccf47602a97"} Feb 20 09:04:18 crc kubenswrapper[5094]: I0220 09:04:18.886042 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hvxdd" podStartSLOduration=2.396745902 podStartE2EDuration="4.88601804s" podCreationTimestamp="2026-02-20 09:04:14 +0000 UTC" firstStartedPulling="2026-02-20 09:04:15.800990339 +0000 UTC m=+8270.673617050" lastFinishedPulling="2026-02-20 09:04:18.290262477 +0000 UTC m=+8273.162889188" observedRunningTime="2026-02-20 09:04:18.876735706 +0000 UTC m=+8273.749362417" watchObservedRunningTime="2026-02-20 09:04:18.88601804 +0000 UTC m=+8273.758644751" Feb 20 09:04:24 crc kubenswrapper[5094]: I0220 09:04:24.832312 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:24 crc kubenswrapper[5094]: I0220 09:04:24.833749 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:24 crc kubenswrapper[5094]: I0220 09:04:24.887463 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:24 crc kubenswrapper[5094]: I0220 09:04:24.953031 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:25 crc kubenswrapper[5094]: I0220 09:04:25.121538 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hvxdd"] Feb 20 09:04:26 crc kubenswrapper[5094]: I0220 09:04:26.927202 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" event={"ID":"110791b2-a067-409d-9970-9db4868f0d4d","Type":"ContainerDied","Data":"24186d99d2e091e90cea103f3ededd5ae0be73e5479d2f80e87c425b36de8252"} Feb 20 09:04:26 crc kubenswrapper[5094]: I0220 09:04:26.927128 5094 generic.go:334] "Generic (PLEG): container finished" podID="110791b2-a067-409d-9970-9db4868f0d4d" containerID="24186d99d2e091e90cea103f3ededd5ae0be73e5479d2f80e87c425b36de8252" exitCode=0 Feb 20 09:04:26 crc kubenswrapper[5094]: I0220 09:04:26.927962 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hvxdd" podUID="ee43a380-78c4-4fdc-957a-a76021a27b53" containerName="registry-server" containerID="cri-o://d9377f47eb257f2928083cfe994b97e5d229e2b4d7e8fc8ff63e0ccf47602a97" gracePeriod=2 Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.374318 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.529204 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee43a380-78c4-4fdc-957a-a76021a27b53-utilities\") pod \"ee43a380-78c4-4fdc-957a-a76021a27b53\" (UID: \"ee43a380-78c4-4fdc-957a-a76021a27b53\") " Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.529393 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjk86\" (UniqueName: \"kubernetes.io/projected/ee43a380-78c4-4fdc-957a-a76021a27b53-kube-api-access-mjk86\") pod \"ee43a380-78c4-4fdc-957a-a76021a27b53\" (UID: \"ee43a380-78c4-4fdc-957a-a76021a27b53\") " Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.529509 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee43a380-78c4-4fdc-957a-a76021a27b53-catalog-content\") pod \"ee43a380-78c4-4fdc-957a-a76021a27b53\" (UID: \"ee43a380-78c4-4fdc-957a-a76021a27b53\") " Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.530660 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee43a380-78c4-4fdc-957a-a76021a27b53-utilities" (OuterVolumeSpecName: "utilities") pod "ee43a380-78c4-4fdc-957a-a76021a27b53" (UID: "ee43a380-78c4-4fdc-957a-a76021a27b53"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.534610 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee43a380-78c4-4fdc-957a-a76021a27b53-kube-api-access-mjk86" (OuterVolumeSpecName: "kube-api-access-mjk86") pod "ee43a380-78c4-4fdc-957a-a76021a27b53" (UID: "ee43a380-78c4-4fdc-957a-a76021a27b53"). InnerVolumeSpecName "kube-api-access-mjk86". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.581327 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee43a380-78c4-4fdc-957a-a76021a27b53-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee43a380-78c4-4fdc-957a-a76021a27b53" (UID: "ee43a380-78c4-4fdc-957a-a76021a27b53"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.632544 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee43a380-78c4-4fdc-957a-a76021a27b53-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.632585 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee43a380-78c4-4fdc-957a-a76021a27b53-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.632601 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjk86\" (UniqueName: \"kubernetes.io/projected/ee43a380-78c4-4fdc-957a-a76021a27b53-kube-api-access-mjk86\") on node \"crc\" DevicePath \"\"" Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.940932 5094 generic.go:334] "Generic (PLEG): container finished" podID="ee43a380-78c4-4fdc-957a-a76021a27b53" containerID="d9377f47eb257f2928083cfe994b97e5d229e2b4d7e8fc8ff63e0ccf47602a97" exitCode=0 Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.941136 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.941173 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvxdd" event={"ID":"ee43a380-78c4-4fdc-957a-a76021a27b53","Type":"ContainerDied","Data":"d9377f47eb257f2928083cfe994b97e5d229e2b4d7e8fc8ff63e0ccf47602a97"} Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.941770 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvxdd" event={"ID":"ee43a380-78c4-4fdc-957a-a76021a27b53","Type":"ContainerDied","Data":"cf4431142b094e688304265b91f9daccea2c70afd9f108b3b0281393c36eadf9"} Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.941813 5094 scope.go:117] "RemoveContainer" containerID="d9377f47eb257f2928083cfe994b97e5d229e2b4d7e8fc8ff63e0ccf47602a97" Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.970973 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hvxdd"] Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.989163 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hvxdd"] Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.000663 5094 scope.go:117] "RemoveContainer" containerID="91ab2a8fa1dcb2274f66124aee2414a73de9ba0489ddc5132143ddf61255c32d" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.026375 5094 scope.go:117] "RemoveContainer" containerID="f382e9bdcd3b0564a60c5db2dec447e6a631ee1061f879d955cbf44775b58b0d" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.087888 5094 scope.go:117] "RemoveContainer" containerID="d9377f47eb257f2928083cfe994b97e5d229e2b4d7e8fc8ff63e0ccf47602a97" Feb 20 09:04:28 crc kubenswrapper[5094]: E0220 09:04:28.088466 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9377f47eb257f2928083cfe994b97e5d229e2b4d7e8fc8ff63e0ccf47602a97\": container with ID starting with d9377f47eb257f2928083cfe994b97e5d229e2b4d7e8fc8ff63e0ccf47602a97 not found: ID does not exist" containerID="d9377f47eb257f2928083cfe994b97e5d229e2b4d7e8fc8ff63e0ccf47602a97" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.088501 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9377f47eb257f2928083cfe994b97e5d229e2b4d7e8fc8ff63e0ccf47602a97"} err="failed to get container status \"d9377f47eb257f2928083cfe994b97e5d229e2b4d7e8fc8ff63e0ccf47602a97\": rpc error: code = NotFound desc = could not find container \"d9377f47eb257f2928083cfe994b97e5d229e2b4d7e8fc8ff63e0ccf47602a97\": container with ID starting with d9377f47eb257f2928083cfe994b97e5d229e2b4d7e8fc8ff63e0ccf47602a97 not found: ID does not exist" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.088523 5094 scope.go:117] "RemoveContainer" containerID="91ab2a8fa1dcb2274f66124aee2414a73de9ba0489ddc5132143ddf61255c32d" Feb 20 09:04:28 crc kubenswrapper[5094]: E0220 09:04:28.089020 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91ab2a8fa1dcb2274f66124aee2414a73de9ba0489ddc5132143ddf61255c32d\": container with ID starting with 91ab2a8fa1dcb2274f66124aee2414a73de9ba0489ddc5132143ddf61255c32d not found: ID does not exist" containerID="91ab2a8fa1dcb2274f66124aee2414a73de9ba0489ddc5132143ddf61255c32d" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.089042 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91ab2a8fa1dcb2274f66124aee2414a73de9ba0489ddc5132143ddf61255c32d"} err="failed to get container status \"91ab2a8fa1dcb2274f66124aee2414a73de9ba0489ddc5132143ddf61255c32d\": rpc error: code = NotFound desc = could not find container \"91ab2a8fa1dcb2274f66124aee2414a73de9ba0489ddc5132143ddf61255c32d\": container with ID starting with 91ab2a8fa1dcb2274f66124aee2414a73de9ba0489ddc5132143ddf61255c32d not found: ID does not exist" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.089055 5094 scope.go:117] "RemoveContainer" containerID="f382e9bdcd3b0564a60c5db2dec447e6a631ee1061f879d955cbf44775b58b0d" Feb 20 09:04:28 crc kubenswrapper[5094]: E0220 09:04:28.089215 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f382e9bdcd3b0564a60c5db2dec447e6a631ee1061f879d955cbf44775b58b0d\": container with ID starting with f382e9bdcd3b0564a60c5db2dec447e6a631ee1061f879d955cbf44775b58b0d not found: ID does not exist" containerID="f382e9bdcd3b0564a60c5db2dec447e6a631ee1061f879d955cbf44775b58b0d" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.089236 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f382e9bdcd3b0564a60c5db2dec447e6a631ee1061f879d955cbf44775b58b0d"} err="failed to get container status \"f382e9bdcd3b0564a60c5db2dec447e6a631ee1061f879d955cbf44775b58b0d\": rpc error: code = NotFound desc = could not find container \"f382e9bdcd3b0564a60c5db2dec447e6a631ee1061f879d955cbf44775b58b0d\": container with ID starting with f382e9bdcd3b0564a60c5db2dec447e6a631ee1061f879d955cbf44775b58b0d not found: ID does not exist" Feb 20 09:04:28 crc kubenswrapper[5094]: E0220 09:04:28.143874 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee43a380_78c4_4fdc_957a_a76021a27b53.slice/crio-cf4431142b094e688304265b91f9daccea2c70afd9f108b3b0281393c36eadf9\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee43a380_78c4_4fdc_957a_a76021a27b53.slice\": RecentStats: unable to find data in memory cache]" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.406296 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.455860 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svmxt\" (UniqueName: \"kubernetes.io/projected/110791b2-a067-409d-9970-9db4868f0d4d-kube-api-access-svmxt\") pod \"110791b2-a067-409d-9970-9db4868f0d4d\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.456202 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-ssh-key-openstack-cell1\") pod \"110791b2-a067-409d-9970-9db4868f0d4d\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.456791 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-inventory\") pod \"110791b2-a067-409d-9970-9db4868f0d4d\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.456932 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-ceph\") pod \"110791b2-a067-409d-9970-9db4868f0d4d\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.457086 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-tripleo-cleanup-combined-ca-bundle\") pod \"110791b2-a067-409d-9970-9db4868f0d4d\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.461808 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-ceph" (OuterVolumeSpecName: "ceph") pod "110791b2-a067-409d-9970-9db4868f0d4d" (UID: "110791b2-a067-409d-9970-9db4868f0d4d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.461999 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/110791b2-a067-409d-9970-9db4868f0d4d-kube-api-access-svmxt" (OuterVolumeSpecName: "kube-api-access-svmxt") pod "110791b2-a067-409d-9970-9db4868f0d4d" (UID: "110791b2-a067-409d-9970-9db4868f0d4d"). InnerVolumeSpecName "kube-api-access-svmxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.473169 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "110791b2-a067-409d-9970-9db4868f0d4d" (UID: "110791b2-a067-409d-9970-9db4868f0d4d"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.483908 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "110791b2-a067-409d-9970-9db4868f0d4d" (UID: "110791b2-a067-409d-9970-9db4868f0d4d"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.498684 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-inventory" (OuterVolumeSpecName: "inventory") pod "110791b2-a067-409d-9970-9db4868f0d4d" (UID: "110791b2-a067-409d-9970-9db4868f0d4d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.559188 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svmxt\" (UniqueName: \"kubernetes.io/projected/110791b2-a067-409d-9970-9db4868f0d4d-kube-api-access-svmxt\") on node \"crc\" DevicePath \"\"" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.559231 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.559243 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.559252 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.559264 5094 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.950876 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" event={"ID":"110791b2-a067-409d-9970-9db4868f0d4d","Type":"ContainerDied","Data":"0c39006c151d238a3f167340ddd6cab9c442fbcdd2b65206e97f1302317761f1"} Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.950897 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.951283 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c39006c151d238a3f167340ddd6cab9c442fbcdd2b65206e97f1302317761f1" Feb 20 09:04:29 crc kubenswrapper[5094]: I0220 09:04:29.852200 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee43a380-78c4-4fdc-957a-a76021a27b53" path="/var/lib/kubelet/pods/ee43a380-78c4-4fdc-957a-a76021a27b53/volumes" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.768107 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-m7mmg"] Feb 20 09:04:30 crc kubenswrapper[5094]: E0220 09:04:30.768643 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee43a380-78c4-4fdc-957a-a76021a27b53" containerName="extract-utilities" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.768663 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee43a380-78c4-4fdc-957a-a76021a27b53" containerName="extract-utilities" Feb 20 09:04:30 crc kubenswrapper[5094]: E0220 09:04:30.768685 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee43a380-78c4-4fdc-957a-a76021a27b53" containerName="extract-content" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.768695 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee43a380-78c4-4fdc-957a-a76021a27b53" containerName="extract-content" Feb 20 09:04:30 crc kubenswrapper[5094]: E0220 09:04:30.768757 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee43a380-78c4-4fdc-957a-a76021a27b53" containerName="registry-server" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.768773 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee43a380-78c4-4fdc-957a-a76021a27b53" containerName="registry-server" Feb 20 09:04:30 crc kubenswrapper[5094]: E0220 09:04:30.768791 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110791b2-a067-409d-9970-9db4868f0d4d" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.768803 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="110791b2-a067-409d-9970-9db4868f0d4d" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.769162 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="110791b2-a067-409d-9970-9db4868f0d4d" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.769202 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee43a380-78c4-4fdc-957a-a76021a27b53" containerName="registry-server" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.770224 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.779492 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-networker-452ln"] Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.780812 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-networker-452ln" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.781647 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.781952 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.782074 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.782283 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-xf9xl" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.782780 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.784065 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.803915 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-m7mmg"] Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.811775 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-networker-452ln"] Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.906347 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-inventory\") pod \"bootstrap-openstack-openstack-cell1-m7mmg\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.906528 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc2bv\" (UniqueName: \"kubernetes.io/projected/30a55d13-2efe-4d90-bcef-14aedc741079-kube-api-access-hc2bv\") pod \"bootstrap-openstack-openstack-cell1-m7mmg\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.906598 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-ssh-key-openstack-networker\") pod \"bootstrap-openstack-openstack-networker-452ln\" (UID: \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\") " pod="openstack/bootstrap-openstack-openstack-networker-452ln" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.906647 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-networker-452ln\" (UID: \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\") " pod="openstack/bootstrap-openstack-openstack-networker-452ln" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.906682 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-m7mmg\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.906728 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-inventory\") pod \"bootstrap-openstack-openstack-networker-452ln\" (UID: \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\") " pod="openstack/bootstrap-openstack-openstack-networker-452ln" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.906818 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-m7mmg\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.906848 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92jdm\" (UniqueName: \"kubernetes.io/projected/e3baf01f-744b-44ed-b3c8-2ec288f77e59-kube-api-access-92jdm\") pod \"bootstrap-openstack-openstack-networker-452ln\" (UID: \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\") " pod="openstack/bootstrap-openstack-openstack-networker-452ln" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.906889 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-ceph\") pod \"bootstrap-openstack-openstack-cell1-m7mmg\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.008556 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-m7mmg\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.008618 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92jdm\" (UniqueName: \"kubernetes.io/projected/e3baf01f-744b-44ed-b3c8-2ec288f77e59-kube-api-access-92jdm\") pod \"bootstrap-openstack-openstack-networker-452ln\" (UID: \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\") " pod="openstack/bootstrap-openstack-openstack-networker-452ln" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.008663 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-ceph\") pod \"bootstrap-openstack-openstack-cell1-m7mmg\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.008825 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-inventory\") pod \"bootstrap-openstack-openstack-cell1-m7mmg\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.008877 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc2bv\" (UniqueName: \"kubernetes.io/projected/30a55d13-2efe-4d90-bcef-14aedc741079-kube-api-access-hc2bv\") pod \"bootstrap-openstack-openstack-cell1-m7mmg\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.008914 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-ssh-key-openstack-networker\") pod \"bootstrap-openstack-openstack-networker-452ln\" (UID: \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\") " pod="openstack/bootstrap-openstack-openstack-networker-452ln" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.008957 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-networker-452ln\" (UID: \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\") " pod="openstack/bootstrap-openstack-openstack-networker-452ln" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.008987 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-m7mmg\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.009010 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-inventory\") pod \"bootstrap-openstack-openstack-networker-452ln\" (UID: \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\") " pod="openstack/bootstrap-openstack-openstack-networker-452ln" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.016517 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-inventory\") pod \"bootstrap-openstack-openstack-networker-452ln\" (UID: \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\") " pod="openstack/bootstrap-openstack-openstack-networker-452ln" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.016677 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-m7mmg\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.017014 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-ceph\") pod \"bootstrap-openstack-openstack-cell1-m7mmg\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.017466 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-ssh-key-openstack-networker\") pod \"bootstrap-openstack-openstack-networker-452ln\" (UID: \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\") " pod="openstack/bootstrap-openstack-openstack-networker-452ln" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.023113 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-m7mmg\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.026631 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-inventory\") pod \"bootstrap-openstack-openstack-cell1-m7mmg\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.034738 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-networker-452ln\" (UID: \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\") " pod="openstack/bootstrap-openstack-openstack-networker-452ln" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.036142 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92jdm\" (UniqueName: \"kubernetes.io/projected/e3baf01f-744b-44ed-b3c8-2ec288f77e59-kube-api-access-92jdm\") pod \"bootstrap-openstack-openstack-networker-452ln\" (UID: \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\") " pod="openstack/bootstrap-openstack-openstack-networker-452ln" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.039327 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc2bv\" (UniqueName: \"kubernetes.io/projected/30a55d13-2efe-4d90-bcef-14aedc741079-kube-api-access-hc2bv\") pod \"bootstrap-openstack-openstack-cell1-m7mmg\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.086474 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.100094 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-networker-452ln" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.626242 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-m7mmg"] Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.751632 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-networker-452ln"] Feb 20 09:04:31 crc kubenswrapper[5094]: W0220 09:04:31.755085 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3baf01f_744b_44ed_b3c8_2ec288f77e59.slice/crio-c4621d2d6bab418c3d74a80e75231eb8e087a215e3a060c1143e1dc140bb3ed4 WatchSource:0}: Error finding container c4621d2d6bab418c3d74a80e75231eb8e087a215e3a060c1143e1dc140bb3ed4: Status 404 returned error can't find the container with id c4621d2d6bab418c3d74a80e75231eb8e087a215e3a060c1143e1dc140bb3ed4 Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.999401 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" event={"ID":"30a55d13-2efe-4d90-bcef-14aedc741079","Type":"ContainerStarted","Data":"b5119a9e3d1361f0b6bbf1a075495f30b7f1bf3071d0a3ad04118e1bd1708bb2"} Feb 20 09:04:32 crc kubenswrapper[5094]: I0220 09:04:32.000413 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-networker-452ln" event={"ID":"e3baf01f-744b-44ed-b3c8-2ec288f77e59","Type":"ContainerStarted","Data":"c4621d2d6bab418c3d74a80e75231eb8e087a215e3a060c1143e1dc140bb3ed4"} Feb 20 09:04:33 crc kubenswrapper[5094]: I0220 09:04:33.025732 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-networker-452ln" event={"ID":"e3baf01f-744b-44ed-b3c8-2ec288f77e59","Type":"ContainerStarted","Data":"d7b5776600a90eebbccedbaae5eb73db115ddeee382bcc00638e0b193233eedc"} Feb 20 09:04:33 crc kubenswrapper[5094]: I0220 09:04:33.028116 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" event={"ID":"30a55d13-2efe-4d90-bcef-14aedc741079","Type":"ContainerStarted","Data":"28b6a1153e01e932715ae5b8469ca02823b29ac957c328d37bd8a8b4033ff372"} Feb 20 09:04:33 crc kubenswrapper[5094]: I0220 09:04:33.059143 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-networker-452ln" podStartSLOduration=2.668105491 podStartE2EDuration="3.059124068s" podCreationTimestamp="2026-02-20 09:04:30 +0000 UTC" firstStartedPulling="2026-02-20 09:04:31.758227321 +0000 UTC m=+8286.630854032" lastFinishedPulling="2026-02-20 09:04:32.149245898 +0000 UTC m=+8287.021872609" observedRunningTime="2026-02-20 09:04:33.058547944 +0000 UTC m=+8287.931174675" watchObservedRunningTime="2026-02-20 09:04:33.059124068 +0000 UTC m=+8287.931750789" Feb 20 09:04:33 crc kubenswrapper[5094]: I0220 09:04:33.109866 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" podStartSLOduration=2.432953013 podStartE2EDuration="3.109839398s" podCreationTimestamp="2026-02-20 09:04:30 +0000 UTC" firstStartedPulling="2026-02-20 09:04:31.641170854 +0000 UTC m=+8286.513797565" lastFinishedPulling="2026-02-20 09:04:32.318057239 +0000 UTC m=+8287.190683950" observedRunningTime="2026-02-20 09:04:33.08830472 +0000 UTC m=+8287.960931441" watchObservedRunningTime="2026-02-20 09:04:33.109839398 +0000 UTC m=+8287.982466119" Feb 20 09:04:43 crc kubenswrapper[5094]: I0220 09:04:43.444412 5094 scope.go:117] "RemoveContainer" containerID="40e80a7f49d2a8cd8ede69f04221413d74bc3298b5502921432bc86e342a4f7d" Feb 20 09:04:43 crc kubenswrapper[5094]: I0220 09:04:43.472150 5094 scope.go:117] "RemoveContainer" containerID="8325d32b9b097dfce89183c5462e4d5dad44f1febeae290e0597bb55bbe59825" Feb 20 09:04:43 crc kubenswrapper[5094]: I0220 09:04:43.546406 5094 scope.go:117] "RemoveContainer" containerID="b71c497029e9f5437c06dfb05b6008f8e4f7c93cd886b8f44f838d87660036a7" Feb 20 09:04:43 crc kubenswrapper[5094]: I0220 09:04:43.614444 5094 scope.go:117] "RemoveContainer" containerID="fcec52d45c535185ac325065c4cab11c829c4a1ebad6b2123939c3a35f4b9360" Feb 20 09:05:34 crc kubenswrapper[5094]: I0220 09:05:34.106509 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:05:34 crc kubenswrapper[5094]: I0220 09:05:34.107302 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:06:04 crc kubenswrapper[5094]: I0220 09:06:04.106460 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:06:04 crc kubenswrapper[5094]: I0220 09:06:04.107513 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:06:34 crc kubenswrapper[5094]: I0220 09:06:34.106690 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:06:34 crc kubenswrapper[5094]: I0220 09:06:34.107916 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:06:34 crc kubenswrapper[5094]: I0220 09:06:34.107991 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 09:06:34 crc kubenswrapper[5094]: I0220 09:06:34.108984 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 09:06:34 crc kubenswrapper[5094]: I0220 09:06:34.109064 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" gracePeriod=600 Feb 20 09:06:34 crc kubenswrapper[5094]: E0220 09:06:34.243438 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:06:34 crc kubenswrapper[5094]: I0220 09:06:34.248954 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" exitCode=0 Feb 20 09:06:34 crc kubenswrapper[5094]: I0220 09:06:34.249008 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60"} Feb 20 09:06:34 crc kubenswrapper[5094]: I0220 09:06:34.249056 5094 scope.go:117] "RemoveContainer" containerID="8272b573c1c010339b8d8ae657303bad51b724d1c5f662b08fa0dc532f0ace33" Feb 20 09:06:35 crc kubenswrapper[5094]: I0220 09:06:35.263580 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:06:35 crc kubenswrapper[5094]: E0220 09:06:35.263874 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:06:49 crc kubenswrapper[5094]: I0220 09:06:49.841147 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:06:49 crc kubenswrapper[5094]: E0220 09:06:49.841921 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:07:02 crc kubenswrapper[5094]: I0220 09:07:02.840970 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:07:02 crc kubenswrapper[5094]: E0220 09:07:02.841857 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:07:17 crc kubenswrapper[5094]: I0220 09:07:17.841009 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:07:17 crc kubenswrapper[5094]: E0220 09:07:17.841986 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:07:21 crc kubenswrapper[5094]: I0220 09:07:21.711911 5094 generic.go:334] "Generic (PLEG): container finished" podID="30a55d13-2efe-4d90-bcef-14aedc741079" containerID="28b6a1153e01e932715ae5b8469ca02823b29ac957c328d37bd8a8b4033ff372" exitCode=0 Feb 20 09:07:21 crc kubenswrapper[5094]: I0220 09:07:21.711963 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" event={"ID":"30a55d13-2efe-4d90-bcef-14aedc741079","Type":"ContainerDied","Data":"28b6a1153e01e932715ae5b8469ca02823b29ac957c328d37bd8a8b4033ff372"} Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.155409 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.190612 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-ceph\") pod \"30a55d13-2efe-4d90-bcef-14aedc741079\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.190800 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-bootstrap-combined-ca-bundle\") pod \"30a55d13-2efe-4d90-bcef-14aedc741079\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.190892 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc2bv\" (UniqueName: \"kubernetes.io/projected/30a55d13-2efe-4d90-bcef-14aedc741079-kube-api-access-hc2bv\") pod \"30a55d13-2efe-4d90-bcef-14aedc741079\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.190979 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-ssh-key-openstack-cell1\") pod \"30a55d13-2efe-4d90-bcef-14aedc741079\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.191017 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-inventory\") pod \"30a55d13-2efe-4d90-bcef-14aedc741079\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.197685 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-ceph" (OuterVolumeSpecName: "ceph") pod "30a55d13-2efe-4d90-bcef-14aedc741079" (UID: "30a55d13-2efe-4d90-bcef-14aedc741079"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.197989 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "30a55d13-2efe-4d90-bcef-14aedc741079" (UID: "30a55d13-2efe-4d90-bcef-14aedc741079"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.199939 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30a55d13-2efe-4d90-bcef-14aedc741079-kube-api-access-hc2bv" (OuterVolumeSpecName: "kube-api-access-hc2bv") pod "30a55d13-2efe-4d90-bcef-14aedc741079" (UID: "30a55d13-2efe-4d90-bcef-14aedc741079"). InnerVolumeSpecName "kube-api-access-hc2bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.219963 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-inventory" (OuterVolumeSpecName: "inventory") pod "30a55d13-2efe-4d90-bcef-14aedc741079" (UID: "30a55d13-2efe-4d90-bcef-14aedc741079"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.231939 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "30a55d13-2efe-4d90-bcef-14aedc741079" (UID: "30a55d13-2efe-4d90-bcef-14aedc741079"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.294084 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.294111 5094 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.294125 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc2bv\" (UniqueName: \"kubernetes.io/projected/30a55d13-2efe-4d90-bcef-14aedc741079-kube-api-access-hc2bv\") on node \"crc\" DevicePath \"\"" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.294133 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.294145 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.729448 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" event={"ID":"30a55d13-2efe-4d90-bcef-14aedc741079","Type":"ContainerDied","Data":"b5119a9e3d1361f0b6bbf1a075495f30b7f1bf3071d0a3ad04118e1bd1708bb2"} Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.729807 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5119a9e3d1361f0b6bbf1a075495f30b7f1bf3071d0a3ad04118e1bd1708bb2" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.729500 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.838439 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-4zfxb"] Feb 20 09:07:23 crc kubenswrapper[5094]: E0220 09:07:23.839054 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30a55d13-2efe-4d90-bcef-14aedc741079" containerName="bootstrap-openstack-openstack-cell1" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.839081 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="30a55d13-2efe-4d90-bcef-14aedc741079" containerName="bootstrap-openstack-openstack-cell1" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.839302 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="30a55d13-2efe-4d90-bcef-14aedc741079" containerName="bootstrap-openstack-openstack-cell1" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.840330 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.842940 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.843120 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.893280 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-4zfxb"] Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.905054 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzdxr\" (UniqueName: \"kubernetes.io/projected/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-kube-api-access-wzdxr\") pod \"download-cache-openstack-openstack-cell1-4zfxb\" (UID: \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\") " pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.905120 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-ceph\") pod \"download-cache-openstack-openstack-cell1-4zfxb\" (UID: \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\") " pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.905192 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-4zfxb\" (UID: \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\") " pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.905290 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-inventory\") pod \"download-cache-openstack-openstack-cell1-4zfxb\" (UID: \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\") " pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" Feb 20 09:07:24 crc kubenswrapper[5094]: I0220 09:07:24.007597 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-ceph\") pod \"download-cache-openstack-openstack-cell1-4zfxb\" (UID: \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\") " pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" Feb 20 09:07:24 crc kubenswrapper[5094]: I0220 09:07:24.007688 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-4zfxb\" (UID: \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\") " pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" Feb 20 09:07:24 crc kubenswrapper[5094]: I0220 09:07:24.007804 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-inventory\") pod \"download-cache-openstack-openstack-cell1-4zfxb\" (UID: \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\") " pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" Feb 20 09:07:24 crc kubenswrapper[5094]: I0220 09:07:24.007926 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzdxr\" (UniqueName: \"kubernetes.io/projected/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-kube-api-access-wzdxr\") pod \"download-cache-openstack-openstack-cell1-4zfxb\" (UID: \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\") " pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" Feb 20 09:07:24 crc kubenswrapper[5094]: I0220 09:07:24.013498 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-ceph\") pod \"download-cache-openstack-openstack-cell1-4zfxb\" (UID: \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\") " pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" Feb 20 09:07:24 crc kubenswrapper[5094]: I0220 09:07:24.022080 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-4zfxb\" (UID: \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\") " pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" Feb 20 09:07:24 crc kubenswrapper[5094]: I0220 09:07:24.022085 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-inventory\") pod \"download-cache-openstack-openstack-cell1-4zfxb\" (UID: \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\") " pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" Feb 20 09:07:24 crc kubenswrapper[5094]: I0220 09:07:24.028248 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzdxr\" (UniqueName: \"kubernetes.io/projected/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-kube-api-access-wzdxr\") pod \"download-cache-openstack-openstack-cell1-4zfxb\" (UID: \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\") " pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" Feb 20 09:07:24 crc kubenswrapper[5094]: I0220 09:07:24.173160 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" Feb 20 09:07:24 crc kubenswrapper[5094]: I0220 09:07:24.691995 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-4zfxb"] Feb 20 09:07:24 crc kubenswrapper[5094]: I0220 09:07:24.738942 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" event={"ID":"ee932d3e-c52d-491d-92d8-8e21f7e1adbb","Type":"ContainerStarted","Data":"1d7eecaf4ffd0c7ac6f9163f2ee12164460f0f16736a4c9bfdbe1062f525b936"} Feb 20 09:07:25 crc kubenswrapper[5094]: I0220 09:07:25.749598 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" event={"ID":"ee932d3e-c52d-491d-92d8-8e21f7e1adbb","Type":"ContainerStarted","Data":"2ccd8439a62576ce780ceaa3d66e444205d4735fa06b8de40606cb4118f1de13"} Feb 20 09:07:25 crc kubenswrapper[5094]: I0220 09:07:25.769673 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" podStartSLOduration=2.239253361 podStartE2EDuration="2.769648361s" podCreationTimestamp="2026-02-20 09:07:23 +0000 UTC" firstStartedPulling="2026-02-20 09:07:24.703119463 +0000 UTC m=+8459.575746174" lastFinishedPulling="2026-02-20 09:07:25.233514463 +0000 UTC m=+8460.106141174" observedRunningTime="2026-02-20 09:07:25.767006318 +0000 UTC m=+8460.639633029" watchObservedRunningTime="2026-02-20 09:07:25.769648361 +0000 UTC m=+8460.642275072" Feb 20 09:07:31 crc kubenswrapper[5094]: I0220 09:07:31.840917 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:07:31 crc kubenswrapper[5094]: E0220 09:07:31.842004 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:07:34 crc kubenswrapper[5094]: I0220 09:07:34.843400 5094 generic.go:334] "Generic (PLEG): container finished" podID="e3baf01f-744b-44ed-b3c8-2ec288f77e59" containerID="d7b5776600a90eebbccedbaae5eb73db115ddeee382bcc00638e0b193233eedc" exitCode=0 Feb 20 09:07:34 crc kubenswrapper[5094]: I0220 09:07:34.843508 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-networker-452ln" event={"ID":"e3baf01f-744b-44ed-b3c8-2ec288f77e59","Type":"ContainerDied","Data":"d7b5776600a90eebbccedbaae5eb73db115ddeee382bcc00638e0b193233eedc"} Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.365753 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-networker-452ln" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.480011 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92jdm\" (UniqueName: \"kubernetes.io/projected/e3baf01f-744b-44ed-b3c8-2ec288f77e59-kube-api-access-92jdm\") pod \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\" (UID: \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\") " Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.480060 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-ssh-key-openstack-networker\") pod \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\" (UID: \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\") " Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.480092 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-bootstrap-combined-ca-bundle\") pod \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\" (UID: \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\") " Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.480208 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-inventory\") pod \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\" (UID: \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\") " Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.489898 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "e3baf01f-744b-44ed-b3c8-2ec288f77e59" (UID: "e3baf01f-744b-44ed-b3c8-2ec288f77e59"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.490962 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3baf01f-744b-44ed-b3c8-2ec288f77e59-kube-api-access-92jdm" (OuterVolumeSpecName: "kube-api-access-92jdm") pod "e3baf01f-744b-44ed-b3c8-2ec288f77e59" (UID: "e3baf01f-744b-44ed-b3c8-2ec288f77e59"). InnerVolumeSpecName "kube-api-access-92jdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.509395 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "e3baf01f-744b-44ed-b3c8-2ec288f77e59" (UID: "e3baf01f-744b-44ed-b3c8-2ec288f77e59"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.520399 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-inventory" (OuterVolumeSpecName: "inventory") pod "e3baf01f-744b-44ed-b3c8-2ec288f77e59" (UID: "e3baf01f-744b-44ed-b3c8-2ec288f77e59"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.583197 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92jdm\" (UniqueName: \"kubernetes.io/projected/e3baf01f-744b-44ed-b3c8-2ec288f77e59-kube-api-access-92jdm\") on node \"crc\" DevicePath \"\"" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.583241 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.583253 5094 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.583265 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.867691 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-networker-452ln" event={"ID":"e3baf01f-744b-44ed-b3c8-2ec288f77e59","Type":"ContainerDied","Data":"c4621d2d6bab418c3d74a80e75231eb8e087a215e3a060c1143e1dc140bb3ed4"} Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.867752 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4621d2d6bab418c3d74a80e75231eb8e087a215e3a060c1143e1dc140bb3ed4" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.867798 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-networker-452ln" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.926121 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-networker-s48qg"] Feb 20 09:07:36 crc kubenswrapper[5094]: E0220 09:07:36.926576 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3baf01f-744b-44ed-b3c8-2ec288f77e59" containerName="bootstrap-openstack-openstack-networker" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.926601 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3baf01f-744b-44ed-b3c8-2ec288f77e59" containerName="bootstrap-openstack-openstack-networker" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.926919 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3baf01f-744b-44ed-b3c8-2ec288f77e59" containerName="bootstrap-openstack-openstack-networker" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.927659 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-networker-s48qg" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.929843 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-xf9xl" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.943645 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.943951 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-networker-s48qg"] Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.991593 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m76cc\" (UniqueName: \"kubernetes.io/projected/74755119-ad5b-439b-80bc-57779ffb5161-kube-api-access-m76cc\") pod \"download-cache-openstack-openstack-networker-s48qg\" (UID: \"74755119-ad5b-439b-80bc-57779ffb5161\") " pod="openstack/download-cache-openstack-openstack-networker-s48qg" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.991682 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74755119-ad5b-439b-80bc-57779ffb5161-inventory\") pod \"download-cache-openstack-openstack-networker-s48qg\" (UID: \"74755119-ad5b-439b-80bc-57779ffb5161\") " pod="openstack/download-cache-openstack-openstack-networker-s48qg" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.991778 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/74755119-ad5b-439b-80bc-57779ffb5161-ssh-key-openstack-networker\") pod \"download-cache-openstack-openstack-networker-s48qg\" (UID: \"74755119-ad5b-439b-80bc-57779ffb5161\") " pod="openstack/download-cache-openstack-openstack-networker-s48qg" Feb 20 09:07:37 crc kubenswrapper[5094]: I0220 09:07:37.093686 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74755119-ad5b-439b-80bc-57779ffb5161-inventory\") pod \"download-cache-openstack-openstack-networker-s48qg\" (UID: \"74755119-ad5b-439b-80bc-57779ffb5161\") " pod="openstack/download-cache-openstack-openstack-networker-s48qg" Feb 20 09:07:37 crc kubenswrapper[5094]: I0220 09:07:37.093987 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/74755119-ad5b-439b-80bc-57779ffb5161-ssh-key-openstack-networker\") pod \"download-cache-openstack-openstack-networker-s48qg\" (UID: \"74755119-ad5b-439b-80bc-57779ffb5161\") " pod="openstack/download-cache-openstack-openstack-networker-s48qg" Feb 20 09:07:37 crc kubenswrapper[5094]: I0220 09:07:37.094147 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m76cc\" (UniqueName: \"kubernetes.io/projected/74755119-ad5b-439b-80bc-57779ffb5161-kube-api-access-m76cc\") pod \"download-cache-openstack-openstack-networker-s48qg\" (UID: \"74755119-ad5b-439b-80bc-57779ffb5161\") " pod="openstack/download-cache-openstack-openstack-networker-s48qg" Feb 20 09:07:37 crc kubenswrapper[5094]: I0220 09:07:37.099315 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/74755119-ad5b-439b-80bc-57779ffb5161-ssh-key-openstack-networker\") pod \"download-cache-openstack-openstack-networker-s48qg\" (UID: \"74755119-ad5b-439b-80bc-57779ffb5161\") " pod="openstack/download-cache-openstack-openstack-networker-s48qg" Feb 20 09:07:37 crc kubenswrapper[5094]: I0220 09:07:37.099586 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74755119-ad5b-439b-80bc-57779ffb5161-inventory\") pod \"download-cache-openstack-openstack-networker-s48qg\" (UID: \"74755119-ad5b-439b-80bc-57779ffb5161\") " pod="openstack/download-cache-openstack-openstack-networker-s48qg" Feb 20 09:07:37 crc kubenswrapper[5094]: I0220 09:07:37.119445 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m76cc\" (UniqueName: \"kubernetes.io/projected/74755119-ad5b-439b-80bc-57779ffb5161-kube-api-access-m76cc\") pod \"download-cache-openstack-openstack-networker-s48qg\" (UID: \"74755119-ad5b-439b-80bc-57779ffb5161\") " pod="openstack/download-cache-openstack-openstack-networker-s48qg" Feb 20 09:07:37 crc kubenswrapper[5094]: I0220 09:07:37.249843 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-networker-s48qg" Feb 20 09:07:37 crc kubenswrapper[5094]: I0220 09:07:37.768947 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-networker-s48qg"] Feb 20 09:07:37 crc kubenswrapper[5094]: W0220 09:07:37.777581 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74755119_ad5b_439b_80bc_57779ffb5161.slice/crio-09d4d203520efc0360ec5324bd63b47e8902e7a3f5ee93502961755f99e4fdd6 WatchSource:0}: Error finding container 09d4d203520efc0360ec5324bd63b47e8902e7a3f5ee93502961755f99e4fdd6: Status 404 returned error can't find the container with id 09d4d203520efc0360ec5324bd63b47e8902e7a3f5ee93502961755f99e4fdd6 Feb 20 09:07:37 crc kubenswrapper[5094]: I0220 09:07:37.879774 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-networker-s48qg" event={"ID":"74755119-ad5b-439b-80bc-57779ffb5161","Type":"ContainerStarted","Data":"09d4d203520efc0360ec5324bd63b47e8902e7a3f5ee93502961755f99e4fdd6"} Feb 20 09:07:38 crc kubenswrapper[5094]: I0220 09:07:38.888640 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-networker-s48qg" event={"ID":"74755119-ad5b-439b-80bc-57779ffb5161","Type":"ContainerStarted","Data":"c060370efba02fe88f084c71b2630e0d32026fa891d4b33faa34cb53fd72fa7d"} Feb 20 09:07:38 crc kubenswrapper[5094]: I0220 09:07:38.917365 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-networker-s48qg" podStartSLOduration=2.191177328 podStartE2EDuration="2.917339389s" podCreationTimestamp="2026-02-20 09:07:36 +0000 UTC" firstStartedPulling="2026-02-20 09:07:37.780868817 +0000 UTC m=+8472.653495518" lastFinishedPulling="2026-02-20 09:07:38.507030868 +0000 UTC m=+8473.379657579" observedRunningTime="2026-02-20 09:07:38.910773251 +0000 UTC m=+8473.783399972" watchObservedRunningTime="2026-02-20 09:07:38.917339389 +0000 UTC m=+8473.789966090" Feb 20 09:07:45 crc kubenswrapper[5094]: I0220 09:07:45.851974 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:07:45 crc kubenswrapper[5094]: E0220 09:07:45.853158 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:08:00 crc kubenswrapper[5094]: I0220 09:08:00.849825 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:08:00 crc kubenswrapper[5094]: E0220 09:08:00.851288 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:08:15 crc kubenswrapper[5094]: I0220 09:08:15.849238 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:08:15 crc kubenswrapper[5094]: E0220 09:08:15.850533 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:08:28 crc kubenswrapper[5094]: I0220 09:08:28.839959 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:08:28 crc kubenswrapper[5094]: E0220 09:08:28.841278 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:08:42 crc kubenswrapper[5094]: I0220 09:08:42.840840 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:08:42 crc kubenswrapper[5094]: E0220 09:08:42.843481 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:08:49 crc kubenswrapper[5094]: I0220 09:08:49.791220 5094 generic.go:334] "Generic (PLEG): container finished" podID="74755119-ad5b-439b-80bc-57779ffb5161" containerID="c060370efba02fe88f084c71b2630e0d32026fa891d4b33faa34cb53fd72fa7d" exitCode=0 Feb 20 09:08:49 crc kubenswrapper[5094]: I0220 09:08:49.791309 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-networker-s48qg" event={"ID":"74755119-ad5b-439b-80bc-57779ffb5161","Type":"ContainerDied","Data":"c060370efba02fe88f084c71b2630e0d32026fa891d4b33faa34cb53fd72fa7d"} Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.275476 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-networker-s48qg" Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.387515 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74755119-ad5b-439b-80bc-57779ffb5161-inventory\") pod \"74755119-ad5b-439b-80bc-57779ffb5161\" (UID: \"74755119-ad5b-439b-80bc-57779ffb5161\") " Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.387596 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/74755119-ad5b-439b-80bc-57779ffb5161-ssh-key-openstack-networker\") pod \"74755119-ad5b-439b-80bc-57779ffb5161\" (UID: \"74755119-ad5b-439b-80bc-57779ffb5161\") " Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.387717 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m76cc\" (UniqueName: \"kubernetes.io/projected/74755119-ad5b-439b-80bc-57779ffb5161-kube-api-access-m76cc\") pod \"74755119-ad5b-439b-80bc-57779ffb5161\" (UID: \"74755119-ad5b-439b-80bc-57779ffb5161\") " Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.399416 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74755119-ad5b-439b-80bc-57779ffb5161-kube-api-access-m76cc" (OuterVolumeSpecName: "kube-api-access-m76cc") pod "74755119-ad5b-439b-80bc-57779ffb5161" (UID: "74755119-ad5b-439b-80bc-57779ffb5161"). InnerVolumeSpecName "kube-api-access-m76cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.430979 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74755119-ad5b-439b-80bc-57779ffb5161-inventory" (OuterVolumeSpecName: "inventory") pod "74755119-ad5b-439b-80bc-57779ffb5161" (UID: "74755119-ad5b-439b-80bc-57779ffb5161"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.471870 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74755119-ad5b-439b-80bc-57779ffb5161-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "74755119-ad5b-439b-80bc-57779ffb5161" (UID: "74755119-ad5b-439b-80bc-57779ffb5161"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.493462 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m76cc\" (UniqueName: \"kubernetes.io/projected/74755119-ad5b-439b-80bc-57779ffb5161-kube-api-access-m76cc\") on node \"crc\" DevicePath \"\"" Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.493520 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74755119-ad5b-439b-80bc-57779ffb5161-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.493539 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/74755119-ad5b-439b-80bc-57779ffb5161-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.807234 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-networker-s48qg" event={"ID":"74755119-ad5b-439b-80bc-57779ffb5161","Type":"ContainerDied","Data":"09d4d203520efc0360ec5324bd63b47e8902e7a3f5ee93502961755f99e4fdd6"} Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.807463 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09d4d203520efc0360ec5324bd63b47e8902e7a3f5ee93502961755f99e4fdd6" Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.807325 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-networker-s48qg" Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.891294 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-networker-59p5g"] Feb 20 09:08:51 crc kubenswrapper[5094]: E0220 09:08:51.892071 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74755119-ad5b-439b-80bc-57779ffb5161" containerName="download-cache-openstack-openstack-networker" Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.892164 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="74755119-ad5b-439b-80bc-57779ffb5161" containerName="download-cache-openstack-openstack-networker" Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.892531 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="74755119-ad5b-439b-80bc-57779ffb5161" containerName="download-cache-openstack-openstack-networker" Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.893571 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-networker-59p5g" Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.897520 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.898028 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-xf9xl" Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.916168 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-networker-59p5g"] Feb 20 09:08:52 crc kubenswrapper[5094]: I0220 09:08:52.002154 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de84413a-d424-4ec1-bb6d-e91b2278b854-inventory\") pod \"configure-network-openstack-openstack-networker-59p5g\" (UID: \"de84413a-d424-4ec1-bb6d-e91b2278b854\") " pod="openstack/configure-network-openstack-openstack-networker-59p5g" Feb 20 09:08:52 crc kubenswrapper[5094]: I0220 09:08:52.002228 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/de84413a-d424-4ec1-bb6d-e91b2278b854-ssh-key-openstack-networker\") pod \"configure-network-openstack-openstack-networker-59p5g\" (UID: \"de84413a-d424-4ec1-bb6d-e91b2278b854\") " pod="openstack/configure-network-openstack-openstack-networker-59p5g" Feb 20 09:08:52 crc kubenswrapper[5094]: I0220 09:08:52.002608 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrbg4\" (UniqueName: \"kubernetes.io/projected/de84413a-d424-4ec1-bb6d-e91b2278b854-kube-api-access-mrbg4\") pod \"configure-network-openstack-openstack-networker-59p5g\" (UID: \"de84413a-d424-4ec1-bb6d-e91b2278b854\") " pod="openstack/configure-network-openstack-openstack-networker-59p5g" Feb 20 09:08:52 crc kubenswrapper[5094]: I0220 09:08:52.104759 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de84413a-d424-4ec1-bb6d-e91b2278b854-inventory\") pod \"configure-network-openstack-openstack-networker-59p5g\" (UID: \"de84413a-d424-4ec1-bb6d-e91b2278b854\") " pod="openstack/configure-network-openstack-openstack-networker-59p5g" Feb 20 09:08:52 crc kubenswrapper[5094]: I0220 09:08:52.104859 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/de84413a-d424-4ec1-bb6d-e91b2278b854-ssh-key-openstack-networker\") pod \"configure-network-openstack-openstack-networker-59p5g\" (UID: \"de84413a-d424-4ec1-bb6d-e91b2278b854\") " pod="openstack/configure-network-openstack-openstack-networker-59p5g" Feb 20 09:08:52 crc kubenswrapper[5094]: I0220 09:08:52.104999 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrbg4\" (UniqueName: \"kubernetes.io/projected/de84413a-d424-4ec1-bb6d-e91b2278b854-kube-api-access-mrbg4\") pod \"configure-network-openstack-openstack-networker-59p5g\" (UID: \"de84413a-d424-4ec1-bb6d-e91b2278b854\") " pod="openstack/configure-network-openstack-openstack-networker-59p5g" Feb 20 09:08:52 crc kubenswrapper[5094]: I0220 09:08:52.109671 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de84413a-d424-4ec1-bb6d-e91b2278b854-inventory\") pod \"configure-network-openstack-openstack-networker-59p5g\" (UID: \"de84413a-d424-4ec1-bb6d-e91b2278b854\") " pod="openstack/configure-network-openstack-openstack-networker-59p5g" Feb 20 09:08:52 crc kubenswrapper[5094]: I0220 09:08:52.110053 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/de84413a-d424-4ec1-bb6d-e91b2278b854-ssh-key-openstack-networker\") pod \"configure-network-openstack-openstack-networker-59p5g\" (UID: \"de84413a-d424-4ec1-bb6d-e91b2278b854\") " pod="openstack/configure-network-openstack-openstack-networker-59p5g" Feb 20 09:08:52 crc kubenswrapper[5094]: I0220 09:08:52.125847 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrbg4\" (UniqueName: \"kubernetes.io/projected/de84413a-d424-4ec1-bb6d-e91b2278b854-kube-api-access-mrbg4\") pod \"configure-network-openstack-openstack-networker-59p5g\" (UID: \"de84413a-d424-4ec1-bb6d-e91b2278b854\") " pod="openstack/configure-network-openstack-openstack-networker-59p5g" Feb 20 09:08:52 crc kubenswrapper[5094]: I0220 09:08:52.232366 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-networker-59p5g" Feb 20 09:08:52 crc kubenswrapper[5094]: I0220 09:08:52.812473 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-networker-59p5g"] Feb 20 09:08:52 crc kubenswrapper[5094]: I0220 09:08:52.816745 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-networker-59p5g" event={"ID":"de84413a-d424-4ec1-bb6d-e91b2278b854","Type":"ContainerStarted","Data":"ebb40d52986a5e8028a0edd2009cbe5bef1de50e2634f808b231951cf13031cc"} Feb 20 09:08:53 crc kubenswrapper[5094]: I0220 09:08:53.834251 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-networker-59p5g" event={"ID":"de84413a-d424-4ec1-bb6d-e91b2278b854","Type":"ContainerStarted","Data":"5fe5bfe654a87fe11ab6e48fba7983d6c8f9cb8b7a0de1d97571c5ae01535c98"} Feb 20 09:08:53 crc kubenswrapper[5094]: I0220 09:08:53.844162 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:08:53 crc kubenswrapper[5094]: E0220 09:08:53.844412 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:08:53 crc kubenswrapper[5094]: I0220 09:08:53.850681 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-networker-59p5g" podStartSLOduration=2.319668678 podStartE2EDuration="2.850664242s" podCreationTimestamp="2026-02-20 09:08:51 +0000 UTC" firstStartedPulling="2026-02-20 09:08:52.808170552 +0000 UTC m=+8547.680797263" lastFinishedPulling="2026-02-20 09:08:53.339166126 +0000 UTC m=+8548.211792827" observedRunningTime="2026-02-20 09:08:53.849476213 +0000 UTC m=+8548.722102924" watchObservedRunningTime="2026-02-20 09:08:53.850664242 +0000 UTC m=+8548.723290953" Feb 20 09:09:01 crc kubenswrapper[5094]: I0220 09:09:01.661932 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l222c"] Feb 20 09:09:01 crc kubenswrapper[5094]: I0220 09:09:01.676670 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l222c"] Feb 20 09:09:01 crc kubenswrapper[5094]: I0220 09:09:01.676851 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:01 crc kubenswrapper[5094]: I0220 09:09:01.737646 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-catalog-content\") pod \"redhat-marketplace-l222c\" (UID: \"ac8429a7-c662-4ac3-968f-306eb9b3ba0e\") " pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:01 crc kubenswrapper[5094]: I0220 09:09:01.737833 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p967\" (UniqueName: \"kubernetes.io/projected/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-kube-api-access-4p967\") pod \"redhat-marketplace-l222c\" (UID: \"ac8429a7-c662-4ac3-968f-306eb9b3ba0e\") " pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:01 crc kubenswrapper[5094]: I0220 09:09:01.737871 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-utilities\") pod \"redhat-marketplace-l222c\" (UID: \"ac8429a7-c662-4ac3-968f-306eb9b3ba0e\") " pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:01 crc kubenswrapper[5094]: I0220 09:09:01.839524 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-catalog-content\") pod \"redhat-marketplace-l222c\" (UID: \"ac8429a7-c662-4ac3-968f-306eb9b3ba0e\") " pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:01 crc kubenswrapper[5094]: I0220 09:09:01.839639 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p967\" (UniqueName: \"kubernetes.io/projected/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-kube-api-access-4p967\") pod \"redhat-marketplace-l222c\" (UID: \"ac8429a7-c662-4ac3-968f-306eb9b3ba0e\") " pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:01 crc kubenswrapper[5094]: I0220 09:09:01.839675 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-utilities\") pod \"redhat-marketplace-l222c\" (UID: \"ac8429a7-c662-4ac3-968f-306eb9b3ba0e\") " pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:01 crc kubenswrapper[5094]: I0220 09:09:01.841151 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-utilities\") pod \"redhat-marketplace-l222c\" (UID: \"ac8429a7-c662-4ac3-968f-306eb9b3ba0e\") " pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:01 crc kubenswrapper[5094]: I0220 09:09:01.841234 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-catalog-content\") pod \"redhat-marketplace-l222c\" (UID: \"ac8429a7-c662-4ac3-968f-306eb9b3ba0e\") " pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:01 crc kubenswrapper[5094]: I0220 09:09:01.859409 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p967\" (UniqueName: \"kubernetes.io/projected/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-kube-api-access-4p967\") pod \"redhat-marketplace-l222c\" (UID: \"ac8429a7-c662-4ac3-968f-306eb9b3ba0e\") " pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:01 crc kubenswrapper[5094]: I0220 09:09:01.998617 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:02 crc kubenswrapper[5094]: I0220 09:09:02.470743 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l222c"] Feb 20 09:09:02 crc kubenswrapper[5094]: W0220 09:09:02.478259 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac8429a7_c662_4ac3_968f_306eb9b3ba0e.slice/crio-6efdaf2b79ccf0b98669133f00a3c1ce0a38fadc1e5961435ff80a8123217423 WatchSource:0}: Error finding container 6efdaf2b79ccf0b98669133f00a3c1ce0a38fadc1e5961435ff80a8123217423: Status 404 returned error can't find the container with id 6efdaf2b79ccf0b98669133f00a3c1ce0a38fadc1e5961435ff80a8123217423 Feb 20 09:09:02 crc kubenswrapper[5094]: I0220 09:09:02.937858 5094 generic.go:334] "Generic (PLEG): container finished" podID="ac8429a7-c662-4ac3-968f-306eb9b3ba0e" containerID="1a3ded7a99dde04c7ca9f20546fcc39238fb662005072e7c91e1309efa761dd4" exitCode=0 Feb 20 09:09:02 crc kubenswrapper[5094]: I0220 09:09:02.937942 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l222c" event={"ID":"ac8429a7-c662-4ac3-968f-306eb9b3ba0e","Type":"ContainerDied","Data":"1a3ded7a99dde04c7ca9f20546fcc39238fb662005072e7c91e1309efa761dd4"} Feb 20 09:09:02 crc kubenswrapper[5094]: I0220 09:09:02.938244 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l222c" event={"ID":"ac8429a7-c662-4ac3-968f-306eb9b3ba0e","Type":"ContainerStarted","Data":"6efdaf2b79ccf0b98669133f00a3c1ce0a38fadc1e5961435ff80a8123217423"} Feb 20 09:09:04 crc kubenswrapper[5094]: I0220 09:09:04.956363 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l222c" event={"ID":"ac8429a7-c662-4ac3-968f-306eb9b3ba0e","Type":"ContainerStarted","Data":"8862aef8bd542f54b4f796b48d4129769da77b8bcfb351000b16657e37142b9e"} Feb 20 09:09:05 crc kubenswrapper[5094]: I0220 09:09:05.969388 5094 generic.go:334] "Generic (PLEG): container finished" podID="ac8429a7-c662-4ac3-968f-306eb9b3ba0e" containerID="8862aef8bd542f54b4f796b48d4129769da77b8bcfb351000b16657e37142b9e" exitCode=0 Feb 20 09:09:05 crc kubenswrapper[5094]: I0220 09:09:05.969544 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l222c" event={"ID":"ac8429a7-c662-4ac3-968f-306eb9b3ba0e","Type":"ContainerDied","Data":"8862aef8bd542f54b4f796b48d4129769da77b8bcfb351000b16657e37142b9e"} Feb 20 09:09:06 crc kubenswrapper[5094]: I0220 09:09:06.840320 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:09:06 crc kubenswrapper[5094]: E0220 09:09:06.841232 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:09:06 crc kubenswrapper[5094]: I0220 09:09:06.981299 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l222c" event={"ID":"ac8429a7-c662-4ac3-968f-306eb9b3ba0e","Type":"ContainerStarted","Data":"0a3417a44acb9e5244cd206a80346319bb6878cf7514c126c67952a4b16ce319"} Feb 20 09:09:07 crc kubenswrapper[5094]: I0220 09:09:07.003589 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l222c" podStartSLOduration=2.57991436 podStartE2EDuration="6.003568477s" podCreationTimestamp="2026-02-20 09:09:01 +0000 UTC" firstStartedPulling="2026-02-20 09:09:02.940794064 +0000 UTC m=+8557.813420775" lastFinishedPulling="2026-02-20 09:09:06.364448181 +0000 UTC m=+8561.237074892" observedRunningTime="2026-02-20 09:09:07.000290868 +0000 UTC m=+8561.872917589" watchObservedRunningTime="2026-02-20 09:09:07.003568477 +0000 UTC m=+8561.876195188" Feb 20 09:09:11 crc kubenswrapper[5094]: I0220 09:09:11.999101 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:12 crc kubenswrapper[5094]: I0220 09:09:11.999973 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:12 crc kubenswrapper[5094]: I0220 09:09:12.061080 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:12 crc kubenswrapper[5094]: I0220 09:09:12.149745 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:12 crc kubenswrapper[5094]: I0220 09:09:12.307665 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l222c"] Feb 20 09:09:14 crc kubenswrapper[5094]: I0220 09:09:14.060100 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l222c" podUID="ac8429a7-c662-4ac3-968f-306eb9b3ba0e" containerName="registry-server" containerID="cri-o://0a3417a44acb9e5244cd206a80346319bb6878cf7514c126c67952a4b16ce319" gracePeriod=2 Feb 20 09:09:14 crc kubenswrapper[5094]: I0220 09:09:14.589877 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:14 crc kubenswrapper[5094]: I0220 09:09:14.734078 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p967\" (UniqueName: \"kubernetes.io/projected/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-kube-api-access-4p967\") pod \"ac8429a7-c662-4ac3-968f-306eb9b3ba0e\" (UID: \"ac8429a7-c662-4ac3-968f-306eb9b3ba0e\") " Feb 20 09:09:14 crc kubenswrapper[5094]: I0220 09:09:14.734223 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-utilities\") pod \"ac8429a7-c662-4ac3-968f-306eb9b3ba0e\" (UID: \"ac8429a7-c662-4ac3-968f-306eb9b3ba0e\") " Feb 20 09:09:14 crc kubenswrapper[5094]: I0220 09:09:14.735073 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-catalog-content\") pod \"ac8429a7-c662-4ac3-968f-306eb9b3ba0e\" (UID: \"ac8429a7-c662-4ac3-968f-306eb9b3ba0e\") " Feb 20 09:09:14 crc kubenswrapper[5094]: I0220 09:09:14.735101 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-utilities" (OuterVolumeSpecName: "utilities") pod "ac8429a7-c662-4ac3-968f-306eb9b3ba0e" (UID: "ac8429a7-c662-4ac3-968f-306eb9b3ba0e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:09:14 crc kubenswrapper[5094]: I0220 09:09:14.737031 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:09:14 crc kubenswrapper[5094]: I0220 09:09:14.741928 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-kube-api-access-4p967" (OuterVolumeSpecName: "kube-api-access-4p967") pod "ac8429a7-c662-4ac3-968f-306eb9b3ba0e" (UID: "ac8429a7-c662-4ac3-968f-306eb9b3ba0e"). InnerVolumeSpecName "kube-api-access-4p967". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:09:14 crc kubenswrapper[5094]: I0220 09:09:14.758946 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac8429a7-c662-4ac3-968f-306eb9b3ba0e" (UID: "ac8429a7-c662-4ac3-968f-306eb9b3ba0e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:09:14 crc kubenswrapper[5094]: I0220 09:09:14.841668 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p967\" (UniqueName: \"kubernetes.io/projected/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-kube-api-access-4p967\") on node \"crc\" DevicePath \"\"" Feb 20 09:09:14 crc kubenswrapper[5094]: I0220 09:09:14.842240 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:09:15 crc kubenswrapper[5094]: I0220 09:09:15.073227 5094 generic.go:334] "Generic (PLEG): container finished" podID="ac8429a7-c662-4ac3-968f-306eb9b3ba0e" containerID="0a3417a44acb9e5244cd206a80346319bb6878cf7514c126c67952a4b16ce319" exitCode=0 Feb 20 09:09:15 crc kubenswrapper[5094]: I0220 09:09:15.073261 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l222c" event={"ID":"ac8429a7-c662-4ac3-968f-306eb9b3ba0e","Type":"ContainerDied","Data":"0a3417a44acb9e5244cd206a80346319bb6878cf7514c126c67952a4b16ce319"} Feb 20 09:09:15 crc kubenswrapper[5094]: I0220 09:09:15.073312 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l222c" event={"ID":"ac8429a7-c662-4ac3-968f-306eb9b3ba0e","Type":"ContainerDied","Data":"6efdaf2b79ccf0b98669133f00a3c1ce0a38fadc1e5961435ff80a8123217423"} Feb 20 09:09:15 crc kubenswrapper[5094]: I0220 09:09:15.073336 5094 scope.go:117] "RemoveContainer" containerID="0a3417a44acb9e5244cd206a80346319bb6878cf7514c126c67952a4b16ce319" Feb 20 09:09:15 crc kubenswrapper[5094]: I0220 09:09:15.073342 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:15 crc kubenswrapper[5094]: I0220 09:09:15.112144 5094 scope.go:117] "RemoveContainer" containerID="8862aef8bd542f54b4f796b48d4129769da77b8bcfb351000b16657e37142b9e" Feb 20 09:09:15 crc kubenswrapper[5094]: I0220 09:09:15.113306 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l222c"] Feb 20 09:09:15 crc kubenswrapper[5094]: I0220 09:09:15.123319 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l222c"] Feb 20 09:09:15 crc kubenswrapper[5094]: I0220 09:09:15.177890 5094 scope.go:117] "RemoveContainer" containerID="1a3ded7a99dde04c7ca9f20546fcc39238fb662005072e7c91e1309efa761dd4" Feb 20 09:09:15 crc kubenswrapper[5094]: I0220 09:09:15.208772 5094 scope.go:117] "RemoveContainer" containerID="0a3417a44acb9e5244cd206a80346319bb6878cf7514c126c67952a4b16ce319" Feb 20 09:09:15 crc kubenswrapper[5094]: E0220 09:09:15.209486 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a3417a44acb9e5244cd206a80346319bb6878cf7514c126c67952a4b16ce319\": container with ID starting with 0a3417a44acb9e5244cd206a80346319bb6878cf7514c126c67952a4b16ce319 not found: ID does not exist" containerID="0a3417a44acb9e5244cd206a80346319bb6878cf7514c126c67952a4b16ce319" Feb 20 09:09:15 crc kubenswrapper[5094]: I0220 09:09:15.209515 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a3417a44acb9e5244cd206a80346319bb6878cf7514c126c67952a4b16ce319"} err="failed to get container status \"0a3417a44acb9e5244cd206a80346319bb6878cf7514c126c67952a4b16ce319\": rpc error: code = NotFound desc = could not find container \"0a3417a44acb9e5244cd206a80346319bb6878cf7514c126c67952a4b16ce319\": container with ID starting with 0a3417a44acb9e5244cd206a80346319bb6878cf7514c126c67952a4b16ce319 not found: ID does not exist" Feb 20 09:09:15 crc kubenswrapper[5094]: I0220 09:09:15.209538 5094 scope.go:117] "RemoveContainer" containerID="8862aef8bd542f54b4f796b48d4129769da77b8bcfb351000b16657e37142b9e" Feb 20 09:09:15 crc kubenswrapper[5094]: E0220 09:09:15.210160 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8862aef8bd542f54b4f796b48d4129769da77b8bcfb351000b16657e37142b9e\": container with ID starting with 8862aef8bd542f54b4f796b48d4129769da77b8bcfb351000b16657e37142b9e not found: ID does not exist" containerID="8862aef8bd542f54b4f796b48d4129769da77b8bcfb351000b16657e37142b9e" Feb 20 09:09:15 crc kubenswrapper[5094]: I0220 09:09:15.210247 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8862aef8bd542f54b4f796b48d4129769da77b8bcfb351000b16657e37142b9e"} err="failed to get container status \"8862aef8bd542f54b4f796b48d4129769da77b8bcfb351000b16657e37142b9e\": rpc error: code = NotFound desc = could not find container \"8862aef8bd542f54b4f796b48d4129769da77b8bcfb351000b16657e37142b9e\": container with ID starting with 8862aef8bd542f54b4f796b48d4129769da77b8bcfb351000b16657e37142b9e not found: ID does not exist" Feb 20 09:09:15 crc kubenswrapper[5094]: I0220 09:09:15.210297 5094 scope.go:117] "RemoveContainer" containerID="1a3ded7a99dde04c7ca9f20546fcc39238fb662005072e7c91e1309efa761dd4" Feb 20 09:09:15 crc kubenswrapper[5094]: E0220 09:09:15.210890 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a3ded7a99dde04c7ca9f20546fcc39238fb662005072e7c91e1309efa761dd4\": container with ID starting with 1a3ded7a99dde04c7ca9f20546fcc39238fb662005072e7c91e1309efa761dd4 not found: ID does not exist" containerID="1a3ded7a99dde04c7ca9f20546fcc39238fb662005072e7c91e1309efa761dd4" Feb 20 09:09:15 crc kubenswrapper[5094]: I0220 09:09:15.210932 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a3ded7a99dde04c7ca9f20546fcc39238fb662005072e7c91e1309efa761dd4"} err="failed to get container status \"1a3ded7a99dde04c7ca9f20546fcc39238fb662005072e7c91e1309efa761dd4\": rpc error: code = NotFound desc = could not find container \"1a3ded7a99dde04c7ca9f20546fcc39238fb662005072e7c91e1309efa761dd4\": container with ID starting with 1a3ded7a99dde04c7ca9f20546fcc39238fb662005072e7c91e1309efa761dd4 not found: ID does not exist" Feb 20 09:09:15 crc kubenswrapper[5094]: I0220 09:09:15.851694 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac8429a7-c662-4ac3-968f-306eb9b3ba0e" path="/var/lib/kubelet/pods/ac8429a7-c662-4ac3-968f-306eb9b3ba0e/volumes" Feb 20 09:09:18 crc kubenswrapper[5094]: I0220 09:09:18.841337 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:09:18 crc kubenswrapper[5094]: E0220 09:09:18.842158 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:09:31 crc kubenswrapper[5094]: I0220 09:09:31.841094 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:09:31 crc kubenswrapper[5094]: E0220 09:09:31.842133 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:09:32 crc kubenswrapper[5094]: I0220 09:09:32.278433 5094 generic.go:334] "Generic (PLEG): container finished" podID="ee932d3e-c52d-491d-92d8-8e21f7e1adbb" containerID="2ccd8439a62576ce780ceaa3d66e444205d4735fa06b8de40606cb4118f1de13" exitCode=0 Feb 20 09:09:32 crc kubenswrapper[5094]: I0220 09:09:32.278488 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" event={"ID":"ee932d3e-c52d-491d-92d8-8e21f7e1adbb","Type":"ContainerDied","Data":"2ccd8439a62576ce780ceaa3d66e444205d4735fa06b8de40606cb4118f1de13"} Feb 20 09:09:33 crc kubenswrapper[5094]: I0220 09:09:33.766783 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" Feb 20 09:09:33 crc kubenswrapper[5094]: I0220 09:09:33.879364 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-ceph\") pod \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\" (UID: \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\") " Feb 20 09:09:33 crc kubenswrapper[5094]: I0220 09:09:33.879546 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-ssh-key-openstack-cell1\") pod \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\" (UID: \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\") " Feb 20 09:09:33 crc kubenswrapper[5094]: I0220 09:09:33.879619 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzdxr\" (UniqueName: \"kubernetes.io/projected/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-kube-api-access-wzdxr\") pod \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\" (UID: \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\") " Feb 20 09:09:33 crc kubenswrapper[5094]: I0220 09:09:33.879670 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-inventory\") pod \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\" (UID: \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\") " Feb 20 09:09:33 crc kubenswrapper[5094]: I0220 09:09:33.896982 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-ceph" (OuterVolumeSpecName: "ceph") pod "ee932d3e-c52d-491d-92d8-8e21f7e1adbb" (UID: "ee932d3e-c52d-491d-92d8-8e21f7e1adbb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:09:33 crc kubenswrapper[5094]: I0220 09:09:33.897006 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-kube-api-access-wzdxr" (OuterVolumeSpecName: "kube-api-access-wzdxr") pod "ee932d3e-c52d-491d-92d8-8e21f7e1adbb" (UID: "ee932d3e-c52d-491d-92d8-8e21f7e1adbb"). InnerVolumeSpecName "kube-api-access-wzdxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:09:33 crc kubenswrapper[5094]: I0220 09:09:33.909312 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "ee932d3e-c52d-491d-92d8-8e21f7e1adbb" (UID: "ee932d3e-c52d-491d-92d8-8e21f7e1adbb"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:09:33 crc kubenswrapper[5094]: I0220 09:09:33.911280 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-inventory" (OuterVolumeSpecName: "inventory") pod "ee932d3e-c52d-491d-92d8-8e21f7e1adbb" (UID: "ee932d3e-c52d-491d-92d8-8e21f7e1adbb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:09:33 crc kubenswrapper[5094]: I0220 09:09:33.982108 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:09:33 crc kubenswrapper[5094]: I0220 09:09:33.982181 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:09:33 crc kubenswrapper[5094]: I0220 09:09:33.982196 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzdxr\" (UniqueName: \"kubernetes.io/projected/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-kube-api-access-wzdxr\") on node \"crc\" DevicePath \"\"" Feb 20 09:09:33 crc kubenswrapper[5094]: I0220 09:09:33.982211 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.300071 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" event={"ID":"ee932d3e-c52d-491d-92d8-8e21f7e1adbb","Type":"ContainerDied","Data":"1d7eecaf4ffd0c7ac6f9163f2ee12164460f0f16736a4c9bfdbe1062f525b936"} Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.300119 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d7eecaf4ffd0c7ac6f9163f2ee12164460f0f16736a4c9bfdbe1062f525b936" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.300193 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.375675 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-ggj5f"] Feb 20 09:09:34 crc kubenswrapper[5094]: E0220 09:09:34.376318 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac8429a7-c662-4ac3-968f-306eb9b3ba0e" containerName="extract-utilities" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.376381 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac8429a7-c662-4ac3-968f-306eb9b3ba0e" containerName="extract-utilities" Feb 20 09:09:34 crc kubenswrapper[5094]: E0220 09:09:34.376454 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac8429a7-c662-4ac3-968f-306eb9b3ba0e" containerName="registry-server" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.376509 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac8429a7-c662-4ac3-968f-306eb9b3ba0e" containerName="registry-server" Feb 20 09:09:34 crc kubenswrapper[5094]: E0220 09:09:34.376576 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee932d3e-c52d-491d-92d8-8e21f7e1adbb" containerName="download-cache-openstack-openstack-cell1" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.376629 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee932d3e-c52d-491d-92d8-8e21f7e1adbb" containerName="download-cache-openstack-openstack-cell1" Feb 20 09:09:34 crc kubenswrapper[5094]: E0220 09:09:34.376685 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac8429a7-c662-4ac3-968f-306eb9b3ba0e" containerName="extract-content" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.376757 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac8429a7-c662-4ac3-968f-306eb9b3ba0e" containerName="extract-content" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.376987 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee932d3e-c52d-491d-92d8-8e21f7e1adbb" containerName="download-cache-openstack-openstack-cell1" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.377063 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac8429a7-c662-4ac3-968f-306eb9b3ba0e" containerName="registry-server" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.377788 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.379948 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.380356 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.414811 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-ggj5f"] Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.491381 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgvc4\" (UniqueName: \"kubernetes.io/projected/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-kube-api-access-tgvc4\") pod \"configure-network-openstack-openstack-cell1-ggj5f\" (UID: \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\") " pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.491456 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-inventory\") pod \"configure-network-openstack-openstack-cell1-ggj5f\" (UID: \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\") " pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.491502 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-ceph\") pod \"configure-network-openstack-openstack-cell1-ggj5f\" (UID: \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\") " pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.491639 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-ggj5f\" (UID: \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\") " pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.593343 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgvc4\" (UniqueName: \"kubernetes.io/projected/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-kube-api-access-tgvc4\") pod \"configure-network-openstack-openstack-cell1-ggj5f\" (UID: \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\") " pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.593407 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-inventory\") pod \"configure-network-openstack-openstack-cell1-ggj5f\" (UID: \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\") " pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.593440 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-ceph\") pod \"configure-network-openstack-openstack-cell1-ggj5f\" (UID: \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\") " pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.593536 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-ggj5f\" (UID: \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\") " pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.596549 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-inventory\") pod \"configure-network-openstack-openstack-cell1-ggj5f\" (UID: \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\") " pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.596677 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-ceph\") pod \"configure-network-openstack-openstack-cell1-ggj5f\" (UID: \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\") " pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.597143 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-ggj5f\" (UID: \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\") " pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.618406 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgvc4\" (UniqueName: \"kubernetes.io/projected/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-kube-api-access-tgvc4\") pod \"configure-network-openstack-openstack-cell1-ggj5f\" (UID: \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\") " pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.709058 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" Feb 20 09:09:35 crc kubenswrapper[5094]: I0220 09:09:35.325835 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-ggj5f"] Feb 20 09:09:35 crc kubenswrapper[5094]: I0220 09:09:35.332771 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 09:09:36 crc kubenswrapper[5094]: I0220 09:09:36.346200 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" event={"ID":"ea277d62-feb7-40a2-80a9-ad1a9d82cb13","Type":"ContainerStarted","Data":"ac9dc0ee68deb94a4ff5312e0a939b0b139cabb294d3a2db6521599467eefd6e"} Feb 20 09:09:36 crc kubenswrapper[5094]: I0220 09:09:36.346768 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" event={"ID":"ea277d62-feb7-40a2-80a9-ad1a9d82cb13","Type":"ContainerStarted","Data":"77532f6a8f46591197d08ef5699cbaa0638e65ff7cf2caf3ef1ddf35fd1e6aa0"} Feb 20 09:09:36 crc kubenswrapper[5094]: I0220 09:09:36.384025 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" podStartSLOduration=1.960952687 podStartE2EDuration="2.384005055s" podCreationTimestamp="2026-02-20 09:09:34 +0000 UTC" firstStartedPulling="2026-02-20 09:09:35.332499048 +0000 UTC m=+8590.205125759" lastFinishedPulling="2026-02-20 09:09:35.755551396 +0000 UTC m=+8590.628178127" observedRunningTime="2026-02-20 09:09:36.375110082 +0000 UTC m=+8591.247736793" watchObservedRunningTime="2026-02-20 09:09:36.384005055 +0000 UTC m=+8591.256631756" Feb 20 09:09:44 crc kubenswrapper[5094]: I0220 09:09:44.842068 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:09:44 crc kubenswrapper[5094]: E0220 09:09:44.842828 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:09:55 crc kubenswrapper[5094]: I0220 09:09:55.545600 5094 generic.go:334] "Generic (PLEG): container finished" podID="de84413a-d424-4ec1-bb6d-e91b2278b854" containerID="5fe5bfe654a87fe11ab6e48fba7983d6c8f9cb8b7a0de1d97571c5ae01535c98" exitCode=0 Feb 20 09:09:55 crc kubenswrapper[5094]: I0220 09:09:55.546383 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-networker-59p5g" event={"ID":"de84413a-d424-4ec1-bb6d-e91b2278b854","Type":"ContainerDied","Data":"5fe5bfe654a87fe11ab6e48fba7983d6c8f9cb8b7a0de1d97571c5ae01535c98"} Feb 20 09:09:56 crc kubenswrapper[5094]: I0220 09:09:56.841762 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:09:56 crc kubenswrapper[5094]: E0220 09:09:56.843252 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.131525 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-networker-59p5g" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.321904 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de84413a-d424-4ec1-bb6d-e91b2278b854-inventory\") pod \"de84413a-d424-4ec1-bb6d-e91b2278b854\" (UID: \"de84413a-d424-4ec1-bb6d-e91b2278b854\") " Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.322446 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrbg4\" (UniqueName: \"kubernetes.io/projected/de84413a-d424-4ec1-bb6d-e91b2278b854-kube-api-access-mrbg4\") pod \"de84413a-d424-4ec1-bb6d-e91b2278b854\" (UID: \"de84413a-d424-4ec1-bb6d-e91b2278b854\") " Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.322516 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/de84413a-d424-4ec1-bb6d-e91b2278b854-ssh-key-openstack-networker\") pod \"de84413a-d424-4ec1-bb6d-e91b2278b854\" (UID: \"de84413a-d424-4ec1-bb6d-e91b2278b854\") " Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.328409 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de84413a-d424-4ec1-bb6d-e91b2278b854-kube-api-access-mrbg4" (OuterVolumeSpecName: "kube-api-access-mrbg4") pod "de84413a-d424-4ec1-bb6d-e91b2278b854" (UID: "de84413a-d424-4ec1-bb6d-e91b2278b854"). InnerVolumeSpecName "kube-api-access-mrbg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.358026 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de84413a-d424-4ec1-bb6d-e91b2278b854-inventory" (OuterVolumeSpecName: "inventory") pod "de84413a-d424-4ec1-bb6d-e91b2278b854" (UID: "de84413a-d424-4ec1-bb6d-e91b2278b854"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.361794 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de84413a-d424-4ec1-bb6d-e91b2278b854-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "de84413a-d424-4ec1-bb6d-e91b2278b854" (UID: "de84413a-d424-4ec1-bb6d-e91b2278b854"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.425983 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de84413a-d424-4ec1-bb6d-e91b2278b854-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.426047 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrbg4\" (UniqueName: \"kubernetes.io/projected/de84413a-d424-4ec1-bb6d-e91b2278b854-kube-api-access-mrbg4\") on node \"crc\" DevicePath \"\"" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.426076 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/de84413a-d424-4ec1-bb6d-e91b2278b854-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.566067 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-networker-59p5g" event={"ID":"de84413a-d424-4ec1-bb6d-e91b2278b854","Type":"ContainerDied","Data":"ebb40d52986a5e8028a0edd2009cbe5bef1de50e2634f808b231951cf13031cc"} Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.566107 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebb40d52986a5e8028a0edd2009cbe5bef1de50e2634f808b231951cf13031cc" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.566163 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-networker-59p5g" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.674466 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-networker-2pr7n"] Feb 20 09:09:57 crc kubenswrapper[5094]: E0220 09:09:57.675062 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de84413a-d424-4ec1-bb6d-e91b2278b854" containerName="configure-network-openstack-openstack-networker" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.675088 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="de84413a-d424-4ec1-bb6d-e91b2278b854" containerName="configure-network-openstack-openstack-networker" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.675380 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="de84413a-d424-4ec1-bb6d-e91b2278b854" containerName="configure-network-openstack-openstack-networker" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.676326 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-networker-2pr7n" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.680048 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.680107 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-xf9xl" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.689495 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-networker-2pr7n"] Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.834752 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/f04426ab-50e7-4345-842b-69bfcc58207c-ssh-key-openstack-networker\") pod \"validate-network-openstack-openstack-networker-2pr7n\" (UID: \"f04426ab-50e7-4345-842b-69bfcc58207c\") " pod="openstack/validate-network-openstack-openstack-networker-2pr7n" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.834972 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcj8k\" (UniqueName: \"kubernetes.io/projected/f04426ab-50e7-4345-842b-69bfcc58207c-kube-api-access-mcj8k\") pod \"validate-network-openstack-openstack-networker-2pr7n\" (UID: \"f04426ab-50e7-4345-842b-69bfcc58207c\") " pod="openstack/validate-network-openstack-openstack-networker-2pr7n" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.835197 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f04426ab-50e7-4345-842b-69bfcc58207c-inventory\") pod \"validate-network-openstack-openstack-networker-2pr7n\" (UID: \"f04426ab-50e7-4345-842b-69bfcc58207c\") " pod="openstack/validate-network-openstack-openstack-networker-2pr7n" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.938023 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/f04426ab-50e7-4345-842b-69bfcc58207c-ssh-key-openstack-networker\") pod \"validate-network-openstack-openstack-networker-2pr7n\" (UID: \"f04426ab-50e7-4345-842b-69bfcc58207c\") " pod="openstack/validate-network-openstack-openstack-networker-2pr7n" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.938132 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcj8k\" (UniqueName: \"kubernetes.io/projected/f04426ab-50e7-4345-842b-69bfcc58207c-kube-api-access-mcj8k\") pod \"validate-network-openstack-openstack-networker-2pr7n\" (UID: \"f04426ab-50e7-4345-842b-69bfcc58207c\") " pod="openstack/validate-network-openstack-openstack-networker-2pr7n" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.938252 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f04426ab-50e7-4345-842b-69bfcc58207c-inventory\") pod \"validate-network-openstack-openstack-networker-2pr7n\" (UID: \"f04426ab-50e7-4345-842b-69bfcc58207c\") " pod="openstack/validate-network-openstack-openstack-networker-2pr7n" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.943835 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f04426ab-50e7-4345-842b-69bfcc58207c-inventory\") pod \"validate-network-openstack-openstack-networker-2pr7n\" (UID: \"f04426ab-50e7-4345-842b-69bfcc58207c\") " pod="openstack/validate-network-openstack-openstack-networker-2pr7n" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.946054 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/f04426ab-50e7-4345-842b-69bfcc58207c-ssh-key-openstack-networker\") pod \"validate-network-openstack-openstack-networker-2pr7n\" (UID: \"f04426ab-50e7-4345-842b-69bfcc58207c\") " pod="openstack/validate-network-openstack-openstack-networker-2pr7n" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.956159 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcj8k\" (UniqueName: \"kubernetes.io/projected/f04426ab-50e7-4345-842b-69bfcc58207c-kube-api-access-mcj8k\") pod \"validate-network-openstack-openstack-networker-2pr7n\" (UID: \"f04426ab-50e7-4345-842b-69bfcc58207c\") " pod="openstack/validate-network-openstack-openstack-networker-2pr7n" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.993777 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-networker-2pr7n" Feb 20 09:09:58 crc kubenswrapper[5094]: I0220 09:09:58.561900 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-networker-2pr7n"] Feb 20 09:09:58 crc kubenswrapper[5094]: W0220 09:09:58.572271 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf04426ab_50e7_4345_842b_69bfcc58207c.slice/crio-9c352daadefca118725443df28295e4c02048fade4ca2f92eba2b761b7f9c701 WatchSource:0}: Error finding container 9c352daadefca118725443df28295e4c02048fade4ca2f92eba2b761b7f9c701: Status 404 returned error can't find the container with id 9c352daadefca118725443df28295e4c02048fade4ca2f92eba2b761b7f9c701 Feb 20 09:09:59 crc kubenswrapper[5094]: I0220 09:09:59.592039 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-networker-2pr7n" event={"ID":"f04426ab-50e7-4345-842b-69bfcc58207c","Type":"ContainerStarted","Data":"8fa8add69c953aa8c865ae40971a37100de7a516ed62f628f7fee2af28f45d53"} Feb 20 09:09:59 crc kubenswrapper[5094]: I0220 09:09:59.592592 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-networker-2pr7n" event={"ID":"f04426ab-50e7-4345-842b-69bfcc58207c","Type":"ContainerStarted","Data":"9c352daadefca118725443df28295e4c02048fade4ca2f92eba2b761b7f9c701"} Feb 20 09:09:59 crc kubenswrapper[5094]: I0220 09:09:59.620394 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-networker-2pr7n" podStartSLOduration=2.022348811 podStartE2EDuration="2.620375358s" podCreationTimestamp="2026-02-20 09:09:57 +0000 UTC" firstStartedPulling="2026-02-20 09:09:58.574586698 +0000 UTC m=+8613.447213409" lastFinishedPulling="2026-02-20 09:09:59.172613245 +0000 UTC m=+8614.045239956" observedRunningTime="2026-02-20 09:09:59.613914812 +0000 UTC m=+8614.486541523" watchObservedRunningTime="2026-02-20 09:09:59.620375358 +0000 UTC m=+8614.493002059" Feb 20 09:10:04 crc kubenswrapper[5094]: I0220 09:10:04.652350 5094 generic.go:334] "Generic (PLEG): container finished" podID="f04426ab-50e7-4345-842b-69bfcc58207c" containerID="8fa8add69c953aa8c865ae40971a37100de7a516ed62f628f7fee2af28f45d53" exitCode=0 Feb 20 09:10:04 crc kubenswrapper[5094]: I0220 09:10:04.652393 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-networker-2pr7n" event={"ID":"f04426ab-50e7-4345-842b-69bfcc58207c","Type":"ContainerDied","Data":"8fa8add69c953aa8c865ae40971a37100de7a516ed62f628f7fee2af28f45d53"} Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.171099 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-networker-2pr7n" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.341678 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcj8k\" (UniqueName: \"kubernetes.io/projected/f04426ab-50e7-4345-842b-69bfcc58207c-kube-api-access-mcj8k\") pod \"f04426ab-50e7-4345-842b-69bfcc58207c\" (UID: \"f04426ab-50e7-4345-842b-69bfcc58207c\") " Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.341774 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/f04426ab-50e7-4345-842b-69bfcc58207c-ssh-key-openstack-networker\") pod \"f04426ab-50e7-4345-842b-69bfcc58207c\" (UID: \"f04426ab-50e7-4345-842b-69bfcc58207c\") " Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.341972 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f04426ab-50e7-4345-842b-69bfcc58207c-inventory\") pod \"f04426ab-50e7-4345-842b-69bfcc58207c\" (UID: \"f04426ab-50e7-4345-842b-69bfcc58207c\") " Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.348694 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f04426ab-50e7-4345-842b-69bfcc58207c-kube-api-access-mcj8k" (OuterVolumeSpecName: "kube-api-access-mcj8k") pod "f04426ab-50e7-4345-842b-69bfcc58207c" (UID: "f04426ab-50e7-4345-842b-69bfcc58207c"). InnerVolumeSpecName "kube-api-access-mcj8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.378217 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f04426ab-50e7-4345-842b-69bfcc58207c-inventory" (OuterVolumeSpecName: "inventory") pod "f04426ab-50e7-4345-842b-69bfcc58207c" (UID: "f04426ab-50e7-4345-842b-69bfcc58207c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.388002 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f04426ab-50e7-4345-842b-69bfcc58207c-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "f04426ab-50e7-4345-842b-69bfcc58207c" (UID: "f04426ab-50e7-4345-842b-69bfcc58207c"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.444302 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f04426ab-50e7-4345-842b-69bfcc58207c-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.444331 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcj8k\" (UniqueName: \"kubernetes.io/projected/f04426ab-50e7-4345-842b-69bfcc58207c-kube-api-access-mcj8k\") on node \"crc\" DevicePath \"\"" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.444343 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/f04426ab-50e7-4345-842b-69bfcc58207c-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.678400 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-networker-2pr7n" event={"ID":"f04426ab-50e7-4345-842b-69bfcc58207c","Type":"ContainerDied","Data":"9c352daadefca118725443df28295e4c02048fade4ca2f92eba2b761b7f9c701"} Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.678442 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c352daadefca118725443df28295e4c02048fade4ca2f92eba2b761b7f9c701" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.678599 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-networker-2pr7n" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.761498 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-networker-tqhtr"] Feb 20 09:10:06 crc kubenswrapper[5094]: E0220 09:10:06.762055 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f04426ab-50e7-4345-842b-69bfcc58207c" containerName="validate-network-openstack-openstack-networker" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.762084 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f04426ab-50e7-4345-842b-69bfcc58207c" containerName="validate-network-openstack-openstack-networker" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.762319 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f04426ab-50e7-4345-842b-69bfcc58207c" containerName="validate-network-openstack-openstack-networker" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.763131 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-networker-tqhtr" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.766150 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-xf9xl" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.766436 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.775176 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-networker-tqhtr"] Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.954794 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-ssh-key-openstack-networker\") pod \"install-os-openstack-openstack-networker-tqhtr\" (UID: \"e890bf4c-20ec-4b45-936d-d08d3a73b5ee\") " pod="openstack/install-os-openstack-openstack-networker-tqhtr" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.955076 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-inventory\") pod \"install-os-openstack-openstack-networker-tqhtr\" (UID: \"e890bf4c-20ec-4b45-936d-d08d3a73b5ee\") " pod="openstack/install-os-openstack-openstack-networker-tqhtr" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.955178 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4cjz\" (UniqueName: \"kubernetes.io/projected/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-kube-api-access-r4cjz\") pod \"install-os-openstack-openstack-networker-tqhtr\" (UID: \"e890bf4c-20ec-4b45-936d-d08d3a73b5ee\") " pod="openstack/install-os-openstack-openstack-networker-tqhtr" Feb 20 09:10:07 crc kubenswrapper[5094]: I0220 09:10:07.056727 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4cjz\" (UniqueName: \"kubernetes.io/projected/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-kube-api-access-r4cjz\") pod \"install-os-openstack-openstack-networker-tqhtr\" (UID: \"e890bf4c-20ec-4b45-936d-d08d3a73b5ee\") " pod="openstack/install-os-openstack-openstack-networker-tqhtr" Feb 20 09:10:07 crc kubenswrapper[5094]: I0220 09:10:07.057230 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-ssh-key-openstack-networker\") pod \"install-os-openstack-openstack-networker-tqhtr\" (UID: \"e890bf4c-20ec-4b45-936d-d08d3a73b5ee\") " pod="openstack/install-os-openstack-openstack-networker-tqhtr" Feb 20 09:10:07 crc kubenswrapper[5094]: I0220 09:10:07.058112 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-inventory\") pod \"install-os-openstack-openstack-networker-tqhtr\" (UID: \"e890bf4c-20ec-4b45-936d-d08d3a73b5ee\") " pod="openstack/install-os-openstack-openstack-networker-tqhtr" Feb 20 09:10:07 crc kubenswrapper[5094]: I0220 09:10:07.063662 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-ssh-key-openstack-networker\") pod \"install-os-openstack-openstack-networker-tqhtr\" (UID: \"e890bf4c-20ec-4b45-936d-d08d3a73b5ee\") " pod="openstack/install-os-openstack-openstack-networker-tqhtr" Feb 20 09:10:07 crc kubenswrapper[5094]: I0220 09:10:07.064201 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-inventory\") pod \"install-os-openstack-openstack-networker-tqhtr\" (UID: \"e890bf4c-20ec-4b45-936d-d08d3a73b5ee\") " pod="openstack/install-os-openstack-openstack-networker-tqhtr" Feb 20 09:10:07 crc kubenswrapper[5094]: I0220 09:10:07.073338 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4cjz\" (UniqueName: \"kubernetes.io/projected/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-kube-api-access-r4cjz\") pod \"install-os-openstack-openstack-networker-tqhtr\" (UID: \"e890bf4c-20ec-4b45-936d-d08d3a73b5ee\") " pod="openstack/install-os-openstack-openstack-networker-tqhtr" Feb 20 09:10:07 crc kubenswrapper[5094]: I0220 09:10:07.082181 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-networker-tqhtr" Feb 20 09:10:07 crc kubenswrapper[5094]: I0220 09:10:07.640322 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-networker-tqhtr"] Feb 20 09:10:07 crc kubenswrapper[5094]: I0220 09:10:07.696046 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-networker-tqhtr" event={"ID":"e890bf4c-20ec-4b45-936d-d08d3a73b5ee","Type":"ContainerStarted","Data":"baa0af510f31ddefaf054818af5e9e587e1051876dcf44574d86835d0dea5bbc"} Feb 20 09:10:08 crc kubenswrapper[5094]: I0220 09:10:08.710882 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-networker-tqhtr" event={"ID":"e890bf4c-20ec-4b45-936d-d08d3a73b5ee","Type":"ContainerStarted","Data":"53d37e4df512781460481b64b9d79b108f7fa1fdc52b31081e79c7d0aef5a644"} Feb 20 09:10:08 crc kubenswrapper[5094]: I0220 09:10:08.749440 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-networker-tqhtr" podStartSLOduration=2.269640922 podStartE2EDuration="2.749409865s" podCreationTimestamp="2026-02-20 09:10:06 +0000 UTC" firstStartedPulling="2026-02-20 09:10:07.638257062 +0000 UTC m=+8622.510883783" lastFinishedPulling="2026-02-20 09:10:08.118025975 +0000 UTC m=+8622.990652726" observedRunningTime="2026-02-20 09:10:08.739992738 +0000 UTC m=+8623.612619489" watchObservedRunningTime="2026-02-20 09:10:08.749409865 +0000 UTC m=+8623.622036616" Feb 20 09:10:10 crc kubenswrapper[5094]: I0220 09:10:10.840306 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:10:10 crc kubenswrapper[5094]: E0220 09:10:10.840843 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:10:21 crc kubenswrapper[5094]: I0220 09:10:21.841271 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:10:21 crc kubenswrapper[5094]: E0220 09:10:21.842177 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:10:33 crc kubenswrapper[5094]: I0220 09:10:33.840293 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:10:33 crc kubenswrapper[5094]: E0220 09:10:33.841100 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:10:34 crc kubenswrapper[5094]: I0220 09:10:34.942317 5094 generic.go:334] "Generic (PLEG): container finished" podID="ea277d62-feb7-40a2-80a9-ad1a9d82cb13" containerID="ac9dc0ee68deb94a4ff5312e0a939b0b139cabb294d3a2db6521599467eefd6e" exitCode=0 Feb 20 09:10:34 crc kubenswrapper[5094]: I0220 09:10:34.942623 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" event={"ID":"ea277d62-feb7-40a2-80a9-ad1a9d82cb13","Type":"ContainerDied","Data":"ac9dc0ee68deb94a4ff5312e0a939b0b139cabb294d3a2db6521599467eefd6e"} Feb 20 09:10:36 crc kubenswrapper[5094]: I0220 09:10:36.395087 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" Feb 20 09:10:36 crc kubenswrapper[5094]: I0220 09:10:36.503913 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-inventory\") pod \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\" (UID: \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\") " Feb 20 09:10:36 crc kubenswrapper[5094]: I0220 09:10:36.503964 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-ssh-key-openstack-cell1\") pod \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\" (UID: \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\") " Feb 20 09:10:36 crc kubenswrapper[5094]: I0220 09:10:36.504056 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-ceph\") pod \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\" (UID: \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\") " Feb 20 09:10:36 crc kubenswrapper[5094]: I0220 09:10:36.504153 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgvc4\" (UniqueName: \"kubernetes.io/projected/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-kube-api-access-tgvc4\") pod \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\" (UID: \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\") " Feb 20 09:10:36 crc kubenswrapper[5094]: I0220 09:10:36.510035 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-ceph" (OuterVolumeSpecName: "ceph") pod "ea277d62-feb7-40a2-80a9-ad1a9d82cb13" (UID: "ea277d62-feb7-40a2-80a9-ad1a9d82cb13"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:10:36 crc kubenswrapper[5094]: I0220 09:10:36.510074 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-kube-api-access-tgvc4" (OuterVolumeSpecName: "kube-api-access-tgvc4") pod "ea277d62-feb7-40a2-80a9-ad1a9d82cb13" (UID: "ea277d62-feb7-40a2-80a9-ad1a9d82cb13"). InnerVolumeSpecName "kube-api-access-tgvc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:10:36 crc kubenswrapper[5094]: I0220 09:10:36.538898 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-inventory" (OuterVolumeSpecName: "inventory") pod "ea277d62-feb7-40a2-80a9-ad1a9d82cb13" (UID: "ea277d62-feb7-40a2-80a9-ad1a9d82cb13"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:10:36 crc kubenswrapper[5094]: I0220 09:10:36.542976 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "ea277d62-feb7-40a2-80a9-ad1a9d82cb13" (UID: "ea277d62-feb7-40a2-80a9-ad1a9d82cb13"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:10:36 crc kubenswrapper[5094]: I0220 09:10:36.606348 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:10:36 crc kubenswrapper[5094]: I0220 09:10:36.606406 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgvc4\" (UniqueName: \"kubernetes.io/projected/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-kube-api-access-tgvc4\") on node \"crc\" DevicePath \"\"" Feb 20 09:10:36 crc kubenswrapper[5094]: I0220 09:10:36.606418 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:10:36 crc kubenswrapper[5094]: I0220 09:10:36.606427 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:10:36 crc kubenswrapper[5094]: I0220 09:10:36.962019 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" event={"ID":"ea277d62-feb7-40a2-80a9-ad1a9d82cb13","Type":"ContainerDied","Data":"77532f6a8f46591197d08ef5699cbaa0638e65ff7cf2caf3ef1ddf35fd1e6aa0"} Feb 20 09:10:36 crc kubenswrapper[5094]: I0220 09:10:36.962060 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77532f6a8f46591197d08ef5699cbaa0638e65ff7cf2caf3ef1ddf35fd1e6aa0" Feb 20 09:10:36 crc kubenswrapper[5094]: I0220 09:10:36.962110 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.046611 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-k27rr"] Feb 20 09:10:37 crc kubenswrapper[5094]: E0220 09:10:37.047348 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea277d62-feb7-40a2-80a9-ad1a9d82cb13" containerName="configure-network-openstack-openstack-cell1" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.047454 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea277d62-feb7-40a2-80a9-ad1a9d82cb13" containerName="configure-network-openstack-openstack-cell1" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.047719 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea277d62-feb7-40a2-80a9-ad1a9d82cb13" containerName="configure-network-openstack-openstack-cell1" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.048507 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-k27rr" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.051844 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.052640 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.058593 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-k27rr"] Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.220185 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktjhz\" (UniqueName: \"kubernetes.io/projected/61a917df-8faa-482f-9582-0c5737301057-kube-api-access-ktjhz\") pod \"validate-network-openstack-openstack-cell1-k27rr\" (UID: \"61a917df-8faa-482f-9582-0c5737301057\") " pod="openstack/validate-network-openstack-openstack-cell1-k27rr" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.220293 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-inventory\") pod \"validate-network-openstack-openstack-cell1-k27rr\" (UID: \"61a917df-8faa-482f-9582-0c5737301057\") " pod="openstack/validate-network-openstack-openstack-cell1-k27rr" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.221601 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-ceph\") pod \"validate-network-openstack-openstack-cell1-k27rr\" (UID: \"61a917df-8faa-482f-9582-0c5737301057\") " pod="openstack/validate-network-openstack-openstack-cell1-k27rr" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.221667 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-k27rr\" (UID: \"61a917df-8faa-482f-9582-0c5737301057\") " pod="openstack/validate-network-openstack-openstack-cell1-k27rr" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.325538 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-inventory\") pod \"validate-network-openstack-openstack-cell1-k27rr\" (UID: \"61a917df-8faa-482f-9582-0c5737301057\") " pod="openstack/validate-network-openstack-openstack-cell1-k27rr" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.325859 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-ceph\") pod \"validate-network-openstack-openstack-cell1-k27rr\" (UID: \"61a917df-8faa-482f-9582-0c5737301057\") " pod="openstack/validate-network-openstack-openstack-cell1-k27rr" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.325928 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-k27rr\" (UID: \"61a917df-8faa-482f-9582-0c5737301057\") " pod="openstack/validate-network-openstack-openstack-cell1-k27rr" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.326142 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktjhz\" (UniqueName: \"kubernetes.io/projected/61a917df-8faa-482f-9582-0c5737301057-kube-api-access-ktjhz\") pod \"validate-network-openstack-openstack-cell1-k27rr\" (UID: \"61a917df-8faa-482f-9582-0c5737301057\") " pod="openstack/validate-network-openstack-openstack-cell1-k27rr" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.332284 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-k27rr\" (UID: \"61a917df-8faa-482f-9582-0c5737301057\") " pod="openstack/validate-network-openstack-openstack-cell1-k27rr" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.335138 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-inventory\") pod \"validate-network-openstack-openstack-cell1-k27rr\" (UID: \"61a917df-8faa-482f-9582-0c5737301057\") " pod="openstack/validate-network-openstack-openstack-cell1-k27rr" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.335529 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-ceph\") pod \"validate-network-openstack-openstack-cell1-k27rr\" (UID: \"61a917df-8faa-482f-9582-0c5737301057\") " pod="openstack/validate-network-openstack-openstack-cell1-k27rr" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.346456 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktjhz\" (UniqueName: \"kubernetes.io/projected/61a917df-8faa-482f-9582-0c5737301057-kube-api-access-ktjhz\") pod \"validate-network-openstack-openstack-cell1-k27rr\" (UID: \"61a917df-8faa-482f-9582-0c5737301057\") " pod="openstack/validate-network-openstack-openstack-cell1-k27rr" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.376224 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-k27rr" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.900318 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-k27rr"] Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.970643 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-k27rr" event={"ID":"61a917df-8faa-482f-9582-0c5737301057","Type":"ContainerStarted","Data":"eef06862041ecff1ebe280bbf7209ac78f7908fce316f1b44f1b738e154e8e8c"} Feb 20 09:10:38 crc kubenswrapper[5094]: I0220 09:10:38.981341 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-k27rr" event={"ID":"61a917df-8faa-482f-9582-0c5737301057","Type":"ContainerStarted","Data":"d5f2d5bed405ecbd984035db7e835305c214286b97a01996788d3dabeb541dbf"} Feb 20 09:10:39 crc kubenswrapper[5094]: I0220 09:10:39.002049 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-k27rr" podStartSLOduration=1.479114225 podStartE2EDuration="2.002028055s" podCreationTimestamp="2026-02-20 09:10:37 +0000 UTC" firstStartedPulling="2026-02-20 09:10:37.905861923 +0000 UTC m=+8652.778488634" lastFinishedPulling="2026-02-20 09:10:38.428775753 +0000 UTC m=+8653.301402464" observedRunningTime="2026-02-20 09:10:38.995348844 +0000 UTC m=+8653.867975575" watchObservedRunningTime="2026-02-20 09:10:39.002028055 +0000 UTC m=+8653.874654766" Feb 20 09:10:44 crc kubenswrapper[5094]: I0220 09:10:44.840929 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:10:44 crc kubenswrapper[5094]: E0220 09:10:44.842524 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:10:46 crc kubenswrapper[5094]: I0220 09:10:46.045373 5094 generic.go:334] "Generic (PLEG): container finished" podID="61a917df-8faa-482f-9582-0c5737301057" containerID="d5f2d5bed405ecbd984035db7e835305c214286b97a01996788d3dabeb541dbf" exitCode=0 Feb 20 09:10:46 crc kubenswrapper[5094]: I0220 09:10:46.045471 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-k27rr" event={"ID":"61a917df-8faa-482f-9582-0c5737301057","Type":"ContainerDied","Data":"d5f2d5bed405ecbd984035db7e835305c214286b97a01996788d3dabeb541dbf"} Feb 20 09:10:47 crc kubenswrapper[5094]: I0220 09:10:47.502643 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-k27rr" Feb 20 09:10:47 crc kubenswrapper[5094]: I0220 09:10:47.651089 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-inventory\") pod \"61a917df-8faa-482f-9582-0c5737301057\" (UID: \"61a917df-8faa-482f-9582-0c5737301057\") " Feb 20 09:10:47 crc kubenswrapper[5094]: I0220 09:10:47.651181 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktjhz\" (UniqueName: \"kubernetes.io/projected/61a917df-8faa-482f-9582-0c5737301057-kube-api-access-ktjhz\") pod \"61a917df-8faa-482f-9582-0c5737301057\" (UID: \"61a917df-8faa-482f-9582-0c5737301057\") " Feb 20 09:10:47 crc kubenswrapper[5094]: I0220 09:10:47.651217 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-ceph\") pod \"61a917df-8faa-482f-9582-0c5737301057\" (UID: \"61a917df-8faa-482f-9582-0c5737301057\") " Feb 20 09:10:47 crc kubenswrapper[5094]: I0220 09:10:47.651247 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-ssh-key-openstack-cell1\") pod \"61a917df-8faa-482f-9582-0c5737301057\" (UID: \"61a917df-8faa-482f-9582-0c5737301057\") " Feb 20 09:10:47 crc kubenswrapper[5094]: I0220 09:10:47.656485 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61a917df-8faa-482f-9582-0c5737301057-kube-api-access-ktjhz" (OuterVolumeSpecName: "kube-api-access-ktjhz") pod "61a917df-8faa-482f-9582-0c5737301057" (UID: "61a917df-8faa-482f-9582-0c5737301057"). InnerVolumeSpecName "kube-api-access-ktjhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:10:47 crc kubenswrapper[5094]: I0220 09:10:47.656874 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-ceph" (OuterVolumeSpecName: "ceph") pod "61a917df-8faa-482f-9582-0c5737301057" (UID: "61a917df-8faa-482f-9582-0c5737301057"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:10:47 crc kubenswrapper[5094]: I0220 09:10:47.678433 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-inventory" (OuterVolumeSpecName: "inventory") pod "61a917df-8faa-482f-9582-0c5737301057" (UID: "61a917df-8faa-482f-9582-0c5737301057"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:10:47 crc kubenswrapper[5094]: I0220 09:10:47.686166 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "61a917df-8faa-482f-9582-0c5737301057" (UID: "61a917df-8faa-482f-9582-0c5737301057"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:10:47 crc kubenswrapper[5094]: I0220 09:10:47.754072 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:10:47 crc kubenswrapper[5094]: I0220 09:10:47.754393 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktjhz\" (UniqueName: \"kubernetes.io/projected/61a917df-8faa-482f-9582-0c5737301057-kube-api-access-ktjhz\") on node \"crc\" DevicePath \"\"" Feb 20 09:10:47 crc kubenswrapper[5094]: I0220 09:10:47.754403 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:10:47 crc kubenswrapper[5094]: I0220 09:10:47.754412 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.065943 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-k27rr" event={"ID":"61a917df-8faa-482f-9582-0c5737301057","Type":"ContainerDied","Data":"eef06862041ecff1ebe280bbf7209ac78f7908fce316f1b44f1b738e154e8e8c"} Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.066002 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eef06862041ecff1ebe280bbf7209ac78f7908fce316f1b44f1b738e154e8e8c" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.066008 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-k27rr" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.181012 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-8j6nf"] Feb 20 09:10:48 crc kubenswrapper[5094]: E0220 09:10:48.181556 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61a917df-8faa-482f-9582-0c5737301057" containerName="validate-network-openstack-openstack-cell1" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.181580 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="61a917df-8faa-482f-9582-0c5737301057" containerName="validate-network-openstack-openstack-cell1" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.181884 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="61a917df-8faa-482f-9582-0c5737301057" containerName="validate-network-openstack-openstack-cell1" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.182982 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-8j6nf" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.185499 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.192589 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.193797 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-8j6nf"] Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.269446 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-8j6nf\" (UID: \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\") " pod="openstack/install-os-openstack-openstack-cell1-8j6nf" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.269537 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-ceph\") pod \"install-os-openstack-openstack-cell1-8j6nf\" (UID: \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\") " pod="openstack/install-os-openstack-openstack-cell1-8j6nf" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.269581 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-inventory\") pod \"install-os-openstack-openstack-cell1-8j6nf\" (UID: \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\") " pod="openstack/install-os-openstack-openstack-cell1-8j6nf" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.269666 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktxwn\" (UniqueName: \"kubernetes.io/projected/27e4bec3-7ef3-4f1d-897d-99909f817f5e-kube-api-access-ktxwn\") pod \"install-os-openstack-openstack-cell1-8j6nf\" (UID: \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\") " pod="openstack/install-os-openstack-openstack-cell1-8j6nf" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.371738 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktxwn\" (UniqueName: \"kubernetes.io/projected/27e4bec3-7ef3-4f1d-897d-99909f817f5e-kube-api-access-ktxwn\") pod \"install-os-openstack-openstack-cell1-8j6nf\" (UID: \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\") " pod="openstack/install-os-openstack-openstack-cell1-8j6nf" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.372112 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-8j6nf\" (UID: \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\") " pod="openstack/install-os-openstack-openstack-cell1-8j6nf" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.372164 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-ceph\") pod \"install-os-openstack-openstack-cell1-8j6nf\" (UID: \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\") " pod="openstack/install-os-openstack-openstack-cell1-8j6nf" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.372205 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-inventory\") pod \"install-os-openstack-openstack-cell1-8j6nf\" (UID: \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\") " pod="openstack/install-os-openstack-openstack-cell1-8j6nf" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.379324 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-ceph\") pod \"install-os-openstack-openstack-cell1-8j6nf\" (UID: \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\") " pod="openstack/install-os-openstack-openstack-cell1-8j6nf" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.390258 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-8j6nf\" (UID: \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\") " pod="openstack/install-os-openstack-openstack-cell1-8j6nf" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.390625 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktxwn\" (UniqueName: \"kubernetes.io/projected/27e4bec3-7ef3-4f1d-897d-99909f817f5e-kube-api-access-ktxwn\") pod \"install-os-openstack-openstack-cell1-8j6nf\" (UID: \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\") " pod="openstack/install-os-openstack-openstack-cell1-8j6nf" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.392938 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-inventory\") pod \"install-os-openstack-openstack-cell1-8j6nf\" (UID: \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\") " pod="openstack/install-os-openstack-openstack-cell1-8j6nf" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.550543 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-8j6nf" Feb 20 09:10:49 crc kubenswrapper[5094]: I0220 09:10:49.239278 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-8j6nf"] Feb 20 09:10:50 crc kubenswrapper[5094]: I0220 09:10:50.083554 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-8j6nf" event={"ID":"27e4bec3-7ef3-4f1d-897d-99909f817f5e","Type":"ContainerStarted","Data":"2213442e38aa4ed71dfd4b1cb999e22f654d497f4fc2fb09a72fef9b37908a01"} Feb 20 09:10:51 crc kubenswrapper[5094]: I0220 09:10:51.093994 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-8j6nf" event={"ID":"27e4bec3-7ef3-4f1d-897d-99909f817f5e","Type":"ContainerStarted","Data":"7453b73c41008eee230417fe9bbf595522244c350a0fb1c6a70aacaecfc7b2af"} Feb 20 09:10:51 crc kubenswrapper[5094]: I0220 09:10:51.116867 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-8j6nf" podStartSLOduration=2.22922089 podStartE2EDuration="3.116845884s" podCreationTimestamp="2026-02-20 09:10:48 +0000 UTC" firstStartedPulling="2026-02-20 09:10:49.253770543 +0000 UTC m=+8664.126397254" lastFinishedPulling="2026-02-20 09:10:50.141395527 +0000 UTC m=+8665.014022248" observedRunningTime="2026-02-20 09:10:51.112175442 +0000 UTC m=+8665.984802173" watchObservedRunningTime="2026-02-20 09:10:51.116845884 +0000 UTC m=+8665.989472605" Feb 20 09:10:54 crc kubenswrapper[5094]: I0220 09:10:54.573432 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s8vl2"] Feb 20 09:10:54 crc kubenswrapper[5094]: I0220 09:10:54.576382 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:10:54 crc kubenswrapper[5094]: I0220 09:10:54.623066 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7bd9285-f186-43d4-a61f-181c864d71f6-catalog-content\") pod \"redhat-operators-s8vl2\" (UID: \"d7bd9285-f186-43d4-a61f-181c864d71f6\") " pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:10:54 crc kubenswrapper[5094]: I0220 09:10:54.623635 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7bd9285-f186-43d4-a61f-181c864d71f6-utilities\") pod \"redhat-operators-s8vl2\" (UID: \"d7bd9285-f186-43d4-a61f-181c864d71f6\") " pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:10:54 crc kubenswrapper[5094]: I0220 09:10:54.623682 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzbxr\" (UniqueName: \"kubernetes.io/projected/d7bd9285-f186-43d4-a61f-181c864d71f6-kube-api-access-bzbxr\") pod \"redhat-operators-s8vl2\" (UID: \"d7bd9285-f186-43d4-a61f-181c864d71f6\") " pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:10:54 crc kubenswrapper[5094]: I0220 09:10:54.633779 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s8vl2"] Feb 20 09:10:54 crc kubenswrapper[5094]: I0220 09:10:54.742992 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7bd9285-f186-43d4-a61f-181c864d71f6-utilities\") pod \"redhat-operators-s8vl2\" (UID: \"d7bd9285-f186-43d4-a61f-181c864d71f6\") " pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:10:54 crc kubenswrapper[5094]: I0220 09:10:54.743374 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzbxr\" (UniqueName: \"kubernetes.io/projected/d7bd9285-f186-43d4-a61f-181c864d71f6-kube-api-access-bzbxr\") pod \"redhat-operators-s8vl2\" (UID: \"d7bd9285-f186-43d4-a61f-181c864d71f6\") " pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:10:54 crc kubenswrapper[5094]: I0220 09:10:54.743578 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7bd9285-f186-43d4-a61f-181c864d71f6-catalog-content\") pod \"redhat-operators-s8vl2\" (UID: \"d7bd9285-f186-43d4-a61f-181c864d71f6\") " pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:10:54 crc kubenswrapper[5094]: I0220 09:10:54.744501 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7bd9285-f186-43d4-a61f-181c864d71f6-catalog-content\") pod \"redhat-operators-s8vl2\" (UID: \"d7bd9285-f186-43d4-a61f-181c864d71f6\") " pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:10:54 crc kubenswrapper[5094]: I0220 09:10:54.744819 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7bd9285-f186-43d4-a61f-181c864d71f6-utilities\") pod \"redhat-operators-s8vl2\" (UID: \"d7bd9285-f186-43d4-a61f-181c864d71f6\") " pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:10:54 crc kubenswrapper[5094]: I0220 09:10:54.801671 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzbxr\" (UniqueName: \"kubernetes.io/projected/d7bd9285-f186-43d4-a61f-181c864d71f6-kube-api-access-bzbxr\") pod \"redhat-operators-s8vl2\" (UID: \"d7bd9285-f186-43d4-a61f-181c864d71f6\") " pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:10:54 crc kubenswrapper[5094]: I0220 09:10:54.892928 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:10:55 crc kubenswrapper[5094]: I0220 09:10:55.374457 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s8vl2"] Feb 20 09:10:55 crc kubenswrapper[5094]: I0220 09:10:55.846904 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:10:55 crc kubenswrapper[5094]: E0220 09:10:55.848906 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:10:56 crc kubenswrapper[5094]: I0220 09:10:56.146531 5094 generic.go:334] "Generic (PLEG): container finished" podID="d7bd9285-f186-43d4-a61f-181c864d71f6" containerID="a7de66aec64ebf91bc7e8066befb67b49d7ff32daf54e95d5a2fc05f41f9c3f8" exitCode=0 Feb 20 09:10:56 crc kubenswrapper[5094]: I0220 09:10:56.146629 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8vl2" event={"ID":"d7bd9285-f186-43d4-a61f-181c864d71f6","Type":"ContainerDied","Data":"a7de66aec64ebf91bc7e8066befb67b49d7ff32daf54e95d5a2fc05f41f9c3f8"} Feb 20 09:10:56 crc kubenswrapper[5094]: I0220 09:10:56.146966 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8vl2" event={"ID":"d7bd9285-f186-43d4-a61f-181c864d71f6","Type":"ContainerStarted","Data":"59fc4a663a25aaeb18ed744c2850733f6000d6cb2136491a9705a24d5fd293ba"} Feb 20 09:10:58 crc kubenswrapper[5094]: I0220 09:10:58.174970 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8vl2" event={"ID":"d7bd9285-f186-43d4-a61f-181c864d71f6","Type":"ContainerStarted","Data":"c106cc22d51f23b39a2815350e663e320ffd605e6662da151be17f5f02a81fcd"} Feb 20 09:11:00 crc kubenswrapper[5094]: I0220 09:11:00.195343 5094 generic.go:334] "Generic (PLEG): container finished" podID="d7bd9285-f186-43d4-a61f-181c864d71f6" containerID="c106cc22d51f23b39a2815350e663e320ffd605e6662da151be17f5f02a81fcd" exitCode=0 Feb 20 09:11:00 crc kubenswrapper[5094]: I0220 09:11:00.195424 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8vl2" event={"ID":"d7bd9285-f186-43d4-a61f-181c864d71f6","Type":"ContainerDied","Data":"c106cc22d51f23b39a2815350e663e320ffd605e6662da151be17f5f02a81fcd"} Feb 20 09:11:00 crc kubenswrapper[5094]: I0220 09:11:00.198190 5094 generic.go:334] "Generic (PLEG): container finished" podID="e890bf4c-20ec-4b45-936d-d08d3a73b5ee" containerID="53d37e4df512781460481b64b9d79b108f7fa1fdc52b31081e79c7d0aef5a644" exitCode=0 Feb 20 09:11:00 crc kubenswrapper[5094]: I0220 09:11:00.198220 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-networker-tqhtr" event={"ID":"e890bf4c-20ec-4b45-936d-d08d3a73b5ee","Type":"ContainerDied","Data":"53d37e4df512781460481b64b9d79b108f7fa1fdc52b31081e79c7d0aef5a644"} Feb 20 09:11:01 crc kubenswrapper[5094]: I0220 09:11:01.209391 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8vl2" event={"ID":"d7bd9285-f186-43d4-a61f-181c864d71f6","Type":"ContainerStarted","Data":"0d6c06f728ca5a8b0a916a0cfd96aec12b857fc57dad580a12deebac3cd3cfa8"} Feb 20 09:11:01 crc kubenswrapper[5094]: I0220 09:11:01.733712 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-networker-tqhtr" Feb 20 09:11:01 crc kubenswrapper[5094]: I0220 09:11:01.759918 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s8vl2" podStartSLOduration=3.078224994 podStartE2EDuration="7.759865775s" podCreationTimestamp="2026-02-20 09:10:54 +0000 UTC" firstStartedPulling="2026-02-20 09:10:56.148472046 +0000 UTC m=+8671.021098757" lastFinishedPulling="2026-02-20 09:11:00.830112827 +0000 UTC m=+8675.702739538" observedRunningTime="2026-02-20 09:11:01.233766518 +0000 UTC m=+8676.106393229" watchObservedRunningTime="2026-02-20 09:11:01.759865775 +0000 UTC m=+8676.632492506" Feb 20 09:11:01 crc kubenswrapper[5094]: I0220 09:11:01.790716 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-ssh-key-openstack-networker\") pod \"e890bf4c-20ec-4b45-936d-d08d3a73b5ee\" (UID: \"e890bf4c-20ec-4b45-936d-d08d3a73b5ee\") " Feb 20 09:11:01 crc kubenswrapper[5094]: I0220 09:11:01.790935 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-inventory\") pod \"e890bf4c-20ec-4b45-936d-d08d3a73b5ee\" (UID: \"e890bf4c-20ec-4b45-936d-d08d3a73b5ee\") " Feb 20 09:11:01 crc kubenswrapper[5094]: I0220 09:11:01.791047 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4cjz\" (UniqueName: \"kubernetes.io/projected/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-kube-api-access-r4cjz\") pod \"e890bf4c-20ec-4b45-936d-d08d3a73b5ee\" (UID: \"e890bf4c-20ec-4b45-936d-d08d3a73b5ee\") " Feb 20 09:11:01 crc kubenswrapper[5094]: I0220 09:11:01.816037 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-kube-api-access-r4cjz" (OuterVolumeSpecName: "kube-api-access-r4cjz") pod "e890bf4c-20ec-4b45-936d-d08d3a73b5ee" (UID: "e890bf4c-20ec-4b45-936d-d08d3a73b5ee"). InnerVolumeSpecName "kube-api-access-r4cjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:11:01 crc kubenswrapper[5094]: I0220 09:11:01.819493 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-inventory" (OuterVolumeSpecName: "inventory") pod "e890bf4c-20ec-4b45-936d-d08d3a73b5ee" (UID: "e890bf4c-20ec-4b45-936d-d08d3a73b5ee"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:11:01 crc kubenswrapper[5094]: I0220 09:11:01.832284 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "e890bf4c-20ec-4b45-936d-d08d3a73b5ee" (UID: "e890bf4c-20ec-4b45-936d-d08d3a73b5ee"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:11:01 crc kubenswrapper[5094]: I0220 09:11:01.893209 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4cjz\" (UniqueName: \"kubernetes.io/projected/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-kube-api-access-r4cjz\") on node \"crc\" DevicePath \"\"" Feb 20 09:11:01 crc kubenswrapper[5094]: I0220 09:11:01.893244 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 20 09:11:01 crc kubenswrapper[5094]: I0220 09:11:01.893257 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.220371 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-networker-tqhtr" event={"ID":"e890bf4c-20ec-4b45-936d-d08d3a73b5ee","Type":"ContainerDied","Data":"baa0af510f31ddefaf054818af5e9e587e1051876dcf44574d86835d0dea5bbc"} Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.220418 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="baa0af510f31ddefaf054818af5e9e587e1051876dcf44574d86835d0dea5bbc" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.220512 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-networker-tqhtr" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.323200 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-networker-ts6jc"] Feb 20 09:11:02 crc kubenswrapper[5094]: E0220 09:11:02.323788 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e890bf4c-20ec-4b45-936d-d08d3a73b5ee" containerName="install-os-openstack-openstack-networker" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.323826 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="e890bf4c-20ec-4b45-936d-d08d3a73b5ee" containerName="install-os-openstack-openstack-networker" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.324017 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="e890bf4c-20ec-4b45-936d-d08d3a73b5ee" containerName="install-os-openstack-openstack-networker" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.324785 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-networker-ts6jc" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.330480 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-xf9xl" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.330724 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.335073 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-networker-ts6jc"] Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.404479 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/9278a86a-be7e-4e04-a187-52d0c119ccb5-ssh-key-openstack-networker\") pod \"configure-os-openstack-openstack-networker-ts6jc\" (UID: \"9278a86a-be7e-4e04-a187-52d0c119ccb5\") " pod="openstack/configure-os-openstack-openstack-networker-ts6jc" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.405008 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9278a86a-be7e-4e04-a187-52d0c119ccb5-inventory\") pod \"configure-os-openstack-openstack-networker-ts6jc\" (UID: \"9278a86a-be7e-4e04-a187-52d0c119ccb5\") " pod="openstack/configure-os-openstack-openstack-networker-ts6jc" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.405210 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2dnb\" (UniqueName: \"kubernetes.io/projected/9278a86a-be7e-4e04-a187-52d0c119ccb5-kube-api-access-m2dnb\") pod \"configure-os-openstack-openstack-networker-ts6jc\" (UID: \"9278a86a-be7e-4e04-a187-52d0c119ccb5\") " pod="openstack/configure-os-openstack-openstack-networker-ts6jc" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.506164 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9278a86a-be7e-4e04-a187-52d0c119ccb5-inventory\") pod \"configure-os-openstack-openstack-networker-ts6jc\" (UID: \"9278a86a-be7e-4e04-a187-52d0c119ccb5\") " pod="openstack/configure-os-openstack-openstack-networker-ts6jc" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.506292 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2dnb\" (UniqueName: \"kubernetes.io/projected/9278a86a-be7e-4e04-a187-52d0c119ccb5-kube-api-access-m2dnb\") pod \"configure-os-openstack-openstack-networker-ts6jc\" (UID: \"9278a86a-be7e-4e04-a187-52d0c119ccb5\") " pod="openstack/configure-os-openstack-openstack-networker-ts6jc" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.506369 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/9278a86a-be7e-4e04-a187-52d0c119ccb5-ssh-key-openstack-networker\") pod \"configure-os-openstack-openstack-networker-ts6jc\" (UID: \"9278a86a-be7e-4e04-a187-52d0c119ccb5\") " pod="openstack/configure-os-openstack-openstack-networker-ts6jc" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.515605 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/9278a86a-be7e-4e04-a187-52d0c119ccb5-ssh-key-openstack-networker\") pod \"configure-os-openstack-openstack-networker-ts6jc\" (UID: \"9278a86a-be7e-4e04-a187-52d0c119ccb5\") " pod="openstack/configure-os-openstack-openstack-networker-ts6jc" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.515681 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9278a86a-be7e-4e04-a187-52d0c119ccb5-inventory\") pod \"configure-os-openstack-openstack-networker-ts6jc\" (UID: \"9278a86a-be7e-4e04-a187-52d0c119ccb5\") " pod="openstack/configure-os-openstack-openstack-networker-ts6jc" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.522985 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2dnb\" (UniqueName: \"kubernetes.io/projected/9278a86a-be7e-4e04-a187-52d0c119ccb5-kube-api-access-m2dnb\") pod \"configure-os-openstack-openstack-networker-ts6jc\" (UID: \"9278a86a-be7e-4e04-a187-52d0c119ccb5\") " pod="openstack/configure-os-openstack-openstack-networker-ts6jc" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.649813 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-networker-ts6jc" Feb 20 09:11:03 crc kubenswrapper[5094]: I0220 09:11:03.243438 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-networker-ts6jc"] Feb 20 09:11:03 crc kubenswrapper[5094]: W0220 09:11:03.259815 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9278a86a_be7e_4e04_a187_52d0c119ccb5.slice/crio-53482a89b4aa27a085ef4356504526ea239c0913cf6c28f257bd691f3a182a2a WatchSource:0}: Error finding container 53482a89b4aa27a085ef4356504526ea239c0913cf6c28f257bd691f3a182a2a: Status 404 returned error can't find the container with id 53482a89b4aa27a085ef4356504526ea239c0913cf6c28f257bd691f3a182a2a Feb 20 09:11:04 crc kubenswrapper[5094]: I0220 09:11:04.263885 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-networker-ts6jc" event={"ID":"9278a86a-be7e-4e04-a187-52d0c119ccb5","Type":"ContainerStarted","Data":"0ecff3a7f81df043e7dc17dcab78f646edff610b7b2c8f1e0759f2364b70587c"} Feb 20 09:11:04 crc kubenswrapper[5094]: I0220 09:11:04.264401 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-networker-ts6jc" event={"ID":"9278a86a-be7e-4e04-a187-52d0c119ccb5","Type":"ContainerStarted","Data":"53482a89b4aa27a085ef4356504526ea239c0913cf6c28f257bd691f3a182a2a"} Feb 20 09:11:04 crc kubenswrapper[5094]: I0220 09:11:04.295813 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-networker-ts6jc" podStartSLOduration=1.802982568 podStartE2EDuration="2.295793275s" podCreationTimestamp="2026-02-20 09:11:02 +0000 UTC" firstStartedPulling="2026-02-20 09:11:03.262617228 +0000 UTC m=+8678.135243939" lastFinishedPulling="2026-02-20 09:11:03.755427935 +0000 UTC m=+8678.628054646" observedRunningTime="2026-02-20 09:11:04.282115656 +0000 UTC m=+8679.154742357" watchObservedRunningTime="2026-02-20 09:11:04.295793275 +0000 UTC m=+8679.168419986" Feb 20 09:11:04 crc kubenswrapper[5094]: I0220 09:11:04.893088 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:11:04 crc kubenswrapper[5094]: I0220 09:11:04.893386 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:11:05 crc kubenswrapper[5094]: I0220 09:11:05.941882 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-s8vl2" podUID="d7bd9285-f186-43d4-a61f-181c864d71f6" containerName="registry-server" probeResult="failure" output=< Feb 20 09:11:05 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 09:11:05 crc kubenswrapper[5094]: > Feb 20 09:11:10 crc kubenswrapper[5094]: I0220 09:11:10.840557 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:11:10 crc kubenswrapper[5094]: E0220 09:11:10.841741 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:11:14 crc kubenswrapper[5094]: I0220 09:11:14.951133 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:11:15 crc kubenswrapper[5094]: I0220 09:11:15.015838 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:11:15 crc kubenswrapper[5094]: I0220 09:11:15.195975 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s8vl2"] Feb 20 09:11:16 crc kubenswrapper[5094]: I0220 09:11:16.368293 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-s8vl2" podUID="d7bd9285-f186-43d4-a61f-181c864d71f6" containerName="registry-server" containerID="cri-o://0d6c06f728ca5a8b0a916a0cfd96aec12b857fc57dad580a12deebac3cd3cfa8" gracePeriod=2 Feb 20 09:11:16 crc kubenswrapper[5094]: I0220 09:11:16.946185 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.114851 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7bd9285-f186-43d4-a61f-181c864d71f6-catalog-content\") pod \"d7bd9285-f186-43d4-a61f-181c864d71f6\" (UID: \"d7bd9285-f186-43d4-a61f-181c864d71f6\") " Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.115337 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7bd9285-f186-43d4-a61f-181c864d71f6-utilities\") pod \"d7bd9285-f186-43d4-a61f-181c864d71f6\" (UID: \"d7bd9285-f186-43d4-a61f-181c864d71f6\") " Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.115589 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzbxr\" (UniqueName: \"kubernetes.io/projected/d7bd9285-f186-43d4-a61f-181c864d71f6-kube-api-access-bzbxr\") pod \"d7bd9285-f186-43d4-a61f-181c864d71f6\" (UID: \"d7bd9285-f186-43d4-a61f-181c864d71f6\") " Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.117033 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7bd9285-f186-43d4-a61f-181c864d71f6-utilities" (OuterVolumeSpecName: "utilities") pod "d7bd9285-f186-43d4-a61f-181c864d71f6" (UID: "d7bd9285-f186-43d4-a61f-181c864d71f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.125661 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7bd9285-f186-43d4-a61f-181c864d71f6-kube-api-access-bzbxr" (OuterVolumeSpecName: "kube-api-access-bzbxr") pod "d7bd9285-f186-43d4-a61f-181c864d71f6" (UID: "d7bd9285-f186-43d4-a61f-181c864d71f6"). InnerVolumeSpecName "kube-api-access-bzbxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.218606 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzbxr\" (UniqueName: \"kubernetes.io/projected/d7bd9285-f186-43d4-a61f-181c864d71f6-kube-api-access-bzbxr\") on node \"crc\" DevicePath \"\"" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.218653 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7bd9285-f186-43d4-a61f-181c864d71f6-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.266439 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7bd9285-f186-43d4-a61f-181c864d71f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d7bd9285-f186-43d4-a61f-181c864d71f6" (UID: "d7bd9285-f186-43d4-a61f-181c864d71f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.321338 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7bd9285-f186-43d4-a61f-181c864d71f6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.381803 5094 generic.go:334] "Generic (PLEG): container finished" podID="d7bd9285-f186-43d4-a61f-181c864d71f6" containerID="0d6c06f728ca5a8b0a916a0cfd96aec12b857fc57dad580a12deebac3cd3cfa8" exitCode=0 Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.381857 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.381879 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8vl2" event={"ID":"d7bd9285-f186-43d4-a61f-181c864d71f6","Type":"ContainerDied","Data":"0d6c06f728ca5a8b0a916a0cfd96aec12b857fc57dad580a12deebac3cd3cfa8"} Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.382834 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8vl2" event={"ID":"d7bd9285-f186-43d4-a61f-181c864d71f6","Type":"ContainerDied","Data":"59fc4a663a25aaeb18ed744c2850733f6000d6cb2136491a9705a24d5fd293ba"} Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.382875 5094 scope.go:117] "RemoveContainer" containerID="0d6c06f728ca5a8b0a916a0cfd96aec12b857fc57dad580a12deebac3cd3cfa8" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.422416 5094 scope.go:117] "RemoveContainer" containerID="c106cc22d51f23b39a2815350e663e320ffd605e6662da151be17f5f02a81fcd" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.447182 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s8vl2"] Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.464289 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-s8vl2"] Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.468394 5094 scope.go:117] "RemoveContainer" containerID="a7de66aec64ebf91bc7e8066befb67b49d7ff32daf54e95d5a2fc05f41f9c3f8" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.511870 5094 scope.go:117] "RemoveContainer" containerID="0d6c06f728ca5a8b0a916a0cfd96aec12b857fc57dad580a12deebac3cd3cfa8" Feb 20 09:11:17 crc kubenswrapper[5094]: E0220 09:11:17.512821 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d6c06f728ca5a8b0a916a0cfd96aec12b857fc57dad580a12deebac3cd3cfa8\": container with ID starting with 0d6c06f728ca5a8b0a916a0cfd96aec12b857fc57dad580a12deebac3cd3cfa8 not found: ID does not exist" containerID="0d6c06f728ca5a8b0a916a0cfd96aec12b857fc57dad580a12deebac3cd3cfa8" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.512884 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d6c06f728ca5a8b0a916a0cfd96aec12b857fc57dad580a12deebac3cd3cfa8"} err="failed to get container status \"0d6c06f728ca5a8b0a916a0cfd96aec12b857fc57dad580a12deebac3cd3cfa8\": rpc error: code = NotFound desc = could not find container \"0d6c06f728ca5a8b0a916a0cfd96aec12b857fc57dad580a12deebac3cd3cfa8\": container with ID starting with 0d6c06f728ca5a8b0a916a0cfd96aec12b857fc57dad580a12deebac3cd3cfa8 not found: ID does not exist" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.512917 5094 scope.go:117] "RemoveContainer" containerID="c106cc22d51f23b39a2815350e663e320ffd605e6662da151be17f5f02a81fcd" Feb 20 09:11:17 crc kubenswrapper[5094]: E0220 09:11:17.513380 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c106cc22d51f23b39a2815350e663e320ffd605e6662da151be17f5f02a81fcd\": container with ID starting with c106cc22d51f23b39a2815350e663e320ffd605e6662da151be17f5f02a81fcd not found: ID does not exist" containerID="c106cc22d51f23b39a2815350e663e320ffd605e6662da151be17f5f02a81fcd" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.513483 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c106cc22d51f23b39a2815350e663e320ffd605e6662da151be17f5f02a81fcd"} err="failed to get container status \"c106cc22d51f23b39a2815350e663e320ffd605e6662da151be17f5f02a81fcd\": rpc error: code = NotFound desc = could not find container \"c106cc22d51f23b39a2815350e663e320ffd605e6662da151be17f5f02a81fcd\": container with ID starting with c106cc22d51f23b39a2815350e663e320ffd605e6662da151be17f5f02a81fcd not found: ID does not exist" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.513636 5094 scope.go:117] "RemoveContainer" containerID="a7de66aec64ebf91bc7e8066befb67b49d7ff32daf54e95d5a2fc05f41f9c3f8" Feb 20 09:11:17 crc kubenswrapper[5094]: E0220 09:11:17.514286 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7de66aec64ebf91bc7e8066befb67b49d7ff32daf54e95d5a2fc05f41f9c3f8\": container with ID starting with a7de66aec64ebf91bc7e8066befb67b49d7ff32daf54e95d5a2fc05f41f9c3f8 not found: ID does not exist" containerID="a7de66aec64ebf91bc7e8066befb67b49d7ff32daf54e95d5a2fc05f41f9c3f8" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.514315 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7de66aec64ebf91bc7e8066befb67b49d7ff32daf54e95d5a2fc05f41f9c3f8"} err="failed to get container status \"a7de66aec64ebf91bc7e8066befb67b49d7ff32daf54e95d5a2fc05f41f9c3f8\": rpc error: code = NotFound desc = could not find container \"a7de66aec64ebf91bc7e8066befb67b49d7ff32daf54e95d5a2fc05f41f9c3f8\": container with ID starting with a7de66aec64ebf91bc7e8066befb67b49d7ff32daf54e95d5a2fc05f41f9c3f8 not found: ID does not exist" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.855951 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7bd9285-f186-43d4-a61f-181c864d71f6" path="/var/lib/kubelet/pods/d7bd9285-f186-43d4-a61f-181c864d71f6/volumes" Feb 20 09:11:24 crc kubenswrapper[5094]: I0220 09:11:24.841583 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:11:24 crc kubenswrapper[5094]: E0220 09:11:24.842321 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:11:37 crc kubenswrapper[5094]: I0220 09:11:37.840927 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:11:38 crc kubenswrapper[5094]: I0220 09:11:38.604762 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"8248a8f47476a9f6ae0df25861a436e80d0332d121cab1f3067f00978bf0efaf"} Feb 20 09:11:39 crc kubenswrapper[5094]: I0220 09:11:39.615567 5094 generic.go:334] "Generic (PLEG): container finished" podID="27e4bec3-7ef3-4f1d-897d-99909f817f5e" containerID="7453b73c41008eee230417fe9bbf595522244c350a0fb1c6a70aacaecfc7b2af" exitCode=0 Feb 20 09:11:39 crc kubenswrapper[5094]: I0220 09:11:39.615600 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-8j6nf" event={"ID":"27e4bec3-7ef3-4f1d-897d-99909f817f5e","Type":"ContainerDied","Data":"7453b73c41008eee230417fe9bbf595522244c350a0fb1c6a70aacaecfc7b2af"} Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.124738 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-8j6nf" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.271611 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktxwn\" (UniqueName: \"kubernetes.io/projected/27e4bec3-7ef3-4f1d-897d-99909f817f5e-kube-api-access-ktxwn\") pod \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\" (UID: \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\") " Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.271690 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-inventory\") pod \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\" (UID: \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\") " Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.271888 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-ceph\") pod \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\" (UID: \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\") " Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.271968 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-ssh-key-openstack-cell1\") pod \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\" (UID: \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\") " Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.278894 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-ceph" (OuterVolumeSpecName: "ceph") pod "27e4bec3-7ef3-4f1d-897d-99909f817f5e" (UID: "27e4bec3-7ef3-4f1d-897d-99909f817f5e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.279000 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27e4bec3-7ef3-4f1d-897d-99909f817f5e-kube-api-access-ktxwn" (OuterVolumeSpecName: "kube-api-access-ktxwn") pod "27e4bec3-7ef3-4f1d-897d-99909f817f5e" (UID: "27e4bec3-7ef3-4f1d-897d-99909f817f5e"). InnerVolumeSpecName "kube-api-access-ktxwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.306891 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "27e4bec3-7ef3-4f1d-897d-99909f817f5e" (UID: "27e4bec3-7ef3-4f1d-897d-99909f817f5e"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.322605 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-inventory" (OuterVolumeSpecName: "inventory") pod "27e4bec3-7ef3-4f1d-897d-99909f817f5e" (UID: "27e4bec3-7ef3-4f1d-897d-99909f817f5e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.375087 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktxwn\" (UniqueName: \"kubernetes.io/projected/27e4bec3-7ef3-4f1d-897d-99909f817f5e-kube-api-access-ktxwn\") on node \"crc\" DevicePath \"\"" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.375125 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.375140 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.375151 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.635579 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-8j6nf" event={"ID":"27e4bec3-7ef3-4f1d-897d-99909f817f5e","Type":"ContainerDied","Data":"2213442e38aa4ed71dfd4b1cb999e22f654d497f4fc2fb09a72fef9b37908a01"} Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.635608 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-8j6nf" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.635617 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2213442e38aa4ed71dfd4b1cb999e22f654d497f4fc2fb09a72fef9b37908a01" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.722152 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-x6wr7"] Feb 20 09:11:41 crc kubenswrapper[5094]: E0220 09:11:41.722578 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7bd9285-f186-43d4-a61f-181c864d71f6" containerName="extract-content" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.722592 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7bd9285-f186-43d4-a61f-181c864d71f6" containerName="extract-content" Feb 20 09:11:41 crc kubenswrapper[5094]: E0220 09:11:41.722623 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7bd9285-f186-43d4-a61f-181c864d71f6" containerName="extract-utilities" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.722630 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7bd9285-f186-43d4-a61f-181c864d71f6" containerName="extract-utilities" Feb 20 09:11:41 crc kubenswrapper[5094]: E0220 09:11:41.722637 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27e4bec3-7ef3-4f1d-897d-99909f817f5e" containerName="install-os-openstack-openstack-cell1" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.722644 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="27e4bec3-7ef3-4f1d-897d-99909f817f5e" containerName="install-os-openstack-openstack-cell1" Feb 20 09:11:41 crc kubenswrapper[5094]: E0220 09:11:41.722656 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7bd9285-f186-43d4-a61f-181c864d71f6" containerName="registry-server" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.722661 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7bd9285-f186-43d4-a61f-181c864d71f6" containerName="registry-server" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.722922 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="27e4bec3-7ef3-4f1d-897d-99909f817f5e" containerName="install-os-openstack-openstack-cell1" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.722951 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7bd9285-f186-43d4-a61f-181c864d71f6" containerName="registry-server" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.733397 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-x6wr7"] Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.733504 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.737445 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.741550 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.883826 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-x6wr7\" (UID: \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\") " pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.883894 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjq54\" (UniqueName: \"kubernetes.io/projected/dd90b879-7bfd-480f-b25e-b7aef96a4b08-kube-api-access-qjq54\") pod \"configure-os-openstack-openstack-cell1-x6wr7\" (UID: \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\") " pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.884068 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-inventory\") pod \"configure-os-openstack-openstack-cell1-x6wr7\" (UID: \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\") " pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.884163 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-ceph\") pod \"configure-os-openstack-openstack-cell1-x6wr7\" (UID: \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\") " pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.986285 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-x6wr7\" (UID: \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\") " pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.986365 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjq54\" (UniqueName: \"kubernetes.io/projected/dd90b879-7bfd-480f-b25e-b7aef96a4b08-kube-api-access-qjq54\") pod \"configure-os-openstack-openstack-cell1-x6wr7\" (UID: \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\") " pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.986430 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-inventory\") pod \"configure-os-openstack-openstack-cell1-x6wr7\" (UID: \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\") " pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.986470 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-ceph\") pod \"configure-os-openstack-openstack-cell1-x6wr7\" (UID: \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\") " pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.990541 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-x6wr7\" (UID: \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\") " pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.990753 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-inventory\") pod \"configure-os-openstack-openstack-cell1-x6wr7\" (UID: \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\") " pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.990836 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-ceph\") pod \"configure-os-openstack-openstack-cell1-x6wr7\" (UID: \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\") " pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" Feb 20 09:11:42 crc kubenswrapper[5094]: I0220 09:11:42.006822 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjq54\" (UniqueName: \"kubernetes.io/projected/dd90b879-7bfd-480f-b25e-b7aef96a4b08-kube-api-access-qjq54\") pod \"configure-os-openstack-openstack-cell1-x6wr7\" (UID: \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\") " pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" Feb 20 09:11:42 crc kubenswrapper[5094]: I0220 09:11:42.050001 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" Feb 20 09:11:42 crc kubenswrapper[5094]: I0220 09:11:42.579919 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-x6wr7"] Feb 20 09:11:42 crc kubenswrapper[5094]: I0220 09:11:42.645752 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" event={"ID":"dd90b879-7bfd-480f-b25e-b7aef96a4b08","Type":"ContainerStarted","Data":"2d2225b86ca01f28b60c92f79f63009b9f37ea550a1f62fbf59736694ec08b08"} Feb 20 09:11:43 crc kubenswrapper[5094]: I0220 09:11:43.654511 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" event={"ID":"dd90b879-7bfd-480f-b25e-b7aef96a4b08","Type":"ContainerStarted","Data":"e10f72773552a2cc40555e02e367ba4fdd371e010afefe0f9daa523b1f3f9ffa"} Feb 20 09:11:43 crc kubenswrapper[5094]: I0220 09:11:43.672403 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" podStartSLOduration=2.151355225 podStartE2EDuration="2.672388601s" podCreationTimestamp="2026-02-20 09:11:41 +0000 UTC" firstStartedPulling="2026-02-20 09:11:42.581993238 +0000 UTC m=+8717.454619949" lastFinishedPulling="2026-02-20 09:11:43.103026624 +0000 UTC m=+8717.975653325" observedRunningTime="2026-02-20 09:11:43.669946013 +0000 UTC m=+8718.542572724" watchObservedRunningTime="2026-02-20 09:11:43.672388601 +0000 UTC m=+8718.545015312" Feb 20 09:11:53 crc kubenswrapper[5094]: I0220 09:11:53.759898 5094 generic.go:334] "Generic (PLEG): container finished" podID="9278a86a-be7e-4e04-a187-52d0c119ccb5" containerID="0ecff3a7f81df043e7dc17dcab78f646edff610b7b2c8f1e0759f2364b70587c" exitCode=0 Feb 20 09:11:53 crc kubenswrapper[5094]: I0220 09:11:53.760483 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-networker-ts6jc" event={"ID":"9278a86a-be7e-4e04-a187-52d0c119ccb5","Type":"ContainerDied","Data":"0ecff3a7f81df043e7dc17dcab78f646edff610b7b2c8f1e0759f2364b70587c"} Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.254190 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-networker-ts6jc" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.371839 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9278a86a-be7e-4e04-a187-52d0c119ccb5-inventory\") pod \"9278a86a-be7e-4e04-a187-52d0c119ccb5\" (UID: \"9278a86a-be7e-4e04-a187-52d0c119ccb5\") " Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.372184 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2dnb\" (UniqueName: \"kubernetes.io/projected/9278a86a-be7e-4e04-a187-52d0c119ccb5-kube-api-access-m2dnb\") pod \"9278a86a-be7e-4e04-a187-52d0c119ccb5\" (UID: \"9278a86a-be7e-4e04-a187-52d0c119ccb5\") " Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.372278 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/9278a86a-be7e-4e04-a187-52d0c119ccb5-ssh-key-openstack-networker\") pod \"9278a86a-be7e-4e04-a187-52d0c119ccb5\" (UID: \"9278a86a-be7e-4e04-a187-52d0c119ccb5\") " Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.385761 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9278a86a-be7e-4e04-a187-52d0c119ccb5-kube-api-access-m2dnb" (OuterVolumeSpecName: "kube-api-access-m2dnb") pod "9278a86a-be7e-4e04-a187-52d0c119ccb5" (UID: "9278a86a-be7e-4e04-a187-52d0c119ccb5"). InnerVolumeSpecName "kube-api-access-m2dnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.399912 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9278a86a-be7e-4e04-a187-52d0c119ccb5-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "9278a86a-be7e-4e04-a187-52d0c119ccb5" (UID: "9278a86a-be7e-4e04-a187-52d0c119ccb5"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.400862 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9278a86a-be7e-4e04-a187-52d0c119ccb5-inventory" (OuterVolumeSpecName: "inventory") pod "9278a86a-be7e-4e04-a187-52d0c119ccb5" (UID: "9278a86a-be7e-4e04-a187-52d0c119ccb5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.475124 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9278a86a-be7e-4e04-a187-52d0c119ccb5-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.475164 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2dnb\" (UniqueName: \"kubernetes.io/projected/9278a86a-be7e-4e04-a187-52d0c119ccb5-kube-api-access-m2dnb\") on node \"crc\" DevicePath \"\"" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.475180 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/9278a86a-be7e-4e04-a187-52d0c119ccb5-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.781188 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-networker-ts6jc" event={"ID":"9278a86a-be7e-4e04-a187-52d0c119ccb5","Type":"ContainerDied","Data":"53482a89b4aa27a085ef4356504526ea239c0913cf6c28f257bd691f3a182a2a"} Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.781495 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53482a89b4aa27a085ef4356504526ea239c0913cf6c28f257bd691f3a182a2a" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.781785 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-networker-ts6jc" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.868381 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-networker-f78nb"] Feb 20 09:11:55 crc kubenswrapper[5094]: E0220 09:11:55.868968 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9278a86a-be7e-4e04-a187-52d0c119ccb5" containerName="configure-os-openstack-openstack-networker" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.868994 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="9278a86a-be7e-4e04-a187-52d0c119ccb5" containerName="configure-os-openstack-openstack-networker" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.869191 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="9278a86a-be7e-4e04-a187-52d0c119ccb5" containerName="configure-os-openstack-openstack-networker" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.869981 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-networker-f78nb" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.878964 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-xf9xl" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.879158 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.933319 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-networker-f78nb"] Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.983942 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/95f054cd-db3e-45e0-9e12-55c2da3b5a23-ssh-key-openstack-networker\") pod \"run-os-openstack-openstack-networker-f78nb\" (UID: \"95f054cd-db3e-45e0-9e12-55c2da3b5a23\") " pod="openstack/run-os-openstack-openstack-networker-f78nb" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.984028 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95f054cd-db3e-45e0-9e12-55c2da3b5a23-inventory\") pod \"run-os-openstack-openstack-networker-f78nb\" (UID: \"95f054cd-db3e-45e0-9e12-55c2da3b5a23\") " pod="openstack/run-os-openstack-openstack-networker-f78nb" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.984469 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t9js\" (UniqueName: \"kubernetes.io/projected/95f054cd-db3e-45e0-9e12-55c2da3b5a23-kube-api-access-5t9js\") pod \"run-os-openstack-openstack-networker-f78nb\" (UID: \"95f054cd-db3e-45e0-9e12-55c2da3b5a23\") " pod="openstack/run-os-openstack-openstack-networker-f78nb" Feb 20 09:11:56 crc kubenswrapper[5094]: I0220 09:11:56.086529 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/95f054cd-db3e-45e0-9e12-55c2da3b5a23-ssh-key-openstack-networker\") pod \"run-os-openstack-openstack-networker-f78nb\" (UID: \"95f054cd-db3e-45e0-9e12-55c2da3b5a23\") " pod="openstack/run-os-openstack-openstack-networker-f78nb" Feb 20 09:11:56 crc kubenswrapper[5094]: I0220 09:11:56.087788 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95f054cd-db3e-45e0-9e12-55c2da3b5a23-inventory\") pod \"run-os-openstack-openstack-networker-f78nb\" (UID: \"95f054cd-db3e-45e0-9e12-55c2da3b5a23\") " pod="openstack/run-os-openstack-openstack-networker-f78nb" Feb 20 09:11:56 crc kubenswrapper[5094]: I0220 09:11:56.088547 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t9js\" (UniqueName: \"kubernetes.io/projected/95f054cd-db3e-45e0-9e12-55c2da3b5a23-kube-api-access-5t9js\") pod \"run-os-openstack-openstack-networker-f78nb\" (UID: \"95f054cd-db3e-45e0-9e12-55c2da3b5a23\") " pod="openstack/run-os-openstack-openstack-networker-f78nb" Feb 20 09:11:56 crc kubenswrapper[5094]: I0220 09:11:56.093041 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/95f054cd-db3e-45e0-9e12-55c2da3b5a23-ssh-key-openstack-networker\") pod \"run-os-openstack-openstack-networker-f78nb\" (UID: \"95f054cd-db3e-45e0-9e12-55c2da3b5a23\") " pod="openstack/run-os-openstack-openstack-networker-f78nb" Feb 20 09:11:56 crc kubenswrapper[5094]: I0220 09:11:56.093079 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95f054cd-db3e-45e0-9e12-55c2da3b5a23-inventory\") pod \"run-os-openstack-openstack-networker-f78nb\" (UID: \"95f054cd-db3e-45e0-9e12-55c2da3b5a23\") " pod="openstack/run-os-openstack-openstack-networker-f78nb" Feb 20 09:11:56 crc kubenswrapper[5094]: I0220 09:11:56.105159 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t9js\" (UniqueName: \"kubernetes.io/projected/95f054cd-db3e-45e0-9e12-55c2da3b5a23-kube-api-access-5t9js\") pod \"run-os-openstack-openstack-networker-f78nb\" (UID: \"95f054cd-db3e-45e0-9e12-55c2da3b5a23\") " pod="openstack/run-os-openstack-openstack-networker-f78nb" Feb 20 09:11:56 crc kubenswrapper[5094]: I0220 09:11:56.238521 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-networker-f78nb" Feb 20 09:11:56 crc kubenswrapper[5094]: I0220 09:11:56.838166 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-networker-f78nb"] Feb 20 09:11:56 crc kubenswrapper[5094]: W0220 09:11:56.842528 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95f054cd_db3e_45e0_9e12_55c2da3b5a23.slice/crio-733bbb9a6c003e5dd19885f94d1191a64cd06c603ddd3f8a22990f44410e13d8 WatchSource:0}: Error finding container 733bbb9a6c003e5dd19885f94d1191a64cd06c603ddd3f8a22990f44410e13d8: Status 404 returned error can't find the container with id 733bbb9a6c003e5dd19885f94d1191a64cd06c603ddd3f8a22990f44410e13d8 Feb 20 09:11:57 crc kubenswrapper[5094]: I0220 09:11:57.803696 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-networker-f78nb" event={"ID":"95f054cd-db3e-45e0-9e12-55c2da3b5a23","Type":"ContainerStarted","Data":"4d50aac56336d9f1985d507ee643e6f65ac7303c55439b06997bd5c0b8b104d5"} Feb 20 09:11:57 crc kubenswrapper[5094]: I0220 09:11:57.804087 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-networker-f78nb" event={"ID":"95f054cd-db3e-45e0-9e12-55c2da3b5a23","Type":"ContainerStarted","Data":"733bbb9a6c003e5dd19885f94d1191a64cd06c603ddd3f8a22990f44410e13d8"} Feb 20 09:11:57 crc kubenswrapper[5094]: I0220 09:11:57.825844 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-networker-f78nb" podStartSLOduration=2.226813976 podStartE2EDuration="2.825823966s" podCreationTimestamp="2026-02-20 09:11:55 +0000 UTC" firstStartedPulling="2026-02-20 09:11:56.846906775 +0000 UTC m=+8731.719533486" lastFinishedPulling="2026-02-20 09:11:57.445916755 +0000 UTC m=+8732.318543476" observedRunningTime="2026-02-20 09:11:57.820840565 +0000 UTC m=+8732.693467276" watchObservedRunningTime="2026-02-20 09:11:57.825823966 +0000 UTC m=+8732.698450677" Feb 20 09:12:05 crc kubenswrapper[5094]: I0220 09:12:05.881641 5094 generic.go:334] "Generic (PLEG): container finished" podID="95f054cd-db3e-45e0-9e12-55c2da3b5a23" containerID="4d50aac56336d9f1985d507ee643e6f65ac7303c55439b06997bd5c0b8b104d5" exitCode=0 Feb 20 09:12:05 crc kubenswrapper[5094]: I0220 09:12:05.881779 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-networker-f78nb" event={"ID":"95f054cd-db3e-45e0-9e12-55c2da3b5a23","Type":"ContainerDied","Data":"4d50aac56336d9f1985d507ee643e6f65ac7303c55439b06997bd5c0b8b104d5"} Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.384742 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-networker-f78nb" Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.411318 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t9js\" (UniqueName: \"kubernetes.io/projected/95f054cd-db3e-45e0-9e12-55c2da3b5a23-kube-api-access-5t9js\") pod \"95f054cd-db3e-45e0-9e12-55c2da3b5a23\" (UID: \"95f054cd-db3e-45e0-9e12-55c2da3b5a23\") " Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.411389 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/95f054cd-db3e-45e0-9e12-55c2da3b5a23-ssh-key-openstack-networker\") pod \"95f054cd-db3e-45e0-9e12-55c2da3b5a23\" (UID: \"95f054cd-db3e-45e0-9e12-55c2da3b5a23\") " Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.411440 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95f054cd-db3e-45e0-9e12-55c2da3b5a23-inventory\") pod \"95f054cd-db3e-45e0-9e12-55c2da3b5a23\" (UID: \"95f054cd-db3e-45e0-9e12-55c2da3b5a23\") " Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.417251 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95f054cd-db3e-45e0-9e12-55c2da3b5a23-kube-api-access-5t9js" (OuterVolumeSpecName: "kube-api-access-5t9js") pod "95f054cd-db3e-45e0-9e12-55c2da3b5a23" (UID: "95f054cd-db3e-45e0-9e12-55c2da3b5a23"). InnerVolumeSpecName "kube-api-access-5t9js". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.444000 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95f054cd-db3e-45e0-9e12-55c2da3b5a23-inventory" (OuterVolumeSpecName: "inventory") pod "95f054cd-db3e-45e0-9e12-55c2da3b5a23" (UID: "95f054cd-db3e-45e0-9e12-55c2da3b5a23"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.445103 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95f054cd-db3e-45e0-9e12-55c2da3b5a23-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "95f054cd-db3e-45e0-9e12-55c2da3b5a23" (UID: "95f054cd-db3e-45e0-9e12-55c2da3b5a23"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.520231 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5t9js\" (UniqueName: \"kubernetes.io/projected/95f054cd-db3e-45e0-9e12-55c2da3b5a23-kube-api-access-5t9js\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.520278 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/95f054cd-db3e-45e0-9e12-55c2da3b5a23-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.520288 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95f054cd-db3e-45e0-9e12-55c2da3b5a23-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.901506 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-networker-f78nb" event={"ID":"95f054cd-db3e-45e0-9e12-55c2da3b5a23","Type":"ContainerDied","Data":"733bbb9a6c003e5dd19885f94d1191a64cd06c603ddd3f8a22990f44410e13d8"} Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.901549 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="733bbb9a6c003e5dd19885f94d1191a64cd06c603ddd3f8a22990f44410e13d8" Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.901564 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-networker-f78nb" Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.968442 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-networker-x57s5"] Feb 20 09:12:07 crc kubenswrapper[5094]: E0220 09:12:07.969101 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95f054cd-db3e-45e0-9e12-55c2da3b5a23" containerName="run-os-openstack-openstack-networker" Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.969127 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="95f054cd-db3e-45e0-9e12-55c2da3b5a23" containerName="run-os-openstack-openstack-networker" Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.969360 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="95f054cd-db3e-45e0-9e12-55c2da3b5a23" containerName="run-os-openstack-openstack-networker" Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.970312 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-networker-x57s5" Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.972850 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-xf9xl" Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.976874 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.977825 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-networker-x57s5"] Feb 20 09:12:08 crc kubenswrapper[5094]: I0220 09:12:08.037109 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4pcj\" (UniqueName: \"kubernetes.io/projected/5e673130-22d3-4300-a143-c2821deb8cac-kube-api-access-d4pcj\") pod \"reboot-os-openstack-openstack-networker-x57s5\" (UID: \"5e673130-22d3-4300-a143-c2821deb8cac\") " pod="openstack/reboot-os-openstack-openstack-networker-x57s5" Feb 20 09:12:08 crc kubenswrapper[5094]: I0220 09:12:08.037185 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/5e673130-22d3-4300-a143-c2821deb8cac-ssh-key-openstack-networker\") pod \"reboot-os-openstack-openstack-networker-x57s5\" (UID: \"5e673130-22d3-4300-a143-c2821deb8cac\") " pod="openstack/reboot-os-openstack-openstack-networker-x57s5" Feb 20 09:12:08 crc kubenswrapper[5094]: I0220 09:12:08.037224 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e673130-22d3-4300-a143-c2821deb8cac-inventory\") pod \"reboot-os-openstack-openstack-networker-x57s5\" (UID: \"5e673130-22d3-4300-a143-c2821deb8cac\") " pod="openstack/reboot-os-openstack-openstack-networker-x57s5" Feb 20 09:12:08 crc kubenswrapper[5094]: I0220 09:12:08.139407 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4pcj\" (UniqueName: \"kubernetes.io/projected/5e673130-22d3-4300-a143-c2821deb8cac-kube-api-access-d4pcj\") pod \"reboot-os-openstack-openstack-networker-x57s5\" (UID: \"5e673130-22d3-4300-a143-c2821deb8cac\") " pod="openstack/reboot-os-openstack-openstack-networker-x57s5" Feb 20 09:12:08 crc kubenswrapper[5094]: I0220 09:12:08.139489 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/5e673130-22d3-4300-a143-c2821deb8cac-ssh-key-openstack-networker\") pod \"reboot-os-openstack-openstack-networker-x57s5\" (UID: \"5e673130-22d3-4300-a143-c2821deb8cac\") " pod="openstack/reboot-os-openstack-openstack-networker-x57s5" Feb 20 09:12:08 crc kubenswrapper[5094]: I0220 09:12:08.139524 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e673130-22d3-4300-a143-c2821deb8cac-inventory\") pod \"reboot-os-openstack-openstack-networker-x57s5\" (UID: \"5e673130-22d3-4300-a143-c2821deb8cac\") " pod="openstack/reboot-os-openstack-openstack-networker-x57s5" Feb 20 09:12:08 crc kubenswrapper[5094]: I0220 09:12:08.143445 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/5e673130-22d3-4300-a143-c2821deb8cac-ssh-key-openstack-networker\") pod \"reboot-os-openstack-openstack-networker-x57s5\" (UID: \"5e673130-22d3-4300-a143-c2821deb8cac\") " pod="openstack/reboot-os-openstack-openstack-networker-x57s5" Feb 20 09:12:08 crc kubenswrapper[5094]: I0220 09:12:08.146287 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e673130-22d3-4300-a143-c2821deb8cac-inventory\") pod \"reboot-os-openstack-openstack-networker-x57s5\" (UID: \"5e673130-22d3-4300-a143-c2821deb8cac\") " pod="openstack/reboot-os-openstack-openstack-networker-x57s5" Feb 20 09:12:08 crc kubenswrapper[5094]: I0220 09:12:08.154802 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4pcj\" (UniqueName: \"kubernetes.io/projected/5e673130-22d3-4300-a143-c2821deb8cac-kube-api-access-d4pcj\") pod \"reboot-os-openstack-openstack-networker-x57s5\" (UID: \"5e673130-22d3-4300-a143-c2821deb8cac\") " pod="openstack/reboot-os-openstack-openstack-networker-x57s5" Feb 20 09:12:08 crc kubenswrapper[5094]: I0220 09:12:08.302654 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-networker-x57s5" Feb 20 09:12:08 crc kubenswrapper[5094]: I0220 09:12:08.848794 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-networker-x57s5"] Feb 20 09:12:08 crc kubenswrapper[5094]: W0220 09:12:08.850971 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e673130_22d3_4300_a143_c2821deb8cac.slice/crio-c883ef19578f8d353a6ef42c0db66ef99ceed01665ac9d0bfc434b57a06445b3 WatchSource:0}: Error finding container c883ef19578f8d353a6ef42c0db66ef99ceed01665ac9d0bfc434b57a06445b3: Status 404 returned error can't find the container with id c883ef19578f8d353a6ef42c0db66ef99ceed01665ac9d0bfc434b57a06445b3 Feb 20 09:12:08 crc kubenswrapper[5094]: I0220 09:12:08.912741 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-networker-x57s5" event={"ID":"5e673130-22d3-4300-a143-c2821deb8cac","Type":"ContainerStarted","Data":"c883ef19578f8d353a6ef42c0db66ef99ceed01665ac9d0bfc434b57a06445b3"} Feb 20 09:12:09 crc kubenswrapper[5094]: I0220 09:12:09.925343 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-networker-x57s5" event={"ID":"5e673130-22d3-4300-a143-c2821deb8cac","Type":"ContainerStarted","Data":"f4999008d71561b3987b5f2c05b58a5e54c18f25547d77eaa61d1ea04c4081ea"} Feb 20 09:12:09 crc kubenswrapper[5094]: I0220 09:12:09.947054 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-networker-x57s5" podStartSLOduration=2.172587316 podStartE2EDuration="2.947034689s" podCreationTimestamp="2026-02-20 09:12:07 +0000 UTC" firstStartedPulling="2026-02-20 09:12:08.852740932 +0000 UTC m=+8743.725367643" lastFinishedPulling="2026-02-20 09:12:09.627188275 +0000 UTC m=+8744.499815016" observedRunningTime="2026-02-20 09:12:09.939557449 +0000 UTC m=+8744.812184170" watchObservedRunningTime="2026-02-20 09:12:09.947034689 +0000 UTC m=+8744.819661400" Feb 20 09:12:24 crc kubenswrapper[5094]: I0220 09:12:24.072885 5094 generic.go:334] "Generic (PLEG): container finished" podID="5e673130-22d3-4300-a143-c2821deb8cac" containerID="f4999008d71561b3987b5f2c05b58a5e54c18f25547d77eaa61d1ea04c4081ea" exitCode=0 Feb 20 09:12:24 crc kubenswrapper[5094]: I0220 09:12:24.072989 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-networker-x57s5" event={"ID":"5e673130-22d3-4300-a143-c2821deb8cac","Type":"ContainerDied","Data":"f4999008d71561b3987b5f2c05b58a5e54c18f25547d77eaa61d1ea04c4081ea"} Feb 20 09:12:25 crc kubenswrapper[5094]: I0220 09:12:25.604731 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-networker-x57s5" Feb 20 09:12:25 crc kubenswrapper[5094]: I0220 09:12:25.725723 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e673130-22d3-4300-a143-c2821deb8cac-inventory\") pod \"5e673130-22d3-4300-a143-c2821deb8cac\" (UID: \"5e673130-22d3-4300-a143-c2821deb8cac\") " Feb 20 09:12:25 crc kubenswrapper[5094]: I0220 09:12:25.726057 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/5e673130-22d3-4300-a143-c2821deb8cac-ssh-key-openstack-networker\") pod \"5e673130-22d3-4300-a143-c2821deb8cac\" (UID: \"5e673130-22d3-4300-a143-c2821deb8cac\") " Feb 20 09:12:25 crc kubenswrapper[5094]: I0220 09:12:25.726092 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4pcj\" (UniqueName: \"kubernetes.io/projected/5e673130-22d3-4300-a143-c2821deb8cac-kube-api-access-d4pcj\") pod \"5e673130-22d3-4300-a143-c2821deb8cac\" (UID: \"5e673130-22d3-4300-a143-c2821deb8cac\") " Feb 20 09:12:25 crc kubenswrapper[5094]: I0220 09:12:25.734968 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e673130-22d3-4300-a143-c2821deb8cac-kube-api-access-d4pcj" (OuterVolumeSpecName: "kube-api-access-d4pcj") pod "5e673130-22d3-4300-a143-c2821deb8cac" (UID: "5e673130-22d3-4300-a143-c2821deb8cac"). InnerVolumeSpecName "kube-api-access-d4pcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:12:25 crc kubenswrapper[5094]: I0220 09:12:25.753109 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e673130-22d3-4300-a143-c2821deb8cac-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "5e673130-22d3-4300-a143-c2821deb8cac" (UID: "5e673130-22d3-4300-a143-c2821deb8cac"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:25 crc kubenswrapper[5094]: I0220 09:12:25.761207 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e673130-22d3-4300-a143-c2821deb8cac-inventory" (OuterVolumeSpecName: "inventory") pod "5e673130-22d3-4300-a143-c2821deb8cac" (UID: "5e673130-22d3-4300-a143-c2821deb8cac"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:25 crc kubenswrapper[5094]: I0220 09:12:25.829287 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/5e673130-22d3-4300-a143-c2821deb8cac-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:25 crc kubenswrapper[5094]: I0220 09:12:25.829342 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4pcj\" (UniqueName: \"kubernetes.io/projected/5e673130-22d3-4300-a143-c2821deb8cac-kube-api-access-d4pcj\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:25 crc kubenswrapper[5094]: I0220 09:12:25.829356 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e673130-22d3-4300-a143-c2821deb8cac-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.093608 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-networker-x57s5" event={"ID":"5e673130-22d3-4300-a143-c2821deb8cac","Type":"ContainerDied","Data":"c883ef19578f8d353a6ef42c0db66ef99ceed01665ac9d0bfc434b57a06445b3"} Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.093651 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c883ef19578f8d353a6ef42c0db66ef99ceed01665ac9d0bfc434b57a06445b3" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.093652 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-networker-x57s5" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.200240 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-networker-xdx54"] Feb 20 09:12:26 crc kubenswrapper[5094]: E0220 09:12:26.201051 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e673130-22d3-4300-a143-c2821deb8cac" containerName="reboot-os-openstack-openstack-networker" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.201073 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e673130-22d3-4300-a143-c2821deb8cac" containerName="reboot-os-openstack-openstack-networker" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.201333 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e673130-22d3-4300-a143-c2821deb8cac" containerName="reboot-os-openstack-openstack-networker" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.202165 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.209312 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-networker-xdx54"] Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.237419 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.237650 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-xf9xl" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.249572 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-inventory\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.249677 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxj2k\" (UniqueName: \"kubernetes.io/projected/21d27f85-64a1-4dc5-af39-89275cce2427-kube-api-access-bxj2k\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.249734 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.249826 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.249873 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.249897 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-ssh-key-openstack-networker\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.351600 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.351669 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.351690 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-ssh-key-openstack-networker\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.351778 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-inventory\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.351828 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxj2k\" (UniqueName: \"kubernetes.io/projected/21d27f85-64a1-4dc5-af39-89275cce2427-kube-api-access-bxj2k\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.351856 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.358440 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-inventory\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.358807 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.360879 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-ssh-key-openstack-networker\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.361120 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.366384 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.371872 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxj2k\" (UniqueName: \"kubernetes.io/projected/21d27f85-64a1-4dc5-af39-89275cce2427-kube-api-access-bxj2k\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.554277 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:27 crc kubenswrapper[5094]: I0220 09:12:27.098237 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-networker-xdx54"] Feb 20 09:12:28 crc kubenswrapper[5094]: I0220 09:12:28.114878 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-networker-xdx54" event={"ID":"21d27f85-64a1-4dc5-af39-89275cce2427","Type":"ContainerStarted","Data":"6c17819c117daffaa74cb33a9b4a26dc4bfdc09ad0bd109886743a58f1bd45bb"} Feb 20 09:12:28 crc kubenswrapper[5094]: I0220 09:12:28.115163 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-networker-xdx54" event={"ID":"21d27f85-64a1-4dc5-af39-89275cce2427","Type":"ContainerStarted","Data":"c2f19ab64377ee28e9a91ca87c0aaae5c486f7814233b439c782195562b6cbca"} Feb 20 09:12:28 crc kubenswrapper[5094]: I0220 09:12:28.147183 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-networker-xdx54" podStartSLOduration=1.7502919129999999 podStartE2EDuration="2.14715875s" podCreationTimestamp="2026-02-20 09:12:26 +0000 UTC" firstStartedPulling="2026-02-20 09:12:27.1042726 +0000 UTC m=+8761.976899311" lastFinishedPulling="2026-02-20 09:12:27.501139437 +0000 UTC m=+8762.373766148" observedRunningTime="2026-02-20 09:12:28.134181957 +0000 UTC m=+8763.006808678" watchObservedRunningTime="2026-02-20 09:12:28.14715875 +0000 UTC m=+8763.019785461" Feb 20 09:12:32 crc kubenswrapper[5094]: I0220 09:12:32.161566 5094 generic.go:334] "Generic (PLEG): container finished" podID="dd90b879-7bfd-480f-b25e-b7aef96a4b08" containerID="e10f72773552a2cc40555e02e367ba4fdd371e010afefe0f9daa523b1f3f9ffa" exitCode=0 Feb 20 09:12:32 crc kubenswrapper[5094]: I0220 09:12:32.161649 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" event={"ID":"dd90b879-7bfd-480f-b25e-b7aef96a4b08","Type":"ContainerDied","Data":"e10f72773552a2cc40555e02e367ba4fdd371e010afefe0f9daa523b1f3f9ffa"} Feb 20 09:12:33 crc kubenswrapper[5094]: I0220 09:12:33.685411 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" Feb 20 09:12:33 crc kubenswrapper[5094]: I0220 09:12:33.814438 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjq54\" (UniqueName: \"kubernetes.io/projected/dd90b879-7bfd-480f-b25e-b7aef96a4b08-kube-api-access-qjq54\") pod \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\" (UID: \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\") " Feb 20 09:12:33 crc kubenswrapper[5094]: I0220 09:12:33.814573 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-ceph\") pod \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\" (UID: \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\") " Feb 20 09:12:33 crc kubenswrapper[5094]: I0220 09:12:33.814687 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-ssh-key-openstack-cell1\") pod \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\" (UID: \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\") " Feb 20 09:12:33 crc kubenswrapper[5094]: I0220 09:12:33.814844 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-inventory\") pod \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\" (UID: \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\") " Feb 20 09:12:33 crc kubenswrapper[5094]: I0220 09:12:33.827988 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-ceph" (OuterVolumeSpecName: "ceph") pod "dd90b879-7bfd-480f-b25e-b7aef96a4b08" (UID: "dd90b879-7bfd-480f-b25e-b7aef96a4b08"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:33 crc kubenswrapper[5094]: I0220 09:12:33.828065 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd90b879-7bfd-480f-b25e-b7aef96a4b08-kube-api-access-qjq54" (OuterVolumeSpecName: "kube-api-access-qjq54") pod "dd90b879-7bfd-480f-b25e-b7aef96a4b08" (UID: "dd90b879-7bfd-480f-b25e-b7aef96a4b08"). InnerVolumeSpecName "kube-api-access-qjq54". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:12:33 crc kubenswrapper[5094]: I0220 09:12:33.842097 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "dd90b879-7bfd-480f-b25e-b7aef96a4b08" (UID: "dd90b879-7bfd-480f-b25e-b7aef96a4b08"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:33 crc kubenswrapper[5094]: I0220 09:12:33.842889 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-inventory" (OuterVolumeSpecName: "inventory") pod "dd90b879-7bfd-480f-b25e-b7aef96a4b08" (UID: "dd90b879-7bfd-480f-b25e-b7aef96a4b08"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:33 crc kubenswrapper[5094]: I0220 09:12:33.916745 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:33 crc kubenswrapper[5094]: I0220 09:12:33.916788 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjq54\" (UniqueName: \"kubernetes.io/projected/dd90b879-7bfd-480f-b25e-b7aef96a4b08-kube-api-access-qjq54\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:33 crc kubenswrapper[5094]: I0220 09:12:33.916800 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:33 crc kubenswrapper[5094]: I0220 09:12:33.916809 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.181727 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" event={"ID":"dd90b879-7bfd-480f-b25e-b7aef96a4b08","Type":"ContainerDied","Data":"2d2225b86ca01f28b60c92f79f63009b9f37ea550a1f62fbf59736694ec08b08"} Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.181765 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d2225b86ca01f28b60c92f79f63009b9f37ea550a1f62fbf59736694ec08b08" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.181828 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.317560 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-zb287"] Feb 20 09:12:34 crc kubenswrapper[5094]: E0220 09:12:34.318094 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd90b879-7bfd-480f-b25e-b7aef96a4b08" containerName="configure-os-openstack-openstack-cell1" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.318118 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd90b879-7bfd-480f-b25e-b7aef96a4b08" containerName="configure-os-openstack-openstack-cell1" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.318381 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd90b879-7bfd-480f-b25e-b7aef96a4b08" containerName="configure-os-openstack-openstack-cell1" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.332501 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-zb287"] Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.332608 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.337294 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.337523 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.426859 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ceph\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.426905 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.426946 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cznz8\" (UniqueName: \"kubernetes.io/projected/387ade3f-0ebb-4488-8a04-389a018fc31d-kube-api-access-cznz8\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.426996 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-inventory-1\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.427017 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-inventory-0\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.427041 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ssh-key-openstack-networker\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.528771 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ceph\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.528822 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.528861 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cznz8\" (UniqueName: \"kubernetes.io/projected/387ade3f-0ebb-4488-8a04-389a018fc31d-kube-api-access-cznz8\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.528918 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-inventory-1\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.528944 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-inventory-0\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.528969 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ssh-key-openstack-networker\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.535427 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-inventory-1\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.536170 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.540317 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-inventory-0\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.540804 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ssh-key-openstack-networker\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.552120 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ceph\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.552479 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cznz8\" (UniqueName: \"kubernetes.io/projected/387ade3f-0ebb-4488-8a04-389a018fc31d-kube-api-access-cznz8\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.682693 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:35 crc kubenswrapper[5094]: I0220 09:12:35.182420 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-zb287"] Feb 20 09:12:36 crc kubenswrapper[5094]: I0220 09:12:36.202973 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-zb287" event={"ID":"387ade3f-0ebb-4488-8a04-389a018fc31d","Type":"ContainerStarted","Data":"b79daef797bf727b856e9ca751d28441e3426324d5faa4cbdf09b3eda8c461b5"} Feb 20 09:12:36 crc kubenswrapper[5094]: I0220 09:12:36.203793 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-zb287" event={"ID":"387ade3f-0ebb-4488-8a04-389a018fc31d","Type":"ContainerStarted","Data":"2a9b407833d6a7dbd7bb656dedee626bc225b77bd13ecbd8d16ede5733462de6"} Feb 20 09:12:36 crc kubenswrapper[5094]: I0220 09:12:36.231534 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-zb287" podStartSLOduration=1.754032516 podStartE2EDuration="2.231513683s" podCreationTimestamp="2026-02-20 09:12:34 +0000 UTC" firstStartedPulling="2026-02-20 09:12:35.18519555 +0000 UTC m=+8770.057822261" lastFinishedPulling="2026-02-20 09:12:35.662676717 +0000 UTC m=+8770.535303428" observedRunningTime="2026-02-20 09:12:36.225432036 +0000 UTC m=+8771.098058757" watchObservedRunningTime="2026-02-20 09:12:36.231513683 +0000 UTC m=+8771.104140394" Feb 20 09:12:38 crc kubenswrapper[5094]: I0220 09:12:38.224556 5094 generic.go:334] "Generic (PLEG): container finished" podID="21d27f85-64a1-4dc5-af39-89275cce2427" containerID="6c17819c117daffaa74cb33a9b4a26dc4bfdc09ad0bd109886743a58f1bd45bb" exitCode=0 Feb 20 09:12:38 crc kubenswrapper[5094]: I0220 09:12:38.224623 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-networker-xdx54" event={"ID":"21d27f85-64a1-4dc5-af39-89275cce2427","Type":"ContainerDied","Data":"6c17819c117daffaa74cb33a9b4a26dc4bfdc09ad0bd109886743a58f1bd45bb"} Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.683956 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.843310 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-ovn-combined-ca-bundle\") pod \"21d27f85-64a1-4dc5-af39-89275cce2427\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.843517 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-ssh-key-openstack-networker\") pod \"21d27f85-64a1-4dc5-af39-89275cce2427\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.843566 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-neutron-metadata-combined-ca-bundle\") pod \"21d27f85-64a1-4dc5-af39-89275cce2427\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.843612 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxj2k\" (UniqueName: \"kubernetes.io/projected/21d27f85-64a1-4dc5-af39-89275cce2427-kube-api-access-bxj2k\") pod \"21d27f85-64a1-4dc5-af39-89275cce2427\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.843775 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-inventory\") pod \"21d27f85-64a1-4dc5-af39-89275cce2427\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.843914 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-bootstrap-combined-ca-bundle\") pod \"21d27f85-64a1-4dc5-af39-89275cce2427\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.849778 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "21d27f85-64a1-4dc5-af39-89275cce2427" (UID: "21d27f85-64a1-4dc5-af39-89275cce2427"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.850279 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21d27f85-64a1-4dc5-af39-89275cce2427-kube-api-access-bxj2k" (OuterVolumeSpecName: "kube-api-access-bxj2k") pod "21d27f85-64a1-4dc5-af39-89275cce2427" (UID: "21d27f85-64a1-4dc5-af39-89275cce2427"). InnerVolumeSpecName "kube-api-access-bxj2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.850799 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "21d27f85-64a1-4dc5-af39-89275cce2427" (UID: "21d27f85-64a1-4dc5-af39-89275cce2427"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.851195 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "21d27f85-64a1-4dc5-af39-89275cce2427" (UID: "21d27f85-64a1-4dc5-af39-89275cce2427"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.876145 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "21d27f85-64a1-4dc5-af39-89275cce2427" (UID: "21d27f85-64a1-4dc5-af39-89275cce2427"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.894912 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-inventory" (OuterVolumeSpecName: "inventory") pod "21d27f85-64a1-4dc5-af39-89275cce2427" (UID: "21d27f85-64a1-4dc5-af39-89275cce2427"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.948381 5094 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.948435 5094 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.948453 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.948473 5094 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.948595 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxj2k\" (UniqueName: \"kubernetes.io/projected/21d27f85-64a1-4dc5-af39-89275cce2427-kube-api-access-bxj2k\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.948621 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.245186 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-networker-xdx54" event={"ID":"21d27f85-64a1-4dc5-af39-89275cce2427","Type":"ContainerDied","Data":"c2f19ab64377ee28e9a91ca87c0aaae5c486f7814233b439c782195562b6cbca"} Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.245497 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2f19ab64377ee28e9a91ca87c0aaae5c486f7814233b439c782195562b6cbca" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.245250 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.329961 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-networker-r6vzk"] Feb 20 09:12:40 crc kubenswrapper[5094]: E0220 09:12:40.330440 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21d27f85-64a1-4dc5-af39-89275cce2427" containerName="install-certs-openstack-openstack-networker" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.330457 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="21d27f85-64a1-4dc5-af39-89275cce2427" containerName="install-certs-openstack-openstack-networker" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.330684 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="21d27f85-64a1-4dc5-af39-89275cce2427" containerName="install-certs-openstack-openstack-networker" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.331385 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.333591 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-xf9xl" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.333718 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.342942 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-networker-r6vzk"] Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.458905 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-networker-r6vzk\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.458954 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ssh-key-openstack-networker\") pod \"ovn-openstack-openstack-networker-r6vzk\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.458979 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-inventory\") pod \"ovn-openstack-openstack-networker-r6vzk\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.459328 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ovncontroller-config-0\") pod \"ovn-openstack-openstack-networker-r6vzk\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.459593 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znpgd\" (UniqueName: \"kubernetes.io/projected/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-kube-api-access-znpgd\") pod \"ovn-openstack-openstack-networker-r6vzk\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.561906 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znpgd\" (UniqueName: \"kubernetes.io/projected/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-kube-api-access-znpgd\") pod \"ovn-openstack-openstack-networker-r6vzk\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.562233 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-networker-r6vzk\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.562331 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ssh-key-openstack-networker\") pod \"ovn-openstack-openstack-networker-r6vzk\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.562431 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-inventory\") pod \"ovn-openstack-openstack-networker-r6vzk\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.562587 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ovncontroller-config-0\") pod \"ovn-openstack-openstack-networker-r6vzk\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.564374 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ovncontroller-config-0\") pod \"ovn-openstack-openstack-networker-r6vzk\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.566020 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ssh-key-openstack-networker\") pod \"ovn-openstack-openstack-networker-r6vzk\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.572288 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-networker-r6vzk\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.573224 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-inventory\") pod \"ovn-openstack-openstack-networker-r6vzk\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.589484 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znpgd\" (UniqueName: \"kubernetes.io/projected/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-kube-api-access-znpgd\") pod \"ovn-openstack-openstack-networker-r6vzk\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.649888 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:41 crc kubenswrapper[5094]: W0220 09:12:41.211782 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ef4f2ef_92a7_4d12_94a9_e3ee55412547.slice/crio-e1588c72c435155f35fdf02546c5ae6b532e58027baa39a785ec6d828ecdf55b WatchSource:0}: Error finding container e1588c72c435155f35fdf02546c5ae6b532e58027baa39a785ec6d828ecdf55b: Status 404 returned error can't find the container with id e1588c72c435155f35fdf02546c5ae6b532e58027baa39a785ec6d828ecdf55b Feb 20 09:12:41 crc kubenswrapper[5094]: I0220 09:12:41.214746 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-networker-r6vzk"] Feb 20 09:12:41 crc kubenswrapper[5094]: I0220 09:12:41.258294 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-networker-r6vzk" event={"ID":"3ef4f2ef-92a7-4d12-94a9-e3ee55412547","Type":"ContainerStarted","Data":"e1588c72c435155f35fdf02546c5ae6b532e58027baa39a785ec6d828ecdf55b"} Feb 20 09:12:42 crc kubenswrapper[5094]: I0220 09:12:42.268505 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-networker-r6vzk" event={"ID":"3ef4f2ef-92a7-4d12-94a9-e3ee55412547","Type":"ContainerStarted","Data":"a7bba99fdce7206f2a33a9b6752071dfb931fd478046625f8eac18d1bd63f378"} Feb 20 09:12:42 crc kubenswrapper[5094]: I0220 09:12:42.287718 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-networker-r6vzk" podStartSLOduration=1.807364718 podStartE2EDuration="2.287688833s" podCreationTimestamp="2026-02-20 09:12:40 +0000 UTC" firstStartedPulling="2026-02-20 09:12:41.213780377 +0000 UTC m=+8776.086407098" lastFinishedPulling="2026-02-20 09:12:41.694104492 +0000 UTC m=+8776.566731213" observedRunningTime="2026-02-20 09:12:42.281841342 +0000 UTC m=+8777.154468053" watchObservedRunningTime="2026-02-20 09:12:42.287688833 +0000 UTC m=+8777.160315544" Feb 20 09:12:50 crc kubenswrapper[5094]: I0220 09:12:50.355674 5094 generic.go:334] "Generic (PLEG): container finished" podID="387ade3f-0ebb-4488-8a04-389a018fc31d" containerID="b79daef797bf727b856e9ca751d28441e3426324d5faa4cbdf09b3eda8c461b5" exitCode=0 Feb 20 09:12:50 crc kubenswrapper[5094]: I0220 09:12:50.356001 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-zb287" event={"ID":"387ade3f-0ebb-4488-8a04-389a018fc31d","Type":"ContainerDied","Data":"b79daef797bf727b856e9ca751d28441e3426324d5faa4cbdf09b3eda8c461b5"} Feb 20 09:12:51 crc kubenswrapper[5094]: I0220 09:12:51.797891 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:51 crc kubenswrapper[5094]: I0220 09:12:51.912569 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-inventory-1\") pod \"387ade3f-0ebb-4488-8a04-389a018fc31d\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " Feb 20 09:12:51 crc kubenswrapper[5094]: I0220 09:12:51.912655 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-inventory-0\") pod \"387ade3f-0ebb-4488-8a04-389a018fc31d\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " Feb 20 09:12:51 crc kubenswrapper[5094]: I0220 09:12:51.912829 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ceph\") pod \"387ade3f-0ebb-4488-8a04-389a018fc31d\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " Feb 20 09:12:51 crc kubenswrapper[5094]: I0220 09:12:51.912889 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ssh-key-openstack-networker\") pod \"387ade3f-0ebb-4488-8a04-389a018fc31d\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " Feb 20 09:12:51 crc kubenswrapper[5094]: I0220 09:12:51.912916 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ssh-key-openstack-cell1\") pod \"387ade3f-0ebb-4488-8a04-389a018fc31d\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " Feb 20 09:12:51 crc kubenswrapper[5094]: I0220 09:12:51.912980 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cznz8\" (UniqueName: \"kubernetes.io/projected/387ade3f-0ebb-4488-8a04-389a018fc31d-kube-api-access-cznz8\") pod \"387ade3f-0ebb-4488-8a04-389a018fc31d\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " Feb 20 09:12:51 crc kubenswrapper[5094]: I0220 09:12:51.919904 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/387ade3f-0ebb-4488-8a04-389a018fc31d-kube-api-access-cznz8" (OuterVolumeSpecName: "kube-api-access-cznz8") pod "387ade3f-0ebb-4488-8a04-389a018fc31d" (UID: "387ade3f-0ebb-4488-8a04-389a018fc31d"). InnerVolumeSpecName "kube-api-access-cznz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:12:51 crc kubenswrapper[5094]: I0220 09:12:51.920213 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ceph" (OuterVolumeSpecName: "ceph") pod "387ade3f-0ebb-4488-8a04-389a018fc31d" (UID: "387ade3f-0ebb-4488-8a04-389a018fc31d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:51 crc kubenswrapper[5094]: I0220 09:12:51.946429 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "387ade3f-0ebb-4488-8a04-389a018fc31d" (UID: "387ade3f-0ebb-4488-8a04-389a018fc31d"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:51 crc kubenswrapper[5094]: I0220 09:12:51.947724 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-inventory-1" (OuterVolumeSpecName: "inventory-1") pod "387ade3f-0ebb-4488-8a04-389a018fc31d" (UID: "387ade3f-0ebb-4488-8a04-389a018fc31d"). InnerVolumeSpecName "inventory-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:51 crc kubenswrapper[5094]: I0220 09:12:51.948333 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "387ade3f-0ebb-4488-8a04-389a018fc31d" (UID: "387ade3f-0ebb-4488-8a04-389a018fc31d"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:51 crc kubenswrapper[5094]: I0220 09:12:51.958362 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "387ade3f-0ebb-4488-8a04-389a018fc31d" (UID: "387ade3f-0ebb-4488-8a04-389a018fc31d"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.015886 5094 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.015920 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.015932 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.015945 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.015956 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cznz8\" (UniqueName: \"kubernetes.io/projected/387ade3f-0ebb-4488-8a04-389a018fc31d-kube-api-access-cznz8\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.015968 5094 reconciler_common.go:293] "Volume detached for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-inventory-1\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.377014 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-zb287" event={"ID":"387ade3f-0ebb-4488-8a04-389a018fc31d","Type":"ContainerDied","Data":"2a9b407833d6a7dbd7bb656dedee626bc225b77bd13ecbd8d16ede5733462de6"} Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.377058 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a9b407833d6a7dbd7bb656dedee626bc225b77bd13ecbd8d16ede5733462de6" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.377089 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.467898 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-fwcg9"] Feb 20 09:12:52 crc kubenswrapper[5094]: E0220 09:12:52.468358 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="387ade3f-0ebb-4488-8a04-389a018fc31d" containerName="ssh-known-hosts-openstack" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.468376 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="387ade3f-0ebb-4488-8a04-389a018fc31d" containerName="ssh-known-hosts-openstack" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.468563 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="387ade3f-0ebb-4488-8a04-389a018fc31d" containerName="ssh-known-hosts-openstack" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.469320 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-fwcg9" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.471472 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.471700 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.480134 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-fwcg9"] Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.628658 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjwds\" (UniqueName: \"kubernetes.io/projected/73a5aa76-8e8f-4235-bd0d-294f718698fa-kube-api-access-hjwds\") pod \"run-os-openstack-openstack-cell1-fwcg9\" (UID: \"73a5aa76-8e8f-4235-bd0d-294f718698fa\") " pod="openstack/run-os-openstack-openstack-cell1-fwcg9" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.629008 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-ceph\") pod \"run-os-openstack-openstack-cell1-fwcg9\" (UID: \"73a5aa76-8e8f-4235-bd0d-294f718698fa\") " pod="openstack/run-os-openstack-openstack-cell1-fwcg9" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.629093 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-inventory\") pod \"run-os-openstack-openstack-cell1-fwcg9\" (UID: \"73a5aa76-8e8f-4235-bd0d-294f718698fa\") " pod="openstack/run-os-openstack-openstack-cell1-fwcg9" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.629146 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-fwcg9\" (UID: \"73a5aa76-8e8f-4235-bd0d-294f718698fa\") " pod="openstack/run-os-openstack-openstack-cell1-fwcg9" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.730683 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjwds\" (UniqueName: \"kubernetes.io/projected/73a5aa76-8e8f-4235-bd0d-294f718698fa-kube-api-access-hjwds\") pod \"run-os-openstack-openstack-cell1-fwcg9\" (UID: \"73a5aa76-8e8f-4235-bd0d-294f718698fa\") " pod="openstack/run-os-openstack-openstack-cell1-fwcg9" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.730757 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-ceph\") pod \"run-os-openstack-openstack-cell1-fwcg9\" (UID: \"73a5aa76-8e8f-4235-bd0d-294f718698fa\") " pod="openstack/run-os-openstack-openstack-cell1-fwcg9" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.730829 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-inventory\") pod \"run-os-openstack-openstack-cell1-fwcg9\" (UID: \"73a5aa76-8e8f-4235-bd0d-294f718698fa\") " pod="openstack/run-os-openstack-openstack-cell1-fwcg9" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.730897 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-fwcg9\" (UID: \"73a5aa76-8e8f-4235-bd0d-294f718698fa\") " pod="openstack/run-os-openstack-openstack-cell1-fwcg9" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.736465 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-ceph\") pod \"run-os-openstack-openstack-cell1-fwcg9\" (UID: \"73a5aa76-8e8f-4235-bd0d-294f718698fa\") " pod="openstack/run-os-openstack-openstack-cell1-fwcg9" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.739167 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-fwcg9\" (UID: \"73a5aa76-8e8f-4235-bd0d-294f718698fa\") " pod="openstack/run-os-openstack-openstack-cell1-fwcg9" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.739804 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-inventory\") pod \"run-os-openstack-openstack-cell1-fwcg9\" (UID: \"73a5aa76-8e8f-4235-bd0d-294f718698fa\") " pod="openstack/run-os-openstack-openstack-cell1-fwcg9" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.757315 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjwds\" (UniqueName: \"kubernetes.io/projected/73a5aa76-8e8f-4235-bd0d-294f718698fa-kube-api-access-hjwds\") pod \"run-os-openstack-openstack-cell1-fwcg9\" (UID: \"73a5aa76-8e8f-4235-bd0d-294f718698fa\") " pod="openstack/run-os-openstack-openstack-cell1-fwcg9" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.822293 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-fwcg9" Feb 20 09:12:53 crc kubenswrapper[5094]: I0220 09:12:53.361426 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-fwcg9"] Feb 20 09:12:53 crc kubenswrapper[5094]: I0220 09:12:53.389766 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-fwcg9" event={"ID":"73a5aa76-8e8f-4235-bd0d-294f718698fa","Type":"ContainerStarted","Data":"feb9c238dfa2d325536cf0d59da45a3a354dfd03f841db94b04d9035676d0d5c"} Feb 20 09:12:54 crc kubenswrapper[5094]: I0220 09:12:54.400119 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-fwcg9" event={"ID":"73a5aa76-8e8f-4235-bd0d-294f718698fa","Type":"ContainerStarted","Data":"4c5cfabbfd79a25aaeb041af0c03152e4aeb36ff29c53e0b94ec44ab7b88d4b4"} Feb 20 09:12:54 crc kubenswrapper[5094]: I0220 09:12:54.426672 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-fwcg9" podStartSLOduration=1.9479881190000001 podStartE2EDuration="2.426649294s" podCreationTimestamp="2026-02-20 09:12:52 +0000 UTC" firstStartedPulling="2026-02-20 09:12:53.367265407 +0000 UTC m=+8788.239892118" lastFinishedPulling="2026-02-20 09:12:53.845926582 +0000 UTC m=+8788.718553293" observedRunningTime="2026-02-20 09:12:54.414551553 +0000 UTC m=+8789.287178264" watchObservedRunningTime="2026-02-20 09:12:54.426649294 +0000 UTC m=+8789.299276005" Feb 20 09:12:58 crc kubenswrapper[5094]: I0220 09:12:58.804148 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-swkr9"] Feb 20 09:12:58 crc kubenswrapper[5094]: I0220 09:12:58.808588 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:12:58 crc kubenswrapper[5094]: I0220 09:12:58.821249 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-swkr9"] Feb 20 09:12:58 crc kubenswrapper[5094]: I0220 09:12:58.891768 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd4d9\" (UniqueName: \"kubernetes.io/projected/18ea844c-6c90-4428-a5af-073c2b5500b6-kube-api-access-vd4d9\") pod \"certified-operators-swkr9\" (UID: \"18ea844c-6c90-4428-a5af-073c2b5500b6\") " pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:12:58 crc kubenswrapper[5094]: I0220 09:12:58.891862 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18ea844c-6c90-4428-a5af-073c2b5500b6-utilities\") pod \"certified-operators-swkr9\" (UID: \"18ea844c-6c90-4428-a5af-073c2b5500b6\") " pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:12:58 crc kubenswrapper[5094]: I0220 09:12:58.891931 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18ea844c-6c90-4428-a5af-073c2b5500b6-catalog-content\") pod \"certified-operators-swkr9\" (UID: \"18ea844c-6c90-4428-a5af-073c2b5500b6\") " pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:12:58 crc kubenswrapper[5094]: I0220 09:12:58.993471 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd4d9\" (UniqueName: \"kubernetes.io/projected/18ea844c-6c90-4428-a5af-073c2b5500b6-kube-api-access-vd4d9\") pod \"certified-operators-swkr9\" (UID: \"18ea844c-6c90-4428-a5af-073c2b5500b6\") " pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:12:58 crc kubenswrapper[5094]: I0220 09:12:58.993772 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18ea844c-6c90-4428-a5af-073c2b5500b6-utilities\") pod \"certified-operators-swkr9\" (UID: \"18ea844c-6c90-4428-a5af-073c2b5500b6\") " pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:12:58 crc kubenswrapper[5094]: I0220 09:12:58.993942 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18ea844c-6c90-4428-a5af-073c2b5500b6-catalog-content\") pod \"certified-operators-swkr9\" (UID: \"18ea844c-6c90-4428-a5af-073c2b5500b6\") " pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:12:58 crc kubenswrapper[5094]: I0220 09:12:58.994305 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18ea844c-6c90-4428-a5af-073c2b5500b6-utilities\") pod \"certified-operators-swkr9\" (UID: \"18ea844c-6c90-4428-a5af-073c2b5500b6\") " pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:12:58 crc kubenswrapper[5094]: I0220 09:12:58.994781 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18ea844c-6c90-4428-a5af-073c2b5500b6-catalog-content\") pod \"certified-operators-swkr9\" (UID: \"18ea844c-6c90-4428-a5af-073c2b5500b6\") " pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:12:59 crc kubenswrapper[5094]: I0220 09:12:59.034204 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd4d9\" (UniqueName: \"kubernetes.io/projected/18ea844c-6c90-4428-a5af-073c2b5500b6-kube-api-access-vd4d9\") pod \"certified-operators-swkr9\" (UID: \"18ea844c-6c90-4428-a5af-073c2b5500b6\") " pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:12:59 crc kubenswrapper[5094]: I0220 09:12:59.129280 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:12:59 crc kubenswrapper[5094]: I0220 09:12:59.625601 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-swkr9"] Feb 20 09:13:00 crc kubenswrapper[5094]: I0220 09:13:00.460908 5094 generic.go:334] "Generic (PLEG): container finished" podID="18ea844c-6c90-4428-a5af-073c2b5500b6" containerID="d55a313737dc1834267e2d5132e8dc3b4582b8466b0451ffea87f75b85b148d0" exitCode=0 Feb 20 09:13:00 crc kubenswrapper[5094]: I0220 09:13:00.460990 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swkr9" event={"ID":"18ea844c-6c90-4428-a5af-073c2b5500b6","Type":"ContainerDied","Data":"d55a313737dc1834267e2d5132e8dc3b4582b8466b0451ffea87f75b85b148d0"} Feb 20 09:13:00 crc kubenswrapper[5094]: I0220 09:13:00.461616 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swkr9" event={"ID":"18ea844c-6c90-4428-a5af-073c2b5500b6","Type":"ContainerStarted","Data":"c914d995a63fd7594c0e4bdb37a076bf16b8dc6dce300cea3fef7134aad270b1"} Feb 20 09:13:01 crc kubenswrapper[5094]: I0220 09:13:01.474557 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swkr9" event={"ID":"18ea844c-6c90-4428-a5af-073c2b5500b6","Type":"ContainerStarted","Data":"cc78476448bdfbfe0b18fb3660e5595ec12eeec57060e43709c41f959f215d8d"} Feb 20 09:13:03 crc kubenswrapper[5094]: I0220 09:13:03.492528 5094 generic.go:334] "Generic (PLEG): container finished" podID="18ea844c-6c90-4428-a5af-073c2b5500b6" containerID="cc78476448bdfbfe0b18fb3660e5595ec12eeec57060e43709c41f959f215d8d" exitCode=0 Feb 20 09:13:03 crc kubenswrapper[5094]: I0220 09:13:03.492601 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swkr9" event={"ID":"18ea844c-6c90-4428-a5af-073c2b5500b6","Type":"ContainerDied","Data":"cc78476448bdfbfe0b18fb3660e5595ec12eeec57060e43709c41f959f215d8d"} Feb 20 09:13:04 crc kubenswrapper[5094]: I0220 09:13:04.520453 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swkr9" event={"ID":"18ea844c-6c90-4428-a5af-073c2b5500b6","Type":"ContainerStarted","Data":"9e59a5dfa0b9d46498e6efd644ef2e28a2e6a9be1df2122d4e2c68bb612de0aa"} Feb 20 09:13:04 crc kubenswrapper[5094]: I0220 09:13:04.524246 5094 generic.go:334] "Generic (PLEG): container finished" podID="73a5aa76-8e8f-4235-bd0d-294f718698fa" containerID="4c5cfabbfd79a25aaeb041af0c03152e4aeb36ff29c53e0b94ec44ab7b88d4b4" exitCode=0 Feb 20 09:13:04 crc kubenswrapper[5094]: I0220 09:13:04.524283 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-fwcg9" event={"ID":"73a5aa76-8e8f-4235-bd0d-294f718698fa","Type":"ContainerDied","Data":"4c5cfabbfd79a25aaeb041af0c03152e4aeb36ff29c53e0b94ec44ab7b88d4b4"} Feb 20 09:13:04 crc kubenswrapper[5094]: I0220 09:13:04.545853 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-swkr9" podStartSLOduration=3.146602253 podStartE2EDuration="6.545835542s" podCreationTimestamp="2026-02-20 09:12:58 +0000 UTC" firstStartedPulling="2026-02-20 09:13:00.464657817 +0000 UTC m=+8795.337284528" lastFinishedPulling="2026-02-20 09:13:03.863891096 +0000 UTC m=+8798.736517817" observedRunningTime="2026-02-20 09:13:04.541009997 +0000 UTC m=+8799.413636698" watchObservedRunningTime="2026-02-20 09:13:04.545835542 +0000 UTC m=+8799.418462253" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.064393 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-fwcg9" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.151607 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-ceph\") pod \"73a5aa76-8e8f-4235-bd0d-294f718698fa\" (UID: \"73a5aa76-8e8f-4235-bd0d-294f718698fa\") " Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.151671 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjwds\" (UniqueName: \"kubernetes.io/projected/73a5aa76-8e8f-4235-bd0d-294f718698fa-kube-api-access-hjwds\") pod \"73a5aa76-8e8f-4235-bd0d-294f718698fa\" (UID: \"73a5aa76-8e8f-4235-bd0d-294f718698fa\") " Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.151733 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-ssh-key-openstack-cell1\") pod \"73a5aa76-8e8f-4235-bd0d-294f718698fa\" (UID: \"73a5aa76-8e8f-4235-bd0d-294f718698fa\") " Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.151877 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-inventory\") pod \"73a5aa76-8e8f-4235-bd0d-294f718698fa\" (UID: \"73a5aa76-8e8f-4235-bd0d-294f718698fa\") " Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.158830 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-ceph" (OuterVolumeSpecName: "ceph") pod "73a5aa76-8e8f-4235-bd0d-294f718698fa" (UID: "73a5aa76-8e8f-4235-bd0d-294f718698fa"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.159462 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73a5aa76-8e8f-4235-bd0d-294f718698fa-kube-api-access-hjwds" (OuterVolumeSpecName: "kube-api-access-hjwds") pod "73a5aa76-8e8f-4235-bd0d-294f718698fa" (UID: "73a5aa76-8e8f-4235-bd0d-294f718698fa"). InnerVolumeSpecName "kube-api-access-hjwds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.180388 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "73a5aa76-8e8f-4235-bd0d-294f718698fa" (UID: "73a5aa76-8e8f-4235-bd0d-294f718698fa"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.182621 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-inventory" (OuterVolumeSpecName: "inventory") pod "73a5aa76-8e8f-4235-bd0d-294f718698fa" (UID: "73a5aa76-8e8f-4235-bd0d-294f718698fa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.254921 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.254967 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjwds\" (UniqueName: \"kubernetes.io/projected/73a5aa76-8e8f-4235-bd0d-294f718698fa-kube-api-access-hjwds\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.254986 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.255004 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.544769 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-fwcg9" event={"ID":"73a5aa76-8e8f-4235-bd0d-294f718698fa","Type":"ContainerDied","Data":"feb9c238dfa2d325536cf0d59da45a3a354dfd03f841db94b04d9035676d0d5c"} Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.544832 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="feb9c238dfa2d325536cf0d59da45a3a354dfd03f841db94b04d9035676d0d5c" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.544912 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-fwcg9" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.678545 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-gpz7d"] Feb 20 09:13:06 crc kubenswrapper[5094]: E0220 09:13:06.679006 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a5aa76-8e8f-4235-bd0d-294f718698fa" containerName="run-os-openstack-openstack-cell1" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.679025 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a5aa76-8e8f-4235-bd0d-294f718698fa" containerName="run-os-openstack-openstack-cell1" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.679227 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="73a5aa76-8e8f-4235-bd0d-294f718698fa" containerName="run-os-openstack-openstack-cell1" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.679936 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.683078 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.683247 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.694161 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-gpz7d"] Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.763863 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-ceph\") pod \"reboot-os-openstack-openstack-cell1-gpz7d\" (UID: \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\") " pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.763944 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5t7z\" (UniqueName: \"kubernetes.io/projected/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-kube-api-access-s5t7z\") pod \"reboot-os-openstack-openstack-cell1-gpz7d\" (UID: \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\") " pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.763996 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-gpz7d\" (UID: \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\") " pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.764385 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-inventory\") pod \"reboot-os-openstack-openstack-cell1-gpz7d\" (UID: \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\") " pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.866672 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-ceph\") pod \"reboot-os-openstack-openstack-cell1-gpz7d\" (UID: \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\") " pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.866805 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5t7z\" (UniqueName: \"kubernetes.io/projected/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-kube-api-access-s5t7z\") pod \"reboot-os-openstack-openstack-cell1-gpz7d\" (UID: \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\") " pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.866855 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-gpz7d\" (UID: \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\") " pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.866975 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-inventory\") pod \"reboot-os-openstack-openstack-cell1-gpz7d\" (UID: \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\") " pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.870501 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-gpz7d\" (UID: \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\") " pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.870782 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-inventory\") pod \"reboot-os-openstack-openstack-cell1-gpz7d\" (UID: \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\") " pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.871016 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-ceph\") pod \"reboot-os-openstack-openstack-cell1-gpz7d\" (UID: \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\") " pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.885193 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5t7z\" (UniqueName: \"kubernetes.io/projected/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-kube-api-access-s5t7z\") pod \"reboot-os-openstack-openstack-cell1-gpz7d\" (UID: \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\") " pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" Feb 20 09:13:07 crc kubenswrapper[5094]: I0220 09:13:06.999901 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" Feb 20 09:13:07 crc kubenswrapper[5094]: I0220 09:13:07.569690 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-gpz7d"] Feb 20 09:13:08 crc kubenswrapper[5094]: I0220 09:13:08.564967 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" event={"ID":"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e","Type":"ContainerStarted","Data":"535d1f9c00da485e84fcf5b15dfd55c8a8dda766b04ae2351a6d6b1fe7889e10"} Feb 20 09:13:09 crc kubenswrapper[5094]: I0220 09:13:09.130746 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:13:09 crc kubenswrapper[5094]: I0220 09:13:09.130840 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:13:09 crc kubenswrapper[5094]: I0220 09:13:09.195017 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:13:09 crc kubenswrapper[5094]: I0220 09:13:09.576967 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" event={"ID":"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e","Type":"ContainerStarted","Data":"ed8526b86ab62d7bd0921b55737c1c6fd30ec623adb1a48a2a35b5434303c21d"} Feb 20 09:13:09 crc kubenswrapper[5094]: I0220 09:13:09.610893 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" podStartSLOduration=2.816515888 podStartE2EDuration="3.610872218s" podCreationTimestamp="2026-02-20 09:13:06 +0000 UTC" firstStartedPulling="2026-02-20 09:13:07.575231234 +0000 UTC m=+8802.447878836" lastFinishedPulling="2026-02-20 09:13:08.369608455 +0000 UTC m=+8803.242235166" observedRunningTime="2026-02-20 09:13:09.594018313 +0000 UTC m=+8804.466645034" watchObservedRunningTime="2026-02-20 09:13:09.610872218 +0000 UTC m=+8804.483498939" Feb 20 09:13:09 crc kubenswrapper[5094]: I0220 09:13:09.639959 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:13:09 crc kubenswrapper[5094]: I0220 09:13:09.704287 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-swkr9"] Feb 20 09:13:11 crc kubenswrapper[5094]: I0220 09:13:11.594795 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-swkr9" podUID="18ea844c-6c90-4428-a5af-073c2b5500b6" containerName="registry-server" containerID="cri-o://9e59a5dfa0b9d46498e6efd644ef2e28a2e6a9be1df2122d4e2c68bb612de0aa" gracePeriod=2 Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.081327 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.193887 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18ea844c-6c90-4428-a5af-073c2b5500b6-catalog-content\") pod \"18ea844c-6c90-4428-a5af-073c2b5500b6\" (UID: \"18ea844c-6c90-4428-a5af-073c2b5500b6\") " Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.194024 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18ea844c-6c90-4428-a5af-073c2b5500b6-utilities\") pod \"18ea844c-6c90-4428-a5af-073c2b5500b6\" (UID: \"18ea844c-6c90-4428-a5af-073c2b5500b6\") " Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.194226 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd4d9\" (UniqueName: \"kubernetes.io/projected/18ea844c-6c90-4428-a5af-073c2b5500b6-kube-api-access-vd4d9\") pod \"18ea844c-6c90-4428-a5af-073c2b5500b6\" (UID: \"18ea844c-6c90-4428-a5af-073c2b5500b6\") " Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.194754 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18ea844c-6c90-4428-a5af-073c2b5500b6-utilities" (OuterVolumeSpecName: "utilities") pod "18ea844c-6c90-4428-a5af-073c2b5500b6" (UID: "18ea844c-6c90-4428-a5af-073c2b5500b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.204384 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18ea844c-6c90-4428-a5af-073c2b5500b6-kube-api-access-vd4d9" (OuterVolumeSpecName: "kube-api-access-vd4d9") pod "18ea844c-6c90-4428-a5af-073c2b5500b6" (UID: "18ea844c-6c90-4428-a5af-073c2b5500b6"). InnerVolumeSpecName "kube-api-access-vd4d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.252864 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18ea844c-6c90-4428-a5af-073c2b5500b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18ea844c-6c90-4428-a5af-073c2b5500b6" (UID: "18ea844c-6c90-4428-a5af-073c2b5500b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.297008 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd4d9\" (UniqueName: \"kubernetes.io/projected/18ea844c-6c90-4428-a5af-073c2b5500b6-kube-api-access-vd4d9\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.297040 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18ea844c-6c90-4428-a5af-073c2b5500b6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.297052 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18ea844c-6c90-4428-a5af-073c2b5500b6-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.605793 5094 generic.go:334] "Generic (PLEG): container finished" podID="18ea844c-6c90-4428-a5af-073c2b5500b6" containerID="9e59a5dfa0b9d46498e6efd644ef2e28a2e6a9be1df2122d4e2c68bb612de0aa" exitCode=0 Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.606802 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swkr9" event={"ID":"18ea844c-6c90-4428-a5af-073c2b5500b6","Type":"ContainerDied","Data":"9e59a5dfa0b9d46498e6efd644ef2e28a2e6a9be1df2122d4e2c68bb612de0aa"} Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.606861 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swkr9" event={"ID":"18ea844c-6c90-4428-a5af-073c2b5500b6","Type":"ContainerDied","Data":"c914d995a63fd7594c0e4bdb37a076bf16b8dc6dce300cea3fef7134aad270b1"} Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.606881 5094 scope.go:117] "RemoveContainer" containerID="9e59a5dfa0b9d46498e6efd644ef2e28a2e6a9be1df2122d4e2c68bb612de0aa" Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.607047 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.633919 5094 scope.go:117] "RemoveContainer" containerID="cc78476448bdfbfe0b18fb3660e5595ec12eeec57060e43709c41f959f215d8d" Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.651448 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-swkr9"] Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.663561 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-swkr9"] Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.676902 5094 scope.go:117] "RemoveContainer" containerID="d55a313737dc1834267e2d5132e8dc3b4582b8466b0451ffea87f75b85b148d0" Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.721945 5094 scope.go:117] "RemoveContainer" containerID="9e59a5dfa0b9d46498e6efd644ef2e28a2e6a9be1df2122d4e2c68bb612de0aa" Feb 20 09:13:12 crc kubenswrapper[5094]: E0220 09:13:12.722307 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e59a5dfa0b9d46498e6efd644ef2e28a2e6a9be1df2122d4e2c68bb612de0aa\": container with ID starting with 9e59a5dfa0b9d46498e6efd644ef2e28a2e6a9be1df2122d4e2c68bb612de0aa not found: ID does not exist" containerID="9e59a5dfa0b9d46498e6efd644ef2e28a2e6a9be1df2122d4e2c68bb612de0aa" Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.722338 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e59a5dfa0b9d46498e6efd644ef2e28a2e6a9be1df2122d4e2c68bb612de0aa"} err="failed to get container status \"9e59a5dfa0b9d46498e6efd644ef2e28a2e6a9be1df2122d4e2c68bb612de0aa\": rpc error: code = NotFound desc = could not find container \"9e59a5dfa0b9d46498e6efd644ef2e28a2e6a9be1df2122d4e2c68bb612de0aa\": container with ID starting with 9e59a5dfa0b9d46498e6efd644ef2e28a2e6a9be1df2122d4e2c68bb612de0aa not found: ID does not exist" Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.722360 5094 scope.go:117] "RemoveContainer" containerID="cc78476448bdfbfe0b18fb3660e5595ec12eeec57060e43709c41f959f215d8d" Feb 20 09:13:12 crc kubenswrapper[5094]: E0220 09:13:12.722557 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc78476448bdfbfe0b18fb3660e5595ec12eeec57060e43709c41f959f215d8d\": container with ID starting with cc78476448bdfbfe0b18fb3660e5595ec12eeec57060e43709c41f959f215d8d not found: ID does not exist" containerID="cc78476448bdfbfe0b18fb3660e5595ec12eeec57060e43709c41f959f215d8d" Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.722581 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc78476448bdfbfe0b18fb3660e5595ec12eeec57060e43709c41f959f215d8d"} err="failed to get container status \"cc78476448bdfbfe0b18fb3660e5595ec12eeec57060e43709c41f959f215d8d\": rpc error: code = NotFound desc = could not find container \"cc78476448bdfbfe0b18fb3660e5595ec12eeec57060e43709c41f959f215d8d\": container with ID starting with cc78476448bdfbfe0b18fb3660e5595ec12eeec57060e43709c41f959f215d8d not found: ID does not exist" Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.722594 5094 scope.go:117] "RemoveContainer" containerID="d55a313737dc1834267e2d5132e8dc3b4582b8466b0451ffea87f75b85b148d0" Feb 20 09:13:12 crc kubenswrapper[5094]: E0220 09:13:12.722953 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d55a313737dc1834267e2d5132e8dc3b4582b8466b0451ffea87f75b85b148d0\": container with ID starting with d55a313737dc1834267e2d5132e8dc3b4582b8466b0451ffea87f75b85b148d0 not found: ID does not exist" containerID="d55a313737dc1834267e2d5132e8dc3b4582b8466b0451ffea87f75b85b148d0" Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.722976 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d55a313737dc1834267e2d5132e8dc3b4582b8466b0451ffea87f75b85b148d0"} err="failed to get container status \"d55a313737dc1834267e2d5132e8dc3b4582b8466b0451ffea87f75b85b148d0\": rpc error: code = NotFound desc = could not find container \"d55a313737dc1834267e2d5132e8dc3b4582b8466b0451ffea87f75b85b148d0\": container with ID starting with d55a313737dc1834267e2d5132e8dc3b4582b8466b0451ffea87f75b85b148d0 not found: ID does not exist" Feb 20 09:13:13 crc kubenswrapper[5094]: I0220 09:13:13.852597 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18ea844c-6c90-4428-a5af-073c2b5500b6" path="/var/lib/kubelet/pods/18ea844c-6c90-4428-a5af-073c2b5500b6/volumes" Feb 20 09:13:23 crc kubenswrapper[5094]: I0220 09:13:23.756807 5094 generic.go:334] "Generic (PLEG): container finished" podID="134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e" containerID="ed8526b86ab62d7bd0921b55737c1c6fd30ec623adb1a48a2a35b5434303c21d" exitCode=0 Feb 20 09:13:23 crc kubenswrapper[5094]: I0220 09:13:23.756863 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" event={"ID":"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e","Type":"ContainerDied","Data":"ed8526b86ab62d7bd0921b55737c1c6fd30ec623adb1a48a2a35b5434303c21d"} Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.240449 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.284231 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-inventory\") pod \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\" (UID: \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\") " Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.284449 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-ceph\") pod \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\" (UID: \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\") " Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.284474 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5t7z\" (UniqueName: \"kubernetes.io/projected/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-kube-api-access-s5t7z\") pod \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\" (UID: \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\") " Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.284614 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-ssh-key-openstack-cell1\") pod \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\" (UID: \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\") " Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.290736 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-kube-api-access-s5t7z" (OuterVolumeSpecName: "kube-api-access-s5t7z") pod "134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e" (UID: "134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e"). InnerVolumeSpecName "kube-api-access-s5t7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.294825 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-ceph" (OuterVolumeSpecName: "ceph") pod "134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e" (UID: "134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.313763 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e" (UID: "134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.315431 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-inventory" (OuterVolumeSpecName: "inventory") pod "134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e" (UID: "134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.386788 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.386840 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.386850 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.386880 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5t7z\" (UniqueName: \"kubernetes.io/projected/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-kube-api-access-s5t7z\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.785906 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" event={"ID":"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e","Type":"ContainerDied","Data":"535d1f9c00da485e84fcf5b15dfd55c8a8dda766b04ae2351a6d6b1fe7889e10"} Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.785974 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.785978 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="535d1f9c00da485e84fcf5b15dfd55c8a8dda766b04ae2351a6d6b1fe7889e10" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.906262 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-8nqcz"] Feb 20 09:13:25 crc kubenswrapper[5094]: E0220 09:13:25.906927 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e" containerName="reboot-os-openstack-openstack-cell1" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.907008 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e" containerName="reboot-os-openstack-openstack-cell1" Feb 20 09:13:25 crc kubenswrapper[5094]: E0220 09:13:25.907086 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ea844c-6c90-4428-a5af-073c2b5500b6" containerName="extract-utilities" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.907150 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ea844c-6c90-4428-a5af-073c2b5500b6" containerName="extract-utilities" Feb 20 09:13:25 crc kubenswrapper[5094]: E0220 09:13:25.907212 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ea844c-6c90-4428-a5af-073c2b5500b6" containerName="registry-server" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.907267 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ea844c-6c90-4428-a5af-073c2b5500b6" containerName="registry-server" Feb 20 09:13:25 crc kubenswrapper[5094]: E0220 09:13:25.907340 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ea844c-6c90-4428-a5af-073c2b5500b6" containerName="extract-content" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.907391 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ea844c-6c90-4428-a5af-073c2b5500b6" containerName="extract-content" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.907635 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="18ea844c-6c90-4428-a5af-073c2b5500b6" containerName="registry-server" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.907733 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e" containerName="reboot-os-openstack-openstack-cell1" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.908515 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.912668 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.912846 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.919879 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-8nqcz"] Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.998145 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.998227 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.998318 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.998338 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-inventory\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.998649 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2gtk\" (UniqueName: \"kubernetes.io/projected/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-kube-api-access-z2gtk\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.998741 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.998941 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.999055 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.999228 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.999258 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.999334 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ceph\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.999386 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.102032 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.102098 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.102136 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-inventory\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.102219 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2gtk\" (UniqueName: \"kubernetes.io/projected/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-kube-api-access-z2gtk\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.102250 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.102308 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.102347 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.102400 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.102424 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.102649 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ceph\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.102676 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.102744 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.106225 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.108362 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.108509 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.109216 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.109327 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ceph\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.110147 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.111580 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.112217 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.112298 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.113029 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.117356 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-inventory\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.120604 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.122081 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2gtk\" (UniqueName: \"kubernetes.io/projected/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-kube-api-access-z2gtk\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.270627 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.277793 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.977157 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-8nqcz"] Feb 20 09:13:26 crc kubenswrapper[5094]: W0220 09:13:26.982054 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f8dd6dc_a03c_4873_8ecb_e23bc464edff.slice/crio-b1b01afdc89e7344aaf971d11d95305dbae7f41f92c9a69461b5024f2ebf7c51 WatchSource:0}: Error finding container b1b01afdc89e7344aaf971d11d95305dbae7f41f92c9a69461b5024f2ebf7c51: Status 404 returned error can't find the container with id b1b01afdc89e7344aaf971d11d95305dbae7f41f92c9a69461b5024f2ebf7c51 Feb 20 09:13:27 crc kubenswrapper[5094]: I0220 09:13:27.811570 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" event={"ID":"8f8dd6dc-a03c-4873-8ecb-e23bc464edff","Type":"ContainerStarted","Data":"7c96506019fea93df882e932dad5dd6521b64c73702e1151d7089d024cc92c02"} Feb 20 09:13:27 crc kubenswrapper[5094]: I0220 09:13:27.811960 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" event={"ID":"8f8dd6dc-a03c-4873-8ecb-e23bc464edff","Type":"ContainerStarted","Data":"b1b01afdc89e7344aaf971d11d95305dbae7f41f92c9a69461b5024f2ebf7c51"} Feb 20 09:13:27 crc kubenswrapper[5094]: I0220 09:13:27.846229 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" podStartSLOduration=2.452895213 podStartE2EDuration="2.846203975s" podCreationTimestamp="2026-02-20 09:13:25 +0000 UTC" firstStartedPulling="2026-02-20 09:13:26.984305379 +0000 UTC m=+8821.856932120" lastFinishedPulling="2026-02-20 09:13:27.377614161 +0000 UTC m=+8822.250240882" observedRunningTime="2026-02-20 09:13:27.830756203 +0000 UTC m=+8822.703382934" watchObservedRunningTime="2026-02-20 09:13:27.846203975 +0000 UTC m=+8822.718830726" Feb 20 09:13:47 crc kubenswrapper[5094]: I0220 09:13:47.005011 5094 generic.go:334] "Generic (PLEG): container finished" podID="8f8dd6dc-a03c-4873-8ecb-e23bc464edff" containerID="7c96506019fea93df882e932dad5dd6521b64c73702e1151d7089d024cc92c02" exitCode=0 Feb 20 09:13:47 crc kubenswrapper[5094]: I0220 09:13:47.005108 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" event={"ID":"8f8dd6dc-a03c-4873-8ecb-e23bc464edff","Type":"ContainerDied","Data":"7c96506019fea93df882e932dad5dd6521b64c73702e1151d7089d024cc92c02"} Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.477915 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.527788 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ceph\") pod \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.527863 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-dhcp-combined-ca-bundle\") pod \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.527935 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2gtk\" (UniqueName: \"kubernetes.io/projected/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-kube-api-access-z2gtk\") pod \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.527972 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-sriov-combined-ca-bundle\") pod \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.528016 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-telemetry-combined-ca-bundle\") pod \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.528074 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-inventory\") pod \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.528111 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-libvirt-combined-ca-bundle\") pod \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.528146 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-nova-combined-ca-bundle\") pod \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.528190 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ssh-key-openstack-cell1\") pod \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.528320 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ovn-combined-ca-bundle\") pod \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.528432 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-metadata-combined-ca-bundle\") pod \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.528514 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-bootstrap-combined-ca-bundle\") pod \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.534423 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "8f8dd6dc-a03c-4873-8ecb-e23bc464edff" (UID: "8f8dd6dc-a03c-4873-8ecb-e23bc464edff"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.534679 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "8f8dd6dc-a03c-4873-8ecb-e23bc464edff" (UID: "8f8dd6dc-a03c-4873-8ecb-e23bc464edff"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.534807 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-kube-api-access-z2gtk" (OuterVolumeSpecName: "kube-api-access-z2gtk") pod "8f8dd6dc-a03c-4873-8ecb-e23bc464edff" (UID: "8f8dd6dc-a03c-4873-8ecb-e23bc464edff"). InnerVolumeSpecName "kube-api-access-z2gtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.535996 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "8f8dd6dc-a03c-4873-8ecb-e23bc464edff" (UID: "8f8dd6dc-a03c-4873-8ecb-e23bc464edff"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.536506 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "8f8dd6dc-a03c-4873-8ecb-e23bc464edff" (UID: "8f8dd6dc-a03c-4873-8ecb-e23bc464edff"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.536588 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "8f8dd6dc-a03c-4873-8ecb-e23bc464edff" (UID: "8f8dd6dc-a03c-4873-8ecb-e23bc464edff"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.536890 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "8f8dd6dc-a03c-4873-8ecb-e23bc464edff" (UID: "8f8dd6dc-a03c-4873-8ecb-e23bc464edff"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.537138 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "8f8dd6dc-a03c-4873-8ecb-e23bc464edff" (UID: "8f8dd6dc-a03c-4873-8ecb-e23bc464edff"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.538873 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "8f8dd6dc-a03c-4873-8ecb-e23bc464edff" (UID: "8f8dd6dc-a03c-4873-8ecb-e23bc464edff"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.542005 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ceph" (OuterVolumeSpecName: "ceph") pod "8f8dd6dc-a03c-4873-8ecb-e23bc464edff" (UID: "8f8dd6dc-a03c-4873-8ecb-e23bc464edff"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.563084 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-inventory" (OuterVolumeSpecName: "inventory") pod "8f8dd6dc-a03c-4873-8ecb-e23bc464edff" (UID: "8f8dd6dc-a03c-4873-8ecb-e23bc464edff"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.568185 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "8f8dd6dc-a03c-4873-8ecb-e23bc464edff" (UID: "8f8dd6dc-a03c-4873-8ecb-e23bc464edff"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.635332 5094 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.635488 5094 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.635591 5094 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.635683 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.635808 5094 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.635913 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2gtk\" (UniqueName: \"kubernetes.io/projected/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-kube-api-access-z2gtk\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.635995 5094 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.636081 5094 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.636177 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.636262 5094 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.636340 5094 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.636430 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.030121 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" event={"ID":"8f8dd6dc-a03c-4873-8ecb-e23bc464edff","Type":"ContainerDied","Data":"b1b01afdc89e7344aaf971d11d95305dbae7f41f92c9a69461b5024f2ebf7c51"} Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.030477 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1b01afdc89e7344aaf971d11d95305dbae7f41f92c9a69461b5024f2ebf7c51" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.030217 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.202494 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-ffl97"] Feb 20 09:13:49 crc kubenswrapper[5094]: E0220 09:13:49.203322 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f8dd6dc-a03c-4873-8ecb-e23bc464edff" containerName="install-certs-openstack-openstack-cell1" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.203468 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f8dd6dc-a03c-4873-8ecb-e23bc464edff" containerName="install-certs-openstack-openstack-cell1" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.203912 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f8dd6dc-a03c-4873-8ecb-e23bc464edff" containerName="install-certs-openstack-openstack-cell1" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.204884 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.210696 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.210766 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.213818 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-ffl97"] Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.253313 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-ffl97\" (UID: \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\") " pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.253406 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvklc\" (UniqueName: \"kubernetes.io/projected/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-kube-api-access-mvklc\") pod \"ceph-client-openstack-openstack-cell1-ffl97\" (UID: \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\") " pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.253450 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-inventory\") pod \"ceph-client-openstack-openstack-cell1-ffl97\" (UID: \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\") " pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.253538 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-ceph\") pod \"ceph-client-openstack-openstack-cell1-ffl97\" (UID: \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\") " pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.355286 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-ceph\") pod \"ceph-client-openstack-openstack-cell1-ffl97\" (UID: \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\") " pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.355419 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-ffl97\" (UID: \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\") " pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.355470 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvklc\" (UniqueName: \"kubernetes.io/projected/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-kube-api-access-mvklc\") pod \"ceph-client-openstack-openstack-cell1-ffl97\" (UID: \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\") " pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.355507 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-inventory\") pod \"ceph-client-openstack-openstack-cell1-ffl97\" (UID: \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\") " pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.360872 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-ceph\") pod \"ceph-client-openstack-openstack-cell1-ffl97\" (UID: \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\") " pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.360907 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-ffl97\" (UID: \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\") " pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.374149 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-inventory\") pod \"ceph-client-openstack-openstack-cell1-ffl97\" (UID: \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\") " pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.375011 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvklc\" (UniqueName: \"kubernetes.io/projected/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-kube-api-access-mvklc\") pod \"ceph-client-openstack-openstack-cell1-ffl97\" (UID: \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\") " pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.522288 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" Feb 20 09:13:50 crc kubenswrapper[5094]: I0220 09:13:50.130000 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-ffl97"] Feb 20 09:13:51 crc kubenswrapper[5094]: I0220 09:13:51.054235 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" event={"ID":"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db","Type":"ContainerStarted","Data":"e7a67200ded8823653b016a9ba2b790f20cb85685e7f8e7f60228125c6bf1e16"} Feb 20 09:13:51 crc kubenswrapper[5094]: I0220 09:13:51.055440 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" event={"ID":"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db","Type":"ContainerStarted","Data":"639da57cbc720668a4113926cc775bd9a7a1d7c0bf9ce364b0e700fc1cee12ad"} Feb 20 09:13:51 crc kubenswrapper[5094]: I0220 09:13:51.076529 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" podStartSLOduration=1.669723394 podStartE2EDuration="2.076511311s" podCreationTimestamp="2026-02-20 09:13:49 +0000 UTC" firstStartedPulling="2026-02-20 09:13:50.134444097 +0000 UTC m=+8845.007070818" lastFinishedPulling="2026-02-20 09:13:50.541232024 +0000 UTC m=+8845.413858735" observedRunningTime="2026-02-20 09:13:51.072652138 +0000 UTC m=+8845.945278849" watchObservedRunningTime="2026-02-20 09:13:51.076511311 +0000 UTC m=+8845.949138022" Feb 20 09:13:52 crc kubenswrapper[5094]: I0220 09:13:52.063742 5094 generic.go:334] "Generic (PLEG): container finished" podID="3ef4f2ef-92a7-4d12-94a9-e3ee55412547" containerID="a7bba99fdce7206f2a33a9b6752071dfb931fd478046625f8eac18d1bd63f378" exitCode=0 Feb 20 09:13:52 crc kubenswrapper[5094]: I0220 09:13:52.063813 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-networker-r6vzk" event={"ID":"3ef4f2ef-92a7-4d12-94a9-e3ee55412547","Type":"ContainerDied","Data":"a7bba99fdce7206f2a33a9b6752071dfb931fd478046625f8eac18d1bd63f378"} Feb 20 09:13:53 crc kubenswrapper[5094]: I0220 09:13:53.597410 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:13:53 crc kubenswrapper[5094]: I0220 09:13:53.751763 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-inventory\") pod \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " Feb 20 09:13:53 crc kubenswrapper[5094]: I0220 09:13:53.751893 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ovncontroller-config-0\") pod \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " Feb 20 09:13:53 crc kubenswrapper[5094]: I0220 09:13:53.751940 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znpgd\" (UniqueName: \"kubernetes.io/projected/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-kube-api-access-znpgd\") pod \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " Feb 20 09:13:53 crc kubenswrapper[5094]: I0220 09:13:53.752076 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ssh-key-openstack-networker\") pod \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " Feb 20 09:13:53 crc kubenswrapper[5094]: I0220 09:13:53.752098 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ovn-combined-ca-bundle\") pod \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " Feb 20 09:13:53 crc kubenswrapper[5094]: I0220 09:13:53.763364 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "3ef4f2ef-92a7-4d12-94a9-e3ee55412547" (UID: "3ef4f2ef-92a7-4d12-94a9-e3ee55412547"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:53 crc kubenswrapper[5094]: I0220 09:13:53.767940 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-kube-api-access-znpgd" (OuterVolumeSpecName: "kube-api-access-znpgd") pod "3ef4f2ef-92a7-4d12-94a9-e3ee55412547" (UID: "3ef4f2ef-92a7-4d12-94a9-e3ee55412547"). InnerVolumeSpecName "kube-api-access-znpgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:13:53 crc kubenswrapper[5094]: I0220 09:13:53.793533 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "3ef4f2ef-92a7-4d12-94a9-e3ee55412547" (UID: "3ef4f2ef-92a7-4d12-94a9-e3ee55412547"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:53 crc kubenswrapper[5094]: I0220 09:13:53.799615 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "3ef4f2ef-92a7-4d12-94a9-e3ee55412547" (UID: "3ef4f2ef-92a7-4d12-94a9-e3ee55412547"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:13:53 crc kubenswrapper[5094]: I0220 09:13:53.809757 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-inventory" (OuterVolumeSpecName: "inventory") pod "3ef4f2ef-92a7-4d12-94a9-e3ee55412547" (UID: "3ef4f2ef-92a7-4d12-94a9-e3ee55412547"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:53 crc kubenswrapper[5094]: I0220 09:13:53.854839 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:53 crc kubenswrapper[5094]: I0220 09:13:53.854882 5094 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:53 crc kubenswrapper[5094]: I0220 09:13:53.854893 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znpgd\" (UniqueName: \"kubernetes.io/projected/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-kube-api-access-znpgd\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:53 crc kubenswrapper[5094]: I0220 09:13:53.854953 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:53 crc kubenswrapper[5094]: I0220 09:13:53.854963 5094 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.091451 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-networker-r6vzk" event={"ID":"3ef4f2ef-92a7-4d12-94a9-e3ee55412547","Type":"ContainerDied","Data":"e1588c72c435155f35fdf02546c5ae6b532e58027baa39a785ec6d828ecdf55b"} Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.091492 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1588c72c435155f35fdf02546c5ae6b532e58027baa39a785ec6d828ecdf55b" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.091531 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.189955 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-networker-f5lls"] Feb 20 09:13:54 crc kubenswrapper[5094]: E0220 09:13:54.190635 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ef4f2ef-92a7-4d12-94a9-e3ee55412547" containerName="ovn-openstack-openstack-networker" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.190665 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef4f2ef-92a7-4d12-94a9-e3ee55412547" containerName="ovn-openstack-openstack-networker" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.191222 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ef4f2ef-92a7-4d12-94a9-e3ee55412547" containerName="ovn-openstack-openstack-networker" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.192426 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.195342 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.195413 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-xf9xl" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.197844 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.197845 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.208973 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-networker-f5lls"] Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.264442 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.264794 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcvln\" (UniqueName: \"kubernetes.io/projected/b741c1f4-f408-486b-bd44-3ae1fcadc83b-kube-api-access-lcvln\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.265105 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-inventory\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.265259 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-ssh-key-openstack-networker\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.265347 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.265471 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.367254 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-inventory\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.367337 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-ssh-key-openstack-networker\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.367393 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.367432 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.367473 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.367511 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcvln\" (UniqueName: \"kubernetes.io/projected/b741c1f4-f408-486b-bd44-3ae1fcadc83b-kube-api-access-lcvln\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.371507 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-ssh-key-openstack-networker\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.371591 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.372305 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.373600 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-inventory\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.375920 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.390195 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcvln\" (UniqueName: \"kubernetes.io/projected/b741c1f4-f408-486b-bd44-3ae1fcadc83b-kube-api-access-lcvln\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.511337 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:55 crc kubenswrapper[5094]: W0220 09:13:55.046931 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb741c1f4_f408_486b_bd44_3ae1fcadc83b.slice/crio-cb81625e8207eb6bdb083e10cef3efc13f9b1eb6ced3c8983766d66a97d86cd2 WatchSource:0}: Error finding container cb81625e8207eb6bdb083e10cef3efc13f9b1eb6ced3c8983766d66a97d86cd2: Status 404 returned error can't find the container with id cb81625e8207eb6bdb083e10cef3efc13f9b1eb6ced3c8983766d66a97d86cd2 Feb 20 09:13:55 crc kubenswrapper[5094]: I0220 09:13:55.053069 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-networker-f5lls"] Feb 20 09:13:55 crc kubenswrapper[5094]: I0220 09:13:55.100754 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" event={"ID":"b741c1f4-f408-486b-bd44-3ae1fcadc83b","Type":"ContainerStarted","Data":"cb81625e8207eb6bdb083e10cef3efc13f9b1eb6ced3c8983766d66a97d86cd2"} Feb 20 09:13:56 crc kubenswrapper[5094]: I0220 09:13:56.112070 5094 generic.go:334] "Generic (PLEG): container finished" podID="e01778d5-c4a7-44c6-a9e9-cf7d3cb299db" containerID="e7a67200ded8823653b016a9ba2b790f20cb85685e7f8e7f60228125c6bf1e16" exitCode=0 Feb 20 09:13:56 crc kubenswrapper[5094]: I0220 09:13:56.112190 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" event={"ID":"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db","Type":"ContainerDied","Data":"e7a67200ded8823653b016a9ba2b790f20cb85685e7f8e7f60228125c6bf1e16"} Feb 20 09:13:56 crc kubenswrapper[5094]: I0220 09:13:56.115325 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" event={"ID":"b741c1f4-f408-486b-bd44-3ae1fcadc83b","Type":"ContainerStarted","Data":"60033d64494b24ef736d90599a6c3267ec69b6bcf9cf95209691bf9761ff5b97"} Feb 20 09:13:56 crc kubenswrapper[5094]: I0220 09:13:56.154722 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" podStartSLOduration=1.736795469 podStartE2EDuration="2.154687112s" podCreationTimestamp="2026-02-20 09:13:54 +0000 UTC" firstStartedPulling="2026-02-20 09:13:55.049940604 +0000 UTC m=+8849.922567335" lastFinishedPulling="2026-02-20 09:13:55.467832257 +0000 UTC m=+8850.340458978" observedRunningTime="2026-02-20 09:13:56.148222856 +0000 UTC m=+8851.020849587" watchObservedRunningTime="2026-02-20 09:13:56.154687112 +0000 UTC m=+8851.027313823" Feb 20 09:13:57 crc kubenswrapper[5094]: I0220 09:13:57.614765 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" Feb 20 09:13:57 crc kubenswrapper[5094]: I0220 09:13:57.745821 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-ceph\") pod \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\" (UID: \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\") " Feb 20 09:13:57 crc kubenswrapper[5094]: I0220 09:13:57.746236 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-ssh-key-openstack-cell1\") pod \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\" (UID: \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\") " Feb 20 09:13:57 crc kubenswrapper[5094]: I0220 09:13:57.746402 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvklc\" (UniqueName: \"kubernetes.io/projected/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-kube-api-access-mvklc\") pod \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\" (UID: \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\") " Feb 20 09:13:57 crc kubenswrapper[5094]: I0220 09:13:57.746490 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-inventory\") pod \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\" (UID: \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\") " Feb 20 09:13:57 crc kubenswrapper[5094]: I0220 09:13:57.751537 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-kube-api-access-mvklc" (OuterVolumeSpecName: "kube-api-access-mvklc") pod "e01778d5-c4a7-44c6-a9e9-cf7d3cb299db" (UID: "e01778d5-c4a7-44c6-a9e9-cf7d3cb299db"). InnerVolumeSpecName "kube-api-access-mvklc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:13:57 crc kubenswrapper[5094]: I0220 09:13:57.751956 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-ceph" (OuterVolumeSpecName: "ceph") pod "e01778d5-c4a7-44c6-a9e9-cf7d3cb299db" (UID: "e01778d5-c4a7-44c6-a9e9-cf7d3cb299db"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:57 crc kubenswrapper[5094]: I0220 09:13:57.785999 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-inventory" (OuterVolumeSpecName: "inventory") pod "e01778d5-c4a7-44c6-a9e9-cf7d3cb299db" (UID: "e01778d5-c4a7-44c6-a9e9-cf7d3cb299db"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:57 crc kubenswrapper[5094]: I0220 09:13:57.811983 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "e01778d5-c4a7-44c6-a9e9-cf7d3cb299db" (UID: "e01778d5-c4a7-44c6-a9e9-cf7d3cb299db"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:57 crc kubenswrapper[5094]: I0220 09:13:57.856986 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:57 crc kubenswrapper[5094]: I0220 09:13:57.857130 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:57 crc kubenswrapper[5094]: I0220 09:13:57.857150 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvklc\" (UniqueName: \"kubernetes.io/projected/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-kube-api-access-mvklc\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:57 crc kubenswrapper[5094]: I0220 09:13:57.857474 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.137832 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" event={"ID":"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db","Type":"ContainerDied","Data":"639da57cbc720668a4113926cc775bd9a7a1d7c0bf9ce364b0e700fc1cee12ad"} Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.137875 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="639da57cbc720668a4113926cc775bd9a7a1d7c0bf9ce364b0e700fc1cee12ad" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.137959 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.283625 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-pswsx"] Feb 20 09:13:58 crc kubenswrapper[5094]: E0220 09:13:58.284260 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e01778d5-c4a7-44c6-a9e9-cf7d3cb299db" containerName="ceph-client-openstack-openstack-cell1" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.284283 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="e01778d5-c4a7-44c6-a9e9-cf7d3cb299db" containerName="ceph-client-openstack-openstack-cell1" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.284572 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="e01778d5-c4a7-44c6-a9e9-cf7d3cb299db" containerName="ceph-client-openstack-openstack-cell1" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.285489 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.288562 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.289624 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.290131 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.301521 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-pswsx"] Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.369286 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.369376 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ceph\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.369491 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.369534 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sx9r\" (UniqueName: \"kubernetes.io/projected/3f35b6d1-3070-44cf-bdf8-6376b2434586-kube-api-access-5sx9r\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.369579 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3f35b6d1-3070-44cf-bdf8-6376b2434586-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.369627 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-inventory\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.471168 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.471232 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sx9r\" (UniqueName: \"kubernetes.io/projected/3f35b6d1-3070-44cf-bdf8-6376b2434586-kube-api-access-5sx9r\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.471293 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3f35b6d1-3070-44cf-bdf8-6376b2434586-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.471362 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-inventory\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.471458 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.471538 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ceph\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.472753 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3f35b6d1-3070-44cf-bdf8-6376b2434586-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.475364 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.476330 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ceph\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.476476 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-inventory\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.477026 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.492228 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sx9r\" (UniqueName: \"kubernetes.io/projected/3f35b6d1-3070-44cf-bdf8-6376b2434586-kube-api-access-5sx9r\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.610417 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:59 crc kubenswrapper[5094]: I0220 09:13:59.161393 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-pswsx"] Feb 20 09:13:59 crc kubenswrapper[5094]: W0220 09:13:59.162191 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f35b6d1_3070_44cf_bdf8_6376b2434586.slice/crio-cdc3d1621bbe1a4ebe64d157d01d74cdbb9d3ff91d96fb9d22cab8643dd41f30 WatchSource:0}: Error finding container cdc3d1621bbe1a4ebe64d157d01d74cdbb9d3ff91d96fb9d22cab8643dd41f30: Status 404 returned error can't find the container with id cdc3d1621bbe1a4ebe64d157d01d74cdbb9d3ff91d96fb9d22cab8643dd41f30 Feb 20 09:14:00 crc kubenswrapper[5094]: I0220 09:14:00.163055 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-pswsx" event={"ID":"3f35b6d1-3070-44cf-bdf8-6376b2434586","Type":"ContainerStarted","Data":"3c888b2b05fec27e60745bb98c9fc87f20df720e5a76a1cf1200d076a1e8a640"} Feb 20 09:14:00 crc kubenswrapper[5094]: I0220 09:14:00.163414 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-pswsx" event={"ID":"3f35b6d1-3070-44cf-bdf8-6376b2434586","Type":"ContainerStarted","Data":"cdc3d1621bbe1a4ebe64d157d01d74cdbb9d3ff91d96fb9d22cab8643dd41f30"} Feb 20 09:14:00 crc kubenswrapper[5094]: I0220 09:14:00.186597 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-pswsx" podStartSLOduration=1.762382777 podStartE2EDuration="2.186576812s" podCreationTimestamp="2026-02-20 09:13:58 +0000 UTC" firstStartedPulling="2026-02-20 09:13:59.16647151 +0000 UTC m=+8854.039098231" lastFinishedPulling="2026-02-20 09:13:59.590665545 +0000 UTC m=+8854.463292266" observedRunningTime="2026-02-20 09:14:00.183079467 +0000 UTC m=+8855.055706178" watchObservedRunningTime="2026-02-20 09:14:00.186576812 +0000 UTC m=+8855.059203523" Feb 20 09:14:04 crc kubenswrapper[5094]: I0220 09:14:04.106807 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:14:04 crc kubenswrapper[5094]: I0220 09:14:04.107943 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:14:34 crc kubenswrapper[5094]: I0220 09:14:34.106881 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:14:34 crc kubenswrapper[5094]: I0220 09:14:34.107352 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:14:49 crc kubenswrapper[5094]: I0220 09:14:49.653038 5094 generic.go:334] "Generic (PLEG): container finished" podID="b741c1f4-f408-486b-bd44-3ae1fcadc83b" containerID="60033d64494b24ef736d90599a6c3267ec69b6bcf9cf95209691bf9761ff5b97" exitCode=0 Feb 20 09:14:49 crc kubenswrapper[5094]: I0220 09:14:49.653100 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" event={"ID":"b741c1f4-f408-486b-bd44-3ae1fcadc83b","Type":"ContainerDied","Data":"60033d64494b24ef736d90599a6c3267ec69b6bcf9cf95209691bf9761ff5b97"} Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.120495 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.231686 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-nova-metadata-neutron-config-0\") pod \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.231752 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-neutron-metadata-combined-ca-bundle\") pod \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.231870 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-inventory\") pod \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.231887 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-ssh-key-openstack-networker\") pod \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.231948 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-neutron-ovn-metadata-agent-neutron-config-0\") pod \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.232057 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcvln\" (UniqueName: \"kubernetes.io/projected/b741c1f4-f408-486b-bd44-3ae1fcadc83b-kube-api-access-lcvln\") pod \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.238914 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "b741c1f4-f408-486b-bd44-3ae1fcadc83b" (UID: "b741c1f4-f408-486b-bd44-3ae1fcadc83b"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.241523 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b741c1f4-f408-486b-bd44-3ae1fcadc83b-kube-api-access-lcvln" (OuterVolumeSpecName: "kube-api-access-lcvln") pod "b741c1f4-f408-486b-bd44-3ae1fcadc83b" (UID: "b741c1f4-f408-486b-bd44-3ae1fcadc83b"). InnerVolumeSpecName "kube-api-access-lcvln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.269646 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "b741c1f4-f408-486b-bd44-3ae1fcadc83b" (UID: "b741c1f4-f408-486b-bd44-3ae1fcadc83b"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.282935 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-inventory" (OuterVolumeSpecName: "inventory") pod "b741c1f4-f408-486b-bd44-3ae1fcadc83b" (UID: "b741c1f4-f408-486b-bd44-3ae1fcadc83b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.293829 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "b741c1f4-f408-486b-bd44-3ae1fcadc83b" (UID: "b741c1f4-f408-486b-bd44-3ae1fcadc83b"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.303772 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "b741c1f4-f408-486b-bd44-3ae1fcadc83b" (UID: "b741c1f4-f408-486b-bd44-3ae1fcadc83b"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.334853 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcvln\" (UniqueName: \"kubernetes.io/projected/b741c1f4-f408-486b-bd44-3ae1fcadc83b-kube-api-access-lcvln\") on node \"crc\" DevicePath \"\"" Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.335144 5094 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.335250 5094 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.335337 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.335422 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.335505 5094 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.677355 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" event={"ID":"b741c1f4-f408-486b-bd44-3ae1fcadc83b","Type":"ContainerDied","Data":"cb81625e8207eb6bdb083e10cef3efc13f9b1eb6ced3c8983766d66a97d86cd2"} Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.677400 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb81625e8207eb6bdb083e10cef3efc13f9b1eb6ced3c8983766d66a97d86cd2" Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.677934 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.150163 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm"] Feb 20 09:15:00 crc kubenswrapper[5094]: E0220 09:15:00.151231 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b741c1f4-f408-486b-bd44-3ae1fcadc83b" containerName="neutron-metadata-openstack-openstack-networker" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.151250 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b741c1f4-f408-486b-bd44-3ae1fcadc83b" containerName="neutron-metadata-openstack-openstack-networker" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.151531 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b741c1f4-f408-486b-bd44-3ae1fcadc83b" containerName="neutron-metadata-openstack-openstack-networker" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.152926 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.154937 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.155777 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.161864 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm"] Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.221002 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-config-volume\") pod \"collect-profiles-29526315-md8jm\" (UID: \"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.221128 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-secret-volume\") pod \"collect-profiles-29526315-md8jm\" (UID: \"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.221377 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbxs9\" (UniqueName: \"kubernetes.io/projected/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-kube-api-access-pbxs9\") pod \"collect-profiles-29526315-md8jm\" (UID: \"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.323638 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-config-volume\") pod \"collect-profiles-29526315-md8jm\" (UID: \"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.323752 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-secret-volume\") pod \"collect-profiles-29526315-md8jm\" (UID: \"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.323903 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbxs9\" (UniqueName: \"kubernetes.io/projected/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-kube-api-access-pbxs9\") pod \"collect-profiles-29526315-md8jm\" (UID: \"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.325091 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-config-volume\") pod \"collect-profiles-29526315-md8jm\" (UID: \"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.330412 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-secret-volume\") pod \"collect-profiles-29526315-md8jm\" (UID: \"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.343467 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbxs9\" (UniqueName: \"kubernetes.io/projected/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-kube-api-access-pbxs9\") pod \"collect-profiles-29526315-md8jm\" (UID: \"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.505448 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.992627 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm"] Feb 20 09:15:01 crc kubenswrapper[5094]: I0220 09:15:01.769411 5094 generic.go:334] "Generic (PLEG): container finished" podID="ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1" containerID="54cabe5f22fe8cc34888629b6ad81ec4c78f22e5ddf13beea44532e8ad37533e" exitCode=0 Feb 20 09:15:01 crc kubenswrapper[5094]: I0220 09:15:01.769473 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm" event={"ID":"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1","Type":"ContainerDied","Data":"54cabe5f22fe8cc34888629b6ad81ec4c78f22e5ddf13beea44532e8ad37533e"} Feb 20 09:15:01 crc kubenswrapper[5094]: I0220 09:15:01.769817 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm" event={"ID":"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1","Type":"ContainerStarted","Data":"f6e87b5b2102f7eaa839fc05ee79c8bec2aebcbb17ddd051d0ff98aa9943e799"} Feb 20 09:15:03 crc kubenswrapper[5094]: I0220 09:15:03.244091 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm" Feb 20 09:15:03 crc kubenswrapper[5094]: I0220 09:15:03.414676 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbxs9\" (UniqueName: \"kubernetes.io/projected/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-kube-api-access-pbxs9\") pod \"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1\" (UID: \"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1\") " Feb 20 09:15:03 crc kubenswrapper[5094]: I0220 09:15:03.414772 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-secret-volume\") pod \"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1\" (UID: \"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1\") " Feb 20 09:15:03 crc kubenswrapper[5094]: I0220 09:15:03.414864 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-config-volume\") pod \"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1\" (UID: \"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1\") " Feb 20 09:15:03 crc kubenswrapper[5094]: I0220 09:15:03.415443 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-config-volume" (OuterVolumeSpecName: "config-volume") pod "ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1" (UID: "ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:15:03 crc kubenswrapper[5094]: I0220 09:15:03.416099 5094 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 09:15:03 crc kubenswrapper[5094]: I0220 09:15:03.420348 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-kube-api-access-pbxs9" (OuterVolumeSpecName: "kube-api-access-pbxs9") pod "ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1" (UID: "ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1"). InnerVolumeSpecName "kube-api-access-pbxs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:15:03 crc kubenswrapper[5094]: I0220 09:15:03.421889 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1" (UID: "ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:15:03 crc kubenswrapper[5094]: I0220 09:15:03.517730 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbxs9\" (UniqueName: \"kubernetes.io/projected/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-kube-api-access-pbxs9\") on node \"crc\" DevicePath \"\"" Feb 20 09:15:03 crc kubenswrapper[5094]: I0220 09:15:03.517762 5094 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 09:15:03 crc kubenswrapper[5094]: I0220 09:15:03.798497 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm" event={"ID":"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1","Type":"ContainerDied","Data":"f6e87b5b2102f7eaa839fc05ee79c8bec2aebcbb17ddd051d0ff98aa9943e799"} Feb 20 09:15:03 crc kubenswrapper[5094]: I0220 09:15:03.798545 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6e87b5b2102f7eaa839fc05ee79c8bec2aebcbb17ddd051d0ff98aa9943e799" Feb 20 09:15:03 crc kubenswrapper[5094]: I0220 09:15:03.798898 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm" Feb 20 09:15:04 crc kubenswrapper[5094]: I0220 09:15:04.106575 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:15:04 crc kubenswrapper[5094]: I0220 09:15:04.106910 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:15:04 crc kubenswrapper[5094]: I0220 09:15:04.107023 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 09:15:04 crc kubenswrapper[5094]: I0220 09:15:04.107901 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8248a8f47476a9f6ae0df25861a436e80d0332d121cab1f3067f00978bf0efaf"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 09:15:04 crc kubenswrapper[5094]: I0220 09:15:04.108047 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://8248a8f47476a9f6ae0df25861a436e80d0332d121cab1f3067f00978bf0efaf" gracePeriod=600 Feb 20 09:15:04 crc kubenswrapper[5094]: I0220 09:15:04.336193 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh"] Feb 20 09:15:04 crc kubenswrapper[5094]: I0220 09:15:04.353481 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh"] Feb 20 09:15:04 crc kubenswrapper[5094]: I0220 09:15:04.829767 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="8248a8f47476a9f6ae0df25861a436e80d0332d121cab1f3067f00978bf0efaf" exitCode=0 Feb 20 09:15:04 crc kubenswrapper[5094]: I0220 09:15:04.830287 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"8248a8f47476a9f6ae0df25861a436e80d0332d121cab1f3067f00978bf0efaf"} Feb 20 09:15:04 crc kubenswrapper[5094]: I0220 09:15:04.830322 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0"} Feb 20 09:15:04 crc kubenswrapper[5094]: I0220 09:15:04.830344 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:15:05 crc kubenswrapper[5094]: I0220 09:15:05.853912 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c7c75bd-9812-4d90-80ea-08eda0f926fc" path="/var/lib/kubelet/pods/5c7c75bd-9812-4d90-80ea-08eda0f926fc/volumes" Feb 20 09:15:09 crc kubenswrapper[5094]: I0220 09:15:09.904162 5094 generic.go:334] "Generic (PLEG): container finished" podID="3f35b6d1-3070-44cf-bdf8-6376b2434586" containerID="3c888b2b05fec27e60745bb98c9fc87f20df720e5a76a1cf1200d076a1e8a640" exitCode=0 Feb 20 09:15:09 crc kubenswrapper[5094]: I0220 09:15:09.904278 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-pswsx" event={"ID":"3f35b6d1-3070-44cf-bdf8-6376b2434586","Type":"ContainerDied","Data":"3c888b2b05fec27e60745bb98c9fc87f20df720e5a76a1cf1200d076a1e8a640"} Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.484754 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.635342 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ovn-combined-ca-bundle\") pod \"3f35b6d1-3070-44cf-bdf8-6376b2434586\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.635701 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ssh-key-openstack-cell1\") pod \"3f35b6d1-3070-44cf-bdf8-6376b2434586\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.635889 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3f35b6d1-3070-44cf-bdf8-6376b2434586-ovncontroller-config-0\") pod \"3f35b6d1-3070-44cf-bdf8-6376b2434586\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.635959 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-inventory\") pod \"3f35b6d1-3070-44cf-bdf8-6376b2434586\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.636020 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ceph\") pod \"3f35b6d1-3070-44cf-bdf8-6376b2434586\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.636190 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sx9r\" (UniqueName: \"kubernetes.io/projected/3f35b6d1-3070-44cf-bdf8-6376b2434586-kube-api-access-5sx9r\") pod \"3f35b6d1-3070-44cf-bdf8-6376b2434586\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.641561 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ceph" (OuterVolumeSpecName: "ceph") pod "3f35b6d1-3070-44cf-bdf8-6376b2434586" (UID: "3f35b6d1-3070-44cf-bdf8-6376b2434586"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.643124 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "3f35b6d1-3070-44cf-bdf8-6376b2434586" (UID: "3f35b6d1-3070-44cf-bdf8-6376b2434586"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.650243 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f35b6d1-3070-44cf-bdf8-6376b2434586-kube-api-access-5sx9r" (OuterVolumeSpecName: "kube-api-access-5sx9r") pod "3f35b6d1-3070-44cf-bdf8-6376b2434586" (UID: "3f35b6d1-3070-44cf-bdf8-6376b2434586"). InnerVolumeSpecName "kube-api-access-5sx9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.668541 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f35b6d1-3070-44cf-bdf8-6376b2434586-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "3f35b6d1-3070-44cf-bdf8-6376b2434586" (UID: "3f35b6d1-3070-44cf-bdf8-6376b2434586"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.672045 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "3f35b6d1-3070-44cf-bdf8-6376b2434586" (UID: "3f35b6d1-3070-44cf-bdf8-6376b2434586"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.675865 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-inventory" (OuterVolumeSpecName: "inventory") pod "3f35b6d1-3070-44cf-bdf8-6376b2434586" (UID: "3f35b6d1-3070-44cf-bdf8-6376b2434586"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.739578 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sx9r\" (UniqueName: \"kubernetes.io/projected/3f35b6d1-3070-44cf-bdf8-6376b2434586-kube-api-access-5sx9r\") on node \"crc\" DevicePath \"\"" Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.739624 5094 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.739640 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.739654 5094 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3f35b6d1-3070-44cf-bdf8-6376b2434586-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.739667 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.739681 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.929818 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-pswsx" event={"ID":"3f35b6d1-3070-44cf-bdf8-6376b2434586","Type":"ContainerDied","Data":"cdc3d1621bbe1a4ebe64d157d01d74cdbb9d3ff91d96fb9d22cab8643dd41f30"} Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.929862 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdc3d1621bbe1a4ebe64d157d01d74cdbb9d3ff91d96fb9d22cab8643dd41f30" Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.929921 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.014416 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-d2d4n"] Feb 20 09:15:12 crc kubenswrapper[5094]: E0220 09:15:12.014933 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1" containerName="collect-profiles" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.014956 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1" containerName="collect-profiles" Feb 20 09:15:12 crc kubenswrapper[5094]: E0220 09:15:12.014980 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f35b6d1-3070-44cf-bdf8-6376b2434586" containerName="ovn-openstack-openstack-cell1" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.014989 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f35b6d1-3070-44cf-bdf8-6376b2434586" containerName="ovn-openstack-openstack-cell1" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.015224 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f35b6d1-3070-44cf-bdf8-6376b2434586" containerName="ovn-openstack-openstack-cell1" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.015245 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1" containerName="collect-profiles" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.016079 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.019804 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.020004 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.020009 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.020668 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.020949 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.022121 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.036589 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-d2d4n"] Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.151705 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.151768 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.151808 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.151956 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.151991 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.152025 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtb5q\" (UniqueName: \"kubernetes.io/projected/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-kube-api-access-jtb5q\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.152050 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.253443 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.253499 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.253533 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtb5q\" (UniqueName: \"kubernetes.io/projected/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-kube-api-access-jtb5q\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.253556 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.253624 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.253644 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.253679 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.258638 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.258638 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.260046 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.261354 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.261470 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.262033 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.274395 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtb5q\" (UniqueName: \"kubernetes.io/projected/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-kube-api-access-jtb5q\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.344669 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.927453 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-d2d4n"] Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.932388 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.943628 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" event={"ID":"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe","Type":"ContainerStarted","Data":"eced22e9311585fc8a9e80929bb7e4e1c9a1f94226fce854695e14c779761fb4"} Feb 20 09:15:13 crc kubenswrapper[5094]: I0220 09:15:13.955626 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" event={"ID":"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe","Type":"ContainerStarted","Data":"2d9f8b5996e148deeb14413be699a051f445a16100af15b499c642d8cd36aaf1"} Feb 20 09:15:13 crc kubenswrapper[5094]: I0220 09:15:13.975679 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" podStartSLOduration=2.534631268 podStartE2EDuration="2.975664488s" podCreationTimestamp="2026-02-20 09:15:11 +0000 UTC" firstStartedPulling="2026-02-20 09:15:12.932207885 +0000 UTC m=+8927.804834596" lastFinishedPulling="2026-02-20 09:15:13.373241105 +0000 UTC m=+8928.245867816" observedRunningTime="2026-02-20 09:15:13.97368419 +0000 UTC m=+8928.846310901" watchObservedRunningTime="2026-02-20 09:15:13.975664488 +0000 UTC m=+8928.848291199" Feb 20 09:15:44 crc kubenswrapper[5094]: I0220 09:15:44.069900 5094 scope.go:117] "RemoveContainer" containerID="89a68b3798c7e61a71c5a1f766e1642edc8983858caba5c4db74959c3a8cdcec" Feb 20 09:16:09 crc kubenswrapper[5094]: I0220 09:16:09.551241 5094 generic.go:334] "Generic (PLEG): container finished" podID="bf84fab1-aae6-4c92-982e-a4c5b1c7cefe" containerID="2d9f8b5996e148deeb14413be699a051f445a16100af15b499c642d8cd36aaf1" exitCode=0 Feb 20 09:16:09 crc kubenswrapper[5094]: I0220 09:16:09.551329 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" event={"ID":"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe","Type":"ContainerDied","Data":"2d9f8b5996e148deeb14413be699a051f445a16100af15b499c642d8cd36aaf1"} Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.088580 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.205478 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-ssh-key-openstack-cell1\") pod \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.205644 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-ceph\") pod \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.205677 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-inventory\") pod \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.205882 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-neutron-ovn-metadata-agent-neutron-config-0\") pod \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.205916 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-neutron-metadata-combined-ca-bundle\") pod \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.205949 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtb5q\" (UniqueName: \"kubernetes.io/projected/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-kube-api-access-jtb5q\") pod \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.205993 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-nova-metadata-neutron-config-0\") pod \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.214854 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-ceph" (OuterVolumeSpecName: "ceph") pod "bf84fab1-aae6-4c92-982e-a4c5b1c7cefe" (UID: "bf84fab1-aae6-4c92-982e-a4c5b1c7cefe"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.214882 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "bf84fab1-aae6-4c92-982e-a4c5b1c7cefe" (UID: "bf84fab1-aae6-4c92-982e-a4c5b1c7cefe"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.214905 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-kube-api-access-jtb5q" (OuterVolumeSpecName: "kube-api-access-jtb5q") pod "bf84fab1-aae6-4c92-982e-a4c5b1c7cefe" (UID: "bf84fab1-aae6-4c92-982e-a4c5b1c7cefe"). InnerVolumeSpecName "kube-api-access-jtb5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.238423 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "bf84fab1-aae6-4c92-982e-a4c5b1c7cefe" (UID: "bf84fab1-aae6-4c92-982e-a4c5b1c7cefe"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.249914 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "bf84fab1-aae6-4c92-982e-a4c5b1c7cefe" (UID: "bf84fab1-aae6-4c92-982e-a4c5b1c7cefe"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.308123 5094 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.308167 5094 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.308179 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtb5q\" (UniqueName: \"kubernetes.io/projected/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-kube-api-access-jtb5q\") on node \"crc\" DevicePath \"\"" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.308190 5094 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.308198 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.345927 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-inventory" (OuterVolumeSpecName: "inventory") pod "bf84fab1-aae6-4c92-982e-a4c5b1c7cefe" (UID: "bf84fab1-aae6-4c92-982e-a4c5b1c7cefe"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.362687 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "bf84fab1-aae6-4c92-982e-a4c5b1c7cefe" (UID: "bf84fab1-aae6-4c92-982e-a4c5b1c7cefe"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.410523 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.410573 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.571125 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" event={"ID":"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe","Type":"ContainerDied","Data":"eced22e9311585fc8a9e80929bb7e4e1c9a1f94226fce854695e14c779761fb4"} Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.571602 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eced22e9311585fc8a9e80929bb7e4e1c9a1f94226fce854695e14c779761fb4" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.571202 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.684836 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-f6rmf"] Feb 20 09:16:11 crc kubenswrapper[5094]: E0220 09:16:11.685470 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf84fab1-aae6-4c92-982e-a4c5b1c7cefe" containerName="neutron-metadata-openstack-openstack-cell1" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.685506 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf84fab1-aae6-4c92-982e-a4c5b1c7cefe" containerName="neutron-metadata-openstack-openstack-cell1" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.685877 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf84fab1-aae6-4c92-982e-a4c5b1c7cefe" containerName="neutron-metadata-openstack-openstack-cell1" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.686833 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.688313 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.688312 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.688949 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.689019 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.689663 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.703924 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-f6rmf"] Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.819740 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.819801 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-inventory\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.819836 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.819970 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5bl2\" (UniqueName: \"kubernetes.io/projected/a552adeb-5834-4cfe-8ee3-56472dda5cab-kube-api-access-f5bl2\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.820044 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-ceph\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.820272 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.921684 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5bl2\" (UniqueName: \"kubernetes.io/projected/a552adeb-5834-4cfe-8ee3-56472dda5cab-kube-api-access-f5bl2\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.921777 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-ceph\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.921841 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.921886 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.921929 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-inventory\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.921970 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.926984 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-inventory\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.927150 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-ceph\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.928173 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.928672 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.929139 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.937692 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5bl2\" (UniqueName: \"kubernetes.io/projected/a552adeb-5834-4cfe-8ee3-56472dda5cab-kube-api-access-f5bl2\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:12 crc kubenswrapper[5094]: I0220 09:16:12.012456 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:12 crc kubenswrapper[5094]: I0220 09:16:12.544803 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-f6rmf"] Feb 20 09:16:12 crc kubenswrapper[5094]: I0220 09:16:12.579919 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" event={"ID":"a552adeb-5834-4cfe-8ee3-56472dda5cab","Type":"ContainerStarted","Data":"27d38e30a80a5d08e54616c5028c63d34bbc4b444a5bebcc5405786abf7e2ca0"} Feb 20 09:16:13 crc kubenswrapper[5094]: I0220 09:16:13.589629 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" event={"ID":"a552adeb-5834-4cfe-8ee3-56472dda5cab","Type":"ContainerStarted","Data":"11e98b436e832bd8d5225286da85ae8eb5b7dfbad585fc8d6f918f1ffca98c48"} Feb 20 09:16:13 crc kubenswrapper[5094]: I0220 09:16:13.618586 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" podStartSLOduration=2.187018211 podStartE2EDuration="2.618538092s" podCreationTimestamp="2026-02-20 09:16:11 +0000 UTC" firstStartedPulling="2026-02-20 09:16:12.55231168 +0000 UTC m=+8987.424938391" lastFinishedPulling="2026-02-20 09:16:12.983831551 +0000 UTC m=+8987.856458272" observedRunningTime="2026-02-20 09:16:13.606592864 +0000 UTC m=+8988.479219575" watchObservedRunningTime="2026-02-20 09:16:13.618538092 +0000 UTC m=+8988.491164803" Feb 20 09:17:04 crc kubenswrapper[5094]: I0220 09:17:04.106550 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:17:04 crc kubenswrapper[5094]: I0220 09:17:04.107265 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:17:34 crc kubenswrapper[5094]: I0220 09:17:34.107298 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:17:34 crc kubenswrapper[5094]: I0220 09:17:34.108052 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:18:04 crc kubenswrapper[5094]: I0220 09:18:04.106668 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:18:04 crc kubenswrapper[5094]: I0220 09:18:04.107387 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:18:04 crc kubenswrapper[5094]: I0220 09:18:04.107475 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 09:18:04 crc kubenswrapper[5094]: I0220 09:18:04.108605 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 09:18:04 crc kubenswrapper[5094]: I0220 09:18:04.108742 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" gracePeriod=600 Feb 20 09:18:04 crc kubenswrapper[5094]: E0220 09:18:04.252946 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:18:04 crc kubenswrapper[5094]: I0220 09:18:04.825229 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" exitCode=0 Feb 20 09:18:04 crc kubenswrapper[5094]: I0220 09:18:04.825292 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0"} Feb 20 09:18:04 crc kubenswrapper[5094]: I0220 09:18:04.825341 5094 scope.go:117] "RemoveContainer" containerID="8248a8f47476a9f6ae0df25861a436e80d0332d121cab1f3067f00978bf0efaf" Feb 20 09:18:04 crc kubenswrapper[5094]: I0220 09:18:04.831827 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:18:04 crc kubenswrapper[5094]: E0220 09:18:04.832444 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:18:19 crc kubenswrapper[5094]: I0220 09:18:19.842097 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:18:19 crc kubenswrapper[5094]: E0220 09:18:19.843192 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:18:34 crc kubenswrapper[5094]: I0220 09:18:34.840812 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:18:34 crc kubenswrapper[5094]: E0220 09:18:34.843761 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:18:38 crc kubenswrapper[5094]: I0220 09:18:38.280222 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g8njc"] Feb 20 09:18:38 crc kubenswrapper[5094]: I0220 09:18:38.284991 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:38 crc kubenswrapper[5094]: I0220 09:18:38.295232 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g8njc"] Feb 20 09:18:38 crc kubenswrapper[5094]: I0220 09:18:38.345766 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/811e4fa6-3b96-40b7-88a3-067b582b0683-catalog-content\") pod \"community-operators-g8njc\" (UID: \"811e4fa6-3b96-40b7-88a3-067b582b0683\") " pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:38 crc kubenswrapper[5094]: I0220 09:18:38.345841 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/811e4fa6-3b96-40b7-88a3-067b582b0683-utilities\") pod \"community-operators-g8njc\" (UID: \"811e4fa6-3b96-40b7-88a3-067b582b0683\") " pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:38 crc kubenswrapper[5094]: I0220 09:18:38.345918 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7tzs\" (UniqueName: \"kubernetes.io/projected/811e4fa6-3b96-40b7-88a3-067b582b0683-kube-api-access-s7tzs\") pod \"community-operators-g8njc\" (UID: \"811e4fa6-3b96-40b7-88a3-067b582b0683\") " pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:38 crc kubenswrapper[5094]: I0220 09:18:38.448243 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/811e4fa6-3b96-40b7-88a3-067b582b0683-catalog-content\") pod \"community-operators-g8njc\" (UID: \"811e4fa6-3b96-40b7-88a3-067b582b0683\") " pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:38 crc kubenswrapper[5094]: I0220 09:18:38.448861 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/811e4fa6-3b96-40b7-88a3-067b582b0683-utilities\") pod \"community-operators-g8njc\" (UID: \"811e4fa6-3b96-40b7-88a3-067b582b0683\") " pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:38 crc kubenswrapper[5094]: I0220 09:18:38.449005 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/811e4fa6-3b96-40b7-88a3-067b582b0683-catalog-content\") pod \"community-operators-g8njc\" (UID: \"811e4fa6-3b96-40b7-88a3-067b582b0683\") " pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:38 crc kubenswrapper[5094]: I0220 09:18:38.449026 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7tzs\" (UniqueName: \"kubernetes.io/projected/811e4fa6-3b96-40b7-88a3-067b582b0683-kube-api-access-s7tzs\") pod \"community-operators-g8njc\" (UID: \"811e4fa6-3b96-40b7-88a3-067b582b0683\") " pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:38 crc kubenswrapper[5094]: I0220 09:18:38.449339 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/811e4fa6-3b96-40b7-88a3-067b582b0683-utilities\") pod \"community-operators-g8njc\" (UID: \"811e4fa6-3b96-40b7-88a3-067b582b0683\") " pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:38 crc kubenswrapper[5094]: I0220 09:18:38.468582 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7tzs\" (UniqueName: \"kubernetes.io/projected/811e4fa6-3b96-40b7-88a3-067b582b0683-kube-api-access-s7tzs\") pod \"community-operators-g8njc\" (UID: \"811e4fa6-3b96-40b7-88a3-067b582b0683\") " pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:38 crc kubenswrapper[5094]: I0220 09:18:38.620472 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:39 crc kubenswrapper[5094]: I0220 09:18:39.154342 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g8njc"] Feb 20 09:18:39 crc kubenswrapper[5094]: I0220 09:18:39.508456 5094 generic.go:334] "Generic (PLEG): container finished" podID="811e4fa6-3b96-40b7-88a3-067b582b0683" containerID="fa5064b945341f85314c31fcfe63dac8cc7dacd470b3a99df5d85bd58b38d0bd" exitCode=0 Feb 20 09:18:39 crc kubenswrapper[5094]: I0220 09:18:39.508839 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8njc" event={"ID":"811e4fa6-3b96-40b7-88a3-067b582b0683","Type":"ContainerDied","Data":"fa5064b945341f85314c31fcfe63dac8cc7dacd470b3a99df5d85bd58b38d0bd"} Feb 20 09:18:39 crc kubenswrapper[5094]: I0220 09:18:39.508878 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8njc" event={"ID":"811e4fa6-3b96-40b7-88a3-067b582b0683","Type":"ContainerStarted","Data":"0137a537f9282a07e616bd1f6dc47d28ccb212e130da010ebef4074da0a74551"} Feb 20 09:18:40 crc kubenswrapper[5094]: I0220 09:18:40.527783 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8njc" event={"ID":"811e4fa6-3b96-40b7-88a3-067b582b0683","Type":"ContainerStarted","Data":"b2544b81a61b09dab625a06130adcc6de768df15aa50446f5d7958b1d23a98ed"} Feb 20 09:18:41 crc kubenswrapper[5094]: I0220 09:18:41.540719 5094 generic.go:334] "Generic (PLEG): container finished" podID="811e4fa6-3b96-40b7-88a3-067b582b0683" containerID="b2544b81a61b09dab625a06130adcc6de768df15aa50446f5d7958b1d23a98ed" exitCode=0 Feb 20 09:18:41 crc kubenswrapper[5094]: I0220 09:18:41.540925 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8njc" event={"ID":"811e4fa6-3b96-40b7-88a3-067b582b0683","Type":"ContainerDied","Data":"b2544b81a61b09dab625a06130adcc6de768df15aa50446f5d7958b1d23a98ed"} Feb 20 09:18:42 crc kubenswrapper[5094]: I0220 09:18:42.552506 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8njc" event={"ID":"811e4fa6-3b96-40b7-88a3-067b582b0683","Type":"ContainerStarted","Data":"daa67a570fa974d5aa23afecc3d9bee3da3432d40d38bc55b9f4adbdc13c0f5f"} Feb 20 09:18:42 crc kubenswrapper[5094]: I0220 09:18:42.576595 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g8njc" podStartSLOduration=2.087858215 podStartE2EDuration="4.576573638s" podCreationTimestamp="2026-02-20 09:18:38 +0000 UTC" firstStartedPulling="2026-02-20 09:18:39.517788129 +0000 UTC m=+9134.390414840" lastFinishedPulling="2026-02-20 09:18:42.006503522 +0000 UTC m=+9136.879130263" observedRunningTime="2026-02-20 09:18:42.566156166 +0000 UTC m=+9137.438782917" watchObservedRunningTime="2026-02-20 09:18:42.576573638 +0000 UTC m=+9137.449200349" Feb 20 09:18:45 crc kubenswrapper[5094]: I0220 09:18:45.849163 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:18:45 crc kubenswrapper[5094]: E0220 09:18:45.850147 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:18:48 crc kubenswrapper[5094]: I0220 09:18:48.621096 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:48 crc kubenswrapper[5094]: I0220 09:18:48.621674 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:48 crc kubenswrapper[5094]: I0220 09:18:48.710190 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:50 crc kubenswrapper[5094]: I0220 09:18:50.393852 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:50 crc kubenswrapper[5094]: I0220 09:18:50.454363 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g8njc"] Feb 20 09:18:51 crc kubenswrapper[5094]: I0220 09:18:51.663209 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-g8njc" podUID="811e4fa6-3b96-40b7-88a3-067b582b0683" containerName="registry-server" containerID="cri-o://daa67a570fa974d5aa23afecc3d9bee3da3432d40d38bc55b9f4adbdc13c0f5f" gracePeriod=2 Feb 20 09:18:52 crc kubenswrapper[5094]: I0220 09:18:52.679377 5094 generic.go:334] "Generic (PLEG): container finished" podID="811e4fa6-3b96-40b7-88a3-067b582b0683" containerID="daa67a570fa974d5aa23afecc3d9bee3da3432d40d38bc55b9f4adbdc13c0f5f" exitCode=0 Feb 20 09:18:52 crc kubenswrapper[5094]: I0220 09:18:52.680908 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8njc" event={"ID":"811e4fa6-3b96-40b7-88a3-067b582b0683","Type":"ContainerDied","Data":"daa67a570fa974d5aa23afecc3d9bee3da3432d40d38bc55b9f4adbdc13c0f5f"} Feb 20 09:18:52 crc kubenswrapper[5094]: I0220 09:18:52.773161 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:52 crc kubenswrapper[5094]: I0220 09:18:52.958813 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/811e4fa6-3b96-40b7-88a3-067b582b0683-catalog-content\") pod \"811e4fa6-3b96-40b7-88a3-067b582b0683\" (UID: \"811e4fa6-3b96-40b7-88a3-067b582b0683\") " Feb 20 09:18:52 crc kubenswrapper[5094]: I0220 09:18:52.959543 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7tzs\" (UniqueName: \"kubernetes.io/projected/811e4fa6-3b96-40b7-88a3-067b582b0683-kube-api-access-s7tzs\") pod \"811e4fa6-3b96-40b7-88a3-067b582b0683\" (UID: \"811e4fa6-3b96-40b7-88a3-067b582b0683\") " Feb 20 09:18:52 crc kubenswrapper[5094]: I0220 09:18:52.959629 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/811e4fa6-3b96-40b7-88a3-067b582b0683-utilities\") pod \"811e4fa6-3b96-40b7-88a3-067b582b0683\" (UID: \"811e4fa6-3b96-40b7-88a3-067b582b0683\") " Feb 20 09:18:52 crc kubenswrapper[5094]: I0220 09:18:52.960425 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/811e4fa6-3b96-40b7-88a3-067b582b0683-utilities" (OuterVolumeSpecName: "utilities") pod "811e4fa6-3b96-40b7-88a3-067b582b0683" (UID: "811e4fa6-3b96-40b7-88a3-067b582b0683"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:18:52 crc kubenswrapper[5094]: I0220 09:18:52.968550 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/811e4fa6-3b96-40b7-88a3-067b582b0683-kube-api-access-s7tzs" (OuterVolumeSpecName: "kube-api-access-s7tzs") pod "811e4fa6-3b96-40b7-88a3-067b582b0683" (UID: "811e4fa6-3b96-40b7-88a3-067b582b0683"). InnerVolumeSpecName "kube-api-access-s7tzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:18:53 crc kubenswrapper[5094]: I0220 09:18:53.021347 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/811e4fa6-3b96-40b7-88a3-067b582b0683-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "811e4fa6-3b96-40b7-88a3-067b582b0683" (UID: "811e4fa6-3b96-40b7-88a3-067b582b0683"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:18:53 crc kubenswrapper[5094]: I0220 09:18:53.061086 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/811e4fa6-3b96-40b7-88a3-067b582b0683-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:18:53 crc kubenswrapper[5094]: I0220 09:18:53.061121 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7tzs\" (UniqueName: \"kubernetes.io/projected/811e4fa6-3b96-40b7-88a3-067b582b0683-kube-api-access-s7tzs\") on node \"crc\" DevicePath \"\"" Feb 20 09:18:53 crc kubenswrapper[5094]: I0220 09:18:53.061134 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/811e4fa6-3b96-40b7-88a3-067b582b0683-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:18:53 crc kubenswrapper[5094]: I0220 09:18:53.695100 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8njc" event={"ID":"811e4fa6-3b96-40b7-88a3-067b582b0683","Type":"ContainerDied","Data":"0137a537f9282a07e616bd1f6dc47d28ccb212e130da010ebef4074da0a74551"} Feb 20 09:18:53 crc kubenswrapper[5094]: I0220 09:18:53.695186 5094 scope.go:117] "RemoveContainer" containerID="daa67a570fa974d5aa23afecc3d9bee3da3432d40d38bc55b9f4adbdc13c0f5f" Feb 20 09:18:53 crc kubenswrapper[5094]: I0220 09:18:53.695186 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:53 crc kubenswrapper[5094]: I0220 09:18:53.738882 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g8njc"] Feb 20 09:18:53 crc kubenswrapper[5094]: I0220 09:18:53.749765 5094 scope.go:117] "RemoveContainer" containerID="b2544b81a61b09dab625a06130adcc6de768df15aa50446f5d7958b1d23a98ed" Feb 20 09:18:53 crc kubenswrapper[5094]: I0220 09:18:53.773527 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-g8njc"] Feb 20 09:18:53 crc kubenswrapper[5094]: I0220 09:18:53.791466 5094 scope.go:117] "RemoveContainer" containerID="fa5064b945341f85314c31fcfe63dac8cc7dacd470b3a99df5d85bd58b38d0bd" Feb 20 09:18:53 crc kubenswrapper[5094]: I0220 09:18:53.854894 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="811e4fa6-3b96-40b7-88a3-067b582b0683" path="/var/lib/kubelet/pods/811e4fa6-3b96-40b7-88a3-067b582b0683/volumes" Feb 20 09:19:00 crc kubenswrapper[5094]: I0220 09:19:00.841577 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:19:00 crc kubenswrapper[5094]: E0220 09:19:00.842922 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:19:15 crc kubenswrapper[5094]: I0220 09:19:15.856109 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:19:15 crc kubenswrapper[5094]: E0220 09:19:15.857497 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:19:30 crc kubenswrapper[5094]: I0220 09:19:30.840450 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:19:30 crc kubenswrapper[5094]: E0220 09:19:30.841261 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:19:44 crc kubenswrapper[5094]: I0220 09:19:44.840879 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:19:44 crc kubenswrapper[5094]: E0220 09:19:44.842629 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:19:57 crc kubenswrapper[5094]: I0220 09:19:57.840652 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:19:57 crc kubenswrapper[5094]: E0220 09:19:57.841345 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:20:08 crc kubenswrapper[5094]: I0220 09:20:08.841301 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:20:08 crc kubenswrapper[5094]: E0220 09:20:08.842407 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:20:19 crc kubenswrapper[5094]: I0220 09:20:19.841836 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:20:19 crc kubenswrapper[5094]: E0220 09:20:19.842926 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:20:31 crc kubenswrapper[5094]: I0220 09:20:31.840905 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:20:31 crc kubenswrapper[5094]: E0220 09:20:31.841940 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:20:44 crc kubenswrapper[5094]: I0220 09:20:44.841039 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:20:44 crc kubenswrapper[5094]: E0220 09:20:44.842191 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:20:56 crc kubenswrapper[5094]: I0220 09:20:56.841567 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:20:56 crc kubenswrapper[5094]: E0220 09:20:56.844457 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:20:59 crc kubenswrapper[5094]: I0220 09:20:59.009607 5094 generic.go:334] "Generic (PLEG): container finished" podID="a552adeb-5834-4cfe-8ee3-56472dda5cab" containerID="11e98b436e832bd8d5225286da85ae8eb5b7dfbad585fc8d6f918f1ffca98c48" exitCode=0 Feb 20 09:20:59 crc kubenswrapper[5094]: I0220 09:20:59.009813 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" event={"ID":"a552adeb-5834-4cfe-8ee3-56472dda5cab","Type":"ContainerDied","Data":"11e98b436e832bd8d5225286da85ae8eb5b7dfbad585fc8d6f918f1ffca98c48"} Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.434731 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.576874 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5bl2\" (UniqueName: \"kubernetes.io/projected/a552adeb-5834-4cfe-8ee3-56472dda5cab-kube-api-access-f5bl2\") pod \"a552adeb-5834-4cfe-8ee3-56472dda5cab\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.576935 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-ssh-key-openstack-cell1\") pod \"a552adeb-5834-4cfe-8ee3-56472dda5cab\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.577049 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-libvirt-combined-ca-bundle\") pod \"a552adeb-5834-4cfe-8ee3-56472dda5cab\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.577768 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-ceph\") pod \"a552adeb-5834-4cfe-8ee3-56472dda5cab\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.577792 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-libvirt-secret-0\") pod \"a552adeb-5834-4cfe-8ee3-56472dda5cab\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.577964 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-inventory\") pod \"a552adeb-5834-4cfe-8ee3-56472dda5cab\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.582146 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "a552adeb-5834-4cfe-8ee3-56472dda5cab" (UID: "a552adeb-5834-4cfe-8ee3-56472dda5cab"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.585906 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-ceph" (OuterVolumeSpecName: "ceph") pod "a552adeb-5834-4cfe-8ee3-56472dda5cab" (UID: "a552adeb-5834-4cfe-8ee3-56472dda5cab"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.586572 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a552adeb-5834-4cfe-8ee3-56472dda5cab-kube-api-access-f5bl2" (OuterVolumeSpecName: "kube-api-access-f5bl2") pod "a552adeb-5834-4cfe-8ee3-56472dda5cab" (UID: "a552adeb-5834-4cfe-8ee3-56472dda5cab"). InnerVolumeSpecName "kube-api-access-f5bl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.603289 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-inventory" (OuterVolumeSpecName: "inventory") pod "a552adeb-5834-4cfe-8ee3-56472dda5cab" (UID: "a552adeb-5834-4cfe-8ee3-56472dda5cab"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.603901 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "a552adeb-5834-4cfe-8ee3-56472dda5cab" (UID: "a552adeb-5834-4cfe-8ee3-56472dda5cab"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.604256 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "a552adeb-5834-4cfe-8ee3-56472dda5cab" (UID: "a552adeb-5834-4cfe-8ee3-56472dda5cab"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.679975 5094 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.680006 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.680016 5094 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.680024 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.680033 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5bl2\" (UniqueName: \"kubernetes.io/projected/a552adeb-5834-4cfe-8ee3-56472dda5cab-kube-api-access-f5bl2\") on node \"crc\" DevicePath \"\"" Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.680041 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.038971 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" event={"ID":"a552adeb-5834-4cfe-8ee3-56472dda5cab","Type":"ContainerDied","Data":"27d38e30a80a5d08e54616c5028c63d34bbc4b444a5bebcc5405786abf7e2ca0"} Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.039004 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.039009 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27d38e30a80a5d08e54616c5028c63d34bbc4b444a5bebcc5405786abf7e2ca0" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.209834 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-tclm2"] Feb 20 09:21:01 crc kubenswrapper[5094]: E0220 09:21:01.210920 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="811e4fa6-3b96-40b7-88a3-067b582b0683" containerName="extract-utilities" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.210946 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="811e4fa6-3b96-40b7-88a3-067b582b0683" containerName="extract-utilities" Feb 20 09:21:01 crc kubenswrapper[5094]: E0220 09:21:01.210976 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="811e4fa6-3b96-40b7-88a3-067b582b0683" containerName="extract-content" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.210983 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="811e4fa6-3b96-40b7-88a3-067b582b0683" containerName="extract-content" Feb 20 09:21:01 crc kubenswrapper[5094]: E0220 09:21:01.211000 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a552adeb-5834-4cfe-8ee3-56472dda5cab" containerName="libvirt-openstack-openstack-cell1" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.211006 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a552adeb-5834-4cfe-8ee3-56472dda5cab" containerName="libvirt-openstack-openstack-cell1" Feb 20 09:21:01 crc kubenswrapper[5094]: E0220 09:21:01.211016 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="811e4fa6-3b96-40b7-88a3-067b582b0683" containerName="registry-server" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.211023 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="811e4fa6-3b96-40b7-88a3-067b582b0683" containerName="registry-server" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.211478 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="811e4fa6-3b96-40b7-88a3-067b582b0683" containerName="registry-server" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.211521 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a552adeb-5834-4cfe-8ee3-56472dda5cab" containerName="libvirt-openstack-openstack-cell1" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.213673 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.225276 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.225628 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.225682 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.227095 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.227626 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.229093 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.242999 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.245218 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-tclm2"] Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.395571 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.395615 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-inventory\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.395662 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.395682 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.395763 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-ceph\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.395975 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.396024 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjltv\" (UniqueName: \"kubernetes.io/projected/086935dd-74d5-4657-a6a1-25bd11f6455f-kube-api-access-pjltv\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.396094 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.396135 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.396191 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.396229 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.396335 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.396376 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.498482 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.498540 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjltv\" (UniqueName: \"kubernetes.io/projected/086935dd-74d5-4657-a6a1-25bd11f6455f-kube-api-access-pjltv\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.498616 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.498652 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.498733 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.498768 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.499414 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.499446 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.499456 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.499523 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.499591 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-inventory\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.499656 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.499697 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.499760 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-ceph\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.500002 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.503155 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.503616 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.504251 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-ceph\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.504372 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-inventory\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.504457 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.505479 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.506884 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.508134 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.510254 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.511920 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.519672 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjltv\" (UniqueName: \"kubernetes.io/projected/086935dd-74d5-4657-a6a1-25bd11f6455f-kube-api-access-pjltv\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.551262 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:02 crc kubenswrapper[5094]: I0220 09:21:02.156943 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 09:21:02 crc kubenswrapper[5094]: I0220 09:21:02.159308 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-tclm2"] Feb 20 09:21:03 crc kubenswrapper[5094]: I0220 09:21:03.061779 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" event={"ID":"086935dd-74d5-4657-a6a1-25bd11f6455f","Type":"ContainerStarted","Data":"c623c6bcb73425cb5942caee11d7bb36d9dab38dcb8e5a059b4e2fee9f335a16"} Feb 20 09:21:03 crc kubenswrapper[5094]: I0220 09:21:03.062417 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" event={"ID":"086935dd-74d5-4657-a6a1-25bd11f6455f","Type":"ContainerStarted","Data":"b5f00a6875a9dca730468d3130ccbd421639490aee72bce64c7d96373c50cba6"} Feb 20 09:21:03 crc kubenswrapper[5094]: I0220 09:21:03.087855 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" podStartSLOduration=1.60755362 podStartE2EDuration="2.087827533s" podCreationTimestamp="2026-02-20 09:21:01 +0000 UTC" firstStartedPulling="2026-02-20 09:21:02.156751845 +0000 UTC m=+9277.029378556" lastFinishedPulling="2026-02-20 09:21:02.637025758 +0000 UTC m=+9277.509652469" observedRunningTime="2026-02-20 09:21:03.080884756 +0000 UTC m=+9277.953511477" watchObservedRunningTime="2026-02-20 09:21:03.087827533 +0000 UTC m=+9277.960454254" Feb 20 09:21:08 crc kubenswrapper[5094]: I0220 09:21:08.840421 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:21:08 crc kubenswrapper[5094]: E0220 09:21:08.841147 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:21:22 crc kubenswrapper[5094]: I0220 09:21:22.840232 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:21:22 crc kubenswrapper[5094]: E0220 09:21:22.840886 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:21:28 crc kubenswrapper[5094]: I0220 09:21:28.968880 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-snghl"] Feb 20 09:21:28 crc kubenswrapper[5094]: I0220 09:21:28.971264 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.066868 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzlr4\" (UniqueName: \"kubernetes.io/projected/40cbd310-80bd-43a5-aa9f-d151d8397a5e-kube-api-access-tzlr4\") pod \"redhat-marketplace-snghl\" (UID: \"40cbd310-80bd-43a5-aa9f-d151d8397a5e\") " pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.067208 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40cbd310-80bd-43a5-aa9f-d151d8397a5e-catalog-content\") pod \"redhat-marketplace-snghl\" (UID: \"40cbd310-80bd-43a5-aa9f-d151d8397a5e\") " pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.067318 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40cbd310-80bd-43a5-aa9f-d151d8397a5e-utilities\") pod \"redhat-marketplace-snghl\" (UID: \"40cbd310-80bd-43a5-aa9f-d151d8397a5e\") " pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.115877 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-snghl"] Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.161768 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wkskm"] Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.164345 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.169410 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzlr4\" (UniqueName: \"kubernetes.io/projected/40cbd310-80bd-43a5-aa9f-d151d8397a5e-kube-api-access-tzlr4\") pod \"redhat-marketplace-snghl\" (UID: \"40cbd310-80bd-43a5-aa9f-d151d8397a5e\") " pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.169494 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40cbd310-80bd-43a5-aa9f-d151d8397a5e-catalog-content\") pod \"redhat-marketplace-snghl\" (UID: \"40cbd310-80bd-43a5-aa9f-d151d8397a5e\") " pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.169591 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40cbd310-80bd-43a5-aa9f-d151d8397a5e-utilities\") pod \"redhat-marketplace-snghl\" (UID: \"40cbd310-80bd-43a5-aa9f-d151d8397a5e\") " pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.170185 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40cbd310-80bd-43a5-aa9f-d151d8397a5e-utilities\") pod \"redhat-marketplace-snghl\" (UID: \"40cbd310-80bd-43a5-aa9f-d151d8397a5e\") " pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.170761 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40cbd310-80bd-43a5-aa9f-d151d8397a5e-catalog-content\") pod \"redhat-marketplace-snghl\" (UID: \"40cbd310-80bd-43a5-aa9f-d151d8397a5e\") " pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.182025 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wkskm"] Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.244883 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzlr4\" (UniqueName: \"kubernetes.io/projected/40cbd310-80bd-43a5-aa9f-d151d8397a5e-kube-api-access-tzlr4\") pod \"redhat-marketplace-snghl\" (UID: \"40cbd310-80bd-43a5-aa9f-d151d8397a5e\") " pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.271880 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhz4j\" (UniqueName: \"kubernetes.io/projected/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-kube-api-access-hhz4j\") pod \"redhat-operators-wkskm\" (UID: \"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6\") " pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.271943 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-catalog-content\") pod \"redhat-operators-wkskm\" (UID: \"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6\") " pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.272362 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-utilities\") pod \"redhat-operators-wkskm\" (UID: \"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6\") " pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.294321 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.374250 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-utilities\") pod \"redhat-operators-wkskm\" (UID: \"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6\") " pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.374421 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhz4j\" (UniqueName: \"kubernetes.io/projected/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-kube-api-access-hhz4j\") pod \"redhat-operators-wkskm\" (UID: \"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6\") " pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.374466 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-catalog-content\") pod \"redhat-operators-wkskm\" (UID: \"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6\") " pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.375163 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-catalog-content\") pod \"redhat-operators-wkskm\" (UID: \"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6\") " pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.375460 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-utilities\") pod \"redhat-operators-wkskm\" (UID: \"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6\") " pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.399716 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhz4j\" (UniqueName: \"kubernetes.io/projected/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-kube-api-access-hhz4j\") pod \"redhat-operators-wkskm\" (UID: \"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6\") " pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.484238 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.838998 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-snghl"] Feb 20 09:21:30 crc kubenswrapper[5094]: I0220 09:21:30.038200 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wkskm"] Feb 20 09:21:30 crc kubenswrapper[5094]: W0220 09:21:30.051508 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac1cb6da_bf56_4dc5_9b34_251a55e75ba6.slice/crio-5cd95cedd36a9fe6d492781127606e3898935339a83730f5e5e72eecbd818b50 WatchSource:0}: Error finding container 5cd95cedd36a9fe6d492781127606e3898935339a83730f5e5e72eecbd818b50: Status 404 returned error can't find the container with id 5cd95cedd36a9fe6d492781127606e3898935339a83730f5e5e72eecbd818b50 Feb 20 09:21:30 crc kubenswrapper[5094]: I0220 09:21:30.333420 5094 generic.go:334] "Generic (PLEG): container finished" podID="40cbd310-80bd-43a5-aa9f-d151d8397a5e" containerID="a7b657d582bfe99be21953db3aba964d80e38b73ae4bb0e5a435e16fac6c1648" exitCode=0 Feb 20 09:21:30 crc kubenswrapper[5094]: I0220 09:21:30.333486 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snghl" event={"ID":"40cbd310-80bd-43a5-aa9f-d151d8397a5e","Type":"ContainerDied","Data":"a7b657d582bfe99be21953db3aba964d80e38b73ae4bb0e5a435e16fac6c1648"} Feb 20 09:21:30 crc kubenswrapper[5094]: I0220 09:21:30.333552 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snghl" event={"ID":"40cbd310-80bd-43a5-aa9f-d151d8397a5e","Type":"ContainerStarted","Data":"3432a63a3e35886fc691c3469a2a6ea3d3dadcd62ce5fa72c15aaf723fb7dc0c"} Feb 20 09:21:30 crc kubenswrapper[5094]: I0220 09:21:30.335037 5094 generic.go:334] "Generic (PLEG): container finished" podID="ac1cb6da-bf56-4dc5-9b34-251a55e75ba6" containerID="dd297ae58f3d57c6457856bfef3df4e9674a1df5468a15a1a134d0c82e3c6d19" exitCode=0 Feb 20 09:21:30 crc kubenswrapper[5094]: I0220 09:21:30.335063 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkskm" event={"ID":"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6","Type":"ContainerDied","Data":"dd297ae58f3d57c6457856bfef3df4e9674a1df5468a15a1a134d0c82e3c6d19"} Feb 20 09:21:30 crc kubenswrapper[5094]: I0220 09:21:30.335079 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkskm" event={"ID":"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6","Type":"ContainerStarted","Data":"5cd95cedd36a9fe6d492781127606e3898935339a83730f5e5e72eecbd818b50"} Feb 20 09:21:32 crc kubenswrapper[5094]: I0220 09:21:32.361041 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkskm" event={"ID":"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6","Type":"ContainerStarted","Data":"96bf390a5ccfccc71cd37fc6fdff090f60306c6fdb4810bd84e7c7db47a65ef6"} Feb 20 09:21:32 crc kubenswrapper[5094]: I0220 09:21:32.366987 5094 generic.go:334] "Generic (PLEG): container finished" podID="40cbd310-80bd-43a5-aa9f-d151d8397a5e" containerID="c6489d400015ce16b5bb2780f5bdb9ae8c2c1d96cf37f7a97d8b191553f2789a" exitCode=0 Feb 20 09:21:32 crc kubenswrapper[5094]: I0220 09:21:32.367039 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snghl" event={"ID":"40cbd310-80bd-43a5-aa9f-d151d8397a5e","Type":"ContainerDied","Data":"c6489d400015ce16b5bb2780f5bdb9ae8c2c1d96cf37f7a97d8b191553f2789a"} Feb 20 09:21:33 crc kubenswrapper[5094]: I0220 09:21:33.380000 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snghl" event={"ID":"40cbd310-80bd-43a5-aa9f-d151d8397a5e","Type":"ContainerStarted","Data":"d16910822a977d3ffed417afd8c88c7e707411f2de2e107c93137b8c882274a9"} Feb 20 09:21:33 crc kubenswrapper[5094]: I0220 09:21:33.403867 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-snghl" podStartSLOduration=2.937889493 podStartE2EDuration="5.403837489s" podCreationTimestamp="2026-02-20 09:21:28 +0000 UTC" firstStartedPulling="2026-02-20 09:21:30.334914397 +0000 UTC m=+9305.207541108" lastFinishedPulling="2026-02-20 09:21:32.800862393 +0000 UTC m=+9307.673489104" observedRunningTime="2026-02-20 09:21:33.398510191 +0000 UTC m=+9308.271136902" watchObservedRunningTime="2026-02-20 09:21:33.403837489 +0000 UTC m=+9308.276464230" Feb 20 09:21:35 crc kubenswrapper[5094]: E0220 09:21:35.189957 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac1cb6da_bf56_4dc5_9b34_251a55e75ba6.slice/crio-96bf390a5ccfccc71cd37fc6fdff090f60306c6fdb4810bd84e7c7db47a65ef6.scope\": RecentStats: unable to find data in memory cache]" Feb 20 09:21:35 crc kubenswrapper[5094]: I0220 09:21:35.400170 5094 generic.go:334] "Generic (PLEG): container finished" podID="ac1cb6da-bf56-4dc5-9b34-251a55e75ba6" containerID="96bf390a5ccfccc71cd37fc6fdff090f60306c6fdb4810bd84e7c7db47a65ef6" exitCode=0 Feb 20 09:21:35 crc kubenswrapper[5094]: I0220 09:21:35.400236 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkskm" event={"ID":"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6","Type":"ContainerDied","Data":"96bf390a5ccfccc71cd37fc6fdff090f60306c6fdb4810bd84e7c7db47a65ef6"} Feb 20 09:21:35 crc kubenswrapper[5094]: I0220 09:21:35.847201 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:21:35 crc kubenswrapper[5094]: E0220 09:21:35.847583 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:21:36 crc kubenswrapper[5094]: I0220 09:21:36.411623 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkskm" event={"ID":"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6","Type":"ContainerStarted","Data":"aefab98f9be28e8f6ee1dc784e3de109cead402994b0f4902476f1b1e66001d7"} Feb 20 09:21:36 crc kubenswrapper[5094]: I0220 09:21:36.441679 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wkskm" podStartSLOduration=2.013292667 podStartE2EDuration="7.441657283s" podCreationTimestamp="2026-02-20 09:21:29 +0000 UTC" firstStartedPulling="2026-02-20 09:21:30.336895694 +0000 UTC m=+9305.209522405" lastFinishedPulling="2026-02-20 09:21:35.76526031 +0000 UTC m=+9310.637887021" observedRunningTime="2026-02-20 09:21:36.430523665 +0000 UTC m=+9311.303150366" watchObservedRunningTime="2026-02-20 09:21:36.441657283 +0000 UTC m=+9311.314283994" Feb 20 09:21:39 crc kubenswrapper[5094]: I0220 09:21:39.295924 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:39 crc kubenswrapper[5094]: I0220 09:21:39.296522 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:39 crc kubenswrapper[5094]: I0220 09:21:39.485420 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:39 crc kubenswrapper[5094]: I0220 09:21:39.485736 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:40 crc kubenswrapper[5094]: I0220 09:21:40.357006 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-snghl" podUID="40cbd310-80bd-43a5-aa9f-d151d8397a5e" containerName="registry-server" probeResult="failure" output=< Feb 20 09:21:40 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 09:21:40 crc kubenswrapper[5094]: > Feb 20 09:21:40 crc kubenswrapper[5094]: I0220 09:21:40.533611 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wkskm" podUID="ac1cb6da-bf56-4dc5-9b34-251a55e75ba6" containerName="registry-server" probeResult="failure" output=< Feb 20 09:21:40 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 09:21:40 crc kubenswrapper[5094]: > Feb 20 09:21:49 crc kubenswrapper[5094]: I0220 09:21:49.358122 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:49 crc kubenswrapper[5094]: I0220 09:21:49.793145 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:49 crc kubenswrapper[5094]: I0220 09:21:49.815653 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:49 crc kubenswrapper[5094]: I0220 09:21:49.860760 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-snghl"] Feb 20 09:21:49 crc kubenswrapper[5094]: I0220 09:21:49.870241 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:50 crc kubenswrapper[5094]: I0220 09:21:50.542436 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-snghl" podUID="40cbd310-80bd-43a5-aa9f-d151d8397a5e" containerName="registry-server" containerID="cri-o://d16910822a977d3ffed417afd8c88c7e707411f2de2e107c93137b8c882274a9" gracePeriod=2 Feb 20 09:21:50 crc kubenswrapper[5094]: I0220 09:21:50.840847 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:21:50 crc kubenswrapper[5094]: E0220 09:21:50.841509 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.103361 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.179899 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40cbd310-80bd-43a5-aa9f-d151d8397a5e-utilities\") pod \"40cbd310-80bd-43a5-aa9f-d151d8397a5e\" (UID: \"40cbd310-80bd-43a5-aa9f-d151d8397a5e\") " Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.179975 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40cbd310-80bd-43a5-aa9f-d151d8397a5e-catalog-content\") pod \"40cbd310-80bd-43a5-aa9f-d151d8397a5e\" (UID: \"40cbd310-80bd-43a5-aa9f-d151d8397a5e\") " Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.180046 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzlr4\" (UniqueName: \"kubernetes.io/projected/40cbd310-80bd-43a5-aa9f-d151d8397a5e-kube-api-access-tzlr4\") pod \"40cbd310-80bd-43a5-aa9f-d151d8397a5e\" (UID: \"40cbd310-80bd-43a5-aa9f-d151d8397a5e\") " Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.180617 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40cbd310-80bd-43a5-aa9f-d151d8397a5e-utilities" (OuterVolumeSpecName: "utilities") pod "40cbd310-80bd-43a5-aa9f-d151d8397a5e" (UID: "40cbd310-80bd-43a5-aa9f-d151d8397a5e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.180795 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40cbd310-80bd-43a5-aa9f-d151d8397a5e-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.212974 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40cbd310-80bd-43a5-aa9f-d151d8397a5e-kube-api-access-tzlr4" (OuterVolumeSpecName: "kube-api-access-tzlr4") pod "40cbd310-80bd-43a5-aa9f-d151d8397a5e" (UID: "40cbd310-80bd-43a5-aa9f-d151d8397a5e"). InnerVolumeSpecName "kube-api-access-tzlr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.248228 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40cbd310-80bd-43a5-aa9f-d151d8397a5e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40cbd310-80bd-43a5-aa9f-d151d8397a5e" (UID: "40cbd310-80bd-43a5-aa9f-d151d8397a5e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.282899 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40cbd310-80bd-43a5-aa9f-d151d8397a5e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.282933 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzlr4\" (UniqueName: \"kubernetes.io/projected/40cbd310-80bd-43a5-aa9f-d151d8397a5e-kube-api-access-tzlr4\") on node \"crc\" DevicePath \"\"" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.553427 5094 generic.go:334] "Generic (PLEG): container finished" podID="40cbd310-80bd-43a5-aa9f-d151d8397a5e" containerID="d16910822a977d3ffed417afd8c88c7e707411f2de2e107c93137b8c882274a9" exitCode=0 Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.553467 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snghl" event={"ID":"40cbd310-80bd-43a5-aa9f-d151d8397a5e","Type":"ContainerDied","Data":"d16910822a977d3ffed417afd8c88c7e707411f2de2e107c93137b8c882274a9"} Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.553498 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snghl" event={"ID":"40cbd310-80bd-43a5-aa9f-d151d8397a5e","Type":"ContainerDied","Data":"3432a63a3e35886fc691c3469a2a6ea3d3dadcd62ce5fa72c15aaf723fb7dc0c"} Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.553511 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.553520 5094 scope.go:117] "RemoveContainer" containerID="d16910822a977d3ffed417afd8c88c7e707411f2de2e107c93137b8c882274a9" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.587529 5094 scope.go:117] "RemoveContainer" containerID="c6489d400015ce16b5bb2780f5bdb9ae8c2c1d96cf37f7a97d8b191553f2789a" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.588463 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-snghl"] Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.598467 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-snghl"] Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.627668 5094 scope.go:117] "RemoveContainer" containerID="a7b657d582bfe99be21953db3aba964d80e38b73ae4bb0e5a435e16fac6c1648" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.671904 5094 scope.go:117] "RemoveContainer" containerID="d16910822a977d3ffed417afd8c88c7e707411f2de2e107c93137b8c882274a9" Feb 20 09:21:51 crc kubenswrapper[5094]: E0220 09:21:51.672376 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d16910822a977d3ffed417afd8c88c7e707411f2de2e107c93137b8c882274a9\": container with ID starting with d16910822a977d3ffed417afd8c88c7e707411f2de2e107c93137b8c882274a9 not found: ID does not exist" containerID="d16910822a977d3ffed417afd8c88c7e707411f2de2e107c93137b8c882274a9" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.672427 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d16910822a977d3ffed417afd8c88c7e707411f2de2e107c93137b8c882274a9"} err="failed to get container status \"d16910822a977d3ffed417afd8c88c7e707411f2de2e107c93137b8c882274a9\": rpc error: code = NotFound desc = could not find container \"d16910822a977d3ffed417afd8c88c7e707411f2de2e107c93137b8c882274a9\": container with ID starting with d16910822a977d3ffed417afd8c88c7e707411f2de2e107c93137b8c882274a9 not found: ID does not exist" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.672456 5094 scope.go:117] "RemoveContainer" containerID="c6489d400015ce16b5bb2780f5bdb9ae8c2c1d96cf37f7a97d8b191553f2789a" Feb 20 09:21:51 crc kubenswrapper[5094]: E0220 09:21:51.672845 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6489d400015ce16b5bb2780f5bdb9ae8c2c1d96cf37f7a97d8b191553f2789a\": container with ID starting with c6489d400015ce16b5bb2780f5bdb9ae8c2c1d96cf37f7a97d8b191553f2789a not found: ID does not exist" containerID="c6489d400015ce16b5bb2780f5bdb9ae8c2c1d96cf37f7a97d8b191553f2789a" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.672890 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6489d400015ce16b5bb2780f5bdb9ae8c2c1d96cf37f7a97d8b191553f2789a"} err="failed to get container status \"c6489d400015ce16b5bb2780f5bdb9ae8c2c1d96cf37f7a97d8b191553f2789a\": rpc error: code = NotFound desc = could not find container \"c6489d400015ce16b5bb2780f5bdb9ae8c2c1d96cf37f7a97d8b191553f2789a\": container with ID starting with c6489d400015ce16b5bb2780f5bdb9ae8c2c1d96cf37f7a97d8b191553f2789a not found: ID does not exist" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.672918 5094 scope.go:117] "RemoveContainer" containerID="a7b657d582bfe99be21953db3aba964d80e38b73ae4bb0e5a435e16fac6c1648" Feb 20 09:21:51 crc kubenswrapper[5094]: E0220 09:21:51.673332 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7b657d582bfe99be21953db3aba964d80e38b73ae4bb0e5a435e16fac6c1648\": container with ID starting with a7b657d582bfe99be21953db3aba964d80e38b73ae4bb0e5a435e16fac6c1648 not found: ID does not exist" containerID="a7b657d582bfe99be21953db3aba964d80e38b73ae4bb0e5a435e16fac6c1648" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.673369 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7b657d582bfe99be21953db3aba964d80e38b73ae4bb0e5a435e16fac6c1648"} err="failed to get container status \"a7b657d582bfe99be21953db3aba964d80e38b73ae4bb0e5a435e16fac6c1648\": rpc error: code = NotFound desc = could not find container \"a7b657d582bfe99be21953db3aba964d80e38b73ae4bb0e5a435e16fac6c1648\": container with ID starting with a7b657d582bfe99be21953db3aba964d80e38b73ae4bb0e5a435e16fac6c1648 not found: ID does not exist" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.821831 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wkskm"] Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.822611 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wkskm" podUID="ac1cb6da-bf56-4dc5-9b34-251a55e75ba6" containerName="registry-server" containerID="cri-o://aefab98f9be28e8f6ee1dc784e3de109cead402994b0f4902476f1b1e66001d7" gracePeriod=2 Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.860213 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40cbd310-80bd-43a5-aa9f-d151d8397a5e" path="/var/lib/kubelet/pods/40cbd310-80bd-43a5-aa9f-d151d8397a5e/volumes" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.381566 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.514104 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-catalog-content\") pod \"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6\" (UID: \"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6\") " Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.514667 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhz4j\" (UniqueName: \"kubernetes.io/projected/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-kube-api-access-hhz4j\") pod \"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6\" (UID: \"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6\") " Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.514797 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-utilities\") pod \"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6\" (UID: \"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6\") " Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.515687 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-utilities" (OuterVolumeSpecName: "utilities") pod "ac1cb6da-bf56-4dc5-9b34-251a55e75ba6" (UID: "ac1cb6da-bf56-4dc5-9b34-251a55e75ba6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.521733 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-kube-api-access-hhz4j" (OuterVolumeSpecName: "kube-api-access-hhz4j") pod "ac1cb6da-bf56-4dc5-9b34-251a55e75ba6" (UID: "ac1cb6da-bf56-4dc5-9b34-251a55e75ba6"). InnerVolumeSpecName "kube-api-access-hhz4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.568463 5094 generic.go:334] "Generic (PLEG): container finished" podID="ac1cb6da-bf56-4dc5-9b34-251a55e75ba6" containerID="aefab98f9be28e8f6ee1dc784e3de109cead402994b0f4902476f1b1e66001d7" exitCode=0 Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.568557 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.570211 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkskm" event={"ID":"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6","Type":"ContainerDied","Data":"aefab98f9be28e8f6ee1dc784e3de109cead402994b0f4902476f1b1e66001d7"} Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.570355 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkskm" event={"ID":"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6","Type":"ContainerDied","Data":"5cd95cedd36a9fe6d492781127606e3898935339a83730f5e5e72eecbd818b50"} Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.570427 5094 scope.go:117] "RemoveContainer" containerID="aefab98f9be28e8f6ee1dc784e3de109cead402994b0f4902476f1b1e66001d7" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.601862 5094 scope.go:117] "RemoveContainer" containerID="96bf390a5ccfccc71cd37fc6fdff090f60306c6fdb4810bd84e7c7db47a65ef6" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.623312 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhz4j\" (UniqueName: \"kubernetes.io/projected/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-kube-api-access-hhz4j\") on node \"crc\" DevicePath \"\"" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.623529 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.636359 5094 scope.go:117] "RemoveContainer" containerID="dd297ae58f3d57c6457856bfef3df4e9674a1df5468a15a1a134d0c82e3c6d19" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.658525 5094 scope.go:117] "RemoveContainer" containerID="aefab98f9be28e8f6ee1dc784e3de109cead402994b0f4902476f1b1e66001d7" Feb 20 09:21:52 crc kubenswrapper[5094]: E0220 09:21:52.659288 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aefab98f9be28e8f6ee1dc784e3de109cead402994b0f4902476f1b1e66001d7\": container with ID starting with aefab98f9be28e8f6ee1dc784e3de109cead402994b0f4902476f1b1e66001d7 not found: ID does not exist" containerID="aefab98f9be28e8f6ee1dc784e3de109cead402994b0f4902476f1b1e66001d7" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.659397 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aefab98f9be28e8f6ee1dc784e3de109cead402994b0f4902476f1b1e66001d7"} err="failed to get container status \"aefab98f9be28e8f6ee1dc784e3de109cead402994b0f4902476f1b1e66001d7\": rpc error: code = NotFound desc = could not find container \"aefab98f9be28e8f6ee1dc784e3de109cead402994b0f4902476f1b1e66001d7\": container with ID starting with aefab98f9be28e8f6ee1dc784e3de109cead402994b0f4902476f1b1e66001d7 not found: ID does not exist" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.659493 5094 scope.go:117] "RemoveContainer" containerID="96bf390a5ccfccc71cd37fc6fdff090f60306c6fdb4810bd84e7c7db47a65ef6" Feb 20 09:21:52 crc kubenswrapper[5094]: E0220 09:21:52.659856 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96bf390a5ccfccc71cd37fc6fdff090f60306c6fdb4810bd84e7c7db47a65ef6\": container with ID starting with 96bf390a5ccfccc71cd37fc6fdff090f60306c6fdb4810bd84e7c7db47a65ef6 not found: ID does not exist" containerID="96bf390a5ccfccc71cd37fc6fdff090f60306c6fdb4810bd84e7c7db47a65ef6" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.659975 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96bf390a5ccfccc71cd37fc6fdff090f60306c6fdb4810bd84e7c7db47a65ef6"} err="failed to get container status \"96bf390a5ccfccc71cd37fc6fdff090f60306c6fdb4810bd84e7c7db47a65ef6\": rpc error: code = NotFound desc = could not find container \"96bf390a5ccfccc71cd37fc6fdff090f60306c6fdb4810bd84e7c7db47a65ef6\": container with ID starting with 96bf390a5ccfccc71cd37fc6fdff090f60306c6fdb4810bd84e7c7db47a65ef6 not found: ID does not exist" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.660058 5094 scope.go:117] "RemoveContainer" containerID="dd297ae58f3d57c6457856bfef3df4e9674a1df5468a15a1a134d0c82e3c6d19" Feb 20 09:21:52 crc kubenswrapper[5094]: E0220 09:21:52.660398 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd297ae58f3d57c6457856bfef3df4e9674a1df5468a15a1a134d0c82e3c6d19\": container with ID starting with dd297ae58f3d57c6457856bfef3df4e9674a1df5468a15a1a134d0c82e3c6d19 not found: ID does not exist" containerID="dd297ae58f3d57c6457856bfef3df4e9674a1df5468a15a1a134d0c82e3c6d19" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.660497 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd297ae58f3d57c6457856bfef3df4e9674a1df5468a15a1a134d0c82e3c6d19"} err="failed to get container status \"dd297ae58f3d57c6457856bfef3df4e9674a1df5468a15a1a134d0c82e3c6d19\": rpc error: code = NotFound desc = could not find container \"dd297ae58f3d57c6457856bfef3df4e9674a1df5468a15a1a134d0c82e3c6d19\": container with ID starting with dd297ae58f3d57c6457856bfef3df4e9674a1df5468a15a1a134d0c82e3c6d19 not found: ID does not exist" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.676925 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac1cb6da-bf56-4dc5-9b34-251a55e75ba6" (UID: "ac1cb6da-bf56-4dc5-9b34-251a55e75ba6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.726009 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.898139 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wkskm"] Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.908499 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wkskm"] Feb 20 09:21:53 crc kubenswrapper[5094]: I0220 09:21:53.854933 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac1cb6da-bf56-4dc5-9b34-251a55e75ba6" path="/var/lib/kubelet/pods/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6/volumes" Feb 20 09:22:04 crc kubenswrapper[5094]: I0220 09:22:04.840918 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:22:04 crc kubenswrapper[5094]: E0220 09:22:04.841776 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:22:17 crc kubenswrapper[5094]: I0220 09:22:17.842232 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:22:17 crc kubenswrapper[5094]: E0220 09:22:17.845826 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:22:31 crc kubenswrapper[5094]: I0220 09:22:31.840334 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:22:31 crc kubenswrapper[5094]: E0220 09:22:31.841071 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:22:45 crc kubenswrapper[5094]: I0220 09:22:45.849382 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:22:45 crc kubenswrapper[5094]: E0220 09:22:45.850285 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:22:59 crc kubenswrapper[5094]: I0220 09:22:59.839774 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:22:59 crc kubenswrapper[5094]: E0220 09:22:59.840555 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:23:14 crc kubenswrapper[5094]: I0220 09:23:14.841251 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:23:15 crc kubenswrapper[5094]: I0220 09:23:15.607892 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"94b92f3a2bced5f38491238ad47cd4799f0c32866995d1804ffd21cbf89a306f"} Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.485900 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-scdqt"] Feb 20 09:23:20 crc kubenswrapper[5094]: E0220 09:23:20.487122 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40cbd310-80bd-43a5-aa9f-d151d8397a5e" containerName="extract-content" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.487140 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="40cbd310-80bd-43a5-aa9f-d151d8397a5e" containerName="extract-content" Feb 20 09:23:20 crc kubenswrapper[5094]: E0220 09:23:20.487180 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40cbd310-80bd-43a5-aa9f-d151d8397a5e" containerName="extract-utilities" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.487190 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="40cbd310-80bd-43a5-aa9f-d151d8397a5e" containerName="extract-utilities" Feb 20 09:23:20 crc kubenswrapper[5094]: E0220 09:23:20.487208 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40cbd310-80bd-43a5-aa9f-d151d8397a5e" containerName="registry-server" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.487220 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="40cbd310-80bd-43a5-aa9f-d151d8397a5e" containerName="registry-server" Feb 20 09:23:20 crc kubenswrapper[5094]: E0220 09:23:20.487234 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac1cb6da-bf56-4dc5-9b34-251a55e75ba6" containerName="extract-utilities" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.487242 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac1cb6da-bf56-4dc5-9b34-251a55e75ba6" containerName="extract-utilities" Feb 20 09:23:20 crc kubenswrapper[5094]: E0220 09:23:20.487258 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac1cb6da-bf56-4dc5-9b34-251a55e75ba6" containerName="registry-server" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.487266 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac1cb6da-bf56-4dc5-9b34-251a55e75ba6" containerName="registry-server" Feb 20 09:23:20 crc kubenswrapper[5094]: E0220 09:23:20.487278 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac1cb6da-bf56-4dc5-9b34-251a55e75ba6" containerName="extract-content" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.487286 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac1cb6da-bf56-4dc5-9b34-251a55e75ba6" containerName="extract-content" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.487589 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="40cbd310-80bd-43a5-aa9f-d151d8397a5e" containerName="registry-server" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.487612 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac1cb6da-bf56-4dc5-9b34-251a55e75ba6" containerName="registry-server" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.489484 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.509060 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-scdqt"] Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.585422 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f05d18a6-4c8d-4876-ae31-b44332ff55ca-utilities\") pod \"certified-operators-scdqt\" (UID: \"f05d18a6-4c8d-4876-ae31-b44332ff55ca\") " pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.586401 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tds5f\" (UniqueName: \"kubernetes.io/projected/f05d18a6-4c8d-4876-ae31-b44332ff55ca-kube-api-access-tds5f\") pod \"certified-operators-scdqt\" (UID: \"f05d18a6-4c8d-4876-ae31-b44332ff55ca\") " pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.586510 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f05d18a6-4c8d-4876-ae31-b44332ff55ca-catalog-content\") pod \"certified-operators-scdqt\" (UID: \"f05d18a6-4c8d-4876-ae31-b44332ff55ca\") " pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.687361 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tds5f\" (UniqueName: \"kubernetes.io/projected/f05d18a6-4c8d-4876-ae31-b44332ff55ca-kube-api-access-tds5f\") pod \"certified-operators-scdqt\" (UID: \"f05d18a6-4c8d-4876-ae31-b44332ff55ca\") " pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.687424 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f05d18a6-4c8d-4876-ae31-b44332ff55ca-catalog-content\") pod \"certified-operators-scdqt\" (UID: \"f05d18a6-4c8d-4876-ae31-b44332ff55ca\") " pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.687534 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f05d18a6-4c8d-4876-ae31-b44332ff55ca-utilities\") pod \"certified-operators-scdqt\" (UID: \"f05d18a6-4c8d-4876-ae31-b44332ff55ca\") " pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.687985 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f05d18a6-4c8d-4876-ae31-b44332ff55ca-utilities\") pod \"certified-operators-scdqt\" (UID: \"f05d18a6-4c8d-4876-ae31-b44332ff55ca\") " pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.688475 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f05d18a6-4c8d-4876-ae31-b44332ff55ca-catalog-content\") pod \"certified-operators-scdqt\" (UID: \"f05d18a6-4c8d-4876-ae31-b44332ff55ca\") " pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.710673 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tds5f\" (UniqueName: \"kubernetes.io/projected/f05d18a6-4c8d-4876-ae31-b44332ff55ca-kube-api-access-tds5f\") pod \"certified-operators-scdqt\" (UID: \"f05d18a6-4c8d-4876-ae31-b44332ff55ca\") " pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.825167 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:21 crc kubenswrapper[5094]: I0220 09:23:21.327301 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-scdqt"] Feb 20 09:23:21 crc kubenswrapper[5094]: I0220 09:23:21.683612 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scdqt" event={"ID":"f05d18a6-4c8d-4876-ae31-b44332ff55ca","Type":"ContainerStarted","Data":"4b359f1a3fa7bb1a2550a1bda8b29ed69dfc413e256d39e62079e81f061afc15"} Feb 20 09:23:22 crc kubenswrapper[5094]: I0220 09:23:22.695953 5094 generic.go:334] "Generic (PLEG): container finished" podID="f05d18a6-4c8d-4876-ae31-b44332ff55ca" containerID="4b943ec4c7d1ca434acd9492654334ba91ffe6700bc8e7d3d8b8214771d14fd7" exitCode=0 Feb 20 09:23:22 crc kubenswrapper[5094]: I0220 09:23:22.696094 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scdqt" event={"ID":"f05d18a6-4c8d-4876-ae31-b44332ff55ca","Type":"ContainerDied","Data":"4b943ec4c7d1ca434acd9492654334ba91ffe6700bc8e7d3d8b8214771d14fd7"} Feb 20 09:23:24 crc kubenswrapper[5094]: I0220 09:23:24.720999 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scdqt" event={"ID":"f05d18a6-4c8d-4876-ae31-b44332ff55ca","Type":"ContainerStarted","Data":"cec40255b9c8bf55aa9d7e637e649a16a0eb5d70c662743f237ead50d855b873"} Feb 20 09:23:25 crc kubenswrapper[5094]: I0220 09:23:25.731956 5094 generic.go:334] "Generic (PLEG): container finished" podID="f05d18a6-4c8d-4876-ae31-b44332ff55ca" containerID="cec40255b9c8bf55aa9d7e637e649a16a0eb5d70c662743f237ead50d855b873" exitCode=0 Feb 20 09:23:25 crc kubenswrapper[5094]: I0220 09:23:25.732062 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scdqt" event={"ID":"f05d18a6-4c8d-4876-ae31-b44332ff55ca","Type":"ContainerDied","Data":"cec40255b9c8bf55aa9d7e637e649a16a0eb5d70c662743f237ead50d855b873"} Feb 20 09:23:26 crc kubenswrapper[5094]: I0220 09:23:26.743285 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scdqt" event={"ID":"f05d18a6-4c8d-4876-ae31-b44332ff55ca","Type":"ContainerStarted","Data":"c5bf28e7fd944e53ffd78146956ea3c8c30d4b1fe18bd58a21e41551769c7d0b"} Feb 20 09:23:26 crc kubenswrapper[5094]: I0220 09:23:26.763810 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-scdqt" podStartSLOduration=3.357818182 podStartE2EDuration="6.763792604s" podCreationTimestamp="2026-02-20 09:23:20 +0000 UTC" firstStartedPulling="2026-02-20 09:23:22.698142591 +0000 UTC m=+9417.570769302" lastFinishedPulling="2026-02-20 09:23:26.104117013 +0000 UTC m=+9420.976743724" observedRunningTime="2026-02-20 09:23:26.76114785 +0000 UTC m=+9421.633774651" watchObservedRunningTime="2026-02-20 09:23:26.763792604 +0000 UTC m=+9421.636419315" Feb 20 09:23:30 crc kubenswrapper[5094]: I0220 09:23:30.825742 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:30 crc kubenswrapper[5094]: I0220 09:23:30.826158 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:31 crc kubenswrapper[5094]: I0220 09:23:31.331320 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:31 crc kubenswrapper[5094]: I0220 09:23:31.852171 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:31 crc kubenswrapper[5094]: I0220 09:23:31.902449 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-scdqt"] Feb 20 09:23:33 crc kubenswrapper[5094]: I0220 09:23:33.803131 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-scdqt" podUID="f05d18a6-4c8d-4876-ae31-b44332ff55ca" containerName="registry-server" containerID="cri-o://c5bf28e7fd944e53ffd78146956ea3c8c30d4b1fe18bd58a21e41551769c7d0b" gracePeriod=2 Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.264893 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.272488 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f05d18a6-4c8d-4876-ae31-b44332ff55ca-catalog-content\") pod \"f05d18a6-4c8d-4876-ae31-b44332ff55ca\" (UID: \"f05d18a6-4c8d-4876-ae31-b44332ff55ca\") " Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.272555 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f05d18a6-4c8d-4876-ae31-b44332ff55ca-utilities\") pod \"f05d18a6-4c8d-4876-ae31-b44332ff55ca\" (UID: \"f05d18a6-4c8d-4876-ae31-b44332ff55ca\") " Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.272687 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tds5f\" (UniqueName: \"kubernetes.io/projected/f05d18a6-4c8d-4876-ae31-b44332ff55ca-kube-api-access-tds5f\") pod \"f05d18a6-4c8d-4876-ae31-b44332ff55ca\" (UID: \"f05d18a6-4c8d-4876-ae31-b44332ff55ca\") " Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.274496 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f05d18a6-4c8d-4876-ae31-b44332ff55ca-utilities" (OuterVolumeSpecName: "utilities") pod "f05d18a6-4c8d-4876-ae31-b44332ff55ca" (UID: "f05d18a6-4c8d-4876-ae31-b44332ff55ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.338624 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f05d18a6-4c8d-4876-ae31-b44332ff55ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f05d18a6-4c8d-4876-ae31-b44332ff55ca" (UID: "f05d18a6-4c8d-4876-ae31-b44332ff55ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.361979 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f05d18a6-4c8d-4876-ae31-b44332ff55ca-kube-api-access-tds5f" (OuterVolumeSpecName: "kube-api-access-tds5f") pod "f05d18a6-4c8d-4876-ae31-b44332ff55ca" (UID: "f05d18a6-4c8d-4876-ae31-b44332ff55ca"). InnerVolumeSpecName "kube-api-access-tds5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.375559 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f05d18a6-4c8d-4876-ae31-b44332ff55ca-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.375596 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f05d18a6-4c8d-4876-ae31-b44332ff55ca-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.375607 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tds5f\" (UniqueName: \"kubernetes.io/projected/f05d18a6-4c8d-4876-ae31-b44332ff55ca-kube-api-access-tds5f\") on node \"crc\" DevicePath \"\"" Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.815614 5094 generic.go:334] "Generic (PLEG): container finished" podID="f05d18a6-4c8d-4876-ae31-b44332ff55ca" containerID="c5bf28e7fd944e53ffd78146956ea3c8c30d4b1fe18bd58a21e41551769c7d0b" exitCode=0 Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.815770 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scdqt" event={"ID":"f05d18a6-4c8d-4876-ae31-b44332ff55ca","Type":"ContainerDied","Data":"c5bf28e7fd944e53ffd78146956ea3c8c30d4b1fe18bd58a21e41551769c7d0b"} Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.815857 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.816029 5094 scope.go:117] "RemoveContainer" containerID="c5bf28e7fd944e53ffd78146956ea3c8c30d4b1fe18bd58a21e41551769c7d0b" Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.816010 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scdqt" event={"ID":"f05d18a6-4c8d-4876-ae31-b44332ff55ca","Type":"ContainerDied","Data":"4b359f1a3fa7bb1a2550a1bda8b29ed69dfc413e256d39e62079e81f061afc15"} Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.850097 5094 scope.go:117] "RemoveContainer" containerID="cec40255b9c8bf55aa9d7e637e649a16a0eb5d70c662743f237ead50d855b873" Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.871828 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-scdqt"] Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.872248 5094 scope.go:117] "RemoveContainer" containerID="4b943ec4c7d1ca434acd9492654334ba91ffe6700bc8e7d3d8b8214771d14fd7" Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.881101 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-scdqt"] Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.920576 5094 scope.go:117] "RemoveContainer" containerID="c5bf28e7fd944e53ffd78146956ea3c8c30d4b1fe18bd58a21e41551769c7d0b" Feb 20 09:23:34 crc kubenswrapper[5094]: E0220 09:23:34.921078 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5bf28e7fd944e53ffd78146956ea3c8c30d4b1fe18bd58a21e41551769c7d0b\": container with ID starting with c5bf28e7fd944e53ffd78146956ea3c8c30d4b1fe18bd58a21e41551769c7d0b not found: ID does not exist" containerID="c5bf28e7fd944e53ffd78146956ea3c8c30d4b1fe18bd58a21e41551769c7d0b" Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.921119 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5bf28e7fd944e53ffd78146956ea3c8c30d4b1fe18bd58a21e41551769c7d0b"} err="failed to get container status \"c5bf28e7fd944e53ffd78146956ea3c8c30d4b1fe18bd58a21e41551769c7d0b\": rpc error: code = NotFound desc = could not find container \"c5bf28e7fd944e53ffd78146956ea3c8c30d4b1fe18bd58a21e41551769c7d0b\": container with ID starting with c5bf28e7fd944e53ffd78146956ea3c8c30d4b1fe18bd58a21e41551769c7d0b not found: ID does not exist" Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.921145 5094 scope.go:117] "RemoveContainer" containerID="cec40255b9c8bf55aa9d7e637e649a16a0eb5d70c662743f237ead50d855b873" Feb 20 09:23:34 crc kubenswrapper[5094]: E0220 09:23:34.921558 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cec40255b9c8bf55aa9d7e637e649a16a0eb5d70c662743f237ead50d855b873\": container with ID starting with cec40255b9c8bf55aa9d7e637e649a16a0eb5d70c662743f237ead50d855b873 not found: ID does not exist" containerID="cec40255b9c8bf55aa9d7e637e649a16a0eb5d70c662743f237ead50d855b873" Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.921586 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cec40255b9c8bf55aa9d7e637e649a16a0eb5d70c662743f237ead50d855b873"} err="failed to get container status \"cec40255b9c8bf55aa9d7e637e649a16a0eb5d70c662743f237ead50d855b873\": rpc error: code = NotFound desc = could not find container \"cec40255b9c8bf55aa9d7e637e649a16a0eb5d70c662743f237ead50d855b873\": container with ID starting with cec40255b9c8bf55aa9d7e637e649a16a0eb5d70c662743f237ead50d855b873 not found: ID does not exist" Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.921603 5094 scope.go:117] "RemoveContainer" containerID="4b943ec4c7d1ca434acd9492654334ba91ffe6700bc8e7d3d8b8214771d14fd7" Feb 20 09:23:34 crc kubenswrapper[5094]: E0220 09:23:34.921925 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b943ec4c7d1ca434acd9492654334ba91ffe6700bc8e7d3d8b8214771d14fd7\": container with ID starting with 4b943ec4c7d1ca434acd9492654334ba91ffe6700bc8e7d3d8b8214771d14fd7 not found: ID does not exist" containerID="4b943ec4c7d1ca434acd9492654334ba91ffe6700bc8e7d3d8b8214771d14fd7" Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.921954 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b943ec4c7d1ca434acd9492654334ba91ffe6700bc8e7d3d8b8214771d14fd7"} err="failed to get container status \"4b943ec4c7d1ca434acd9492654334ba91ffe6700bc8e7d3d8b8214771d14fd7\": rpc error: code = NotFound desc = could not find container \"4b943ec4c7d1ca434acd9492654334ba91ffe6700bc8e7d3d8b8214771d14fd7\": container with ID starting with 4b943ec4c7d1ca434acd9492654334ba91ffe6700bc8e7d3d8b8214771d14fd7 not found: ID does not exist" Feb 20 09:23:35 crc kubenswrapper[5094]: I0220 09:23:35.857848 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f05d18a6-4c8d-4876-ae31-b44332ff55ca" path="/var/lib/kubelet/pods/f05d18a6-4c8d-4876-ae31-b44332ff55ca/volumes" Feb 20 09:24:13 crc kubenswrapper[5094]: I0220 09:24:13.208651 5094 generic.go:334] "Generic (PLEG): container finished" podID="086935dd-74d5-4657-a6a1-25bd11f6455f" containerID="c623c6bcb73425cb5942caee11d7bb36d9dab38dcb8e5a059b4e2fee9f335a16" exitCode=0 Feb 20 09:24:13 crc kubenswrapper[5094]: I0220 09:24:13.208738 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" event={"ID":"086935dd-74d5-4657-a6a1-25bd11f6455f","Type":"ContainerDied","Data":"c623c6bcb73425cb5942caee11d7bb36d9dab38dcb8e5a059b4e2fee9f335a16"} Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.745880 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.864025 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-migration-ssh-key-1\") pod \"086935dd-74d5-4657-a6a1-25bd11f6455f\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.864091 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-inventory\") pod \"086935dd-74d5-4657-a6a1-25bd11f6455f\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.864168 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-0\") pod \"086935dd-74d5-4657-a6a1-25bd11f6455f\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.864197 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjltv\" (UniqueName: \"kubernetes.io/projected/086935dd-74d5-4657-a6a1-25bd11f6455f-kube-api-access-pjltv\") pod \"086935dd-74d5-4657-a6a1-25bd11f6455f\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.864246 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-1\") pod \"086935dd-74d5-4657-a6a1-25bd11f6455f\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.864302 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-2\") pod \"086935dd-74d5-4657-a6a1-25bd11f6455f\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.864348 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cells-global-config-1\") pod \"086935dd-74d5-4657-a6a1-25bd11f6455f\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.864390 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cells-global-config-0\") pod \"086935dd-74d5-4657-a6a1-25bd11f6455f\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.864423 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-combined-ca-bundle\") pod \"086935dd-74d5-4657-a6a1-25bd11f6455f\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.864465 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-3\") pod \"086935dd-74d5-4657-a6a1-25bd11f6455f\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.864517 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-ceph\") pod \"086935dd-74d5-4657-a6a1-25bd11f6455f\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.864587 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-migration-ssh-key-0\") pod \"086935dd-74d5-4657-a6a1-25bd11f6455f\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.864622 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-ssh-key-openstack-cell1\") pod \"086935dd-74d5-4657-a6a1-25bd11f6455f\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.871208 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/086935dd-74d5-4657-a6a1-25bd11f6455f-kube-api-access-pjltv" (OuterVolumeSpecName: "kube-api-access-pjltv") pod "086935dd-74d5-4657-a6a1-25bd11f6455f" (UID: "086935dd-74d5-4657-a6a1-25bd11f6455f"). InnerVolumeSpecName "kube-api-access-pjltv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.877226 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "086935dd-74d5-4657-a6a1-25bd11f6455f" (UID: "086935dd-74d5-4657-a6a1-25bd11f6455f"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.877355 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-ceph" (OuterVolumeSpecName: "ceph") pod "086935dd-74d5-4657-a6a1-25bd11f6455f" (UID: "086935dd-74d5-4657-a6a1-25bd11f6455f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.894403 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "086935dd-74d5-4657-a6a1-25bd11f6455f" (UID: "086935dd-74d5-4657-a6a1-25bd11f6455f"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.898888 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "086935dd-74d5-4657-a6a1-25bd11f6455f" (UID: "086935dd-74d5-4657-a6a1-25bd11f6455f"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.899311 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "086935dd-74d5-4657-a6a1-25bd11f6455f" (UID: "086935dd-74d5-4657-a6a1-25bd11f6455f"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.899345 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "086935dd-74d5-4657-a6a1-25bd11f6455f" (UID: "086935dd-74d5-4657-a6a1-25bd11f6455f"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.903092 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-inventory" (OuterVolumeSpecName: "inventory") pod "086935dd-74d5-4657-a6a1-25bd11f6455f" (UID: "086935dd-74d5-4657-a6a1-25bd11f6455f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.918053 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "086935dd-74d5-4657-a6a1-25bd11f6455f" (UID: "086935dd-74d5-4657-a6a1-25bd11f6455f"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.921014 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "086935dd-74d5-4657-a6a1-25bd11f6455f" (UID: "086935dd-74d5-4657-a6a1-25bd11f6455f"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.921046 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "086935dd-74d5-4657-a6a1-25bd11f6455f" (UID: "086935dd-74d5-4657-a6a1-25bd11f6455f"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.933283 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "086935dd-74d5-4657-a6a1-25bd11f6455f" (UID: "086935dd-74d5-4657-a6a1-25bd11f6455f"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.944664 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "086935dd-74d5-4657-a6a1-25bd11f6455f" (UID: "086935dd-74d5-4657-a6a1-25bd11f6455f"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.967579 5094 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.967611 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.967624 5094 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.967635 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjltv\" (UniqueName: \"kubernetes.io/projected/086935dd-74d5-4657-a6a1-25bd11f6455f-kube-api-access-pjltv\") on node \"crc\" DevicePath \"\"" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.967643 5094 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.967654 5094 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.967663 5094 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.967671 5094 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.967679 5094 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.967688 5094 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.967696 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.967718 5094 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.967728 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.237114 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" event={"ID":"086935dd-74d5-4657-a6a1-25bd11f6455f","Type":"ContainerDied","Data":"b5f00a6875a9dca730468d3130ccbd421639490aee72bce64c7d96373c50cba6"} Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.237160 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5f00a6875a9dca730468d3130ccbd421639490aee72bce64c7d96373c50cba6" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.237180 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.420993 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-s5mln"] Feb 20 09:24:15 crc kubenswrapper[5094]: E0220 09:24:15.421532 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f05d18a6-4c8d-4876-ae31-b44332ff55ca" containerName="extract-utilities" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.421557 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f05d18a6-4c8d-4876-ae31-b44332ff55ca" containerName="extract-utilities" Feb 20 09:24:15 crc kubenswrapper[5094]: E0220 09:24:15.421576 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f05d18a6-4c8d-4876-ae31-b44332ff55ca" containerName="registry-server" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.421586 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f05d18a6-4c8d-4876-ae31-b44332ff55ca" containerName="registry-server" Feb 20 09:24:15 crc kubenswrapper[5094]: E0220 09:24:15.421603 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f05d18a6-4c8d-4876-ae31-b44332ff55ca" containerName="extract-content" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.421611 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f05d18a6-4c8d-4876-ae31-b44332ff55ca" containerName="extract-content" Feb 20 09:24:15 crc kubenswrapper[5094]: E0220 09:24:15.421621 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="086935dd-74d5-4657-a6a1-25bd11f6455f" containerName="nova-cell1-openstack-openstack-cell1" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.421631 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="086935dd-74d5-4657-a6a1-25bd11f6455f" containerName="nova-cell1-openstack-openstack-cell1" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.421908 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="086935dd-74d5-4657-a6a1-25bd11f6455f" containerName="nova-cell1-openstack-openstack-cell1" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.421932 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f05d18a6-4c8d-4876-ae31-b44332ff55ca" containerName="registry-server" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.422861 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.424992 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.425450 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.425469 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.425474 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.425614 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.435658 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-s5mln"] Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.579089 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.579159 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.579178 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.579293 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.579313 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnfsq\" (UniqueName: \"kubernetes.io/projected/a134a8f4-8450-4f7c-9988-11686cbdcd19-kube-api-access-lnfsq\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.579438 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.579464 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-inventory\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.579510 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceph\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.681791 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.681836 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-inventory\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.681886 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceph\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.681927 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.681958 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.681977 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.682011 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnfsq\" (UniqueName: \"kubernetes.io/projected/a134a8f4-8450-4f7c-9988-11686cbdcd19-kube-api-access-lnfsq\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.682037 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:16 crc kubenswrapper[5094]: I0220 09:24:16.144145 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-inventory\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:16 crc kubenswrapper[5094]: I0220 09:24:16.144669 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:16 crc kubenswrapper[5094]: I0220 09:24:16.144779 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceph\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:16 crc kubenswrapper[5094]: I0220 09:24:16.145175 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:16 crc kubenswrapper[5094]: I0220 09:24:16.145212 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:16 crc kubenswrapper[5094]: I0220 09:24:16.145350 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:16 crc kubenswrapper[5094]: I0220 09:24:16.146523 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:16 crc kubenswrapper[5094]: I0220 09:24:16.146602 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnfsq\" (UniqueName: \"kubernetes.io/projected/a134a8f4-8450-4f7c-9988-11686cbdcd19-kube-api-access-lnfsq\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:16 crc kubenswrapper[5094]: I0220 09:24:16.343038 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:16 crc kubenswrapper[5094]: I0220 09:24:16.940439 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-s5mln"] Feb 20 09:24:17 crc kubenswrapper[5094]: I0220 09:24:17.273218 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-s5mln" event={"ID":"a134a8f4-8450-4f7c-9988-11686cbdcd19","Type":"ContainerStarted","Data":"6c7d4214b5a698225ea8951f58de0f8b3b887bc05034f126dfe8b75d2e437a91"} Feb 20 09:24:19 crc kubenswrapper[5094]: I0220 09:24:19.298123 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-s5mln" event={"ID":"a134a8f4-8450-4f7c-9988-11686cbdcd19","Type":"ContainerStarted","Data":"73d8bcbbcd28ef40d7b3c365d7fd939eb7c6d42e40f91b1591867adf8ce6addb"} Feb 20 09:25:34 crc kubenswrapper[5094]: I0220 09:25:34.107511 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:25:34 crc kubenswrapper[5094]: I0220 09:25:34.108278 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:26:04 crc kubenswrapper[5094]: I0220 09:26:04.106611 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:26:04 crc kubenswrapper[5094]: I0220 09:26:04.107244 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:26:34 crc kubenswrapper[5094]: I0220 09:26:34.106623 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:26:34 crc kubenswrapper[5094]: I0220 09:26:34.107473 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:26:34 crc kubenswrapper[5094]: I0220 09:26:34.107543 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 09:26:34 crc kubenswrapper[5094]: I0220 09:26:34.108698 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"94b92f3a2bced5f38491238ad47cd4799f0c32866995d1804ffd21cbf89a306f"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 09:26:34 crc kubenswrapper[5094]: I0220 09:26:34.108836 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://94b92f3a2bced5f38491238ad47cd4799f0c32866995d1804ffd21cbf89a306f" gracePeriod=600 Feb 20 09:26:34 crc kubenswrapper[5094]: I0220 09:26:34.771789 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="94b92f3a2bced5f38491238ad47cd4799f0c32866995d1804ffd21cbf89a306f" exitCode=0 Feb 20 09:26:34 crc kubenswrapper[5094]: I0220 09:26:34.772102 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"94b92f3a2bced5f38491238ad47cd4799f0c32866995d1804ffd21cbf89a306f"} Feb 20 09:26:34 crc kubenswrapper[5094]: I0220 09:26:34.772138 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4"} Feb 20 09:26:34 crc kubenswrapper[5094]: I0220 09:26:34.772159 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:26:34 crc kubenswrapper[5094]: I0220 09:26:34.807364 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-s5mln" podStartSLOduration=139.413504418 podStartE2EDuration="2m19.807343383s" podCreationTimestamp="2026-02-20 09:24:15 +0000 UTC" firstStartedPulling="2026-02-20 09:24:16.948214835 +0000 UTC m=+9471.820841536" lastFinishedPulling="2026-02-20 09:24:17.34205379 +0000 UTC m=+9472.214680501" observedRunningTime="2026-02-20 09:24:19.325233692 +0000 UTC m=+9474.197860473" watchObservedRunningTime="2026-02-20 09:26:34.807343383 +0000 UTC m=+9609.679970104" Feb 20 09:28:34 crc kubenswrapper[5094]: I0220 09:28:34.116616 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:28:34 crc kubenswrapper[5094]: I0220 09:28:34.117349 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:29:04 crc kubenswrapper[5094]: I0220 09:29:04.107194 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:29:04 crc kubenswrapper[5094]: I0220 09:29:04.107984 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:29:34 crc kubenswrapper[5094]: I0220 09:29:34.106905 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:29:34 crc kubenswrapper[5094]: I0220 09:29:34.107484 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:29:34 crc kubenswrapper[5094]: I0220 09:29:34.107538 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 09:29:34 crc kubenswrapper[5094]: I0220 09:29:34.108577 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 09:29:34 crc kubenswrapper[5094]: I0220 09:29:34.108647 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" gracePeriod=600 Feb 20 09:29:34 crc kubenswrapper[5094]: E0220 09:29:34.251376 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:29:34 crc kubenswrapper[5094]: I0220 09:29:34.794081 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" exitCode=0 Feb 20 09:29:34 crc kubenswrapper[5094]: I0220 09:29:34.794115 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4"} Feb 20 09:29:34 crc kubenswrapper[5094]: I0220 09:29:34.794189 5094 scope.go:117] "RemoveContainer" containerID="94b92f3a2bced5f38491238ad47cd4799f0c32866995d1804ffd21cbf89a306f" Feb 20 09:29:34 crc kubenswrapper[5094]: I0220 09:29:34.795043 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:29:34 crc kubenswrapper[5094]: E0220 09:29:34.795481 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:29:49 crc kubenswrapper[5094]: I0220 09:29:49.841081 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:29:49 crc kubenswrapper[5094]: E0220 09:29:49.842129 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:30:00 crc kubenswrapper[5094]: I0220 09:30:00.178317 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7"] Feb 20 09:30:00 crc kubenswrapper[5094]: I0220 09:30:00.180998 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7" Feb 20 09:30:00 crc kubenswrapper[5094]: I0220 09:30:00.183416 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 09:30:00 crc kubenswrapper[5094]: I0220 09:30:00.184414 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 09:30:00 crc kubenswrapper[5094]: I0220 09:30:00.192586 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7"] Feb 20 09:30:00 crc kubenswrapper[5094]: I0220 09:30:00.307452 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06092367-1969-4b35-8025-09e5a52a5855-config-volume\") pod \"collect-profiles-29526330-4dpg7\" (UID: \"06092367-1969-4b35-8025-09e5a52a5855\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7" Feb 20 09:30:00 crc kubenswrapper[5094]: I0220 09:30:00.307603 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhh9q\" (UniqueName: \"kubernetes.io/projected/06092367-1969-4b35-8025-09e5a52a5855-kube-api-access-dhh9q\") pod \"collect-profiles-29526330-4dpg7\" (UID: \"06092367-1969-4b35-8025-09e5a52a5855\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7" Feb 20 09:30:00 crc kubenswrapper[5094]: I0220 09:30:00.307659 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06092367-1969-4b35-8025-09e5a52a5855-secret-volume\") pod \"collect-profiles-29526330-4dpg7\" (UID: \"06092367-1969-4b35-8025-09e5a52a5855\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7" Feb 20 09:30:00 crc kubenswrapper[5094]: I0220 09:30:00.409824 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06092367-1969-4b35-8025-09e5a52a5855-config-volume\") pod \"collect-profiles-29526330-4dpg7\" (UID: \"06092367-1969-4b35-8025-09e5a52a5855\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7" Feb 20 09:30:00 crc kubenswrapper[5094]: I0220 09:30:00.409986 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhh9q\" (UniqueName: \"kubernetes.io/projected/06092367-1969-4b35-8025-09e5a52a5855-kube-api-access-dhh9q\") pod \"collect-profiles-29526330-4dpg7\" (UID: \"06092367-1969-4b35-8025-09e5a52a5855\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7" Feb 20 09:30:00 crc kubenswrapper[5094]: I0220 09:30:00.410029 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06092367-1969-4b35-8025-09e5a52a5855-secret-volume\") pod \"collect-profiles-29526330-4dpg7\" (UID: \"06092367-1969-4b35-8025-09e5a52a5855\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7" Feb 20 09:30:00 crc kubenswrapper[5094]: I0220 09:30:00.411185 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06092367-1969-4b35-8025-09e5a52a5855-config-volume\") pod \"collect-profiles-29526330-4dpg7\" (UID: \"06092367-1969-4b35-8025-09e5a52a5855\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7" Feb 20 09:30:00 crc kubenswrapper[5094]: I0220 09:30:00.417572 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06092367-1969-4b35-8025-09e5a52a5855-secret-volume\") pod \"collect-profiles-29526330-4dpg7\" (UID: \"06092367-1969-4b35-8025-09e5a52a5855\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7" Feb 20 09:30:00 crc kubenswrapper[5094]: I0220 09:30:00.428948 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhh9q\" (UniqueName: \"kubernetes.io/projected/06092367-1969-4b35-8025-09e5a52a5855-kube-api-access-dhh9q\") pod \"collect-profiles-29526330-4dpg7\" (UID: \"06092367-1969-4b35-8025-09e5a52a5855\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7" Feb 20 09:30:00 crc kubenswrapper[5094]: I0220 09:30:00.513752 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7" Feb 20 09:30:00 crc kubenswrapper[5094]: I0220 09:30:00.968747 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7"] Feb 20 09:30:01 crc kubenswrapper[5094]: I0220 09:30:01.105118 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7" event={"ID":"06092367-1969-4b35-8025-09e5a52a5855","Type":"ContainerStarted","Data":"bb2abcf7fb1444dc481ad19c1b42c93bf38073d87f41fb8c968098ad31011d6d"} Feb 20 09:30:01 crc kubenswrapper[5094]: I0220 09:30:01.109067 5094 generic.go:334] "Generic (PLEG): container finished" podID="a134a8f4-8450-4f7c-9988-11686cbdcd19" containerID="73d8bcbbcd28ef40d7b3c365d7fd939eb7c6d42e40f91b1591867adf8ce6addb" exitCode=0 Feb 20 09:30:01 crc kubenswrapper[5094]: I0220 09:30:01.109151 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-s5mln" event={"ID":"a134a8f4-8450-4f7c-9988-11686cbdcd19","Type":"ContainerDied","Data":"73d8bcbbcd28ef40d7b3c365d7fd939eb7c6d42e40f91b1591867adf8ce6addb"} Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.121012 5094 generic.go:334] "Generic (PLEG): container finished" podID="06092367-1969-4b35-8025-09e5a52a5855" containerID="4458d3e89efbd0e5ea42a99c4b47f135cba67a66cbdaaf49efb55576b8dd1322" exitCode=0 Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.121067 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7" event={"ID":"06092367-1969-4b35-8025-09e5a52a5855","Type":"ContainerDied","Data":"4458d3e89efbd0e5ea42a99c4b47f135cba67a66cbdaaf49efb55576b8dd1322"} Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.748913 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9tfpc"] Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.751614 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.768828 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9tfpc"] Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.794673 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.840843 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:30:02 crc kubenswrapper[5094]: E0220 09:30:02.841615 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.885787 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnfsq\" (UniqueName: \"kubernetes.io/projected/a134a8f4-8450-4f7c-9988-11686cbdcd19-kube-api-access-lnfsq\") pod \"a134a8f4-8450-4f7c-9988-11686cbdcd19\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.885911 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-1\") pod \"a134a8f4-8450-4f7c-9988-11686cbdcd19\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.885957 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ssh-key-openstack-cell1\") pod \"a134a8f4-8450-4f7c-9988-11686cbdcd19\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.885984 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-telemetry-combined-ca-bundle\") pod \"a134a8f4-8450-4f7c-9988-11686cbdcd19\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.886083 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-inventory\") pod \"a134a8f4-8450-4f7c-9988-11686cbdcd19\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.886116 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceph\") pod \"a134a8f4-8450-4f7c-9988-11686cbdcd19\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.886143 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-0\") pod \"a134a8f4-8450-4f7c-9988-11686cbdcd19\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.886191 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-2\") pod \"a134a8f4-8450-4f7c-9988-11686cbdcd19\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.887563 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f58baa6-934a-458f-be52-13c9185c7076-utilities\") pod \"community-operators-9tfpc\" (UID: \"6f58baa6-934a-458f-be52-13c9185c7076\") " pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.887638 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f58baa6-934a-458f-be52-13c9185c7076-catalog-content\") pod \"community-operators-9tfpc\" (UID: \"6f58baa6-934a-458f-be52-13c9185c7076\") " pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.888058 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swqtq\" (UniqueName: \"kubernetes.io/projected/6f58baa6-934a-458f-be52-13c9185c7076-kube-api-access-swqtq\") pod \"community-operators-9tfpc\" (UID: \"6f58baa6-934a-458f-be52-13c9185c7076\") " pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.906759 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceph" (OuterVolumeSpecName: "ceph") pod "a134a8f4-8450-4f7c-9988-11686cbdcd19" (UID: "a134a8f4-8450-4f7c-9988-11686cbdcd19"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.909054 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "a134a8f4-8450-4f7c-9988-11686cbdcd19" (UID: "a134a8f4-8450-4f7c-9988-11686cbdcd19"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.922110 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a134a8f4-8450-4f7c-9988-11686cbdcd19-kube-api-access-lnfsq" (OuterVolumeSpecName: "kube-api-access-lnfsq") pod "a134a8f4-8450-4f7c-9988-11686cbdcd19" (UID: "a134a8f4-8450-4f7c-9988-11686cbdcd19"). InnerVolumeSpecName "kube-api-access-lnfsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.925143 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "a134a8f4-8450-4f7c-9988-11686cbdcd19" (UID: "a134a8f4-8450-4f7c-9988-11686cbdcd19"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.925933 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "a134a8f4-8450-4f7c-9988-11686cbdcd19" (UID: "a134a8f4-8450-4f7c-9988-11686cbdcd19"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.925983 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-inventory" (OuterVolumeSpecName: "inventory") pod "a134a8f4-8450-4f7c-9988-11686cbdcd19" (UID: "a134a8f4-8450-4f7c-9988-11686cbdcd19"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.945816 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "a134a8f4-8450-4f7c-9988-11686cbdcd19" (UID: "a134a8f4-8450-4f7c-9988-11686cbdcd19"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.948389 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "a134a8f4-8450-4f7c-9988-11686cbdcd19" (UID: "a134a8f4-8450-4f7c-9988-11686cbdcd19"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.989676 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swqtq\" (UniqueName: \"kubernetes.io/projected/6f58baa6-934a-458f-be52-13c9185c7076-kube-api-access-swqtq\") pod \"community-operators-9tfpc\" (UID: \"6f58baa6-934a-458f-be52-13c9185c7076\") " pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.989921 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f58baa6-934a-458f-be52-13c9185c7076-utilities\") pod \"community-operators-9tfpc\" (UID: \"6f58baa6-934a-458f-be52-13c9185c7076\") " pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.989969 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f58baa6-934a-458f-be52-13c9185c7076-catalog-content\") pod \"community-operators-9tfpc\" (UID: \"6f58baa6-934a-458f-be52-13c9185c7076\") " pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.990100 5094 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.990126 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.990139 5094 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.990153 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.990163 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.990176 5094 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.990189 5094 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.990202 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnfsq\" (UniqueName: \"kubernetes.io/projected/a134a8f4-8450-4f7c-9988-11686cbdcd19-kube-api-access-lnfsq\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.990667 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f58baa6-934a-458f-be52-13c9185c7076-catalog-content\") pod \"community-operators-9tfpc\" (UID: \"6f58baa6-934a-458f-be52-13c9185c7076\") " pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.991114 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f58baa6-934a-458f-be52-13c9185c7076-utilities\") pod \"community-operators-9tfpc\" (UID: \"6f58baa6-934a-458f-be52-13c9185c7076\") " pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.009337 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swqtq\" (UniqueName: \"kubernetes.io/projected/6f58baa6-934a-458f-be52-13c9185c7076-kube-api-access-swqtq\") pod \"community-operators-9tfpc\" (UID: \"6f58baa6-934a-458f-be52-13c9185c7076\") " pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.109038 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.133147 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-s5mln" event={"ID":"a134a8f4-8450-4f7c-9988-11686cbdcd19","Type":"ContainerDied","Data":"6c7d4214b5a698225ea8951f58de0f8b3b887bc05034f126dfe8b75d2e437a91"} Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.133235 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c7d4214b5a698225ea8951f58de0f8b3b887bc05034f126dfe8b75d2e437a91" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.133171 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.248940 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-6sqzq"] Feb 20 09:30:03 crc kubenswrapper[5094]: E0220 09:30:03.250336 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a134a8f4-8450-4f7c-9988-11686cbdcd19" containerName="telemetry-openstack-openstack-cell1" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.250494 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a134a8f4-8450-4f7c-9988-11686cbdcd19" containerName="telemetry-openstack-openstack-cell1" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.251093 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a134a8f4-8450-4f7c-9988-11686cbdcd19" containerName="telemetry-openstack-openstack-cell1" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.252152 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.256326 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.257724 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.257973 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.260175 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.260391 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.274615 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-6sqzq"] Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.298886 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jpl2\" (UniqueName: \"kubernetes.io/projected/6d3bf727-1eae-408c-be3d-2df97b387704-kube-api-access-9jpl2\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.298976 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.298998 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.299021 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.299084 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.299105 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.405901 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.405946 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.406033 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jpl2\" (UniqueName: \"kubernetes.io/projected/6d3bf727-1eae-408c-be3d-2df97b387704-kube-api-access-9jpl2\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.406112 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.406131 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.406150 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.411818 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.412230 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.413097 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.413277 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.416465 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.429495 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jpl2\" (UniqueName: \"kubernetes.io/projected/6d3bf727-1eae-408c-be3d-2df97b387704-kube-api-access-9jpl2\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.569284 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.600361 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.710801 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhh9q\" (UniqueName: \"kubernetes.io/projected/06092367-1969-4b35-8025-09e5a52a5855-kube-api-access-dhh9q\") pod \"06092367-1969-4b35-8025-09e5a52a5855\" (UID: \"06092367-1969-4b35-8025-09e5a52a5855\") " Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.710898 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06092367-1969-4b35-8025-09e5a52a5855-secret-volume\") pod \"06092367-1969-4b35-8025-09e5a52a5855\" (UID: \"06092367-1969-4b35-8025-09e5a52a5855\") " Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.711175 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06092367-1969-4b35-8025-09e5a52a5855-config-volume\") pod \"06092367-1969-4b35-8025-09e5a52a5855\" (UID: \"06092367-1969-4b35-8025-09e5a52a5855\") " Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.712011 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06092367-1969-4b35-8025-09e5a52a5855-config-volume" (OuterVolumeSpecName: "config-volume") pod "06092367-1969-4b35-8025-09e5a52a5855" (UID: "06092367-1969-4b35-8025-09e5a52a5855"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.715189 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06092367-1969-4b35-8025-09e5a52a5855-kube-api-access-dhh9q" (OuterVolumeSpecName: "kube-api-access-dhh9q") pod "06092367-1969-4b35-8025-09e5a52a5855" (UID: "06092367-1969-4b35-8025-09e5a52a5855"). InnerVolumeSpecName "kube-api-access-dhh9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.715197 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06092367-1969-4b35-8025-09e5a52a5855-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "06092367-1969-4b35-8025-09e5a52a5855" (UID: "06092367-1969-4b35-8025-09e5a52a5855"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.789541 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9tfpc"] Feb 20 09:30:03 crc kubenswrapper[5094]: W0220 09:30:03.800887 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f58baa6_934a_458f_be52_13c9185c7076.slice/crio-4a536eaac1c56dabcf722017ffa827274fa5003477db0004d60a134631bc7925 WatchSource:0}: Error finding container 4a536eaac1c56dabcf722017ffa827274fa5003477db0004d60a134631bc7925: Status 404 returned error can't find the container with id 4a536eaac1c56dabcf722017ffa827274fa5003477db0004d60a134631bc7925 Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.814527 5094 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06092367-1969-4b35-8025-09e5a52a5855-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.814563 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhh9q\" (UniqueName: \"kubernetes.io/projected/06092367-1969-4b35-8025-09e5a52a5855-kube-api-access-dhh9q\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.814575 5094 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06092367-1969-4b35-8025-09e5a52a5855-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:04 crc kubenswrapper[5094]: I0220 09:30:04.145928 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-6sqzq"] Feb 20 09:30:04 crc kubenswrapper[5094]: I0220 09:30:04.157124 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 09:30:04 crc kubenswrapper[5094]: I0220 09:30:04.157263 5094 generic.go:334] "Generic (PLEG): container finished" podID="6f58baa6-934a-458f-be52-13c9185c7076" containerID="609bbebaaf348df0c34e989e5b432e9346c1eadbd013c5cdf0375a6968705b3a" exitCode=0 Feb 20 09:30:04 crc kubenswrapper[5094]: I0220 09:30:04.157490 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tfpc" event={"ID":"6f58baa6-934a-458f-be52-13c9185c7076","Type":"ContainerDied","Data":"609bbebaaf348df0c34e989e5b432e9346c1eadbd013c5cdf0375a6968705b3a"} Feb 20 09:30:04 crc kubenswrapper[5094]: I0220 09:30:04.157524 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tfpc" event={"ID":"6f58baa6-934a-458f-be52-13c9185c7076","Type":"ContainerStarted","Data":"4a536eaac1c56dabcf722017ffa827274fa5003477db0004d60a134631bc7925"} Feb 20 09:30:04 crc kubenswrapper[5094]: I0220 09:30:04.161509 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7" event={"ID":"06092367-1969-4b35-8025-09e5a52a5855","Type":"ContainerDied","Data":"bb2abcf7fb1444dc481ad19c1b42c93bf38073d87f41fb8c968098ad31011d6d"} Feb 20 09:30:04 crc kubenswrapper[5094]: I0220 09:30:04.161639 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb2abcf7fb1444dc481ad19c1b42c93bf38073d87f41fb8c968098ad31011d6d" Feb 20 09:30:04 crc kubenswrapper[5094]: I0220 09:30:04.161795 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7" Feb 20 09:30:04 crc kubenswrapper[5094]: I0220 09:30:04.676129 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4"] Feb 20 09:30:04 crc kubenswrapper[5094]: I0220 09:30:04.691643 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4"] Feb 20 09:30:05 crc kubenswrapper[5094]: I0220 09:30:05.184310 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" event={"ID":"6d3bf727-1eae-408c-be3d-2df97b387704","Type":"ContainerStarted","Data":"17fbee9c4a3852e927701f4dfea9296493a9a027cb979ab0b3ffefe9fd18a14f"} Feb 20 09:30:05 crc kubenswrapper[5094]: I0220 09:30:05.184746 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" event={"ID":"6d3bf727-1eae-408c-be3d-2df97b387704","Type":"ContainerStarted","Data":"807e3a8640eb579cfbd8f2e00fbe291277d0bdb2b420fc4b5394ff2cab037a0a"} Feb 20 09:30:05 crc kubenswrapper[5094]: I0220 09:30:05.217237 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" podStartSLOduration=1.737112878 podStartE2EDuration="2.217216429s" podCreationTimestamp="2026-02-20 09:30:03 +0000 UTC" firstStartedPulling="2026-02-20 09:30:04.156676195 +0000 UTC m=+9819.029302916" lastFinishedPulling="2026-02-20 09:30:04.636779736 +0000 UTC m=+9819.509406467" observedRunningTime="2026-02-20 09:30:05.206232655 +0000 UTC m=+9820.078859366" watchObservedRunningTime="2026-02-20 09:30:05.217216429 +0000 UTC m=+9820.089843140" Feb 20 09:30:05 crc kubenswrapper[5094]: I0220 09:30:05.859976 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4271712d-7fb9-4862-bc38-e3cfbcced425" path="/var/lib/kubelet/pods/4271712d-7fb9-4862-bc38-e3cfbcced425/volumes" Feb 20 09:30:06 crc kubenswrapper[5094]: I0220 09:30:06.198225 5094 generic.go:334] "Generic (PLEG): container finished" podID="6f58baa6-934a-458f-be52-13c9185c7076" containerID="bb435daf7ab6ba6af4984d8f667c7227c5ab824fd2097edb47d00e633cb1570b" exitCode=0 Feb 20 09:30:06 crc kubenswrapper[5094]: I0220 09:30:06.198326 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tfpc" event={"ID":"6f58baa6-934a-458f-be52-13c9185c7076","Type":"ContainerDied","Data":"bb435daf7ab6ba6af4984d8f667c7227c5ab824fd2097edb47d00e633cb1570b"} Feb 20 09:30:07 crc kubenswrapper[5094]: I0220 09:30:07.208858 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tfpc" event={"ID":"6f58baa6-934a-458f-be52-13c9185c7076","Type":"ContainerStarted","Data":"ea288d0e36a6a61918ac1052fa176505630660bcda4dc25081d87d390c85c33c"} Feb 20 09:30:13 crc kubenswrapper[5094]: I0220 09:30:13.110171 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:13 crc kubenswrapper[5094]: I0220 09:30:13.110729 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:13 crc kubenswrapper[5094]: I0220 09:30:13.154779 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:13 crc kubenswrapper[5094]: I0220 09:30:13.181150 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9tfpc" podStartSLOduration=8.733881591 podStartE2EDuration="11.181106886s" podCreationTimestamp="2026-02-20 09:30:02 +0000 UTC" firstStartedPulling="2026-02-20 09:30:04.159420501 +0000 UTC m=+9819.032047222" lastFinishedPulling="2026-02-20 09:30:06.606645806 +0000 UTC m=+9821.479272517" observedRunningTime="2026-02-20 09:30:07.262978266 +0000 UTC m=+9822.135604977" watchObservedRunningTime="2026-02-20 09:30:13.181106886 +0000 UTC m=+9828.053733597" Feb 20 09:30:13 crc kubenswrapper[5094]: I0220 09:30:13.403896 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:13 crc kubenswrapper[5094]: I0220 09:30:13.466100 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9tfpc"] Feb 20 09:30:15 crc kubenswrapper[5094]: I0220 09:30:15.373008 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9tfpc" podUID="6f58baa6-934a-458f-be52-13c9185c7076" containerName="registry-server" containerID="cri-o://ea288d0e36a6a61918ac1052fa176505630660bcda4dc25081d87d390c85c33c" gracePeriod=2 Feb 20 09:30:15 crc kubenswrapper[5094]: I0220 09:30:15.848686 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:30:15 crc kubenswrapper[5094]: E0220 09:30:15.849346 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:30:15 crc kubenswrapper[5094]: I0220 09:30:15.893289 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.029796 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f58baa6-934a-458f-be52-13c9185c7076-utilities\") pod \"6f58baa6-934a-458f-be52-13c9185c7076\" (UID: \"6f58baa6-934a-458f-be52-13c9185c7076\") " Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.029880 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swqtq\" (UniqueName: \"kubernetes.io/projected/6f58baa6-934a-458f-be52-13c9185c7076-kube-api-access-swqtq\") pod \"6f58baa6-934a-458f-be52-13c9185c7076\" (UID: \"6f58baa6-934a-458f-be52-13c9185c7076\") " Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.029915 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f58baa6-934a-458f-be52-13c9185c7076-catalog-content\") pod \"6f58baa6-934a-458f-be52-13c9185c7076\" (UID: \"6f58baa6-934a-458f-be52-13c9185c7076\") " Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.030841 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f58baa6-934a-458f-be52-13c9185c7076-utilities" (OuterVolumeSpecName: "utilities") pod "6f58baa6-934a-458f-be52-13c9185c7076" (UID: "6f58baa6-934a-458f-be52-13c9185c7076"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.041445 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f58baa6-934a-458f-be52-13c9185c7076-kube-api-access-swqtq" (OuterVolumeSpecName: "kube-api-access-swqtq") pod "6f58baa6-934a-458f-be52-13c9185c7076" (UID: "6f58baa6-934a-458f-be52-13c9185c7076"). InnerVolumeSpecName "kube-api-access-swqtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.095510 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f58baa6-934a-458f-be52-13c9185c7076-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f58baa6-934a-458f-be52-13c9185c7076" (UID: "6f58baa6-934a-458f-be52-13c9185c7076"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.132791 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f58baa6-934a-458f-be52-13c9185c7076-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.132828 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swqtq\" (UniqueName: \"kubernetes.io/projected/6f58baa6-934a-458f-be52-13c9185c7076-kube-api-access-swqtq\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.133026 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f58baa6-934a-458f-be52-13c9185c7076-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.386434 5094 generic.go:334] "Generic (PLEG): container finished" podID="6f58baa6-934a-458f-be52-13c9185c7076" containerID="ea288d0e36a6a61918ac1052fa176505630660bcda4dc25081d87d390c85c33c" exitCode=0 Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.386478 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tfpc" event={"ID":"6f58baa6-934a-458f-be52-13c9185c7076","Type":"ContainerDied","Data":"ea288d0e36a6a61918ac1052fa176505630660bcda4dc25081d87d390c85c33c"} Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.386491 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.386508 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tfpc" event={"ID":"6f58baa6-934a-458f-be52-13c9185c7076","Type":"ContainerDied","Data":"4a536eaac1c56dabcf722017ffa827274fa5003477db0004d60a134631bc7925"} Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.386526 5094 scope.go:117] "RemoveContainer" containerID="ea288d0e36a6a61918ac1052fa176505630660bcda4dc25081d87d390c85c33c" Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.405869 5094 scope.go:117] "RemoveContainer" containerID="bb435daf7ab6ba6af4984d8f667c7227c5ab824fd2097edb47d00e633cb1570b" Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.426971 5094 scope.go:117] "RemoveContainer" containerID="609bbebaaf348df0c34e989e5b432e9346c1eadbd013c5cdf0375a6968705b3a" Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.453013 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9tfpc"] Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.471502 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9tfpc"] Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.483031 5094 scope.go:117] "RemoveContainer" containerID="ea288d0e36a6a61918ac1052fa176505630660bcda4dc25081d87d390c85c33c" Feb 20 09:30:16 crc kubenswrapper[5094]: E0220 09:30:16.483666 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea288d0e36a6a61918ac1052fa176505630660bcda4dc25081d87d390c85c33c\": container with ID starting with ea288d0e36a6a61918ac1052fa176505630660bcda4dc25081d87d390c85c33c not found: ID does not exist" containerID="ea288d0e36a6a61918ac1052fa176505630660bcda4dc25081d87d390c85c33c" Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.483711 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea288d0e36a6a61918ac1052fa176505630660bcda4dc25081d87d390c85c33c"} err="failed to get container status \"ea288d0e36a6a61918ac1052fa176505630660bcda4dc25081d87d390c85c33c\": rpc error: code = NotFound desc = could not find container \"ea288d0e36a6a61918ac1052fa176505630660bcda4dc25081d87d390c85c33c\": container with ID starting with ea288d0e36a6a61918ac1052fa176505630660bcda4dc25081d87d390c85c33c not found: ID does not exist" Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.483757 5094 scope.go:117] "RemoveContainer" containerID="bb435daf7ab6ba6af4984d8f667c7227c5ab824fd2097edb47d00e633cb1570b" Feb 20 09:30:16 crc kubenswrapper[5094]: E0220 09:30:16.484150 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb435daf7ab6ba6af4984d8f667c7227c5ab824fd2097edb47d00e633cb1570b\": container with ID starting with bb435daf7ab6ba6af4984d8f667c7227c5ab824fd2097edb47d00e633cb1570b not found: ID does not exist" containerID="bb435daf7ab6ba6af4984d8f667c7227c5ab824fd2097edb47d00e633cb1570b" Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.484211 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb435daf7ab6ba6af4984d8f667c7227c5ab824fd2097edb47d00e633cb1570b"} err="failed to get container status \"bb435daf7ab6ba6af4984d8f667c7227c5ab824fd2097edb47d00e633cb1570b\": rpc error: code = NotFound desc = could not find container \"bb435daf7ab6ba6af4984d8f667c7227c5ab824fd2097edb47d00e633cb1570b\": container with ID starting with bb435daf7ab6ba6af4984d8f667c7227c5ab824fd2097edb47d00e633cb1570b not found: ID does not exist" Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.484241 5094 scope.go:117] "RemoveContainer" containerID="609bbebaaf348df0c34e989e5b432e9346c1eadbd013c5cdf0375a6968705b3a" Feb 20 09:30:16 crc kubenswrapper[5094]: E0220 09:30:16.484540 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"609bbebaaf348df0c34e989e5b432e9346c1eadbd013c5cdf0375a6968705b3a\": container with ID starting with 609bbebaaf348df0c34e989e5b432e9346c1eadbd013c5cdf0375a6968705b3a not found: ID does not exist" containerID="609bbebaaf348df0c34e989e5b432e9346c1eadbd013c5cdf0375a6968705b3a" Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.484572 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"609bbebaaf348df0c34e989e5b432e9346c1eadbd013c5cdf0375a6968705b3a"} err="failed to get container status \"609bbebaaf348df0c34e989e5b432e9346c1eadbd013c5cdf0375a6968705b3a\": rpc error: code = NotFound desc = could not find container \"609bbebaaf348df0c34e989e5b432e9346c1eadbd013c5cdf0375a6968705b3a\": container with ID starting with 609bbebaaf348df0c34e989e5b432e9346c1eadbd013c5cdf0375a6968705b3a not found: ID does not exist" Feb 20 09:30:17 crc kubenswrapper[5094]: I0220 09:30:17.851032 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f58baa6-934a-458f-be52-13c9185c7076" path="/var/lib/kubelet/pods/6f58baa6-934a-458f-be52-13c9185c7076/volumes" Feb 20 09:30:30 crc kubenswrapper[5094]: I0220 09:30:30.840122 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:30:30 crc kubenswrapper[5094]: E0220 09:30:30.840738 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:30:43 crc kubenswrapper[5094]: I0220 09:30:43.841833 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:30:43 crc kubenswrapper[5094]: E0220 09:30:43.842667 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:30:44 crc kubenswrapper[5094]: I0220 09:30:44.538395 5094 scope.go:117] "RemoveContainer" containerID="8db4ee156703861a72d9f8f5a2380b086eb9a7f4aadd08037037563019ebbb48" Feb 20 09:30:52 crc kubenswrapper[5094]: I0220 09:30:52.740318 5094 generic.go:334] "Generic (PLEG): container finished" podID="6d3bf727-1eae-408c-be3d-2df97b387704" containerID="17fbee9c4a3852e927701f4dfea9296493a9a027cb979ab0b3ffefe9fd18a14f" exitCode=0 Feb 20 09:30:52 crc kubenswrapper[5094]: I0220 09:30:52.740392 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" event={"ID":"6d3bf727-1eae-408c-be3d-2df97b387704","Type":"ContainerDied","Data":"17fbee9c4a3852e927701f4dfea9296493a9a027cb979ab0b3ffefe9fd18a14f"} Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.363433 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.444777 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-ssh-key-openstack-cell1\") pod \"6d3bf727-1eae-408c-be3d-2df97b387704\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.445179 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jpl2\" (UniqueName: \"kubernetes.io/projected/6d3bf727-1eae-408c-be3d-2df97b387704-kube-api-access-9jpl2\") pod \"6d3bf727-1eae-408c-be3d-2df97b387704\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.445408 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-neutron-sriov-combined-ca-bundle\") pod \"6d3bf727-1eae-408c-be3d-2df97b387704\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.445479 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-ceph\") pod \"6d3bf727-1eae-408c-be3d-2df97b387704\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.445516 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-neutron-sriov-agent-neutron-config-0\") pod \"6d3bf727-1eae-408c-be3d-2df97b387704\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.445578 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-inventory\") pod \"6d3bf727-1eae-408c-be3d-2df97b387704\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.542834 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d3bf727-1eae-408c-be3d-2df97b387704-kube-api-access-9jpl2" (OuterVolumeSpecName: "kube-api-access-9jpl2") pod "6d3bf727-1eae-408c-be3d-2df97b387704" (UID: "6d3bf727-1eae-408c-be3d-2df97b387704"). InnerVolumeSpecName "kube-api-access-9jpl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.542927 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "6d3bf727-1eae-408c-be3d-2df97b387704" (UID: "6d3bf727-1eae-408c-be3d-2df97b387704"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.543181 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-ceph" (OuterVolumeSpecName: "ceph") pod "6d3bf727-1eae-408c-be3d-2df97b387704" (UID: "6d3bf727-1eae-408c-be3d-2df97b387704"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.545804 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-inventory" (OuterVolumeSpecName: "inventory") pod "6d3bf727-1eae-408c-be3d-2df97b387704" (UID: "6d3bf727-1eae-408c-be3d-2df97b387704"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.547362 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "6d3bf727-1eae-408c-be3d-2df97b387704" (UID: "6d3bf727-1eae-408c-be3d-2df97b387704"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.547726 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-ssh-key-openstack-cell1\") pod \"6d3bf727-1eae-408c-be3d-2df97b387704\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " Feb 20 09:30:54 crc kubenswrapper[5094]: W0220 09:30:54.547765 5094 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/6d3bf727-1eae-408c-be3d-2df97b387704/volumes/kubernetes.io~secret/ssh-key-openstack-cell1 Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.547776 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "6d3bf727-1eae-408c-be3d-2df97b387704" (UID: "6d3bf727-1eae-408c-be3d-2df97b387704"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.548310 5094 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.548337 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.548352 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.548365 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.548377 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jpl2\" (UniqueName: \"kubernetes.io/projected/6d3bf727-1eae-408c-be3d-2df97b387704-kube-api-access-9jpl2\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.551816 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "6d3bf727-1eae-408c-be3d-2df97b387704" (UID: "6d3bf727-1eae-408c-be3d-2df97b387704"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.650445 5094 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.770958 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" event={"ID":"6d3bf727-1eae-408c-be3d-2df97b387704","Type":"ContainerDied","Data":"807e3a8640eb579cfbd8f2e00fbe291277d0bdb2b420fc4b5394ff2cab037a0a"} Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.770997 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="807e3a8640eb579cfbd8f2e00fbe291277d0bdb2b420fc4b5394ff2cab037a0a" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.771030 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.887177 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9"] Feb 20 09:30:54 crc kubenswrapper[5094]: E0220 09:30:54.887795 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f58baa6-934a-458f-be52-13c9185c7076" containerName="extract-utilities" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.887810 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f58baa6-934a-458f-be52-13c9185c7076" containerName="extract-utilities" Feb 20 09:30:54 crc kubenswrapper[5094]: E0220 09:30:54.887830 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f58baa6-934a-458f-be52-13c9185c7076" containerName="extract-content" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.887839 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f58baa6-934a-458f-be52-13c9185c7076" containerName="extract-content" Feb 20 09:30:54 crc kubenswrapper[5094]: E0220 09:30:54.887858 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d3bf727-1eae-408c-be3d-2df97b387704" containerName="neutron-sriov-openstack-openstack-cell1" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.887868 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d3bf727-1eae-408c-be3d-2df97b387704" containerName="neutron-sriov-openstack-openstack-cell1" Feb 20 09:30:54 crc kubenswrapper[5094]: E0220 09:30:54.887886 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f58baa6-934a-458f-be52-13c9185c7076" containerName="registry-server" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.887894 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f58baa6-934a-458f-be52-13c9185c7076" containerName="registry-server" Feb 20 09:30:54 crc kubenswrapper[5094]: E0220 09:30:54.887913 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06092367-1969-4b35-8025-09e5a52a5855" containerName="collect-profiles" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.887922 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="06092367-1969-4b35-8025-09e5a52a5855" containerName="collect-profiles" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.888182 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f58baa6-934a-458f-be52-13c9185c7076" containerName="registry-server" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.888201 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d3bf727-1eae-408c-be3d-2df97b387704" containerName="neutron-sriov-openstack-openstack-cell1" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.888219 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="06092367-1969-4b35-8025-09e5a52a5855" containerName="collect-profiles" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.889116 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.893234 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.893471 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.893612 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.894580 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.895044 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.899508 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9"] Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.955895 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbb4f\" (UniqueName: \"kubernetes.io/projected/f58790dc-4468-40ad-ba58-bb433a926abe-kube-api-access-dbb4f\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.956046 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.956083 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.956210 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.956325 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.956397 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:55 crc kubenswrapper[5094]: I0220 09:30:55.058902 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbb4f\" (UniqueName: \"kubernetes.io/projected/f58790dc-4468-40ad-ba58-bb433a926abe-kube-api-access-dbb4f\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:55 crc kubenswrapper[5094]: I0220 09:30:55.059024 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:55 crc kubenswrapper[5094]: I0220 09:30:55.059061 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:55 crc kubenswrapper[5094]: I0220 09:30:55.059120 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:55 crc kubenswrapper[5094]: I0220 09:30:55.059190 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:55 crc kubenswrapper[5094]: I0220 09:30:55.059235 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:55 crc kubenswrapper[5094]: I0220 09:30:55.065276 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:55 crc kubenswrapper[5094]: I0220 09:30:55.065312 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:55 crc kubenswrapper[5094]: I0220 09:30:55.065308 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:55 crc kubenswrapper[5094]: I0220 09:30:55.065816 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:55 crc kubenswrapper[5094]: I0220 09:30:55.067440 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:55 crc kubenswrapper[5094]: I0220 09:30:55.079741 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbb4f\" (UniqueName: \"kubernetes.io/projected/f58790dc-4468-40ad-ba58-bb433a926abe-kube-api-access-dbb4f\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:55 crc kubenswrapper[5094]: I0220 09:30:55.247689 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:55 crc kubenswrapper[5094]: I0220 09:30:55.795011 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9"] Feb 20 09:30:55 crc kubenswrapper[5094]: I0220 09:30:55.853330 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:30:55 crc kubenswrapper[5094]: E0220 09:30:55.853649 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:30:56 crc kubenswrapper[5094]: I0220 09:30:56.792976 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" event={"ID":"f58790dc-4468-40ad-ba58-bb433a926abe","Type":"ContainerStarted","Data":"16bb3d0c2b07a856a2200802f3afc3601693f102f136e4654b9e478e3c583b2c"} Feb 20 09:30:56 crc kubenswrapper[5094]: I0220 09:30:56.793321 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" event={"ID":"f58790dc-4468-40ad-ba58-bb433a926abe","Type":"ContainerStarted","Data":"e39920cb7c5a699ca2b8e232842e2e36ead718dc4047ce9c0e15c3362b15e939"} Feb 20 09:30:56 crc kubenswrapper[5094]: I0220 09:30:56.826898 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" podStartSLOduration=2.428919335 podStartE2EDuration="2.826878229s" podCreationTimestamp="2026-02-20 09:30:54 +0000 UTC" firstStartedPulling="2026-02-20 09:30:55.791246794 +0000 UTC m=+9870.663873505" lastFinishedPulling="2026-02-20 09:30:56.189205688 +0000 UTC m=+9871.061832399" observedRunningTime="2026-02-20 09:30:56.816350146 +0000 UTC m=+9871.688976857" watchObservedRunningTime="2026-02-20 09:30:56.826878229 +0000 UTC m=+9871.699504940" Feb 20 09:31:07 crc kubenswrapper[5094]: I0220 09:31:07.840515 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:31:07 crc kubenswrapper[5094]: E0220 09:31:07.841927 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:31:22 crc kubenswrapper[5094]: I0220 09:31:22.840628 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:31:22 crc kubenswrapper[5094]: E0220 09:31:22.843440 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:31:35 crc kubenswrapper[5094]: I0220 09:31:35.851431 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:31:35 crc kubenswrapper[5094]: E0220 09:31:35.852326 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:31:46 crc kubenswrapper[5094]: I0220 09:31:46.840161 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:31:46 crc kubenswrapper[5094]: E0220 09:31:46.840900 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:31:58 crc kubenswrapper[5094]: I0220 09:31:58.416589 5094 generic.go:334] "Generic (PLEG): container finished" podID="f58790dc-4468-40ad-ba58-bb433a926abe" containerID="16bb3d0c2b07a856a2200802f3afc3601693f102f136e4654b9e478e3c583b2c" exitCode=0 Feb 20 09:31:58 crc kubenswrapper[5094]: I0220 09:31:58.416775 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" event={"ID":"f58790dc-4468-40ad-ba58-bb433a926abe","Type":"ContainerDied","Data":"16bb3d0c2b07a856a2200802f3afc3601693f102f136e4654b9e478e3c583b2c"} Feb 20 09:31:59 crc kubenswrapper[5094]: I0220 09:31:59.840479 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:31:59 crc kubenswrapper[5094]: E0220 09:31:59.841102 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.028052 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.127830 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbb4f\" (UniqueName: \"kubernetes.io/projected/f58790dc-4468-40ad-ba58-bb433a926abe-kube-api-access-dbb4f\") pod \"f58790dc-4468-40ad-ba58-bb433a926abe\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.127959 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-ceph\") pod \"f58790dc-4468-40ad-ba58-bb433a926abe\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.127999 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-neutron-dhcp-combined-ca-bundle\") pod \"f58790dc-4468-40ad-ba58-bb433a926abe\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.128115 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-inventory\") pod \"f58790dc-4468-40ad-ba58-bb433a926abe\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.128226 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-neutron-dhcp-agent-neutron-config-0\") pod \"f58790dc-4468-40ad-ba58-bb433a926abe\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.128242 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-ssh-key-openstack-cell1\") pod \"f58790dc-4468-40ad-ba58-bb433a926abe\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.135899 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-ceph" (OuterVolumeSpecName: "ceph") pod "f58790dc-4468-40ad-ba58-bb433a926abe" (UID: "f58790dc-4468-40ad-ba58-bb433a926abe"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.136050 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f58790dc-4468-40ad-ba58-bb433a926abe-kube-api-access-dbb4f" (OuterVolumeSpecName: "kube-api-access-dbb4f") pod "f58790dc-4468-40ad-ba58-bb433a926abe" (UID: "f58790dc-4468-40ad-ba58-bb433a926abe"). InnerVolumeSpecName "kube-api-access-dbb4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.136229 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "f58790dc-4468-40ad-ba58-bb433a926abe" (UID: "f58790dc-4468-40ad-ba58-bb433a926abe"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.158401 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "f58790dc-4468-40ad-ba58-bb433a926abe" (UID: "f58790dc-4468-40ad-ba58-bb433a926abe"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.172120 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "f58790dc-4468-40ad-ba58-bb433a926abe" (UID: "f58790dc-4468-40ad-ba58-bb433a926abe"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.176485 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-inventory" (OuterVolumeSpecName: "inventory") pod "f58790dc-4468-40ad-ba58-bb433a926abe" (UID: "f58790dc-4468-40ad-ba58-bb433a926abe"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.230527 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.230555 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.230565 5094 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.230576 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbb4f\" (UniqueName: \"kubernetes.io/projected/f58790dc-4468-40ad-ba58-bb433a926abe-kube-api-access-dbb4f\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.230586 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.230594 5094 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.439719 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" event={"ID":"f58790dc-4468-40ad-ba58-bb433a926abe","Type":"ContainerDied","Data":"e39920cb7c5a699ca2b8e232842e2e36ead718dc4047ce9c0e15c3362b15e939"} Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.439761 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e39920cb7c5a699ca2b8e232842e2e36ead718dc4047ce9c0e15c3362b15e939" Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.439772 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:32:14 crc kubenswrapper[5094]: I0220 09:32:14.840259 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:32:14 crc kubenswrapper[5094]: E0220 09:32:14.841079 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:32:24 crc kubenswrapper[5094]: I0220 09:32:24.716811 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 09:32:24 crc kubenswrapper[5094]: I0220 09:32:24.717548 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="02305b70-64d3-46af-876a-f81d73f83cbf" containerName="nova-cell0-conductor-conductor" containerID="cri-o://fec92b2a8da1e2997bd3d8a328f8f1e86063ef10d1fd953772b4ae368e9d524b" gracePeriod=30 Feb 20 09:32:25 crc kubenswrapper[5094]: I0220 09:32:25.176907 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 09:32:25 crc kubenswrapper[5094]: I0220 09:32:25.177326 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="ec04fa38-0d41-4c78-99fd-56299cd1c5ac" containerName="nova-cell1-conductor-conductor" containerID="cri-o://b0e35cbe8bf849fece9161e859f69976c6d3a8bd0f88046bd05e34f439cfb7f8" gracePeriod=30 Feb 20 09:32:25 crc kubenswrapper[5094]: I0220 09:32:25.369084 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 09:32:25 crc kubenswrapper[5094]: I0220 09:32:25.369295 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="80d2f807-a13f-4a1d-93d3-293d1afd6e4c" containerName="nova-scheduler-scheduler" containerID="cri-o://10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f" gracePeriod=30 Feb 20 09:32:25 crc kubenswrapper[5094]: I0220 09:32:25.394848 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 09:32:25 crc kubenswrapper[5094]: I0220 09:32:25.395109 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4c1b5836-3f97-4ae2-a894-e42a72b29729" containerName="nova-api-log" containerID="cri-o://39a13020f662d3e6609d4881367424f2f25adb683ad3729bfa3b75921443ae45" gracePeriod=30 Feb 20 09:32:25 crc kubenswrapper[5094]: I0220 09:32:25.395230 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4c1b5836-3f97-4ae2-a894-e42a72b29729" containerName="nova-api-api" containerID="cri-o://c876bb36d6cd6351bc15c4941fbdd419137b0dffee9422b0eca06e2602e21509" gracePeriod=30 Feb 20 09:32:25 crc kubenswrapper[5094]: I0220 09:32:25.453088 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 09:32:25 crc kubenswrapper[5094]: I0220 09:32:25.453316 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1323ed20-0605-4081-a36d-6fa8c40f26e6" containerName="nova-metadata-log" containerID="cri-o://c29b5297edcb6838d15c351b78051941a64fc614165717a744eca6ab46c45320" gracePeriod=30 Feb 20 09:32:25 crc kubenswrapper[5094]: I0220 09:32:25.453795 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1323ed20-0605-4081-a36d-6fa8c40f26e6" containerName="nova-metadata-metadata" containerID="cri-o://25dc2efe543de2f343b425911821a1eda1c851282daa752a1b28d46a6d470381" gracePeriod=30 Feb 20 09:32:25 crc kubenswrapper[5094]: I0220 09:32:25.727084 5094 generic.go:334] "Generic (PLEG): container finished" podID="1323ed20-0605-4081-a36d-6fa8c40f26e6" containerID="c29b5297edcb6838d15c351b78051941a64fc614165717a744eca6ab46c45320" exitCode=143 Feb 20 09:32:25 crc kubenswrapper[5094]: I0220 09:32:25.727168 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1323ed20-0605-4081-a36d-6fa8c40f26e6","Type":"ContainerDied","Data":"c29b5297edcb6838d15c351b78051941a64fc614165717a744eca6ab46c45320"} Feb 20 09:32:25 crc kubenswrapper[5094]: I0220 09:32:25.730049 5094 generic.go:334] "Generic (PLEG): container finished" podID="4c1b5836-3f97-4ae2-a894-e42a72b29729" containerID="39a13020f662d3e6609d4881367424f2f25adb683ad3729bfa3b75921443ae45" exitCode=143 Feb 20 09:32:25 crc kubenswrapper[5094]: I0220 09:32:25.730090 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c1b5836-3f97-4ae2-a894-e42a72b29729","Type":"ContainerDied","Data":"39a13020f662d3e6609d4881367424f2f25adb683ad3729bfa3b75921443ae45"} Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.371683 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.529033 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-combined-ca-bundle\") pod \"ec04fa38-0d41-4c78-99fd-56299cd1c5ac\" (UID: \"ec04fa38-0d41-4c78-99fd-56299cd1c5ac\") " Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.529170 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d2s8\" (UniqueName: \"kubernetes.io/projected/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-kube-api-access-2d2s8\") pod \"ec04fa38-0d41-4c78-99fd-56299cd1c5ac\" (UID: \"ec04fa38-0d41-4c78-99fd-56299cd1c5ac\") " Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.529231 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-config-data\") pod \"ec04fa38-0d41-4c78-99fd-56299cd1c5ac\" (UID: \"ec04fa38-0d41-4c78-99fd-56299cd1c5ac\") " Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.534141 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-kube-api-access-2d2s8" (OuterVolumeSpecName: "kube-api-access-2d2s8") pod "ec04fa38-0d41-4c78-99fd-56299cd1c5ac" (UID: "ec04fa38-0d41-4c78-99fd-56299cd1c5ac"). InnerVolumeSpecName "kube-api-access-2d2s8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.562910 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-config-data" (OuterVolumeSpecName: "config-data") pod "ec04fa38-0d41-4c78-99fd-56299cd1c5ac" (UID: "ec04fa38-0d41-4c78-99fd-56299cd1c5ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.565490 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec04fa38-0d41-4c78-99fd-56299cd1c5ac" (UID: "ec04fa38-0d41-4c78-99fd-56299cd1c5ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.631343 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.631540 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d2s8\" (UniqueName: \"kubernetes.io/projected/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-kube-api-access-2d2s8\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.631599 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.740270 5094 generic.go:334] "Generic (PLEG): container finished" podID="ec04fa38-0d41-4c78-99fd-56299cd1c5ac" containerID="b0e35cbe8bf849fece9161e859f69976c6d3a8bd0f88046bd05e34f439cfb7f8" exitCode=0 Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.740310 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ec04fa38-0d41-4c78-99fd-56299cd1c5ac","Type":"ContainerDied","Data":"b0e35cbe8bf849fece9161e859f69976c6d3a8bd0f88046bd05e34f439cfb7f8"} Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.740334 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ec04fa38-0d41-4c78-99fd-56299cd1c5ac","Type":"ContainerDied","Data":"205874c2bd37603e15fe2e158ed7cb1e0b3a0637daa0a0e89a2d6fb0e765f899"} Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.740349 5094 scope.go:117] "RemoveContainer" containerID="b0e35cbe8bf849fece9161e859f69976c6d3a8bd0f88046bd05e34f439cfb7f8" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.740452 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.773863 5094 scope.go:117] "RemoveContainer" containerID="b0e35cbe8bf849fece9161e859f69976c6d3a8bd0f88046bd05e34f439cfb7f8" Feb 20 09:32:26 crc kubenswrapper[5094]: E0220 09:32:26.774365 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0e35cbe8bf849fece9161e859f69976c6d3a8bd0f88046bd05e34f439cfb7f8\": container with ID starting with b0e35cbe8bf849fece9161e859f69976c6d3a8bd0f88046bd05e34f439cfb7f8 not found: ID does not exist" containerID="b0e35cbe8bf849fece9161e859f69976c6d3a8bd0f88046bd05e34f439cfb7f8" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.774417 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0e35cbe8bf849fece9161e859f69976c6d3a8bd0f88046bd05e34f439cfb7f8"} err="failed to get container status \"b0e35cbe8bf849fece9161e859f69976c6d3a8bd0f88046bd05e34f439cfb7f8\": rpc error: code = NotFound desc = could not find container \"b0e35cbe8bf849fece9161e859f69976c6d3a8bd0f88046bd05e34f439cfb7f8\": container with ID starting with b0e35cbe8bf849fece9161e859f69976c6d3a8bd0f88046bd05e34f439cfb7f8 not found: ID does not exist" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.780944 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.802134 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.818574 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 09:32:26 crc kubenswrapper[5094]: E0220 09:32:26.819160 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f58790dc-4468-40ad-ba58-bb433a926abe" containerName="neutron-dhcp-openstack-openstack-cell1" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.819177 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f58790dc-4468-40ad-ba58-bb433a926abe" containerName="neutron-dhcp-openstack-openstack-cell1" Feb 20 09:32:26 crc kubenswrapper[5094]: E0220 09:32:26.819241 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec04fa38-0d41-4c78-99fd-56299cd1c5ac" containerName="nova-cell1-conductor-conductor" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.819253 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec04fa38-0d41-4c78-99fd-56299cd1c5ac" containerName="nova-cell1-conductor-conductor" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.819491 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f58790dc-4468-40ad-ba58-bb433a926abe" containerName="neutron-dhcp-openstack-openstack-cell1" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.819531 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec04fa38-0d41-4c78-99fd-56299cd1c5ac" containerName="nova-cell1-conductor-conductor" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.820421 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.827097 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.828752 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.835667 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23153570-19e2-4a29-9533-5db90a0c5d09-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"23153570-19e2-4a29-9533-5db90a0c5d09\") " pod="openstack/nova-cell1-conductor-0" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.836070 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23153570-19e2-4a29-9533-5db90a0c5d09-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"23153570-19e2-4a29-9533-5db90a0c5d09\") " pod="openstack/nova-cell1-conductor-0" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.836104 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5m6v\" (UniqueName: \"kubernetes.io/projected/23153570-19e2-4a29-9533-5db90a0c5d09-kube-api-access-v5m6v\") pod \"nova-cell1-conductor-0\" (UID: \"23153570-19e2-4a29-9533-5db90a0c5d09\") " pod="openstack/nova-cell1-conductor-0" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.937539 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23153570-19e2-4a29-9533-5db90a0c5d09-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"23153570-19e2-4a29-9533-5db90a0c5d09\") " pod="openstack/nova-cell1-conductor-0" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.937718 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23153570-19e2-4a29-9533-5db90a0c5d09-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"23153570-19e2-4a29-9533-5db90a0c5d09\") " pod="openstack/nova-cell1-conductor-0" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.937754 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5m6v\" (UniqueName: \"kubernetes.io/projected/23153570-19e2-4a29-9533-5db90a0c5d09-kube-api-access-v5m6v\") pod \"nova-cell1-conductor-0\" (UID: \"23153570-19e2-4a29-9533-5db90a0c5d09\") " pod="openstack/nova-cell1-conductor-0" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.949783 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23153570-19e2-4a29-9533-5db90a0c5d09-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"23153570-19e2-4a29-9533-5db90a0c5d09\") " pod="openstack/nova-cell1-conductor-0" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.958867 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23153570-19e2-4a29-9533-5db90a0c5d09-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"23153570-19e2-4a29-9533-5db90a0c5d09\") " pod="openstack/nova-cell1-conductor-0" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.972969 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5m6v\" (UniqueName: \"kubernetes.io/projected/23153570-19e2-4a29-9533-5db90a0c5d09-kube-api-access-v5m6v\") pod \"nova-cell1-conductor-0\" (UID: \"23153570-19e2-4a29-9533-5db90a0c5d09\") " pod="openstack/nova-cell1-conductor-0" Feb 20 09:32:27 crc kubenswrapper[5094]: I0220 09:32:27.148097 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 09:32:27 crc kubenswrapper[5094]: I0220 09:32:27.646391 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 09:32:27 crc kubenswrapper[5094]: I0220 09:32:27.751599 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"23153570-19e2-4a29-9533-5db90a0c5d09","Type":"ContainerStarted","Data":"c3f4d685a93958bf9224bacc59e5f1b58749b345e66218d6839bb67feec0373c"} Feb 20 09:32:27 crc kubenswrapper[5094]: I0220 09:32:27.855753 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec04fa38-0d41-4c78-99fd-56299cd1c5ac" path="/var/lib/kubelet/pods/ec04fa38-0d41-4c78-99fd-56299cd1c5ac/volumes" Feb 20 09:32:28 crc kubenswrapper[5094]: I0220 09:32:28.787223 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"23153570-19e2-4a29-9533-5db90a0c5d09","Type":"ContainerStarted","Data":"82040ae5321ea360ff64c5dde93b028401d429a83a61f8db08baa6fd1a7397ac"} Feb 20 09:32:28 crc kubenswrapper[5094]: I0220 09:32:28.787733 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 20 09:32:28 crc kubenswrapper[5094]: I0220 09:32:28.790237 5094 generic.go:334] "Generic (PLEG): container finished" podID="02305b70-64d3-46af-876a-f81d73f83cbf" containerID="fec92b2a8da1e2997bd3d8a328f8f1e86063ef10d1fd953772b4ae368e9d524b" exitCode=0 Feb 20 09:32:28 crc kubenswrapper[5094]: I0220 09:32:28.790320 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"02305b70-64d3-46af-876a-f81d73f83cbf","Type":"ContainerDied","Data":"fec92b2a8da1e2997bd3d8a328f8f1e86063ef10d1fd953772b4ae368e9d524b"} Feb 20 09:32:28 crc kubenswrapper[5094]: I0220 09:32:28.819278 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.81925231 podStartE2EDuration="2.81925231s" podCreationTimestamp="2026-02-20 09:32:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:32:28.801918903 +0000 UTC m=+9963.674545624" watchObservedRunningTime="2026-02-20 09:32:28.81925231 +0000 UTC m=+9963.691879031" Feb 20 09:32:28 crc kubenswrapper[5094]: I0220 09:32:28.873142 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="1323ed20-0605-4081-a36d-6fa8c40f26e6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.87:8775/\": read tcp 10.217.0.2:34240->10.217.1.87:8775: read: connection reset by peer" Feb 20 09:32:28 crc kubenswrapper[5094]: I0220 09:32:28.873117 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="1323ed20-0605-4081-a36d-6fa8c40f26e6" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.87:8775/\": read tcp 10.217.0.2:34248->10.217.1.87:8775: read: connection reset by peer" Feb 20 09:32:29 crc kubenswrapper[5094]: E0220 09:32:29.000024 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fec92b2a8da1e2997bd3d8a328f8f1e86063ef10d1fd953772b4ae368e9d524b is running failed: container process not found" containerID="fec92b2a8da1e2997bd3d8a328f8f1e86063ef10d1fd953772b4ae368e9d524b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 09:32:29 crc kubenswrapper[5094]: E0220 09:32:29.000559 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fec92b2a8da1e2997bd3d8a328f8f1e86063ef10d1fd953772b4ae368e9d524b is running failed: container process not found" containerID="fec92b2a8da1e2997bd3d8a328f8f1e86063ef10d1fd953772b4ae368e9d524b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 09:32:29 crc kubenswrapper[5094]: E0220 09:32:29.001005 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fec92b2a8da1e2997bd3d8a328f8f1e86063ef10d1fd953772b4ae368e9d524b is running failed: container process not found" containerID="fec92b2a8da1e2997bd3d8a328f8f1e86063ef10d1fd953772b4ae368e9d524b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 09:32:29 crc kubenswrapper[5094]: E0220 09:32:29.001038 5094 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fec92b2a8da1e2997bd3d8a328f8f1e86063ef10d1fd953772b4ae368e9d524b is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="02305b70-64d3-46af-876a-f81d73f83cbf" containerName="nova-cell0-conductor-conductor" Feb 20 09:32:29 crc kubenswrapper[5094]: I0220 09:32:29.730181 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 09:32:29 crc kubenswrapper[5094]: I0220 09:32:29.807241 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"02305b70-64d3-46af-876a-f81d73f83cbf","Type":"ContainerDied","Data":"b27e92493a93d647968058e4cfe443d6348c867ece948ce72e75c01521bbc434"} Feb 20 09:32:29 crc kubenswrapper[5094]: I0220 09:32:29.807573 5094 scope.go:117] "RemoveContainer" containerID="fec92b2a8da1e2997bd3d8a328f8f1e86063ef10d1fd953772b4ae368e9d524b" Feb 20 09:32:29 crc kubenswrapper[5094]: I0220 09:32:29.807732 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 09:32:29 crc kubenswrapper[5094]: I0220 09:32:29.814433 5094 generic.go:334] "Generic (PLEG): container finished" podID="1323ed20-0605-4081-a36d-6fa8c40f26e6" containerID="25dc2efe543de2f343b425911821a1eda1c851282daa752a1b28d46a6d470381" exitCode=0 Feb 20 09:32:29 crc kubenswrapper[5094]: I0220 09:32:29.814498 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1323ed20-0605-4081-a36d-6fa8c40f26e6","Type":"ContainerDied","Data":"25dc2efe543de2f343b425911821a1eda1c851282daa752a1b28d46a6d470381"} Feb 20 09:32:29 crc kubenswrapper[5094]: I0220 09:32:29.816573 5094 generic.go:334] "Generic (PLEG): container finished" podID="4c1b5836-3f97-4ae2-a894-e42a72b29729" containerID="c876bb36d6cd6351bc15c4941fbdd419137b0dffee9422b0eca06e2602e21509" exitCode=0 Feb 20 09:32:29 crc kubenswrapper[5094]: I0220 09:32:29.817086 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c1b5836-3f97-4ae2-a894-e42a72b29729","Type":"ContainerDied","Data":"c876bb36d6cd6351bc15c4941fbdd419137b0dffee9422b0eca06e2602e21509"} Feb 20 09:32:29 crc kubenswrapper[5094]: I0220 09:32:29.843027 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:32:29 crc kubenswrapper[5094]: E0220 09:32:29.846191 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:32:29 crc kubenswrapper[5094]: I0220 09:32:29.903314 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02305b70-64d3-46af-876a-f81d73f83cbf-combined-ca-bundle\") pod \"02305b70-64d3-46af-876a-f81d73f83cbf\" (UID: \"02305b70-64d3-46af-876a-f81d73f83cbf\") " Feb 20 09:32:29 crc kubenswrapper[5094]: I0220 09:32:29.903400 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02305b70-64d3-46af-876a-f81d73f83cbf-config-data\") pod \"02305b70-64d3-46af-876a-f81d73f83cbf\" (UID: \"02305b70-64d3-46af-876a-f81d73f83cbf\") " Feb 20 09:32:29 crc kubenswrapper[5094]: I0220 09:32:29.903478 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk747\" (UniqueName: \"kubernetes.io/projected/02305b70-64d3-46af-876a-f81d73f83cbf-kube-api-access-wk747\") pod \"02305b70-64d3-46af-876a-f81d73f83cbf\" (UID: \"02305b70-64d3-46af-876a-f81d73f83cbf\") " Feb 20 09:32:29 crc kubenswrapper[5094]: I0220 09:32:29.918413 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02305b70-64d3-46af-876a-f81d73f83cbf-kube-api-access-wk747" (OuterVolumeSpecName: "kube-api-access-wk747") pod "02305b70-64d3-46af-876a-f81d73f83cbf" (UID: "02305b70-64d3-46af-876a-f81d73f83cbf"). InnerVolumeSpecName "kube-api-access-wk747". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.006267 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk747\" (UniqueName: \"kubernetes.io/projected/02305b70-64d3-46af-876a-f81d73f83cbf-kube-api-access-wk747\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.043551 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02305b70-64d3-46af-876a-f81d73f83cbf-config-data" (OuterVolumeSpecName: "config-data") pod "02305b70-64d3-46af-876a-f81d73f83cbf" (UID: "02305b70-64d3-46af-876a-f81d73f83cbf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.054950 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02305b70-64d3-46af-876a-f81d73f83cbf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02305b70-64d3-46af-876a-f81d73f83cbf" (UID: "02305b70-64d3-46af-876a-f81d73f83cbf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.110087 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02305b70-64d3-46af-876a-f81d73f83cbf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.110123 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02305b70-64d3-46af-876a-f81d73f83cbf-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:30 crc kubenswrapper[5094]: E0220 09:32:30.150497 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f is running failed: container process not found" containerID="10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 09:32:30 crc kubenswrapper[5094]: E0220 09:32:30.151249 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f is running failed: container process not found" containerID="10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 09:32:30 crc kubenswrapper[5094]: E0220 09:32:30.155028 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f is running failed: container process not found" containerID="10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 09:32:30 crc kubenswrapper[5094]: E0220 09:32:30.155070 5094 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="80d2f807-a13f-4a1d-93d3-293d1afd6e4c" containerName="nova-scheduler-scheduler" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.203264 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.213451 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1323ed20-0605-4081-a36d-6fa8c40f26e6-logs\") pod \"1323ed20-0605-4081-a36d-6fa8c40f26e6\" (UID: \"1323ed20-0605-4081-a36d-6fa8c40f26e6\") " Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.213508 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1323ed20-0605-4081-a36d-6fa8c40f26e6-combined-ca-bundle\") pod \"1323ed20-0605-4081-a36d-6fa8c40f26e6\" (UID: \"1323ed20-0605-4081-a36d-6fa8c40f26e6\") " Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.213606 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1323ed20-0605-4081-a36d-6fa8c40f26e6-config-data\") pod \"1323ed20-0605-4081-a36d-6fa8c40f26e6\" (UID: \"1323ed20-0605-4081-a36d-6fa8c40f26e6\") " Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.213680 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcbxk\" (UniqueName: \"kubernetes.io/projected/1323ed20-0605-4081-a36d-6fa8c40f26e6-kube-api-access-dcbxk\") pod \"1323ed20-0605-4081-a36d-6fa8c40f26e6\" (UID: \"1323ed20-0605-4081-a36d-6fa8c40f26e6\") " Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.214189 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1323ed20-0605-4081-a36d-6fa8c40f26e6-logs" (OuterVolumeSpecName: "logs") pod "1323ed20-0605-4081-a36d-6fa8c40f26e6" (UID: "1323ed20-0605-4081-a36d-6fa8c40f26e6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.214786 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1323ed20-0605-4081-a36d-6fa8c40f26e6-logs\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.218739 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.223528 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.228262 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1323ed20-0605-4081-a36d-6fa8c40f26e6-kube-api-access-dcbxk" (OuterVolumeSpecName: "kube-api-access-dcbxk") pod "1323ed20-0605-4081-a36d-6fa8c40f26e6" (UID: "1323ed20-0605-4081-a36d-6fa8c40f26e6"). InnerVolumeSpecName "kube-api-access-dcbxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.241120 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.262463 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 09:32:30 crc kubenswrapper[5094]: E0220 09:32:30.262963 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c1b5836-3f97-4ae2-a894-e42a72b29729" containerName="nova-api-log" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.262976 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c1b5836-3f97-4ae2-a894-e42a72b29729" containerName="nova-api-log" Feb 20 09:32:30 crc kubenswrapper[5094]: E0220 09:32:30.262988 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c1b5836-3f97-4ae2-a894-e42a72b29729" containerName="nova-api-api" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.262994 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c1b5836-3f97-4ae2-a894-e42a72b29729" containerName="nova-api-api" Feb 20 09:32:30 crc kubenswrapper[5094]: E0220 09:32:30.263014 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1323ed20-0605-4081-a36d-6fa8c40f26e6" containerName="nova-metadata-log" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.263020 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="1323ed20-0605-4081-a36d-6fa8c40f26e6" containerName="nova-metadata-log" Feb 20 09:32:30 crc kubenswrapper[5094]: E0220 09:32:30.263047 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02305b70-64d3-46af-876a-f81d73f83cbf" containerName="nova-cell0-conductor-conductor" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.263053 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="02305b70-64d3-46af-876a-f81d73f83cbf" containerName="nova-cell0-conductor-conductor" Feb 20 09:32:30 crc kubenswrapper[5094]: E0220 09:32:30.263070 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1323ed20-0605-4081-a36d-6fa8c40f26e6" containerName="nova-metadata-metadata" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.263081 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="1323ed20-0605-4081-a36d-6fa8c40f26e6" containerName="nova-metadata-metadata" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.263290 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="1323ed20-0605-4081-a36d-6fa8c40f26e6" containerName="nova-metadata-log" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.263305 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="02305b70-64d3-46af-876a-f81d73f83cbf" containerName="nova-cell0-conductor-conductor" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.263320 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c1b5836-3f97-4ae2-a894-e42a72b29729" containerName="nova-api-log" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.263333 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="1323ed20-0605-4081-a36d-6fa8c40f26e6" containerName="nova-metadata-metadata" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.263350 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c1b5836-3f97-4ae2-a894-e42a72b29729" containerName="nova-api-api" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.264060 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.270225 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.284863 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.293500 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1323ed20-0605-4081-a36d-6fa8c40f26e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1323ed20-0605-4081-a36d-6fa8c40f26e6" (UID: "1323ed20-0605-4081-a36d-6fa8c40f26e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.307877 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1323ed20-0605-4081-a36d-6fa8c40f26e6-config-data" (OuterVolumeSpecName: "config-data") pod "1323ed20-0605-4081-a36d-6fa8c40f26e6" (UID: "1323ed20-0605-4081-a36d-6fa8c40f26e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.317427 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltthv\" (UniqueName: \"kubernetes.io/projected/4c1b5836-3f97-4ae2-a894-e42a72b29729-kube-api-access-ltthv\") pod \"4c1b5836-3f97-4ae2-a894-e42a72b29729\" (UID: \"4c1b5836-3f97-4ae2-a894-e42a72b29729\") " Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.317519 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c1b5836-3f97-4ae2-a894-e42a72b29729-config-data\") pod \"4c1b5836-3f97-4ae2-a894-e42a72b29729\" (UID: \"4c1b5836-3f97-4ae2-a894-e42a72b29729\") " Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.317570 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c1b5836-3f97-4ae2-a894-e42a72b29729-combined-ca-bundle\") pod \"4c1b5836-3f97-4ae2-a894-e42a72b29729\" (UID: \"4c1b5836-3f97-4ae2-a894-e42a72b29729\") " Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.317601 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c1b5836-3f97-4ae2-a894-e42a72b29729-logs\") pod \"4c1b5836-3f97-4ae2-a894-e42a72b29729\" (UID: \"4c1b5836-3f97-4ae2-a894-e42a72b29729\") " Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.317978 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mktbp\" (UniqueName: \"kubernetes.io/projected/9ae582b9-3951-4670-91bf-5d044269ff1c-kube-api-access-mktbp\") pod \"nova-cell0-conductor-0\" (UID: \"9ae582b9-3951-4670-91bf-5d044269ff1c\") " pod="openstack/nova-cell0-conductor-0" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.318086 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ae582b9-3951-4670-91bf-5d044269ff1c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9ae582b9-3951-4670-91bf-5d044269ff1c\") " pod="openstack/nova-cell0-conductor-0" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.318192 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ae582b9-3951-4670-91bf-5d044269ff1c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9ae582b9-3951-4670-91bf-5d044269ff1c\") " pod="openstack/nova-cell0-conductor-0" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.320298 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c1b5836-3f97-4ae2-a894-e42a72b29729-kube-api-access-ltthv" (OuterVolumeSpecName: "kube-api-access-ltthv") pod "4c1b5836-3f97-4ae2-a894-e42a72b29729" (UID: "4c1b5836-3f97-4ae2-a894-e42a72b29729"). InnerVolumeSpecName "kube-api-access-ltthv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.321115 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c1b5836-3f97-4ae2-a894-e42a72b29729-logs" (OuterVolumeSpecName: "logs") pod "4c1b5836-3f97-4ae2-a894-e42a72b29729" (UID: "4c1b5836-3f97-4ae2-a894-e42a72b29729"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.318361 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcbxk\" (UniqueName: \"kubernetes.io/projected/1323ed20-0605-4081-a36d-6fa8c40f26e6-kube-api-access-dcbxk\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.322763 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1323ed20-0605-4081-a36d-6fa8c40f26e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.322775 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1323ed20-0605-4081-a36d-6fa8c40f26e6-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.354342 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c1b5836-3f97-4ae2-a894-e42a72b29729-config-data" (OuterVolumeSpecName: "config-data") pod "4c1b5836-3f97-4ae2-a894-e42a72b29729" (UID: "4c1b5836-3f97-4ae2-a894-e42a72b29729"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.354366 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c1b5836-3f97-4ae2-a894-e42a72b29729-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c1b5836-3f97-4ae2-a894-e42a72b29729" (UID: "4c1b5836-3f97-4ae2-a894-e42a72b29729"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.384910 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.423604 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-combined-ca-bundle\") pod \"80d2f807-a13f-4a1d-93d3-293d1afd6e4c\" (UID: \"80d2f807-a13f-4a1d-93d3-293d1afd6e4c\") " Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.423712 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7kqr\" (UniqueName: \"kubernetes.io/projected/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-kube-api-access-t7kqr\") pod \"80d2f807-a13f-4a1d-93d3-293d1afd6e4c\" (UID: \"80d2f807-a13f-4a1d-93d3-293d1afd6e4c\") " Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.423795 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-config-data\") pod \"80d2f807-a13f-4a1d-93d3-293d1afd6e4c\" (UID: \"80d2f807-a13f-4a1d-93d3-293d1afd6e4c\") " Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.423974 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mktbp\" (UniqueName: \"kubernetes.io/projected/9ae582b9-3951-4670-91bf-5d044269ff1c-kube-api-access-mktbp\") pod \"nova-cell0-conductor-0\" (UID: \"9ae582b9-3951-4670-91bf-5d044269ff1c\") " pod="openstack/nova-cell0-conductor-0" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.424033 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ae582b9-3951-4670-91bf-5d044269ff1c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9ae582b9-3951-4670-91bf-5d044269ff1c\") " pod="openstack/nova-cell0-conductor-0" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.424098 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ae582b9-3951-4670-91bf-5d044269ff1c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9ae582b9-3951-4670-91bf-5d044269ff1c\") " pod="openstack/nova-cell0-conductor-0" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.424186 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c1b5836-3f97-4ae2-a894-e42a72b29729-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.424196 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c1b5836-3f97-4ae2-a894-e42a72b29729-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.424206 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c1b5836-3f97-4ae2-a894-e42a72b29729-logs\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.424215 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltthv\" (UniqueName: \"kubernetes.io/projected/4c1b5836-3f97-4ae2-a894-e42a72b29729-kube-api-access-ltthv\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.430372 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ae582b9-3951-4670-91bf-5d044269ff1c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9ae582b9-3951-4670-91bf-5d044269ff1c\") " pod="openstack/nova-cell0-conductor-0" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.433999 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-kube-api-access-t7kqr" (OuterVolumeSpecName: "kube-api-access-t7kqr") pod "80d2f807-a13f-4a1d-93d3-293d1afd6e4c" (UID: "80d2f807-a13f-4a1d-93d3-293d1afd6e4c"). InnerVolumeSpecName "kube-api-access-t7kqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.435765 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ae582b9-3951-4670-91bf-5d044269ff1c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9ae582b9-3951-4670-91bf-5d044269ff1c\") " pod="openstack/nova-cell0-conductor-0" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.439974 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mktbp\" (UniqueName: \"kubernetes.io/projected/9ae582b9-3951-4670-91bf-5d044269ff1c-kube-api-access-mktbp\") pod \"nova-cell0-conductor-0\" (UID: \"9ae582b9-3951-4670-91bf-5d044269ff1c\") " pod="openstack/nova-cell0-conductor-0" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.452270 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80d2f807-a13f-4a1d-93d3-293d1afd6e4c" (UID: "80d2f807-a13f-4a1d-93d3-293d1afd6e4c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.456844 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-config-data" (OuterVolumeSpecName: "config-data") pod "80d2f807-a13f-4a1d-93d3-293d1afd6e4c" (UID: "80d2f807-a13f-4a1d-93d3-293d1afd6e4c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.526895 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7kqr\" (UniqueName: \"kubernetes.io/projected/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-kube-api-access-t7kqr\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.526953 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.526966 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.597224 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.835246 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1323ed20-0605-4081-a36d-6fa8c40f26e6","Type":"ContainerDied","Data":"d585671e2fde2c389818c568ec8f701d1f0c341b00acbfaa339458d079916a62"} Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.835310 5094 scope.go:117] "RemoveContainer" containerID="25dc2efe543de2f343b425911821a1eda1c851282daa752a1b28d46a6d470381" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.835415 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.845531 5094 generic.go:334] "Generic (PLEG): container finished" podID="80d2f807-a13f-4a1d-93d3-293d1afd6e4c" containerID="10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f" exitCode=0 Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.845631 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"80d2f807-a13f-4a1d-93d3-293d1afd6e4c","Type":"ContainerDied","Data":"10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f"} Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.845687 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"80d2f807-a13f-4a1d-93d3-293d1afd6e4c","Type":"ContainerDied","Data":"364b3455721cb71cf6727cbca688257580fa376c421f75064e6d61cba30218c8"} Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.845788 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.853332 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c1b5836-3f97-4ae2-a894-e42a72b29729","Type":"ContainerDied","Data":"867e153f6129e6c09f4b4a68b08d0ac6938b5f39543d1e08a62fd3fdae93737c"} Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.853420 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.078511 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.403613 5094 scope.go:117] "RemoveContainer" containerID="c29b5297edcb6838d15c351b78051941a64fc614165717a744eca6ab46c45320" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.433259 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.444210 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.457858 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.482753 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.506797 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 09:32:31 crc kubenswrapper[5094]: E0220 09:32:31.507289 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80d2f807-a13f-4a1d-93d3-293d1afd6e4c" containerName="nova-scheduler-scheduler" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.507310 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="80d2f807-a13f-4a1d-93d3-293d1afd6e4c" containerName="nova-scheduler-scheduler" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.507580 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="80d2f807-a13f-4a1d-93d3-293d1afd6e4c" containerName="nova-scheduler-scheduler" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.508365 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.514782 5094 scope.go:117] "RemoveContainer" containerID="10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.518640 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.529097 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.584258 5094 scope.go:117] "RemoveContainer" containerID="10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f" Feb 20 09:32:31 crc kubenswrapper[5094]: E0220 09:32:31.586371 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f\": container with ID starting with 10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f not found: ID does not exist" containerID="10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.586421 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f"} err="failed to get container status \"10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f\": rpc error: code = NotFound desc = could not find container \"10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f\": container with ID starting with 10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f not found: ID does not exist" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.586466 5094 scope.go:117] "RemoveContainer" containerID="c876bb36d6cd6351bc15c4941fbdd419137b0dffee9422b0eca06e2602e21509" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.602374 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.617171 5094 scope.go:117] "RemoveContainer" containerID="39a13020f662d3e6609d4881367424f2f25adb683ad3729bfa3b75921443ae45" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.621105 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.631758 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.633729 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.637356 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.642957 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.656020 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.657938 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.659890 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67a3bd12-be26-46a3-bd66-982bea39049a-config-data\") pod \"nova-scheduler-0\" (UID: \"67a3bd12-be26-46a3-bd66-982bea39049a\") " pod="openstack/nova-scheduler-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.659954 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67a3bd12-be26-46a3-bd66-982bea39049a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"67a3bd12-be26-46a3-bd66-982bea39049a\") " pod="openstack/nova-scheduler-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.659992 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv2cw\" (UniqueName: \"kubernetes.io/projected/67a3bd12-be26-46a3-bd66-982bea39049a-kube-api-access-lv2cw\") pod \"nova-scheduler-0\" (UID: \"67a3bd12-be26-46a3-bd66-982bea39049a\") " pod="openstack/nova-scheduler-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.661997 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.665898 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.761382 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67a3bd12-be26-46a3-bd66-982bea39049a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"67a3bd12-be26-46a3-bd66-982bea39049a\") " pod="openstack/nova-scheduler-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.761433 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx8fk\" (UniqueName: \"kubernetes.io/projected/b6886613-4f07-498a-911f-4d77704ab4df-kube-api-access-sx8fk\") pod \"nova-api-0\" (UID: \"b6886613-4f07-498a-911f-4d77704ab4df\") " pod="openstack/nova-api-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.761470 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv2cw\" (UniqueName: \"kubernetes.io/projected/67a3bd12-be26-46a3-bd66-982bea39049a-kube-api-access-lv2cw\") pod \"nova-scheduler-0\" (UID: \"67a3bd12-be26-46a3-bd66-982bea39049a\") " pod="openstack/nova-scheduler-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.761523 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6886613-4f07-498a-911f-4d77704ab4df-config-data\") pod \"nova-api-0\" (UID: \"b6886613-4f07-498a-911f-4d77704ab4df\") " pod="openstack/nova-api-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.761551 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f99e6f-46a8-4a46-bcae-81947aa95700-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"87f99e6f-46a8-4a46-bcae-81947aa95700\") " pod="openstack/nova-metadata-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.761583 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f99e6f-46a8-4a46-bcae-81947aa95700-config-data\") pod \"nova-metadata-0\" (UID: \"87f99e6f-46a8-4a46-bcae-81947aa95700\") " pod="openstack/nova-metadata-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.761655 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67a3bd12-be26-46a3-bd66-982bea39049a-config-data\") pod \"nova-scheduler-0\" (UID: \"67a3bd12-be26-46a3-bd66-982bea39049a\") " pod="openstack/nova-scheduler-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.761678 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6886613-4f07-498a-911f-4d77704ab4df-logs\") pod \"nova-api-0\" (UID: \"b6886613-4f07-498a-911f-4d77704ab4df\") " pod="openstack/nova-api-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.761693 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87f99e6f-46a8-4a46-bcae-81947aa95700-logs\") pod \"nova-metadata-0\" (UID: \"87f99e6f-46a8-4a46-bcae-81947aa95700\") " pod="openstack/nova-metadata-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.761732 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6886613-4f07-498a-911f-4d77704ab4df-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b6886613-4f07-498a-911f-4d77704ab4df\") " pod="openstack/nova-api-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.761753 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxfv6\" (UniqueName: \"kubernetes.io/projected/87f99e6f-46a8-4a46-bcae-81947aa95700-kube-api-access-pxfv6\") pod \"nova-metadata-0\" (UID: \"87f99e6f-46a8-4a46-bcae-81947aa95700\") " pod="openstack/nova-metadata-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.769653 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67a3bd12-be26-46a3-bd66-982bea39049a-config-data\") pod \"nova-scheduler-0\" (UID: \"67a3bd12-be26-46a3-bd66-982bea39049a\") " pod="openstack/nova-scheduler-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.770278 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67a3bd12-be26-46a3-bd66-982bea39049a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"67a3bd12-be26-46a3-bd66-982bea39049a\") " pod="openstack/nova-scheduler-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.778462 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv2cw\" (UniqueName: \"kubernetes.io/projected/67a3bd12-be26-46a3-bd66-982bea39049a-kube-api-access-lv2cw\") pod \"nova-scheduler-0\" (UID: \"67a3bd12-be26-46a3-bd66-982bea39049a\") " pod="openstack/nova-scheduler-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.837056 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.856989 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02305b70-64d3-46af-876a-f81d73f83cbf" path="/var/lib/kubelet/pods/02305b70-64d3-46af-876a-f81d73f83cbf/volumes" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.857643 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1323ed20-0605-4081-a36d-6fa8c40f26e6" path="/var/lib/kubelet/pods/1323ed20-0605-4081-a36d-6fa8c40f26e6/volumes" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.858918 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c1b5836-3f97-4ae2-a894-e42a72b29729" path="/var/lib/kubelet/pods/4c1b5836-3f97-4ae2-a894-e42a72b29729/volumes" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.860022 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80d2f807-a13f-4a1d-93d3-293d1afd6e4c" path="/var/lib/kubelet/pods/80d2f807-a13f-4a1d-93d3-293d1afd6e4c/volumes" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.863063 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6886613-4f07-498a-911f-4d77704ab4df-config-data\") pod \"nova-api-0\" (UID: \"b6886613-4f07-498a-911f-4d77704ab4df\") " pod="openstack/nova-api-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.863122 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f99e6f-46a8-4a46-bcae-81947aa95700-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"87f99e6f-46a8-4a46-bcae-81947aa95700\") " pod="openstack/nova-metadata-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.863162 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f99e6f-46a8-4a46-bcae-81947aa95700-config-data\") pod \"nova-metadata-0\" (UID: \"87f99e6f-46a8-4a46-bcae-81947aa95700\") " pod="openstack/nova-metadata-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.863237 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6886613-4f07-498a-911f-4d77704ab4df-logs\") pod \"nova-api-0\" (UID: \"b6886613-4f07-498a-911f-4d77704ab4df\") " pod="openstack/nova-api-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.863256 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87f99e6f-46a8-4a46-bcae-81947aa95700-logs\") pod \"nova-metadata-0\" (UID: \"87f99e6f-46a8-4a46-bcae-81947aa95700\") " pod="openstack/nova-metadata-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.863283 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6886613-4f07-498a-911f-4d77704ab4df-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b6886613-4f07-498a-911f-4d77704ab4df\") " pod="openstack/nova-api-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.863900 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6886613-4f07-498a-911f-4d77704ab4df-logs\") pod \"nova-api-0\" (UID: \"b6886613-4f07-498a-911f-4d77704ab4df\") " pod="openstack/nova-api-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.864088 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87f99e6f-46a8-4a46-bcae-81947aa95700-logs\") pod \"nova-metadata-0\" (UID: \"87f99e6f-46a8-4a46-bcae-81947aa95700\") " pod="openstack/nova-metadata-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.864168 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxfv6\" (UniqueName: \"kubernetes.io/projected/87f99e6f-46a8-4a46-bcae-81947aa95700-kube-api-access-pxfv6\") pod \"nova-metadata-0\" (UID: \"87f99e6f-46a8-4a46-bcae-81947aa95700\") " pod="openstack/nova-metadata-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.864193 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx8fk\" (UniqueName: \"kubernetes.io/projected/b6886613-4f07-498a-911f-4d77704ab4df-kube-api-access-sx8fk\") pod \"nova-api-0\" (UID: \"b6886613-4f07-498a-911f-4d77704ab4df\") " pod="openstack/nova-api-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.867873 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f99e6f-46a8-4a46-bcae-81947aa95700-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"87f99e6f-46a8-4a46-bcae-81947aa95700\") " pod="openstack/nova-metadata-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.867934 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6886613-4f07-498a-911f-4d77704ab4df-config-data\") pod \"nova-api-0\" (UID: \"b6886613-4f07-498a-911f-4d77704ab4df\") " pod="openstack/nova-api-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.867997 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6886613-4f07-498a-911f-4d77704ab4df-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b6886613-4f07-498a-911f-4d77704ab4df\") " pod="openstack/nova-api-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.873520 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f99e6f-46a8-4a46-bcae-81947aa95700-config-data\") pod \"nova-metadata-0\" (UID: \"87f99e6f-46a8-4a46-bcae-81947aa95700\") " pod="openstack/nova-metadata-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.881362 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9ae582b9-3951-4670-91bf-5d044269ff1c","Type":"ContainerStarted","Data":"dde8a975f326a4ab0f78730b58e14cf202d320584e2e32873e4c8b3c0cbe1f90"} Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.881408 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9ae582b9-3951-4670-91bf-5d044269ff1c","Type":"ContainerStarted","Data":"379bbcf599c0781990628e9389811419bb8aab99ec5f6dfc156931ce84ca14e9"} Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.882659 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.882691 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxfv6\" (UniqueName: \"kubernetes.io/projected/87f99e6f-46a8-4a46-bcae-81947aa95700-kube-api-access-pxfv6\") pod \"nova-metadata-0\" (UID: \"87f99e6f-46a8-4a46-bcae-81947aa95700\") " pod="openstack/nova-metadata-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.903075 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx8fk\" (UniqueName: \"kubernetes.io/projected/b6886613-4f07-498a-911f-4d77704ab4df-kube-api-access-sx8fk\") pod \"nova-api-0\" (UID: \"b6886613-4f07-498a-911f-4d77704ab4df\") " pod="openstack/nova-api-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.922203 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.922173071 podStartE2EDuration="1.922173071s" podCreationTimestamp="2026-02-20 09:32:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:32:31.915031939 +0000 UTC m=+9966.787658650" watchObservedRunningTime="2026-02-20 09:32:31.922173071 +0000 UTC m=+9966.794799782" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.954197 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.988972 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 09:32:32 crc kubenswrapper[5094]: I0220 09:32:32.186210 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 20 09:32:32 crc kubenswrapper[5094]: I0220 09:32:32.309605 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 09:32:32 crc kubenswrapper[5094]: W0220 09:32:32.311082 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67a3bd12_be26_46a3_bd66_982bea39049a.slice/crio-5ceac1d8107be32eeca410c91e33a4c53f7aab6c3ca4b2f06667102f548f5507 WatchSource:0}: Error finding container 5ceac1d8107be32eeca410c91e33a4c53f7aab6c3ca4b2f06667102f548f5507: Status 404 returned error can't find the container with id 5ceac1d8107be32eeca410c91e33a4c53f7aab6c3ca4b2f06667102f548f5507 Feb 20 09:32:32 crc kubenswrapper[5094]: I0220 09:32:32.431853 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 09:32:32 crc kubenswrapper[5094]: I0220 09:32:32.545418 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 09:32:32 crc kubenswrapper[5094]: W0220 09:32:32.553093 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6886613_4f07_498a_911f_4d77704ab4df.slice/crio-5790f0b4bc606143a9bcd6c51144d155191868038b03c26011c247967da74145 WatchSource:0}: Error finding container 5790f0b4bc606143a9bcd6c51144d155191868038b03c26011c247967da74145: Status 404 returned error can't find the container with id 5790f0b4bc606143a9bcd6c51144d155191868038b03c26011c247967da74145 Feb 20 09:32:32 crc kubenswrapper[5094]: I0220 09:32:32.896655 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b6886613-4f07-498a-911f-4d77704ab4df","Type":"ContainerStarted","Data":"995a9223f6e307afd506adc22a61e1152c3ee31123de0c92dc4dac150289ab55"} Feb 20 09:32:32 crc kubenswrapper[5094]: I0220 09:32:32.896710 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b6886613-4f07-498a-911f-4d77704ab4df","Type":"ContainerStarted","Data":"5790f0b4bc606143a9bcd6c51144d155191868038b03c26011c247967da74145"} Feb 20 09:32:32 crc kubenswrapper[5094]: I0220 09:32:32.900364 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"67a3bd12-be26-46a3-bd66-982bea39049a","Type":"ContainerStarted","Data":"35716fc27f82e5b6c12e47b0adf36caf429732bd9f4df54a7314bb44da06331b"} Feb 20 09:32:32 crc kubenswrapper[5094]: I0220 09:32:32.900442 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"67a3bd12-be26-46a3-bd66-982bea39049a","Type":"ContainerStarted","Data":"5ceac1d8107be32eeca410c91e33a4c53f7aab6c3ca4b2f06667102f548f5507"} Feb 20 09:32:32 crc kubenswrapper[5094]: I0220 09:32:32.903532 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"87f99e6f-46a8-4a46-bcae-81947aa95700","Type":"ContainerStarted","Data":"9b891c6114fffd97f16c1211284ae3ba50954686c250967b8979eb57c794377d"} Feb 20 09:32:32 crc kubenswrapper[5094]: I0220 09:32:32.904315 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"87f99e6f-46a8-4a46-bcae-81947aa95700","Type":"ContainerStarted","Data":"69f5327d4e2e931dbb3e097c3ae567eac64dee32b5e10553719964bfaea7d383"} Feb 20 09:32:32 crc kubenswrapper[5094]: I0220 09:32:32.904332 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"87f99e6f-46a8-4a46-bcae-81947aa95700","Type":"ContainerStarted","Data":"a08cf42a633180684561dacd926e08fcf523aa424706ca2a8b1e45e2e4e16bc8"} Feb 20 09:32:32 crc kubenswrapper[5094]: I0220 09:32:32.920635 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.920618781 podStartE2EDuration="1.920618781s" podCreationTimestamp="2026-02-20 09:32:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:32:32.914614157 +0000 UTC m=+9967.787240868" watchObservedRunningTime="2026-02-20 09:32:32.920618781 +0000 UTC m=+9967.793245492" Feb 20 09:32:32 crc kubenswrapper[5094]: I0220 09:32:32.951697 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.9516762779999999 podStartE2EDuration="1.951676278s" podCreationTimestamp="2026-02-20 09:32:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:32:32.939125756 +0000 UTC m=+9967.811752467" watchObservedRunningTime="2026-02-20 09:32:32.951676278 +0000 UTC m=+9967.824303009" Feb 20 09:32:33 crc kubenswrapper[5094]: I0220 09:32:33.937308 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b6886613-4f07-498a-911f-4d77704ab4df","Type":"ContainerStarted","Data":"84e9062ecb98a3cfd9caef3e7bd887d3c643f0c5dc9d7f9a7a51d5b91793e59f"} Feb 20 09:32:33 crc kubenswrapper[5094]: I0220 09:32:33.977857 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.977834516 podStartE2EDuration="2.977834516s" podCreationTimestamp="2026-02-20 09:32:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:32:33.967473727 +0000 UTC m=+9968.840100458" watchObservedRunningTime="2026-02-20 09:32:33.977834516 +0000 UTC m=+9968.850461257" Feb 20 09:32:36 crc kubenswrapper[5094]: I0220 09:32:36.838785 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 20 09:32:36 crc kubenswrapper[5094]: I0220 09:32:36.954391 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 09:32:36 crc kubenswrapper[5094]: I0220 09:32:36.954431 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 09:32:40 crc kubenswrapper[5094]: I0220 09:32:40.656677 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 20 09:32:41 crc kubenswrapper[5094]: I0220 09:32:41.838677 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 20 09:32:41 crc kubenswrapper[5094]: I0220 09:32:41.954884 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 09:32:41 crc kubenswrapper[5094]: I0220 09:32:41.955232 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 09:32:41 crc kubenswrapper[5094]: I0220 09:32:41.977852 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 20 09:32:41 crc kubenswrapper[5094]: I0220 09:32:41.989783 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 09:32:41 crc kubenswrapper[5094]: I0220 09:32:41.989834 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 09:32:42 crc kubenswrapper[5094]: I0220 09:32:42.050318 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 20 09:32:42 crc kubenswrapper[5094]: I0220 09:32:42.840587 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:32:42 crc kubenswrapper[5094]: E0220 09:32:42.841152 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:32:43 crc kubenswrapper[5094]: I0220 09:32:43.121826 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b6886613-4f07-498a-911f-4d77704ab4df" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.190:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 09:32:43 crc kubenswrapper[5094]: I0220 09:32:43.121864 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="87f99e6f-46a8-4a46-bcae-81947aa95700" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.189:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 09:32:43 crc kubenswrapper[5094]: I0220 09:32:43.121813 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="87f99e6f-46a8-4a46-bcae-81947aa95700" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.189:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 09:32:43 crc kubenswrapper[5094]: I0220 09:32:43.122201 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b6886613-4f07-498a-911f-4d77704ab4df" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.190:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.426043 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-km8mn"] Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.428762 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.437163 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-km8mn"] Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.529415 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6ef20e3-709d-4a1f-a616-c0259cebabd5-catalog-content\") pod \"redhat-operators-km8mn\" (UID: \"a6ef20e3-709d-4a1f-a616-c0259cebabd5\") " pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.529469 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mk2b\" (UniqueName: \"kubernetes.io/projected/a6ef20e3-709d-4a1f-a616-c0259cebabd5-kube-api-access-9mk2b\") pod \"redhat-operators-km8mn\" (UID: \"a6ef20e3-709d-4a1f-a616-c0259cebabd5\") " pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.529502 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6ef20e3-709d-4a1f-a616-c0259cebabd5-utilities\") pod \"redhat-operators-km8mn\" (UID: \"a6ef20e3-709d-4a1f-a616-c0259cebabd5\") " pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.631552 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6ef20e3-709d-4a1f-a616-c0259cebabd5-catalog-content\") pod \"redhat-operators-km8mn\" (UID: \"a6ef20e3-709d-4a1f-a616-c0259cebabd5\") " pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.631591 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mk2b\" (UniqueName: \"kubernetes.io/projected/a6ef20e3-709d-4a1f-a616-c0259cebabd5-kube-api-access-9mk2b\") pod \"redhat-operators-km8mn\" (UID: \"a6ef20e3-709d-4a1f-a616-c0259cebabd5\") " pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.631613 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6ef20e3-709d-4a1f-a616-c0259cebabd5-utilities\") pod \"redhat-operators-km8mn\" (UID: \"a6ef20e3-709d-4a1f-a616-c0259cebabd5\") " pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.632136 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6ef20e3-709d-4a1f-a616-c0259cebabd5-catalog-content\") pod \"redhat-operators-km8mn\" (UID: \"a6ef20e3-709d-4a1f-a616-c0259cebabd5\") " pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.632139 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6ef20e3-709d-4a1f-a616-c0259cebabd5-utilities\") pod \"redhat-operators-km8mn\" (UID: \"a6ef20e3-709d-4a1f-a616-c0259cebabd5\") " pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.650308 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mk2b\" (UniqueName: \"kubernetes.io/projected/a6ef20e3-709d-4a1f-a616-c0259cebabd5-kube-api-access-9mk2b\") pod \"redhat-operators-km8mn\" (UID: \"a6ef20e3-709d-4a1f-a616-c0259cebabd5\") " pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.764476 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.957726 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.957995 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.960057 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.962228 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 20 09:32:52 crc kubenswrapper[5094]: I0220 09:32:52.000383 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 20 09:32:52 crc kubenswrapper[5094]: I0220 09:32:52.001576 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 20 09:32:52 crc kubenswrapper[5094]: I0220 09:32:52.012360 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 20 09:32:52 crc kubenswrapper[5094]: I0220 09:32:52.016513 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 20 09:32:52 crc kubenswrapper[5094]: I0220 09:32:52.123990 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 20 09:32:52 crc kubenswrapper[5094]: I0220 09:32:52.133367 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 20 09:32:52 crc kubenswrapper[5094]: I0220 09:32:52.263682 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-km8mn"] Feb 20 09:32:53 crc kubenswrapper[5094]: I0220 09:32:53.133213 5094 generic.go:334] "Generic (PLEG): container finished" podID="a6ef20e3-709d-4a1f-a616-c0259cebabd5" containerID="2ccdeeaf4b6cf33e08f15c08b13eee5499adef516f06783b99da2954f2a56ff1" exitCode=0 Feb 20 09:32:53 crc kubenswrapper[5094]: I0220 09:32:53.133404 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-km8mn" event={"ID":"a6ef20e3-709d-4a1f-a616-c0259cebabd5","Type":"ContainerDied","Data":"2ccdeeaf4b6cf33e08f15c08b13eee5499adef516f06783b99da2954f2a56ff1"} Feb 20 09:32:53 crc kubenswrapper[5094]: I0220 09:32:53.134649 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-km8mn" event={"ID":"a6ef20e3-709d-4a1f-a616-c0259cebabd5","Type":"ContainerStarted","Data":"610c159ff287bb67be257ddc8c95e160d06d6ef93bad3fd074b27d4eb4beeeda"} Feb 20 09:32:53 crc kubenswrapper[5094]: I0220 09:32:53.891933 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd"] Feb 20 09:32:53 crc kubenswrapper[5094]: I0220 09:32:53.893807 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:53 crc kubenswrapper[5094]: I0220 09:32:53.898633 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:32:53 crc kubenswrapper[5094]: I0220 09:32:53.898888 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 20 09:32:53 crc kubenswrapper[5094]: I0220 09:32:53.899028 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Feb 20 09:32:53 crc kubenswrapper[5094]: I0220 09:32:53.899253 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 20 09:32:53 crc kubenswrapper[5094]: I0220 09:32:53.899368 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:32:53 crc kubenswrapper[5094]: I0220 09:32:53.899551 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 09:32:53 crc kubenswrapper[5094]: I0220 09:32:53.899724 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 20 09:32:53 crc kubenswrapper[5094]: I0220 09:32:53.949390 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd"] Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.003836 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.003926 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.003954 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.003976 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.004035 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.004062 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.004086 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbtc5\" (UniqueName: \"kubernetes.io/projected/5b30b185-0b70-4ad8-8eca-a292b76fb410-kube-api-access-sbtc5\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.004110 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.004132 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.004151 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.004174 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.004220 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.004238 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.106751 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.107239 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.107276 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbtc5\" (UniqueName: \"kubernetes.io/projected/5b30b185-0b70-4ad8-8eca-a292b76fb410-kube-api-access-sbtc5\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.107304 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.107341 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.107361 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.107385 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.107414 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.107437 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.107464 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.107547 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.107578 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.107603 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.110540 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.110552 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.114781 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.114967 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.115085 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.115728 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.117221 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.119973 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.120564 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.128759 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.129745 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.131594 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.134752 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbtc5\" (UniqueName: \"kubernetes.io/projected/5b30b185-0b70-4ad8-8eca-a292b76fb410-kube-api-access-sbtc5\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.152486 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-km8mn" event={"ID":"a6ef20e3-709d-4a1f-a616-c0259cebabd5","Type":"ContainerStarted","Data":"cd68c7deed63881e78ecfc4f6731d253a24fb0b5eac0e138fa34f8befa830688"} Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.267556 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.894480 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd"] Feb 20 09:32:54 crc kubenswrapper[5094]: W0220 09:32:54.900087 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b30b185_0b70_4ad8_8eca_a292b76fb410.slice/crio-135ff20bf16d42798db0e45ebf2925fe4bfb43459a1c4911adfd22e9f5907df7 WatchSource:0}: Error finding container 135ff20bf16d42798db0e45ebf2925fe4bfb43459a1c4911adfd22e9f5907df7: Status 404 returned error can't find the container with id 135ff20bf16d42798db0e45ebf2925fe4bfb43459a1c4911adfd22e9f5907df7 Feb 20 09:32:55 crc kubenswrapper[5094]: I0220 09:32:55.165914 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" event={"ID":"5b30b185-0b70-4ad8-8eca-a292b76fb410","Type":"ContainerStarted","Data":"135ff20bf16d42798db0e45ebf2925fe4bfb43459a1c4911adfd22e9f5907df7"} Feb 20 09:32:56 crc kubenswrapper[5094]: I0220 09:32:56.193762 5094 generic.go:334] "Generic (PLEG): container finished" podID="a6ef20e3-709d-4a1f-a616-c0259cebabd5" containerID="cd68c7deed63881e78ecfc4f6731d253a24fb0b5eac0e138fa34f8befa830688" exitCode=0 Feb 20 09:32:56 crc kubenswrapper[5094]: I0220 09:32:56.193836 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-km8mn" event={"ID":"a6ef20e3-709d-4a1f-a616-c0259cebabd5","Type":"ContainerDied","Data":"cd68c7deed63881e78ecfc4f6731d253a24fb0b5eac0e138fa34f8befa830688"} Feb 20 09:32:56 crc kubenswrapper[5094]: I0220 09:32:56.197742 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" event={"ID":"5b30b185-0b70-4ad8-8eca-a292b76fb410","Type":"ContainerStarted","Data":"40b6b847514554d6d2915d390a68f23820ddde7e93d36e6d67b4ad209729e928"} Feb 20 09:32:56 crc kubenswrapper[5094]: I0220 09:32:56.250752 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" podStartSLOduration=2.766982284 podStartE2EDuration="3.250735901s" podCreationTimestamp="2026-02-20 09:32:53 +0000 UTC" firstStartedPulling="2026-02-20 09:32:54.902455024 +0000 UTC m=+9989.775081735" lastFinishedPulling="2026-02-20 09:32:55.386208641 +0000 UTC m=+9990.258835352" observedRunningTime="2026-02-20 09:32:56.246396406 +0000 UTC m=+9991.119023117" watchObservedRunningTime="2026-02-20 09:32:56.250735901 +0000 UTC m=+9991.123362612" Feb 20 09:32:56 crc kubenswrapper[5094]: I0220 09:32:56.841052 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:32:56 crc kubenswrapper[5094]: E0220 09:32:56.841573 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:32:58 crc kubenswrapper[5094]: I0220 09:32:58.219539 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-km8mn" event={"ID":"a6ef20e3-709d-4a1f-a616-c0259cebabd5","Type":"ContainerStarted","Data":"34b6d19292b35673a24307d48a3d1c8c037f0913c16f64c2be2b06c25807eb61"} Feb 20 09:32:58 crc kubenswrapper[5094]: I0220 09:32:58.249844 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-km8mn" podStartSLOduration=3.419647218 podStartE2EDuration="7.249823154s" podCreationTimestamp="2026-02-20 09:32:51 +0000 UTC" firstStartedPulling="2026-02-20 09:32:53.136659142 +0000 UTC m=+9988.009285853" lastFinishedPulling="2026-02-20 09:32:56.966835088 +0000 UTC m=+9991.839461789" observedRunningTime="2026-02-20 09:32:58.241546866 +0000 UTC m=+9993.114173597" watchObservedRunningTime="2026-02-20 09:32:58.249823154 +0000 UTC m=+9993.122449865" Feb 20 09:32:59 crc kubenswrapper[5094]: I0220 09:32:59.801624 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h6jzn"] Feb 20 09:32:59 crc kubenswrapper[5094]: I0220 09:32:59.812877 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:32:59 crc kubenswrapper[5094]: I0220 09:32:59.863109 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6jzn"] Feb 20 09:32:59 crc kubenswrapper[5094]: I0220 09:32:59.955733 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9x2v\" (UniqueName: \"kubernetes.io/projected/b9215716-a0b8-42e4-8f60-abbd516f91d6-kube-api-access-g9x2v\") pod \"redhat-marketplace-h6jzn\" (UID: \"b9215716-a0b8-42e4-8f60-abbd516f91d6\") " pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:32:59 crc kubenswrapper[5094]: I0220 09:32:59.955899 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9215716-a0b8-42e4-8f60-abbd516f91d6-utilities\") pod \"redhat-marketplace-h6jzn\" (UID: \"b9215716-a0b8-42e4-8f60-abbd516f91d6\") " pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:32:59 crc kubenswrapper[5094]: I0220 09:32:59.956256 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9215716-a0b8-42e4-8f60-abbd516f91d6-catalog-content\") pod \"redhat-marketplace-h6jzn\" (UID: \"b9215716-a0b8-42e4-8f60-abbd516f91d6\") " pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:33:00 crc kubenswrapper[5094]: I0220 09:33:00.058595 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9x2v\" (UniqueName: \"kubernetes.io/projected/b9215716-a0b8-42e4-8f60-abbd516f91d6-kube-api-access-g9x2v\") pod \"redhat-marketplace-h6jzn\" (UID: \"b9215716-a0b8-42e4-8f60-abbd516f91d6\") " pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:33:00 crc kubenswrapper[5094]: I0220 09:33:00.058776 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9215716-a0b8-42e4-8f60-abbd516f91d6-utilities\") pod \"redhat-marketplace-h6jzn\" (UID: \"b9215716-a0b8-42e4-8f60-abbd516f91d6\") " pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:33:00 crc kubenswrapper[5094]: I0220 09:33:00.058870 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9215716-a0b8-42e4-8f60-abbd516f91d6-catalog-content\") pod \"redhat-marketplace-h6jzn\" (UID: \"b9215716-a0b8-42e4-8f60-abbd516f91d6\") " pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:33:00 crc kubenswrapper[5094]: I0220 09:33:00.059385 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9215716-a0b8-42e4-8f60-abbd516f91d6-catalog-content\") pod \"redhat-marketplace-h6jzn\" (UID: \"b9215716-a0b8-42e4-8f60-abbd516f91d6\") " pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:33:00 crc kubenswrapper[5094]: I0220 09:33:00.059472 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9215716-a0b8-42e4-8f60-abbd516f91d6-utilities\") pod \"redhat-marketplace-h6jzn\" (UID: \"b9215716-a0b8-42e4-8f60-abbd516f91d6\") " pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:33:00 crc kubenswrapper[5094]: I0220 09:33:00.081363 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9x2v\" (UniqueName: \"kubernetes.io/projected/b9215716-a0b8-42e4-8f60-abbd516f91d6-kube-api-access-g9x2v\") pod \"redhat-marketplace-h6jzn\" (UID: \"b9215716-a0b8-42e4-8f60-abbd516f91d6\") " pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:33:00 crc kubenswrapper[5094]: I0220 09:33:00.141599 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:33:00 crc kubenswrapper[5094]: I0220 09:33:00.645118 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6jzn"] Feb 20 09:33:01 crc kubenswrapper[5094]: I0220 09:33:01.255105 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6jzn" event={"ID":"b9215716-a0b8-42e4-8f60-abbd516f91d6","Type":"ContainerStarted","Data":"851258ed832aa014433a1a37cc6b0d08924a30fb97018dfda95e09099f934eaf"} Feb 20 09:33:01 crc kubenswrapper[5094]: I0220 09:33:01.765433 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:33:01 crc kubenswrapper[5094]: I0220 09:33:01.765488 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:33:02 crc kubenswrapper[5094]: I0220 09:33:02.265314 5094 generic.go:334] "Generic (PLEG): container finished" podID="b9215716-a0b8-42e4-8f60-abbd516f91d6" containerID="6a950e737c27d27ac2b886f1b67fb7e0f1d51bad1dc3e086671beee3ea9e99a6" exitCode=0 Feb 20 09:33:02 crc kubenswrapper[5094]: I0220 09:33:02.265412 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6jzn" event={"ID":"b9215716-a0b8-42e4-8f60-abbd516f91d6","Type":"ContainerDied","Data":"6a950e737c27d27ac2b886f1b67fb7e0f1d51bad1dc3e086671beee3ea9e99a6"} Feb 20 09:33:02 crc kubenswrapper[5094]: I0220 09:33:02.810030 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-km8mn" podUID="a6ef20e3-709d-4a1f-a616-c0259cebabd5" containerName="registry-server" probeResult="failure" output=< Feb 20 09:33:02 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 09:33:02 crc kubenswrapper[5094]: > Feb 20 09:33:04 crc kubenswrapper[5094]: I0220 09:33:04.297882 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6jzn" event={"ID":"b9215716-a0b8-42e4-8f60-abbd516f91d6","Type":"ContainerStarted","Data":"d6bd787107531ff3e1a19803fc9c847eccec4b85fefe5b5db8cb283afc2a9c70"} Feb 20 09:33:05 crc kubenswrapper[5094]: I0220 09:33:05.308396 5094 generic.go:334] "Generic (PLEG): container finished" podID="b9215716-a0b8-42e4-8f60-abbd516f91d6" containerID="d6bd787107531ff3e1a19803fc9c847eccec4b85fefe5b5db8cb283afc2a9c70" exitCode=0 Feb 20 09:33:05 crc kubenswrapper[5094]: I0220 09:33:05.308436 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6jzn" event={"ID":"b9215716-a0b8-42e4-8f60-abbd516f91d6","Type":"ContainerDied","Data":"d6bd787107531ff3e1a19803fc9c847eccec4b85fefe5b5db8cb283afc2a9c70"} Feb 20 09:33:06 crc kubenswrapper[5094]: I0220 09:33:06.320089 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6jzn" event={"ID":"b9215716-a0b8-42e4-8f60-abbd516f91d6","Type":"ContainerStarted","Data":"0777df51b8320dd938ae97fa80baf455fc887a581be50f90d9d3e1daa391ebef"} Feb 20 09:33:06 crc kubenswrapper[5094]: I0220 09:33:06.339627 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h6jzn" podStartSLOduration=3.9230232320000002 podStartE2EDuration="7.339606899s" podCreationTimestamp="2026-02-20 09:32:59 +0000 UTC" firstStartedPulling="2026-02-20 09:33:02.267357289 +0000 UTC m=+9997.139984000" lastFinishedPulling="2026-02-20 09:33:05.683940966 +0000 UTC m=+10000.556567667" observedRunningTime="2026-02-20 09:33:06.337896339 +0000 UTC m=+10001.210523050" watchObservedRunningTime="2026-02-20 09:33:06.339606899 +0000 UTC m=+10001.212233610" Feb 20 09:33:08 crc kubenswrapper[5094]: I0220 09:33:08.840542 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:33:08 crc kubenswrapper[5094]: E0220 09:33:08.841208 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:33:10 crc kubenswrapper[5094]: I0220 09:33:10.142813 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:33:10 crc kubenswrapper[5094]: I0220 09:33:10.142893 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:33:10 crc kubenswrapper[5094]: I0220 09:33:10.199338 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:33:10 crc kubenswrapper[5094]: I0220 09:33:10.407400 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:33:10 crc kubenswrapper[5094]: I0220 09:33:10.462689 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6jzn"] Feb 20 09:33:11 crc kubenswrapper[5094]: I0220 09:33:11.813076 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:33:11 crc kubenswrapper[5094]: I0220 09:33:11.872047 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:33:12 crc kubenswrapper[5094]: I0220 09:33:12.376789 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h6jzn" podUID="b9215716-a0b8-42e4-8f60-abbd516f91d6" containerName="registry-server" containerID="cri-o://0777df51b8320dd938ae97fa80baf455fc887a581be50f90d9d3e1daa391ebef" gracePeriod=2 Feb 20 09:33:12 crc kubenswrapper[5094]: I0220 09:33:12.851215 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-km8mn"] Feb 20 09:33:13 crc kubenswrapper[5094]: I0220 09:33:13.394930 5094 generic.go:334] "Generic (PLEG): container finished" podID="b9215716-a0b8-42e4-8f60-abbd516f91d6" containerID="0777df51b8320dd938ae97fa80baf455fc887a581be50f90d9d3e1daa391ebef" exitCode=0 Feb 20 09:33:13 crc kubenswrapper[5094]: I0220 09:33:13.395006 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6jzn" event={"ID":"b9215716-a0b8-42e4-8f60-abbd516f91d6","Type":"ContainerDied","Data":"0777df51b8320dd938ae97fa80baf455fc887a581be50f90d9d3e1daa391ebef"} Feb 20 09:33:13 crc kubenswrapper[5094]: I0220 09:33:13.395159 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-km8mn" podUID="a6ef20e3-709d-4a1f-a616-c0259cebabd5" containerName="registry-server" containerID="cri-o://34b6d19292b35673a24307d48a3d1c8c037f0913c16f64c2be2b06c25807eb61" gracePeriod=2 Feb 20 09:33:13 crc kubenswrapper[5094]: E0220 09:33:13.550867 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6ef20e3_709d_4a1f_a616_c0259cebabd5.slice/crio-34b6d19292b35673a24307d48a3d1c8c037f0913c16f64c2be2b06c25807eb61.scope\": RecentStats: unable to find data in memory cache]" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.007938 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.017098 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.160315 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9x2v\" (UniqueName: \"kubernetes.io/projected/b9215716-a0b8-42e4-8f60-abbd516f91d6-kube-api-access-g9x2v\") pod \"b9215716-a0b8-42e4-8f60-abbd516f91d6\" (UID: \"b9215716-a0b8-42e4-8f60-abbd516f91d6\") " Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.160375 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6ef20e3-709d-4a1f-a616-c0259cebabd5-catalog-content\") pod \"a6ef20e3-709d-4a1f-a616-c0259cebabd5\" (UID: \"a6ef20e3-709d-4a1f-a616-c0259cebabd5\") " Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.160494 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9215716-a0b8-42e4-8f60-abbd516f91d6-utilities\") pod \"b9215716-a0b8-42e4-8f60-abbd516f91d6\" (UID: \"b9215716-a0b8-42e4-8f60-abbd516f91d6\") " Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.160531 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6ef20e3-709d-4a1f-a616-c0259cebabd5-utilities\") pod \"a6ef20e3-709d-4a1f-a616-c0259cebabd5\" (UID: \"a6ef20e3-709d-4a1f-a616-c0259cebabd5\") " Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.160596 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9215716-a0b8-42e4-8f60-abbd516f91d6-catalog-content\") pod \"b9215716-a0b8-42e4-8f60-abbd516f91d6\" (UID: \"b9215716-a0b8-42e4-8f60-abbd516f91d6\") " Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.160640 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mk2b\" (UniqueName: \"kubernetes.io/projected/a6ef20e3-709d-4a1f-a616-c0259cebabd5-kube-api-access-9mk2b\") pod \"a6ef20e3-709d-4a1f-a616-c0259cebabd5\" (UID: \"a6ef20e3-709d-4a1f-a616-c0259cebabd5\") " Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.161364 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6ef20e3-709d-4a1f-a616-c0259cebabd5-utilities" (OuterVolumeSpecName: "utilities") pod "a6ef20e3-709d-4a1f-a616-c0259cebabd5" (UID: "a6ef20e3-709d-4a1f-a616-c0259cebabd5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.161494 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9215716-a0b8-42e4-8f60-abbd516f91d6-utilities" (OuterVolumeSpecName: "utilities") pod "b9215716-a0b8-42e4-8f60-abbd516f91d6" (UID: "b9215716-a0b8-42e4-8f60-abbd516f91d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.162203 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9215716-a0b8-42e4-8f60-abbd516f91d6-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.162248 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6ef20e3-709d-4a1f-a616-c0259cebabd5-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.165803 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6ef20e3-709d-4a1f-a616-c0259cebabd5-kube-api-access-9mk2b" (OuterVolumeSpecName: "kube-api-access-9mk2b") pod "a6ef20e3-709d-4a1f-a616-c0259cebabd5" (UID: "a6ef20e3-709d-4a1f-a616-c0259cebabd5"). InnerVolumeSpecName "kube-api-access-9mk2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.174220 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9215716-a0b8-42e4-8f60-abbd516f91d6-kube-api-access-g9x2v" (OuterVolumeSpecName: "kube-api-access-g9x2v") pod "b9215716-a0b8-42e4-8f60-abbd516f91d6" (UID: "b9215716-a0b8-42e4-8f60-abbd516f91d6"). InnerVolumeSpecName "kube-api-access-g9x2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.184053 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9215716-a0b8-42e4-8f60-abbd516f91d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b9215716-a0b8-42e4-8f60-abbd516f91d6" (UID: "b9215716-a0b8-42e4-8f60-abbd516f91d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.264510 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9215716-a0b8-42e4-8f60-abbd516f91d6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.264840 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mk2b\" (UniqueName: \"kubernetes.io/projected/a6ef20e3-709d-4a1f-a616-c0259cebabd5-kube-api-access-9mk2b\") on node \"crc\" DevicePath \"\"" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.264850 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9x2v\" (UniqueName: \"kubernetes.io/projected/b9215716-a0b8-42e4-8f60-abbd516f91d6-kube-api-access-g9x2v\") on node \"crc\" DevicePath \"\"" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.304936 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6ef20e3-709d-4a1f-a616-c0259cebabd5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a6ef20e3-709d-4a1f-a616-c0259cebabd5" (UID: "a6ef20e3-709d-4a1f-a616-c0259cebabd5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.367659 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6ef20e3-709d-4a1f-a616-c0259cebabd5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.407263 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6jzn" event={"ID":"b9215716-a0b8-42e4-8f60-abbd516f91d6","Type":"ContainerDied","Data":"851258ed832aa014433a1a37cc6b0d08924a30fb97018dfda95e09099f934eaf"} Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.407342 5094 scope.go:117] "RemoveContainer" containerID="0777df51b8320dd938ae97fa80baf455fc887a581be50f90d9d3e1daa391ebef" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.407392 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.411036 5094 generic.go:334] "Generic (PLEG): container finished" podID="a6ef20e3-709d-4a1f-a616-c0259cebabd5" containerID="34b6d19292b35673a24307d48a3d1c8c037f0913c16f64c2be2b06c25807eb61" exitCode=0 Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.411067 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-km8mn" event={"ID":"a6ef20e3-709d-4a1f-a616-c0259cebabd5","Type":"ContainerDied","Data":"34b6d19292b35673a24307d48a3d1c8c037f0913c16f64c2be2b06c25807eb61"} Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.411091 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-km8mn" event={"ID":"a6ef20e3-709d-4a1f-a616-c0259cebabd5","Type":"ContainerDied","Data":"610c159ff287bb67be257ddc8c95e160d06d6ef93bad3fd074b27d4eb4beeeda"} Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.411156 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.434209 5094 scope.go:117] "RemoveContainer" containerID="d6bd787107531ff3e1a19803fc9c847eccec4b85fefe5b5db8cb283afc2a9c70" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.464639 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6jzn"] Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.481820 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6jzn"] Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.482349 5094 scope.go:117] "RemoveContainer" containerID="6a950e737c27d27ac2b886f1b67fb7e0f1d51bad1dc3e086671beee3ea9e99a6" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.492789 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-km8mn"] Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.515073 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-km8mn"] Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.515458 5094 scope.go:117] "RemoveContainer" containerID="34b6d19292b35673a24307d48a3d1c8c037f0913c16f64c2be2b06c25807eb61" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.558904 5094 scope.go:117] "RemoveContainer" containerID="cd68c7deed63881e78ecfc4f6731d253a24fb0b5eac0e138fa34f8befa830688" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.603866 5094 scope.go:117] "RemoveContainer" containerID="2ccdeeaf4b6cf33e08f15c08b13eee5499adef516f06783b99da2954f2a56ff1" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.628722 5094 scope.go:117] "RemoveContainer" containerID="34b6d19292b35673a24307d48a3d1c8c037f0913c16f64c2be2b06c25807eb61" Feb 20 09:33:14 crc kubenswrapper[5094]: E0220 09:33:14.629329 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34b6d19292b35673a24307d48a3d1c8c037f0913c16f64c2be2b06c25807eb61\": container with ID starting with 34b6d19292b35673a24307d48a3d1c8c037f0913c16f64c2be2b06c25807eb61 not found: ID does not exist" containerID="34b6d19292b35673a24307d48a3d1c8c037f0913c16f64c2be2b06c25807eb61" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.629390 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34b6d19292b35673a24307d48a3d1c8c037f0913c16f64c2be2b06c25807eb61"} err="failed to get container status \"34b6d19292b35673a24307d48a3d1c8c037f0913c16f64c2be2b06c25807eb61\": rpc error: code = NotFound desc = could not find container \"34b6d19292b35673a24307d48a3d1c8c037f0913c16f64c2be2b06c25807eb61\": container with ID starting with 34b6d19292b35673a24307d48a3d1c8c037f0913c16f64c2be2b06c25807eb61 not found: ID does not exist" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.629417 5094 scope.go:117] "RemoveContainer" containerID="cd68c7deed63881e78ecfc4f6731d253a24fb0b5eac0e138fa34f8befa830688" Feb 20 09:33:14 crc kubenswrapper[5094]: E0220 09:33:14.629997 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd68c7deed63881e78ecfc4f6731d253a24fb0b5eac0e138fa34f8befa830688\": container with ID starting with cd68c7deed63881e78ecfc4f6731d253a24fb0b5eac0e138fa34f8befa830688 not found: ID does not exist" containerID="cd68c7deed63881e78ecfc4f6731d253a24fb0b5eac0e138fa34f8befa830688" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.630024 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd68c7deed63881e78ecfc4f6731d253a24fb0b5eac0e138fa34f8befa830688"} err="failed to get container status \"cd68c7deed63881e78ecfc4f6731d253a24fb0b5eac0e138fa34f8befa830688\": rpc error: code = NotFound desc = could not find container \"cd68c7deed63881e78ecfc4f6731d253a24fb0b5eac0e138fa34f8befa830688\": container with ID starting with cd68c7deed63881e78ecfc4f6731d253a24fb0b5eac0e138fa34f8befa830688 not found: ID does not exist" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.630047 5094 scope.go:117] "RemoveContainer" containerID="2ccdeeaf4b6cf33e08f15c08b13eee5499adef516f06783b99da2954f2a56ff1" Feb 20 09:33:14 crc kubenswrapper[5094]: E0220 09:33:14.630423 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ccdeeaf4b6cf33e08f15c08b13eee5499adef516f06783b99da2954f2a56ff1\": container with ID starting with 2ccdeeaf4b6cf33e08f15c08b13eee5499adef516f06783b99da2954f2a56ff1 not found: ID does not exist" containerID="2ccdeeaf4b6cf33e08f15c08b13eee5499adef516f06783b99da2954f2a56ff1" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.630443 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ccdeeaf4b6cf33e08f15c08b13eee5499adef516f06783b99da2954f2a56ff1"} err="failed to get container status \"2ccdeeaf4b6cf33e08f15c08b13eee5499adef516f06783b99da2954f2a56ff1\": rpc error: code = NotFound desc = could not find container \"2ccdeeaf4b6cf33e08f15c08b13eee5499adef516f06783b99da2954f2a56ff1\": container with ID starting with 2ccdeeaf4b6cf33e08f15c08b13eee5499adef516f06783b99da2954f2a56ff1 not found: ID does not exist" Feb 20 09:33:15 crc kubenswrapper[5094]: I0220 09:33:15.854671 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6ef20e3-709d-4a1f-a616-c0259cebabd5" path="/var/lib/kubelet/pods/a6ef20e3-709d-4a1f-a616-c0259cebabd5/volumes" Feb 20 09:33:15 crc kubenswrapper[5094]: I0220 09:33:15.855781 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9215716-a0b8-42e4-8f60-abbd516f91d6" path="/var/lib/kubelet/pods/b9215716-a0b8-42e4-8f60-abbd516f91d6/volumes" Feb 20 09:33:19 crc kubenswrapper[5094]: I0220 09:33:19.840501 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:33:19 crc kubenswrapper[5094]: E0220 09:33:19.841184 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:33:31 crc kubenswrapper[5094]: I0220 09:33:31.840279 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:33:31 crc kubenswrapper[5094]: E0220 09:33:31.841059 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:33:44 crc kubenswrapper[5094]: I0220 09:33:44.840625 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:33:44 crc kubenswrapper[5094]: E0220 09:33:44.841517 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.491718 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qk79m"] Feb 20 09:33:52 crc kubenswrapper[5094]: E0220 09:33:52.492827 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ef20e3-709d-4a1f-a616-c0259cebabd5" containerName="extract-content" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.492845 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ef20e3-709d-4a1f-a616-c0259cebabd5" containerName="extract-content" Feb 20 09:33:52 crc kubenswrapper[5094]: E0220 09:33:52.492862 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ef20e3-709d-4a1f-a616-c0259cebabd5" containerName="extract-utilities" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.492871 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ef20e3-709d-4a1f-a616-c0259cebabd5" containerName="extract-utilities" Feb 20 09:33:52 crc kubenswrapper[5094]: E0220 09:33:52.492901 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9215716-a0b8-42e4-8f60-abbd516f91d6" containerName="extract-content" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.492910 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9215716-a0b8-42e4-8f60-abbd516f91d6" containerName="extract-content" Feb 20 09:33:52 crc kubenswrapper[5094]: E0220 09:33:52.492924 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9215716-a0b8-42e4-8f60-abbd516f91d6" containerName="registry-server" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.492932 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9215716-a0b8-42e4-8f60-abbd516f91d6" containerName="registry-server" Feb 20 09:33:52 crc kubenswrapper[5094]: E0220 09:33:52.492966 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9215716-a0b8-42e4-8f60-abbd516f91d6" containerName="extract-utilities" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.492974 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9215716-a0b8-42e4-8f60-abbd516f91d6" containerName="extract-utilities" Feb 20 09:33:52 crc kubenswrapper[5094]: E0220 09:33:52.492992 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ef20e3-709d-4a1f-a616-c0259cebabd5" containerName="registry-server" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.493002 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ef20e3-709d-4a1f-a616-c0259cebabd5" containerName="registry-server" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.493266 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6ef20e3-709d-4a1f-a616-c0259cebabd5" containerName="registry-server" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.493292 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9215716-a0b8-42e4-8f60-abbd516f91d6" containerName="registry-server" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.495398 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qk79m" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.515999 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qk79m"] Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.597831 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67d20448-086a-4d76-b547-768f68c018f2-utilities\") pod \"certified-operators-qk79m\" (UID: \"67d20448-086a-4d76-b547-768f68c018f2\") " pod="openshift-marketplace/certified-operators-qk79m" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.597900 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbfwz\" (UniqueName: \"kubernetes.io/projected/67d20448-086a-4d76-b547-768f68c018f2-kube-api-access-pbfwz\") pod \"certified-operators-qk79m\" (UID: \"67d20448-086a-4d76-b547-768f68c018f2\") " pod="openshift-marketplace/certified-operators-qk79m" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.598035 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67d20448-086a-4d76-b547-768f68c018f2-catalog-content\") pod \"certified-operators-qk79m\" (UID: \"67d20448-086a-4d76-b547-768f68c018f2\") " pod="openshift-marketplace/certified-operators-qk79m" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.700012 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67d20448-086a-4d76-b547-768f68c018f2-catalog-content\") pod \"certified-operators-qk79m\" (UID: \"67d20448-086a-4d76-b547-768f68c018f2\") " pod="openshift-marketplace/certified-operators-qk79m" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.700239 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67d20448-086a-4d76-b547-768f68c018f2-utilities\") pod \"certified-operators-qk79m\" (UID: \"67d20448-086a-4d76-b547-768f68c018f2\") " pod="openshift-marketplace/certified-operators-qk79m" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.700284 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbfwz\" (UniqueName: \"kubernetes.io/projected/67d20448-086a-4d76-b547-768f68c018f2-kube-api-access-pbfwz\") pod \"certified-operators-qk79m\" (UID: \"67d20448-086a-4d76-b547-768f68c018f2\") " pod="openshift-marketplace/certified-operators-qk79m" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.700786 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67d20448-086a-4d76-b547-768f68c018f2-catalog-content\") pod \"certified-operators-qk79m\" (UID: \"67d20448-086a-4d76-b547-768f68c018f2\") " pod="openshift-marketplace/certified-operators-qk79m" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.701405 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67d20448-086a-4d76-b547-768f68c018f2-utilities\") pod \"certified-operators-qk79m\" (UID: \"67d20448-086a-4d76-b547-768f68c018f2\") " pod="openshift-marketplace/certified-operators-qk79m" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.734259 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbfwz\" (UniqueName: \"kubernetes.io/projected/67d20448-086a-4d76-b547-768f68c018f2-kube-api-access-pbfwz\") pod \"certified-operators-qk79m\" (UID: \"67d20448-086a-4d76-b547-768f68c018f2\") " pod="openshift-marketplace/certified-operators-qk79m" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.825044 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qk79m" Feb 20 09:33:53 crc kubenswrapper[5094]: I0220 09:33:53.277177 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qk79m"] Feb 20 09:33:53 crc kubenswrapper[5094]: I0220 09:33:53.835797 5094 generic.go:334] "Generic (PLEG): container finished" podID="67d20448-086a-4d76-b547-768f68c018f2" containerID="643c447e66c80ac8e409efce76bf116e44220b0806a0ef9462d288f6b9525e21" exitCode=0 Feb 20 09:33:53 crc kubenswrapper[5094]: I0220 09:33:53.835856 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qk79m" event={"ID":"67d20448-086a-4d76-b547-768f68c018f2","Type":"ContainerDied","Data":"643c447e66c80ac8e409efce76bf116e44220b0806a0ef9462d288f6b9525e21"} Feb 20 09:33:53 crc kubenswrapper[5094]: I0220 09:33:53.835898 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qk79m" event={"ID":"67d20448-086a-4d76-b547-768f68c018f2","Type":"ContainerStarted","Data":"636fac638a1c3adb897df51c5637d7ea4e67ced089cceb5902118444998e4673"} Feb 20 09:33:55 crc kubenswrapper[5094]: I0220 09:33:55.848679 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:33:55 crc kubenswrapper[5094]: E0220 09:33:55.849648 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:33:58 crc kubenswrapper[5094]: I0220 09:33:58.892224 5094 generic.go:334] "Generic (PLEG): container finished" podID="67d20448-086a-4d76-b547-768f68c018f2" containerID="88b2d7edfd9ae4aea620ef3f8c76de6d29966d07827651cec7c7fba1d9b759c9" exitCode=0 Feb 20 09:33:58 crc kubenswrapper[5094]: I0220 09:33:58.892289 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qk79m" event={"ID":"67d20448-086a-4d76-b547-768f68c018f2","Type":"ContainerDied","Data":"88b2d7edfd9ae4aea620ef3f8c76de6d29966d07827651cec7c7fba1d9b759c9"} Feb 20 09:33:59 crc kubenswrapper[5094]: I0220 09:33:59.906267 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qk79m" event={"ID":"67d20448-086a-4d76-b547-768f68c018f2","Type":"ContainerStarted","Data":"6ce609b949dea07c6e6cae1376723e7c800390d20d3eaa646283825557899eac"} Feb 20 09:33:59 crc kubenswrapper[5094]: I0220 09:33:59.922343 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qk79m" podStartSLOduration=2.490177808 podStartE2EDuration="7.922321996s" podCreationTimestamp="2026-02-20 09:33:52 +0000 UTC" firstStartedPulling="2026-02-20 09:33:53.837884455 +0000 UTC m=+10048.710511166" lastFinishedPulling="2026-02-20 09:33:59.270028643 +0000 UTC m=+10054.142655354" observedRunningTime="2026-02-20 09:33:59.921461805 +0000 UTC m=+10054.794088536" watchObservedRunningTime="2026-02-20 09:33:59.922321996 +0000 UTC m=+10054.794948707" Feb 20 09:34:02 crc kubenswrapper[5094]: I0220 09:34:02.825568 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qk79m" Feb 20 09:34:02 crc kubenswrapper[5094]: I0220 09:34:02.826078 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qk79m" Feb 20 09:34:02 crc kubenswrapper[5094]: I0220 09:34:02.869508 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qk79m" Feb 20 09:34:10 crc kubenswrapper[5094]: I0220 09:34:10.840872 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:34:10 crc kubenswrapper[5094]: E0220 09:34:10.841667 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:34:12 crc kubenswrapper[5094]: I0220 09:34:12.879502 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qk79m" Feb 20 09:34:12 crc kubenswrapper[5094]: I0220 09:34:12.946677 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qk79m"] Feb 20 09:34:13 crc kubenswrapper[5094]: I0220 09:34:13.000751 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tzpc7"] Feb 20 09:34:13 crc kubenswrapper[5094]: I0220 09:34:13.001041 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tzpc7" podUID="c6db9ece-1aa7-4ea4-b800-b710a760edf6" containerName="registry-server" containerID="cri-o://e09d2889270f03a1a2710e1eedb583da8696096d2d0903240186438a06d8fabb" gracePeriod=2 Feb 20 09:34:13 crc kubenswrapper[5094]: I0220 09:34:13.516816 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 09:34:13 crc kubenswrapper[5094]: I0220 09:34:13.663059 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjhpt\" (UniqueName: \"kubernetes.io/projected/c6db9ece-1aa7-4ea4-b800-b710a760edf6-kube-api-access-qjhpt\") pod \"c6db9ece-1aa7-4ea4-b800-b710a760edf6\" (UID: \"c6db9ece-1aa7-4ea4-b800-b710a760edf6\") " Feb 20 09:34:13 crc kubenswrapper[5094]: I0220 09:34:13.663199 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6db9ece-1aa7-4ea4-b800-b710a760edf6-utilities\") pod \"c6db9ece-1aa7-4ea4-b800-b710a760edf6\" (UID: \"c6db9ece-1aa7-4ea4-b800-b710a760edf6\") " Feb 20 09:34:13 crc kubenswrapper[5094]: I0220 09:34:13.663272 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6db9ece-1aa7-4ea4-b800-b710a760edf6-catalog-content\") pod \"c6db9ece-1aa7-4ea4-b800-b710a760edf6\" (UID: \"c6db9ece-1aa7-4ea4-b800-b710a760edf6\") " Feb 20 09:34:13 crc kubenswrapper[5094]: I0220 09:34:13.666268 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6db9ece-1aa7-4ea4-b800-b710a760edf6-utilities" (OuterVolumeSpecName: "utilities") pod "c6db9ece-1aa7-4ea4-b800-b710a760edf6" (UID: "c6db9ece-1aa7-4ea4-b800-b710a760edf6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:34:13 crc kubenswrapper[5094]: I0220 09:34:13.671782 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6db9ece-1aa7-4ea4-b800-b710a760edf6-kube-api-access-qjhpt" (OuterVolumeSpecName: "kube-api-access-qjhpt") pod "c6db9ece-1aa7-4ea4-b800-b710a760edf6" (UID: "c6db9ece-1aa7-4ea4-b800-b710a760edf6"). InnerVolumeSpecName "kube-api-access-qjhpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:34:13 crc kubenswrapper[5094]: I0220 09:34:13.732242 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6db9ece-1aa7-4ea4-b800-b710a760edf6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c6db9ece-1aa7-4ea4-b800-b710a760edf6" (UID: "c6db9ece-1aa7-4ea4-b800-b710a760edf6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:34:13 crc kubenswrapper[5094]: I0220 09:34:13.765451 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6db9ece-1aa7-4ea4-b800-b710a760edf6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:34:13 crc kubenswrapper[5094]: I0220 09:34:13.765483 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjhpt\" (UniqueName: \"kubernetes.io/projected/c6db9ece-1aa7-4ea4-b800-b710a760edf6-kube-api-access-qjhpt\") on node \"crc\" DevicePath \"\"" Feb 20 09:34:13 crc kubenswrapper[5094]: I0220 09:34:13.765494 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6db9ece-1aa7-4ea4-b800-b710a760edf6-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:34:14 crc kubenswrapper[5094]: I0220 09:34:14.051375 5094 generic.go:334] "Generic (PLEG): container finished" podID="c6db9ece-1aa7-4ea4-b800-b710a760edf6" containerID="e09d2889270f03a1a2710e1eedb583da8696096d2d0903240186438a06d8fabb" exitCode=0 Feb 20 09:34:14 crc kubenswrapper[5094]: I0220 09:34:14.051426 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzpc7" event={"ID":"c6db9ece-1aa7-4ea4-b800-b710a760edf6","Type":"ContainerDied","Data":"e09d2889270f03a1a2710e1eedb583da8696096d2d0903240186438a06d8fabb"} Feb 20 09:34:14 crc kubenswrapper[5094]: I0220 09:34:14.051454 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 09:34:14 crc kubenswrapper[5094]: I0220 09:34:14.051477 5094 scope.go:117] "RemoveContainer" containerID="e09d2889270f03a1a2710e1eedb583da8696096d2d0903240186438a06d8fabb" Feb 20 09:34:14 crc kubenswrapper[5094]: I0220 09:34:14.051464 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzpc7" event={"ID":"c6db9ece-1aa7-4ea4-b800-b710a760edf6","Type":"ContainerDied","Data":"b005b27266dcf7c793cadc5ec806c3eba13a4c965c0c0b82e485488383c3f93f"} Feb 20 09:34:14 crc kubenswrapper[5094]: I0220 09:34:14.084365 5094 scope.go:117] "RemoveContainer" containerID="b699a1c50072acbdab88a18e4209ddc7b6f1665995200477921d9b14cf4874bc" Feb 20 09:34:14 crc kubenswrapper[5094]: I0220 09:34:14.091426 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tzpc7"] Feb 20 09:34:14 crc kubenswrapper[5094]: I0220 09:34:14.108534 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tzpc7"] Feb 20 09:34:14 crc kubenswrapper[5094]: I0220 09:34:14.150763 5094 scope.go:117] "RemoveContainer" containerID="0e5f06dce56f85bd90954f924e3c901b30cb5173e84b14dd5d12f82eb9ebf4a0" Feb 20 09:34:14 crc kubenswrapper[5094]: I0220 09:34:14.189991 5094 scope.go:117] "RemoveContainer" containerID="e09d2889270f03a1a2710e1eedb583da8696096d2d0903240186438a06d8fabb" Feb 20 09:34:14 crc kubenswrapper[5094]: E0220 09:34:14.191061 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e09d2889270f03a1a2710e1eedb583da8696096d2d0903240186438a06d8fabb\": container with ID starting with e09d2889270f03a1a2710e1eedb583da8696096d2d0903240186438a06d8fabb not found: ID does not exist" containerID="e09d2889270f03a1a2710e1eedb583da8696096d2d0903240186438a06d8fabb" Feb 20 09:34:14 crc kubenswrapper[5094]: I0220 09:34:14.191092 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e09d2889270f03a1a2710e1eedb583da8696096d2d0903240186438a06d8fabb"} err="failed to get container status \"e09d2889270f03a1a2710e1eedb583da8696096d2d0903240186438a06d8fabb\": rpc error: code = NotFound desc = could not find container \"e09d2889270f03a1a2710e1eedb583da8696096d2d0903240186438a06d8fabb\": container with ID starting with e09d2889270f03a1a2710e1eedb583da8696096d2d0903240186438a06d8fabb not found: ID does not exist" Feb 20 09:34:14 crc kubenswrapper[5094]: I0220 09:34:14.191111 5094 scope.go:117] "RemoveContainer" containerID="b699a1c50072acbdab88a18e4209ddc7b6f1665995200477921d9b14cf4874bc" Feb 20 09:34:14 crc kubenswrapper[5094]: E0220 09:34:14.191442 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b699a1c50072acbdab88a18e4209ddc7b6f1665995200477921d9b14cf4874bc\": container with ID starting with b699a1c50072acbdab88a18e4209ddc7b6f1665995200477921d9b14cf4874bc not found: ID does not exist" containerID="b699a1c50072acbdab88a18e4209ddc7b6f1665995200477921d9b14cf4874bc" Feb 20 09:34:14 crc kubenswrapper[5094]: I0220 09:34:14.191462 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b699a1c50072acbdab88a18e4209ddc7b6f1665995200477921d9b14cf4874bc"} err="failed to get container status \"b699a1c50072acbdab88a18e4209ddc7b6f1665995200477921d9b14cf4874bc\": rpc error: code = NotFound desc = could not find container \"b699a1c50072acbdab88a18e4209ddc7b6f1665995200477921d9b14cf4874bc\": container with ID starting with b699a1c50072acbdab88a18e4209ddc7b6f1665995200477921d9b14cf4874bc not found: ID does not exist" Feb 20 09:34:14 crc kubenswrapper[5094]: I0220 09:34:14.191474 5094 scope.go:117] "RemoveContainer" containerID="0e5f06dce56f85bd90954f924e3c901b30cb5173e84b14dd5d12f82eb9ebf4a0" Feb 20 09:34:14 crc kubenswrapper[5094]: E0220 09:34:14.191830 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e5f06dce56f85bd90954f924e3c901b30cb5173e84b14dd5d12f82eb9ebf4a0\": container with ID starting with 0e5f06dce56f85bd90954f924e3c901b30cb5173e84b14dd5d12f82eb9ebf4a0 not found: ID does not exist" containerID="0e5f06dce56f85bd90954f924e3c901b30cb5173e84b14dd5d12f82eb9ebf4a0" Feb 20 09:34:14 crc kubenswrapper[5094]: I0220 09:34:14.191849 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e5f06dce56f85bd90954f924e3c901b30cb5173e84b14dd5d12f82eb9ebf4a0"} err="failed to get container status \"0e5f06dce56f85bd90954f924e3c901b30cb5173e84b14dd5d12f82eb9ebf4a0\": rpc error: code = NotFound desc = could not find container \"0e5f06dce56f85bd90954f924e3c901b30cb5173e84b14dd5d12f82eb9ebf4a0\": container with ID starting with 0e5f06dce56f85bd90954f924e3c901b30cb5173e84b14dd5d12f82eb9ebf4a0 not found: ID does not exist" Feb 20 09:34:15 crc kubenswrapper[5094]: I0220 09:34:15.851652 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6db9ece-1aa7-4ea4-b800-b710a760edf6" path="/var/lib/kubelet/pods/c6db9ece-1aa7-4ea4-b800-b710a760edf6/volumes" Feb 20 09:34:25 crc kubenswrapper[5094]: I0220 09:34:25.847359 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:34:25 crc kubenswrapper[5094]: E0220 09:34:25.854815 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:34:37 crc kubenswrapper[5094]: I0220 09:34:37.841771 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:34:38 crc kubenswrapper[5094]: I0220 09:34:38.300643 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"31a97918d224bfc5f6552272a8a0ed77ef798862ba6a87755f4f2752a98e41a2"} Feb 20 09:35:42 crc kubenswrapper[5094]: I0220 09:35:42.989128 5094 generic.go:334] "Generic (PLEG): container finished" podID="5b30b185-0b70-4ad8-8eca-a292b76fb410" containerID="40b6b847514554d6d2915d390a68f23820ddde7e93d36e6d67b4ad209729e928" exitCode=0 Feb 20 09:35:42 crc kubenswrapper[5094]: I0220 09:35:42.989834 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" event={"ID":"5b30b185-0b70-4ad8-8eca-a292b76fb410","Type":"ContainerDied","Data":"40b6b847514554d6d2915d390a68f23820ddde7e93d36e6d67b4ad209729e928"} Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.505281 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.594214 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-ceph\") pod \"5b30b185-0b70-4ad8-8eca-a292b76fb410\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.594340 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-combined-ca-bundle\") pod \"5b30b185-0b70-4ad8-8eca-a292b76fb410\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.594423 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-migration-ssh-key-1\") pod \"5b30b185-0b70-4ad8-8eca-a292b76fb410\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.594462 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cells-global-config-1\") pod \"5b30b185-0b70-4ad8-8eca-a292b76fb410\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.594488 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cells-global-config-0\") pod \"5b30b185-0b70-4ad8-8eca-a292b76fb410\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.594510 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-0\") pod \"5b30b185-0b70-4ad8-8eca-a292b76fb410\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.594544 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-ssh-key-openstack-cell1\") pod \"5b30b185-0b70-4ad8-8eca-a292b76fb410\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.594579 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbtc5\" (UniqueName: \"kubernetes.io/projected/5b30b185-0b70-4ad8-8eca-a292b76fb410-kube-api-access-sbtc5\") pod \"5b30b185-0b70-4ad8-8eca-a292b76fb410\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.594621 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-1\") pod \"5b30b185-0b70-4ad8-8eca-a292b76fb410\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.594661 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-2\") pod \"5b30b185-0b70-4ad8-8eca-a292b76fb410\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.594688 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-inventory\") pod \"5b30b185-0b70-4ad8-8eca-a292b76fb410\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.594759 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-3\") pod \"5b30b185-0b70-4ad8-8eca-a292b76fb410\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.594807 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-migration-ssh-key-0\") pod \"5b30b185-0b70-4ad8-8eca-a292b76fb410\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.603400 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b30b185-0b70-4ad8-8eca-a292b76fb410-kube-api-access-sbtc5" (OuterVolumeSpecName: "kube-api-access-sbtc5") pod "5b30b185-0b70-4ad8-8eca-a292b76fb410" (UID: "5b30b185-0b70-4ad8-8eca-a292b76fb410"). InnerVolumeSpecName "kube-api-access-sbtc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.604194 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "5b30b185-0b70-4ad8-8eca-a292b76fb410" (UID: "5b30b185-0b70-4ad8-8eca-a292b76fb410"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.615155 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-ceph" (OuterVolumeSpecName: "ceph") pod "5b30b185-0b70-4ad8-8eca-a292b76fb410" (UID: "5b30b185-0b70-4ad8-8eca-a292b76fb410"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.637630 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "5b30b185-0b70-4ad8-8eca-a292b76fb410" (UID: "5b30b185-0b70-4ad8-8eca-a292b76fb410"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.638464 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "5b30b185-0b70-4ad8-8eca-a292b76fb410" (UID: "5b30b185-0b70-4ad8-8eca-a292b76fb410"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.638505 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "5b30b185-0b70-4ad8-8eca-a292b76fb410" (UID: "5b30b185-0b70-4ad8-8eca-a292b76fb410"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.643399 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-inventory" (OuterVolumeSpecName: "inventory") pod "5b30b185-0b70-4ad8-8eca-a292b76fb410" (UID: "5b30b185-0b70-4ad8-8eca-a292b76fb410"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.647053 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "5b30b185-0b70-4ad8-8eca-a292b76fb410" (UID: "5b30b185-0b70-4ad8-8eca-a292b76fb410"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.657142 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "5b30b185-0b70-4ad8-8eca-a292b76fb410" (UID: "5b30b185-0b70-4ad8-8eca-a292b76fb410"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.658044 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "5b30b185-0b70-4ad8-8eca-a292b76fb410" (UID: "5b30b185-0b70-4ad8-8eca-a292b76fb410"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.660261 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "5b30b185-0b70-4ad8-8eca-a292b76fb410" (UID: "5b30b185-0b70-4ad8-8eca-a292b76fb410"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.661340 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "5b30b185-0b70-4ad8-8eca-a292b76fb410" (UID: "5b30b185-0b70-4ad8-8eca-a292b76fb410"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.670728 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "5b30b185-0b70-4ad8-8eca-a292b76fb410" (UID: "5b30b185-0b70-4ad8-8eca-a292b76fb410"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.697369 5094 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.697402 5094 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.697412 5094 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.697421 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.697430 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbtc5\" (UniqueName: \"kubernetes.io/projected/5b30b185-0b70-4ad8-8eca-a292b76fb410-kube-api-access-sbtc5\") on node \"crc\" DevicePath \"\"" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.697439 5094 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.697447 5094 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.697457 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.697466 5094 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.697474 5094 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.697482 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.697490 5094 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.697508 5094 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 20 09:35:45 crc kubenswrapper[5094]: I0220 09:35:45.016738 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" event={"ID":"5b30b185-0b70-4ad8-8eca-a292b76fb410","Type":"ContainerDied","Data":"135ff20bf16d42798db0e45ebf2925fe4bfb43459a1c4911adfd22e9f5907df7"} Feb 20 09:35:45 crc kubenswrapper[5094]: I0220 09:35:45.016789 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:35:45 crc kubenswrapper[5094]: I0220 09:35:45.016798 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="135ff20bf16d42798db0e45ebf2925fe4bfb43459a1c4911adfd22e9f5907df7" Feb 20 09:37:04 crc kubenswrapper[5094]: I0220 09:37:04.106356 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:37:04 crc kubenswrapper[5094]: I0220 09:37:04.107076 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:37:19 crc kubenswrapper[5094]: I0220 09:37:19.970258 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Feb 20 09:37:19 crc kubenswrapper[5094]: I0220 09:37:19.971271 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="38c66beb-a97f-470c-8999-e15f5c4a9b60" containerName="adoption" containerID="cri-o://0ce7542b1a8ed6f96aeda268ad15c227863bbd5946b3eaac7eb6134db4ce52f0" gracePeriod=30 Feb 20 09:37:34 crc kubenswrapper[5094]: I0220 09:37:34.106578 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:37:34 crc kubenswrapper[5094]: I0220 09:37:34.107137 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:37:50 crc kubenswrapper[5094]: I0220 09:37:50.412583 5094 generic.go:334] "Generic (PLEG): container finished" podID="38c66beb-a97f-470c-8999-e15f5c4a9b60" containerID="0ce7542b1a8ed6f96aeda268ad15c227863bbd5946b3eaac7eb6134db4ce52f0" exitCode=137 Feb 20 09:37:50 crc kubenswrapper[5094]: I0220 09:37:50.412666 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"38c66beb-a97f-470c-8999-e15f5c4a9b60","Type":"ContainerDied","Data":"0ce7542b1a8ed6f96aeda268ad15c227863bbd5946b3eaac7eb6134db4ce52f0"} Feb 20 09:37:50 crc kubenswrapper[5094]: I0220 09:37:50.413220 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"38c66beb-a97f-470c-8999-e15f5c4a9b60","Type":"ContainerDied","Data":"9eef613ade19e2ed2f7272eef05d7fc30f774f3cc73f115f6f763788afc1cc96"} Feb 20 09:37:50 crc kubenswrapper[5094]: I0220 09:37:50.413236 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9eef613ade19e2ed2f7272eef05d7fc30f774f3cc73f115f6f763788afc1cc96" Feb 20 09:37:50 crc kubenswrapper[5094]: I0220 09:37:50.449975 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 20 09:37:50 crc kubenswrapper[5094]: I0220 09:37:50.569872 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvljf\" (UniqueName: \"kubernetes.io/projected/38c66beb-a97f-470c-8999-e15f5c4a9b60-kube-api-access-rvljf\") pod \"38c66beb-a97f-470c-8999-e15f5c4a9b60\" (UID: \"38c66beb-a97f-470c-8999-e15f5c4a9b60\") " Feb 20 09:37:50 crc kubenswrapper[5094]: I0220 09:37:50.572005 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7\") pod \"38c66beb-a97f-470c-8999-e15f5c4a9b60\" (UID: \"38c66beb-a97f-470c-8999-e15f5c4a9b60\") " Feb 20 09:37:50 crc kubenswrapper[5094]: I0220 09:37:50.577887 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38c66beb-a97f-470c-8999-e15f5c4a9b60-kube-api-access-rvljf" (OuterVolumeSpecName: "kube-api-access-rvljf") pod "38c66beb-a97f-470c-8999-e15f5c4a9b60" (UID: "38c66beb-a97f-470c-8999-e15f5c4a9b60"). InnerVolumeSpecName "kube-api-access-rvljf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:37:50 crc kubenswrapper[5094]: I0220 09:37:50.590197 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7" (OuterVolumeSpecName: "mariadb-data") pod "38c66beb-a97f-470c-8999-e15f5c4a9b60" (UID: "38c66beb-a97f-470c-8999-e15f5c4a9b60"). InnerVolumeSpecName "pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 20 09:37:50 crc kubenswrapper[5094]: I0220 09:37:50.674615 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvljf\" (UniqueName: \"kubernetes.io/projected/38c66beb-a97f-470c-8999-e15f5c4a9b60-kube-api-access-rvljf\") on node \"crc\" DevicePath \"\"" Feb 20 09:37:50 crc kubenswrapper[5094]: I0220 09:37:50.674875 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7\") on node \"crc\" " Feb 20 09:37:50 crc kubenswrapper[5094]: I0220 09:37:50.712908 5094 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 20 09:37:50 crc kubenswrapper[5094]: I0220 09:37:50.713311 5094 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7") on node "crc" Feb 20 09:37:50 crc kubenswrapper[5094]: I0220 09:37:50.776527 5094 reconciler_common.go:293] "Volume detached for volume \"pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7\") on node \"crc\" DevicePath \"\"" Feb 20 09:37:51 crc kubenswrapper[5094]: I0220 09:37:51.421990 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 20 09:37:51 crc kubenswrapper[5094]: I0220 09:37:51.469687 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Feb 20 09:37:51 crc kubenswrapper[5094]: I0220 09:37:51.483343 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Feb 20 09:37:51 crc kubenswrapper[5094]: I0220 09:37:51.856680 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38c66beb-a97f-470c-8999-e15f5c4a9b60" path="/var/lib/kubelet/pods/38c66beb-a97f-470c-8999-e15f5c4a9b60/volumes" Feb 20 09:37:52 crc kubenswrapper[5094]: I0220 09:37:52.310293 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Feb 20 09:37:52 crc kubenswrapper[5094]: I0220 09:37:52.311461 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="4111d2dd-641f-4113-8751-4151d435e934" containerName="adoption" containerID="cri-o://0a962e0ed36b20eeee3760655985d477e38322eaa7ec12060bc14e41416dcf5e" gracePeriod=30 Feb 20 09:38:04 crc kubenswrapper[5094]: I0220 09:38:04.106902 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:38:04 crc kubenswrapper[5094]: I0220 09:38:04.108022 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:38:04 crc kubenswrapper[5094]: I0220 09:38:04.108121 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 09:38:04 crc kubenswrapper[5094]: I0220 09:38:04.109687 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"31a97918d224bfc5f6552272a8a0ed77ef798862ba6a87755f4f2752a98e41a2"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 09:38:04 crc kubenswrapper[5094]: I0220 09:38:04.109812 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://31a97918d224bfc5f6552272a8a0ed77ef798862ba6a87755f4f2752a98e41a2" gracePeriod=600 Feb 20 09:38:04 crc kubenswrapper[5094]: I0220 09:38:04.609689 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="31a97918d224bfc5f6552272a8a0ed77ef798862ba6a87755f4f2752a98e41a2" exitCode=0 Feb 20 09:38:04 crc kubenswrapper[5094]: I0220 09:38:04.610195 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"31a97918d224bfc5f6552272a8a0ed77ef798862ba6a87755f4f2752a98e41a2"} Feb 20 09:38:04 crc kubenswrapper[5094]: I0220 09:38:04.610236 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3"} Feb 20 09:38:04 crc kubenswrapper[5094]: I0220 09:38:04.610257 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:38:22 crc kubenswrapper[5094]: I0220 09:38:22.823181 5094 generic.go:334] "Generic (PLEG): container finished" podID="4111d2dd-641f-4113-8751-4151d435e934" containerID="0a962e0ed36b20eeee3760655985d477e38322eaa7ec12060bc14e41416dcf5e" exitCode=137 Feb 20 09:38:22 crc kubenswrapper[5094]: I0220 09:38:22.823229 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"4111d2dd-641f-4113-8751-4151d435e934","Type":"ContainerDied","Data":"0a962e0ed36b20eeee3760655985d477e38322eaa7ec12060bc14e41416dcf5e"} Feb 20 09:38:22 crc kubenswrapper[5094]: I0220 09:38:22.823678 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"4111d2dd-641f-4113-8751-4151d435e934","Type":"ContainerDied","Data":"12052248c5408eacb24c4b1b460564df0f8c856ad57e93b0f1d03df46cc8aee4"} Feb 20 09:38:22 crc kubenswrapper[5094]: I0220 09:38:22.823689 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12052248c5408eacb24c4b1b460564df0f8c856ad57e93b0f1d03df46cc8aee4" Feb 20 09:38:22 crc kubenswrapper[5094]: I0220 09:38:22.839055 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 20 09:38:23 crc kubenswrapper[5094]: I0220 09:38:23.008526 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/4111d2dd-641f-4113-8751-4151d435e934-ovn-data-cert\") pod \"4111d2dd-641f-4113-8751-4151d435e934\" (UID: \"4111d2dd-641f-4113-8751-4151d435e934\") " Feb 20 09:38:23 crc kubenswrapper[5094]: I0220 09:38:23.008689 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trgg5\" (UniqueName: \"kubernetes.io/projected/4111d2dd-641f-4113-8751-4151d435e934-kube-api-access-trgg5\") pod \"4111d2dd-641f-4113-8751-4151d435e934\" (UID: \"4111d2dd-641f-4113-8751-4151d435e934\") " Feb 20 09:38:23 crc kubenswrapper[5094]: I0220 09:38:23.009613 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87\") pod \"4111d2dd-641f-4113-8751-4151d435e934\" (UID: \"4111d2dd-641f-4113-8751-4151d435e934\") " Feb 20 09:38:23 crc kubenswrapper[5094]: I0220 09:38:23.015144 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4111d2dd-641f-4113-8751-4151d435e934-kube-api-access-trgg5" (OuterVolumeSpecName: "kube-api-access-trgg5") pod "4111d2dd-641f-4113-8751-4151d435e934" (UID: "4111d2dd-641f-4113-8751-4151d435e934"). InnerVolumeSpecName "kube-api-access-trgg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:38:23 crc kubenswrapper[5094]: I0220 09:38:23.017161 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4111d2dd-641f-4113-8751-4151d435e934-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "4111d2dd-641f-4113-8751-4151d435e934" (UID: "4111d2dd-641f-4113-8751-4151d435e934"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:38:23 crc kubenswrapper[5094]: I0220 09:38:23.032385 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87" (OuterVolumeSpecName: "ovn-data") pod "4111d2dd-641f-4113-8751-4151d435e934" (UID: "4111d2dd-641f-4113-8751-4151d435e934"). InnerVolumeSpecName "pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 20 09:38:23 crc kubenswrapper[5094]: I0220 09:38:23.112267 5094 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/4111d2dd-641f-4113-8751-4151d435e934-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:38:23 crc kubenswrapper[5094]: I0220 09:38:23.112325 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87\") on node \"crc\" " Feb 20 09:38:23 crc kubenswrapper[5094]: I0220 09:38:23.112336 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trgg5\" (UniqueName: \"kubernetes.io/projected/4111d2dd-641f-4113-8751-4151d435e934-kube-api-access-trgg5\") on node \"crc\" DevicePath \"\"" Feb 20 09:38:23 crc kubenswrapper[5094]: I0220 09:38:23.134522 5094 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 20 09:38:23 crc kubenswrapper[5094]: I0220 09:38:23.134693 5094 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87") on node "crc" Feb 20 09:38:23 crc kubenswrapper[5094]: I0220 09:38:23.215671 5094 reconciler_common.go:293] "Volume detached for volume \"pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87\") on node \"crc\" DevicePath \"\"" Feb 20 09:38:23 crc kubenswrapper[5094]: I0220 09:38:23.832631 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 20 09:38:23 crc kubenswrapper[5094]: I0220 09:38:23.873044 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Feb 20 09:38:23 crc kubenswrapper[5094]: I0220 09:38:23.883375 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Feb 20 09:38:25 crc kubenswrapper[5094]: I0220 09:38:25.853576 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4111d2dd-641f-4113-8751-4151d435e934" path="/var/lib/kubelet/pods/4111d2dd-641f-4113-8751-4151d435e934/volumes" Feb 20 09:38:44 crc kubenswrapper[5094]: I0220 09:38:44.935052 5094 scope.go:117] "RemoveContainer" containerID="0a962e0ed36b20eeee3760655985d477e38322eaa7ec12060bc14e41416dcf5e" Feb 20 09:38:44 crc kubenswrapper[5094]: I0220 09:38:44.993829 5094 scope.go:117] "RemoveContainer" containerID="0ce7542b1a8ed6f96aeda268ad15c227863bbd5946b3eaac7eb6134db4ce52f0" Feb 20 09:40:04 crc kubenswrapper[5094]: I0220 09:40:04.106446 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:40:04 crc kubenswrapper[5094]: I0220 09:40:04.107059 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:40:34 crc kubenswrapper[5094]: I0220 09:40:34.107154 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:40:34 crc kubenswrapper[5094]: I0220 09:40:34.107985 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.039673 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vkzrm"] Feb 20 09:41:04 crc kubenswrapper[5094]: E0220 09:41:04.041555 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b30b185-0b70-4ad8-8eca-a292b76fb410" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.041584 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b30b185-0b70-4ad8-8eca-a292b76fb410" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Feb 20 09:41:04 crc kubenswrapper[5094]: E0220 09:41:04.041608 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6db9ece-1aa7-4ea4-b800-b710a760edf6" containerName="extract-utilities" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.041622 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6db9ece-1aa7-4ea4-b800-b710a760edf6" containerName="extract-utilities" Feb 20 09:41:04 crc kubenswrapper[5094]: E0220 09:41:04.041636 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6db9ece-1aa7-4ea4-b800-b710a760edf6" containerName="extract-content" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.041644 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6db9ece-1aa7-4ea4-b800-b710a760edf6" containerName="extract-content" Feb 20 09:41:04 crc kubenswrapper[5094]: E0220 09:41:04.041665 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6db9ece-1aa7-4ea4-b800-b710a760edf6" containerName="registry-server" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.041673 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6db9ece-1aa7-4ea4-b800-b710a760edf6" containerName="registry-server" Feb 20 09:41:04 crc kubenswrapper[5094]: E0220 09:41:04.041742 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38c66beb-a97f-470c-8999-e15f5c4a9b60" containerName="adoption" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.041751 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="38c66beb-a97f-470c-8999-e15f5c4a9b60" containerName="adoption" Feb 20 09:41:04 crc kubenswrapper[5094]: E0220 09:41:04.041764 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4111d2dd-641f-4113-8751-4151d435e934" containerName="adoption" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.041770 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4111d2dd-641f-4113-8751-4151d435e934" containerName="adoption" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.041966 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b30b185-0b70-4ad8-8eca-a292b76fb410" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.041987 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6db9ece-1aa7-4ea4-b800-b710a760edf6" containerName="registry-server" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.042010 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4111d2dd-641f-4113-8751-4151d435e934" containerName="adoption" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.042022 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="38c66beb-a97f-470c-8999-e15f5c4a9b60" containerName="adoption" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.043917 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.054343 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vkzrm"] Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.106762 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.106837 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.106887 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.107821 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.107884 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" gracePeriod=600 Feb 20 09:41:04 crc kubenswrapper[5094]: E0220 09:41:04.228782 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.229204 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f77272af-e36b-4ab1-9029-6485a5d2c95f-utilities\") pod \"community-operators-vkzrm\" (UID: \"f77272af-e36b-4ab1-9029-6485a5d2c95f\") " pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.229352 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfpgp\" (UniqueName: \"kubernetes.io/projected/f77272af-e36b-4ab1-9029-6485a5d2c95f-kube-api-access-dfpgp\") pod \"community-operators-vkzrm\" (UID: \"f77272af-e36b-4ab1-9029-6485a5d2c95f\") " pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.229424 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f77272af-e36b-4ab1-9029-6485a5d2c95f-catalog-content\") pod \"community-operators-vkzrm\" (UID: \"f77272af-e36b-4ab1-9029-6485a5d2c95f\") " pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.331014 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f77272af-e36b-4ab1-9029-6485a5d2c95f-utilities\") pod \"community-operators-vkzrm\" (UID: \"f77272af-e36b-4ab1-9029-6485a5d2c95f\") " pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.331136 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfpgp\" (UniqueName: \"kubernetes.io/projected/f77272af-e36b-4ab1-9029-6485a5d2c95f-kube-api-access-dfpgp\") pod \"community-operators-vkzrm\" (UID: \"f77272af-e36b-4ab1-9029-6485a5d2c95f\") " pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.331172 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f77272af-e36b-4ab1-9029-6485a5d2c95f-catalog-content\") pod \"community-operators-vkzrm\" (UID: \"f77272af-e36b-4ab1-9029-6485a5d2c95f\") " pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.331546 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f77272af-e36b-4ab1-9029-6485a5d2c95f-utilities\") pod \"community-operators-vkzrm\" (UID: \"f77272af-e36b-4ab1-9029-6485a5d2c95f\") " pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.331739 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f77272af-e36b-4ab1-9029-6485a5d2c95f-catalog-content\") pod \"community-operators-vkzrm\" (UID: \"f77272af-e36b-4ab1-9029-6485a5d2c95f\") " pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.349889 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfpgp\" (UniqueName: \"kubernetes.io/projected/f77272af-e36b-4ab1-9029-6485a5d2c95f-kube-api-access-dfpgp\") pod \"community-operators-vkzrm\" (UID: \"f77272af-e36b-4ab1-9029-6485a5d2c95f\") " pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.362903 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.663811 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" exitCode=0 Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.663854 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3"} Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.664164 5094 scope.go:117] "RemoveContainer" containerID="31a97918d224bfc5f6552272a8a0ed77ef798862ba6a87755f4f2752a98e41a2" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.665166 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:41:04 crc kubenswrapper[5094]: E0220 09:41:04.665612 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.933991 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vkzrm"] Feb 20 09:41:05 crc kubenswrapper[5094]: I0220 09:41:05.680526 5094 generic.go:334] "Generic (PLEG): container finished" podID="f77272af-e36b-4ab1-9029-6485a5d2c95f" containerID="c5346f105c3a7aa086fca7b0b87e5e8579a3e80ebf063450c83b214c198260b6" exitCode=0 Feb 20 09:41:05 crc kubenswrapper[5094]: I0220 09:41:05.680635 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkzrm" event={"ID":"f77272af-e36b-4ab1-9029-6485a5d2c95f","Type":"ContainerDied","Data":"c5346f105c3a7aa086fca7b0b87e5e8579a3e80ebf063450c83b214c198260b6"} Feb 20 09:41:05 crc kubenswrapper[5094]: I0220 09:41:05.680930 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkzrm" event={"ID":"f77272af-e36b-4ab1-9029-6485a5d2c95f","Type":"ContainerStarted","Data":"71a63e5a0ea54fcc02961c1ea8fc4c44cc227dcf955deaa310c10c70c63fbaf3"} Feb 20 09:41:05 crc kubenswrapper[5094]: I0220 09:41:05.683376 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 09:41:07 crc kubenswrapper[5094]: I0220 09:41:07.701316 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkzrm" event={"ID":"f77272af-e36b-4ab1-9029-6485a5d2c95f","Type":"ContainerStarted","Data":"7af968c16caa79d5bb5f6a8736bb5e65c93aad23723c3283822323271ece7400"} Feb 20 09:41:09 crc kubenswrapper[5094]: I0220 09:41:09.727196 5094 generic.go:334] "Generic (PLEG): container finished" podID="f77272af-e36b-4ab1-9029-6485a5d2c95f" containerID="7af968c16caa79d5bb5f6a8736bb5e65c93aad23723c3283822323271ece7400" exitCode=0 Feb 20 09:41:09 crc kubenswrapper[5094]: I0220 09:41:09.727588 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkzrm" event={"ID":"f77272af-e36b-4ab1-9029-6485a5d2c95f","Type":"ContainerDied","Data":"7af968c16caa79d5bb5f6a8736bb5e65c93aad23723c3283822323271ece7400"} Feb 20 09:41:10 crc kubenswrapper[5094]: I0220 09:41:10.740884 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkzrm" event={"ID":"f77272af-e36b-4ab1-9029-6485a5d2c95f","Type":"ContainerStarted","Data":"75d574b6e0f2f2f5d758f86e35fb24ddf85408817e99d01d1c91bc0b34a2b3e5"} Feb 20 09:41:10 crc kubenswrapper[5094]: I0220 09:41:10.772372 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vkzrm" podStartSLOduration=2.315427207 podStartE2EDuration="6.772351574s" podCreationTimestamp="2026-02-20 09:41:04 +0000 UTC" firstStartedPulling="2026-02-20 09:41:05.683149634 +0000 UTC m=+10480.555776335" lastFinishedPulling="2026-02-20 09:41:10.140073981 +0000 UTC m=+10485.012700702" observedRunningTime="2026-02-20 09:41:10.763445725 +0000 UTC m=+10485.636072446" watchObservedRunningTime="2026-02-20 09:41:10.772351574 +0000 UTC m=+10485.644978295" Feb 20 09:41:14 crc kubenswrapper[5094]: I0220 09:41:14.363599 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:14 crc kubenswrapper[5094]: I0220 09:41:14.366928 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:14 crc kubenswrapper[5094]: I0220 09:41:14.433449 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:16 crc kubenswrapper[5094]: I0220 09:41:16.841531 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:41:16 crc kubenswrapper[5094]: E0220 09:41:16.842466 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:41:24 crc kubenswrapper[5094]: I0220 09:41:24.414504 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:24 crc kubenswrapper[5094]: I0220 09:41:24.457458 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vkzrm"] Feb 20 09:41:24 crc kubenswrapper[5094]: I0220 09:41:24.926765 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vkzrm" podUID="f77272af-e36b-4ab1-9029-6485a5d2c95f" containerName="registry-server" containerID="cri-o://75d574b6e0f2f2f5d758f86e35fb24ddf85408817e99d01d1c91bc0b34a2b3e5" gracePeriod=2 Feb 20 09:41:25 crc kubenswrapper[5094]: I0220 09:41:25.941810 5094 generic.go:334] "Generic (PLEG): container finished" podID="f77272af-e36b-4ab1-9029-6485a5d2c95f" containerID="75d574b6e0f2f2f5d758f86e35fb24ddf85408817e99d01d1c91bc0b34a2b3e5" exitCode=0 Feb 20 09:41:25 crc kubenswrapper[5094]: I0220 09:41:25.941868 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkzrm" event={"ID":"f77272af-e36b-4ab1-9029-6485a5d2c95f","Type":"ContainerDied","Data":"75d574b6e0f2f2f5d758f86e35fb24ddf85408817e99d01d1c91bc0b34a2b3e5"} Feb 20 09:41:25 crc kubenswrapper[5094]: I0220 09:41:25.942310 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkzrm" event={"ID":"f77272af-e36b-4ab1-9029-6485a5d2c95f","Type":"ContainerDied","Data":"71a63e5a0ea54fcc02961c1ea8fc4c44cc227dcf955deaa310c10c70c63fbaf3"} Feb 20 09:41:25 crc kubenswrapper[5094]: I0220 09:41:25.942327 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71a63e5a0ea54fcc02961c1ea8fc4c44cc227dcf955deaa310c10c70c63fbaf3" Feb 20 09:41:25 crc kubenswrapper[5094]: I0220 09:41:25.945921 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:26 crc kubenswrapper[5094]: I0220 09:41:26.081421 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f77272af-e36b-4ab1-9029-6485a5d2c95f-catalog-content\") pod \"f77272af-e36b-4ab1-9029-6485a5d2c95f\" (UID: \"f77272af-e36b-4ab1-9029-6485a5d2c95f\") " Feb 20 09:41:26 crc kubenswrapper[5094]: I0220 09:41:26.081641 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfpgp\" (UniqueName: \"kubernetes.io/projected/f77272af-e36b-4ab1-9029-6485a5d2c95f-kube-api-access-dfpgp\") pod \"f77272af-e36b-4ab1-9029-6485a5d2c95f\" (UID: \"f77272af-e36b-4ab1-9029-6485a5d2c95f\") " Feb 20 09:41:26 crc kubenswrapper[5094]: I0220 09:41:26.081725 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f77272af-e36b-4ab1-9029-6485a5d2c95f-utilities\") pod \"f77272af-e36b-4ab1-9029-6485a5d2c95f\" (UID: \"f77272af-e36b-4ab1-9029-6485a5d2c95f\") " Feb 20 09:41:26 crc kubenswrapper[5094]: I0220 09:41:26.082904 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f77272af-e36b-4ab1-9029-6485a5d2c95f-utilities" (OuterVolumeSpecName: "utilities") pod "f77272af-e36b-4ab1-9029-6485a5d2c95f" (UID: "f77272af-e36b-4ab1-9029-6485a5d2c95f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:41:26 crc kubenswrapper[5094]: I0220 09:41:26.143375 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f77272af-e36b-4ab1-9029-6485a5d2c95f-kube-api-access-dfpgp" (OuterVolumeSpecName: "kube-api-access-dfpgp") pod "f77272af-e36b-4ab1-9029-6485a5d2c95f" (UID: "f77272af-e36b-4ab1-9029-6485a5d2c95f"). InnerVolumeSpecName "kube-api-access-dfpgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:41:26 crc kubenswrapper[5094]: I0220 09:41:26.150378 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f77272af-e36b-4ab1-9029-6485a5d2c95f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f77272af-e36b-4ab1-9029-6485a5d2c95f" (UID: "f77272af-e36b-4ab1-9029-6485a5d2c95f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:41:26 crc kubenswrapper[5094]: I0220 09:41:26.183451 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfpgp\" (UniqueName: \"kubernetes.io/projected/f77272af-e36b-4ab1-9029-6485a5d2c95f-kube-api-access-dfpgp\") on node \"crc\" DevicePath \"\"" Feb 20 09:41:26 crc kubenswrapper[5094]: I0220 09:41:26.183504 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f77272af-e36b-4ab1-9029-6485a5d2c95f-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:41:26 crc kubenswrapper[5094]: I0220 09:41:26.183515 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f77272af-e36b-4ab1-9029-6485a5d2c95f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:41:26 crc kubenswrapper[5094]: I0220 09:41:26.951579 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:26 crc kubenswrapper[5094]: I0220 09:41:26.996332 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vkzrm"] Feb 20 09:41:27 crc kubenswrapper[5094]: I0220 09:41:27.007173 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vkzrm"] Feb 20 09:41:27 crc kubenswrapper[5094]: I0220 09:41:27.858237 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f77272af-e36b-4ab1-9029-6485a5d2c95f" path="/var/lib/kubelet/pods/f77272af-e36b-4ab1-9029-6485a5d2c95f/volumes" Feb 20 09:41:31 crc kubenswrapper[5094]: I0220 09:41:31.841279 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:41:31 crc kubenswrapper[5094]: E0220 09:41:31.842125 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:41:45 crc kubenswrapper[5094]: I0220 09:41:45.853282 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:41:45 crc kubenswrapper[5094]: E0220 09:41:45.854294 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:41:58 crc kubenswrapper[5094]: I0220 09:41:58.840829 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:41:58 crc kubenswrapper[5094]: E0220 09:41:58.841469 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:42:10 crc kubenswrapper[5094]: I0220 09:42:10.841047 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:42:10 crc kubenswrapper[5094]: E0220 09:42:10.843395 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:42:25 crc kubenswrapper[5094]: I0220 09:42:25.858510 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:42:25 crc kubenswrapper[5094]: E0220 09:42:25.859791 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:42:36 crc kubenswrapper[5094]: I0220 09:42:36.840757 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:42:36 crc kubenswrapper[5094]: E0220 09:42:36.842819 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:42:51 crc kubenswrapper[5094]: I0220 09:42:51.840791 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:42:51 crc kubenswrapper[5094]: E0220 09:42:51.841855 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:43:02 crc kubenswrapper[5094]: I0220 09:43:02.765821 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jhzc2"] Feb 20 09:43:02 crc kubenswrapper[5094]: E0220 09:43:02.766789 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f77272af-e36b-4ab1-9029-6485a5d2c95f" containerName="extract-utilities" Feb 20 09:43:02 crc kubenswrapper[5094]: I0220 09:43:02.766803 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f77272af-e36b-4ab1-9029-6485a5d2c95f" containerName="extract-utilities" Feb 20 09:43:02 crc kubenswrapper[5094]: E0220 09:43:02.766822 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f77272af-e36b-4ab1-9029-6485a5d2c95f" containerName="extract-content" Feb 20 09:43:02 crc kubenswrapper[5094]: I0220 09:43:02.766828 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f77272af-e36b-4ab1-9029-6485a5d2c95f" containerName="extract-content" Feb 20 09:43:02 crc kubenswrapper[5094]: E0220 09:43:02.766843 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f77272af-e36b-4ab1-9029-6485a5d2c95f" containerName="registry-server" Feb 20 09:43:02 crc kubenswrapper[5094]: I0220 09:43:02.766850 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f77272af-e36b-4ab1-9029-6485a5d2c95f" containerName="registry-server" Feb 20 09:43:02 crc kubenswrapper[5094]: I0220 09:43:02.767041 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f77272af-e36b-4ab1-9029-6485a5d2c95f" containerName="registry-server" Feb 20 09:43:02 crc kubenswrapper[5094]: I0220 09:43:02.768429 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:02 crc kubenswrapper[5094]: I0220 09:43:02.784222 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jhzc2"] Feb 20 09:43:02 crc kubenswrapper[5094]: I0220 09:43:02.872698 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz2cg\" (UniqueName: \"kubernetes.io/projected/32d8687f-685c-44e3-866d-2f5f1eb289e2-kube-api-access-kz2cg\") pod \"redhat-marketplace-jhzc2\" (UID: \"32d8687f-685c-44e3-866d-2f5f1eb289e2\") " pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:02 crc kubenswrapper[5094]: I0220 09:43:02.872792 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32d8687f-685c-44e3-866d-2f5f1eb289e2-catalog-content\") pod \"redhat-marketplace-jhzc2\" (UID: \"32d8687f-685c-44e3-866d-2f5f1eb289e2\") " pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:02 crc kubenswrapper[5094]: I0220 09:43:02.872827 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32d8687f-685c-44e3-866d-2f5f1eb289e2-utilities\") pod \"redhat-marketplace-jhzc2\" (UID: \"32d8687f-685c-44e3-866d-2f5f1eb289e2\") " pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:02 crc kubenswrapper[5094]: I0220 09:43:02.974619 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz2cg\" (UniqueName: \"kubernetes.io/projected/32d8687f-685c-44e3-866d-2f5f1eb289e2-kube-api-access-kz2cg\") pod \"redhat-marketplace-jhzc2\" (UID: \"32d8687f-685c-44e3-866d-2f5f1eb289e2\") " pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:02 crc kubenswrapper[5094]: I0220 09:43:02.975610 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32d8687f-685c-44e3-866d-2f5f1eb289e2-catalog-content\") pod \"redhat-marketplace-jhzc2\" (UID: \"32d8687f-685c-44e3-866d-2f5f1eb289e2\") " pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:02 crc kubenswrapper[5094]: I0220 09:43:02.975816 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32d8687f-685c-44e3-866d-2f5f1eb289e2-utilities\") pod \"redhat-marketplace-jhzc2\" (UID: \"32d8687f-685c-44e3-866d-2f5f1eb289e2\") " pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:02 crc kubenswrapper[5094]: I0220 09:43:02.976042 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32d8687f-685c-44e3-866d-2f5f1eb289e2-catalog-content\") pod \"redhat-marketplace-jhzc2\" (UID: \"32d8687f-685c-44e3-866d-2f5f1eb289e2\") " pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:02 crc kubenswrapper[5094]: I0220 09:43:02.976335 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32d8687f-685c-44e3-866d-2f5f1eb289e2-utilities\") pod \"redhat-marketplace-jhzc2\" (UID: \"32d8687f-685c-44e3-866d-2f5f1eb289e2\") " pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:02 crc kubenswrapper[5094]: I0220 09:43:02.995486 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz2cg\" (UniqueName: \"kubernetes.io/projected/32d8687f-685c-44e3-866d-2f5f1eb289e2-kube-api-access-kz2cg\") pod \"redhat-marketplace-jhzc2\" (UID: \"32d8687f-685c-44e3-866d-2f5f1eb289e2\") " pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:03 crc kubenswrapper[5094]: I0220 09:43:03.089800 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:03 crc kubenswrapper[5094]: I0220 09:43:03.557084 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jhzc2"] Feb 20 09:43:03 crc kubenswrapper[5094]: I0220 09:43:03.841203 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:43:03 crc kubenswrapper[5094]: E0220 09:43:03.841905 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:43:04 crc kubenswrapper[5094]: I0220 09:43:04.192527 5094 generic.go:334] "Generic (PLEG): container finished" podID="32d8687f-685c-44e3-866d-2f5f1eb289e2" containerID="2f1931458d437e43cbd8bc95c262d54ee9ce7bd1a90e96c0ff847beb2e9b6bd5" exitCode=0 Feb 20 09:43:04 crc kubenswrapper[5094]: I0220 09:43:04.192571 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jhzc2" event={"ID":"32d8687f-685c-44e3-866d-2f5f1eb289e2","Type":"ContainerDied","Data":"2f1931458d437e43cbd8bc95c262d54ee9ce7bd1a90e96c0ff847beb2e9b6bd5"} Feb 20 09:43:04 crc kubenswrapper[5094]: I0220 09:43:04.192595 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jhzc2" event={"ID":"32d8687f-685c-44e3-866d-2f5f1eb289e2","Type":"ContainerStarted","Data":"19574da3fb2273ee9563d8d3859e607ecb55ce1e9e72fd86b08f90594aef0a9e"} Feb 20 09:43:05 crc kubenswrapper[5094]: I0220 09:43:05.191043 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kc77r"] Feb 20 09:43:05 crc kubenswrapper[5094]: I0220 09:43:05.193619 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:05 crc kubenswrapper[5094]: I0220 09:43:05.219223 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jhzc2" event={"ID":"32d8687f-685c-44e3-866d-2f5f1eb289e2","Type":"ContainerStarted","Data":"4f0b0b106fd6f1617a10c750586039348c94f17c58bdd125e394ec6e270c5e25"} Feb 20 09:43:05 crc kubenswrapper[5094]: I0220 09:43:05.220544 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kc77r"] Feb 20 09:43:05 crc kubenswrapper[5094]: I0220 09:43:05.325576 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-catalog-content\") pod \"redhat-operators-kc77r\" (UID: \"d8c7e904-200a-45ed-aaa1-cd6bf6c71399\") " pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:05 crc kubenswrapper[5094]: I0220 09:43:05.325644 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-utilities\") pod \"redhat-operators-kc77r\" (UID: \"d8c7e904-200a-45ed-aaa1-cd6bf6c71399\") " pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:05 crc kubenswrapper[5094]: I0220 09:43:05.325666 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqw9n\" (UniqueName: \"kubernetes.io/projected/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-kube-api-access-jqw9n\") pod \"redhat-operators-kc77r\" (UID: \"d8c7e904-200a-45ed-aaa1-cd6bf6c71399\") " pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:05 crc kubenswrapper[5094]: I0220 09:43:05.427350 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-catalog-content\") pod \"redhat-operators-kc77r\" (UID: \"d8c7e904-200a-45ed-aaa1-cd6bf6c71399\") " pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:05 crc kubenswrapper[5094]: I0220 09:43:05.427436 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-utilities\") pod \"redhat-operators-kc77r\" (UID: \"d8c7e904-200a-45ed-aaa1-cd6bf6c71399\") " pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:05 crc kubenswrapper[5094]: I0220 09:43:05.427468 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqw9n\" (UniqueName: \"kubernetes.io/projected/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-kube-api-access-jqw9n\") pod \"redhat-operators-kc77r\" (UID: \"d8c7e904-200a-45ed-aaa1-cd6bf6c71399\") " pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:05 crc kubenswrapper[5094]: I0220 09:43:05.427909 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-catalog-content\") pod \"redhat-operators-kc77r\" (UID: \"d8c7e904-200a-45ed-aaa1-cd6bf6c71399\") " pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:05 crc kubenswrapper[5094]: I0220 09:43:05.428255 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-utilities\") pod \"redhat-operators-kc77r\" (UID: \"d8c7e904-200a-45ed-aaa1-cd6bf6c71399\") " pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:05 crc kubenswrapper[5094]: I0220 09:43:05.449424 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqw9n\" (UniqueName: \"kubernetes.io/projected/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-kube-api-access-jqw9n\") pod \"redhat-operators-kc77r\" (UID: \"d8c7e904-200a-45ed-aaa1-cd6bf6c71399\") " pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:05 crc kubenswrapper[5094]: I0220 09:43:05.512152 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:06 crc kubenswrapper[5094]: I0220 09:43:06.151978 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kc77r"] Feb 20 09:43:06 crc kubenswrapper[5094]: I0220 09:43:06.235909 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kc77r" event={"ID":"d8c7e904-200a-45ed-aaa1-cd6bf6c71399","Type":"ContainerStarted","Data":"0e99f289f884916a03a641feb2a0bffd99bb555f0d1d3483e2c7822f04a8fa4b"} Feb 20 09:43:07 crc kubenswrapper[5094]: I0220 09:43:07.248024 5094 generic.go:334] "Generic (PLEG): container finished" podID="32d8687f-685c-44e3-866d-2f5f1eb289e2" containerID="4f0b0b106fd6f1617a10c750586039348c94f17c58bdd125e394ec6e270c5e25" exitCode=0 Feb 20 09:43:07 crc kubenswrapper[5094]: I0220 09:43:07.248106 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jhzc2" event={"ID":"32d8687f-685c-44e3-866d-2f5f1eb289e2","Type":"ContainerDied","Data":"4f0b0b106fd6f1617a10c750586039348c94f17c58bdd125e394ec6e270c5e25"} Feb 20 09:43:07 crc kubenswrapper[5094]: I0220 09:43:07.250118 5094 generic.go:334] "Generic (PLEG): container finished" podID="d8c7e904-200a-45ed-aaa1-cd6bf6c71399" containerID="9b09c11b51a431f98ac5ba6e5d43378a44dac84a11dcdb9bf7d45440a3c402d0" exitCode=0 Feb 20 09:43:07 crc kubenswrapper[5094]: I0220 09:43:07.250158 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kc77r" event={"ID":"d8c7e904-200a-45ed-aaa1-cd6bf6c71399","Type":"ContainerDied","Data":"9b09c11b51a431f98ac5ba6e5d43378a44dac84a11dcdb9bf7d45440a3c402d0"} Feb 20 09:43:08 crc kubenswrapper[5094]: I0220 09:43:08.261019 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jhzc2" event={"ID":"32d8687f-685c-44e3-866d-2f5f1eb289e2","Type":"ContainerStarted","Data":"060b3efae204d4906112a48a27f9459a57d1fc7a2f6cbfe75aa69dc1876919ab"} Feb 20 09:43:08 crc kubenswrapper[5094]: I0220 09:43:08.278428 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jhzc2" podStartSLOduration=2.677876233 podStartE2EDuration="6.27841036s" podCreationTimestamp="2026-02-20 09:43:02 +0000 UTC" firstStartedPulling="2026-02-20 09:43:04.194102718 +0000 UTC m=+10599.066729419" lastFinishedPulling="2026-02-20 09:43:07.794636845 +0000 UTC m=+10602.667263546" observedRunningTime="2026-02-20 09:43:08.276909394 +0000 UTC m=+10603.149536115" watchObservedRunningTime="2026-02-20 09:43:08.27841036 +0000 UTC m=+10603.151037071" Feb 20 09:43:09 crc kubenswrapper[5094]: I0220 09:43:09.283616 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kc77r" event={"ID":"d8c7e904-200a-45ed-aaa1-cd6bf6c71399","Type":"ContainerStarted","Data":"d29a2dc3f8c45c9dac09b075bbcc08642c86a1e55b6bc87ee2ee5870dcbcf2f0"} Feb 20 09:43:13 crc kubenswrapper[5094]: I0220 09:43:13.090150 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:13 crc kubenswrapper[5094]: I0220 09:43:13.090827 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:14 crc kubenswrapper[5094]: I0220 09:43:14.338557 5094 generic.go:334] "Generic (PLEG): container finished" podID="d8c7e904-200a-45ed-aaa1-cd6bf6c71399" containerID="d29a2dc3f8c45c9dac09b075bbcc08642c86a1e55b6bc87ee2ee5870dcbcf2f0" exitCode=0 Feb 20 09:43:14 crc kubenswrapper[5094]: I0220 09:43:14.338770 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kc77r" event={"ID":"d8c7e904-200a-45ed-aaa1-cd6bf6c71399","Type":"ContainerDied","Data":"d29a2dc3f8c45c9dac09b075bbcc08642c86a1e55b6bc87ee2ee5870dcbcf2f0"} Feb 20 09:43:14 crc kubenswrapper[5094]: I0220 09:43:14.686919 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-jhzc2" podUID="32d8687f-685c-44e3-866d-2f5f1eb289e2" containerName="registry-server" probeResult="failure" output=< Feb 20 09:43:14 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 09:43:14 crc kubenswrapper[5094]: > Feb 20 09:43:15 crc kubenswrapper[5094]: I0220 09:43:15.350000 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kc77r" event={"ID":"d8c7e904-200a-45ed-aaa1-cd6bf6c71399","Type":"ContainerStarted","Data":"186c26df9cf23d1d26fa3ebea038f1baebef58766ecdcf8d81b0a71c3ca51321"} Feb 20 09:43:15 crc kubenswrapper[5094]: I0220 09:43:15.378520 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kc77r" podStartSLOduration=2.903414118 podStartE2EDuration="10.378500871s" podCreationTimestamp="2026-02-20 09:43:05 +0000 UTC" firstStartedPulling="2026-02-20 09:43:07.258918206 +0000 UTC m=+10602.131544927" lastFinishedPulling="2026-02-20 09:43:14.734004959 +0000 UTC m=+10609.606631680" observedRunningTime="2026-02-20 09:43:15.372494034 +0000 UTC m=+10610.245120745" watchObservedRunningTime="2026-02-20 09:43:15.378500871 +0000 UTC m=+10610.251127592" Feb 20 09:43:15 crc kubenswrapper[5094]: I0220 09:43:15.513344 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:15 crc kubenswrapper[5094]: I0220 09:43:15.513415 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:15 crc kubenswrapper[5094]: I0220 09:43:15.846611 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:43:15 crc kubenswrapper[5094]: E0220 09:43:15.846898 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:43:16 crc kubenswrapper[5094]: I0220 09:43:16.566124 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kc77r" podUID="d8c7e904-200a-45ed-aaa1-cd6bf6c71399" containerName="registry-server" probeResult="failure" output=< Feb 20 09:43:16 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 09:43:16 crc kubenswrapper[5094]: > Feb 20 09:43:23 crc kubenswrapper[5094]: I0220 09:43:23.144954 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:24 crc kubenswrapper[5094]: I0220 09:43:24.069050 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:24 crc kubenswrapper[5094]: I0220 09:43:24.120468 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jhzc2"] Feb 20 09:43:24 crc kubenswrapper[5094]: I0220 09:43:24.436288 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jhzc2" podUID="32d8687f-685c-44e3-866d-2f5f1eb289e2" containerName="registry-server" containerID="cri-o://060b3efae204d4906112a48a27f9459a57d1fc7a2f6cbfe75aa69dc1876919ab" gracePeriod=2 Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.000937 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.027223 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32d8687f-685c-44e3-866d-2f5f1eb289e2-catalog-content\") pod \"32d8687f-685c-44e3-866d-2f5f1eb289e2\" (UID: \"32d8687f-685c-44e3-866d-2f5f1eb289e2\") " Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.027512 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32d8687f-685c-44e3-866d-2f5f1eb289e2-utilities\") pod \"32d8687f-685c-44e3-866d-2f5f1eb289e2\" (UID: \"32d8687f-685c-44e3-866d-2f5f1eb289e2\") " Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.027552 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kz2cg\" (UniqueName: \"kubernetes.io/projected/32d8687f-685c-44e3-866d-2f5f1eb289e2-kube-api-access-kz2cg\") pod \"32d8687f-685c-44e3-866d-2f5f1eb289e2\" (UID: \"32d8687f-685c-44e3-866d-2f5f1eb289e2\") " Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.028363 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32d8687f-685c-44e3-866d-2f5f1eb289e2-utilities" (OuterVolumeSpecName: "utilities") pod "32d8687f-685c-44e3-866d-2f5f1eb289e2" (UID: "32d8687f-685c-44e3-866d-2f5f1eb289e2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.034588 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32d8687f-685c-44e3-866d-2f5f1eb289e2-kube-api-access-kz2cg" (OuterVolumeSpecName: "kube-api-access-kz2cg") pod "32d8687f-685c-44e3-866d-2f5f1eb289e2" (UID: "32d8687f-685c-44e3-866d-2f5f1eb289e2"). InnerVolumeSpecName "kube-api-access-kz2cg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.057513 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32d8687f-685c-44e3-866d-2f5f1eb289e2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "32d8687f-685c-44e3-866d-2f5f1eb289e2" (UID: "32d8687f-685c-44e3-866d-2f5f1eb289e2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.130634 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32d8687f-685c-44e3-866d-2f5f1eb289e2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.130698 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32d8687f-685c-44e3-866d-2f5f1eb289e2-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.130743 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kz2cg\" (UniqueName: \"kubernetes.io/projected/32d8687f-685c-44e3-866d-2f5f1eb289e2-kube-api-access-kz2cg\") on node \"crc\" DevicePath \"\"" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.447042 5094 generic.go:334] "Generic (PLEG): container finished" podID="32d8687f-685c-44e3-866d-2f5f1eb289e2" containerID="060b3efae204d4906112a48a27f9459a57d1fc7a2f6cbfe75aa69dc1876919ab" exitCode=0 Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.447087 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jhzc2" event={"ID":"32d8687f-685c-44e3-866d-2f5f1eb289e2","Type":"ContainerDied","Data":"060b3efae204d4906112a48a27f9459a57d1fc7a2f6cbfe75aa69dc1876919ab"} Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.447120 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jhzc2" event={"ID":"32d8687f-685c-44e3-866d-2f5f1eb289e2","Type":"ContainerDied","Data":"19574da3fb2273ee9563d8d3859e607ecb55ce1e9e72fd86b08f90594aef0a9e"} Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.447141 5094 scope.go:117] "RemoveContainer" containerID="060b3efae204d4906112a48a27f9459a57d1fc7a2f6cbfe75aa69dc1876919ab" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.447163 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.468506 5094 scope.go:117] "RemoveContainer" containerID="4f0b0b106fd6f1617a10c750586039348c94f17c58bdd125e394ec6e270c5e25" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.494851 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jhzc2"] Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.504589 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jhzc2"] Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.663595 5094 scope.go:117] "RemoveContainer" containerID="2f1931458d437e43cbd8bc95c262d54ee9ce7bd1a90e96c0ff847beb2e9b6bd5" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.745579 5094 scope.go:117] "RemoveContainer" containerID="060b3efae204d4906112a48a27f9459a57d1fc7a2f6cbfe75aa69dc1876919ab" Feb 20 09:43:25 crc kubenswrapper[5094]: E0220 09:43:25.746687 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"060b3efae204d4906112a48a27f9459a57d1fc7a2f6cbfe75aa69dc1876919ab\": container with ID starting with 060b3efae204d4906112a48a27f9459a57d1fc7a2f6cbfe75aa69dc1876919ab not found: ID does not exist" containerID="060b3efae204d4906112a48a27f9459a57d1fc7a2f6cbfe75aa69dc1876919ab" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.746762 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"060b3efae204d4906112a48a27f9459a57d1fc7a2f6cbfe75aa69dc1876919ab"} err="failed to get container status \"060b3efae204d4906112a48a27f9459a57d1fc7a2f6cbfe75aa69dc1876919ab\": rpc error: code = NotFound desc = could not find container \"060b3efae204d4906112a48a27f9459a57d1fc7a2f6cbfe75aa69dc1876919ab\": container with ID starting with 060b3efae204d4906112a48a27f9459a57d1fc7a2f6cbfe75aa69dc1876919ab not found: ID does not exist" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.746795 5094 scope.go:117] "RemoveContainer" containerID="4f0b0b106fd6f1617a10c750586039348c94f17c58bdd125e394ec6e270c5e25" Feb 20 09:43:25 crc kubenswrapper[5094]: E0220 09:43:25.747293 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f0b0b106fd6f1617a10c750586039348c94f17c58bdd125e394ec6e270c5e25\": container with ID starting with 4f0b0b106fd6f1617a10c750586039348c94f17c58bdd125e394ec6e270c5e25 not found: ID does not exist" containerID="4f0b0b106fd6f1617a10c750586039348c94f17c58bdd125e394ec6e270c5e25" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.747327 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f0b0b106fd6f1617a10c750586039348c94f17c58bdd125e394ec6e270c5e25"} err="failed to get container status \"4f0b0b106fd6f1617a10c750586039348c94f17c58bdd125e394ec6e270c5e25\": rpc error: code = NotFound desc = could not find container \"4f0b0b106fd6f1617a10c750586039348c94f17c58bdd125e394ec6e270c5e25\": container with ID starting with 4f0b0b106fd6f1617a10c750586039348c94f17c58bdd125e394ec6e270c5e25 not found: ID does not exist" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.747350 5094 scope.go:117] "RemoveContainer" containerID="2f1931458d437e43cbd8bc95c262d54ee9ce7bd1a90e96c0ff847beb2e9b6bd5" Feb 20 09:43:25 crc kubenswrapper[5094]: E0220 09:43:25.747601 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f1931458d437e43cbd8bc95c262d54ee9ce7bd1a90e96c0ff847beb2e9b6bd5\": container with ID starting with 2f1931458d437e43cbd8bc95c262d54ee9ce7bd1a90e96c0ff847beb2e9b6bd5 not found: ID does not exist" containerID="2f1931458d437e43cbd8bc95c262d54ee9ce7bd1a90e96c0ff847beb2e9b6bd5" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.747630 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f1931458d437e43cbd8bc95c262d54ee9ce7bd1a90e96c0ff847beb2e9b6bd5"} err="failed to get container status \"2f1931458d437e43cbd8bc95c262d54ee9ce7bd1a90e96c0ff847beb2e9b6bd5\": rpc error: code = NotFound desc = could not find container \"2f1931458d437e43cbd8bc95c262d54ee9ce7bd1a90e96c0ff847beb2e9b6bd5\": container with ID starting with 2f1931458d437e43cbd8bc95c262d54ee9ce7bd1a90e96c0ff847beb2e9b6bd5 not found: ID does not exist" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.854009 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32d8687f-685c-44e3-866d-2f5f1eb289e2" path="/var/lib/kubelet/pods/32d8687f-685c-44e3-866d-2f5f1eb289e2/volumes" Feb 20 09:43:26 crc kubenswrapper[5094]: I0220 09:43:26.691789 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kc77r" podUID="d8c7e904-200a-45ed-aaa1-cd6bf6c71399" containerName="registry-server" probeResult="failure" output=< Feb 20 09:43:26 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 09:43:26 crc kubenswrapper[5094]: > Feb 20 09:43:26 crc kubenswrapper[5094]: I0220 09:43:26.840830 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:43:26 crc kubenswrapper[5094]: E0220 09:43:26.841283 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:43:36 crc kubenswrapper[5094]: I0220 09:43:36.558301 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kc77r" podUID="d8c7e904-200a-45ed-aaa1-cd6bf6c71399" containerName="registry-server" probeResult="failure" output=< Feb 20 09:43:36 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 09:43:36 crc kubenswrapper[5094]: > Feb 20 09:43:38 crc kubenswrapper[5094]: I0220 09:43:38.840569 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:43:38 crc kubenswrapper[5094]: E0220 09:43:38.841377 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:43:45 crc kubenswrapper[5094]: I0220 09:43:45.571957 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:45 crc kubenswrapper[5094]: I0220 09:43:45.648648 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:45 crc kubenswrapper[5094]: I0220 09:43:45.815448 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kc77r"] Feb 20 09:43:46 crc kubenswrapper[5094]: I0220 09:43:46.692038 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kc77r" podUID="d8c7e904-200a-45ed-aaa1-cd6bf6c71399" containerName="registry-server" containerID="cri-o://186c26df9cf23d1d26fa3ebea038f1baebef58766ecdcf8d81b0a71c3ca51321" gracePeriod=2 Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.214684 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.284076 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqw9n\" (UniqueName: \"kubernetes.io/projected/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-kube-api-access-jqw9n\") pod \"d8c7e904-200a-45ed-aaa1-cd6bf6c71399\" (UID: \"d8c7e904-200a-45ed-aaa1-cd6bf6c71399\") " Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.284642 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-catalog-content\") pod \"d8c7e904-200a-45ed-aaa1-cd6bf6c71399\" (UID: \"d8c7e904-200a-45ed-aaa1-cd6bf6c71399\") " Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.284905 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-utilities\") pod \"d8c7e904-200a-45ed-aaa1-cd6bf6c71399\" (UID: \"d8c7e904-200a-45ed-aaa1-cd6bf6c71399\") " Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.287727 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-utilities" (OuterVolumeSpecName: "utilities") pod "d8c7e904-200a-45ed-aaa1-cd6bf6c71399" (UID: "d8c7e904-200a-45ed-aaa1-cd6bf6c71399"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.299280 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-kube-api-access-jqw9n" (OuterVolumeSpecName: "kube-api-access-jqw9n") pod "d8c7e904-200a-45ed-aaa1-cd6bf6c71399" (UID: "d8c7e904-200a-45ed-aaa1-cd6bf6c71399"). InnerVolumeSpecName "kube-api-access-jqw9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.388800 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqw9n\" (UniqueName: \"kubernetes.io/projected/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-kube-api-access-jqw9n\") on node \"crc\" DevicePath \"\"" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.389159 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.432244 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d8c7e904-200a-45ed-aaa1-cd6bf6c71399" (UID: "d8c7e904-200a-45ed-aaa1-cd6bf6c71399"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.491310 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.703870 5094 generic.go:334] "Generic (PLEG): container finished" podID="d8c7e904-200a-45ed-aaa1-cd6bf6c71399" containerID="186c26df9cf23d1d26fa3ebea038f1baebef58766ecdcf8d81b0a71c3ca51321" exitCode=0 Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.703926 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kc77r" event={"ID":"d8c7e904-200a-45ed-aaa1-cd6bf6c71399","Type":"ContainerDied","Data":"186c26df9cf23d1d26fa3ebea038f1baebef58766ecdcf8d81b0a71c3ca51321"} Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.703958 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kc77r" event={"ID":"d8c7e904-200a-45ed-aaa1-cd6bf6c71399","Type":"ContainerDied","Data":"0e99f289f884916a03a641feb2a0bffd99bb555f0d1d3483e2c7822f04a8fa4b"} Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.703958 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.704047 5094 scope.go:117] "RemoveContainer" containerID="186c26df9cf23d1d26fa3ebea038f1baebef58766ecdcf8d81b0a71c3ca51321" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.742677 5094 scope.go:117] "RemoveContainer" containerID="d29a2dc3f8c45c9dac09b075bbcc08642c86a1e55b6bc87ee2ee5870dcbcf2f0" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.742682 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kc77r"] Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.752080 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kc77r"] Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.771965 5094 scope.go:117] "RemoveContainer" containerID="9b09c11b51a431f98ac5ba6e5d43378a44dac84a11dcdb9bf7d45440a3c402d0" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.815522 5094 scope.go:117] "RemoveContainer" containerID="186c26df9cf23d1d26fa3ebea038f1baebef58766ecdcf8d81b0a71c3ca51321" Feb 20 09:43:47 crc kubenswrapper[5094]: E0220 09:43:47.815892 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"186c26df9cf23d1d26fa3ebea038f1baebef58766ecdcf8d81b0a71c3ca51321\": container with ID starting with 186c26df9cf23d1d26fa3ebea038f1baebef58766ecdcf8d81b0a71c3ca51321 not found: ID does not exist" containerID="186c26df9cf23d1d26fa3ebea038f1baebef58766ecdcf8d81b0a71c3ca51321" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.815938 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"186c26df9cf23d1d26fa3ebea038f1baebef58766ecdcf8d81b0a71c3ca51321"} err="failed to get container status \"186c26df9cf23d1d26fa3ebea038f1baebef58766ecdcf8d81b0a71c3ca51321\": rpc error: code = NotFound desc = could not find container \"186c26df9cf23d1d26fa3ebea038f1baebef58766ecdcf8d81b0a71c3ca51321\": container with ID starting with 186c26df9cf23d1d26fa3ebea038f1baebef58766ecdcf8d81b0a71c3ca51321 not found: ID does not exist" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.815968 5094 scope.go:117] "RemoveContainer" containerID="d29a2dc3f8c45c9dac09b075bbcc08642c86a1e55b6bc87ee2ee5870dcbcf2f0" Feb 20 09:43:47 crc kubenswrapper[5094]: E0220 09:43:47.816622 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d29a2dc3f8c45c9dac09b075bbcc08642c86a1e55b6bc87ee2ee5870dcbcf2f0\": container with ID starting with d29a2dc3f8c45c9dac09b075bbcc08642c86a1e55b6bc87ee2ee5870dcbcf2f0 not found: ID does not exist" containerID="d29a2dc3f8c45c9dac09b075bbcc08642c86a1e55b6bc87ee2ee5870dcbcf2f0" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.816742 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d29a2dc3f8c45c9dac09b075bbcc08642c86a1e55b6bc87ee2ee5870dcbcf2f0"} err="failed to get container status \"d29a2dc3f8c45c9dac09b075bbcc08642c86a1e55b6bc87ee2ee5870dcbcf2f0\": rpc error: code = NotFound desc = could not find container \"d29a2dc3f8c45c9dac09b075bbcc08642c86a1e55b6bc87ee2ee5870dcbcf2f0\": container with ID starting with d29a2dc3f8c45c9dac09b075bbcc08642c86a1e55b6bc87ee2ee5870dcbcf2f0 not found: ID does not exist" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.816821 5094 scope.go:117] "RemoveContainer" containerID="9b09c11b51a431f98ac5ba6e5d43378a44dac84a11dcdb9bf7d45440a3c402d0" Feb 20 09:43:47 crc kubenswrapper[5094]: E0220 09:43:47.818351 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b09c11b51a431f98ac5ba6e5d43378a44dac84a11dcdb9bf7d45440a3c402d0\": container with ID starting with 9b09c11b51a431f98ac5ba6e5d43378a44dac84a11dcdb9bf7d45440a3c402d0 not found: ID does not exist" containerID="9b09c11b51a431f98ac5ba6e5d43378a44dac84a11dcdb9bf7d45440a3c402d0" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.818379 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b09c11b51a431f98ac5ba6e5d43378a44dac84a11dcdb9bf7d45440a3c402d0"} err="failed to get container status \"9b09c11b51a431f98ac5ba6e5d43378a44dac84a11dcdb9bf7d45440a3c402d0\": rpc error: code = NotFound desc = could not find container \"9b09c11b51a431f98ac5ba6e5d43378a44dac84a11dcdb9bf7d45440a3c402d0\": container with ID starting with 9b09c11b51a431f98ac5ba6e5d43378a44dac84a11dcdb9bf7d45440a3c402d0 not found: ID does not exist" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.852503 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8c7e904-200a-45ed-aaa1-cd6bf6c71399" path="/var/lib/kubelet/pods/d8c7e904-200a-45ed-aaa1-cd6bf6c71399/volumes" Feb 20 09:43:49 crc kubenswrapper[5094]: I0220 09:43:49.840324 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:43:49 crc kubenswrapper[5094]: E0220 09:43:49.841270 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:44:04 crc kubenswrapper[5094]: I0220 09:44:04.842057 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:44:04 crc kubenswrapper[5094]: E0220 09:44:04.842981 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:44:16 crc kubenswrapper[5094]: I0220 09:44:16.840897 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:44:16 crc kubenswrapper[5094]: E0220 09:44:16.842007 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:44:31 crc kubenswrapper[5094]: I0220 09:44:31.840580 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:44:31 crc kubenswrapper[5094]: E0220 09:44:31.842048 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:44:45 crc kubenswrapper[5094]: I0220 09:44:45.852932 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:44:45 crc kubenswrapper[5094]: E0220 09:44:45.853768 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.752743 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wzmgc"] Feb 20 09:44:55 crc kubenswrapper[5094]: E0220 09:44:55.754060 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8c7e904-200a-45ed-aaa1-cd6bf6c71399" containerName="extract-content" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.754084 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8c7e904-200a-45ed-aaa1-cd6bf6c71399" containerName="extract-content" Feb 20 09:44:55 crc kubenswrapper[5094]: E0220 09:44:55.754120 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8c7e904-200a-45ed-aaa1-cd6bf6c71399" containerName="registry-server" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.754134 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8c7e904-200a-45ed-aaa1-cd6bf6c71399" containerName="registry-server" Feb 20 09:44:55 crc kubenswrapper[5094]: E0220 09:44:55.754164 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32d8687f-685c-44e3-866d-2f5f1eb289e2" containerName="extract-content" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.754178 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="32d8687f-685c-44e3-866d-2f5f1eb289e2" containerName="extract-content" Feb 20 09:44:55 crc kubenswrapper[5094]: E0220 09:44:55.754224 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8c7e904-200a-45ed-aaa1-cd6bf6c71399" containerName="extract-utilities" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.754242 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8c7e904-200a-45ed-aaa1-cd6bf6c71399" containerName="extract-utilities" Feb 20 09:44:55 crc kubenswrapper[5094]: E0220 09:44:55.754265 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32d8687f-685c-44e3-866d-2f5f1eb289e2" containerName="registry-server" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.754277 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="32d8687f-685c-44e3-866d-2f5f1eb289e2" containerName="registry-server" Feb 20 09:44:55 crc kubenswrapper[5094]: E0220 09:44:55.754318 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32d8687f-685c-44e3-866d-2f5f1eb289e2" containerName="extract-utilities" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.754333 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="32d8687f-685c-44e3-866d-2f5f1eb289e2" containerName="extract-utilities" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.755510 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8c7e904-200a-45ed-aaa1-cd6bf6c71399" containerName="registry-server" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.755555 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="32d8687f-685c-44e3-866d-2f5f1eb289e2" containerName="registry-server" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.758559 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.773086 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wzmgc"] Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.822052 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858ceda4-5973-45c6-8eed-2ae8f1da9129-catalog-content\") pod \"certified-operators-wzmgc\" (UID: \"858ceda4-5973-45c6-8eed-2ae8f1da9129\") " pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.822146 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858ceda4-5973-45c6-8eed-2ae8f1da9129-utilities\") pod \"certified-operators-wzmgc\" (UID: \"858ceda4-5973-45c6-8eed-2ae8f1da9129\") " pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.822200 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgnjr\" (UniqueName: \"kubernetes.io/projected/858ceda4-5973-45c6-8eed-2ae8f1da9129-kube-api-access-dgnjr\") pod \"certified-operators-wzmgc\" (UID: \"858ceda4-5973-45c6-8eed-2ae8f1da9129\") " pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.924260 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858ceda4-5973-45c6-8eed-2ae8f1da9129-catalog-content\") pod \"certified-operators-wzmgc\" (UID: \"858ceda4-5973-45c6-8eed-2ae8f1da9129\") " pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.924790 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858ceda4-5973-45c6-8eed-2ae8f1da9129-utilities\") pod \"certified-operators-wzmgc\" (UID: \"858ceda4-5973-45c6-8eed-2ae8f1da9129\") " pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.924796 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858ceda4-5973-45c6-8eed-2ae8f1da9129-catalog-content\") pod \"certified-operators-wzmgc\" (UID: \"858ceda4-5973-45c6-8eed-2ae8f1da9129\") " pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.924875 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgnjr\" (UniqueName: \"kubernetes.io/projected/858ceda4-5973-45c6-8eed-2ae8f1da9129-kube-api-access-dgnjr\") pod \"certified-operators-wzmgc\" (UID: \"858ceda4-5973-45c6-8eed-2ae8f1da9129\") " pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.925385 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858ceda4-5973-45c6-8eed-2ae8f1da9129-utilities\") pod \"certified-operators-wzmgc\" (UID: \"858ceda4-5973-45c6-8eed-2ae8f1da9129\") " pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.949165 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgnjr\" (UniqueName: \"kubernetes.io/projected/858ceda4-5973-45c6-8eed-2ae8f1da9129-kube-api-access-dgnjr\") pod \"certified-operators-wzmgc\" (UID: \"858ceda4-5973-45c6-8eed-2ae8f1da9129\") " pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:44:56 crc kubenswrapper[5094]: I0220 09:44:56.085630 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:44:56 crc kubenswrapper[5094]: I0220 09:44:56.613910 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wzmgc"] Feb 20 09:44:57 crc kubenswrapper[5094]: I0220 09:44:57.512157 5094 generic.go:334] "Generic (PLEG): container finished" podID="858ceda4-5973-45c6-8eed-2ae8f1da9129" containerID="f8828c57a2b8ec7317d67d2a0bf9858177944fa933e6d8df847de341eeb0cb4b" exitCode=0 Feb 20 09:44:57 crc kubenswrapper[5094]: I0220 09:44:57.512250 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wzmgc" event={"ID":"858ceda4-5973-45c6-8eed-2ae8f1da9129","Type":"ContainerDied","Data":"f8828c57a2b8ec7317d67d2a0bf9858177944fa933e6d8df847de341eeb0cb4b"} Feb 20 09:44:57 crc kubenswrapper[5094]: I0220 09:44:57.513452 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wzmgc" event={"ID":"858ceda4-5973-45c6-8eed-2ae8f1da9129","Type":"ContainerStarted","Data":"00b97c75da7493ed5a9472f088e87955e8d31c3a5e258ff8fda53af82feaa89a"} Feb 20 09:44:58 crc kubenswrapper[5094]: I0220 09:44:58.524397 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wzmgc" event={"ID":"858ceda4-5973-45c6-8eed-2ae8f1da9129","Type":"ContainerStarted","Data":"f5dac48e57d264ca97f7999f084ce806acbde6bbc09182550bc88626fa4584c3"} Feb 20 09:44:59 crc kubenswrapper[5094]: I0220 09:44:59.538523 5094 generic.go:334] "Generic (PLEG): container finished" podID="858ceda4-5973-45c6-8eed-2ae8f1da9129" containerID="f5dac48e57d264ca97f7999f084ce806acbde6bbc09182550bc88626fa4584c3" exitCode=0 Feb 20 09:44:59 crc kubenswrapper[5094]: I0220 09:44:59.538573 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wzmgc" event={"ID":"858ceda4-5973-45c6-8eed-2ae8f1da9129","Type":"ContainerDied","Data":"f5dac48e57d264ca97f7999f084ce806acbde6bbc09182550bc88626fa4584c3"} Feb 20 09:44:59 crc kubenswrapper[5094]: I0220 09:44:59.841360 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:44:59 crc kubenswrapper[5094]: E0220 09:44:59.842233 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.180835 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg"] Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.182534 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg" Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.184669 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.186315 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.199385 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg"] Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.217453 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6226166e-0c34-4bcc-a689-160cc6141fd2-config-volume\") pod \"collect-profiles-29526345-k9rhg\" (UID: \"6226166e-0c34-4bcc-a689-160cc6141fd2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg" Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.217509 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhpd5\" (UniqueName: \"kubernetes.io/projected/6226166e-0c34-4bcc-a689-160cc6141fd2-kube-api-access-jhpd5\") pod \"collect-profiles-29526345-k9rhg\" (UID: \"6226166e-0c34-4bcc-a689-160cc6141fd2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg" Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.217622 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6226166e-0c34-4bcc-a689-160cc6141fd2-secret-volume\") pod \"collect-profiles-29526345-k9rhg\" (UID: \"6226166e-0c34-4bcc-a689-160cc6141fd2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg" Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.319888 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6226166e-0c34-4bcc-a689-160cc6141fd2-config-volume\") pod \"collect-profiles-29526345-k9rhg\" (UID: \"6226166e-0c34-4bcc-a689-160cc6141fd2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg" Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.319961 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhpd5\" (UniqueName: \"kubernetes.io/projected/6226166e-0c34-4bcc-a689-160cc6141fd2-kube-api-access-jhpd5\") pod \"collect-profiles-29526345-k9rhg\" (UID: \"6226166e-0c34-4bcc-a689-160cc6141fd2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg" Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.320111 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6226166e-0c34-4bcc-a689-160cc6141fd2-secret-volume\") pod \"collect-profiles-29526345-k9rhg\" (UID: \"6226166e-0c34-4bcc-a689-160cc6141fd2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg" Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.320829 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6226166e-0c34-4bcc-a689-160cc6141fd2-config-volume\") pod \"collect-profiles-29526345-k9rhg\" (UID: \"6226166e-0c34-4bcc-a689-160cc6141fd2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg" Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.337438 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6226166e-0c34-4bcc-a689-160cc6141fd2-secret-volume\") pod \"collect-profiles-29526345-k9rhg\" (UID: \"6226166e-0c34-4bcc-a689-160cc6141fd2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg" Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.343041 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhpd5\" (UniqueName: \"kubernetes.io/projected/6226166e-0c34-4bcc-a689-160cc6141fd2-kube-api-access-jhpd5\") pod \"collect-profiles-29526345-k9rhg\" (UID: \"6226166e-0c34-4bcc-a689-160cc6141fd2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg" Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.505855 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg" Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.558475 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wzmgc" event={"ID":"858ceda4-5973-45c6-8eed-2ae8f1da9129","Type":"ContainerStarted","Data":"270d662de4a3565bb7d20d3d8f4ee3707451e243e6a69d02a0a43ca78b68e241"} Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.592921 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wzmgc" podStartSLOduration=3.180805014 podStartE2EDuration="5.592899038s" podCreationTimestamp="2026-02-20 09:44:55 +0000 UTC" firstStartedPulling="2026-02-20 09:44:57.515117902 +0000 UTC m=+10712.387744613" lastFinishedPulling="2026-02-20 09:44:59.927211926 +0000 UTC m=+10714.799838637" observedRunningTime="2026-02-20 09:45:00.590163141 +0000 UTC m=+10715.462789872" watchObservedRunningTime="2026-02-20 09:45:00.592899038 +0000 UTC m=+10715.465525749" Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.960107 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg"] Feb 20 09:45:01 crc kubenswrapper[5094]: I0220 09:45:01.567533 5094 generic.go:334] "Generic (PLEG): container finished" podID="6226166e-0c34-4bcc-a689-160cc6141fd2" containerID="7f687d9b94feb89c77fd301d62419bc2a64a6c336bb3c4b73ccdb4bc555200cb" exitCode=0 Feb 20 09:45:01 crc kubenswrapper[5094]: I0220 09:45:01.567599 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg" event={"ID":"6226166e-0c34-4bcc-a689-160cc6141fd2","Type":"ContainerDied","Data":"7f687d9b94feb89c77fd301d62419bc2a64a6c336bb3c4b73ccdb4bc555200cb"} Feb 20 09:45:01 crc kubenswrapper[5094]: I0220 09:45:01.567987 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg" event={"ID":"6226166e-0c34-4bcc-a689-160cc6141fd2","Type":"ContainerStarted","Data":"514e7f34ca8fdc23ec75b68771b5e24c6ed0e5ae53f1402417ec6f9a03d88da2"} Feb 20 09:45:02 crc kubenswrapper[5094]: I0220 09:45:02.946337 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg" Feb 20 09:45:02 crc kubenswrapper[5094]: I0220 09:45:02.975645 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6226166e-0c34-4bcc-a689-160cc6141fd2-config-volume\") pod \"6226166e-0c34-4bcc-a689-160cc6141fd2\" (UID: \"6226166e-0c34-4bcc-a689-160cc6141fd2\") " Feb 20 09:45:02 crc kubenswrapper[5094]: I0220 09:45:02.975766 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhpd5\" (UniqueName: \"kubernetes.io/projected/6226166e-0c34-4bcc-a689-160cc6141fd2-kube-api-access-jhpd5\") pod \"6226166e-0c34-4bcc-a689-160cc6141fd2\" (UID: \"6226166e-0c34-4bcc-a689-160cc6141fd2\") " Feb 20 09:45:02 crc kubenswrapper[5094]: I0220 09:45:02.975960 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6226166e-0c34-4bcc-a689-160cc6141fd2-secret-volume\") pod \"6226166e-0c34-4bcc-a689-160cc6141fd2\" (UID: \"6226166e-0c34-4bcc-a689-160cc6141fd2\") " Feb 20 09:45:02 crc kubenswrapper[5094]: I0220 09:45:02.976419 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6226166e-0c34-4bcc-a689-160cc6141fd2-config-volume" (OuterVolumeSpecName: "config-volume") pod "6226166e-0c34-4bcc-a689-160cc6141fd2" (UID: "6226166e-0c34-4bcc-a689-160cc6141fd2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:45:02 crc kubenswrapper[5094]: I0220 09:45:02.982192 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6226166e-0c34-4bcc-a689-160cc6141fd2-kube-api-access-jhpd5" (OuterVolumeSpecName: "kube-api-access-jhpd5") pod "6226166e-0c34-4bcc-a689-160cc6141fd2" (UID: "6226166e-0c34-4bcc-a689-160cc6141fd2"). InnerVolumeSpecName "kube-api-access-jhpd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:45:02 crc kubenswrapper[5094]: I0220 09:45:02.987014 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6226166e-0c34-4bcc-a689-160cc6141fd2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6226166e-0c34-4bcc-a689-160cc6141fd2" (UID: "6226166e-0c34-4bcc-a689-160cc6141fd2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:45:03 crc kubenswrapper[5094]: I0220 09:45:03.078168 5094 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6226166e-0c34-4bcc-a689-160cc6141fd2-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 09:45:03 crc kubenswrapper[5094]: I0220 09:45:03.078480 5094 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6226166e-0c34-4bcc-a689-160cc6141fd2-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 09:45:03 crc kubenswrapper[5094]: I0220 09:45:03.078493 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhpd5\" (UniqueName: \"kubernetes.io/projected/6226166e-0c34-4bcc-a689-160cc6141fd2-kube-api-access-jhpd5\") on node \"crc\" DevicePath \"\"" Feb 20 09:45:03 crc kubenswrapper[5094]: I0220 09:45:03.589390 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg" event={"ID":"6226166e-0c34-4bcc-a689-160cc6141fd2","Type":"ContainerDied","Data":"514e7f34ca8fdc23ec75b68771b5e24c6ed0e5ae53f1402417ec6f9a03d88da2"} Feb 20 09:45:03 crc kubenswrapper[5094]: I0220 09:45:03.589430 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="514e7f34ca8fdc23ec75b68771b5e24c6ed0e5ae53f1402417ec6f9a03d88da2" Feb 20 09:45:03 crc kubenswrapper[5094]: I0220 09:45:03.589483 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg" Feb 20 09:45:04 crc kubenswrapper[5094]: I0220 09:45:04.030598 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l"] Feb 20 09:45:04 crc kubenswrapper[5094]: I0220 09:45:04.040246 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l"] Feb 20 09:45:05 crc kubenswrapper[5094]: I0220 09:45:05.854888 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30508b7a-ac76-48d8-822c-65a32552ca80" path="/var/lib/kubelet/pods/30508b7a-ac76-48d8-822c-65a32552ca80/volumes" Feb 20 09:45:06 crc kubenswrapper[5094]: I0220 09:45:06.086142 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:45:06 crc kubenswrapper[5094]: I0220 09:45:06.086220 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:45:06 crc kubenswrapper[5094]: I0220 09:45:06.167202 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:45:06 crc kubenswrapper[5094]: I0220 09:45:06.665053 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:45:06 crc kubenswrapper[5094]: I0220 09:45:06.727570 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wzmgc"] Feb 20 09:45:08 crc kubenswrapper[5094]: I0220 09:45:08.637111 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wzmgc" podUID="858ceda4-5973-45c6-8eed-2ae8f1da9129" containerName="registry-server" containerID="cri-o://270d662de4a3565bb7d20d3d8f4ee3707451e243e6a69d02a0a43ca78b68e241" gracePeriod=2 Feb 20 09:45:09 crc kubenswrapper[5094]: I0220 09:45:09.647964 5094 generic.go:334] "Generic (PLEG): container finished" podID="858ceda4-5973-45c6-8eed-2ae8f1da9129" containerID="270d662de4a3565bb7d20d3d8f4ee3707451e243e6a69d02a0a43ca78b68e241" exitCode=0 Feb 20 09:45:09 crc kubenswrapper[5094]: I0220 09:45:09.648284 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wzmgc" event={"ID":"858ceda4-5973-45c6-8eed-2ae8f1da9129","Type":"ContainerDied","Data":"270d662de4a3565bb7d20d3d8f4ee3707451e243e6a69d02a0a43ca78b68e241"} Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.078697 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.151958 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858ceda4-5973-45c6-8eed-2ae8f1da9129-catalog-content\") pod \"858ceda4-5973-45c6-8eed-2ae8f1da9129\" (UID: \"858ceda4-5973-45c6-8eed-2ae8f1da9129\") " Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.152125 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858ceda4-5973-45c6-8eed-2ae8f1da9129-utilities\") pod \"858ceda4-5973-45c6-8eed-2ae8f1da9129\" (UID: \"858ceda4-5973-45c6-8eed-2ae8f1da9129\") " Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.152215 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgnjr\" (UniqueName: \"kubernetes.io/projected/858ceda4-5973-45c6-8eed-2ae8f1da9129-kube-api-access-dgnjr\") pod \"858ceda4-5973-45c6-8eed-2ae8f1da9129\" (UID: \"858ceda4-5973-45c6-8eed-2ae8f1da9129\") " Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.153007 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/858ceda4-5973-45c6-8eed-2ae8f1da9129-utilities" (OuterVolumeSpecName: "utilities") pod "858ceda4-5973-45c6-8eed-2ae8f1da9129" (UID: "858ceda4-5973-45c6-8eed-2ae8f1da9129"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.158027 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/858ceda4-5973-45c6-8eed-2ae8f1da9129-kube-api-access-dgnjr" (OuterVolumeSpecName: "kube-api-access-dgnjr") pod "858ceda4-5973-45c6-8eed-2ae8f1da9129" (UID: "858ceda4-5973-45c6-8eed-2ae8f1da9129"). InnerVolumeSpecName "kube-api-access-dgnjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.202785 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/858ceda4-5973-45c6-8eed-2ae8f1da9129-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "858ceda4-5973-45c6-8eed-2ae8f1da9129" (UID: "858ceda4-5973-45c6-8eed-2ae8f1da9129"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.254774 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858ceda4-5973-45c6-8eed-2ae8f1da9129-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.254815 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858ceda4-5973-45c6-8eed-2ae8f1da9129-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.254830 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgnjr\" (UniqueName: \"kubernetes.io/projected/858ceda4-5973-45c6-8eed-2ae8f1da9129-kube-api-access-dgnjr\") on node \"crc\" DevicePath \"\"" Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.660215 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wzmgc" event={"ID":"858ceda4-5973-45c6-8eed-2ae8f1da9129","Type":"ContainerDied","Data":"00b97c75da7493ed5a9472f088e87955e8d31c3a5e258ff8fda53af82feaa89a"} Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.660288 5094 scope.go:117] "RemoveContainer" containerID="270d662de4a3565bb7d20d3d8f4ee3707451e243e6a69d02a0a43ca78b68e241" Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.660295 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.686639 5094 scope.go:117] "RemoveContainer" containerID="f5dac48e57d264ca97f7999f084ce806acbde6bbc09182550bc88626fa4584c3" Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.704640 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wzmgc"] Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.726125 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wzmgc"] Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.732134 5094 scope.go:117] "RemoveContainer" containerID="f8828c57a2b8ec7317d67d2a0bf9858177944fa933e6d8df847de341eeb0cb4b" Feb 20 09:45:11 crc kubenswrapper[5094]: I0220 09:45:11.861429 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="858ceda4-5973-45c6-8eed-2ae8f1da9129" path="/var/lib/kubelet/pods/858ceda4-5973-45c6-8eed-2ae8f1da9129/volumes" Feb 20 09:45:14 crc kubenswrapper[5094]: I0220 09:45:14.841215 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:45:14 crc kubenswrapper[5094]: E0220 09:45:14.842093 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:45:27 crc kubenswrapper[5094]: I0220 09:45:27.842468 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:45:27 crc kubenswrapper[5094]: E0220 09:45:27.843793 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:45:41 crc kubenswrapper[5094]: I0220 09:45:41.840513 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:45:41 crc kubenswrapper[5094]: E0220 09:45:41.841464 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:45:45 crc kubenswrapper[5094]: I0220 09:45:45.287785 5094 scope.go:117] "RemoveContainer" containerID="9dd7ec3da040b20e94b1ef4ad1e6147baa04fe89a2ea1cd7d18c7a1def8587f9" Feb 20 09:45:54 crc kubenswrapper[5094]: I0220 09:45:54.840479 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:45:54 crc kubenswrapper[5094]: E0220 09:45:54.841306 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:46:07 crc kubenswrapper[5094]: I0220 09:46:07.841160 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:46:09 crc kubenswrapper[5094]: I0220 09:46:09.608719 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"02670694c17a296cbc1327519023ef00691c5e296f13bbf2b5914297f2d0efd8"} Feb 20 09:47:45 crc kubenswrapper[5094]: I0220 09:47:45.422940 5094 scope.go:117] "RemoveContainer" containerID="7af968c16caa79d5bb5f6a8736bb5e65c93aad23723c3283822323271ece7400" Feb 20 09:47:45 crc kubenswrapper[5094]: I0220 09:47:45.461241 5094 scope.go:117] "RemoveContainer" containerID="c5346f105c3a7aa086fca7b0b87e5e8579a3e80ebf063450c83b214c198260b6" Feb 20 09:47:45 crc kubenswrapper[5094]: I0220 09:47:45.529810 5094 scope.go:117] "RemoveContainer" containerID="75d574b6e0f2f2f5d758f86e35fb24ddf85408817e99d01d1c91bc0b34a2b3e5" Feb 20 09:48:34 crc kubenswrapper[5094]: I0220 09:48:34.106880 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:48:34 crc kubenswrapper[5094]: I0220 09:48:34.107534 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.327835 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Feb 20 09:48:38 crc kubenswrapper[5094]: E0220 09:48:38.328734 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="858ceda4-5973-45c6-8eed-2ae8f1da9129" containerName="extract-utilities" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.328750 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="858ceda4-5973-45c6-8eed-2ae8f1da9129" containerName="extract-utilities" Feb 20 09:48:38 crc kubenswrapper[5094]: E0220 09:48:38.328781 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6226166e-0c34-4bcc-a689-160cc6141fd2" containerName="collect-profiles" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.328787 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6226166e-0c34-4bcc-a689-160cc6141fd2" containerName="collect-profiles" Feb 20 09:48:38 crc kubenswrapper[5094]: E0220 09:48:38.328819 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="858ceda4-5973-45c6-8eed-2ae8f1da9129" containerName="registry-server" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.328827 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="858ceda4-5973-45c6-8eed-2ae8f1da9129" containerName="registry-server" Feb 20 09:48:38 crc kubenswrapper[5094]: E0220 09:48:38.328842 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="858ceda4-5973-45c6-8eed-2ae8f1da9129" containerName="extract-content" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.328849 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="858ceda4-5973-45c6-8eed-2ae8f1da9129" containerName="extract-content" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.329044 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="6226166e-0c34-4bcc-a689-160cc6141fd2" containerName="collect-profiles" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.329069 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="858ceda4-5973-45c6-8eed-2ae8f1da9129" containerName="registry-server" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.329882 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.332095 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-d88p7" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.332683 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.333336 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.334878 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.342673 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.414492 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7j29\" (UniqueName: \"kubernetes.io/projected/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-kube-api-access-v7j29\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.415092 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.415329 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.415558 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.415815 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.416040 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.416223 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-config-data\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.416412 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.416749 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.518599 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7j29\" (UniqueName: \"kubernetes.io/projected/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-kube-api-access-v7j29\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.518687 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.518983 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.519048 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.519131 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.519191 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.519232 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-config-data\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.519291 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.519425 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.519905 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.520463 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.521375 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.521806 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.523911 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-config-data\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.527487 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.530755 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.536357 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.540915 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7j29\" (UniqueName: \"kubernetes.io/projected/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-kube-api-access-v7j29\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.579428 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.657034 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 20 09:48:39 crc kubenswrapper[5094]: I0220 09:48:39.191751 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 20 09:48:39 crc kubenswrapper[5094]: I0220 09:48:39.207252 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 09:48:39 crc kubenswrapper[5094]: I0220 09:48:39.414161 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9","Type":"ContainerStarted","Data":"e3fe55f15912750e6ef07e8fa7b4632b5f9782d82892810f96924fcbf7aff5a8"} Feb 20 09:49:04 crc kubenswrapper[5094]: I0220 09:49:04.107382 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:49:04 crc kubenswrapper[5094]: I0220 09:49:04.108058 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:49:29 crc kubenswrapper[5094]: E0220 09:49:29.492876 5094 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:f0473f3e18dd17d7021c02e991298923" Feb 20 09:49:29 crc kubenswrapper[5094]: E0220 09:49:29.493509 5094 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:f0473f3e18dd17d7021c02e991298923" Feb 20 09:49:29 crc kubenswrapper[5094]: E0220 09:49:29.493842 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:f0473f3e18dd17d7021c02e991298923,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v7j29,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(8e2aa894-2a09-4fad-bcc7-1f259ca48ac9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 09:49:29 crc kubenswrapper[5094]: E0220 09:49:29.495099 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="8e2aa894-2a09-4fad-bcc7-1f259ca48ac9" Feb 20 09:49:30 crc kubenswrapper[5094]: E0220 09:49:30.292645 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:f0473f3e18dd17d7021c02e991298923\\\"\"" pod="openstack/tempest-tests-tempest" podUID="8e2aa894-2a09-4fad-bcc7-1f259ca48ac9" Feb 20 09:49:34 crc kubenswrapper[5094]: I0220 09:49:34.106903 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:49:34 crc kubenswrapper[5094]: I0220 09:49:34.107627 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:49:34 crc kubenswrapper[5094]: I0220 09:49:34.107690 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 09:49:34 crc kubenswrapper[5094]: I0220 09:49:34.108980 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"02670694c17a296cbc1327519023ef00691c5e296f13bbf2b5914297f2d0efd8"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 09:49:34 crc kubenswrapper[5094]: I0220 09:49:34.109091 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://02670694c17a296cbc1327519023ef00691c5e296f13bbf2b5914297f2d0efd8" gracePeriod=600 Feb 20 09:49:34 crc kubenswrapper[5094]: I0220 09:49:34.333781 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="02670694c17a296cbc1327519023ef00691c5e296f13bbf2b5914297f2d0efd8" exitCode=0 Feb 20 09:49:34 crc kubenswrapper[5094]: I0220 09:49:34.333940 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"02670694c17a296cbc1327519023ef00691c5e296f13bbf2b5914297f2d0efd8"} Feb 20 09:49:34 crc kubenswrapper[5094]: I0220 09:49:34.334121 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:49:35 crc kubenswrapper[5094]: I0220 09:49:35.344291 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1"} Feb 20 09:49:46 crc kubenswrapper[5094]: I0220 09:49:46.086425 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 20 09:49:47 crc kubenswrapper[5094]: I0220 09:49:47.471435 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9","Type":"ContainerStarted","Data":"2c94eb29c8d1a5480a6899dd7732430e17544eb4ff0b06be93fdd212c2a48558"} Feb 20 09:49:47 crc kubenswrapper[5094]: I0220 09:49:47.496101 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.620296081 podStartE2EDuration="1m10.496078753s" podCreationTimestamp="2026-02-20 09:48:37 +0000 UTC" firstStartedPulling="2026-02-20 09:48:39.206974034 +0000 UTC m=+10934.079600745" lastFinishedPulling="2026-02-20 09:49:46.082756666 +0000 UTC m=+11000.955383417" observedRunningTime="2026-02-20 09:49:47.492377513 +0000 UTC m=+11002.365004264" watchObservedRunningTime="2026-02-20 09:49:47.496078753 +0000 UTC m=+11002.368705504" Feb 20 09:51:34 crc kubenswrapper[5094]: I0220 09:51:34.107225 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:51:34 crc kubenswrapper[5094]: I0220 09:51:34.107852 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:52:04 crc kubenswrapper[5094]: I0220 09:52:04.107017 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:52:04 crc kubenswrapper[5094]: I0220 09:52:04.107807 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:52:23 crc kubenswrapper[5094]: I0220 09:52:23.989729 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vsfp4"] Feb 20 09:52:24 crc kubenswrapper[5094]: I0220 09:52:24.006337 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vsfp4"] Feb 20 09:52:24 crc kubenswrapper[5094]: I0220 09:52:24.006439 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:24 crc kubenswrapper[5094]: I0220 09:52:24.182300 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efc939c9-c470-49ad-aa1f-cef315df8594-utilities\") pod \"community-operators-vsfp4\" (UID: \"efc939c9-c470-49ad-aa1f-cef315df8594\") " pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:24 crc kubenswrapper[5094]: I0220 09:52:24.182943 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g4ct\" (UniqueName: \"kubernetes.io/projected/efc939c9-c470-49ad-aa1f-cef315df8594-kube-api-access-4g4ct\") pod \"community-operators-vsfp4\" (UID: \"efc939c9-c470-49ad-aa1f-cef315df8594\") " pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:24 crc kubenswrapper[5094]: I0220 09:52:24.183155 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efc939c9-c470-49ad-aa1f-cef315df8594-catalog-content\") pod \"community-operators-vsfp4\" (UID: \"efc939c9-c470-49ad-aa1f-cef315df8594\") " pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:24 crc kubenswrapper[5094]: I0220 09:52:24.285116 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efc939c9-c470-49ad-aa1f-cef315df8594-catalog-content\") pod \"community-operators-vsfp4\" (UID: \"efc939c9-c470-49ad-aa1f-cef315df8594\") " pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:24 crc kubenswrapper[5094]: I0220 09:52:24.285230 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efc939c9-c470-49ad-aa1f-cef315df8594-utilities\") pod \"community-operators-vsfp4\" (UID: \"efc939c9-c470-49ad-aa1f-cef315df8594\") " pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:24 crc kubenswrapper[5094]: I0220 09:52:24.285291 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g4ct\" (UniqueName: \"kubernetes.io/projected/efc939c9-c470-49ad-aa1f-cef315df8594-kube-api-access-4g4ct\") pod \"community-operators-vsfp4\" (UID: \"efc939c9-c470-49ad-aa1f-cef315df8594\") " pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:24 crc kubenswrapper[5094]: I0220 09:52:24.285826 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efc939c9-c470-49ad-aa1f-cef315df8594-catalog-content\") pod \"community-operators-vsfp4\" (UID: \"efc939c9-c470-49ad-aa1f-cef315df8594\") " pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:24 crc kubenswrapper[5094]: I0220 09:52:24.285835 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efc939c9-c470-49ad-aa1f-cef315df8594-utilities\") pod \"community-operators-vsfp4\" (UID: \"efc939c9-c470-49ad-aa1f-cef315df8594\") " pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:24 crc kubenswrapper[5094]: I0220 09:52:24.306299 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g4ct\" (UniqueName: \"kubernetes.io/projected/efc939c9-c470-49ad-aa1f-cef315df8594-kube-api-access-4g4ct\") pod \"community-operators-vsfp4\" (UID: \"efc939c9-c470-49ad-aa1f-cef315df8594\") " pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:24 crc kubenswrapper[5094]: I0220 09:52:24.334551 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:25 crc kubenswrapper[5094]: I0220 09:52:25.197124 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vsfp4"] Feb 20 09:52:26 crc kubenswrapper[5094]: I0220 09:52:26.118799 5094 generic.go:334] "Generic (PLEG): container finished" podID="efc939c9-c470-49ad-aa1f-cef315df8594" containerID="27aa8e074cf94425baae3d26fc77836a8417361144044c62aaa5db7d2e4906b9" exitCode=0 Feb 20 09:52:26 crc kubenswrapper[5094]: I0220 09:52:26.118882 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vsfp4" event={"ID":"efc939c9-c470-49ad-aa1f-cef315df8594","Type":"ContainerDied","Data":"27aa8e074cf94425baae3d26fc77836a8417361144044c62aaa5db7d2e4906b9"} Feb 20 09:52:26 crc kubenswrapper[5094]: I0220 09:52:26.119267 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vsfp4" event={"ID":"efc939c9-c470-49ad-aa1f-cef315df8594","Type":"ContainerStarted","Data":"d6f7074d93aaef589bc935f694e2d12082a5a977dd0619fbfa03bcb7292208bb"} Feb 20 09:52:27 crc kubenswrapper[5094]: I0220 09:52:27.129908 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vsfp4" event={"ID":"efc939c9-c470-49ad-aa1f-cef315df8594","Type":"ContainerStarted","Data":"a584c7044025912681642a3c255cf431b3a697e6309a87e6ce161da58ed43b0c"} Feb 20 09:52:29 crc kubenswrapper[5094]: I0220 09:52:29.146559 5094 generic.go:334] "Generic (PLEG): container finished" podID="efc939c9-c470-49ad-aa1f-cef315df8594" containerID="a584c7044025912681642a3c255cf431b3a697e6309a87e6ce161da58ed43b0c" exitCode=0 Feb 20 09:52:29 crc kubenswrapper[5094]: I0220 09:52:29.146640 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vsfp4" event={"ID":"efc939c9-c470-49ad-aa1f-cef315df8594","Type":"ContainerDied","Data":"a584c7044025912681642a3c255cf431b3a697e6309a87e6ce161da58ed43b0c"} Feb 20 09:52:30 crc kubenswrapper[5094]: I0220 09:52:30.165187 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vsfp4" event={"ID":"efc939c9-c470-49ad-aa1f-cef315df8594","Type":"ContainerStarted","Data":"1fe924a4977428558f95a93b1fe697e7f7fb7904a6ff0282d16c4b0dc17a4153"} Feb 20 09:52:30 crc kubenswrapper[5094]: I0220 09:52:30.195994 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vsfp4" podStartSLOduration=3.79358038 podStartE2EDuration="7.195974585s" podCreationTimestamp="2026-02-20 09:52:23 +0000 UTC" firstStartedPulling="2026-02-20 09:52:26.120539439 +0000 UTC m=+11160.993166150" lastFinishedPulling="2026-02-20 09:52:29.522933634 +0000 UTC m=+11164.395560355" observedRunningTime="2026-02-20 09:52:30.184427081 +0000 UTC m=+11165.057053792" watchObservedRunningTime="2026-02-20 09:52:30.195974585 +0000 UTC m=+11165.068601286" Feb 20 09:52:34 crc kubenswrapper[5094]: I0220 09:52:34.107198 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:52:34 crc kubenswrapper[5094]: I0220 09:52:34.107582 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:52:34 crc kubenswrapper[5094]: I0220 09:52:34.107623 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 09:52:34 crc kubenswrapper[5094]: I0220 09:52:34.108272 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 09:52:34 crc kubenswrapper[5094]: I0220 09:52:34.108320 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" gracePeriod=600 Feb 20 09:52:34 crc kubenswrapper[5094]: E0220 09:52:34.235142 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:52:34 crc kubenswrapper[5094]: I0220 09:52:34.335643 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:34 crc kubenswrapper[5094]: I0220 09:52:34.335715 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:34 crc kubenswrapper[5094]: I0220 09:52:34.391485 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:35 crc kubenswrapper[5094]: I0220 09:52:35.218963 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" exitCode=0 Feb 20 09:52:35 crc kubenswrapper[5094]: I0220 09:52:35.219067 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1"} Feb 20 09:52:35 crc kubenswrapper[5094]: I0220 09:52:35.219301 5094 scope.go:117] "RemoveContainer" containerID="02670694c17a296cbc1327519023ef00691c5e296f13bbf2b5914297f2d0efd8" Feb 20 09:52:35 crc kubenswrapper[5094]: I0220 09:52:35.220901 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:52:35 crc kubenswrapper[5094]: E0220 09:52:35.221390 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:52:35 crc kubenswrapper[5094]: I0220 09:52:35.293187 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:35 crc kubenswrapper[5094]: I0220 09:52:35.350114 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vsfp4"] Feb 20 09:52:37 crc kubenswrapper[5094]: I0220 09:52:37.238718 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vsfp4" podUID="efc939c9-c470-49ad-aa1f-cef315df8594" containerName="registry-server" containerID="cri-o://1fe924a4977428558f95a93b1fe697e7f7fb7904a6ff0282d16c4b0dc17a4153" gracePeriod=2 Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.035270 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.191441 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efc939c9-c470-49ad-aa1f-cef315df8594-utilities\") pod \"efc939c9-c470-49ad-aa1f-cef315df8594\" (UID: \"efc939c9-c470-49ad-aa1f-cef315df8594\") " Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.191657 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efc939c9-c470-49ad-aa1f-cef315df8594-catalog-content\") pod \"efc939c9-c470-49ad-aa1f-cef315df8594\" (UID: \"efc939c9-c470-49ad-aa1f-cef315df8594\") " Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.191743 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g4ct\" (UniqueName: \"kubernetes.io/projected/efc939c9-c470-49ad-aa1f-cef315df8594-kube-api-access-4g4ct\") pod \"efc939c9-c470-49ad-aa1f-cef315df8594\" (UID: \"efc939c9-c470-49ad-aa1f-cef315df8594\") " Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.192639 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efc939c9-c470-49ad-aa1f-cef315df8594-utilities" (OuterVolumeSpecName: "utilities") pod "efc939c9-c470-49ad-aa1f-cef315df8594" (UID: "efc939c9-c470-49ad-aa1f-cef315df8594"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.199889 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efc939c9-c470-49ad-aa1f-cef315df8594-kube-api-access-4g4ct" (OuterVolumeSpecName: "kube-api-access-4g4ct") pod "efc939c9-c470-49ad-aa1f-cef315df8594" (UID: "efc939c9-c470-49ad-aa1f-cef315df8594"). InnerVolumeSpecName "kube-api-access-4g4ct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.252972 5094 generic.go:334] "Generic (PLEG): container finished" podID="efc939c9-c470-49ad-aa1f-cef315df8594" containerID="1fe924a4977428558f95a93b1fe697e7f7fb7904a6ff0282d16c4b0dc17a4153" exitCode=0 Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.253037 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.253053 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vsfp4" event={"ID":"efc939c9-c470-49ad-aa1f-cef315df8594","Type":"ContainerDied","Data":"1fe924a4977428558f95a93b1fe697e7f7fb7904a6ff0282d16c4b0dc17a4153"} Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.253089 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vsfp4" event={"ID":"efc939c9-c470-49ad-aa1f-cef315df8594","Type":"ContainerDied","Data":"d6f7074d93aaef589bc935f694e2d12082a5a977dd0619fbfa03bcb7292208bb"} Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.253105 5094 scope.go:117] "RemoveContainer" containerID="1fe924a4977428558f95a93b1fe697e7f7fb7904a6ff0282d16c4b0dc17a4153" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.258115 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efc939c9-c470-49ad-aa1f-cef315df8594-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "efc939c9-c470-49ad-aa1f-cef315df8594" (UID: "efc939c9-c470-49ad-aa1f-cef315df8594"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.276692 5094 scope.go:117] "RemoveContainer" containerID="a584c7044025912681642a3c255cf431b3a697e6309a87e6ce161da58ed43b0c" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.294568 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efc939c9-c470-49ad-aa1f-cef315df8594-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.294600 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g4ct\" (UniqueName: \"kubernetes.io/projected/efc939c9-c470-49ad-aa1f-cef315df8594-kube-api-access-4g4ct\") on node \"crc\" DevicePath \"\"" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.294611 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efc939c9-c470-49ad-aa1f-cef315df8594-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.297085 5094 scope.go:117] "RemoveContainer" containerID="27aa8e074cf94425baae3d26fc77836a8417361144044c62aaa5db7d2e4906b9" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.337263 5094 scope.go:117] "RemoveContainer" containerID="1fe924a4977428558f95a93b1fe697e7f7fb7904a6ff0282d16c4b0dc17a4153" Feb 20 09:52:38 crc kubenswrapper[5094]: E0220 09:52:38.337719 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fe924a4977428558f95a93b1fe697e7f7fb7904a6ff0282d16c4b0dc17a4153\": container with ID starting with 1fe924a4977428558f95a93b1fe697e7f7fb7904a6ff0282d16c4b0dc17a4153 not found: ID does not exist" containerID="1fe924a4977428558f95a93b1fe697e7f7fb7904a6ff0282d16c4b0dc17a4153" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.337772 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fe924a4977428558f95a93b1fe697e7f7fb7904a6ff0282d16c4b0dc17a4153"} err="failed to get container status \"1fe924a4977428558f95a93b1fe697e7f7fb7904a6ff0282d16c4b0dc17a4153\": rpc error: code = NotFound desc = could not find container \"1fe924a4977428558f95a93b1fe697e7f7fb7904a6ff0282d16c4b0dc17a4153\": container with ID starting with 1fe924a4977428558f95a93b1fe697e7f7fb7904a6ff0282d16c4b0dc17a4153 not found: ID does not exist" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.337800 5094 scope.go:117] "RemoveContainer" containerID="a584c7044025912681642a3c255cf431b3a697e6309a87e6ce161da58ed43b0c" Feb 20 09:52:38 crc kubenswrapper[5094]: E0220 09:52:38.338096 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a584c7044025912681642a3c255cf431b3a697e6309a87e6ce161da58ed43b0c\": container with ID starting with a584c7044025912681642a3c255cf431b3a697e6309a87e6ce161da58ed43b0c not found: ID does not exist" containerID="a584c7044025912681642a3c255cf431b3a697e6309a87e6ce161da58ed43b0c" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.338129 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a584c7044025912681642a3c255cf431b3a697e6309a87e6ce161da58ed43b0c"} err="failed to get container status \"a584c7044025912681642a3c255cf431b3a697e6309a87e6ce161da58ed43b0c\": rpc error: code = NotFound desc = could not find container \"a584c7044025912681642a3c255cf431b3a697e6309a87e6ce161da58ed43b0c\": container with ID starting with a584c7044025912681642a3c255cf431b3a697e6309a87e6ce161da58ed43b0c not found: ID does not exist" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.338151 5094 scope.go:117] "RemoveContainer" containerID="27aa8e074cf94425baae3d26fc77836a8417361144044c62aaa5db7d2e4906b9" Feb 20 09:52:38 crc kubenswrapper[5094]: E0220 09:52:38.338384 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27aa8e074cf94425baae3d26fc77836a8417361144044c62aaa5db7d2e4906b9\": container with ID starting with 27aa8e074cf94425baae3d26fc77836a8417361144044c62aaa5db7d2e4906b9 not found: ID does not exist" containerID="27aa8e074cf94425baae3d26fc77836a8417361144044c62aaa5db7d2e4906b9" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.338414 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27aa8e074cf94425baae3d26fc77836a8417361144044c62aaa5db7d2e4906b9"} err="failed to get container status \"27aa8e074cf94425baae3d26fc77836a8417361144044c62aaa5db7d2e4906b9\": rpc error: code = NotFound desc = could not find container \"27aa8e074cf94425baae3d26fc77836a8417361144044c62aaa5db7d2e4906b9\": container with ID starting with 27aa8e074cf94425baae3d26fc77836a8417361144044c62aaa5db7d2e4906b9 not found: ID does not exist" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.599451 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vsfp4"] Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.608203 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vsfp4"] Feb 20 09:52:39 crc kubenswrapper[5094]: I0220 09:52:39.852812 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efc939c9-c470-49ad-aa1f-cef315df8594" path="/var/lib/kubelet/pods/efc939c9-c470-49ad-aa1f-cef315df8594/volumes" Feb 20 09:52:48 crc kubenswrapper[5094]: I0220 09:52:48.840086 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:52:48 crc kubenswrapper[5094]: E0220 09:52:48.840788 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:53:02 crc kubenswrapper[5094]: I0220 09:53:02.841289 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:53:02 crc kubenswrapper[5094]: E0220 09:53:02.842735 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:53:14 crc kubenswrapper[5094]: I0220 09:53:14.840822 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:53:14 crc kubenswrapper[5094]: E0220 09:53:14.841822 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:53:25 crc kubenswrapper[5094]: I0220 09:53:25.852080 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:53:25 crc kubenswrapper[5094]: E0220 09:53:25.853307 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:53:26 crc kubenswrapper[5094]: I0220 09:53:26.673446 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q457n"] Feb 20 09:53:26 crc kubenswrapper[5094]: E0220 09:53:26.674540 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efc939c9-c470-49ad-aa1f-cef315df8594" containerName="registry-server" Feb 20 09:53:26 crc kubenswrapper[5094]: I0220 09:53:26.674562 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="efc939c9-c470-49ad-aa1f-cef315df8594" containerName="registry-server" Feb 20 09:53:26 crc kubenswrapper[5094]: E0220 09:53:26.674600 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efc939c9-c470-49ad-aa1f-cef315df8594" containerName="extract-utilities" Feb 20 09:53:26 crc kubenswrapper[5094]: I0220 09:53:26.674610 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="efc939c9-c470-49ad-aa1f-cef315df8594" containerName="extract-utilities" Feb 20 09:53:26 crc kubenswrapper[5094]: E0220 09:53:26.674660 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efc939c9-c470-49ad-aa1f-cef315df8594" containerName="extract-content" Feb 20 09:53:26 crc kubenswrapper[5094]: I0220 09:53:26.674669 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="efc939c9-c470-49ad-aa1f-cef315df8594" containerName="extract-content" Feb 20 09:53:26 crc kubenswrapper[5094]: I0220 09:53:26.674957 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="efc939c9-c470-49ad-aa1f-cef315df8594" containerName="registry-server" Feb 20 09:53:26 crc kubenswrapper[5094]: I0220 09:53:26.676872 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:26 crc kubenswrapper[5094]: I0220 09:53:26.687247 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q457n"] Feb 20 09:53:26 crc kubenswrapper[5094]: I0220 09:53:26.689848 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8bnb\" (UniqueName: \"kubernetes.io/projected/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-kube-api-access-k8bnb\") pod \"redhat-marketplace-q457n\" (UID: \"ad834bd9-9f2a-4b16-a79b-4c66429e20f8\") " pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:26 crc kubenswrapper[5094]: I0220 09:53:26.689945 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-catalog-content\") pod \"redhat-marketplace-q457n\" (UID: \"ad834bd9-9f2a-4b16-a79b-4c66429e20f8\") " pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:26 crc kubenswrapper[5094]: I0220 09:53:26.689997 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-utilities\") pod \"redhat-marketplace-q457n\" (UID: \"ad834bd9-9f2a-4b16-a79b-4c66429e20f8\") " pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:26 crc kubenswrapper[5094]: I0220 09:53:26.791650 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8bnb\" (UniqueName: \"kubernetes.io/projected/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-kube-api-access-k8bnb\") pod \"redhat-marketplace-q457n\" (UID: \"ad834bd9-9f2a-4b16-a79b-4c66429e20f8\") " pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:26 crc kubenswrapper[5094]: I0220 09:53:26.791843 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-catalog-content\") pod \"redhat-marketplace-q457n\" (UID: \"ad834bd9-9f2a-4b16-a79b-4c66429e20f8\") " pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:26 crc kubenswrapper[5094]: I0220 09:53:26.791923 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-utilities\") pod \"redhat-marketplace-q457n\" (UID: \"ad834bd9-9f2a-4b16-a79b-4c66429e20f8\") " pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:26 crc kubenswrapper[5094]: I0220 09:53:26.792639 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-utilities\") pod \"redhat-marketplace-q457n\" (UID: \"ad834bd9-9f2a-4b16-a79b-4c66429e20f8\") " pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:26 crc kubenswrapper[5094]: I0220 09:53:26.793277 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-catalog-content\") pod \"redhat-marketplace-q457n\" (UID: \"ad834bd9-9f2a-4b16-a79b-4c66429e20f8\") " pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:26 crc kubenswrapper[5094]: I0220 09:53:26.811962 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8bnb\" (UniqueName: \"kubernetes.io/projected/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-kube-api-access-k8bnb\") pod \"redhat-marketplace-q457n\" (UID: \"ad834bd9-9f2a-4b16-a79b-4c66429e20f8\") " pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:27 crc kubenswrapper[5094]: I0220 09:53:27.010676 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:27 crc kubenswrapper[5094]: I0220 09:53:27.510112 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q457n"] Feb 20 09:53:27 crc kubenswrapper[5094]: I0220 09:53:27.775240 5094 generic.go:334] "Generic (PLEG): container finished" podID="ad834bd9-9f2a-4b16-a79b-4c66429e20f8" containerID="c16488e5fb16ae96037e74b2aae3b51fdba4cbc02b6de899da1a24c12e71f63d" exitCode=0 Feb 20 09:53:27 crc kubenswrapper[5094]: I0220 09:53:27.775299 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q457n" event={"ID":"ad834bd9-9f2a-4b16-a79b-4c66429e20f8","Type":"ContainerDied","Data":"c16488e5fb16ae96037e74b2aae3b51fdba4cbc02b6de899da1a24c12e71f63d"} Feb 20 09:53:27 crc kubenswrapper[5094]: I0220 09:53:27.775595 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q457n" event={"ID":"ad834bd9-9f2a-4b16-a79b-4c66429e20f8","Type":"ContainerStarted","Data":"face48fb76ab2bf77d89b9d776328f22131f9e1aaa8d0edf5235ddc072c31877"} Feb 20 09:53:28 crc kubenswrapper[5094]: I0220 09:53:28.786492 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q457n" event={"ID":"ad834bd9-9f2a-4b16-a79b-4c66429e20f8","Type":"ContainerStarted","Data":"755c2c518e2215405d0efe3a4a63d2ff18c5e12fcc9943b8705da558428ab6c0"} Feb 20 09:53:29 crc kubenswrapper[5094]: I0220 09:53:29.798640 5094 generic.go:334] "Generic (PLEG): container finished" podID="ad834bd9-9f2a-4b16-a79b-4c66429e20f8" containerID="755c2c518e2215405d0efe3a4a63d2ff18c5e12fcc9943b8705da558428ab6c0" exitCode=0 Feb 20 09:53:29 crc kubenswrapper[5094]: I0220 09:53:29.798691 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q457n" event={"ID":"ad834bd9-9f2a-4b16-a79b-4c66429e20f8","Type":"ContainerDied","Data":"755c2c518e2215405d0efe3a4a63d2ff18c5e12fcc9943b8705da558428ab6c0"} Feb 20 09:53:31 crc kubenswrapper[5094]: I0220 09:53:31.824219 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q457n" event={"ID":"ad834bd9-9f2a-4b16-a79b-4c66429e20f8","Type":"ContainerStarted","Data":"7288ca6871af091c0022ad4194db65a5e5e37bd8f3422e075c3f3372d2df6ee3"} Feb 20 09:53:31 crc kubenswrapper[5094]: I0220 09:53:31.879108 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q457n" podStartSLOduration=2.4457314119999998 podStartE2EDuration="5.879064472s" podCreationTimestamp="2026-02-20 09:53:26 +0000 UTC" firstStartedPulling="2026-02-20 09:53:27.777445747 +0000 UTC m=+11222.650072448" lastFinishedPulling="2026-02-20 09:53:31.210778797 +0000 UTC m=+11226.083405508" observedRunningTime="2026-02-20 09:53:31.856193936 +0000 UTC m=+11226.728820667" watchObservedRunningTime="2026-02-20 09:53:31.879064472 +0000 UTC m=+11226.751691183" Feb 20 09:53:37 crc kubenswrapper[5094]: I0220 09:53:37.011391 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:37 crc kubenswrapper[5094]: I0220 09:53:37.011983 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:37 crc kubenswrapper[5094]: I0220 09:53:37.075155 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:37 crc kubenswrapper[5094]: I0220 09:53:37.943017 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:38 crc kubenswrapper[5094]: I0220 09:53:38.009866 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q457n"] Feb 20 09:53:38 crc kubenswrapper[5094]: I0220 09:53:38.840096 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:53:38 crc kubenswrapper[5094]: E0220 09:53:38.840591 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:53:39 crc kubenswrapper[5094]: I0220 09:53:39.908761 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q457n" podUID="ad834bd9-9f2a-4b16-a79b-4c66429e20f8" containerName="registry-server" containerID="cri-o://7288ca6871af091c0022ad4194db65a5e5e37bd8f3422e075c3f3372d2df6ee3" gracePeriod=2 Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.635162 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.784665 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-utilities\") pod \"ad834bd9-9f2a-4b16-a79b-4c66429e20f8\" (UID: \"ad834bd9-9f2a-4b16-a79b-4c66429e20f8\") " Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.784752 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-catalog-content\") pod \"ad834bd9-9f2a-4b16-a79b-4c66429e20f8\" (UID: \"ad834bd9-9f2a-4b16-a79b-4c66429e20f8\") " Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.784835 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8bnb\" (UniqueName: \"kubernetes.io/projected/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-kube-api-access-k8bnb\") pod \"ad834bd9-9f2a-4b16-a79b-4c66429e20f8\" (UID: \"ad834bd9-9f2a-4b16-a79b-4c66429e20f8\") " Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.787188 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-utilities" (OuterVolumeSpecName: "utilities") pod "ad834bd9-9f2a-4b16-a79b-4c66429e20f8" (UID: "ad834bd9-9f2a-4b16-a79b-4c66429e20f8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.801015 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-kube-api-access-k8bnb" (OuterVolumeSpecName: "kube-api-access-k8bnb") pod "ad834bd9-9f2a-4b16-a79b-4c66429e20f8" (UID: "ad834bd9-9f2a-4b16-a79b-4c66429e20f8"). InnerVolumeSpecName "kube-api-access-k8bnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.808794 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad834bd9-9f2a-4b16-a79b-4c66429e20f8" (UID: "ad834bd9-9f2a-4b16-a79b-4c66429e20f8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.886940 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8bnb\" (UniqueName: \"kubernetes.io/projected/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-kube-api-access-k8bnb\") on node \"crc\" DevicePath \"\"" Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.886979 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.886992 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.922676 5094 generic.go:334] "Generic (PLEG): container finished" podID="ad834bd9-9f2a-4b16-a79b-4c66429e20f8" containerID="7288ca6871af091c0022ad4194db65a5e5e37bd8f3422e075c3f3372d2df6ee3" exitCode=0 Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.922740 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q457n" event={"ID":"ad834bd9-9f2a-4b16-a79b-4c66429e20f8","Type":"ContainerDied","Data":"7288ca6871af091c0022ad4194db65a5e5e37bd8f3422e075c3f3372d2df6ee3"} Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.922777 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.922803 5094 scope.go:117] "RemoveContainer" containerID="7288ca6871af091c0022ad4194db65a5e5e37bd8f3422e075c3f3372d2df6ee3" Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.922790 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q457n" event={"ID":"ad834bd9-9f2a-4b16-a79b-4c66429e20f8","Type":"ContainerDied","Data":"face48fb76ab2bf77d89b9d776328f22131f9e1aaa8d0edf5235ddc072c31877"} Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.946446 5094 scope.go:117] "RemoveContainer" containerID="755c2c518e2215405d0efe3a4a63d2ff18c5e12fcc9943b8705da558428ab6c0" Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.960565 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q457n"] Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.970739 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q457n"] Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.977414 5094 scope.go:117] "RemoveContainer" containerID="c16488e5fb16ae96037e74b2aae3b51fdba4cbc02b6de899da1a24c12e71f63d" Feb 20 09:53:41 crc kubenswrapper[5094]: I0220 09:53:41.031657 5094 scope.go:117] "RemoveContainer" containerID="7288ca6871af091c0022ad4194db65a5e5e37bd8f3422e075c3f3372d2df6ee3" Feb 20 09:53:41 crc kubenswrapper[5094]: E0220 09:53:41.032285 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7288ca6871af091c0022ad4194db65a5e5e37bd8f3422e075c3f3372d2df6ee3\": container with ID starting with 7288ca6871af091c0022ad4194db65a5e5e37bd8f3422e075c3f3372d2df6ee3 not found: ID does not exist" containerID="7288ca6871af091c0022ad4194db65a5e5e37bd8f3422e075c3f3372d2df6ee3" Feb 20 09:53:41 crc kubenswrapper[5094]: I0220 09:53:41.032322 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7288ca6871af091c0022ad4194db65a5e5e37bd8f3422e075c3f3372d2df6ee3"} err="failed to get container status \"7288ca6871af091c0022ad4194db65a5e5e37bd8f3422e075c3f3372d2df6ee3\": rpc error: code = NotFound desc = could not find container \"7288ca6871af091c0022ad4194db65a5e5e37bd8f3422e075c3f3372d2df6ee3\": container with ID starting with 7288ca6871af091c0022ad4194db65a5e5e37bd8f3422e075c3f3372d2df6ee3 not found: ID does not exist" Feb 20 09:53:41 crc kubenswrapper[5094]: I0220 09:53:41.032342 5094 scope.go:117] "RemoveContainer" containerID="755c2c518e2215405d0efe3a4a63d2ff18c5e12fcc9943b8705da558428ab6c0" Feb 20 09:53:41 crc kubenswrapper[5094]: E0220 09:53:41.032641 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"755c2c518e2215405d0efe3a4a63d2ff18c5e12fcc9943b8705da558428ab6c0\": container with ID starting with 755c2c518e2215405d0efe3a4a63d2ff18c5e12fcc9943b8705da558428ab6c0 not found: ID does not exist" containerID="755c2c518e2215405d0efe3a4a63d2ff18c5e12fcc9943b8705da558428ab6c0" Feb 20 09:53:41 crc kubenswrapper[5094]: I0220 09:53:41.032662 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"755c2c518e2215405d0efe3a4a63d2ff18c5e12fcc9943b8705da558428ab6c0"} err="failed to get container status \"755c2c518e2215405d0efe3a4a63d2ff18c5e12fcc9943b8705da558428ab6c0\": rpc error: code = NotFound desc = could not find container \"755c2c518e2215405d0efe3a4a63d2ff18c5e12fcc9943b8705da558428ab6c0\": container with ID starting with 755c2c518e2215405d0efe3a4a63d2ff18c5e12fcc9943b8705da558428ab6c0 not found: ID does not exist" Feb 20 09:53:41 crc kubenswrapper[5094]: I0220 09:53:41.032675 5094 scope.go:117] "RemoveContainer" containerID="c16488e5fb16ae96037e74b2aae3b51fdba4cbc02b6de899da1a24c12e71f63d" Feb 20 09:53:41 crc kubenswrapper[5094]: E0220 09:53:41.033265 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c16488e5fb16ae96037e74b2aae3b51fdba4cbc02b6de899da1a24c12e71f63d\": container with ID starting with c16488e5fb16ae96037e74b2aae3b51fdba4cbc02b6de899da1a24c12e71f63d not found: ID does not exist" containerID="c16488e5fb16ae96037e74b2aae3b51fdba4cbc02b6de899da1a24c12e71f63d" Feb 20 09:53:41 crc kubenswrapper[5094]: I0220 09:53:41.033300 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c16488e5fb16ae96037e74b2aae3b51fdba4cbc02b6de899da1a24c12e71f63d"} err="failed to get container status \"c16488e5fb16ae96037e74b2aae3b51fdba4cbc02b6de899da1a24c12e71f63d\": rpc error: code = NotFound desc = could not find container \"c16488e5fb16ae96037e74b2aae3b51fdba4cbc02b6de899da1a24c12e71f63d\": container with ID starting with c16488e5fb16ae96037e74b2aae3b51fdba4cbc02b6de899da1a24c12e71f63d not found: ID does not exist" Feb 20 09:53:41 crc kubenswrapper[5094]: I0220 09:53:41.850780 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad834bd9-9f2a-4b16-a79b-4c66429e20f8" path="/var/lib/kubelet/pods/ad834bd9-9f2a-4b16-a79b-4c66429e20f8/volumes" Feb 20 09:53:51 crc kubenswrapper[5094]: I0220 09:53:51.839560 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:53:51 crc kubenswrapper[5094]: E0220 09:53:51.840266 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:54:06 crc kubenswrapper[5094]: I0220 09:54:06.840126 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:54:06 crc kubenswrapper[5094]: E0220 09:54:06.841177 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:54:17 crc kubenswrapper[5094]: I0220 09:54:17.840512 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:54:17 crc kubenswrapper[5094]: E0220 09:54:17.841404 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:54:29 crc kubenswrapper[5094]: I0220 09:54:29.840456 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:54:29 crc kubenswrapper[5094]: E0220 09:54:29.841335 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:54:40 crc kubenswrapper[5094]: I0220 09:54:40.841053 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:54:40 crc kubenswrapper[5094]: E0220 09:54:40.842091 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:54:44 crc kubenswrapper[5094]: I0220 09:54:44.766484 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z8kfp"] Feb 20 09:54:44 crc kubenswrapper[5094]: E0220 09:54:44.767411 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad834bd9-9f2a-4b16-a79b-4c66429e20f8" containerName="registry-server" Feb 20 09:54:44 crc kubenswrapper[5094]: I0220 09:54:44.767423 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad834bd9-9f2a-4b16-a79b-4c66429e20f8" containerName="registry-server" Feb 20 09:54:44 crc kubenswrapper[5094]: E0220 09:54:44.767433 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad834bd9-9f2a-4b16-a79b-4c66429e20f8" containerName="extract-utilities" Feb 20 09:54:44 crc kubenswrapper[5094]: I0220 09:54:44.767440 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad834bd9-9f2a-4b16-a79b-4c66429e20f8" containerName="extract-utilities" Feb 20 09:54:44 crc kubenswrapper[5094]: E0220 09:54:44.767468 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad834bd9-9f2a-4b16-a79b-4c66429e20f8" containerName="extract-content" Feb 20 09:54:44 crc kubenswrapper[5094]: I0220 09:54:44.767475 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad834bd9-9f2a-4b16-a79b-4c66429e20f8" containerName="extract-content" Feb 20 09:54:44 crc kubenswrapper[5094]: I0220 09:54:44.767681 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad834bd9-9f2a-4b16-a79b-4c66429e20f8" containerName="registry-server" Feb 20 09:54:44 crc kubenswrapper[5094]: I0220 09:54:44.769152 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:54:44 crc kubenswrapper[5094]: I0220 09:54:44.782689 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z8kfp"] Feb 20 09:54:44 crc kubenswrapper[5094]: I0220 09:54:44.836186 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95eeef2c-7672-4fd2-a004-9b285e87b509-utilities\") pod \"redhat-operators-z8kfp\" (UID: \"95eeef2c-7672-4fd2-a004-9b285e87b509\") " pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:54:44 crc kubenswrapper[5094]: I0220 09:54:44.836260 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcfmg\" (UniqueName: \"kubernetes.io/projected/95eeef2c-7672-4fd2-a004-9b285e87b509-kube-api-access-wcfmg\") pod \"redhat-operators-z8kfp\" (UID: \"95eeef2c-7672-4fd2-a004-9b285e87b509\") " pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:54:44 crc kubenswrapper[5094]: I0220 09:54:44.836366 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95eeef2c-7672-4fd2-a004-9b285e87b509-catalog-content\") pod \"redhat-operators-z8kfp\" (UID: \"95eeef2c-7672-4fd2-a004-9b285e87b509\") " pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:54:44 crc kubenswrapper[5094]: I0220 09:54:44.937857 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95eeef2c-7672-4fd2-a004-9b285e87b509-utilities\") pod \"redhat-operators-z8kfp\" (UID: \"95eeef2c-7672-4fd2-a004-9b285e87b509\") " pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:54:44 crc kubenswrapper[5094]: I0220 09:54:44.937929 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcfmg\" (UniqueName: \"kubernetes.io/projected/95eeef2c-7672-4fd2-a004-9b285e87b509-kube-api-access-wcfmg\") pod \"redhat-operators-z8kfp\" (UID: \"95eeef2c-7672-4fd2-a004-9b285e87b509\") " pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:54:44 crc kubenswrapper[5094]: I0220 09:54:44.938016 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95eeef2c-7672-4fd2-a004-9b285e87b509-catalog-content\") pod \"redhat-operators-z8kfp\" (UID: \"95eeef2c-7672-4fd2-a004-9b285e87b509\") " pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:54:44 crc kubenswrapper[5094]: I0220 09:54:44.938373 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95eeef2c-7672-4fd2-a004-9b285e87b509-utilities\") pod \"redhat-operators-z8kfp\" (UID: \"95eeef2c-7672-4fd2-a004-9b285e87b509\") " pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:54:44 crc kubenswrapper[5094]: I0220 09:54:44.938443 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95eeef2c-7672-4fd2-a004-9b285e87b509-catalog-content\") pod \"redhat-operators-z8kfp\" (UID: \"95eeef2c-7672-4fd2-a004-9b285e87b509\") " pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:54:44 crc kubenswrapper[5094]: I0220 09:54:44.963622 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcfmg\" (UniqueName: \"kubernetes.io/projected/95eeef2c-7672-4fd2-a004-9b285e87b509-kube-api-access-wcfmg\") pod \"redhat-operators-z8kfp\" (UID: \"95eeef2c-7672-4fd2-a004-9b285e87b509\") " pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:54:45 crc kubenswrapper[5094]: I0220 09:54:45.102073 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:54:45 crc kubenswrapper[5094]: I0220 09:54:45.673546 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z8kfp"] Feb 20 09:54:45 crc kubenswrapper[5094]: I0220 09:54:45.856983 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z8kfp" event={"ID":"95eeef2c-7672-4fd2-a004-9b285e87b509","Type":"ContainerStarted","Data":"cb0f0380618df27adf79ad832178308b9549c7e55f77907064673108fdec1afe"} Feb 20 09:54:46 crc kubenswrapper[5094]: I0220 09:54:46.871163 5094 generic.go:334] "Generic (PLEG): container finished" podID="95eeef2c-7672-4fd2-a004-9b285e87b509" containerID="733b415eecb17e443b2a977aa65d667050a0186d232caf045539d883be4a2e35" exitCode=0 Feb 20 09:54:46 crc kubenswrapper[5094]: I0220 09:54:46.871248 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z8kfp" event={"ID":"95eeef2c-7672-4fd2-a004-9b285e87b509","Type":"ContainerDied","Data":"733b415eecb17e443b2a977aa65d667050a0186d232caf045539d883be4a2e35"} Feb 20 09:54:46 crc kubenswrapper[5094]: I0220 09:54:46.877648 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 09:54:47 crc kubenswrapper[5094]: I0220 09:54:47.884899 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z8kfp" event={"ID":"95eeef2c-7672-4fd2-a004-9b285e87b509","Type":"ContainerStarted","Data":"4ec1be828492e4f64e6a996ef9a73b351bfe9297664ce17a592b38d797f83ccb"} Feb 20 09:54:51 crc kubenswrapper[5094]: I0220 09:54:51.841003 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:54:51 crc kubenswrapper[5094]: E0220 09:54:51.841947 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:54:52 crc kubenswrapper[5094]: I0220 09:54:52.936686 5094 generic.go:334] "Generic (PLEG): container finished" podID="95eeef2c-7672-4fd2-a004-9b285e87b509" containerID="4ec1be828492e4f64e6a996ef9a73b351bfe9297664ce17a592b38d797f83ccb" exitCode=0 Feb 20 09:54:52 crc kubenswrapper[5094]: I0220 09:54:52.936746 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z8kfp" event={"ID":"95eeef2c-7672-4fd2-a004-9b285e87b509","Type":"ContainerDied","Data":"4ec1be828492e4f64e6a996ef9a73b351bfe9297664ce17a592b38d797f83ccb"} Feb 20 09:54:53 crc kubenswrapper[5094]: I0220 09:54:53.950096 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z8kfp" event={"ID":"95eeef2c-7672-4fd2-a004-9b285e87b509","Type":"ContainerStarted","Data":"6fbd7c0cf87af4612d1030e214d2b13aaa3b54c21c11da74a8bdc769bfa9206f"} Feb 20 09:54:53 crc kubenswrapper[5094]: I0220 09:54:53.976922 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z8kfp" podStartSLOduration=3.508961127 podStartE2EDuration="9.976899807s" podCreationTimestamp="2026-02-20 09:54:44 +0000 UTC" firstStartedPulling="2026-02-20 09:54:46.877447393 +0000 UTC m=+11301.750074104" lastFinishedPulling="2026-02-20 09:54:53.345386073 +0000 UTC m=+11308.218012784" observedRunningTime="2026-02-20 09:54:53.966666634 +0000 UTC m=+11308.839293365" watchObservedRunningTime="2026-02-20 09:54:53.976899807 +0000 UTC m=+11308.849526518" Feb 20 09:54:55 crc kubenswrapper[5094]: I0220 09:54:55.102629 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:54:55 crc kubenswrapper[5094]: I0220 09:54:55.102922 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:54:56 crc kubenswrapper[5094]: I0220 09:54:56.158590 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z8kfp" podUID="95eeef2c-7672-4fd2-a004-9b285e87b509" containerName="registry-server" probeResult="failure" output=< Feb 20 09:54:56 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 09:54:56 crc kubenswrapper[5094]: > Feb 20 09:54:59 crc kubenswrapper[5094]: I0220 09:54:59.575506 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6swkl"] Feb 20 09:54:59 crc kubenswrapper[5094]: I0220 09:54:59.578869 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:54:59 crc kubenswrapper[5094]: I0220 09:54:59.586943 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6swkl"] Feb 20 09:54:59 crc kubenswrapper[5094]: I0220 09:54:59.632378 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-catalog-content\") pod \"certified-operators-6swkl\" (UID: \"5c1bdcc1-da0d-4443-a24b-2f521f7b24db\") " pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:54:59 crc kubenswrapper[5094]: I0220 09:54:59.632515 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-utilities\") pod \"certified-operators-6swkl\" (UID: \"5c1bdcc1-da0d-4443-a24b-2f521f7b24db\") " pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:54:59 crc kubenswrapper[5094]: I0220 09:54:59.632626 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spklh\" (UniqueName: \"kubernetes.io/projected/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-kube-api-access-spklh\") pod \"certified-operators-6swkl\" (UID: \"5c1bdcc1-da0d-4443-a24b-2f521f7b24db\") " pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:54:59 crc kubenswrapper[5094]: I0220 09:54:59.735038 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-catalog-content\") pod \"certified-operators-6swkl\" (UID: \"5c1bdcc1-da0d-4443-a24b-2f521f7b24db\") " pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:54:59 crc kubenswrapper[5094]: I0220 09:54:59.735186 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-utilities\") pod \"certified-operators-6swkl\" (UID: \"5c1bdcc1-da0d-4443-a24b-2f521f7b24db\") " pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:54:59 crc kubenswrapper[5094]: I0220 09:54:59.735276 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spklh\" (UniqueName: \"kubernetes.io/projected/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-kube-api-access-spklh\") pod \"certified-operators-6swkl\" (UID: \"5c1bdcc1-da0d-4443-a24b-2f521f7b24db\") " pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:54:59 crc kubenswrapper[5094]: I0220 09:54:59.735524 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-catalog-content\") pod \"certified-operators-6swkl\" (UID: \"5c1bdcc1-da0d-4443-a24b-2f521f7b24db\") " pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:54:59 crc kubenswrapper[5094]: I0220 09:54:59.735793 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-utilities\") pod \"certified-operators-6swkl\" (UID: \"5c1bdcc1-da0d-4443-a24b-2f521f7b24db\") " pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:54:59 crc kubenswrapper[5094]: I0220 09:54:59.756601 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spklh\" (UniqueName: \"kubernetes.io/projected/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-kube-api-access-spklh\") pod \"certified-operators-6swkl\" (UID: \"5c1bdcc1-da0d-4443-a24b-2f521f7b24db\") " pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:54:59 crc kubenswrapper[5094]: I0220 09:54:59.927843 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:55:00 crc kubenswrapper[5094]: I0220 09:55:00.426413 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6swkl"] Feb 20 09:55:00 crc kubenswrapper[5094]: W0220 09:55:00.426492 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c1bdcc1_da0d_4443_a24b_2f521f7b24db.slice/crio-d7a5896337c0cc6a0e7364a77cf815cb86c8be932da0cbfaf8920552e7e8750c WatchSource:0}: Error finding container d7a5896337c0cc6a0e7364a77cf815cb86c8be932da0cbfaf8920552e7e8750c: Status 404 returned error can't find the container with id d7a5896337c0cc6a0e7364a77cf815cb86c8be932da0cbfaf8920552e7e8750c Feb 20 09:55:01 crc kubenswrapper[5094]: I0220 09:55:01.013430 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6swkl" event={"ID":"5c1bdcc1-da0d-4443-a24b-2f521f7b24db","Type":"ContainerDied","Data":"3f817346f66c47d2c53704651b8c09d5875145007ac55393f4b849a904c6cee8"} Feb 20 09:55:01 crc kubenswrapper[5094]: I0220 09:55:01.013291 5094 generic.go:334] "Generic (PLEG): container finished" podID="5c1bdcc1-da0d-4443-a24b-2f521f7b24db" containerID="3f817346f66c47d2c53704651b8c09d5875145007ac55393f4b849a904c6cee8" exitCode=0 Feb 20 09:55:01 crc kubenswrapper[5094]: I0220 09:55:01.013818 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6swkl" event={"ID":"5c1bdcc1-da0d-4443-a24b-2f521f7b24db","Type":"ContainerStarted","Data":"d7a5896337c0cc6a0e7364a77cf815cb86c8be932da0cbfaf8920552e7e8750c"} Feb 20 09:55:02 crc kubenswrapper[5094]: I0220 09:55:02.025650 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6swkl" event={"ID":"5c1bdcc1-da0d-4443-a24b-2f521f7b24db","Type":"ContainerStarted","Data":"15aed613ef93b6809b605945723aa28d2ca2f64f99e1321457be80b59c878827"} Feb 20 09:55:04 crc kubenswrapper[5094]: I0220 09:55:04.046008 5094 generic.go:334] "Generic (PLEG): container finished" podID="5c1bdcc1-da0d-4443-a24b-2f521f7b24db" containerID="15aed613ef93b6809b605945723aa28d2ca2f64f99e1321457be80b59c878827" exitCode=0 Feb 20 09:55:04 crc kubenswrapper[5094]: I0220 09:55:04.046080 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6swkl" event={"ID":"5c1bdcc1-da0d-4443-a24b-2f521f7b24db","Type":"ContainerDied","Data":"15aed613ef93b6809b605945723aa28d2ca2f64f99e1321457be80b59c878827"} Feb 20 09:55:04 crc kubenswrapper[5094]: I0220 09:55:04.840065 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:55:04 crc kubenswrapper[5094]: E0220 09:55:04.840556 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:55:05 crc kubenswrapper[5094]: I0220 09:55:05.056553 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6swkl" event={"ID":"5c1bdcc1-da0d-4443-a24b-2f521f7b24db","Type":"ContainerStarted","Data":"4092720bcfb743e860d3b4bab28abea7806f206dce64d8a4c626c69fb75bac3b"} Feb 20 09:55:05 crc kubenswrapper[5094]: I0220 09:55:05.084055 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6swkl" podStartSLOduration=2.652506004 podStartE2EDuration="6.08403166s" podCreationTimestamp="2026-02-20 09:54:59 +0000 UTC" firstStartedPulling="2026-02-20 09:55:01.016109568 +0000 UTC m=+11315.888736279" lastFinishedPulling="2026-02-20 09:55:04.447635224 +0000 UTC m=+11319.320261935" observedRunningTime="2026-02-20 09:55:05.079550819 +0000 UTC m=+11319.952177550" watchObservedRunningTime="2026-02-20 09:55:05.08403166 +0000 UTC m=+11319.956658371" Feb 20 09:55:05 crc kubenswrapper[5094]: I0220 09:55:05.156044 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:55:05 crc kubenswrapper[5094]: I0220 09:55:05.226517 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:55:06 crc kubenswrapper[5094]: I0220 09:55:06.952673 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z8kfp"] Feb 20 09:55:07 crc kubenswrapper[5094]: I0220 09:55:07.072342 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z8kfp" podUID="95eeef2c-7672-4fd2-a004-9b285e87b509" containerName="registry-server" containerID="cri-o://6fbd7c0cf87af4612d1030e214d2b13aaa3b54c21c11da74a8bdc769bfa9206f" gracePeriod=2 Feb 20 09:55:07 crc kubenswrapper[5094]: I0220 09:55:07.756203 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:55:07 crc kubenswrapper[5094]: I0220 09:55:07.816557 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95eeef2c-7672-4fd2-a004-9b285e87b509-catalog-content\") pod \"95eeef2c-7672-4fd2-a004-9b285e87b509\" (UID: \"95eeef2c-7672-4fd2-a004-9b285e87b509\") " Feb 20 09:55:07 crc kubenswrapper[5094]: I0220 09:55:07.816683 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95eeef2c-7672-4fd2-a004-9b285e87b509-utilities\") pod \"95eeef2c-7672-4fd2-a004-9b285e87b509\" (UID: \"95eeef2c-7672-4fd2-a004-9b285e87b509\") " Feb 20 09:55:07 crc kubenswrapper[5094]: I0220 09:55:07.816827 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcfmg\" (UniqueName: \"kubernetes.io/projected/95eeef2c-7672-4fd2-a004-9b285e87b509-kube-api-access-wcfmg\") pod \"95eeef2c-7672-4fd2-a004-9b285e87b509\" (UID: \"95eeef2c-7672-4fd2-a004-9b285e87b509\") " Feb 20 09:55:07 crc kubenswrapper[5094]: I0220 09:55:07.818172 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95eeef2c-7672-4fd2-a004-9b285e87b509-utilities" (OuterVolumeSpecName: "utilities") pod "95eeef2c-7672-4fd2-a004-9b285e87b509" (UID: "95eeef2c-7672-4fd2-a004-9b285e87b509"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:55:07 crc kubenswrapper[5094]: I0220 09:55:07.835445 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95eeef2c-7672-4fd2-a004-9b285e87b509-kube-api-access-wcfmg" (OuterVolumeSpecName: "kube-api-access-wcfmg") pod "95eeef2c-7672-4fd2-a004-9b285e87b509" (UID: "95eeef2c-7672-4fd2-a004-9b285e87b509"). InnerVolumeSpecName "kube-api-access-wcfmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:07 crc kubenswrapper[5094]: I0220 09:55:07.919490 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95eeef2c-7672-4fd2-a004-9b285e87b509-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:07 crc kubenswrapper[5094]: I0220 09:55:07.919525 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcfmg\" (UniqueName: \"kubernetes.io/projected/95eeef2c-7672-4fd2-a004-9b285e87b509-kube-api-access-wcfmg\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:07 crc kubenswrapper[5094]: I0220 09:55:07.959383 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95eeef2c-7672-4fd2-a004-9b285e87b509-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "95eeef2c-7672-4fd2-a004-9b285e87b509" (UID: "95eeef2c-7672-4fd2-a004-9b285e87b509"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:55:08 crc kubenswrapper[5094]: I0220 09:55:08.021026 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95eeef2c-7672-4fd2-a004-9b285e87b509-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:08 crc kubenswrapper[5094]: I0220 09:55:08.104458 5094 generic.go:334] "Generic (PLEG): container finished" podID="95eeef2c-7672-4fd2-a004-9b285e87b509" containerID="6fbd7c0cf87af4612d1030e214d2b13aaa3b54c21c11da74a8bdc769bfa9206f" exitCode=0 Feb 20 09:55:08 crc kubenswrapper[5094]: I0220 09:55:08.104514 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z8kfp" event={"ID":"95eeef2c-7672-4fd2-a004-9b285e87b509","Type":"ContainerDied","Data":"6fbd7c0cf87af4612d1030e214d2b13aaa3b54c21c11da74a8bdc769bfa9206f"} Feb 20 09:55:08 crc kubenswrapper[5094]: I0220 09:55:08.104547 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z8kfp" event={"ID":"95eeef2c-7672-4fd2-a004-9b285e87b509","Type":"ContainerDied","Data":"cb0f0380618df27adf79ad832178308b9549c7e55f77907064673108fdec1afe"} Feb 20 09:55:08 crc kubenswrapper[5094]: I0220 09:55:08.104581 5094 scope.go:117] "RemoveContainer" containerID="6fbd7c0cf87af4612d1030e214d2b13aaa3b54c21c11da74a8bdc769bfa9206f" Feb 20 09:55:08 crc kubenswrapper[5094]: I0220 09:55:08.104640 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:55:08 crc kubenswrapper[5094]: I0220 09:55:08.142583 5094 scope.go:117] "RemoveContainer" containerID="4ec1be828492e4f64e6a996ef9a73b351bfe9297664ce17a592b38d797f83ccb" Feb 20 09:55:08 crc kubenswrapper[5094]: I0220 09:55:08.144927 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z8kfp"] Feb 20 09:55:08 crc kubenswrapper[5094]: I0220 09:55:08.159689 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z8kfp"] Feb 20 09:55:08 crc kubenswrapper[5094]: I0220 09:55:08.179274 5094 scope.go:117] "RemoveContainer" containerID="733b415eecb17e443b2a977aa65d667050a0186d232caf045539d883be4a2e35" Feb 20 09:55:08 crc kubenswrapper[5094]: I0220 09:55:08.227925 5094 scope.go:117] "RemoveContainer" containerID="6fbd7c0cf87af4612d1030e214d2b13aaa3b54c21c11da74a8bdc769bfa9206f" Feb 20 09:55:08 crc kubenswrapper[5094]: E0220 09:55:08.229302 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fbd7c0cf87af4612d1030e214d2b13aaa3b54c21c11da74a8bdc769bfa9206f\": container with ID starting with 6fbd7c0cf87af4612d1030e214d2b13aaa3b54c21c11da74a8bdc769bfa9206f not found: ID does not exist" containerID="6fbd7c0cf87af4612d1030e214d2b13aaa3b54c21c11da74a8bdc769bfa9206f" Feb 20 09:55:08 crc kubenswrapper[5094]: I0220 09:55:08.229610 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fbd7c0cf87af4612d1030e214d2b13aaa3b54c21c11da74a8bdc769bfa9206f"} err="failed to get container status \"6fbd7c0cf87af4612d1030e214d2b13aaa3b54c21c11da74a8bdc769bfa9206f\": rpc error: code = NotFound desc = could not find container \"6fbd7c0cf87af4612d1030e214d2b13aaa3b54c21c11da74a8bdc769bfa9206f\": container with ID starting with 6fbd7c0cf87af4612d1030e214d2b13aaa3b54c21c11da74a8bdc769bfa9206f not found: ID does not exist" Feb 20 09:55:08 crc kubenswrapper[5094]: I0220 09:55:08.229636 5094 scope.go:117] "RemoveContainer" containerID="4ec1be828492e4f64e6a996ef9a73b351bfe9297664ce17a592b38d797f83ccb" Feb 20 09:55:08 crc kubenswrapper[5094]: E0220 09:55:08.229991 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ec1be828492e4f64e6a996ef9a73b351bfe9297664ce17a592b38d797f83ccb\": container with ID starting with 4ec1be828492e4f64e6a996ef9a73b351bfe9297664ce17a592b38d797f83ccb not found: ID does not exist" containerID="4ec1be828492e4f64e6a996ef9a73b351bfe9297664ce17a592b38d797f83ccb" Feb 20 09:55:08 crc kubenswrapper[5094]: I0220 09:55:08.230028 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ec1be828492e4f64e6a996ef9a73b351bfe9297664ce17a592b38d797f83ccb"} err="failed to get container status \"4ec1be828492e4f64e6a996ef9a73b351bfe9297664ce17a592b38d797f83ccb\": rpc error: code = NotFound desc = could not find container \"4ec1be828492e4f64e6a996ef9a73b351bfe9297664ce17a592b38d797f83ccb\": container with ID starting with 4ec1be828492e4f64e6a996ef9a73b351bfe9297664ce17a592b38d797f83ccb not found: ID does not exist" Feb 20 09:55:08 crc kubenswrapper[5094]: I0220 09:55:08.230053 5094 scope.go:117] "RemoveContainer" containerID="733b415eecb17e443b2a977aa65d667050a0186d232caf045539d883be4a2e35" Feb 20 09:55:08 crc kubenswrapper[5094]: E0220 09:55:08.233301 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"733b415eecb17e443b2a977aa65d667050a0186d232caf045539d883be4a2e35\": container with ID starting with 733b415eecb17e443b2a977aa65d667050a0186d232caf045539d883be4a2e35 not found: ID does not exist" containerID="733b415eecb17e443b2a977aa65d667050a0186d232caf045539d883be4a2e35" Feb 20 09:55:08 crc kubenswrapper[5094]: I0220 09:55:08.233326 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"733b415eecb17e443b2a977aa65d667050a0186d232caf045539d883be4a2e35"} err="failed to get container status \"733b415eecb17e443b2a977aa65d667050a0186d232caf045539d883be4a2e35\": rpc error: code = NotFound desc = could not find container \"733b415eecb17e443b2a977aa65d667050a0186d232caf045539d883be4a2e35\": container with ID starting with 733b415eecb17e443b2a977aa65d667050a0186d232caf045539d883be4a2e35 not found: ID does not exist" Feb 20 09:55:08 crc kubenswrapper[5094]: E0220 09:55:08.282249 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95eeef2c_7672_4fd2_a004_9b285e87b509.slice/crio-cb0f0380618df27adf79ad832178308b9549c7e55f77907064673108fdec1afe\": RecentStats: unable to find data in memory cache]" Feb 20 09:55:09 crc kubenswrapper[5094]: I0220 09:55:09.850790 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95eeef2c-7672-4fd2-a004-9b285e87b509" path="/var/lib/kubelet/pods/95eeef2c-7672-4fd2-a004-9b285e87b509/volumes" Feb 20 09:55:09 crc kubenswrapper[5094]: I0220 09:55:09.928587 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:55:09 crc kubenswrapper[5094]: I0220 09:55:09.928633 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:55:09 crc kubenswrapper[5094]: I0220 09:55:09.982952 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:55:10 crc kubenswrapper[5094]: I0220 09:55:10.185098 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:55:11 crc kubenswrapper[5094]: I0220 09:55:11.151399 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6swkl"] Feb 20 09:55:12 crc kubenswrapper[5094]: I0220 09:55:12.145114 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6swkl" podUID="5c1bdcc1-da0d-4443-a24b-2f521f7b24db" containerName="registry-server" containerID="cri-o://4092720bcfb743e860d3b4bab28abea7806f206dce64d8a4c626c69fb75bac3b" gracePeriod=2 Feb 20 09:55:12 crc kubenswrapper[5094]: I0220 09:55:12.835528 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:55:12 crc kubenswrapper[5094]: I0220 09:55:12.923896 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-catalog-content\") pod \"5c1bdcc1-da0d-4443-a24b-2f521f7b24db\" (UID: \"5c1bdcc1-da0d-4443-a24b-2f521f7b24db\") " Feb 20 09:55:12 crc kubenswrapper[5094]: I0220 09:55:12.924079 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spklh\" (UniqueName: \"kubernetes.io/projected/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-kube-api-access-spklh\") pod \"5c1bdcc1-da0d-4443-a24b-2f521f7b24db\" (UID: \"5c1bdcc1-da0d-4443-a24b-2f521f7b24db\") " Feb 20 09:55:12 crc kubenswrapper[5094]: I0220 09:55:12.924238 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-utilities\") pod \"5c1bdcc1-da0d-4443-a24b-2f521f7b24db\" (UID: \"5c1bdcc1-da0d-4443-a24b-2f521f7b24db\") " Feb 20 09:55:12 crc kubenswrapper[5094]: I0220 09:55:12.925057 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-utilities" (OuterVolumeSpecName: "utilities") pod "5c1bdcc1-da0d-4443-a24b-2f521f7b24db" (UID: "5c1bdcc1-da0d-4443-a24b-2f521f7b24db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:55:12 crc kubenswrapper[5094]: I0220 09:55:12.930514 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-kube-api-access-spklh" (OuterVolumeSpecName: "kube-api-access-spklh") pod "5c1bdcc1-da0d-4443-a24b-2f521f7b24db" (UID: "5c1bdcc1-da0d-4443-a24b-2f521f7b24db"). InnerVolumeSpecName "kube-api-access-spklh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.001443 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c1bdcc1-da0d-4443-a24b-2f521f7b24db" (UID: "5c1bdcc1-da0d-4443-a24b-2f521f7b24db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.025891 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.025926 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.025940 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spklh\" (UniqueName: \"kubernetes.io/projected/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-kube-api-access-spklh\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.154247 5094 generic.go:334] "Generic (PLEG): container finished" podID="5c1bdcc1-da0d-4443-a24b-2f521f7b24db" containerID="4092720bcfb743e860d3b4bab28abea7806f206dce64d8a4c626c69fb75bac3b" exitCode=0 Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.154291 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6swkl" event={"ID":"5c1bdcc1-da0d-4443-a24b-2f521f7b24db","Type":"ContainerDied","Data":"4092720bcfb743e860d3b4bab28abea7806f206dce64d8a4c626c69fb75bac3b"} Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.154325 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6swkl" event={"ID":"5c1bdcc1-da0d-4443-a24b-2f521f7b24db","Type":"ContainerDied","Data":"d7a5896337c0cc6a0e7364a77cf815cb86c8be932da0cbfaf8920552e7e8750c"} Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.154346 5094 scope.go:117] "RemoveContainer" containerID="4092720bcfb743e860d3b4bab28abea7806f206dce64d8a4c626c69fb75bac3b" Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.154339 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.182359 5094 scope.go:117] "RemoveContainer" containerID="15aed613ef93b6809b605945723aa28d2ca2f64f99e1321457be80b59c878827" Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.190595 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6swkl"] Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.199791 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6swkl"] Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.208858 5094 scope.go:117] "RemoveContainer" containerID="3f817346f66c47d2c53704651b8c09d5875145007ac55393f4b849a904c6cee8" Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.259983 5094 scope.go:117] "RemoveContainer" containerID="4092720bcfb743e860d3b4bab28abea7806f206dce64d8a4c626c69fb75bac3b" Feb 20 09:55:13 crc kubenswrapper[5094]: E0220 09:55:13.260564 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4092720bcfb743e860d3b4bab28abea7806f206dce64d8a4c626c69fb75bac3b\": container with ID starting with 4092720bcfb743e860d3b4bab28abea7806f206dce64d8a4c626c69fb75bac3b not found: ID does not exist" containerID="4092720bcfb743e860d3b4bab28abea7806f206dce64d8a4c626c69fb75bac3b" Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.260590 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4092720bcfb743e860d3b4bab28abea7806f206dce64d8a4c626c69fb75bac3b"} err="failed to get container status \"4092720bcfb743e860d3b4bab28abea7806f206dce64d8a4c626c69fb75bac3b\": rpc error: code = NotFound desc = could not find container \"4092720bcfb743e860d3b4bab28abea7806f206dce64d8a4c626c69fb75bac3b\": container with ID starting with 4092720bcfb743e860d3b4bab28abea7806f206dce64d8a4c626c69fb75bac3b not found: ID does not exist" Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.260610 5094 scope.go:117] "RemoveContainer" containerID="15aed613ef93b6809b605945723aa28d2ca2f64f99e1321457be80b59c878827" Feb 20 09:55:13 crc kubenswrapper[5094]: E0220 09:55:13.261284 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15aed613ef93b6809b605945723aa28d2ca2f64f99e1321457be80b59c878827\": container with ID starting with 15aed613ef93b6809b605945723aa28d2ca2f64f99e1321457be80b59c878827 not found: ID does not exist" containerID="15aed613ef93b6809b605945723aa28d2ca2f64f99e1321457be80b59c878827" Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.261487 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15aed613ef93b6809b605945723aa28d2ca2f64f99e1321457be80b59c878827"} err="failed to get container status \"15aed613ef93b6809b605945723aa28d2ca2f64f99e1321457be80b59c878827\": rpc error: code = NotFound desc = could not find container \"15aed613ef93b6809b605945723aa28d2ca2f64f99e1321457be80b59c878827\": container with ID starting with 15aed613ef93b6809b605945723aa28d2ca2f64f99e1321457be80b59c878827 not found: ID does not exist" Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.261585 5094 scope.go:117] "RemoveContainer" containerID="3f817346f66c47d2c53704651b8c09d5875145007ac55393f4b849a904c6cee8" Feb 20 09:55:13 crc kubenswrapper[5094]: E0220 09:55:13.265809 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f817346f66c47d2c53704651b8c09d5875145007ac55393f4b849a904c6cee8\": container with ID starting with 3f817346f66c47d2c53704651b8c09d5875145007ac55393f4b849a904c6cee8 not found: ID does not exist" containerID="3f817346f66c47d2c53704651b8c09d5875145007ac55393f4b849a904c6cee8" Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.265841 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f817346f66c47d2c53704651b8c09d5875145007ac55393f4b849a904c6cee8"} err="failed to get container status \"3f817346f66c47d2c53704651b8c09d5875145007ac55393f4b849a904c6cee8\": rpc error: code = NotFound desc = could not find container \"3f817346f66c47d2c53704651b8c09d5875145007ac55393f4b849a904c6cee8\": container with ID starting with 3f817346f66c47d2c53704651b8c09d5875145007ac55393f4b849a904c6cee8 not found: ID does not exist" Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.851823 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c1bdcc1-da0d-4443-a24b-2f521f7b24db" path="/var/lib/kubelet/pods/5c1bdcc1-da0d-4443-a24b-2f521f7b24db/volumes" Feb 20 09:55:17 crc kubenswrapper[5094]: I0220 09:55:17.840621 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:55:17 crc kubenswrapper[5094]: E0220 09:55:17.841395 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:55:28 crc kubenswrapper[5094]: I0220 09:55:28.840850 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:55:28 crc kubenswrapper[5094]: E0220 09:55:28.841558 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:55:40 crc kubenswrapper[5094]: I0220 09:55:40.841425 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:55:40 crc kubenswrapper[5094]: E0220 09:55:40.842152 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:55:51 crc kubenswrapper[5094]: I0220 09:55:51.840156 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:55:51 crc kubenswrapper[5094]: E0220 09:55:51.841032 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:56:06 crc kubenswrapper[5094]: I0220 09:56:06.840466 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:56:06 crc kubenswrapper[5094]: E0220 09:56:06.841381 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:56:21 crc kubenswrapper[5094]: I0220 09:56:21.841016 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:56:21 crc kubenswrapper[5094]: E0220 09:56:21.851529 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:56:35 crc kubenswrapper[5094]: I0220 09:56:35.851788 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:56:35 crc kubenswrapper[5094]: E0220 09:56:35.852669 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:56:49 crc kubenswrapper[5094]: I0220 09:56:49.840860 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:56:49 crc kubenswrapper[5094]: E0220 09:56:49.841684 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:57:03 crc kubenswrapper[5094]: I0220 09:57:03.841112 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:57:03 crc kubenswrapper[5094]: E0220 09:57:03.841917 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:57:15 crc kubenswrapper[5094]: I0220 09:57:15.846831 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:57:15 crc kubenswrapper[5094]: E0220 09:57:15.847608 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:57:29 crc kubenswrapper[5094]: I0220 09:57:29.840265 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:57:29 crc kubenswrapper[5094]: E0220 09:57:29.840998 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:57:40 crc kubenswrapper[5094]: I0220 09:57:40.840929 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:57:41 crc kubenswrapper[5094]: I0220 09:57:41.613549 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"93e51fc131dd81ce33232a8e4dd68a98c48acfa4e4dbb0b00dd430da14bb99d1"} Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.206547 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56"] Feb 20 10:00:00 crc kubenswrapper[5094]: E0220 10:00:00.207550 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c1bdcc1-da0d-4443-a24b-2f521f7b24db" containerName="registry-server" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.207563 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c1bdcc1-da0d-4443-a24b-2f521f7b24db" containerName="registry-server" Feb 20 10:00:00 crc kubenswrapper[5094]: E0220 10:00:00.207584 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95eeef2c-7672-4fd2-a004-9b285e87b509" containerName="extract-utilities" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.207591 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="95eeef2c-7672-4fd2-a004-9b285e87b509" containerName="extract-utilities" Feb 20 10:00:00 crc kubenswrapper[5094]: E0220 10:00:00.207607 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95eeef2c-7672-4fd2-a004-9b285e87b509" containerName="registry-server" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.207614 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="95eeef2c-7672-4fd2-a004-9b285e87b509" containerName="registry-server" Feb 20 10:00:00 crc kubenswrapper[5094]: E0220 10:00:00.207627 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c1bdcc1-da0d-4443-a24b-2f521f7b24db" containerName="extract-utilities" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.207633 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c1bdcc1-da0d-4443-a24b-2f521f7b24db" containerName="extract-utilities" Feb 20 10:00:00 crc kubenswrapper[5094]: E0220 10:00:00.207645 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95eeef2c-7672-4fd2-a004-9b285e87b509" containerName="extract-content" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.207650 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="95eeef2c-7672-4fd2-a004-9b285e87b509" containerName="extract-content" Feb 20 10:00:00 crc kubenswrapper[5094]: E0220 10:00:00.207661 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c1bdcc1-da0d-4443-a24b-2f521f7b24db" containerName="extract-content" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.207668 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c1bdcc1-da0d-4443-a24b-2f521f7b24db" containerName="extract-content" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.207887 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c1bdcc1-da0d-4443-a24b-2f521f7b24db" containerName="registry-server" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.207915 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="95eeef2c-7672-4fd2-a004-9b285e87b509" containerName="registry-server" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.208661 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.210366 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.210489 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.218538 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56"] Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.267683 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85ctv\" (UniqueName: \"kubernetes.io/projected/c403611d-ad00-4c45-be94-83ae57d9a75f-kube-api-access-85ctv\") pod \"collect-profiles-29526360-cnw56\" (UID: \"c403611d-ad00-4c45-be94-83ae57d9a75f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.267787 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c403611d-ad00-4c45-be94-83ae57d9a75f-secret-volume\") pod \"collect-profiles-29526360-cnw56\" (UID: \"c403611d-ad00-4c45-be94-83ae57d9a75f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.267890 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c403611d-ad00-4c45-be94-83ae57d9a75f-config-volume\") pod \"collect-profiles-29526360-cnw56\" (UID: \"c403611d-ad00-4c45-be94-83ae57d9a75f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.370174 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c403611d-ad00-4c45-be94-83ae57d9a75f-config-volume\") pod \"collect-profiles-29526360-cnw56\" (UID: \"c403611d-ad00-4c45-be94-83ae57d9a75f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.370333 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85ctv\" (UniqueName: \"kubernetes.io/projected/c403611d-ad00-4c45-be94-83ae57d9a75f-kube-api-access-85ctv\") pod \"collect-profiles-29526360-cnw56\" (UID: \"c403611d-ad00-4c45-be94-83ae57d9a75f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.370358 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c403611d-ad00-4c45-be94-83ae57d9a75f-secret-volume\") pod \"collect-profiles-29526360-cnw56\" (UID: \"c403611d-ad00-4c45-be94-83ae57d9a75f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.371506 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c403611d-ad00-4c45-be94-83ae57d9a75f-config-volume\") pod \"collect-profiles-29526360-cnw56\" (UID: \"c403611d-ad00-4c45-be94-83ae57d9a75f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.385933 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c403611d-ad00-4c45-be94-83ae57d9a75f-secret-volume\") pod \"collect-profiles-29526360-cnw56\" (UID: \"c403611d-ad00-4c45-be94-83ae57d9a75f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.386394 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85ctv\" (UniqueName: \"kubernetes.io/projected/c403611d-ad00-4c45-be94-83ae57d9a75f-kube-api-access-85ctv\") pod \"collect-profiles-29526360-cnw56\" (UID: \"c403611d-ad00-4c45-be94-83ae57d9a75f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.545765 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56" Feb 20 10:00:01 crc kubenswrapper[5094]: I0220 10:00:01.165093 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56"] Feb 20 10:00:02 crc kubenswrapper[5094]: I0220 10:00:02.048430 5094 generic.go:334] "Generic (PLEG): container finished" podID="c403611d-ad00-4c45-be94-83ae57d9a75f" containerID="06958deee8569d614e28d60783078685bc1897a15343fd5aeef03a44767b5c64" exitCode=0 Feb 20 10:00:02 crc kubenswrapper[5094]: I0220 10:00:02.048532 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56" event={"ID":"c403611d-ad00-4c45-be94-83ae57d9a75f","Type":"ContainerDied","Data":"06958deee8569d614e28d60783078685bc1897a15343fd5aeef03a44767b5c64"} Feb 20 10:00:02 crc kubenswrapper[5094]: I0220 10:00:02.048727 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56" event={"ID":"c403611d-ad00-4c45-be94-83ae57d9a75f","Type":"ContainerStarted","Data":"d613a215ebd306e51f10950d4c1029ae3399dedbaf5e5a148636460b587a851f"} Feb 20 10:00:03 crc kubenswrapper[5094]: I0220 10:00:03.610349 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56" Feb 20 10:00:03 crc kubenswrapper[5094]: I0220 10:00:03.739294 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85ctv\" (UniqueName: \"kubernetes.io/projected/c403611d-ad00-4c45-be94-83ae57d9a75f-kube-api-access-85ctv\") pod \"c403611d-ad00-4c45-be94-83ae57d9a75f\" (UID: \"c403611d-ad00-4c45-be94-83ae57d9a75f\") " Feb 20 10:00:03 crc kubenswrapper[5094]: I0220 10:00:03.739385 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c403611d-ad00-4c45-be94-83ae57d9a75f-secret-volume\") pod \"c403611d-ad00-4c45-be94-83ae57d9a75f\" (UID: \"c403611d-ad00-4c45-be94-83ae57d9a75f\") " Feb 20 10:00:03 crc kubenswrapper[5094]: I0220 10:00:03.739524 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c403611d-ad00-4c45-be94-83ae57d9a75f-config-volume\") pod \"c403611d-ad00-4c45-be94-83ae57d9a75f\" (UID: \"c403611d-ad00-4c45-be94-83ae57d9a75f\") " Feb 20 10:00:03 crc kubenswrapper[5094]: I0220 10:00:03.740243 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c403611d-ad00-4c45-be94-83ae57d9a75f-config-volume" (OuterVolumeSpecName: "config-volume") pod "c403611d-ad00-4c45-be94-83ae57d9a75f" (UID: "c403611d-ad00-4c45-be94-83ae57d9a75f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:00:03 crc kubenswrapper[5094]: I0220 10:00:03.746020 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c403611d-ad00-4c45-be94-83ae57d9a75f-kube-api-access-85ctv" (OuterVolumeSpecName: "kube-api-access-85ctv") pod "c403611d-ad00-4c45-be94-83ae57d9a75f" (UID: "c403611d-ad00-4c45-be94-83ae57d9a75f"). InnerVolumeSpecName "kube-api-access-85ctv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:00:03 crc kubenswrapper[5094]: I0220 10:00:03.750830 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c403611d-ad00-4c45-be94-83ae57d9a75f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c403611d-ad00-4c45-be94-83ae57d9a75f" (UID: "c403611d-ad00-4c45-be94-83ae57d9a75f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:00:03 crc kubenswrapper[5094]: I0220 10:00:03.841761 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85ctv\" (UniqueName: \"kubernetes.io/projected/c403611d-ad00-4c45-be94-83ae57d9a75f-kube-api-access-85ctv\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:03 crc kubenswrapper[5094]: I0220 10:00:03.842101 5094 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c403611d-ad00-4c45-be94-83ae57d9a75f-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:03 crc kubenswrapper[5094]: I0220 10:00:03.842122 5094 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c403611d-ad00-4c45-be94-83ae57d9a75f-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:04 crc kubenswrapper[5094]: I0220 10:00:04.088942 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56" event={"ID":"c403611d-ad00-4c45-be94-83ae57d9a75f","Type":"ContainerDied","Data":"d613a215ebd306e51f10950d4c1029ae3399dedbaf5e5a148636460b587a851f"} Feb 20 10:00:04 crc kubenswrapper[5094]: I0220 10:00:04.089042 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d613a215ebd306e51f10950d4c1029ae3399dedbaf5e5a148636460b587a851f" Feb 20 10:00:04 crc kubenswrapper[5094]: I0220 10:00:04.089119 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56" Feb 20 10:00:04 crc kubenswrapper[5094]: I0220 10:00:04.107390 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:00:04 crc kubenswrapper[5094]: I0220 10:00:04.107597 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:00:04 crc kubenswrapper[5094]: I0220 10:00:04.783980 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm"] Feb 20 10:00:04 crc kubenswrapper[5094]: I0220 10:00:04.797089 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm"] Feb 20 10:00:05 crc kubenswrapper[5094]: I0220 10:00:05.851497 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1" path="/var/lib/kubelet/pods/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1/volumes" Feb 20 10:00:34 crc kubenswrapper[5094]: I0220 10:00:34.107238 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:00:34 crc kubenswrapper[5094]: I0220 10:00:34.107803 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:00:45 crc kubenswrapper[5094]: I0220 10:00:45.969146 5094 scope.go:117] "RemoveContainer" containerID="54cabe5f22fe8cc34888629b6ad81ec4c78f22e5ddf13beea44532e8ad37533e" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.147995 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29526361-srdg8"] Feb 20 10:01:00 crc kubenswrapper[5094]: E0220 10:01:00.148978 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c403611d-ad00-4c45-be94-83ae57d9a75f" containerName="collect-profiles" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.148991 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c403611d-ad00-4c45-be94-83ae57d9a75f" containerName="collect-profiles" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.149209 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="c403611d-ad00-4c45-be94-83ae57d9a75f" containerName="collect-profiles" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.150029 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29526361-srdg8" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.169933 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29526361-srdg8"] Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.246374 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-config-data\") pod \"keystone-cron-29526361-srdg8\" (UID: \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\") " pod="openstack/keystone-cron-29526361-srdg8" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.246446 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9ck8\" (UniqueName: \"kubernetes.io/projected/5a279e74-ee64-4ff9-8a0f-2700c30a770d-kube-api-access-v9ck8\") pod \"keystone-cron-29526361-srdg8\" (UID: \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\") " pod="openstack/keystone-cron-29526361-srdg8" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.246831 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-fernet-keys\") pod \"keystone-cron-29526361-srdg8\" (UID: \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\") " pod="openstack/keystone-cron-29526361-srdg8" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.246952 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-combined-ca-bundle\") pod \"keystone-cron-29526361-srdg8\" (UID: \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\") " pod="openstack/keystone-cron-29526361-srdg8" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.348953 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-combined-ca-bundle\") pod \"keystone-cron-29526361-srdg8\" (UID: \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\") " pod="openstack/keystone-cron-29526361-srdg8" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.349208 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-config-data\") pod \"keystone-cron-29526361-srdg8\" (UID: \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\") " pod="openstack/keystone-cron-29526361-srdg8" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.349269 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9ck8\" (UniqueName: \"kubernetes.io/projected/5a279e74-ee64-4ff9-8a0f-2700c30a770d-kube-api-access-v9ck8\") pod \"keystone-cron-29526361-srdg8\" (UID: \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\") " pod="openstack/keystone-cron-29526361-srdg8" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.349357 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-fernet-keys\") pod \"keystone-cron-29526361-srdg8\" (UID: \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\") " pod="openstack/keystone-cron-29526361-srdg8" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.356053 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-config-data\") pod \"keystone-cron-29526361-srdg8\" (UID: \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\") " pod="openstack/keystone-cron-29526361-srdg8" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.356825 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-combined-ca-bundle\") pod \"keystone-cron-29526361-srdg8\" (UID: \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\") " pod="openstack/keystone-cron-29526361-srdg8" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.357678 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-fernet-keys\") pod \"keystone-cron-29526361-srdg8\" (UID: \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\") " pod="openstack/keystone-cron-29526361-srdg8" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.369946 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9ck8\" (UniqueName: \"kubernetes.io/projected/5a279e74-ee64-4ff9-8a0f-2700c30a770d-kube-api-access-v9ck8\") pod \"keystone-cron-29526361-srdg8\" (UID: \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\") " pod="openstack/keystone-cron-29526361-srdg8" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.476262 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29526361-srdg8" Feb 20 10:01:01 crc kubenswrapper[5094]: I0220 10:01:01.123499 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29526361-srdg8"] Feb 20 10:01:01 crc kubenswrapper[5094]: I0220 10:01:01.665480 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29526361-srdg8" event={"ID":"5a279e74-ee64-4ff9-8a0f-2700c30a770d","Type":"ContainerStarted","Data":"e01a6f191157f2a182b310659da19066acb8623209878837eb3ecfca8150fc03"} Feb 20 10:01:01 crc kubenswrapper[5094]: I0220 10:01:01.666237 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29526361-srdg8" event={"ID":"5a279e74-ee64-4ff9-8a0f-2700c30a770d","Type":"ContainerStarted","Data":"866d1dd340cedbaa173010fb9fe515d661516c32b1e799f0b7ad9a2f543304b4"} Feb 20 10:01:01 crc kubenswrapper[5094]: I0220 10:01:01.687858 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29526361-srdg8" podStartSLOduration=1.687837838 podStartE2EDuration="1.687837838s" podCreationTimestamp="2026-02-20 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:01:01.680962877 +0000 UTC m=+11676.553589588" watchObservedRunningTime="2026-02-20 10:01:01.687837838 +0000 UTC m=+11676.560464549" Feb 20 10:01:04 crc kubenswrapper[5094]: I0220 10:01:04.107184 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:01:04 crc kubenswrapper[5094]: I0220 10:01:04.107851 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:01:04 crc kubenswrapper[5094]: I0220 10:01:04.107997 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 10:01:04 crc kubenswrapper[5094]: I0220 10:01:04.109573 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"93e51fc131dd81ce33232a8e4dd68a98c48acfa4e4dbb0b00dd430da14bb99d1"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:01:04 crc kubenswrapper[5094]: I0220 10:01:04.109651 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://93e51fc131dd81ce33232a8e4dd68a98c48acfa4e4dbb0b00dd430da14bb99d1" gracePeriod=600 Feb 20 10:01:04 crc kubenswrapper[5094]: I0220 10:01:04.694160 5094 generic.go:334] "Generic (PLEG): container finished" podID="5a279e74-ee64-4ff9-8a0f-2700c30a770d" containerID="e01a6f191157f2a182b310659da19066acb8623209878837eb3ecfca8150fc03" exitCode=0 Feb 20 10:01:04 crc kubenswrapper[5094]: I0220 10:01:04.694272 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29526361-srdg8" event={"ID":"5a279e74-ee64-4ff9-8a0f-2700c30a770d","Type":"ContainerDied","Data":"e01a6f191157f2a182b310659da19066acb8623209878837eb3ecfca8150fc03"} Feb 20 10:01:04 crc kubenswrapper[5094]: I0220 10:01:04.698670 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="93e51fc131dd81ce33232a8e4dd68a98c48acfa4e4dbb0b00dd430da14bb99d1" exitCode=0 Feb 20 10:01:04 crc kubenswrapper[5094]: I0220 10:01:04.698734 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"93e51fc131dd81ce33232a8e4dd68a98c48acfa4e4dbb0b00dd430da14bb99d1"} Feb 20 10:01:04 crc kubenswrapper[5094]: I0220 10:01:04.698763 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf"} Feb 20 10:01:04 crc kubenswrapper[5094]: I0220 10:01:04.698783 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 10:01:06 crc kubenswrapper[5094]: I0220 10:01:06.336369 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29526361-srdg8" Feb 20 10:01:06 crc kubenswrapper[5094]: I0220 10:01:06.381046 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-combined-ca-bundle\") pod \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\" (UID: \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\") " Feb 20 10:01:06 crc kubenswrapper[5094]: I0220 10:01:06.381230 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9ck8\" (UniqueName: \"kubernetes.io/projected/5a279e74-ee64-4ff9-8a0f-2700c30a770d-kube-api-access-v9ck8\") pod \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\" (UID: \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\") " Feb 20 10:01:06 crc kubenswrapper[5094]: I0220 10:01:06.381326 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-fernet-keys\") pod \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\" (UID: \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\") " Feb 20 10:01:06 crc kubenswrapper[5094]: I0220 10:01:06.381362 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-config-data\") pod \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\" (UID: \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\") " Feb 20 10:01:06 crc kubenswrapper[5094]: I0220 10:01:06.388871 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5a279e74-ee64-4ff9-8a0f-2700c30a770d" (UID: "5a279e74-ee64-4ff9-8a0f-2700c30a770d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:01:06 crc kubenswrapper[5094]: I0220 10:01:06.395002 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a279e74-ee64-4ff9-8a0f-2700c30a770d-kube-api-access-v9ck8" (OuterVolumeSpecName: "kube-api-access-v9ck8") pod "5a279e74-ee64-4ff9-8a0f-2700c30a770d" (UID: "5a279e74-ee64-4ff9-8a0f-2700c30a770d"). InnerVolumeSpecName "kube-api-access-v9ck8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:01:06 crc kubenswrapper[5094]: I0220 10:01:06.433905 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a279e74-ee64-4ff9-8a0f-2700c30a770d" (UID: "5a279e74-ee64-4ff9-8a0f-2700c30a770d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:01:06 crc kubenswrapper[5094]: I0220 10:01:06.466405 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-config-data" (OuterVolumeSpecName: "config-data") pod "5a279e74-ee64-4ff9-8a0f-2700c30a770d" (UID: "5a279e74-ee64-4ff9-8a0f-2700c30a770d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:01:06 crc kubenswrapper[5094]: I0220 10:01:06.484114 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:01:06 crc kubenswrapper[5094]: I0220 10:01:06.484140 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9ck8\" (UniqueName: \"kubernetes.io/projected/5a279e74-ee64-4ff9-8a0f-2700c30a770d-kube-api-access-v9ck8\") on node \"crc\" DevicePath \"\"" Feb 20 10:01:06 crc kubenswrapper[5094]: I0220 10:01:06.484151 5094 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 20 10:01:06 crc kubenswrapper[5094]: I0220 10:01:06.484158 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:01:06 crc kubenswrapper[5094]: I0220 10:01:06.729390 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29526361-srdg8" event={"ID":"5a279e74-ee64-4ff9-8a0f-2700c30a770d","Type":"ContainerDied","Data":"866d1dd340cedbaa173010fb9fe515d661516c32b1e799f0b7ad9a2f543304b4"} Feb 20 10:01:06 crc kubenswrapper[5094]: I0220 10:01:06.729430 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="866d1dd340cedbaa173010fb9fe515d661516c32b1e799f0b7ad9a2f543304b4" Feb 20 10:01:06 crc kubenswrapper[5094]: I0220 10:01:06.729482 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29526361-srdg8" Feb 20 10:03:04 crc kubenswrapper[5094]: I0220 10:03:04.107181 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:03:04 crc kubenswrapper[5094]: I0220 10:03:04.107770 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:03:34 crc kubenswrapper[5094]: I0220 10:03:34.106830 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:03:34 crc kubenswrapper[5094]: I0220 10:03:34.107492 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:04:04 crc kubenswrapper[5094]: I0220 10:04:04.106686 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:04:04 crc kubenswrapper[5094]: I0220 10:04:04.107480 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:04:04 crc kubenswrapper[5094]: I0220 10:04:04.107535 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 10:04:04 crc kubenswrapper[5094]: I0220 10:04:04.108651 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:04:04 crc kubenswrapper[5094]: I0220 10:04:04.108791 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" gracePeriod=600 Feb 20 10:04:04 crc kubenswrapper[5094]: E0220 10:04:04.227220 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:04:04 crc kubenswrapper[5094]: I0220 10:04:04.285272 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" exitCode=0 Feb 20 10:04:04 crc kubenswrapper[5094]: I0220 10:04:04.285317 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf"} Feb 20 10:04:04 crc kubenswrapper[5094]: I0220 10:04:04.285354 5094 scope.go:117] "RemoveContainer" containerID="93e51fc131dd81ce33232a8e4dd68a98c48acfa4e4dbb0b00dd430da14bb99d1" Feb 20 10:04:04 crc kubenswrapper[5094]: I0220 10:04:04.286215 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:04:04 crc kubenswrapper[5094]: E0220 10:04:04.286831 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:04:16 crc kubenswrapper[5094]: I0220 10:04:16.861630 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:04:16 crc kubenswrapper[5094]: E0220 10:04:16.864788 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:04:27 crc kubenswrapper[5094]: I0220 10:04:27.840994 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:04:27 crc kubenswrapper[5094]: E0220 10:04:27.841875 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:04:42 crc kubenswrapper[5094]: I0220 10:04:42.841300 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:04:42 crc kubenswrapper[5094]: E0220 10:04:42.842369 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:04:52 crc kubenswrapper[5094]: I0220 10:04:52.876179 5094 generic.go:334] "Generic (PLEG): container finished" podID="8e2aa894-2a09-4fad-bcc7-1f259ca48ac9" containerID="2c94eb29c8d1a5480a6899dd7732430e17544eb4ff0b06be93fdd212c2a48558" exitCode=0 Feb 20 10:04:52 crc kubenswrapper[5094]: I0220 10:04:52.876435 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9","Type":"ContainerDied","Data":"2c94eb29c8d1a5480a6899dd7732430e17544eb4ff0b06be93fdd212c2a48558"} Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.380546 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.532868 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-config-data\") pod \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.532944 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-test-operator-ephemeral-temporary\") pod \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.532981 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-test-operator-ephemeral-workdir\") pod \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.533033 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7j29\" (UniqueName: \"kubernetes.io/projected/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-kube-api-access-v7j29\") pod \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.533119 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.533148 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-ssh-key\") pod \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.533173 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-openstack-config-secret\") pod \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.533261 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-openstack-config\") pod \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.533278 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-ca-certs\") pod \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.533804 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-config-data" (OuterVolumeSpecName: "config-data") pod "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9" (UID: "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.538526 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9" (UID: "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.555393 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "test-operator-logs") pod "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9" (UID: "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.574274 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9" (UID: "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.580044 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-kube-api-access-v7j29" (OuterVolumeSpecName: "kube-api-access-v7j29") pod "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9" (UID: "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9"). InnerVolumeSpecName "kube-api-access-v7j29". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.584035 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9" (UID: "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.590815 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9" (UID: "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.603305 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9" (UID: "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.603767 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9" (UID: "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.636090 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.636126 5094 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.636141 5094 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.636157 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7j29\" (UniqueName: \"kubernetes.io/projected/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-kube-api-access-v7j29\") on node \"crc\" DevicePath \"\"" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.636196 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.636208 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.636220 5094 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.636233 5094 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.636244 5094 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.661002 5094 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.740758 5094 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.904646 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9","Type":"ContainerDied","Data":"e3fe55f15912750e6ef07e8fa7b4632b5f9782d82892810f96924fcbf7aff5a8"} Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.904687 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3fe55f15912750e6ef07e8fa7b4632b5f9782d82892810f96924fcbf7aff5a8" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.904783 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 20 10:04:55 crc kubenswrapper[5094]: I0220 10:04:55.841061 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:04:55 crc kubenswrapper[5094]: E0220 10:04:55.841914 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:04:58 crc kubenswrapper[5094]: I0220 10:04:58.772142 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 20 10:04:58 crc kubenswrapper[5094]: E0220 10:04:58.773524 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e2aa894-2a09-4fad-bcc7-1f259ca48ac9" containerName="tempest-tests-tempest-tests-runner" Feb 20 10:04:58 crc kubenswrapper[5094]: I0220 10:04:58.773549 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e2aa894-2a09-4fad-bcc7-1f259ca48ac9" containerName="tempest-tests-tempest-tests-runner" Feb 20 10:04:58 crc kubenswrapper[5094]: E0220 10:04:58.773607 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a279e74-ee64-4ff9-8a0f-2700c30a770d" containerName="keystone-cron" Feb 20 10:04:58 crc kubenswrapper[5094]: I0220 10:04:58.773620 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a279e74-ee64-4ff9-8a0f-2700c30a770d" containerName="keystone-cron" Feb 20 10:04:58 crc kubenswrapper[5094]: I0220 10:04:58.774061 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e2aa894-2a09-4fad-bcc7-1f259ca48ac9" containerName="tempest-tests-tempest-tests-runner" Feb 20 10:04:58 crc kubenswrapper[5094]: I0220 10:04:58.774085 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a279e74-ee64-4ff9-8a0f-2700c30a770d" containerName="keystone-cron" Feb 20 10:04:58 crc kubenswrapper[5094]: I0220 10:04:58.775566 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 20 10:04:58 crc kubenswrapper[5094]: I0220 10:04:58.778608 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-d88p7" Feb 20 10:04:58 crc kubenswrapper[5094]: I0220 10:04:58.785701 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 20 10:04:58 crc kubenswrapper[5094]: I0220 10:04:58.934901 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6798c144-2ada-4a54-98c4-72db0e7bd732\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 20 10:04:58 crc kubenswrapper[5094]: I0220 10:04:58.935082 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvgx9\" (UniqueName: \"kubernetes.io/projected/6798c144-2ada-4a54-98c4-72db0e7bd732-kube-api-access-qvgx9\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6798c144-2ada-4a54-98c4-72db0e7bd732\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 20 10:04:59 crc kubenswrapper[5094]: I0220 10:04:59.041625 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvgx9\" (UniqueName: \"kubernetes.io/projected/6798c144-2ada-4a54-98c4-72db0e7bd732-kube-api-access-qvgx9\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6798c144-2ada-4a54-98c4-72db0e7bd732\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 20 10:04:59 crc kubenswrapper[5094]: I0220 10:04:59.045533 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6798c144-2ada-4a54-98c4-72db0e7bd732\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 20 10:04:59 crc kubenswrapper[5094]: I0220 10:04:59.046334 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6798c144-2ada-4a54-98c4-72db0e7bd732\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 20 10:04:59 crc kubenswrapper[5094]: I0220 10:04:59.080950 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvgx9\" (UniqueName: \"kubernetes.io/projected/6798c144-2ada-4a54-98c4-72db0e7bd732-kube-api-access-qvgx9\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6798c144-2ada-4a54-98c4-72db0e7bd732\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 20 10:04:59 crc kubenswrapper[5094]: I0220 10:04:59.091485 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6798c144-2ada-4a54-98c4-72db0e7bd732\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 20 10:04:59 crc kubenswrapper[5094]: I0220 10:04:59.101860 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 20 10:04:59 crc kubenswrapper[5094]: W0220 10:04:59.682987 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6798c144_2ada_4a54_98c4_72db0e7bd732.slice/crio-faa5652cd80e1bf603556f618cfb1e0c37581b32692c0405d3e45fc435b025e2 WatchSource:0}: Error finding container faa5652cd80e1bf603556f618cfb1e0c37581b32692c0405d3e45fc435b025e2: Status 404 returned error can't find the container with id faa5652cd80e1bf603556f618cfb1e0c37581b32692c0405d3e45fc435b025e2 Feb 20 10:04:59 crc kubenswrapper[5094]: I0220 10:04:59.688110 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 20 10:04:59 crc kubenswrapper[5094]: I0220 10:04:59.689934 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 10:04:59 crc kubenswrapper[5094]: I0220 10:04:59.966579 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"6798c144-2ada-4a54-98c4-72db0e7bd732","Type":"ContainerStarted","Data":"faa5652cd80e1bf603556f618cfb1e0c37581b32692c0405d3e45fc435b025e2"} Feb 20 10:05:00 crc kubenswrapper[5094]: I0220 10:05:00.979453 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"6798c144-2ada-4a54-98c4-72db0e7bd732","Type":"ContainerStarted","Data":"fd52b835d21d44adbc9e9b037575e81b3574cf3fc67df2488927d93b8648fb3b"} Feb 20 10:05:01 crc kubenswrapper[5094]: I0220 10:05:01.002869 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.140427751 podStartE2EDuration="3.002850243s" podCreationTimestamp="2026-02-20 10:04:58 +0000 UTC" firstStartedPulling="2026-02-20 10:04:59.689510098 +0000 UTC m=+11914.562136829" lastFinishedPulling="2026-02-20 10:05:00.5519326 +0000 UTC m=+11915.424559321" observedRunningTime="2026-02-20 10:05:00.993639734 +0000 UTC m=+11915.866266455" watchObservedRunningTime="2026-02-20 10:05:01.002850243 +0000 UTC m=+11915.875476964" Feb 20 10:05:09 crc kubenswrapper[5094]: I0220 10:05:09.840921 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:05:09 crc kubenswrapper[5094]: E0220 10:05:09.841997 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:05:21 crc kubenswrapper[5094]: I0220 10:05:21.840254 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:05:21 crc kubenswrapper[5094]: E0220 10:05:21.841564 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:05:34 crc kubenswrapper[5094]: I0220 10:05:34.841632 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:05:34 crc kubenswrapper[5094]: E0220 10:05:34.843300 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:05:45 crc kubenswrapper[5094]: I0220 10:05:45.849966 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:05:45 crc kubenswrapper[5094]: E0220 10:05:45.850765 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:05:46 crc kubenswrapper[5094]: I0220 10:05:46.922006 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jvm7h"] Feb 20 10:05:46 crc kubenswrapper[5094]: I0220 10:05:46.929343 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:05:46 crc kubenswrapper[5094]: I0220 10:05:46.945092 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jvm7h"] Feb 20 10:05:47 crc kubenswrapper[5094]: I0220 10:05:47.031584 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9gbm\" (UniqueName: \"kubernetes.io/projected/da91a28e-9af4-44ae-a45e-542551dc917c-kube-api-access-r9gbm\") pod \"certified-operators-jvm7h\" (UID: \"da91a28e-9af4-44ae-a45e-542551dc917c\") " pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:05:47 crc kubenswrapper[5094]: I0220 10:05:47.031974 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da91a28e-9af4-44ae-a45e-542551dc917c-utilities\") pod \"certified-operators-jvm7h\" (UID: \"da91a28e-9af4-44ae-a45e-542551dc917c\") " pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:05:47 crc kubenswrapper[5094]: I0220 10:05:47.032138 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da91a28e-9af4-44ae-a45e-542551dc917c-catalog-content\") pod \"certified-operators-jvm7h\" (UID: \"da91a28e-9af4-44ae-a45e-542551dc917c\") " pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:05:47 crc kubenswrapper[5094]: I0220 10:05:47.135826 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9gbm\" (UniqueName: \"kubernetes.io/projected/da91a28e-9af4-44ae-a45e-542551dc917c-kube-api-access-r9gbm\") pod \"certified-operators-jvm7h\" (UID: \"da91a28e-9af4-44ae-a45e-542551dc917c\") " pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:05:47 crc kubenswrapper[5094]: I0220 10:05:47.136029 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da91a28e-9af4-44ae-a45e-542551dc917c-utilities\") pod \"certified-operators-jvm7h\" (UID: \"da91a28e-9af4-44ae-a45e-542551dc917c\") " pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:05:47 crc kubenswrapper[5094]: I0220 10:05:47.136131 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da91a28e-9af4-44ae-a45e-542551dc917c-catalog-content\") pod \"certified-operators-jvm7h\" (UID: \"da91a28e-9af4-44ae-a45e-542551dc917c\") " pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:05:47 crc kubenswrapper[5094]: I0220 10:05:47.136888 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da91a28e-9af4-44ae-a45e-542551dc917c-utilities\") pod \"certified-operators-jvm7h\" (UID: \"da91a28e-9af4-44ae-a45e-542551dc917c\") " pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:05:47 crc kubenswrapper[5094]: I0220 10:05:47.137132 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da91a28e-9af4-44ae-a45e-542551dc917c-catalog-content\") pod \"certified-operators-jvm7h\" (UID: \"da91a28e-9af4-44ae-a45e-542551dc917c\") " pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:05:47 crc kubenswrapper[5094]: I0220 10:05:47.169660 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9gbm\" (UniqueName: \"kubernetes.io/projected/da91a28e-9af4-44ae-a45e-542551dc917c-kube-api-access-r9gbm\") pod \"certified-operators-jvm7h\" (UID: \"da91a28e-9af4-44ae-a45e-542551dc917c\") " pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:05:47 crc kubenswrapper[5094]: I0220 10:05:47.260302 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:05:47 crc kubenswrapper[5094]: I0220 10:05:47.801392 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jvm7h"] Feb 20 10:05:48 crc kubenswrapper[5094]: I0220 10:05:48.678854 5094 generic.go:334] "Generic (PLEG): container finished" podID="da91a28e-9af4-44ae-a45e-542551dc917c" containerID="643acf5c4e38b51acc994c44b2f4a36fa33838309f3581a2417f33e550ad8045" exitCode=0 Feb 20 10:05:48 crc kubenswrapper[5094]: I0220 10:05:48.678951 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvm7h" event={"ID":"da91a28e-9af4-44ae-a45e-542551dc917c","Type":"ContainerDied","Data":"643acf5c4e38b51acc994c44b2f4a36fa33838309f3581a2417f33e550ad8045"} Feb 20 10:05:48 crc kubenswrapper[5094]: I0220 10:05:48.679448 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvm7h" event={"ID":"da91a28e-9af4-44ae-a45e-542551dc917c","Type":"ContainerStarted","Data":"b9d4df055f589b6802cc89131a9aee1fa42a3cbe8017b6961f057dd461070128"} Feb 20 10:05:49 crc kubenswrapper[5094]: I0220 10:05:49.691891 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvm7h" event={"ID":"da91a28e-9af4-44ae-a45e-542551dc917c","Type":"ContainerStarted","Data":"456b50afe4baf8b6444d92f51999b9e08874fa6e10737779b4c59ac9974d405b"} Feb 20 10:05:51 crc kubenswrapper[5094]: I0220 10:05:51.724299 5094 generic.go:334] "Generic (PLEG): container finished" podID="da91a28e-9af4-44ae-a45e-542551dc917c" containerID="456b50afe4baf8b6444d92f51999b9e08874fa6e10737779b4c59ac9974d405b" exitCode=0 Feb 20 10:05:51 crc kubenswrapper[5094]: I0220 10:05:51.724394 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvm7h" event={"ID":"da91a28e-9af4-44ae-a45e-542551dc917c","Type":"ContainerDied","Data":"456b50afe4baf8b6444d92f51999b9e08874fa6e10737779b4c59ac9974d405b"} Feb 20 10:05:52 crc kubenswrapper[5094]: I0220 10:05:52.758621 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvm7h" event={"ID":"da91a28e-9af4-44ae-a45e-542551dc917c","Type":"ContainerStarted","Data":"463f1aaecc85c5377345b6c575dabab218080f4dba32820025711c0711c3e85c"} Feb 20 10:05:52 crc kubenswrapper[5094]: I0220 10:05:52.800732 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jvm7h" podStartSLOduration=3.388236217 podStartE2EDuration="6.80069024s" podCreationTimestamp="2026-02-20 10:05:46 +0000 UTC" firstStartedPulling="2026-02-20 10:05:48.683387696 +0000 UTC m=+11963.556014417" lastFinishedPulling="2026-02-20 10:05:52.095841689 +0000 UTC m=+11966.968468440" observedRunningTime="2026-02-20 10:05:52.779024743 +0000 UTC m=+11967.651651464" watchObservedRunningTime="2026-02-20 10:05:52.80069024 +0000 UTC m=+11967.673316961" Feb 20 10:05:56 crc kubenswrapper[5094]: I0220 10:05:56.840359 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:05:56 crc kubenswrapper[5094]: E0220 10:05:56.841328 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:05:57 crc kubenswrapper[5094]: I0220 10:05:57.260821 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:05:57 crc kubenswrapper[5094]: I0220 10:05:57.260968 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:05:57 crc kubenswrapper[5094]: I0220 10:05:57.334505 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:05:57 crc kubenswrapper[5094]: I0220 10:05:57.898649 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:05:57 crc kubenswrapper[5094]: I0220 10:05:57.962725 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jvm7h"] Feb 20 10:05:59 crc kubenswrapper[5094]: I0220 10:05:59.854109 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jvm7h" podUID="da91a28e-9af4-44ae-a45e-542551dc917c" containerName="registry-server" containerID="cri-o://463f1aaecc85c5377345b6c575dabab218080f4dba32820025711c0711c3e85c" gracePeriod=2 Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.345696 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.379444 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9gbm\" (UniqueName: \"kubernetes.io/projected/da91a28e-9af4-44ae-a45e-542551dc917c-kube-api-access-r9gbm\") pod \"da91a28e-9af4-44ae-a45e-542551dc917c\" (UID: \"da91a28e-9af4-44ae-a45e-542551dc917c\") " Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.379668 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da91a28e-9af4-44ae-a45e-542551dc917c-utilities\") pod \"da91a28e-9af4-44ae-a45e-542551dc917c\" (UID: \"da91a28e-9af4-44ae-a45e-542551dc917c\") " Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.380009 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da91a28e-9af4-44ae-a45e-542551dc917c-catalog-content\") pod \"da91a28e-9af4-44ae-a45e-542551dc917c\" (UID: \"da91a28e-9af4-44ae-a45e-542551dc917c\") " Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.380610 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da91a28e-9af4-44ae-a45e-542551dc917c-utilities" (OuterVolumeSpecName: "utilities") pod "da91a28e-9af4-44ae-a45e-542551dc917c" (UID: "da91a28e-9af4-44ae-a45e-542551dc917c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.380884 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da91a28e-9af4-44ae-a45e-542551dc917c-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.390456 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da91a28e-9af4-44ae-a45e-542551dc917c-kube-api-access-r9gbm" (OuterVolumeSpecName: "kube-api-access-r9gbm") pod "da91a28e-9af4-44ae-a45e-542551dc917c" (UID: "da91a28e-9af4-44ae-a45e-542551dc917c"). InnerVolumeSpecName "kube-api-access-r9gbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.469872 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da91a28e-9af4-44ae-a45e-542551dc917c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da91a28e-9af4-44ae-a45e-542551dc917c" (UID: "da91a28e-9af4-44ae-a45e-542551dc917c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.483125 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da91a28e-9af4-44ae-a45e-542551dc917c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.483177 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9gbm\" (UniqueName: \"kubernetes.io/projected/da91a28e-9af4-44ae-a45e-542551dc917c-kube-api-access-r9gbm\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.866739 5094 generic.go:334] "Generic (PLEG): container finished" podID="da91a28e-9af4-44ae-a45e-542551dc917c" containerID="463f1aaecc85c5377345b6c575dabab218080f4dba32820025711c0711c3e85c" exitCode=0 Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.866826 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvm7h" event={"ID":"da91a28e-9af4-44ae-a45e-542551dc917c","Type":"ContainerDied","Data":"463f1aaecc85c5377345b6c575dabab218080f4dba32820025711c0711c3e85c"} Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.867241 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvm7h" event={"ID":"da91a28e-9af4-44ae-a45e-542551dc917c","Type":"ContainerDied","Data":"b9d4df055f589b6802cc89131a9aee1fa42a3cbe8017b6961f057dd461070128"} Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.867269 5094 scope.go:117] "RemoveContainer" containerID="463f1aaecc85c5377345b6c575dabab218080f4dba32820025711c0711c3e85c" Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.866865 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.900435 5094 scope.go:117] "RemoveContainer" containerID="456b50afe4baf8b6444d92f51999b9e08874fa6e10737779b4c59ac9974d405b" Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.908862 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jvm7h"] Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.919204 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jvm7h"] Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.952916 5094 scope.go:117] "RemoveContainer" containerID="643acf5c4e38b51acc994c44b2f4a36fa33838309f3581a2417f33e550ad8045" Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.980263 5094 scope.go:117] "RemoveContainer" containerID="463f1aaecc85c5377345b6c575dabab218080f4dba32820025711c0711c3e85c" Feb 20 10:06:00 crc kubenswrapper[5094]: E0220 10:06:00.986170 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"463f1aaecc85c5377345b6c575dabab218080f4dba32820025711c0711c3e85c\": container with ID starting with 463f1aaecc85c5377345b6c575dabab218080f4dba32820025711c0711c3e85c not found: ID does not exist" containerID="463f1aaecc85c5377345b6c575dabab218080f4dba32820025711c0711c3e85c" Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.986207 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"463f1aaecc85c5377345b6c575dabab218080f4dba32820025711c0711c3e85c"} err="failed to get container status \"463f1aaecc85c5377345b6c575dabab218080f4dba32820025711c0711c3e85c\": rpc error: code = NotFound desc = could not find container \"463f1aaecc85c5377345b6c575dabab218080f4dba32820025711c0711c3e85c\": container with ID starting with 463f1aaecc85c5377345b6c575dabab218080f4dba32820025711c0711c3e85c not found: ID does not exist" Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.986255 5094 scope.go:117] "RemoveContainer" containerID="456b50afe4baf8b6444d92f51999b9e08874fa6e10737779b4c59ac9974d405b" Feb 20 10:06:00 crc kubenswrapper[5094]: E0220 10:06:00.986677 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"456b50afe4baf8b6444d92f51999b9e08874fa6e10737779b4c59ac9974d405b\": container with ID starting with 456b50afe4baf8b6444d92f51999b9e08874fa6e10737779b4c59ac9974d405b not found: ID does not exist" containerID="456b50afe4baf8b6444d92f51999b9e08874fa6e10737779b4c59ac9974d405b" Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.986798 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"456b50afe4baf8b6444d92f51999b9e08874fa6e10737779b4c59ac9974d405b"} err="failed to get container status \"456b50afe4baf8b6444d92f51999b9e08874fa6e10737779b4c59ac9974d405b\": rpc error: code = NotFound desc = could not find container \"456b50afe4baf8b6444d92f51999b9e08874fa6e10737779b4c59ac9974d405b\": container with ID starting with 456b50afe4baf8b6444d92f51999b9e08874fa6e10737779b4c59ac9974d405b not found: ID does not exist" Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.986905 5094 scope.go:117] "RemoveContainer" containerID="643acf5c4e38b51acc994c44b2f4a36fa33838309f3581a2417f33e550ad8045" Feb 20 10:06:00 crc kubenswrapper[5094]: E0220 10:06:00.987361 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"643acf5c4e38b51acc994c44b2f4a36fa33838309f3581a2417f33e550ad8045\": container with ID starting with 643acf5c4e38b51acc994c44b2f4a36fa33838309f3581a2417f33e550ad8045 not found: ID does not exist" containerID="643acf5c4e38b51acc994c44b2f4a36fa33838309f3581a2417f33e550ad8045" Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.987410 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"643acf5c4e38b51acc994c44b2f4a36fa33838309f3581a2417f33e550ad8045"} err="failed to get container status \"643acf5c4e38b51acc994c44b2f4a36fa33838309f3581a2417f33e550ad8045\": rpc error: code = NotFound desc = could not find container \"643acf5c4e38b51acc994c44b2f4a36fa33838309f3581a2417f33e550ad8045\": container with ID starting with 643acf5c4e38b51acc994c44b2f4a36fa33838309f3581a2417f33e550ad8045 not found: ID does not exist" Feb 20 10:06:01 crc kubenswrapper[5094]: I0220 10:06:01.859284 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da91a28e-9af4-44ae-a45e-542551dc917c" path="/var/lib/kubelet/pods/da91a28e-9af4-44ae-a45e-542551dc917c/volumes" Feb 20 10:06:11 crc kubenswrapper[5094]: I0220 10:06:11.146243 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:06:11 crc kubenswrapper[5094]: E0220 10:06:11.149644 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.005014 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pc5cx/must-gather-n6jcq"] Feb 20 10:06:22 crc kubenswrapper[5094]: E0220 10:06:22.005992 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da91a28e-9af4-44ae-a45e-542551dc917c" containerName="extract-content" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.006006 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="da91a28e-9af4-44ae-a45e-542551dc917c" containerName="extract-content" Feb 20 10:06:22 crc kubenswrapper[5094]: E0220 10:06:22.006029 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da91a28e-9af4-44ae-a45e-542551dc917c" containerName="registry-server" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.006035 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="da91a28e-9af4-44ae-a45e-542551dc917c" containerName="registry-server" Feb 20 10:06:22 crc kubenswrapper[5094]: E0220 10:06:22.006063 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da91a28e-9af4-44ae-a45e-542551dc917c" containerName="extract-utilities" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.006070 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="da91a28e-9af4-44ae-a45e-542551dc917c" containerName="extract-utilities" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.006262 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="da91a28e-9af4-44ae-a45e-542551dc917c" containerName="registry-server" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.010014 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pc5cx/must-gather-n6jcq" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.011779 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-pc5cx"/"default-dockercfg-6hz5p" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.014239 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-pc5cx"/"openshift-service-ca.crt" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.014244 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-pc5cx"/"kube-root-ca.crt" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.020590 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pc5cx/must-gather-n6jcq"] Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.085628 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7jgv\" (UniqueName: \"kubernetes.io/projected/2578360e-4830-4223-b07f-031c6c2df11e-kube-api-access-c7jgv\") pod \"must-gather-n6jcq\" (UID: \"2578360e-4830-4223-b07f-031c6c2df11e\") " pod="openshift-must-gather-pc5cx/must-gather-n6jcq" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.086067 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2578360e-4830-4223-b07f-031c6c2df11e-must-gather-output\") pod \"must-gather-n6jcq\" (UID: \"2578360e-4830-4223-b07f-031c6c2df11e\") " pod="openshift-must-gather-pc5cx/must-gather-n6jcq" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.188613 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2578360e-4830-4223-b07f-031c6c2df11e-must-gather-output\") pod \"must-gather-n6jcq\" (UID: \"2578360e-4830-4223-b07f-031c6c2df11e\") " pod="openshift-must-gather-pc5cx/must-gather-n6jcq" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.188832 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7jgv\" (UniqueName: \"kubernetes.io/projected/2578360e-4830-4223-b07f-031c6c2df11e-kube-api-access-c7jgv\") pod \"must-gather-n6jcq\" (UID: \"2578360e-4830-4223-b07f-031c6c2df11e\") " pod="openshift-must-gather-pc5cx/must-gather-n6jcq" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.189233 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2578360e-4830-4223-b07f-031c6c2df11e-must-gather-output\") pod \"must-gather-n6jcq\" (UID: \"2578360e-4830-4223-b07f-031c6c2df11e\") " pod="openshift-must-gather-pc5cx/must-gather-n6jcq" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.214538 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7jgv\" (UniqueName: \"kubernetes.io/projected/2578360e-4830-4223-b07f-031c6c2df11e-kube-api-access-c7jgv\") pod \"must-gather-n6jcq\" (UID: \"2578360e-4830-4223-b07f-031c6c2df11e\") " pod="openshift-must-gather-pc5cx/must-gather-n6jcq" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.328466 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pc5cx/must-gather-n6jcq" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.847757 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pc5cx/must-gather-n6jcq"] Feb 20 10:06:23 crc kubenswrapper[5094]: I0220 10:06:23.295153 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pc5cx/must-gather-n6jcq" event={"ID":"2578360e-4830-4223-b07f-031c6c2df11e","Type":"ContainerStarted","Data":"2ec39c0a5b5164de0b8530de19fc0bc02b72ed51c4f04ce0cff45952f8f5ad7f"} Feb 20 10:06:24 crc kubenswrapper[5094]: I0220 10:06:24.840427 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:06:24 crc kubenswrapper[5094]: E0220 10:06:24.840968 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:06:31 crc kubenswrapper[5094]: I0220 10:06:31.404344 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pc5cx/must-gather-n6jcq" event={"ID":"2578360e-4830-4223-b07f-031c6c2df11e","Type":"ContainerStarted","Data":"ef7151eae2ca0ca37c08fec6a465f8f62ef97648b0bec7c8a5a1cfa8e04d3147"} Feb 20 10:06:31 crc kubenswrapper[5094]: I0220 10:06:31.405083 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pc5cx/must-gather-n6jcq" event={"ID":"2578360e-4830-4223-b07f-031c6c2df11e","Type":"ContainerStarted","Data":"bf28c01ded4fd08d2344ae69fe0783b98b33fb64650c719c32f78d2a847e5899"} Feb 20 10:06:31 crc kubenswrapper[5094]: I0220 10:06:31.421941 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pc5cx/must-gather-n6jcq" podStartSLOduration=3.014911463 podStartE2EDuration="10.421917605s" podCreationTimestamp="2026-02-20 10:06:21 +0000 UTC" firstStartedPulling="2026-02-20 10:06:22.85228688 +0000 UTC m=+11997.724913591" lastFinishedPulling="2026-02-20 10:06:30.259293022 +0000 UTC m=+12005.131919733" observedRunningTime="2026-02-20 10:06:31.420294095 +0000 UTC m=+12006.292920806" watchObservedRunningTime="2026-02-20 10:06:31.421917605 +0000 UTC m=+12006.294544316" Feb 20 10:06:36 crc kubenswrapper[5094]: I0220 10:06:36.342845 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pc5cx/crc-debug-4vf4g"] Feb 20 10:06:36 crc kubenswrapper[5094]: I0220 10:06:36.345451 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pc5cx/crc-debug-4vf4g" Feb 20 10:06:36 crc kubenswrapper[5094]: I0220 10:06:36.449105 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfx9r\" (UniqueName: \"kubernetes.io/projected/603fd649-921c-4ddc-aaea-ca487d399cdc-kube-api-access-rfx9r\") pod \"crc-debug-4vf4g\" (UID: \"603fd649-921c-4ddc-aaea-ca487d399cdc\") " pod="openshift-must-gather-pc5cx/crc-debug-4vf4g" Feb 20 10:06:36 crc kubenswrapper[5094]: I0220 10:06:36.449153 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/603fd649-921c-4ddc-aaea-ca487d399cdc-host\") pod \"crc-debug-4vf4g\" (UID: \"603fd649-921c-4ddc-aaea-ca487d399cdc\") " pod="openshift-must-gather-pc5cx/crc-debug-4vf4g" Feb 20 10:06:36 crc kubenswrapper[5094]: I0220 10:06:36.551556 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfx9r\" (UniqueName: \"kubernetes.io/projected/603fd649-921c-4ddc-aaea-ca487d399cdc-kube-api-access-rfx9r\") pod \"crc-debug-4vf4g\" (UID: \"603fd649-921c-4ddc-aaea-ca487d399cdc\") " pod="openshift-must-gather-pc5cx/crc-debug-4vf4g" Feb 20 10:06:36 crc kubenswrapper[5094]: I0220 10:06:36.551661 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/603fd649-921c-4ddc-aaea-ca487d399cdc-host\") pod \"crc-debug-4vf4g\" (UID: \"603fd649-921c-4ddc-aaea-ca487d399cdc\") " pod="openshift-must-gather-pc5cx/crc-debug-4vf4g" Feb 20 10:06:36 crc kubenswrapper[5094]: I0220 10:06:36.551818 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/603fd649-921c-4ddc-aaea-ca487d399cdc-host\") pod \"crc-debug-4vf4g\" (UID: \"603fd649-921c-4ddc-aaea-ca487d399cdc\") " pod="openshift-must-gather-pc5cx/crc-debug-4vf4g" Feb 20 10:06:36 crc kubenswrapper[5094]: I0220 10:06:36.578403 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfx9r\" (UniqueName: \"kubernetes.io/projected/603fd649-921c-4ddc-aaea-ca487d399cdc-kube-api-access-rfx9r\") pod \"crc-debug-4vf4g\" (UID: \"603fd649-921c-4ddc-aaea-ca487d399cdc\") " pod="openshift-must-gather-pc5cx/crc-debug-4vf4g" Feb 20 10:06:36 crc kubenswrapper[5094]: I0220 10:06:36.662212 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pc5cx/crc-debug-4vf4g" Feb 20 10:06:37 crc kubenswrapper[5094]: I0220 10:06:37.488199 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pc5cx/crc-debug-4vf4g" event={"ID":"603fd649-921c-4ddc-aaea-ca487d399cdc","Type":"ContainerStarted","Data":"d8e323d826373496a351adc42cad6ef3517155efe7a5cb6c8fc3f30807bc0181"} Feb 20 10:06:37 crc kubenswrapper[5094]: I0220 10:06:37.840972 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:06:37 crc kubenswrapper[5094]: E0220 10:06:37.841240 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:06:46 crc kubenswrapper[5094]: I0220 10:06:46.593795 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pc5cx/crc-debug-4vf4g" event={"ID":"603fd649-921c-4ddc-aaea-ca487d399cdc","Type":"ContainerStarted","Data":"5e963cdb7c15a481066283be89f67b32c2bfd26d63091f9de055d580de5cc75f"} Feb 20 10:06:46 crc kubenswrapper[5094]: I0220 10:06:46.627767 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pc5cx/crc-debug-4vf4g" podStartSLOduration=1.376188005 podStartE2EDuration="10.62774171s" podCreationTimestamp="2026-02-20 10:06:36 +0000 UTC" firstStartedPulling="2026-02-20 10:06:36.725421578 +0000 UTC m=+12011.598048299" lastFinishedPulling="2026-02-20 10:06:45.976975253 +0000 UTC m=+12020.849602004" observedRunningTime="2026-02-20 10:06:46.61196169 +0000 UTC m=+12021.484588431" watchObservedRunningTime="2026-02-20 10:06:46.62774171 +0000 UTC m=+12021.500368461" Feb 20 10:06:50 crc kubenswrapper[5094]: I0220 10:06:50.840958 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:06:50 crc kubenswrapper[5094]: E0220 10:06:50.843344 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:07:04 crc kubenswrapper[5094]: I0220 10:07:04.841382 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:07:04 crc kubenswrapper[5094]: E0220 10:07:04.842633 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:07:16 crc kubenswrapper[5094]: I0220 10:07:16.840223 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:07:16 crc kubenswrapper[5094]: E0220 10:07:16.841332 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:07:27 crc kubenswrapper[5094]: I0220 10:07:27.840768 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:07:27 crc kubenswrapper[5094]: E0220 10:07:27.841973 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:07:29 crc kubenswrapper[5094]: I0220 10:07:29.089136 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gb92s"] Feb 20 10:07:29 crc kubenswrapper[5094]: I0220 10:07:29.091641 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:29 crc kubenswrapper[5094]: I0220 10:07:29.104725 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gb92s"] Feb 20 10:07:29 crc kubenswrapper[5094]: I0220 10:07:29.152167 5094 generic.go:334] "Generic (PLEG): container finished" podID="603fd649-921c-4ddc-aaea-ca487d399cdc" containerID="5e963cdb7c15a481066283be89f67b32c2bfd26d63091f9de055d580de5cc75f" exitCode=0 Feb 20 10:07:29 crc kubenswrapper[5094]: I0220 10:07:29.152492 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pc5cx/crc-debug-4vf4g" event={"ID":"603fd649-921c-4ddc-aaea-ca487d399cdc","Type":"ContainerDied","Data":"5e963cdb7c15a481066283be89f67b32c2bfd26d63091f9de055d580de5cc75f"} Feb 20 10:07:29 crc kubenswrapper[5094]: I0220 10:07:29.240583 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/324e88a2-c843-406e-a1c1-3bffb0b5a812-utilities\") pod \"redhat-marketplace-gb92s\" (UID: \"324e88a2-c843-406e-a1c1-3bffb0b5a812\") " pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:29 crc kubenswrapper[5094]: I0220 10:07:29.240808 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/324e88a2-c843-406e-a1c1-3bffb0b5a812-catalog-content\") pod \"redhat-marketplace-gb92s\" (UID: \"324e88a2-c843-406e-a1c1-3bffb0b5a812\") " pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:29 crc kubenswrapper[5094]: I0220 10:07:29.240920 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99qk9\" (UniqueName: \"kubernetes.io/projected/324e88a2-c843-406e-a1c1-3bffb0b5a812-kube-api-access-99qk9\") pod \"redhat-marketplace-gb92s\" (UID: \"324e88a2-c843-406e-a1c1-3bffb0b5a812\") " pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:29 crc kubenswrapper[5094]: I0220 10:07:29.343162 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/324e88a2-c843-406e-a1c1-3bffb0b5a812-catalog-content\") pod \"redhat-marketplace-gb92s\" (UID: \"324e88a2-c843-406e-a1c1-3bffb0b5a812\") " pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:29 crc kubenswrapper[5094]: I0220 10:07:29.343310 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99qk9\" (UniqueName: \"kubernetes.io/projected/324e88a2-c843-406e-a1c1-3bffb0b5a812-kube-api-access-99qk9\") pod \"redhat-marketplace-gb92s\" (UID: \"324e88a2-c843-406e-a1c1-3bffb0b5a812\") " pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:29 crc kubenswrapper[5094]: I0220 10:07:29.343369 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/324e88a2-c843-406e-a1c1-3bffb0b5a812-utilities\") pod \"redhat-marketplace-gb92s\" (UID: \"324e88a2-c843-406e-a1c1-3bffb0b5a812\") " pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:29 crc kubenswrapper[5094]: I0220 10:07:29.343807 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/324e88a2-c843-406e-a1c1-3bffb0b5a812-utilities\") pod \"redhat-marketplace-gb92s\" (UID: \"324e88a2-c843-406e-a1c1-3bffb0b5a812\") " pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:29 crc kubenswrapper[5094]: I0220 10:07:29.343804 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/324e88a2-c843-406e-a1c1-3bffb0b5a812-catalog-content\") pod \"redhat-marketplace-gb92s\" (UID: \"324e88a2-c843-406e-a1c1-3bffb0b5a812\") " pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:29 crc kubenswrapper[5094]: I0220 10:07:29.373968 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99qk9\" (UniqueName: \"kubernetes.io/projected/324e88a2-c843-406e-a1c1-3bffb0b5a812-kube-api-access-99qk9\") pod \"redhat-marketplace-gb92s\" (UID: \"324e88a2-c843-406e-a1c1-3bffb0b5a812\") " pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:29 crc kubenswrapper[5094]: I0220 10:07:29.443111 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:29 crc kubenswrapper[5094]: I0220 10:07:29.913734 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gb92s"] Feb 20 10:07:30 crc kubenswrapper[5094]: I0220 10:07:30.166436 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gb92s" event={"ID":"324e88a2-c843-406e-a1c1-3bffb0b5a812","Type":"ContainerStarted","Data":"b93f0081cd0b061954685743624431c955a43dc58ffb6a377a892f811f13381d"} Feb 20 10:07:30 crc kubenswrapper[5094]: I0220 10:07:30.256990 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pc5cx/crc-debug-4vf4g" Feb 20 10:07:30 crc kubenswrapper[5094]: I0220 10:07:30.305951 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pc5cx/crc-debug-4vf4g"] Feb 20 10:07:30 crc kubenswrapper[5094]: I0220 10:07:30.315993 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pc5cx/crc-debug-4vf4g"] Feb 20 10:07:30 crc kubenswrapper[5094]: I0220 10:07:30.362023 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfx9r\" (UniqueName: \"kubernetes.io/projected/603fd649-921c-4ddc-aaea-ca487d399cdc-kube-api-access-rfx9r\") pod \"603fd649-921c-4ddc-aaea-ca487d399cdc\" (UID: \"603fd649-921c-4ddc-aaea-ca487d399cdc\") " Feb 20 10:07:30 crc kubenswrapper[5094]: I0220 10:07:30.362467 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/603fd649-921c-4ddc-aaea-ca487d399cdc-host\") pod \"603fd649-921c-4ddc-aaea-ca487d399cdc\" (UID: \"603fd649-921c-4ddc-aaea-ca487d399cdc\") " Feb 20 10:07:30 crc kubenswrapper[5094]: I0220 10:07:30.362551 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/603fd649-921c-4ddc-aaea-ca487d399cdc-host" (OuterVolumeSpecName: "host") pod "603fd649-921c-4ddc-aaea-ca487d399cdc" (UID: "603fd649-921c-4ddc-aaea-ca487d399cdc"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:07:30 crc kubenswrapper[5094]: I0220 10:07:30.363127 5094 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/603fd649-921c-4ddc-aaea-ca487d399cdc-host\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:30 crc kubenswrapper[5094]: I0220 10:07:30.374788 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/603fd649-921c-4ddc-aaea-ca487d399cdc-kube-api-access-rfx9r" (OuterVolumeSpecName: "kube-api-access-rfx9r") pod "603fd649-921c-4ddc-aaea-ca487d399cdc" (UID: "603fd649-921c-4ddc-aaea-ca487d399cdc"). InnerVolumeSpecName "kube-api-access-rfx9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:07:30 crc kubenswrapper[5094]: I0220 10:07:30.464772 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfx9r\" (UniqueName: \"kubernetes.io/projected/603fd649-921c-4ddc-aaea-ca487d399cdc-kube-api-access-rfx9r\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.194176 5094 generic.go:334] "Generic (PLEG): container finished" podID="324e88a2-c843-406e-a1c1-3bffb0b5a812" containerID="4bb3a37a2e4b96255d77a10488841b640f0e1ef7e0cf4f2be67173dbb605eff0" exitCode=0 Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.194357 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gb92s" event={"ID":"324e88a2-c843-406e-a1c1-3bffb0b5a812","Type":"ContainerDied","Data":"4bb3a37a2e4b96255d77a10488841b640f0e1ef7e0cf4f2be67173dbb605eff0"} Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.197116 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8e323d826373496a351adc42cad6ef3517155efe7a5cb6c8fc3f30807bc0181" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.197224 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pc5cx/crc-debug-4vf4g" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.467571 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kbj5v"] Feb 20 10:07:31 crc kubenswrapper[5094]: E0220 10:07:31.468528 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="603fd649-921c-4ddc-aaea-ca487d399cdc" containerName="container-00" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.468552 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="603fd649-921c-4ddc-aaea-ca487d399cdc" containerName="container-00" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.468942 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="603fd649-921c-4ddc-aaea-ca487d399cdc" containerName="container-00" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.470915 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.480930 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pc5cx/crc-debug-9mxxw"] Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.482812 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pc5cx/crc-debug-9mxxw" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.494693 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kbj5v"] Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.601692 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ceb59b7-8efc-478c-9663-ec454276c901-utilities\") pod \"redhat-operators-kbj5v\" (UID: \"0ceb59b7-8efc-478c-9663-ec454276c901\") " pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.601829 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c78r9\" (UniqueName: \"kubernetes.io/projected/0ceb59b7-8efc-478c-9663-ec454276c901-kube-api-access-c78r9\") pod \"redhat-operators-kbj5v\" (UID: \"0ceb59b7-8efc-478c-9663-ec454276c901\") " pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.601880 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cb3c5186-6d33-4799-a3b9-22b0645e5a68-host\") pod \"crc-debug-9mxxw\" (UID: \"cb3c5186-6d33-4799-a3b9-22b0645e5a68\") " pod="openshift-must-gather-pc5cx/crc-debug-9mxxw" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.601901 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j74wh\" (UniqueName: \"kubernetes.io/projected/cb3c5186-6d33-4799-a3b9-22b0645e5a68-kube-api-access-j74wh\") pod \"crc-debug-9mxxw\" (UID: \"cb3c5186-6d33-4799-a3b9-22b0645e5a68\") " pod="openshift-must-gather-pc5cx/crc-debug-9mxxw" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.601920 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ceb59b7-8efc-478c-9663-ec454276c901-catalog-content\") pod \"redhat-operators-kbj5v\" (UID: \"0ceb59b7-8efc-478c-9663-ec454276c901\") " pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.703390 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c78r9\" (UniqueName: \"kubernetes.io/projected/0ceb59b7-8efc-478c-9663-ec454276c901-kube-api-access-c78r9\") pod \"redhat-operators-kbj5v\" (UID: \"0ceb59b7-8efc-478c-9663-ec454276c901\") " pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.703503 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cb3c5186-6d33-4799-a3b9-22b0645e5a68-host\") pod \"crc-debug-9mxxw\" (UID: \"cb3c5186-6d33-4799-a3b9-22b0645e5a68\") " pod="openshift-must-gather-pc5cx/crc-debug-9mxxw" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.703543 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j74wh\" (UniqueName: \"kubernetes.io/projected/cb3c5186-6d33-4799-a3b9-22b0645e5a68-kube-api-access-j74wh\") pod \"crc-debug-9mxxw\" (UID: \"cb3c5186-6d33-4799-a3b9-22b0645e5a68\") " pod="openshift-must-gather-pc5cx/crc-debug-9mxxw" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.703572 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ceb59b7-8efc-478c-9663-ec454276c901-catalog-content\") pod \"redhat-operators-kbj5v\" (UID: \"0ceb59b7-8efc-478c-9663-ec454276c901\") " pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.703668 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ceb59b7-8efc-478c-9663-ec454276c901-utilities\") pod \"redhat-operators-kbj5v\" (UID: \"0ceb59b7-8efc-478c-9663-ec454276c901\") " pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.704125 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cb3c5186-6d33-4799-a3b9-22b0645e5a68-host\") pod \"crc-debug-9mxxw\" (UID: \"cb3c5186-6d33-4799-a3b9-22b0645e5a68\") " pod="openshift-must-gather-pc5cx/crc-debug-9mxxw" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.704282 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ceb59b7-8efc-478c-9663-ec454276c901-utilities\") pod \"redhat-operators-kbj5v\" (UID: \"0ceb59b7-8efc-478c-9663-ec454276c901\") " pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.704557 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ceb59b7-8efc-478c-9663-ec454276c901-catalog-content\") pod \"redhat-operators-kbj5v\" (UID: \"0ceb59b7-8efc-478c-9663-ec454276c901\") " pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.723604 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j74wh\" (UniqueName: \"kubernetes.io/projected/cb3c5186-6d33-4799-a3b9-22b0645e5a68-kube-api-access-j74wh\") pod \"crc-debug-9mxxw\" (UID: \"cb3c5186-6d33-4799-a3b9-22b0645e5a68\") " pod="openshift-must-gather-pc5cx/crc-debug-9mxxw" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.727438 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c78r9\" (UniqueName: \"kubernetes.io/projected/0ceb59b7-8efc-478c-9663-ec454276c901-kube-api-access-c78r9\") pod \"redhat-operators-kbj5v\" (UID: \"0ceb59b7-8efc-478c-9663-ec454276c901\") " pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.844575 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.858130 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pc5cx/crc-debug-9mxxw" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.861762 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="603fd649-921c-4ddc-aaea-ca487d399cdc" path="/var/lib/kubelet/pods/603fd649-921c-4ddc-aaea-ca487d399cdc/volumes" Feb 20 10:07:31 crc kubenswrapper[5094]: W0220 10:07:31.942657 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb3c5186_6d33_4799_a3b9_22b0645e5a68.slice/crio-d669063eb640c76d98b7be5169f5a7cec929eb3ed015a6758ef8d4e4cca11ab0 WatchSource:0}: Error finding container d669063eb640c76d98b7be5169f5a7cec929eb3ed015a6758ef8d4e4cca11ab0: Status 404 returned error can't find the container with id d669063eb640c76d98b7be5169f5a7cec929eb3ed015a6758ef8d4e4cca11ab0 Feb 20 10:07:32 crc kubenswrapper[5094]: I0220 10:07:32.072810 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9xvzd"] Feb 20 10:07:32 crc kubenswrapper[5094]: I0220 10:07:32.076080 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:32 crc kubenswrapper[5094]: I0220 10:07:32.096787 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9xvzd"] Feb 20 10:07:32 crc kubenswrapper[5094]: I0220 10:07:32.213160 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pc5cx/crc-debug-9mxxw" event={"ID":"cb3c5186-6d33-4799-a3b9-22b0645e5a68","Type":"ContainerStarted","Data":"d669063eb640c76d98b7be5169f5a7cec929eb3ed015a6758ef8d4e4cca11ab0"} Feb 20 10:07:32 crc kubenswrapper[5094]: I0220 10:07:32.213644 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43789bbd-d60e-4c83-96d4-83c2345aee73-catalog-content\") pod \"community-operators-9xvzd\" (UID: \"43789bbd-d60e-4c83-96d4-83c2345aee73\") " pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:32 crc kubenswrapper[5094]: I0220 10:07:32.213943 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhn6r\" (UniqueName: \"kubernetes.io/projected/43789bbd-d60e-4c83-96d4-83c2345aee73-kube-api-access-rhn6r\") pod \"community-operators-9xvzd\" (UID: \"43789bbd-d60e-4c83-96d4-83c2345aee73\") " pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:32 crc kubenswrapper[5094]: I0220 10:07:32.214017 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43789bbd-d60e-4c83-96d4-83c2345aee73-utilities\") pod \"community-operators-9xvzd\" (UID: \"43789bbd-d60e-4c83-96d4-83c2345aee73\") " pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:32 crc kubenswrapper[5094]: I0220 10:07:32.315979 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhn6r\" (UniqueName: \"kubernetes.io/projected/43789bbd-d60e-4c83-96d4-83c2345aee73-kube-api-access-rhn6r\") pod \"community-operators-9xvzd\" (UID: \"43789bbd-d60e-4c83-96d4-83c2345aee73\") " pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:32 crc kubenswrapper[5094]: I0220 10:07:32.316366 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43789bbd-d60e-4c83-96d4-83c2345aee73-utilities\") pod \"community-operators-9xvzd\" (UID: \"43789bbd-d60e-4c83-96d4-83c2345aee73\") " pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:32 crc kubenswrapper[5094]: I0220 10:07:32.316425 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43789bbd-d60e-4c83-96d4-83c2345aee73-catalog-content\") pod \"community-operators-9xvzd\" (UID: \"43789bbd-d60e-4c83-96d4-83c2345aee73\") " pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:32 crc kubenswrapper[5094]: I0220 10:07:32.316936 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43789bbd-d60e-4c83-96d4-83c2345aee73-catalog-content\") pod \"community-operators-9xvzd\" (UID: \"43789bbd-d60e-4c83-96d4-83c2345aee73\") " pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:32 crc kubenswrapper[5094]: I0220 10:07:32.317030 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43789bbd-d60e-4c83-96d4-83c2345aee73-utilities\") pod \"community-operators-9xvzd\" (UID: \"43789bbd-d60e-4c83-96d4-83c2345aee73\") " pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:32 crc kubenswrapper[5094]: I0220 10:07:32.336391 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhn6r\" (UniqueName: \"kubernetes.io/projected/43789bbd-d60e-4c83-96d4-83c2345aee73-kube-api-access-rhn6r\") pod \"community-operators-9xvzd\" (UID: \"43789bbd-d60e-4c83-96d4-83c2345aee73\") " pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:32 crc kubenswrapper[5094]: I0220 10:07:32.383741 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kbj5v"] Feb 20 10:07:32 crc kubenswrapper[5094]: W0220 10:07:32.384028 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ceb59b7_8efc_478c_9663_ec454276c901.slice/crio-3d888b35cb56fe1fa9ea75553edb55688480d55dec84f24f904b3addc94cb61a WatchSource:0}: Error finding container 3d888b35cb56fe1fa9ea75553edb55688480d55dec84f24f904b3addc94cb61a: Status 404 returned error can't find the container with id 3d888b35cb56fe1fa9ea75553edb55688480d55dec84f24f904b3addc94cb61a Feb 20 10:07:32 crc kubenswrapper[5094]: I0220 10:07:32.435538 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:33 crc kubenswrapper[5094]: I0220 10:07:33.070436 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9xvzd"] Feb 20 10:07:33 crc kubenswrapper[5094]: W0220 10:07:33.073070 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43789bbd_d60e_4c83_96d4_83c2345aee73.slice/crio-750c9e2af19e95975eef09ae4a64b577b078248e036b359cd55dcbe9addcd394 WatchSource:0}: Error finding container 750c9e2af19e95975eef09ae4a64b577b078248e036b359cd55dcbe9addcd394: Status 404 returned error can't find the container with id 750c9e2af19e95975eef09ae4a64b577b078248e036b359cd55dcbe9addcd394 Feb 20 10:07:33 crc kubenswrapper[5094]: I0220 10:07:33.225779 5094 generic.go:334] "Generic (PLEG): container finished" podID="324e88a2-c843-406e-a1c1-3bffb0b5a812" containerID="aa415c8e000608811f44abd799e97a6c4bc5672d7a931d59fcd905a05f2484a6" exitCode=0 Feb 20 10:07:33 crc kubenswrapper[5094]: I0220 10:07:33.226101 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gb92s" event={"ID":"324e88a2-c843-406e-a1c1-3bffb0b5a812","Type":"ContainerDied","Data":"aa415c8e000608811f44abd799e97a6c4bc5672d7a931d59fcd905a05f2484a6"} Feb 20 10:07:33 crc kubenswrapper[5094]: I0220 10:07:33.233658 5094 generic.go:334] "Generic (PLEG): container finished" podID="cb3c5186-6d33-4799-a3b9-22b0645e5a68" containerID="13e5f052b4ba6e08f0990abbd5ddfbbbb9cd2d06486f28fe2908d667d1f8f224" exitCode=0 Feb 20 10:07:33 crc kubenswrapper[5094]: I0220 10:07:33.233753 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pc5cx/crc-debug-9mxxw" event={"ID":"cb3c5186-6d33-4799-a3b9-22b0645e5a68","Type":"ContainerDied","Data":"13e5f052b4ba6e08f0990abbd5ddfbbbb9cd2d06486f28fe2908d667d1f8f224"} Feb 20 10:07:33 crc kubenswrapper[5094]: I0220 10:07:33.235257 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9xvzd" event={"ID":"43789bbd-d60e-4c83-96d4-83c2345aee73","Type":"ContainerStarted","Data":"750c9e2af19e95975eef09ae4a64b577b078248e036b359cd55dcbe9addcd394"} Feb 20 10:07:33 crc kubenswrapper[5094]: I0220 10:07:33.237258 5094 generic.go:334] "Generic (PLEG): container finished" podID="0ceb59b7-8efc-478c-9663-ec454276c901" containerID="1d12c5f188d75bcb0a4383d1e6a5aa247e237d38d8898ba63562c9fa92ef0479" exitCode=0 Feb 20 10:07:33 crc kubenswrapper[5094]: I0220 10:07:33.237298 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbj5v" event={"ID":"0ceb59b7-8efc-478c-9663-ec454276c901","Type":"ContainerDied","Data":"1d12c5f188d75bcb0a4383d1e6a5aa247e237d38d8898ba63562c9fa92ef0479"} Feb 20 10:07:33 crc kubenswrapper[5094]: I0220 10:07:33.237323 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbj5v" event={"ID":"0ceb59b7-8efc-478c-9663-ec454276c901","Type":"ContainerStarted","Data":"3d888b35cb56fe1fa9ea75553edb55688480d55dec84f24f904b3addc94cb61a"} Feb 20 10:07:33 crc kubenswrapper[5094]: I0220 10:07:33.960351 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pc5cx/crc-debug-9mxxw"] Feb 20 10:07:33 crc kubenswrapper[5094]: I0220 10:07:33.969874 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pc5cx/crc-debug-9mxxw"] Feb 20 10:07:34 crc kubenswrapper[5094]: I0220 10:07:34.255358 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gb92s" event={"ID":"324e88a2-c843-406e-a1c1-3bffb0b5a812","Type":"ContainerStarted","Data":"9b4e02c7cb62d486bed17d29c6f8ea969625d60af6266510d9e1e3f136feafa1"} Feb 20 10:07:34 crc kubenswrapper[5094]: I0220 10:07:34.258731 5094 generic.go:334] "Generic (PLEG): container finished" podID="43789bbd-d60e-4c83-96d4-83c2345aee73" containerID="f8119439c03a485a5229d0e97986cc65db314d4847a7b6f720a044c8bc36d4db" exitCode=0 Feb 20 10:07:34 crc kubenswrapper[5094]: I0220 10:07:34.258814 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9xvzd" event={"ID":"43789bbd-d60e-4c83-96d4-83c2345aee73","Type":"ContainerDied","Data":"f8119439c03a485a5229d0e97986cc65db314d4847a7b6f720a044c8bc36d4db"} Feb 20 10:07:34 crc kubenswrapper[5094]: I0220 10:07:34.321383 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gb92s" podStartSLOduration=2.9230023899999997 podStartE2EDuration="5.321361019s" podCreationTimestamp="2026-02-20 10:07:29 +0000 UTC" firstStartedPulling="2026-02-20 10:07:31.204421847 +0000 UTC m=+12066.077048588" lastFinishedPulling="2026-02-20 10:07:33.602780506 +0000 UTC m=+12068.475407217" observedRunningTime="2026-02-20 10:07:34.289737397 +0000 UTC m=+12069.162364108" watchObservedRunningTime="2026-02-20 10:07:34.321361019 +0000 UTC m=+12069.193987730" Feb 20 10:07:34 crc kubenswrapper[5094]: I0220 10:07:34.380228 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pc5cx/crc-debug-9mxxw" Feb 20 10:07:34 crc kubenswrapper[5094]: I0220 10:07:34.561384 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j74wh\" (UniqueName: \"kubernetes.io/projected/cb3c5186-6d33-4799-a3b9-22b0645e5a68-kube-api-access-j74wh\") pod \"cb3c5186-6d33-4799-a3b9-22b0645e5a68\" (UID: \"cb3c5186-6d33-4799-a3b9-22b0645e5a68\") " Feb 20 10:07:34 crc kubenswrapper[5094]: I0220 10:07:34.561448 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cb3c5186-6d33-4799-a3b9-22b0645e5a68-host\") pod \"cb3c5186-6d33-4799-a3b9-22b0645e5a68\" (UID: \"cb3c5186-6d33-4799-a3b9-22b0645e5a68\") " Feb 20 10:07:34 crc kubenswrapper[5094]: I0220 10:07:34.561626 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb3c5186-6d33-4799-a3b9-22b0645e5a68-host" (OuterVolumeSpecName: "host") pod "cb3c5186-6d33-4799-a3b9-22b0645e5a68" (UID: "cb3c5186-6d33-4799-a3b9-22b0645e5a68"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:07:34 crc kubenswrapper[5094]: I0220 10:07:34.562326 5094 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cb3c5186-6d33-4799-a3b9-22b0645e5a68-host\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:34 crc kubenswrapper[5094]: I0220 10:07:34.568251 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb3c5186-6d33-4799-a3b9-22b0645e5a68-kube-api-access-j74wh" (OuterVolumeSpecName: "kube-api-access-j74wh") pod "cb3c5186-6d33-4799-a3b9-22b0645e5a68" (UID: "cb3c5186-6d33-4799-a3b9-22b0645e5a68"). InnerVolumeSpecName "kube-api-access-j74wh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:07:34 crc kubenswrapper[5094]: I0220 10:07:34.664815 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j74wh\" (UniqueName: \"kubernetes.io/projected/cb3c5186-6d33-4799-a3b9-22b0645e5a68-kube-api-access-j74wh\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:35 crc kubenswrapper[5094]: I0220 10:07:35.161491 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pc5cx/crc-debug-m44jr"] Feb 20 10:07:35 crc kubenswrapper[5094]: E0220 10:07:35.162414 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb3c5186-6d33-4799-a3b9-22b0645e5a68" containerName="container-00" Feb 20 10:07:35 crc kubenswrapper[5094]: I0220 10:07:35.162438 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb3c5186-6d33-4799-a3b9-22b0645e5a68" containerName="container-00" Feb 20 10:07:35 crc kubenswrapper[5094]: I0220 10:07:35.162757 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb3c5186-6d33-4799-a3b9-22b0645e5a68" containerName="container-00" Feb 20 10:07:35 crc kubenswrapper[5094]: I0220 10:07:35.163680 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pc5cx/crc-debug-m44jr" Feb 20 10:07:35 crc kubenswrapper[5094]: I0220 10:07:35.269231 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pc5cx/crc-debug-9mxxw" Feb 20 10:07:35 crc kubenswrapper[5094]: I0220 10:07:35.269244 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d669063eb640c76d98b7be5169f5a7cec929eb3ed015a6758ef8d4e4cca11ab0" Feb 20 10:07:35 crc kubenswrapper[5094]: I0220 10:07:35.271287 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9xvzd" event={"ID":"43789bbd-d60e-4c83-96d4-83c2345aee73","Type":"ContainerStarted","Data":"fb56062dda1c7a3c5228b2bfae90e6ba2cd17b10ffd4f4a69e24518c8fd5ac93"} Feb 20 10:07:35 crc kubenswrapper[5094]: I0220 10:07:35.276672 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbj5v" event={"ID":"0ceb59b7-8efc-478c-9663-ec454276c901","Type":"ContainerStarted","Data":"1efc2287f83a0bace8a1700e2e0bf78dcad66e1696bfcec2f605a9a57249bb7c"} Feb 20 10:07:35 crc kubenswrapper[5094]: I0220 10:07:35.278120 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3479144c-1430-4bb4-8013-8c8fa47e3c75-host\") pod \"crc-debug-m44jr\" (UID: \"3479144c-1430-4bb4-8013-8c8fa47e3c75\") " pod="openshift-must-gather-pc5cx/crc-debug-m44jr" Feb 20 10:07:35 crc kubenswrapper[5094]: I0220 10:07:35.278307 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhm9m\" (UniqueName: \"kubernetes.io/projected/3479144c-1430-4bb4-8013-8c8fa47e3c75-kube-api-access-vhm9m\") pod \"crc-debug-m44jr\" (UID: \"3479144c-1430-4bb4-8013-8c8fa47e3c75\") " pod="openshift-must-gather-pc5cx/crc-debug-m44jr" Feb 20 10:07:35 crc kubenswrapper[5094]: I0220 10:07:35.380070 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3479144c-1430-4bb4-8013-8c8fa47e3c75-host\") pod \"crc-debug-m44jr\" (UID: \"3479144c-1430-4bb4-8013-8c8fa47e3c75\") " pod="openshift-must-gather-pc5cx/crc-debug-m44jr" Feb 20 10:07:35 crc kubenswrapper[5094]: I0220 10:07:35.380200 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhm9m\" (UniqueName: \"kubernetes.io/projected/3479144c-1430-4bb4-8013-8c8fa47e3c75-kube-api-access-vhm9m\") pod \"crc-debug-m44jr\" (UID: \"3479144c-1430-4bb4-8013-8c8fa47e3c75\") " pod="openshift-must-gather-pc5cx/crc-debug-m44jr" Feb 20 10:07:35 crc kubenswrapper[5094]: I0220 10:07:35.380288 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3479144c-1430-4bb4-8013-8c8fa47e3c75-host\") pod \"crc-debug-m44jr\" (UID: \"3479144c-1430-4bb4-8013-8c8fa47e3c75\") " pod="openshift-must-gather-pc5cx/crc-debug-m44jr" Feb 20 10:07:35 crc kubenswrapper[5094]: I0220 10:07:35.409328 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhm9m\" (UniqueName: \"kubernetes.io/projected/3479144c-1430-4bb4-8013-8c8fa47e3c75-kube-api-access-vhm9m\") pod \"crc-debug-m44jr\" (UID: \"3479144c-1430-4bb4-8013-8c8fa47e3c75\") " pod="openshift-must-gather-pc5cx/crc-debug-m44jr" Feb 20 10:07:35 crc kubenswrapper[5094]: I0220 10:07:35.481544 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pc5cx/crc-debug-m44jr" Feb 20 10:07:35 crc kubenswrapper[5094]: I0220 10:07:35.858009 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb3c5186-6d33-4799-a3b9-22b0645e5a68" path="/var/lib/kubelet/pods/cb3c5186-6d33-4799-a3b9-22b0645e5a68/volumes" Feb 20 10:07:36 crc kubenswrapper[5094]: I0220 10:07:36.296402 5094 generic.go:334] "Generic (PLEG): container finished" podID="3479144c-1430-4bb4-8013-8c8fa47e3c75" containerID="6915dff992e831dd1242868cc7a6d60444157071af3551c8f9494ba9a8bfdb74" exitCode=0 Feb 20 10:07:36 crc kubenswrapper[5094]: I0220 10:07:36.297837 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pc5cx/crc-debug-m44jr" event={"ID":"3479144c-1430-4bb4-8013-8c8fa47e3c75","Type":"ContainerDied","Data":"6915dff992e831dd1242868cc7a6d60444157071af3551c8f9494ba9a8bfdb74"} Feb 20 10:07:36 crc kubenswrapper[5094]: I0220 10:07:36.297937 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pc5cx/crc-debug-m44jr" event={"ID":"3479144c-1430-4bb4-8013-8c8fa47e3c75","Type":"ContainerStarted","Data":"84e21a0c9dbd3d1ca3086f37240de3faafc225e08e1d6f708168e851659f8d28"} Feb 20 10:07:36 crc kubenswrapper[5094]: I0220 10:07:36.352690 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pc5cx/crc-debug-m44jr"] Feb 20 10:07:36 crc kubenswrapper[5094]: I0220 10:07:36.364027 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pc5cx/crc-debug-m44jr"] Feb 20 10:07:37 crc kubenswrapper[5094]: I0220 10:07:37.416565 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pc5cx/crc-debug-m44jr" Feb 20 10:07:37 crc kubenswrapper[5094]: I0220 10:07:37.530664 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3479144c-1430-4bb4-8013-8c8fa47e3c75-host\") pod \"3479144c-1430-4bb4-8013-8c8fa47e3c75\" (UID: \"3479144c-1430-4bb4-8013-8c8fa47e3c75\") " Feb 20 10:07:37 crc kubenswrapper[5094]: I0220 10:07:37.530777 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3479144c-1430-4bb4-8013-8c8fa47e3c75-host" (OuterVolumeSpecName: "host") pod "3479144c-1430-4bb4-8013-8c8fa47e3c75" (UID: "3479144c-1430-4bb4-8013-8c8fa47e3c75"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:07:37 crc kubenswrapper[5094]: I0220 10:07:37.530827 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhm9m\" (UniqueName: \"kubernetes.io/projected/3479144c-1430-4bb4-8013-8c8fa47e3c75-kube-api-access-vhm9m\") pod \"3479144c-1430-4bb4-8013-8c8fa47e3c75\" (UID: \"3479144c-1430-4bb4-8013-8c8fa47e3c75\") " Feb 20 10:07:37 crc kubenswrapper[5094]: I0220 10:07:37.531306 5094 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3479144c-1430-4bb4-8013-8c8fa47e3c75-host\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:37 crc kubenswrapper[5094]: I0220 10:07:37.539012 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3479144c-1430-4bb4-8013-8c8fa47e3c75-kube-api-access-vhm9m" (OuterVolumeSpecName: "kube-api-access-vhm9m") pod "3479144c-1430-4bb4-8013-8c8fa47e3c75" (UID: "3479144c-1430-4bb4-8013-8c8fa47e3c75"). InnerVolumeSpecName "kube-api-access-vhm9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:07:37 crc kubenswrapper[5094]: I0220 10:07:37.633548 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhm9m\" (UniqueName: \"kubernetes.io/projected/3479144c-1430-4bb4-8013-8c8fa47e3c75-kube-api-access-vhm9m\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:37 crc kubenswrapper[5094]: I0220 10:07:37.854571 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3479144c-1430-4bb4-8013-8c8fa47e3c75" path="/var/lib/kubelet/pods/3479144c-1430-4bb4-8013-8c8fa47e3c75/volumes" Feb 20 10:07:38 crc kubenswrapper[5094]: I0220 10:07:38.322643 5094 scope.go:117] "RemoveContainer" containerID="6915dff992e831dd1242868cc7a6d60444157071af3551c8f9494ba9a8bfdb74" Feb 20 10:07:38 crc kubenswrapper[5094]: I0220 10:07:38.322658 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pc5cx/crc-debug-m44jr" Feb 20 10:07:38 crc kubenswrapper[5094]: I0220 10:07:38.327035 5094 generic.go:334] "Generic (PLEG): container finished" podID="43789bbd-d60e-4c83-96d4-83c2345aee73" containerID="fb56062dda1c7a3c5228b2bfae90e6ba2cd17b10ffd4f4a69e24518c8fd5ac93" exitCode=0 Feb 20 10:07:38 crc kubenswrapper[5094]: I0220 10:07:38.327091 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9xvzd" event={"ID":"43789bbd-d60e-4c83-96d4-83c2345aee73","Type":"ContainerDied","Data":"fb56062dda1c7a3c5228b2bfae90e6ba2cd17b10ffd4f4a69e24518c8fd5ac93"} Feb 20 10:07:38 crc kubenswrapper[5094]: I0220 10:07:38.841484 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:07:38 crc kubenswrapper[5094]: E0220 10:07:38.842379 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:07:39 crc kubenswrapper[5094]: I0220 10:07:39.337975 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9xvzd" event={"ID":"43789bbd-d60e-4c83-96d4-83c2345aee73","Type":"ContainerStarted","Data":"1832bf4907eae6f63efbbcc6531e6b2881234e3dac426afd8fb12dcd33c7cc27"} Feb 20 10:07:39 crc kubenswrapper[5094]: I0220 10:07:39.365302 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9xvzd" podStartSLOduration=2.850274258 podStartE2EDuration="7.365281592s" podCreationTimestamp="2026-02-20 10:07:32 +0000 UTC" firstStartedPulling="2026-02-20 10:07:34.264425826 +0000 UTC m=+12069.137052537" lastFinishedPulling="2026-02-20 10:07:38.77943312 +0000 UTC m=+12073.652059871" observedRunningTime="2026-02-20 10:07:39.356657314 +0000 UTC m=+12074.229284045" watchObservedRunningTime="2026-02-20 10:07:39.365281592 +0000 UTC m=+12074.237908303" Feb 20 10:07:39 crc kubenswrapper[5094]: I0220 10:07:39.443650 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:39 crc kubenswrapper[5094]: I0220 10:07:39.443743 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:39 crc kubenswrapper[5094]: I0220 10:07:39.608981 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:40 crc kubenswrapper[5094]: I0220 10:07:40.351679 5094 generic.go:334] "Generic (PLEG): container finished" podID="0ceb59b7-8efc-478c-9663-ec454276c901" containerID="1efc2287f83a0bace8a1700e2e0bf78dcad66e1696bfcec2f605a9a57249bb7c" exitCode=0 Feb 20 10:07:40 crc kubenswrapper[5094]: I0220 10:07:40.352968 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbj5v" event={"ID":"0ceb59b7-8efc-478c-9663-ec454276c901","Type":"ContainerDied","Data":"1efc2287f83a0bace8a1700e2e0bf78dcad66e1696bfcec2f605a9a57249bb7c"} Feb 20 10:07:40 crc kubenswrapper[5094]: I0220 10:07:40.417419 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:41 crc kubenswrapper[5094]: I0220 10:07:41.365785 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbj5v" event={"ID":"0ceb59b7-8efc-478c-9663-ec454276c901","Type":"ContainerStarted","Data":"7e42e835aeee4918acfaec9056623491679e7d0c0e4891e225083eb83fa5aff8"} Feb 20 10:07:41 crc kubenswrapper[5094]: I0220 10:07:41.383436 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kbj5v" podStartSLOduration=2.8882224020000002 podStartE2EDuration="10.38341327s" podCreationTimestamp="2026-02-20 10:07:31 +0000 UTC" firstStartedPulling="2026-02-20 10:07:33.239059583 +0000 UTC m=+12068.111686294" lastFinishedPulling="2026-02-20 10:07:40.734250451 +0000 UTC m=+12075.606877162" observedRunningTime="2026-02-20 10:07:41.382072388 +0000 UTC m=+12076.254699109" watchObservedRunningTime="2026-02-20 10:07:41.38341327 +0000 UTC m=+12076.256039981" Feb 20 10:07:41 crc kubenswrapper[5094]: I0220 10:07:41.861161 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:41 crc kubenswrapper[5094]: I0220 10:07:41.861204 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gb92s"] Feb 20 10:07:41 crc kubenswrapper[5094]: I0220 10:07:41.861229 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:42 crc kubenswrapper[5094]: I0220 10:07:42.437251 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:42 crc kubenswrapper[5094]: I0220 10:07:42.437317 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:42 crc kubenswrapper[5094]: I0220 10:07:42.910872 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kbj5v" podUID="0ceb59b7-8efc-478c-9663-ec454276c901" containerName="registry-server" probeResult="failure" output=< Feb 20 10:07:42 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 10:07:42 crc kubenswrapper[5094]: > Feb 20 10:07:43 crc kubenswrapper[5094]: I0220 10:07:43.387644 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gb92s" podUID="324e88a2-c843-406e-a1c1-3bffb0b5a812" containerName="registry-server" containerID="cri-o://9b4e02c7cb62d486bed17d29c6f8ea969625d60af6266510d9e1e3f136feafa1" gracePeriod=2 Feb 20 10:07:43 crc kubenswrapper[5094]: I0220 10:07:43.493238 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-9xvzd" podUID="43789bbd-d60e-4c83-96d4-83c2345aee73" containerName="registry-server" probeResult="failure" output=< Feb 20 10:07:43 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 10:07:43 crc kubenswrapper[5094]: > Feb 20 10:07:43 crc kubenswrapper[5094]: I0220 10:07:43.961935 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.087954 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/324e88a2-c843-406e-a1c1-3bffb0b5a812-utilities\") pod \"324e88a2-c843-406e-a1c1-3bffb0b5a812\" (UID: \"324e88a2-c843-406e-a1c1-3bffb0b5a812\") " Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.088088 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/324e88a2-c843-406e-a1c1-3bffb0b5a812-catalog-content\") pod \"324e88a2-c843-406e-a1c1-3bffb0b5a812\" (UID: \"324e88a2-c843-406e-a1c1-3bffb0b5a812\") " Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.088255 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99qk9\" (UniqueName: \"kubernetes.io/projected/324e88a2-c843-406e-a1c1-3bffb0b5a812-kube-api-access-99qk9\") pod \"324e88a2-c843-406e-a1c1-3bffb0b5a812\" (UID: \"324e88a2-c843-406e-a1c1-3bffb0b5a812\") " Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.088900 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/324e88a2-c843-406e-a1c1-3bffb0b5a812-utilities" (OuterVolumeSpecName: "utilities") pod "324e88a2-c843-406e-a1c1-3bffb0b5a812" (UID: "324e88a2-c843-406e-a1c1-3bffb0b5a812"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.094853 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/324e88a2-c843-406e-a1c1-3bffb0b5a812-kube-api-access-99qk9" (OuterVolumeSpecName: "kube-api-access-99qk9") pod "324e88a2-c843-406e-a1c1-3bffb0b5a812" (UID: "324e88a2-c843-406e-a1c1-3bffb0b5a812"). InnerVolumeSpecName "kube-api-access-99qk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.107330 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/324e88a2-c843-406e-a1c1-3bffb0b5a812-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "324e88a2-c843-406e-a1c1-3bffb0b5a812" (UID: "324e88a2-c843-406e-a1c1-3bffb0b5a812"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.190776 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/324e88a2-c843-406e-a1c1-3bffb0b5a812-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.190806 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/324e88a2-c843-406e-a1c1-3bffb0b5a812-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.190817 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99qk9\" (UniqueName: \"kubernetes.io/projected/324e88a2-c843-406e-a1c1-3bffb0b5a812-kube-api-access-99qk9\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.401780 5094 generic.go:334] "Generic (PLEG): container finished" podID="324e88a2-c843-406e-a1c1-3bffb0b5a812" containerID="9b4e02c7cb62d486bed17d29c6f8ea969625d60af6266510d9e1e3f136feafa1" exitCode=0 Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.401876 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gb92s" event={"ID":"324e88a2-c843-406e-a1c1-3bffb0b5a812","Type":"ContainerDied","Data":"9b4e02c7cb62d486bed17d29c6f8ea969625d60af6266510d9e1e3f136feafa1"} Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.402024 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.402145 5094 scope.go:117] "RemoveContainer" containerID="9b4e02c7cb62d486bed17d29c6f8ea969625d60af6266510d9e1e3f136feafa1" Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.402076 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gb92s" event={"ID":"324e88a2-c843-406e-a1c1-3bffb0b5a812","Type":"ContainerDied","Data":"b93f0081cd0b061954685743624431c955a43dc58ffb6a377a892f811f13381d"} Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.443338 5094 scope.go:117] "RemoveContainer" containerID="aa415c8e000608811f44abd799e97a6c4bc5672d7a931d59fcd905a05f2484a6" Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.468976 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gb92s"] Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.479064 5094 scope.go:117] "RemoveContainer" containerID="4bb3a37a2e4b96255d77a10488841b640f0e1ef7e0cf4f2be67173dbb605eff0" Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.483928 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gb92s"] Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.516768 5094 scope.go:117] "RemoveContainer" containerID="9b4e02c7cb62d486bed17d29c6f8ea969625d60af6266510d9e1e3f136feafa1" Feb 20 10:07:44 crc kubenswrapper[5094]: E0220 10:07:44.517187 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b4e02c7cb62d486bed17d29c6f8ea969625d60af6266510d9e1e3f136feafa1\": container with ID starting with 9b4e02c7cb62d486bed17d29c6f8ea969625d60af6266510d9e1e3f136feafa1 not found: ID does not exist" containerID="9b4e02c7cb62d486bed17d29c6f8ea969625d60af6266510d9e1e3f136feafa1" Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.517235 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b4e02c7cb62d486bed17d29c6f8ea969625d60af6266510d9e1e3f136feafa1"} err="failed to get container status \"9b4e02c7cb62d486bed17d29c6f8ea969625d60af6266510d9e1e3f136feafa1\": rpc error: code = NotFound desc = could not find container \"9b4e02c7cb62d486bed17d29c6f8ea969625d60af6266510d9e1e3f136feafa1\": container with ID starting with 9b4e02c7cb62d486bed17d29c6f8ea969625d60af6266510d9e1e3f136feafa1 not found: ID does not exist" Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.517267 5094 scope.go:117] "RemoveContainer" containerID="aa415c8e000608811f44abd799e97a6c4bc5672d7a931d59fcd905a05f2484a6" Feb 20 10:07:44 crc kubenswrapper[5094]: E0220 10:07:44.517603 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa415c8e000608811f44abd799e97a6c4bc5672d7a931d59fcd905a05f2484a6\": container with ID starting with aa415c8e000608811f44abd799e97a6c4bc5672d7a931d59fcd905a05f2484a6 not found: ID does not exist" containerID="aa415c8e000608811f44abd799e97a6c4bc5672d7a931d59fcd905a05f2484a6" Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.517624 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa415c8e000608811f44abd799e97a6c4bc5672d7a931d59fcd905a05f2484a6"} err="failed to get container status \"aa415c8e000608811f44abd799e97a6c4bc5672d7a931d59fcd905a05f2484a6\": rpc error: code = NotFound desc = could not find container \"aa415c8e000608811f44abd799e97a6c4bc5672d7a931d59fcd905a05f2484a6\": container with ID starting with aa415c8e000608811f44abd799e97a6c4bc5672d7a931d59fcd905a05f2484a6 not found: ID does not exist" Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.517641 5094 scope.go:117] "RemoveContainer" containerID="4bb3a37a2e4b96255d77a10488841b640f0e1ef7e0cf4f2be67173dbb605eff0" Feb 20 10:07:44 crc kubenswrapper[5094]: E0220 10:07:44.517837 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bb3a37a2e4b96255d77a10488841b640f0e1ef7e0cf4f2be67173dbb605eff0\": container with ID starting with 4bb3a37a2e4b96255d77a10488841b640f0e1ef7e0cf4f2be67173dbb605eff0 not found: ID does not exist" containerID="4bb3a37a2e4b96255d77a10488841b640f0e1ef7e0cf4f2be67173dbb605eff0" Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.517858 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bb3a37a2e4b96255d77a10488841b640f0e1ef7e0cf4f2be67173dbb605eff0"} err="failed to get container status \"4bb3a37a2e4b96255d77a10488841b640f0e1ef7e0cf4f2be67173dbb605eff0\": rpc error: code = NotFound desc = could not find container \"4bb3a37a2e4b96255d77a10488841b640f0e1ef7e0cf4f2be67173dbb605eff0\": container with ID starting with 4bb3a37a2e4b96255d77a10488841b640f0e1ef7e0cf4f2be67173dbb605eff0 not found: ID does not exist" Feb 20 10:07:45 crc kubenswrapper[5094]: I0220 10:07:45.853015 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="324e88a2-c843-406e-a1c1-3bffb0b5a812" path="/var/lib/kubelet/pods/324e88a2-c843-406e-a1c1-3bffb0b5a812/volumes" Feb 20 10:07:49 crc kubenswrapper[5094]: I0220 10:07:49.840650 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:07:49 crc kubenswrapper[5094]: E0220 10:07:49.841374 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:07:51 crc kubenswrapper[5094]: I0220 10:07:51.913050 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:51 crc kubenswrapper[5094]: I0220 10:07:51.967478 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:52 crc kubenswrapper[5094]: I0220 10:07:52.150820 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kbj5v"] Feb 20 10:07:52 crc kubenswrapper[5094]: I0220 10:07:52.496586 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:52 crc kubenswrapper[5094]: I0220 10:07:52.568898 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:53 crc kubenswrapper[5094]: I0220 10:07:53.493364 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kbj5v" podUID="0ceb59b7-8efc-478c-9663-ec454276c901" containerName="registry-server" containerID="cri-o://7e42e835aeee4918acfaec9056623491679e7d0c0e4891e225083eb83fa5aff8" gracePeriod=2 Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.061170 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.122963 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c78r9\" (UniqueName: \"kubernetes.io/projected/0ceb59b7-8efc-478c-9663-ec454276c901-kube-api-access-c78r9\") pod \"0ceb59b7-8efc-478c-9663-ec454276c901\" (UID: \"0ceb59b7-8efc-478c-9663-ec454276c901\") " Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.123070 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ceb59b7-8efc-478c-9663-ec454276c901-catalog-content\") pod \"0ceb59b7-8efc-478c-9663-ec454276c901\" (UID: \"0ceb59b7-8efc-478c-9663-ec454276c901\") " Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.123090 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ceb59b7-8efc-478c-9663-ec454276c901-utilities\") pod \"0ceb59b7-8efc-478c-9663-ec454276c901\" (UID: \"0ceb59b7-8efc-478c-9663-ec454276c901\") " Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.124278 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ceb59b7-8efc-478c-9663-ec454276c901-utilities" (OuterVolumeSpecName: "utilities") pod "0ceb59b7-8efc-478c-9663-ec454276c901" (UID: "0ceb59b7-8efc-478c-9663-ec454276c901"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.133924 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ceb59b7-8efc-478c-9663-ec454276c901-kube-api-access-c78r9" (OuterVolumeSpecName: "kube-api-access-c78r9") pod "0ceb59b7-8efc-478c-9663-ec454276c901" (UID: "0ceb59b7-8efc-478c-9663-ec454276c901"). InnerVolumeSpecName "kube-api-access-c78r9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.225021 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ceb59b7-8efc-478c-9663-ec454276c901-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.225238 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c78r9\" (UniqueName: \"kubernetes.io/projected/0ceb59b7-8efc-478c-9663-ec454276c901-kube-api-access-c78r9\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.285948 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ceb59b7-8efc-478c-9663-ec454276c901-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0ceb59b7-8efc-478c-9663-ec454276c901" (UID: "0ceb59b7-8efc-478c-9663-ec454276c901"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.327371 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ceb59b7-8efc-478c-9663-ec454276c901-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.507115 5094 generic.go:334] "Generic (PLEG): container finished" podID="0ceb59b7-8efc-478c-9663-ec454276c901" containerID="7e42e835aeee4918acfaec9056623491679e7d0c0e4891e225083eb83fa5aff8" exitCode=0 Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.507155 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbj5v" event={"ID":"0ceb59b7-8efc-478c-9663-ec454276c901","Type":"ContainerDied","Data":"7e42e835aeee4918acfaec9056623491679e7d0c0e4891e225083eb83fa5aff8"} Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.507436 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbj5v" event={"ID":"0ceb59b7-8efc-478c-9663-ec454276c901","Type":"ContainerDied","Data":"3d888b35cb56fe1fa9ea75553edb55688480d55dec84f24f904b3addc94cb61a"} Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.507462 5094 scope.go:117] "RemoveContainer" containerID="7e42e835aeee4918acfaec9056623491679e7d0c0e4891e225083eb83fa5aff8" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.507204 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.532075 5094 scope.go:117] "RemoveContainer" containerID="1efc2287f83a0bace8a1700e2e0bf78dcad66e1696bfcec2f605a9a57249bb7c" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.541455 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kbj5v"] Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.558971 5094 scope.go:117] "RemoveContainer" containerID="1d12c5f188d75bcb0a4383d1e6a5aa247e237d38d8898ba63562c9fa92ef0479" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.566943 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kbj5v"] Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.596700 5094 scope.go:117] "RemoveContainer" containerID="7e42e835aeee4918acfaec9056623491679e7d0c0e4891e225083eb83fa5aff8" Feb 20 10:07:54 crc kubenswrapper[5094]: E0220 10:07:54.597186 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e42e835aeee4918acfaec9056623491679e7d0c0e4891e225083eb83fa5aff8\": container with ID starting with 7e42e835aeee4918acfaec9056623491679e7d0c0e4891e225083eb83fa5aff8 not found: ID does not exist" containerID="7e42e835aeee4918acfaec9056623491679e7d0c0e4891e225083eb83fa5aff8" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.597213 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e42e835aeee4918acfaec9056623491679e7d0c0e4891e225083eb83fa5aff8"} err="failed to get container status \"7e42e835aeee4918acfaec9056623491679e7d0c0e4891e225083eb83fa5aff8\": rpc error: code = NotFound desc = could not find container \"7e42e835aeee4918acfaec9056623491679e7d0c0e4891e225083eb83fa5aff8\": container with ID starting with 7e42e835aeee4918acfaec9056623491679e7d0c0e4891e225083eb83fa5aff8 not found: ID does not exist" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.597234 5094 scope.go:117] "RemoveContainer" containerID="1efc2287f83a0bace8a1700e2e0bf78dcad66e1696bfcec2f605a9a57249bb7c" Feb 20 10:07:54 crc kubenswrapper[5094]: E0220 10:07:54.597584 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1efc2287f83a0bace8a1700e2e0bf78dcad66e1696bfcec2f605a9a57249bb7c\": container with ID starting with 1efc2287f83a0bace8a1700e2e0bf78dcad66e1696bfcec2f605a9a57249bb7c not found: ID does not exist" containerID="1efc2287f83a0bace8a1700e2e0bf78dcad66e1696bfcec2f605a9a57249bb7c" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.597650 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1efc2287f83a0bace8a1700e2e0bf78dcad66e1696bfcec2f605a9a57249bb7c"} err="failed to get container status \"1efc2287f83a0bace8a1700e2e0bf78dcad66e1696bfcec2f605a9a57249bb7c\": rpc error: code = NotFound desc = could not find container \"1efc2287f83a0bace8a1700e2e0bf78dcad66e1696bfcec2f605a9a57249bb7c\": container with ID starting with 1efc2287f83a0bace8a1700e2e0bf78dcad66e1696bfcec2f605a9a57249bb7c not found: ID does not exist" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.597676 5094 scope.go:117] "RemoveContainer" containerID="1d12c5f188d75bcb0a4383d1e6a5aa247e237d38d8898ba63562c9fa92ef0479" Feb 20 10:07:54 crc kubenswrapper[5094]: E0220 10:07:54.598312 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d12c5f188d75bcb0a4383d1e6a5aa247e237d38d8898ba63562c9fa92ef0479\": container with ID starting with 1d12c5f188d75bcb0a4383d1e6a5aa247e237d38d8898ba63562c9fa92ef0479 not found: ID does not exist" containerID="1d12c5f188d75bcb0a4383d1e6a5aa247e237d38d8898ba63562c9fa92ef0479" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.598337 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d12c5f188d75bcb0a4383d1e6a5aa247e237d38d8898ba63562c9fa92ef0479"} err="failed to get container status \"1d12c5f188d75bcb0a4383d1e6a5aa247e237d38d8898ba63562c9fa92ef0479\": rpc error: code = NotFound desc = could not find container \"1d12c5f188d75bcb0a4383d1e6a5aa247e237d38d8898ba63562c9fa92ef0479\": container with ID starting with 1d12c5f188d75bcb0a4383d1e6a5aa247e237d38d8898ba63562c9fa92ef0479 not found: ID does not exist" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.748235 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9xvzd"] Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.748689 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9xvzd" podUID="43789bbd-d60e-4c83-96d4-83c2345aee73" containerName="registry-server" containerID="cri-o://1832bf4907eae6f63efbbcc6531e6b2881234e3dac426afd8fb12dcd33c7cc27" gracePeriod=2 Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.245879 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.347357 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43789bbd-d60e-4c83-96d4-83c2345aee73-utilities\") pod \"43789bbd-d60e-4c83-96d4-83c2345aee73\" (UID: \"43789bbd-d60e-4c83-96d4-83c2345aee73\") " Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.347422 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhn6r\" (UniqueName: \"kubernetes.io/projected/43789bbd-d60e-4c83-96d4-83c2345aee73-kube-api-access-rhn6r\") pod \"43789bbd-d60e-4c83-96d4-83c2345aee73\" (UID: \"43789bbd-d60e-4c83-96d4-83c2345aee73\") " Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.347491 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43789bbd-d60e-4c83-96d4-83c2345aee73-catalog-content\") pod \"43789bbd-d60e-4c83-96d4-83c2345aee73\" (UID: \"43789bbd-d60e-4c83-96d4-83c2345aee73\") " Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.350183 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43789bbd-d60e-4c83-96d4-83c2345aee73-utilities" (OuterVolumeSpecName: "utilities") pod "43789bbd-d60e-4c83-96d4-83c2345aee73" (UID: "43789bbd-d60e-4c83-96d4-83c2345aee73"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.359998 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43789bbd-d60e-4c83-96d4-83c2345aee73-kube-api-access-rhn6r" (OuterVolumeSpecName: "kube-api-access-rhn6r") pod "43789bbd-d60e-4c83-96d4-83c2345aee73" (UID: "43789bbd-d60e-4c83-96d4-83c2345aee73"). InnerVolumeSpecName "kube-api-access-rhn6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.402522 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43789bbd-d60e-4c83-96d4-83c2345aee73-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "43789bbd-d60e-4c83-96d4-83c2345aee73" (UID: "43789bbd-d60e-4c83-96d4-83c2345aee73"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.450159 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43789bbd-d60e-4c83-96d4-83c2345aee73-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.450623 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhn6r\" (UniqueName: \"kubernetes.io/projected/43789bbd-d60e-4c83-96d4-83c2345aee73-kube-api-access-rhn6r\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.450688 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43789bbd-d60e-4c83-96d4-83c2345aee73-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.520455 5094 generic.go:334] "Generic (PLEG): container finished" podID="43789bbd-d60e-4c83-96d4-83c2345aee73" containerID="1832bf4907eae6f63efbbcc6531e6b2881234e3dac426afd8fb12dcd33c7cc27" exitCode=0 Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.520508 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.520524 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9xvzd" event={"ID":"43789bbd-d60e-4c83-96d4-83c2345aee73","Type":"ContainerDied","Data":"1832bf4907eae6f63efbbcc6531e6b2881234e3dac426afd8fb12dcd33c7cc27"} Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.520566 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9xvzd" event={"ID":"43789bbd-d60e-4c83-96d4-83c2345aee73","Type":"ContainerDied","Data":"750c9e2af19e95975eef09ae4a64b577b078248e036b359cd55dcbe9addcd394"} Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.520584 5094 scope.go:117] "RemoveContainer" containerID="1832bf4907eae6f63efbbcc6531e6b2881234e3dac426afd8fb12dcd33c7cc27" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.559346 5094 scope.go:117] "RemoveContainer" containerID="fb56062dda1c7a3c5228b2bfae90e6ba2cd17b10ffd4f4a69e24518c8fd5ac93" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.574256 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9xvzd"] Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.589123 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9xvzd"] Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.593057 5094 scope.go:117] "RemoveContainer" containerID="f8119439c03a485a5229d0e97986cc65db314d4847a7b6f720a044c8bc36d4db" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.623308 5094 scope.go:117] "RemoveContainer" containerID="1832bf4907eae6f63efbbcc6531e6b2881234e3dac426afd8fb12dcd33c7cc27" Feb 20 10:07:55 crc kubenswrapper[5094]: E0220 10:07:55.623838 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1832bf4907eae6f63efbbcc6531e6b2881234e3dac426afd8fb12dcd33c7cc27\": container with ID starting with 1832bf4907eae6f63efbbcc6531e6b2881234e3dac426afd8fb12dcd33c7cc27 not found: ID does not exist" containerID="1832bf4907eae6f63efbbcc6531e6b2881234e3dac426afd8fb12dcd33c7cc27" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.623974 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1832bf4907eae6f63efbbcc6531e6b2881234e3dac426afd8fb12dcd33c7cc27"} err="failed to get container status \"1832bf4907eae6f63efbbcc6531e6b2881234e3dac426afd8fb12dcd33c7cc27\": rpc error: code = NotFound desc = could not find container \"1832bf4907eae6f63efbbcc6531e6b2881234e3dac426afd8fb12dcd33c7cc27\": container with ID starting with 1832bf4907eae6f63efbbcc6531e6b2881234e3dac426afd8fb12dcd33c7cc27 not found: ID does not exist" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.624089 5094 scope.go:117] "RemoveContainer" containerID="fb56062dda1c7a3c5228b2bfae90e6ba2cd17b10ffd4f4a69e24518c8fd5ac93" Feb 20 10:07:55 crc kubenswrapper[5094]: E0220 10:07:55.624563 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb56062dda1c7a3c5228b2bfae90e6ba2cd17b10ffd4f4a69e24518c8fd5ac93\": container with ID starting with fb56062dda1c7a3c5228b2bfae90e6ba2cd17b10ffd4f4a69e24518c8fd5ac93 not found: ID does not exist" containerID="fb56062dda1c7a3c5228b2bfae90e6ba2cd17b10ffd4f4a69e24518c8fd5ac93" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.624670 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb56062dda1c7a3c5228b2bfae90e6ba2cd17b10ffd4f4a69e24518c8fd5ac93"} err="failed to get container status \"fb56062dda1c7a3c5228b2bfae90e6ba2cd17b10ffd4f4a69e24518c8fd5ac93\": rpc error: code = NotFound desc = could not find container \"fb56062dda1c7a3c5228b2bfae90e6ba2cd17b10ffd4f4a69e24518c8fd5ac93\": container with ID starting with fb56062dda1c7a3c5228b2bfae90e6ba2cd17b10ffd4f4a69e24518c8fd5ac93 not found: ID does not exist" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.624797 5094 scope.go:117] "RemoveContainer" containerID="f8119439c03a485a5229d0e97986cc65db314d4847a7b6f720a044c8bc36d4db" Feb 20 10:07:55 crc kubenswrapper[5094]: E0220 10:07:55.625221 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8119439c03a485a5229d0e97986cc65db314d4847a7b6f720a044c8bc36d4db\": container with ID starting with f8119439c03a485a5229d0e97986cc65db314d4847a7b6f720a044c8bc36d4db not found: ID does not exist" containerID="f8119439c03a485a5229d0e97986cc65db314d4847a7b6f720a044c8bc36d4db" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.625333 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8119439c03a485a5229d0e97986cc65db314d4847a7b6f720a044c8bc36d4db"} err="failed to get container status \"f8119439c03a485a5229d0e97986cc65db314d4847a7b6f720a044c8bc36d4db\": rpc error: code = NotFound desc = could not find container \"f8119439c03a485a5229d0e97986cc65db314d4847a7b6f720a044c8bc36d4db\": container with ID starting with f8119439c03a485a5229d0e97986cc65db314d4847a7b6f720a044c8bc36d4db not found: ID does not exist" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.874828 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ceb59b7-8efc-478c-9663-ec454276c901" path="/var/lib/kubelet/pods/0ceb59b7-8efc-478c-9663-ec454276c901/volumes" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.880128 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43789bbd-d60e-4c83-96d4-83c2345aee73" path="/var/lib/kubelet/pods/43789bbd-d60e-4c83-96d4-83c2345aee73/volumes" Feb 20 10:08:00 crc kubenswrapper[5094]: I0220 10:08:00.840095 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:08:00 crc kubenswrapper[5094]: E0220 10:08:00.840674 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:08:14 crc kubenswrapper[5094]: I0220 10:08:14.840469 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:08:14 crc kubenswrapper[5094]: E0220 10:08:14.841296 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:08:26 crc kubenswrapper[5094]: I0220 10:08:26.840464 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:08:26 crc kubenswrapper[5094]: E0220 10:08:26.841383 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:08:37 crc kubenswrapper[5094]: I0220 10:08:37.840555 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:08:37 crc kubenswrapper[5094]: E0220 10:08:37.842733 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:08:48 crc kubenswrapper[5094]: I0220 10:08:48.841164 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:08:48 crc kubenswrapper[5094]: E0220 10:08:48.841931 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:09:02 crc kubenswrapper[5094]: I0220 10:09:02.840591 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:09:02 crc kubenswrapper[5094]: E0220 10:09:02.841389 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:09:16 crc kubenswrapper[5094]: I0220 10:09:16.841865 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:09:17 crc kubenswrapper[5094]: I0220 10:09:17.939017 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"314cf401191c15263264dcbedb42c99f095a2284b06c68e30858221c2e104a07"} Feb 20 10:11:04 crc kubenswrapper[5094]: I0220 10:11:04.530861 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_c09fbf6b-1221-4e3d-b29d-6432848a564b/init-config-reloader/0.log" Feb 20 10:11:04 crc kubenswrapper[5094]: I0220 10:11:04.732471 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_c09fbf6b-1221-4e3d-b29d-6432848a564b/alertmanager/0.log" Feb 20 10:11:04 crc kubenswrapper[5094]: I0220 10:11:04.802568 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_c09fbf6b-1221-4e3d-b29d-6432848a564b/init-config-reloader/0.log" Feb 20 10:11:04 crc kubenswrapper[5094]: I0220 10:11:04.810380 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_c09fbf6b-1221-4e3d-b29d-6432848a564b/config-reloader/0.log" Feb 20 10:11:04 crc kubenswrapper[5094]: I0220 10:11:04.942234 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_4f38802d-49cb-413c-ac61-665d5c77a1a3/aodh-api/0.log" Feb 20 10:11:05 crc kubenswrapper[5094]: I0220 10:11:05.006838 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_4f38802d-49cb-413c-ac61-665d5c77a1a3/aodh-listener/0.log" Feb 20 10:11:05 crc kubenswrapper[5094]: I0220 10:11:05.016628 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_4f38802d-49cb-413c-ac61-665d5c77a1a3/aodh-evaluator/0.log" Feb 20 10:11:05 crc kubenswrapper[5094]: I0220 10:11:05.021454 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_4f38802d-49cb-413c-ac61-665d5c77a1a3/aodh-notifier/0.log" Feb 20 10:11:05 crc kubenswrapper[5094]: I0220 10:11:05.235226 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-69fdd7dd98-bm4fc_3e777e53-5dbe-4779-bc99-90bbf12cea8f/barbican-api-log/0.log" Feb 20 10:11:05 crc kubenswrapper[5094]: I0220 10:11:05.237112 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-69fdd7dd98-bm4fc_3e777e53-5dbe-4779-bc99-90bbf12cea8f/barbican-api/0.log" Feb 20 10:11:05 crc kubenswrapper[5094]: I0220 10:11:05.687900 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7f6984ff88-5xqtx_b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a/barbican-keystone-listener/0.log" Feb 20 10:11:05 crc kubenswrapper[5094]: I0220 10:11:05.743400 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-84b95bd745-mrk5m_13560fbf-48aa-45ac-8c10-067377d1adfa/barbican-worker/0.log" Feb 20 10:11:05 crc kubenswrapper[5094]: I0220 10:11:05.917129 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-84b95bd745-mrk5m_13560fbf-48aa-45ac-8c10-067377d1adfa/barbican-worker-log/0.log" Feb 20 10:11:06 crc kubenswrapper[5094]: I0220 10:11:06.073639 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-m7mmg_30a55d13-2efe-4d90-bcef-14aedc741079/bootstrap-openstack-openstack-cell1/0.log" Feb 20 10:11:06 crc kubenswrapper[5094]: I0220 10:11:06.286414 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-networker-452ln_e3baf01f-744b-44ed-b3c8-2ec288f77e59/bootstrap-openstack-openstack-networker/0.log" Feb 20 10:11:06 crc kubenswrapper[5094]: I0220 10:11:06.347964 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7f6984ff88-5xqtx_b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a/barbican-keystone-listener-log/0.log" Feb 20 10:11:06 crc kubenswrapper[5094]: I0220 10:11:06.426205 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2f751c26-9b9c-4a25-a388-cc52b0934ab6/ceilometer-central-agent/0.log" Feb 20 10:11:06 crc kubenswrapper[5094]: I0220 10:11:06.511783 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2f751c26-9b9c-4a25-a388-cc52b0934ab6/ceilometer-notification-agent/0.log" Feb 20 10:11:06 crc kubenswrapper[5094]: I0220 10:11:06.574501 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2f751c26-9b9c-4a25-a388-cc52b0934ab6/proxy-httpd/0.log" Feb 20 10:11:06 crc kubenswrapper[5094]: I0220 10:11:06.600944 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2f751c26-9b9c-4a25-a388-cc52b0934ab6/sg-core/0.log" Feb 20 10:11:06 crc kubenswrapper[5094]: I0220 10:11:06.709290 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-openstack-openstack-cell1-ffl97_e01778d5-c4a7-44c6-a9e9-cf7d3cb299db/ceph-client-openstack-openstack-cell1/0.log" Feb 20 10:11:07 crc kubenswrapper[5094]: I0220 10:11:07.037215 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_3b8551a6-6aac-4c12-b3ce-913397a5316f/cinder-api-log/0.log" Feb 20 10:11:07 crc kubenswrapper[5094]: I0220 10:11:07.125881 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_3b8551a6-6aac-4c12-b3ce-913397a5316f/cinder-api/0.log" Feb 20 10:11:07 crc kubenswrapper[5094]: I0220 10:11:07.296153 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_d7f13f97-3504-4faa-a8cf-8ad4a7973623/probe/0.log" Feb 20 10:11:07 crc kubenswrapper[5094]: I0220 10:11:07.436529 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_88808044-5011-40de-9088-154284495e1a/cinder-scheduler/0.log" Feb 20 10:11:07 crc kubenswrapper[5094]: I0220 10:11:07.570857 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_88808044-5011-40de-9088-154284495e1a/probe/0.log" Feb 20 10:11:07 crc kubenswrapper[5094]: I0220 10:11:07.874352 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_571a6098-6e30-438f-a6a9-fb751a79ca27/probe/0.log" Feb 20 10:11:08 crc kubenswrapper[5094]: I0220 10:11:08.129839 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-ggj5f_ea277d62-feb7-40a2-80a9-ad1a9d82cb13/configure-network-openstack-openstack-cell1/0.log" Feb 20 10:11:08 crc kubenswrapper[5094]: I0220 10:11:08.357072 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-networker-59p5g_de84413a-d424-4ec1-bb6d-e91b2278b854/configure-network-openstack-openstack-networker/0.log" Feb 20 10:11:08 crc kubenswrapper[5094]: I0220 10:11:08.477538 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_d7f13f97-3504-4faa-a8cf-8ad4a7973623/cinder-backup/0.log" Feb 20 10:11:08 crc kubenswrapper[5094]: I0220 10:11:08.628129 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-x6wr7_dd90b879-7bfd-480f-b25e-b7aef96a4b08/configure-os-openstack-openstack-cell1/0.log" Feb 20 10:11:08 crc kubenswrapper[5094]: I0220 10:11:08.766453 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-networker-ts6jc_9278a86a-be7e-4e04-a187-52d0c119ccb5/configure-os-openstack-openstack-networker/0.log" Feb 20 10:11:08 crc kubenswrapper[5094]: I0220 10:11:08.873662 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8479b9d65f-4mzqh_bee88947-a5ae-4438-9283-a3fc34fde9e4/init/0.log" Feb 20 10:11:09 crc kubenswrapper[5094]: I0220 10:11:09.112804 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8479b9d65f-4mzqh_bee88947-a5ae-4438-9283-a3fc34fde9e4/init/0.log" Feb 20 10:11:09 crc kubenswrapper[5094]: I0220 10:11:09.168882 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-4zfxb_ee932d3e-c52d-491d-92d8-8e21f7e1adbb/download-cache-openstack-openstack-cell1/0.log" Feb 20 10:11:09 crc kubenswrapper[5094]: I0220 10:11:09.313777 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8479b9d65f-4mzqh_bee88947-a5ae-4438-9283-a3fc34fde9e4/dnsmasq-dns/0.log" Feb 20 10:11:09 crc kubenswrapper[5094]: I0220 10:11:09.358845 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-networker-s48qg_74755119-ad5b-439b-80bc-57779ffb5161/download-cache-openstack-openstack-networker/0.log" Feb 20 10:11:09 crc kubenswrapper[5094]: I0220 10:11:09.543225 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f67b4c32-25f3-4bc0-af69-ff9a9aa04404/glance-log/0.log" Feb 20 10:11:09 crc kubenswrapper[5094]: I0220 10:11:09.575426 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f67b4c32-25f3-4bc0-af69-ff9a9aa04404/glance-httpd/0.log" Feb 20 10:11:09 crc kubenswrapper[5094]: I0220 10:11:09.673873 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_571a6098-6e30-438f-a6a9-fb751a79ca27/cinder-volume/0.log" Feb 20 10:11:10 crc kubenswrapper[5094]: I0220 10:11:10.421447 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e9c121ca-4074-4775-a8e5-0c7f8a00ce22/glance-log/0.log" Feb 20 10:11:10 crc kubenswrapper[5094]: I0220 10:11:10.482189 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-68d6fbc7c5-czl7r_f4697fe9-ee95-4003-81d9-c6d7935b46cd/heat-engine/0.log" Feb 20 10:11:10 crc kubenswrapper[5094]: I0220 10:11:10.518109 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e9c121ca-4074-4775-a8e5-0c7f8a00ce22/glance-httpd/0.log" Feb 20 10:11:10 crc kubenswrapper[5094]: I0220 10:11:10.614159 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-76899f657-g7f8m_891348e7-69c8-46e3-a5c2-86c001574a89/heat-cfnapi/0.log" Feb 20 10:11:10 crc kubenswrapper[5094]: I0220 10:11:10.733688 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-6cc7f55d5c-lvdts_128b27b4-464a-4392-af17-51d79bdd1e1e/heat-api/0.log" Feb 20 10:11:10 crc kubenswrapper[5094]: I0220 10:11:10.792977 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-85f686b8b5-kz5d4_dd051d85-41b3-420b-9999-5c9dee9aafe3/horizon/0.log" Feb 20 10:11:10 crc kubenswrapper[5094]: I0220 10:11:10.838245 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-85f686b8b5-kz5d4_dd051d85-41b3-420b-9999-5c9dee9aafe3/horizon-log/0.log" Feb 20 10:11:10 crc kubenswrapper[5094]: I0220 10:11:10.981853 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-8nqcz_8f8dd6dc-a03c-4873-8ecb-e23bc464edff/install-certs-openstack-openstack-cell1/0.log" Feb 20 10:11:11 crc kubenswrapper[5094]: I0220 10:11:11.004029 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-networker-xdx54_21d27f85-64a1-4dc5-af39-89275cce2427/install-certs-openstack-openstack-networker/0.log" Feb 20 10:11:11 crc kubenswrapper[5094]: I0220 10:11:11.097052 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-8j6nf_27e4bec3-7ef3-4f1d-897d-99909f817f5e/install-os-openstack-openstack-cell1/0.log" Feb 20 10:11:11 crc kubenswrapper[5094]: I0220 10:11:11.218371 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-networker-tqhtr_e890bf4c-20ec-4b45-936d-d08d3a73b5ee/install-os-openstack-openstack-networker/0.log" Feb 20 10:11:11 crc kubenswrapper[5094]: I0220 10:11:11.498035 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29526301-wg5rm_ba68c6b8-04f9-4515-8e85-3e7b4ca9615b/keystone-cron/0.log" Feb 20 10:11:11 crc kubenswrapper[5094]: I0220 10:11:11.544132 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29526361-srdg8_5a279e74-ee64-4ff9-8a0f-2700c30a770d/keystone-cron/0.log" Feb 20 10:11:11 crc kubenswrapper[5094]: I0220 10:11:11.697064 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_640e24e6-f89c-45ee-999a-e5aa0816aab2/kube-state-metrics/0.log" Feb 20 10:11:11 crc kubenswrapper[5094]: I0220 10:11:11.862820 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-f6rmf_a552adeb-5834-4cfe-8ee3-56472dda5cab/libvirt-openstack-openstack-cell1/0.log" Feb 20 10:11:12 crc kubenswrapper[5094]: I0220 10:11:12.188942 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6f444df446-vdhbp_167ab003-3908-4714-95b2-bfad7c1e1e00/keystone-api/0.log" Feb 20 10:11:12 crc kubenswrapper[5094]: I0220 10:11:12.299457 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_29ad14f6-de76-4992-b46a-29f0822654c7/probe/0.log" Feb 20 10:11:12 crc kubenswrapper[5094]: I0220 10:11:12.314786 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_29ad14f6-de76-4992-b46a-29f0822654c7/manila-scheduler/0.log" Feb 20 10:11:12 crc kubenswrapper[5094]: I0220 10:11:12.385498 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_fc6c3c7a-374b-49fc-98d5-852785c56ee7/manila-api/0.log" Feb 20 10:11:12 crc kubenswrapper[5094]: I0220 10:11:12.446644 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_fc6c3c7a-374b-49fc-98d5-852785c56ee7/manila-api-log/0.log" Feb 20 10:11:12 crc kubenswrapper[5094]: I0220 10:11:12.493417 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b/probe/0.log" Feb 20 10:11:12 crc kubenswrapper[5094]: I0220 10:11:12.521718 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b/manila-share/0.log" Feb 20 10:11:12 crc kubenswrapper[5094]: I0220 10:11:12.936850 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-g7hl9_f58790dc-4468-40ad-ba58-bb433a926abe/neutron-dhcp-openstack-openstack-cell1/0.log" Feb 20 10:11:13 crc kubenswrapper[5094]: I0220 10:11:13.046746 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-585ff4fdf7-llqts_78fff8ae-90d4-490d-b302-45fce0bd0101/neutron-httpd/0.log" Feb 20 10:11:13 crc kubenswrapper[5094]: I0220 10:11:13.149249 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-d2d4n_bf84fab1-aae6-4c92-982e-a4c5b1c7cefe/neutron-metadata-openstack-openstack-cell1/0.log" Feb 20 10:11:13 crc kubenswrapper[5094]: I0220 10:11:13.411975 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-585ff4fdf7-llqts_78fff8ae-90d4-490d-b302-45fce0bd0101/neutron-api/0.log" Feb 20 10:11:13 crc kubenswrapper[5094]: I0220 10:11:13.575958 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-networker-f5lls_b741c1f4-f408-486b-bd44-3ae1fcadc83b/neutron-metadata-openstack-openstack-networker/0.log" Feb 20 10:11:13 crc kubenswrapper[5094]: I0220 10:11:13.732779 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-6sqzq_6d3bf727-1eae-408c-be3d-2df97b387704/neutron-sriov-openstack-openstack-cell1/0.log" Feb 20 10:11:13 crc kubenswrapper[5094]: I0220 10:11:13.987303 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b6886613-4f07-498a-911f-4d77704ab4df/nova-api-api/0.log" Feb 20 10:11:14 crc kubenswrapper[5094]: I0220 10:11:14.168517 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_9ae582b9-3951-4670-91bf-5d044269ff1c/nova-cell0-conductor-conductor/0.log" Feb 20 10:11:14 crc kubenswrapper[5094]: I0220 10:11:14.313153 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b6886613-4f07-498a-911f-4d77704ab4df/nova-api-log/0.log" Feb 20 10:11:14 crc kubenswrapper[5094]: I0220 10:11:14.319147 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_23153570-19e2-4a29-9533-5db90a0c5d09/nova-cell1-conductor-conductor/0.log" Feb 20 10:11:14 crc kubenswrapper[5094]: I0220 10:11:14.524429 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb/nova-cell1-novncproxy-novncproxy/0.log" Feb 20 10:11:14 crc kubenswrapper[5094]: I0220 10:11:14.581811 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd_5b30b185-0b70-4ad8-8eca-a292b76fb410/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Feb 20 10:11:14 crc kubenswrapper[5094]: I0220 10:11:14.763944 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-tclm2_086935dd-74d5-4657-a6a1-25bd11f6455f/nova-cell1-openstack-openstack-cell1/0.log" Feb 20 10:11:14 crc kubenswrapper[5094]: I0220 10:11:14.985960 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_87f99e6f-46a8-4a46-bcae-81947aa95700/nova-metadata-log/0.log" Feb 20 10:11:15 crc kubenswrapper[5094]: I0220 10:11:15.008759 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_87f99e6f-46a8-4a46-bcae-81947aa95700/nova-metadata-metadata/0.log" Feb 20 10:11:15 crc kubenswrapper[5094]: I0220 10:11:15.145220 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_67a3bd12-be26-46a3-bd66-982bea39049a/nova-scheduler-scheduler/0.log" Feb 20 10:11:15 crc kubenswrapper[5094]: I0220 10:11:15.195731 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_98dd23d5-7a26-4a06-a35a-e818b8feba3c/mysql-bootstrap/0.log" Feb 20 10:11:15 crc kubenswrapper[5094]: I0220 10:11:15.381450 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_98dd23d5-7a26-4a06-a35a-e818b8feba3c/mysql-bootstrap/0.log" Feb 20 10:11:15 crc kubenswrapper[5094]: I0220 10:11:15.400493 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_98dd23d5-7a26-4a06-a35a-e818b8feba3c/galera/0.log" Feb 20 10:11:15 crc kubenswrapper[5094]: I0220 10:11:15.508942 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_542d99bc-6049-42dc-9036-8a795552e896/mysql-bootstrap/0.log" Feb 20 10:11:15 crc kubenswrapper[5094]: I0220 10:11:15.632377 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_542d99bc-6049-42dc-9036-8a795552e896/mysql-bootstrap/0.log" Feb 20 10:11:15 crc kubenswrapper[5094]: I0220 10:11:15.659000 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_542d99bc-6049-42dc-9036-8a795552e896/galera/0.log" Feb 20 10:11:15 crc kubenswrapper[5094]: I0220 10:11:15.777314 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_3c21f8d0-ca22-4206-9cdf-26edee70eac2/openstackclient/0.log" Feb 20 10:11:15 crc kubenswrapper[5094]: I0220 10:11:15.834939 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_4bd6bf8e-8e67-4de1-a294-6b5d50f1797a/openstack-network-exporter/0.log" Feb 20 10:11:15 crc kubenswrapper[5094]: I0220 10:11:15.960745 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_4bd6bf8e-8e67-4de1-a294-6b5d50f1797a/ovn-northd/0.log" Feb 20 10:11:16 crc kubenswrapper[5094]: I0220 10:11:16.176052 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-pswsx_3f35b6d1-3070-44cf-bdf8-6376b2434586/ovn-openstack-openstack-cell1/0.log" Feb 20 10:11:16 crc kubenswrapper[5094]: I0220 10:11:16.356386 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_464f9fe3-85bf-4e78-adc3-3feedbaf1dac/openstack-network-exporter/0.log" Feb 20 10:11:16 crc kubenswrapper[5094]: I0220 10:11:16.372855 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-networker-r6vzk_3ef4f2ef-92a7-4d12-94a9-e3ee55412547/ovn-openstack-openstack-networker/0.log" Feb 20 10:11:16 crc kubenswrapper[5094]: I0220 10:11:16.431537 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_464f9fe3-85bf-4e78-adc3-3feedbaf1dac/ovsdbserver-nb/0.log" Feb 20 10:11:16 crc kubenswrapper[5094]: I0220 10:11:16.565693 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf/openstack-network-exporter/0.log" Feb 20 10:11:16 crc kubenswrapper[5094]: I0220 10:11:16.639094 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf/ovsdbserver-nb/0.log" Feb 20 10:11:16 crc kubenswrapper[5094]: I0220 10:11:16.813811 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_79dfde5a-85a9-437f-979d-1fdb99a1bb5f/openstack-network-exporter/0.log" Feb 20 10:11:16 crc kubenswrapper[5094]: I0220 10:11:16.876180 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_79dfde5a-85a9-437f-979d-1fdb99a1bb5f/ovsdbserver-nb/0.log" Feb 20 10:11:16 crc kubenswrapper[5094]: I0220 10:11:16.999280 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6ff74bc5-95bf-47fd-969e-cecbf1317e5d/openstack-network-exporter/0.log" Feb 20 10:11:17 crc kubenswrapper[5094]: I0220 10:11:17.041336 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6ff74bc5-95bf-47fd-969e-cecbf1317e5d/ovsdbserver-sb/0.log" Feb 20 10:11:17 crc kubenswrapper[5094]: I0220 10:11:17.224030 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_daa57141-e76b-43a8-b363-2a1c7129d7c2/openstack-network-exporter/0.log" Feb 20 10:11:17 crc kubenswrapper[5094]: I0220 10:11:17.277241 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_daa57141-e76b-43a8-b363-2a1c7129d7c2/ovsdbserver-sb/0.log" Feb 20 10:11:17 crc kubenswrapper[5094]: I0220 10:11:17.380951 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_9e6d0be3-167e-49e9-8450-a563f9115817/openstack-network-exporter/0.log" Feb 20 10:11:17 crc kubenswrapper[5094]: I0220 10:11:17.430381 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_9e6d0be3-167e-49e9-8450-a563f9115817/ovsdbserver-sb/0.log" Feb 20 10:11:17 crc kubenswrapper[5094]: I0220 10:11:17.833473 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-64d8d4f69d-shjqs_cc482d5b-0b27-4293-b02b-7b02007cf790/placement-api/0.log" Feb 20 10:11:17 crc kubenswrapper[5094]: I0220 10:11:17.915794 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-64d8d4f69d-shjqs_cc482d5b-0b27-4293-b02b-7b02007cf790/placement-log/0.log" Feb 20 10:11:18 crc kubenswrapper[5094]: I0220 10:11:18.081273 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m_aee17d13-1b0d-49a2-a515-cc63a2f62c63/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Feb 20 10:11:18 crc kubenswrapper[5094]: I0220 10:11:18.122216 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b_750a5132-7613-40c0-a360-2f1a589d2554/pre-adoption-validation-openstack-pre-adoption-openstack-networ/0.log" Feb 20 10:11:18 crc kubenswrapper[5094]: I0220 10:11:18.258445 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_22516d8a-bb80-405e-8258-01fd733495ef/init-config-reloader/0.log" Feb 20 10:11:18 crc kubenswrapper[5094]: I0220 10:11:18.463490 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_22516d8a-bb80-405e-8258-01fd733495ef/init-config-reloader/0.log" Feb 20 10:11:18 crc kubenswrapper[5094]: I0220 10:11:18.491777 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_22516d8a-bb80-405e-8258-01fd733495ef/config-reloader/0.log" Feb 20 10:11:18 crc kubenswrapper[5094]: I0220 10:11:18.531182 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_22516d8a-bb80-405e-8258-01fd733495ef/prometheus/0.log" Feb 20 10:11:18 crc kubenswrapper[5094]: I0220 10:11:18.539719 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_22516d8a-bb80-405e-8258-01fd733495ef/thanos-sidecar/0.log" Feb 20 10:11:18 crc kubenswrapper[5094]: I0220 10:11:18.703633 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_392a6bbf-c80d-4142-adb2-b4828517b1c6/setup-container/0.log" Feb 20 10:11:18 crc kubenswrapper[5094]: I0220 10:11:18.904943 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_392a6bbf-c80d-4142-adb2-b4828517b1c6/setup-container/0.log" Feb 20 10:11:18 crc kubenswrapper[5094]: I0220 10:11:18.960340 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_392a6bbf-c80d-4142-adb2-b4828517b1c6/rabbitmq/0.log" Feb 20 10:11:19 crc kubenswrapper[5094]: I0220 10:11:19.033563 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b/setup-container/0.log" Feb 20 10:11:19 crc kubenswrapper[5094]: I0220 10:11:19.231032 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b/setup-container/0.log" Feb 20 10:11:19 crc kubenswrapper[5094]: I0220 10:11:19.292626 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-gpz7d_134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e/reboot-os-openstack-openstack-cell1/0.log" Feb 20 10:11:19 crc kubenswrapper[5094]: I0220 10:11:19.295776 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b/rabbitmq/0.log" Feb 20 10:11:19 crc kubenswrapper[5094]: I0220 10:11:19.540410 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-networker-x57s5_5e673130-22d3-4300-a143-c2821deb8cac/reboot-os-openstack-openstack-networker/0.log" Feb 20 10:11:19 crc kubenswrapper[5094]: I0220 10:11:19.548006 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-fwcg9_73a5aa76-8e8f-4235-bd0d-294f718698fa/run-os-openstack-openstack-cell1/0.log" Feb 20 10:11:19 crc kubenswrapper[5094]: I0220 10:11:19.718814 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-networker-f78nb_95f054cd-db3e-45e0-9e12-55c2da3b5a23/run-os-openstack-openstack-networker/0.log" Feb 20 10:11:19 crc kubenswrapper[5094]: I0220 10:11:19.789255 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-zb287_387ade3f-0ebb-4488-8a04-389a018fc31d/ssh-known-hosts-openstack/0.log" Feb 20 10:11:20 crc kubenswrapper[5094]: I0220 10:11:20.070045 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-s5mln_a134a8f4-8450-4f7c-9988-11686cbdcd19/telemetry-openstack-openstack-cell1/0.log" Feb 20 10:11:20 crc kubenswrapper[5094]: I0220 10:11:20.207573 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_8e2aa894-2a09-4fad-bcc7-1f259ca48ac9/tempest-tests-tempest-tests-runner/0.log" Feb 20 10:11:20 crc kubenswrapper[5094]: I0220 10:11:20.278134 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_6798c144-2ada-4a54-98c4-72db0e7bd732/test-operator-logs-container/0.log" Feb 20 10:11:20 crc kubenswrapper[5094]: I0220 10:11:20.460366 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6_110791b2-a067-409d-9970-9db4868f0d4d/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Feb 20 10:11:20 crc kubenswrapper[5094]: I0220 10:11:20.543162 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f_7894eb94-d4dd-4035-af5b-5994b4ae6d2f/tripleo-cleanup-tripleo-cleanup-openstack-networker/0.log" Feb 20 10:11:20 crc kubenswrapper[5094]: I0220 10:11:20.666689 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-k27rr_61a917df-8faa-482f-9582-0c5737301057/validate-network-openstack-openstack-cell1/0.log" Feb 20 10:11:20 crc kubenswrapper[5094]: I0220 10:11:20.807719 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-networker-2pr7n_f04426ab-50e7-4345-842b-69bfcc58207c/validate-network-openstack-openstack-networker/0.log" Feb 20 10:11:32 crc kubenswrapper[5094]: I0220 10:11:32.586130 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_5074d037-240e-4685-8c3b-3dd7b963beb0/memcached/0.log" Feb 20 10:11:34 crc kubenswrapper[5094]: I0220 10:11:34.106336 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:11:34 crc kubenswrapper[5094]: I0220 10:11:34.106826 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:11:46 crc kubenswrapper[5094]: I0220 10:11:46.702155 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9_9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8/util/0.log" Feb 20 10:11:46 crc kubenswrapper[5094]: I0220 10:11:46.927658 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9_9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8/pull/0.log" Feb 20 10:11:46 crc kubenswrapper[5094]: I0220 10:11:46.933283 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9_9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8/util/0.log" Feb 20 10:11:46 crc kubenswrapper[5094]: I0220 10:11:46.959952 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9_9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8/pull/0.log" Feb 20 10:11:47 crc kubenswrapper[5094]: I0220 10:11:47.223525 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9_9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8/extract/0.log" Feb 20 10:11:47 crc kubenswrapper[5094]: I0220 10:11:47.258258 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9_9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8/pull/0.log" Feb 20 10:11:47 crc kubenswrapper[5094]: I0220 10:11:47.273381 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9_9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8/util/0.log" Feb 20 10:11:47 crc kubenswrapper[5094]: I0220 10:11:47.680440 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-xjng5_a91a9b82-fc6b-4900-becb-6dc3c100e429/manager/0.log" Feb 20 10:11:48 crc kubenswrapper[5094]: I0220 10:11:48.175152 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-26vtn_f6c8e20e-ecca-42d4-9e0e-5547ae567d9f/manager/0.log" Feb 20 10:11:48 crc kubenswrapper[5094]: I0220 10:11:48.306303 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-p689m_36d60210-52d5-4f28-ae0b-28cce632d5cb/manager/0.log" Feb 20 10:11:48 crc kubenswrapper[5094]: I0220 10:11:48.536672 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-mnd7v_292cb132-b03c-4d20-8bee-c90ad3c4486b/manager/0.log" Feb 20 10:11:49 crc kubenswrapper[5094]: I0220 10:11:49.042081 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-8j8pv_fcf15128-56ef-42dc-b230-1cd8b7638d33/manager/0.log" Feb 20 10:11:49 crc kubenswrapper[5094]: I0220 10:11:49.636098 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-nxtc7_eb67a9bc-35a6-4ce3-bca8-a08ee824cda7/manager/0.log" Feb 20 10:11:49 crc kubenswrapper[5094]: I0220 10:11:49.736190 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-86bxl_8a1c02cd-3546-45fa-b7db-5903c80681a4/manager/0.log" Feb 20 10:11:50 crc kubenswrapper[5094]: I0220 10:11:50.003478 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-hfkff_b863d4f9-063a-4102-8c3d-f7e092e4e2c0/manager/0.log" Feb 20 10:11:50 crc kubenswrapper[5094]: I0220 10:11:50.326883 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-n5dgn_6177108e-bc02-497c-80ab-312f61fbd1c2/manager/0.log" Feb 20 10:11:50 crc kubenswrapper[5094]: I0220 10:11:50.686040 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-bd9tr_eba1b8e0-b529-47ad-a657-75ce01bad56a/manager/0.log" Feb 20 10:11:51 crc kubenswrapper[5094]: I0220 10:11:51.390142 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-2ftdz_93dbc041-00c2-4189-abca-6bb3a00abc2d/manager/0.log" Feb 20 10:11:51 crc kubenswrapper[5094]: I0220 10:11:51.632013 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8_57b4cb2c-e7bc-4430-bfb8-3642dab61d84/manager/0.log" Feb 20 10:11:52 crc kubenswrapper[5094]: I0220 10:11:52.182146 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6679bf9b57-9glnw_234632e4-6191-4ec8-94c5-c93d71c13ad0/operator/0.log" Feb 20 10:11:52 crc kubenswrapper[5094]: I0220 10:11:52.987303 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-72vrj_be2cc842-778e-4963-80f8-bb5c7426f175/registry-server/0.log" Feb 20 10:11:53 crc kubenswrapper[5094]: I0220 10:11:53.532003 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-ct2h7_74861845-de37-4091-9226-bcb1bbe64b35/manager/0.log" Feb 20 10:11:53 crc kubenswrapper[5094]: I0220 10:11:53.588757 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-24cv7_32338b54-c33f-4dc5-b328-9cf4d92d1db6/manager/0.log" Feb 20 10:11:53 crc kubenswrapper[5094]: I0220 10:11:53.809304 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-9c2k5_c510ecc1-53ce-4611-af6a-09488f9317ed/manager/0.log" Feb 20 10:11:54 crc kubenswrapper[5094]: I0220 10:11:54.091057 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-fq9n6_f45a4211-8890-4e4a-af96-ccffec62160c/operator/0.log" Feb 20 10:11:54 crc kubenswrapper[5094]: I0220 10:11:54.306637 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-dh48q_a0d3c29b-4f57-4647-b1a5-bfd6c887b0b5/manager/0.log" Feb 20 10:11:54 crc kubenswrapper[5094]: I0220 10:11:54.781669 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7f45b4ff68-cbtkd_c45dcc1f-a95d-4492-9139-16d550809a8e/manager/0.log" Feb 20 10:11:54 crc kubenswrapper[5094]: I0220 10:11:54.790612 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-nll74_fce6c9b3-2075-479d-9a16-738831a871c4/manager/0.log" Feb 20 10:11:55 crc kubenswrapper[5094]: I0220 10:11:55.071485 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5db88f68c-lz57q_4413dc36-58b0-447a-ba69-cdd2cee9589c/manager/0.log" Feb 20 10:11:56 crc kubenswrapper[5094]: I0220 10:11:56.899900 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-69ff7bc449-57z9v_a1b74404-906b-4466-a3bd-289458ef90ea/manager/0.log" Feb 20 10:11:57 crc kubenswrapper[5094]: I0220 10:11:57.866548 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-k5dkn_6b09cc76-8cba-42ed-bb2c-fdf4473c9afe/manager/0.log" Feb 20 10:11:57 crc kubenswrapper[5094]: I0220 10:11:57.914950 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-ngkcq_683351ac-f508-4961-b07a-eaac9c26a4f3/manager/0.log" Feb 20 10:12:04 crc kubenswrapper[5094]: I0220 10:12:04.107612 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:12:04 crc kubenswrapper[5094]: I0220 10:12:04.108448 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:12:17 crc kubenswrapper[5094]: I0220 10:12:17.328246 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-znrdm_38d9642e-3788-4e70-8232-138cd84e02dc/control-plane-machine-set-operator/0.log" Feb 20 10:12:17 crc kubenswrapper[5094]: I0220 10:12:17.462697 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ps6pv_2f348b60-0d81-490e-bfb4-ea32546c995a/kube-rbac-proxy/0.log" Feb 20 10:12:17 crc kubenswrapper[5094]: I0220 10:12:17.526321 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ps6pv_2f348b60-0d81-490e-bfb4-ea32546c995a/machine-api-operator/0.log" Feb 20 10:12:31 crc kubenswrapper[5094]: I0220 10:12:31.054958 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-pdnlx_d6360113-cdd8-48a4-a145-4b54eb5510eb/cert-manager-controller/0.log" Feb 20 10:12:31 crc kubenswrapper[5094]: I0220 10:12:31.301737 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-mtw89_34f53f0e-6a22-42c9-a953-3ec38e87a70f/cert-manager-cainjector/0.log" Feb 20 10:12:31 crc kubenswrapper[5094]: I0220 10:12:31.382260 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-sxrw7_bc1f2312-eb97-4f63-b37b-975d9dfb5a73/cert-manager-webhook/0.log" Feb 20 10:12:34 crc kubenswrapper[5094]: I0220 10:12:34.106886 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:12:34 crc kubenswrapper[5094]: I0220 10:12:34.107444 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:12:34 crc kubenswrapper[5094]: I0220 10:12:34.107524 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 10:12:34 crc kubenswrapper[5094]: I0220 10:12:34.109094 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"314cf401191c15263264dcbedb42c99f095a2284b06c68e30858221c2e104a07"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:12:34 crc kubenswrapper[5094]: I0220 10:12:34.109275 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://314cf401191c15263264dcbedb42c99f095a2284b06c68e30858221c2e104a07" gracePeriod=600 Feb 20 10:12:34 crc kubenswrapper[5094]: I0220 10:12:34.367041 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="314cf401191c15263264dcbedb42c99f095a2284b06c68e30858221c2e104a07" exitCode=0 Feb 20 10:12:34 crc kubenswrapper[5094]: I0220 10:12:34.367120 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"314cf401191c15263264dcbedb42c99f095a2284b06c68e30858221c2e104a07"} Feb 20 10:12:34 crc kubenswrapper[5094]: I0220 10:12:34.367286 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:12:35 crc kubenswrapper[5094]: I0220 10:12:35.379438 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512"} Feb 20 10:12:44 crc kubenswrapper[5094]: I0220 10:12:44.789491 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-74czm_edd001fc-3ddc-4010-8a98-54f4ffeaba72/nmstate-console-plugin/0.log" Feb 20 10:12:44 crc kubenswrapper[5094]: I0220 10:12:44.974716 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-jr284_55b1a421-7ec5-4442-b4c5-11767715cc4b/nmstate-handler/0.log" Feb 20 10:12:45 crc kubenswrapper[5094]: I0220 10:12:45.028977 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-gvgqm_93238aee-86f0-497a-8880-531338e8245f/kube-rbac-proxy/0.log" Feb 20 10:12:45 crc kubenswrapper[5094]: I0220 10:12:45.088823 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-gvgqm_93238aee-86f0-497a-8880-531338e8245f/nmstate-metrics/0.log" Feb 20 10:12:45 crc kubenswrapper[5094]: I0220 10:12:45.188141 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-qg9ms_6804a7c3-a0d7-46d4-b317-e9c54265841e/nmstate-operator/0.log" Feb 20 10:12:45 crc kubenswrapper[5094]: I0220 10:12:45.267965 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-frvdg_df45fab4-d183-4702-b5b6-2a4e559eff22/nmstate-webhook/0.log" Feb 20 10:12:46 crc kubenswrapper[5094]: I0220 10:12:46.369050 5094 scope.go:117] "RemoveContainer" containerID="5e963cdb7c15a481066283be89f67b32c2bfd26d63091f9de055d580de5cc75f" Feb 20 10:13:00 crc kubenswrapper[5094]: I0220 10:13:00.068772 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-kxcc8_5a9736b1-aca8-4880-9d94-2d7c37efce50/prometheus-operator/0.log" Feb 20 10:13:00 crc kubenswrapper[5094]: I0220 10:13:00.219616 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-56764c7d84-47vjf_b0fb9831-f265-4976-9a1d-14ed3e08daf5/prometheus-operator-admission-webhook/0.log" Feb 20 10:13:00 crc kubenswrapper[5094]: I0220 10:13:00.313994 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-56764c7d84-d7v85_c70d95ea-5321-43fa-8df8-6d1138f0a732/prometheus-operator-admission-webhook/0.log" Feb 20 10:13:00 crc kubenswrapper[5094]: I0220 10:13:00.425030 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-dcm8l_724c1050-e6d7-49c3-8b63-a89a3de26894/operator/0.log" Feb 20 10:13:00 crc kubenswrapper[5094]: I0220 10:13:00.485148 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-9hzgx_1ab531ae-b53c-4de1-b927-ca32c159c244/perses-operator/0.log" Feb 20 10:13:15 crc kubenswrapper[5094]: I0220 10:13:15.535495 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-s7ndd_2a03b7d3-8e22-4a62-98f0-8d72500fab69/kube-rbac-proxy/0.log" Feb 20 10:13:15 crc kubenswrapper[5094]: I0220 10:13:15.836837 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-s7ndd_2a03b7d3-8e22-4a62-98f0-8d72500fab69/controller/0.log" Feb 20 10:13:16 crc kubenswrapper[5094]: I0220 10:13:16.112228 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/cp-frr-files/0.log" Feb 20 10:13:16 crc kubenswrapper[5094]: I0220 10:13:16.238317 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/cp-reloader/0.log" Feb 20 10:13:16 crc kubenswrapper[5094]: I0220 10:13:16.283728 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/cp-frr-files/0.log" Feb 20 10:13:16 crc kubenswrapper[5094]: I0220 10:13:16.338807 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/cp-metrics/0.log" Feb 20 10:13:16 crc kubenswrapper[5094]: I0220 10:13:16.349479 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/cp-reloader/0.log" Feb 20 10:13:16 crc kubenswrapper[5094]: I0220 10:13:16.564079 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/cp-frr-files/0.log" Feb 20 10:13:16 crc kubenswrapper[5094]: I0220 10:13:16.583591 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/cp-reloader/0.log" Feb 20 10:13:16 crc kubenswrapper[5094]: I0220 10:13:16.615512 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/cp-metrics/0.log" Feb 20 10:13:16 crc kubenswrapper[5094]: I0220 10:13:16.661206 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/cp-metrics/0.log" Feb 20 10:13:16 crc kubenswrapper[5094]: I0220 10:13:16.815360 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/cp-metrics/0.log" Feb 20 10:13:16 crc kubenswrapper[5094]: I0220 10:13:16.821668 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/cp-reloader/0.log" Feb 20 10:13:16 crc kubenswrapper[5094]: I0220 10:13:16.867557 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/controller/0.log" Feb 20 10:13:16 crc kubenswrapper[5094]: I0220 10:13:16.899027 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/cp-frr-files/0.log" Feb 20 10:13:17 crc kubenswrapper[5094]: I0220 10:13:17.016000 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/frr-metrics/0.log" Feb 20 10:13:17 crc kubenswrapper[5094]: I0220 10:13:17.064081 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/kube-rbac-proxy/0.log" Feb 20 10:13:17 crc kubenswrapper[5094]: I0220 10:13:17.121758 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/kube-rbac-proxy-frr/0.log" Feb 20 10:13:17 crc kubenswrapper[5094]: I0220 10:13:17.266793 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/reloader/0.log" Feb 20 10:13:17 crc kubenswrapper[5094]: I0220 10:13:17.393968 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-dljs6_fe469d05-edeb-4d23-b06b-6bdbfc646e99/frr-k8s-webhook-server/0.log" Feb 20 10:13:17 crc kubenswrapper[5094]: I0220 10:13:17.579629 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6d969f468d-fd5gv_059e3724-d657-4f2e-beec-f4f55e09e498/manager/0.log" Feb 20 10:13:17 crc kubenswrapper[5094]: I0220 10:13:17.708931 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-c5fbff78-jk6cf_2dde7604-2a93-4dc0-9b15-b8fe41f79e1e/webhook-server/0.log" Feb 20 10:13:17 crc kubenswrapper[5094]: I0220 10:13:17.975045 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gjp5f_4d145cb8-0c5c-40f7-a99c-15f1575629c3/kube-rbac-proxy/0.log" Feb 20 10:13:18 crc kubenswrapper[5094]: I0220 10:13:18.762242 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gjp5f_4d145cb8-0c5c-40f7-a99c-15f1575629c3/speaker/0.log" Feb 20 10:13:20 crc kubenswrapper[5094]: I0220 10:13:20.277005 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/frr/0.log" Feb 20 10:13:33 crc kubenswrapper[5094]: I0220 10:13:33.644032 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26_67055673-f25d-44d3-99e5-2ac1474b1872/util/0.log" Feb 20 10:13:33 crc kubenswrapper[5094]: I0220 10:13:33.845215 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26_67055673-f25d-44d3-99e5-2ac1474b1872/util/0.log" Feb 20 10:13:33 crc kubenswrapper[5094]: I0220 10:13:33.914430 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26_67055673-f25d-44d3-99e5-2ac1474b1872/pull/0.log" Feb 20 10:13:33 crc kubenswrapper[5094]: I0220 10:13:33.915538 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26_67055673-f25d-44d3-99e5-2ac1474b1872/pull/0.log" Feb 20 10:13:34 crc kubenswrapper[5094]: I0220 10:13:34.123820 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26_67055673-f25d-44d3-99e5-2ac1474b1872/pull/0.log" Feb 20 10:13:34 crc kubenswrapper[5094]: I0220 10:13:34.134364 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26_67055673-f25d-44d3-99e5-2ac1474b1872/util/0.log" Feb 20 10:13:34 crc kubenswrapper[5094]: I0220 10:13:34.157447 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26_67055673-f25d-44d3-99e5-2ac1474b1872/extract/0.log" Feb 20 10:13:34 crc kubenswrapper[5094]: I0220 10:13:34.300730 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz_77007c08-6c58-4a19-9c49-09c1677b9070/util/0.log" Feb 20 10:13:34 crc kubenswrapper[5094]: I0220 10:13:34.488880 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz_77007c08-6c58-4a19-9c49-09c1677b9070/util/0.log" Feb 20 10:13:34 crc kubenswrapper[5094]: I0220 10:13:34.505529 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz_77007c08-6c58-4a19-9c49-09c1677b9070/pull/0.log" Feb 20 10:13:34 crc kubenswrapper[5094]: I0220 10:13:34.520414 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz_77007c08-6c58-4a19-9c49-09c1677b9070/pull/0.log" Feb 20 10:13:34 crc kubenswrapper[5094]: I0220 10:13:34.695550 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz_77007c08-6c58-4a19-9c49-09c1677b9070/util/0.log" Feb 20 10:13:34 crc kubenswrapper[5094]: I0220 10:13:34.698734 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz_77007c08-6c58-4a19-9c49-09c1677b9070/pull/0.log" Feb 20 10:13:34 crc kubenswrapper[5094]: I0220 10:13:34.730795 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz_77007c08-6c58-4a19-9c49-09c1677b9070/extract/0.log" Feb 20 10:13:34 crc kubenswrapper[5094]: I0220 10:13:34.892472 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx_4ffda3d5-82a2-4a0c-9052-2546188c107a/util/0.log" Feb 20 10:13:35 crc kubenswrapper[5094]: I0220 10:13:35.068215 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx_4ffda3d5-82a2-4a0c-9052-2546188c107a/pull/0.log" Feb 20 10:13:35 crc kubenswrapper[5094]: I0220 10:13:35.078690 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx_4ffda3d5-82a2-4a0c-9052-2546188c107a/util/0.log" Feb 20 10:13:35 crc kubenswrapper[5094]: I0220 10:13:35.080574 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx_4ffda3d5-82a2-4a0c-9052-2546188c107a/pull/0.log" Feb 20 10:13:35 crc kubenswrapper[5094]: I0220 10:13:35.291272 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx_4ffda3d5-82a2-4a0c-9052-2546188c107a/pull/0.log" Feb 20 10:13:35 crc kubenswrapper[5094]: I0220 10:13:35.340992 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx_4ffda3d5-82a2-4a0c-9052-2546188c107a/util/0.log" Feb 20 10:13:35 crc kubenswrapper[5094]: I0220 10:13:35.342006 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx_4ffda3d5-82a2-4a0c-9052-2546188c107a/extract/0.log" Feb 20 10:13:35 crc kubenswrapper[5094]: I0220 10:13:35.468687 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qk79m_67d20448-086a-4d76-b547-768f68c018f2/extract-utilities/0.log" Feb 20 10:13:35 crc kubenswrapper[5094]: I0220 10:13:35.666434 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qk79m_67d20448-086a-4d76-b547-768f68c018f2/extract-utilities/0.log" Feb 20 10:13:35 crc kubenswrapper[5094]: I0220 10:13:35.700417 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qk79m_67d20448-086a-4d76-b547-768f68c018f2/extract-content/0.log" Feb 20 10:13:35 crc kubenswrapper[5094]: I0220 10:13:35.705937 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qk79m_67d20448-086a-4d76-b547-768f68c018f2/extract-content/0.log" Feb 20 10:13:35 crc kubenswrapper[5094]: I0220 10:13:35.879588 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qk79m_67d20448-086a-4d76-b547-768f68c018f2/extract-utilities/0.log" Feb 20 10:13:35 crc kubenswrapper[5094]: I0220 10:13:35.894630 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qk79m_67d20448-086a-4d76-b547-768f68c018f2/extract-content/0.log" Feb 20 10:13:36 crc kubenswrapper[5094]: I0220 10:13:36.159967 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bhkm8_966d704a-5474-4b23-b125-63789f45ee54/extract-utilities/0.log" Feb 20 10:13:36 crc kubenswrapper[5094]: I0220 10:13:36.245259 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qk79m_67d20448-086a-4d76-b547-768f68c018f2/registry-server/0.log" Feb 20 10:13:36 crc kubenswrapper[5094]: I0220 10:13:36.371590 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bhkm8_966d704a-5474-4b23-b125-63789f45ee54/extract-content/0.log" Feb 20 10:13:36 crc kubenswrapper[5094]: I0220 10:13:36.376103 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bhkm8_966d704a-5474-4b23-b125-63789f45ee54/extract-content/0.log" Feb 20 10:13:36 crc kubenswrapper[5094]: I0220 10:13:36.379401 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bhkm8_966d704a-5474-4b23-b125-63789f45ee54/extract-utilities/0.log" Feb 20 10:13:36 crc kubenswrapper[5094]: I0220 10:13:36.613075 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bhkm8_966d704a-5474-4b23-b125-63789f45ee54/extract-content/0.log" Feb 20 10:13:36 crc kubenswrapper[5094]: I0220 10:13:36.699046 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bhkm8_966d704a-5474-4b23-b125-63789f45ee54/extract-utilities/0.log" Feb 20 10:13:37 crc kubenswrapper[5094]: I0220 10:13:37.204066 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr_b191dcbc-7e3b-4a36-a913-1fe0f53c83d5/util/0.log" Feb 20 10:13:37 crc kubenswrapper[5094]: I0220 10:13:37.561518 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr_b191dcbc-7e3b-4a36-a913-1fe0f53c83d5/util/0.log" Feb 20 10:13:37 crc kubenswrapper[5094]: I0220 10:13:37.581248 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bhkm8_966d704a-5474-4b23-b125-63789f45ee54/registry-server/0.log" Feb 20 10:13:37 crc kubenswrapper[5094]: I0220 10:13:37.595283 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr_b191dcbc-7e3b-4a36-a913-1fe0f53c83d5/pull/0.log" Feb 20 10:13:37 crc kubenswrapper[5094]: I0220 10:13:37.605422 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr_b191dcbc-7e3b-4a36-a913-1fe0f53c83d5/pull/0.log" Feb 20 10:13:37 crc kubenswrapper[5094]: I0220 10:13:37.796203 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr_b191dcbc-7e3b-4a36-a913-1fe0f53c83d5/util/0.log" Feb 20 10:13:37 crc kubenswrapper[5094]: I0220 10:13:37.800992 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr_b191dcbc-7e3b-4a36-a913-1fe0f53c83d5/pull/0.log" Feb 20 10:13:37 crc kubenswrapper[5094]: I0220 10:13:37.822906 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr_b191dcbc-7e3b-4a36-a913-1fe0f53c83d5/extract/0.log" Feb 20 10:13:37 crc kubenswrapper[5094]: I0220 10:13:37.833608 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-j8j9k_a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9/marketplace-operator/0.log" Feb 20 10:13:38 crc kubenswrapper[5094]: I0220 10:13:38.005250 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zw5zl_4ba2a013-0ac4-4983-92e6-875272450307/extract-utilities/0.log" Feb 20 10:13:38 crc kubenswrapper[5094]: I0220 10:13:38.154321 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zw5zl_4ba2a013-0ac4-4983-92e6-875272450307/extract-utilities/0.log" Feb 20 10:13:38 crc kubenswrapper[5094]: I0220 10:13:38.198953 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zw5zl_4ba2a013-0ac4-4983-92e6-875272450307/extract-content/0.log" Feb 20 10:13:38 crc kubenswrapper[5094]: I0220 10:13:38.203244 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zw5zl_4ba2a013-0ac4-4983-92e6-875272450307/extract-content/0.log" Feb 20 10:13:38 crc kubenswrapper[5094]: I0220 10:13:38.367044 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zw5zl_4ba2a013-0ac4-4983-92e6-875272450307/extract-utilities/0.log" Feb 20 10:13:38 crc kubenswrapper[5094]: I0220 10:13:38.441138 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zw5zl_4ba2a013-0ac4-4983-92e6-875272450307/extract-content/0.log" Feb 20 10:13:38 crc kubenswrapper[5094]: I0220 10:13:38.464590 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qcxs4_447e3a00-67d2-44c4-89cd-def383a3693d/extract-utilities/0.log" Feb 20 10:13:38 crc kubenswrapper[5094]: I0220 10:13:38.677786 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qcxs4_447e3a00-67d2-44c4-89cd-def383a3693d/extract-utilities/0.log" Feb 20 10:13:38 crc kubenswrapper[5094]: I0220 10:13:38.718466 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qcxs4_447e3a00-67d2-44c4-89cd-def383a3693d/extract-content/0.log" Feb 20 10:13:38 crc kubenswrapper[5094]: I0220 10:13:38.767818 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qcxs4_447e3a00-67d2-44c4-89cd-def383a3693d/extract-content/0.log" Feb 20 10:13:38 crc kubenswrapper[5094]: I0220 10:13:38.800775 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zw5zl_4ba2a013-0ac4-4983-92e6-875272450307/registry-server/0.log" Feb 20 10:13:38 crc kubenswrapper[5094]: I0220 10:13:38.866131 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qcxs4_447e3a00-67d2-44c4-89cd-def383a3693d/extract-utilities/0.log" Feb 20 10:13:38 crc kubenswrapper[5094]: I0220 10:13:38.890437 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qcxs4_447e3a00-67d2-44c4-89cd-def383a3693d/extract-content/0.log" Feb 20 10:13:40 crc kubenswrapper[5094]: I0220 10:13:40.149004 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qcxs4_447e3a00-67d2-44c4-89cd-def383a3693d/registry-server/0.log" Feb 20 10:13:46 crc kubenswrapper[5094]: I0220 10:13:46.419693 5094 scope.go:117] "RemoveContainer" containerID="13e5f052b4ba6e08f0990abbd5ddfbbbb9cd2d06486f28fe2908d667d1f8f224" Feb 20 10:13:52 crc kubenswrapper[5094]: I0220 10:13:52.939466 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-kxcc8_5a9736b1-aca8-4880-9d94-2d7c37efce50/prometheus-operator/0.log" Feb 20 10:13:52 crc kubenswrapper[5094]: I0220 10:13:52.975501 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-56764c7d84-47vjf_b0fb9831-f265-4976-9a1d-14ed3e08daf5/prometheus-operator-admission-webhook/0.log" Feb 20 10:13:53 crc kubenswrapper[5094]: I0220 10:13:53.057050 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-56764c7d84-d7v85_c70d95ea-5321-43fa-8df8-6d1138f0a732/prometheus-operator-admission-webhook/0.log" Feb 20 10:13:53 crc kubenswrapper[5094]: I0220 10:13:53.227050 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-9hzgx_1ab531ae-b53c-4de1-b927-ca32c159c244/perses-operator/0.log" Feb 20 10:13:53 crc kubenswrapper[5094]: I0220 10:13:53.242872 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-dcm8l_724c1050-e6d7-49c3-8b63-a89a3de26894/operator/0.log" Feb 20 10:14:34 crc kubenswrapper[5094]: I0220 10:14:34.106454 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:14:34 crc kubenswrapper[5094]: I0220 10:14:34.107054 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.176423 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq"] Feb 20 10:15:00 crc kubenswrapper[5094]: E0220 10:15:00.178067 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="324e88a2-c843-406e-a1c1-3bffb0b5a812" containerName="extract-content" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.178089 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="324e88a2-c843-406e-a1c1-3bffb0b5a812" containerName="extract-content" Feb 20 10:15:00 crc kubenswrapper[5094]: E0220 10:15:00.178118 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ceb59b7-8efc-478c-9663-ec454276c901" containerName="extract-content" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.178127 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ceb59b7-8efc-478c-9663-ec454276c901" containerName="extract-content" Feb 20 10:15:00 crc kubenswrapper[5094]: E0220 10:15:00.178146 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3479144c-1430-4bb4-8013-8c8fa47e3c75" containerName="container-00" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.178155 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3479144c-1430-4bb4-8013-8c8fa47e3c75" containerName="container-00" Feb 20 10:15:00 crc kubenswrapper[5094]: E0220 10:15:00.178171 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ceb59b7-8efc-478c-9663-ec454276c901" containerName="extract-utilities" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.178184 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ceb59b7-8efc-478c-9663-ec454276c901" containerName="extract-utilities" Feb 20 10:15:00 crc kubenswrapper[5094]: E0220 10:15:00.178216 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43789bbd-d60e-4c83-96d4-83c2345aee73" containerName="extract-utilities" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.178225 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="43789bbd-d60e-4c83-96d4-83c2345aee73" containerName="extract-utilities" Feb 20 10:15:00 crc kubenswrapper[5094]: E0220 10:15:00.178245 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ceb59b7-8efc-478c-9663-ec454276c901" containerName="registry-server" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.178254 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ceb59b7-8efc-478c-9663-ec454276c901" containerName="registry-server" Feb 20 10:15:00 crc kubenswrapper[5094]: E0220 10:15:00.178281 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43789bbd-d60e-4c83-96d4-83c2345aee73" containerName="extract-content" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.178289 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="43789bbd-d60e-4c83-96d4-83c2345aee73" containerName="extract-content" Feb 20 10:15:00 crc kubenswrapper[5094]: E0220 10:15:00.178306 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="324e88a2-c843-406e-a1c1-3bffb0b5a812" containerName="registry-server" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.178314 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="324e88a2-c843-406e-a1c1-3bffb0b5a812" containerName="registry-server" Feb 20 10:15:00 crc kubenswrapper[5094]: E0220 10:15:00.178328 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="324e88a2-c843-406e-a1c1-3bffb0b5a812" containerName="extract-utilities" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.178337 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="324e88a2-c843-406e-a1c1-3bffb0b5a812" containerName="extract-utilities" Feb 20 10:15:00 crc kubenswrapper[5094]: E0220 10:15:00.178354 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43789bbd-d60e-4c83-96d4-83c2345aee73" containerName="registry-server" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.178363 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="43789bbd-d60e-4c83-96d4-83c2345aee73" containerName="registry-server" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.178662 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="324e88a2-c843-406e-a1c1-3bffb0b5a812" containerName="registry-server" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.178685 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="43789bbd-d60e-4c83-96d4-83c2345aee73" containerName="registry-server" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.178745 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ceb59b7-8efc-478c-9663-ec454276c901" containerName="registry-server" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.178763 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="3479144c-1430-4bb4-8013-8c8fa47e3c75" containerName="container-00" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.180120 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.187600 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.187900 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.209805 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq"] Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.357034 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lstjd\" (UniqueName: \"kubernetes.io/projected/dc4a9549-d420-4188-83ab-110e9585ad99-kube-api-access-lstjd\") pod \"collect-profiles-29526375-m4wtq\" (UID: \"dc4a9549-d420-4188-83ab-110e9585ad99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.357575 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc4a9549-d420-4188-83ab-110e9585ad99-secret-volume\") pod \"collect-profiles-29526375-m4wtq\" (UID: \"dc4a9549-d420-4188-83ab-110e9585ad99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.357665 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc4a9549-d420-4188-83ab-110e9585ad99-config-volume\") pod \"collect-profiles-29526375-m4wtq\" (UID: \"dc4a9549-d420-4188-83ab-110e9585ad99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.459650 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc4a9549-d420-4188-83ab-110e9585ad99-secret-volume\") pod \"collect-profiles-29526375-m4wtq\" (UID: \"dc4a9549-d420-4188-83ab-110e9585ad99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.459845 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc4a9549-d420-4188-83ab-110e9585ad99-config-volume\") pod \"collect-profiles-29526375-m4wtq\" (UID: \"dc4a9549-d420-4188-83ab-110e9585ad99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.460040 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lstjd\" (UniqueName: \"kubernetes.io/projected/dc4a9549-d420-4188-83ab-110e9585ad99-kube-api-access-lstjd\") pod \"collect-profiles-29526375-m4wtq\" (UID: \"dc4a9549-d420-4188-83ab-110e9585ad99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.462108 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc4a9549-d420-4188-83ab-110e9585ad99-config-volume\") pod \"collect-profiles-29526375-m4wtq\" (UID: \"dc4a9549-d420-4188-83ab-110e9585ad99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.473718 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc4a9549-d420-4188-83ab-110e9585ad99-secret-volume\") pod \"collect-profiles-29526375-m4wtq\" (UID: \"dc4a9549-d420-4188-83ab-110e9585ad99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.477059 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lstjd\" (UniqueName: \"kubernetes.io/projected/dc4a9549-d420-4188-83ab-110e9585ad99-kube-api-access-lstjd\") pod \"collect-profiles-29526375-m4wtq\" (UID: \"dc4a9549-d420-4188-83ab-110e9585ad99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.529413 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq" Feb 20 10:15:00 crc kubenswrapper[5094]: W0220 10:15:00.984195 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc4a9549_d420_4188_83ab_110e9585ad99.slice/crio-afc298955973815edba6a67f548b7f02f0a2e45b8ac971fc904bd9d29c57d05b WatchSource:0}: Error finding container afc298955973815edba6a67f548b7f02f0a2e45b8ac971fc904bd9d29c57d05b: Status 404 returned error can't find the container with id afc298955973815edba6a67f548b7f02f0a2e45b8ac971fc904bd9d29c57d05b Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.984928 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq"] Feb 20 10:15:01 crc kubenswrapper[5094]: I0220 10:15:01.964914 5094 generic.go:334] "Generic (PLEG): container finished" podID="dc4a9549-d420-4188-83ab-110e9585ad99" containerID="cff1e3645b1be65b4ed72cc908b8dac51b9ad71ef96dfbdb704c6d7b943c3aa3" exitCode=0 Feb 20 10:15:01 crc kubenswrapper[5094]: I0220 10:15:01.965399 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq" event={"ID":"dc4a9549-d420-4188-83ab-110e9585ad99","Type":"ContainerDied","Data":"cff1e3645b1be65b4ed72cc908b8dac51b9ad71ef96dfbdb704c6d7b943c3aa3"} Feb 20 10:15:01 crc kubenswrapper[5094]: I0220 10:15:01.965751 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq" event={"ID":"dc4a9549-d420-4188-83ab-110e9585ad99","Type":"ContainerStarted","Data":"afc298955973815edba6a67f548b7f02f0a2e45b8ac971fc904bd9d29c57d05b"} Feb 20 10:15:03 crc kubenswrapper[5094]: I0220 10:15:03.432250 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq" Feb 20 10:15:03 crc kubenswrapper[5094]: I0220 10:15:03.531078 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lstjd\" (UniqueName: \"kubernetes.io/projected/dc4a9549-d420-4188-83ab-110e9585ad99-kube-api-access-lstjd\") pod \"dc4a9549-d420-4188-83ab-110e9585ad99\" (UID: \"dc4a9549-d420-4188-83ab-110e9585ad99\") " Feb 20 10:15:03 crc kubenswrapper[5094]: I0220 10:15:03.531126 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc4a9549-d420-4188-83ab-110e9585ad99-secret-volume\") pod \"dc4a9549-d420-4188-83ab-110e9585ad99\" (UID: \"dc4a9549-d420-4188-83ab-110e9585ad99\") " Feb 20 10:15:03 crc kubenswrapper[5094]: I0220 10:15:03.531295 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc4a9549-d420-4188-83ab-110e9585ad99-config-volume\") pod \"dc4a9549-d420-4188-83ab-110e9585ad99\" (UID: \"dc4a9549-d420-4188-83ab-110e9585ad99\") " Feb 20 10:15:03 crc kubenswrapper[5094]: I0220 10:15:03.532039 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc4a9549-d420-4188-83ab-110e9585ad99-config-volume" (OuterVolumeSpecName: "config-volume") pod "dc4a9549-d420-4188-83ab-110e9585ad99" (UID: "dc4a9549-d420-4188-83ab-110e9585ad99"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:15:03 crc kubenswrapper[5094]: I0220 10:15:03.536999 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc4a9549-d420-4188-83ab-110e9585ad99-kube-api-access-lstjd" (OuterVolumeSpecName: "kube-api-access-lstjd") pod "dc4a9549-d420-4188-83ab-110e9585ad99" (UID: "dc4a9549-d420-4188-83ab-110e9585ad99"). InnerVolumeSpecName "kube-api-access-lstjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:03 crc kubenswrapper[5094]: I0220 10:15:03.537042 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc4a9549-d420-4188-83ab-110e9585ad99-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "dc4a9549-d420-4188-83ab-110e9585ad99" (UID: "dc4a9549-d420-4188-83ab-110e9585ad99"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:03 crc kubenswrapper[5094]: I0220 10:15:03.633591 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lstjd\" (UniqueName: \"kubernetes.io/projected/dc4a9549-d420-4188-83ab-110e9585ad99-kube-api-access-lstjd\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:03 crc kubenswrapper[5094]: I0220 10:15:03.633855 5094 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc4a9549-d420-4188-83ab-110e9585ad99-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:03 crc kubenswrapper[5094]: I0220 10:15:03.633865 5094 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc4a9549-d420-4188-83ab-110e9585ad99-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:03 crc kubenswrapper[5094]: I0220 10:15:03.985775 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq" event={"ID":"dc4a9549-d420-4188-83ab-110e9585ad99","Type":"ContainerDied","Data":"afc298955973815edba6a67f548b7f02f0a2e45b8ac971fc904bd9d29c57d05b"} Feb 20 10:15:03 crc kubenswrapper[5094]: I0220 10:15:03.985814 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afc298955973815edba6a67f548b7f02f0a2e45b8ac971fc904bd9d29c57d05b" Feb 20 10:15:03 crc kubenswrapper[5094]: I0220 10:15:03.985859 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq" Feb 20 10:15:04 crc kubenswrapper[5094]: I0220 10:15:04.107002 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:15:04 crc kubenswrapper[5094]: I0220 10:15:04.107112 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:15:04 crc kubenswrapper[5094]: I0220 10:15:04.525552 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7"] Feb 20 10:15:04 crc kubenswrapper[5094]: I0220 10:15:04.540733 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7"] Feb 20 10:15:05 crc kubenswrapper[5094]: I0220 10:15:05.863855 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06092367-1969-4b35-8025-09e5a52a5855" path="/var/lib/kubelet/pods/06092367-1969-4b35-8025-09e5a52a5855/volumes" Feb 20 10:15:34 crc kubenswrapper[5094]: I0220 10:15:34.107307 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:15:34 crc kubenswrapper[5094]: I0220 10:15:34.108017 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:15:34 crc kubenswrapper[5094]: I0220 10:15:34.108080 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 10:15:34 crc kubenswrapper[5094]: I0220 10:15:34.109210 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:15:34 crc kubenswrapper[5094]: I0220 10:15:34.109315 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" gracePeriod=600 Feb 20 10:15:34 crc kubenswrapper[5094]: E0220 10:15:34.239424 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:15:34 crc kubenswrapper[5094]: I0220 10:15:34.389556 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" exitCode=0 Feb 20 10:15:34 crc kubenswrapper[5094]: I0220 10:15:34.389603 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512"} Feb 20 10:15:34 crc kubenswrapper[5094]: I0220 10:15:34.389636 5094 scope.go:117] "RemoveContainer" containerID="314cf401191c15263264dcbedb42c99f095a2284b06c68e30858221c2e104a07" Feb 20 10:15:34 crc kubenswrapper[5094]: I0220 10:15:34.390428 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:15:34 crc kubenswrapper[5094]: E0220 10:15:34.390864 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:15:46 crc kubenswrapper[5094]: I0220 10:15:46.516959 5094 scope.go:117] "RemoveContainer" containerID="4458d3e89efbd0e5ea42a99c4b47f135cba67a66cbdaaf49efb55576b8dd1322" Feb 20 10:15:46 crc kubenswrapper[5094]: I0220 10:15:46.840350 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:15:46 crc kubenswrapper[5094]: E0220 10:15:46.840825 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:16:01 crc kubenswrapper[5094]: I0220 10:16:01.840759 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:16:01 crc kubenswrapper[5094]: E0220 10:16:01.841875 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:16:05 crc kubenswrapper[5094]: I0220 10:16:05.918177 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7vj9d"] Feb 20 10:16:05 crc kubenswrapper[5094]: E0220 10:16:05.920414 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc4a9549-d420-4188-83ab-110e9585ad99" containerName="collect-profiles" Feb 20 10:16:05 crc kubenswrapper[5094]: I0220 10:16:05.920527 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc4a9549-d420-4188-83ab-110e9585ad99" containerName="collect-profiles" Feb 20 10:16:05 crc kubenswrapper[5094]: I0220 10:16:05.920897 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc4a9549-d420-4188-83ab-110e9585ad99" containerName="collect-profiles" Feb 20 10:16:05 crc kubenswrapper[5094]: I0220 10:16:05.922829 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:05 crc kubenswrapper[5094]: I0220 10:16:05.934378 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7vj9d"] Feb 20 10:16:06 crc kubenswrapper[5094]: I0220 10:16:06.060249 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-utilities\") pod \"certified-operators-7vj9d\" (UID: \"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b\") " pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:06 crc kubenswrapper[5094]: I0220 10:16:06.060311 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-catalog-content\") pod \"certified-operators-7vj9d\" (UID: \"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b\") " pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:06 crc kubenswrapper[5094]: I0220 10:16:06.060377 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rwrh\" (UniqueName: \"kubernetes.io/projected/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-kube-api-access-9rwrh\") pod \"certified-operators-7vj9d\" (UID: \"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b\") " pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:06 crc kubenswrapper[5094]: I0220 10:16:06.162591 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-utilities\") pod \"certified-operators-7vj9d\" (UID: \"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b\") " pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:06 crc kubenswrapper[5094]: I0220 10:16:06.162964 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-catalog-content\") pod \"certified-operators-7vj9d\" (UID: \"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b\") " pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:06 crc kubenswrapper[5094]: I0220 10:16:06.163143 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rwrh\" (UniqueName: \"kubernetes.io/projected/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-kube-api-access-9rwrh\") pod \"certified-operators-7vj9d\" (UID: \"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b\") " pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:06 crc kubenswrapper[5094]: I0220 10:16:06.163273 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-catalog-content\") pod \"certified-operators-7vj9d\" (UID: \"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b\") " pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:06 crc kubenswrapper[5094]: I0220 10:16:06.163396 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-utilities\") pod \"certified-operators-7vj9d\" (UID: \"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b\") " pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:06 crc kubenswrapper[5094]: I0220 10:16:06.198221 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rwrh\" (UniqueName: \"kubernetes.io/projected/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-kube-api-access-9rwrh\") pod \"certified-operators-7vj9d\" (UID: \"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b\") " pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:06 crc kubenswrapper[5094]: I0220 10:16:06.244067 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:06 crc kubenswrapper[5094]: I0220 10:16:06.833137 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7vj9d"] Feb 20 10:16:07 crc kubenswrapper[5094]: I0220 10:16:07.837296 5094 generic.go:334] "Generic (PLEG): container finished" podID="e9c6363e-df27-4cdc-bdd1-26daff7ceb4b" containerID="0ddc4eebe69944fe8ec19dc873de312c8f3a9f21c3e745d0f5a98f2fd6c0756c" exitCode=0 Feb 20 10:16:07 crc kubenswrapper[5094]: I0220 10:16:07.837431 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vj9d" event={"ID":"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b","Type":"ContainerDied","Data":"0ddc4eebe69944fe8ec19dc873de312c8f3a9f21c3e745d0f5a98f2fd6c0756c"} Feb 20 10:16:07 crc kubenswrapper[5094]: I0220 10:16:07.838041 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vj9d" event={"ID":"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b","Type":"ContainerStarted","Data":"82aca659b94fe1e7677dac1deb6868541b3c2c79dd885663bc3a1b1e828c3b58"} Feb 20 10:16:07 crc kubenswrapper[5094]: I0220 10:16:07.860460 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 10:16:08 crc kubenswrapper[5094]: I0220 10:16:08.852952 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vj9d" event={"ID":"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b","Type":"ContainerStarted","Data":"9ab342a262a061dd56324fe9dee2bd698ad36dc46c7a79f8999c6b2e08d81cf3"} Feb 20 10:16:09 crc kubenswrapper[5094]: I0220 10:16:09.865738 5094 generic.go:334] "Generic (PLEG): container finished" podID="e9c6363e-df27-4cdc-bdd1-26daff7ceb4b" containerID="9ab342a262a061dd56324fe9dee2bd698ad36dc46c7a79f8999c6b2e08d81cf3" exitCode=0 Feb 20 10:16:09 crc kubenswrapper[5094]: I0220 10:16:09.865912 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vj9d" event={"ID":"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b","Type":"ContainerDied","Data":"9ab342a262a061dd56324fe9dee2bd698ad36dc46c7a79f8999c6b2e08d81cf3"} Feb 20 10:16:10 crc kubenswrapper[5094]: I0220 10:16:10.884346 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vj9d" event={"ID":"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b","Type":"ContainerStarted","Data":"15a9e66cb3931668f0956a87a42fa98bb27623223549898f4dd9f6b95305dd10"} Feb 20 10:16:10 crc kubenswrapper[5094]: I0220 10:16:10.917311 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7vj9d" podStartSLOduration=3.505244887 podStartE2EDuration="5.91728777s" podCreationTimestamp="2026-02-20 10:16:05 +0000 UTC" firstStartedPulling="2026-02-20 10:16:07.853911063 +0000 UTC m=+12582.726537774" lastFinishedPulling="2026-02-20 10:16:10.265953946 +0000 UTC m=+12585.138580657" observedRunningTime="2026-02-20 10:16:10.908487968 +0000 UTC m=+12585.781114699" watchObservedRunningTime="2026-02-20 10:16:10.91728777 +0000 UTC m=+12585.789914481" Feb 20 10:16:13 crc kubenswrapper[5094]: I0220 10:16:13.841361 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:16:13 crc kubenswrapper[5094]: E0220 10:16:13.842229 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:16:16 crc kubenswrapper[5094]: I0220 10:16:16.244742 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:16 crc kubenswrapper[5094]: I0220 10:16:16.245166 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:16 crc kubenswrapper[5094]: I0220 10:16:16.328094 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:17 crc kubenswrapper[5094]: I0220 10:16:17.027963 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:17 crc kubenswrapper[5094]: I0220 10:16:17.119265 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7vj9d"] Feb 20 10:16:18 crc kubenswrapper[5094]: I0220 10:16:18.983108 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7vj9d" podUID="e9c6363e-df27-4cdc-bdd1-26daff7ceb4b" containerName="registry-server" containerID="cri-o://15a9e66cb3931668f0956a87a42fa98bb27623223549898f4dd9f6b95305dd10" gracePeriod=2 Feb 20 10:16:19 crc kubenswrapper[5094]: I0220 10:16:19.574634 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:19 crc kubenswrapper[5094]: I0220 10:16:19.578224 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-utilities\") pod \"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b\" (UID: \"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b\") " Feb 20 10:16:19 crc kubenswrapper[5094]: I0220 10:16:19.579246 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-utilities" (OuterVolumeSpecName: "utilities") pod "e9c6363e-df27-4cdc-bdd1-26daff7ceb4b" (UID: "e9c6363e-df27-4cdc-bdd1-26daff7ceb4b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:16:19 crc kubenswrapper[5094]: I0220 10:16:19.580880 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rwrh\" (UniqueName: \"kubernetes.io/projected/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-kube-api-access-9rwrh\") pod \"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b\" (UID: \"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b\") " Feb 20 10:16:19 crc kubenswrapper[5094]: I0220 10:16:19.581052 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-catalog-content\") pod \"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b\" (UID: \"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b\") " Feb 20 10:16:19 crc kubenswrapper[5094]: I0220 10:16:19.581944 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:19 crc kubenswrapper[5094]: I0220 10:16:19.590691 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-kube-api-access-9rwrh" (OuterVolumeSpecName: "kube-api-access-9rwrh") pod "e9c6363e-df27-4cdc-bdd1-26daff7ceb4b" (UID: "e9c6363e-df27-4cdc-bdd1-26daff7ceb4b"). InnerVolumeSpecName "kube-api-access-9rwrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:16:19 crc kubenswrapper[5094]: I0220 10:16:19.646382 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e9c6363e-df27-4cdc-bdd1-26daff7ceb4b" (UID: "e9c6363e-df27-4cdc-bdd1-26daff7ceb4b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:16:19 crc kubenswrapper[5094]: I0220 10:16:19.684424 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rwrh\" (UniqueName: \"kubernetes.io/projected/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-kube-api-access-9rwrh\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:19 crc kubenswrapper[5094]: I0220 10:16:19.684453 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:19 crc kubenswrapper[5094]: I0220 10:16:19.995473 5094 generic.go:334] "Generic (PLEG): container finished" podID="e9c6363e-df27-4cdc-bdd1-26daff7ceb4b" containerID="15a9e66cb3931668f0956a87a42fa98bb27623223549898f4dd9f6b95305dd10" exitCode=0 Feb 20 10:16:19 crc kubenswrapper[5094]: I0220 10:16:19.995512 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vj9d" event={"ID":"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b","Type":"ContainerDied","Data":"15a9e66cb3931668f0956a87a42fa98bb27623223549898f4dd9f6b95305dd10"} Feb 20 10:16:19 crc kubenswrapper[5094]: I0220 10:16:19.995554 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:19 crc kubenswrapper[5094]: I0220 10:16:19.995579 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vj9d" event={"ID":"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b","Type":"ContainerDied","Data":"82aca659b94fe1e7677dac1deb6868541b3c2c79dd885663bc3a1b1e828c3b58"} Feb 20 10:16:19 crc kubenswrapper[5094]: I0220 10:16:19.995614 5094 scope.go:117] "RemoveContainer" containerID="15a9e66cb3931668f0956a87a42fa98bb27623223549898f4dd9f6b95305dd10" Feb 20 10:16:20 crc kubenswrapper[5094]: I0220 10:16:20.028370 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7vj9d"] Feb 20 10:16:20 crc kubenswrapper[5094]: I0220 10:16:20.037874 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7vj9d"] Feb 20 10:16:20 crc kubenswrapper[5094]: I0220 10:16:20.038442 5094 scope.go:117] "RemoveContainer" containerID="9ab342a262a061dd56324fe9dee2bd698ad36dc46c7a79f8999c6b2e08d81cf3" Feb 20 10:16:20 crc kubenswrapper[5094]: I0220 10:16:20.075748 5094 scope.go:117] "RemoveContainer" containerID="0ddc4eebe69944fe8ec19dc873de312c8f3a9f21c3e745d0f5a98f2fd6c0756c" Feb 20 10:16:20 crc kubenswrapper[5094]: I0220 10:16:20.102205 5094 scope.go:117] "RemoveContainer" containerID="15a9e66cb3931668f0956a87a42fa98bb27623223549898f4dd9f6b95305dd10" Feb 20 10:16:20 crc kubenswrapper[5094]: E0220 10:16:20.102883 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15a9e66cb3931668f0956a87a42fa98bb27623223549898f4dd9f6b95305dd10\": container with ID starting with 15a9e66cb3931668f0956a87a42fa98bb27623223549898f4dd9f6b95305dd10 not found: ID does not exist" containerID="15a9e66cb3931668f0956a87a42fa98bb27623223549898f4dd9f6b95305dd10" Feb 20 10:16:20 crc kubenswrapper[5094]: I0220 10:16:20.102926 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15a9e66cb3931668f0956a87a42fa98bb27623223549898f4dd9f6b95305dd10"} err="failed to get container status \"15a9e66cb3931668f0956a87a42fa98bb27623223549898f4dd9f6b95305dd10\": rpc error: code = NotFound desc = could not find container \"15a9e66cb3931668f0956a87a42fa98bb27623223549898f4dd9f6b95305dd10\": container with ID starting with 15a9e66cb3931668f0956a87a42fa98bb27623223549898f4dd9f6b95305dd10 not found: ID does not exist" Feb 20 10:16:20 crc kubenswrapper[5094]: I0220 10:16:20.102950 5094 scope.go:117] "RemoveContainer" containerID="9ab342a262a061dd56324fe9dee2bd698ad36dc46c7a79f8999c6b2e08d81cf3" Feb 20 10:16:20 crc kubenswrapper[5094]: E0220 10:16:20.103288 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ab342a262a061dd56324fe9dee2bd698ad36dc46c7a79f8999c6b2e08d81cf3\": container with ID starting with 9ab342a262a061dd56324fe9dee2bd698ad36dc46c7a79f8999c6b2e08d81cf3 not found: ID does not exist" containerID="9ab342a262a061dd56324fe9dee2bd698ad36dc46c7a79f8999c6b2e08d81cf3" Feb 20 10:16:20 crc kubenswrapper[5094]: I0220 10:16:20.103316 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ab342a262a061dd56324fe9dee2bd698ad36dc46c7a79f8999c6b2e08d81cf3"} err="failed to get container status \"9ab342a262a061dd56324fe9dee2bd698ad36dc46c7a79f8999c6b2e08d81cf3\": rpc error: code = NotFound desc = could not find container \"9ab342a262a061dd56324fe9dee2bd698ad36dc46c7a79f8999c6b2e08d81cf3\": container with ID starting with 9ab342a262a061dd56324fe9dee2bd698ad36dc46c7a79f8999c6b2e08d81cf3 not found: ID does not exist" Feb 20 10:16:20 crc kubenswrapper[5094]: I0220 10:16:20.103332 5094 scope.go:117] "RemoveContainer" containerID="0ddc4eebe69944fe8ec19dc873de312c8f3a9f21c3e745d0f5a98f2fd6c0756c" Feb 20 10:16:20 crc kubenswrapper[5094]: E0220 10:16:20.103534 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ddc4eebe69944fe8ec19dc873de312c8f3a9f21c3e745d0f5a98f2fd6c0756c\": container with ID starting with 0ddc4eebe69944fe8ec19dc873de312c8f3a9f21c3e745d0f5a98f2fd6c0756c not found: ID does not exist" containerID="0ddc4eebe69944fe8ec19dc873de312c8f3a9f21c3e745d0f5a98f2fd6c0756c" Feb 20 10:16:20 crc kubenswrapper[5094]: I0220 10:16:20.103563 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ddc4eebe69944fe8ec19dc873de312c8f3a9f21c3e745d0f5a98f2fd6c0756c"} err="failed to get container status \"0ddc4eebe69944fe8ec19dc873de312c8f3a9f21c3e745d0f5a98f2fd6c0756c\": rpc error: code = NotFound desc = could not find container \"0ddc4eebe69944fe8ec19dc873de312c8f3a9f21c3e745d0f5a98f2fd6c0756c\": container with ID starting with 0ddc4eebe69944fe8ec19dc873de312c8f3a9f21c3e745d0f5a98f2fd6c0756c not found: ID does not exist" Feb 20 10:16:21 crc kubenswrapper[5094]: I0220 10:16:21.870520 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9c6363e-df27-4cdc-bdd1-26daff7ceb4b" path="/var/lib/kubelet/pods/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b/volumes" Feb 20 10:16:26 crc kubenswrapper[5094]: I0220 10:16:26.840476 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:16:26 crc kubenswrapper[5094]: E0220 10:16:26.841540 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:16:38 crc kubenswrapper[5094]: I0220 10:16:38.841303 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:16:38 crc kubenswrapper[5094]: E0220 10:16:38.842547 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:16:51 crc kubenswrapper[5094]: I0220 10:16:51.841377 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:16:51 crc kubenswrapper[5094]: E0220 10:16:51.842273 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:17:02 crc kubenswrapper[5094]: I0220 10:17:02.840449 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:17:02 crc kubenswrapper[5094]: E0220 10:17:02.841285 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:17:13 crc kubenswrapper[5094]: I0220 10:17:13.841337 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:17:13 crc kubenswrapper[5094]: E0220 10:17:13.842358 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:17:27 crc kubenswrapper[5094]: I0220 10:17:27.846951 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:17:27 crc kubenswrapper[5094]: E0220 10:17:27.849743 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:17:32 crc kubenswrapper[5094]: I0220 10:17:32.885075 5094 generic.go:334] "Generic (PLEG): container finished" podID="2578360e-4830-4223-b07f-031c6c2df11e" containerID="bf28c01ded4fd08d2344ae69fe0783b98b33fb64650c719c32f78d2a847e5899" exitCode=0 Feb 20 10:17:32 crc kubenswrapper[5094]: I0220 10:17:32.885136 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pc5cx/must-gather-n6jcq" event={"ID":"2578360e-4830-4223-b07f-031c6c2df11e","Type":"ContainerDied","Data":"bf28c01ded4fd08d2344ae69fe0783b98b33fb64650c719c32f78d2a847e5899"} Feb 20 10:17:32 crc kubenswrapper[5094]: I0220 10:17:32.886252 5094 scope.go:117] "RemoveContainer" containerID="bf28c01ded4fd08d2344ae69fe0783b98b33fb64650c719c32f78d2a847e5899" Feb 20 10:17:33 crc kubenswrapper[5094]: I0220 10:17:33.422272 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pc5cx_must-gather-n6jcq_2578360e-4830-4223-b07f-031c6c2df11e/gather/0.log" Feb 20 10:17:39 crc kubenswrapper[5094]: I0220 10:17:39.847656 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:17:39 crc kubenswrapper[5094]: E0220 10:17:39.849047 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:17:44 crc kubenswrapper[5094]: I0220 10:17:44.943175 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pc5cx/must-gather-n6jcq"] Feb 20 10:17:44 crc kubenswrapper[5094]: I0220 10:17:44.944376 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-pc5cx/must-gather-n6jcq" podUID="2578360e-4830-4223-b07f-031c6c2df11e" containerName="copy" containerID="cri-o://ef7151eae2ca0ca37c08fec6a465f8f62ef97648b0bec7c8a5a1cfa8e04d3147" gracePeriod=2 Feb 20 10:17:44 crc kubenswrapper[5094]: I0220 10:17:44.957568 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pc5cx/must-gather-n6jcq"] Feb 20 10:17:45 crc kubenswrapper[5094]: I0220 10:17:45.390095 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pc5cx_must-gather-n6jcq_2578360e-4830-4223-b07f-031c6c2df11e/copy/0.log" Feb 20 10:17:45 crc kubenswrapper[5094]: I0220 10:17:45.390875 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pc5cx/must-gather-n6jcq" Feb 20 10:17:45 crc kubenswrapper[5094]: I0220 10:17:45.468780 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2578360e-4830-4223-b07f-031c6c2df11e-must-gather-output\") pod \"2578360e-4830-4223-b07f-031c6c2df11e\" (UID: \"2578360e-4830-4223-b07f-031c6c2df11e\") " Feb 20 10:17:45 crc kubenswrapper[5094]: I0220 10:17:45.469556 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7jgv\" (UniqueName: \"kubernetes.io/projected/2578360e-4830-4223-b07f-031c6c2df11e-kube-api-access-c7jgv\") pod \"2578360e-4830-4223-b07f-031c6c2df11e\" (UID: \"2578360e-4830-4223-b07f-031c6c2df11e\") " Feb 20 10:17:45 crc kubenswrapper[5094]: I0220 10:17:45.477348 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2578360e-4830-4223-b07f-031c6c2df11e-kube-api-access-c7jgv" (OuterVolumeSpecName: "kube-api-access-c7jgv") pod "2578360e-4830-4223-b07f-031c6c2df11e" (UID: "2578360e-4830-4223-b07f-031c6c2df11e"). InnerVolumeSpecName "kube-api-access-c7jgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:17:45 crc kubenswrapper[5094]: I0220 10:17:45.574917 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7jgv\" (UniqueName: \"kubernetes.io/projected/2578360e-4830-4223-b07f-031c6c2df11e-kube-api-access-c7jgv\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:45 crc kubenswrapper[5094]: I0220 10:17:45.815794 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2578360e-4830-4223-b07f-031c6c2df11e-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "2578360e-4830-4223-b07f-031c6c2df11e" (UID: "2578360e-4830-4223-b07f-031c6c2df11e"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:17:45 crc kubenswrapper[5094]: I0220 10:17:45.850360 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2578360e-4830-4223-b07f-031c6c2df11e" path="/var/lib/kubelet/pods/2578360e-4830-4223-b07f-031c6c2df11e/volumes" Feb 20 10:17:45 crc kubenswrapper[5094]: I0220 10:17:45.881021 5094 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2578360e-4830-4223-b07f-031c6c2df11e-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:46 crc kubenswrapper[5094]: I0220 10:17:46.040473 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pc5cx_must-gather-n6jcq_2578360e-4830-4223-b07f-031c6c2df11e/copy/0.log" Feb 20 10:17:46 crc kubenswrapper[5094]: I0220 10:17:46.040821 5094 generic.go:334] "Generic (PLEG): container finished" podID="2578360e-4830-4223-b07f-031c6c2df11e" containerID="ef7151eae2ca0ca37c08fec6a465f8f62ef97648b0bec7c8a5a1cfa8e04d3147" exitCode=143 Feb 20 10:17:46 crc kubenswrapper[5094]: I0220 10:17:46.040864 5094 scope.go:117] "RemoveContainer" containerID="ef7151eae2ca0ca37c08fec6a465f8f62ef97648b0bec7c8a5a1cfa8e04d3147" Feb 20 10:17:46 crc kubenswrapper[5094]: I0220 10:17:46.040992 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pc5cx/must-gather-n6jcq" Feb 20 10:17:46 crc kubenswrapper[5094]: I0220 10:17:46.063997 5094 scope.go:117] "RemoveContainer" containerID="bf28c01ded4fd08d2344ae69fe0783b98b33fb64650c719c32f78d2a847e5899" Feb 20 10:17:46 crc kubenswrapper[5094]: I0220 10:17:46.145565 5094 scope.go:117] "RemoveContainer" containerID="ef7151eae2ca0ca37c08fec6a465f8f62ef97648b0bec7c8a5a1cfa8e04d3147" Feb 20 10:17:46 crc kubenswrapper[5094]: E0220 10:17:46.146003 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef7151eae2ca0ca37c08fec6a465f8f62ef97648b0bec7c8a5a1cfa8e04d3147\": container with ID starting with ef7151eae2ca0ca37c08fec6a465f8f62ef97648b0bec7c8a5a1cfa8e04d3147 not found: ID does not exist" containerID="ef7151eae2ca0ca37c08fec6a465f8f62ef97648b0bec7c8a5a1cfa8e04d3147" Feb 20 10:17:46 crc kubenswrapper[5094]: I0220 10:17:46.146239 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef7151eae2ca0ca37c08fec6a465f8f62ef97648b0bec7c8a5a1cfa8e04d3147"} err="failed to get container status \"ef7151eae2ca0ca37c08fec6a465f8f62ef97648b0bec7c8a5a1cfa8e04d3147\": rpc error: code = NotFound desc = could not find container \"ef7151eae2ca0ca37c08fec6a465f8f62ef97648b0bec7c8a5a1cfa8e04d3147\": container with ID starting with ef7151eae2ca0ca37c08fec6a465f8f62ef97648b0bec7c8a5a1cfa8e04d3147 not found: ID does not exist" Feb 20 10:17:46 crc kubenswrapper[5094]: I0220 10:17:46.146443 5094 scope.go:117] "RemoveContainer" containerID="bf28c01ded4fd08d2344ae69fe0783b98b33fb64650c719c32f78d2a847e5899" Feb 20 10:17:46 crc kubenswrapper[5094]: E0220 10:17:46.146983 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf28c01ded4fd08d2344ae69fe0783b98b33fb64650c719c32f78d2a847e5899\": container with ID starting with bf28c01ded4fd08d2344ae69fe0783b98b33fb64650c719c32f78d2a847e5899 not found: ID does not exist" containerID="bf28c01ded4fd08d2344ae69fe0783b98b33fb64650c719c32f78d2a847e5899" Feb 20 10:17:46 crc kubenswrapper[5094]: I0220 10:17:46.147038 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf28c01ded4fd08d2344ae69fe0783b98b33fb64650c719c32f78d2a847e5899"} err="failed to get container status \"bf28c01ded4fd08d2344ae69fe0783b98b33fb64650c719c32f78d2a847e5899\": rpc error: code = NotFound desc = could not find container \"bf28c01ded4fd08d2344ae69fe0783b98b33fb64650c719c32f78d2a847e5899\": container with ID starting with bf28c01ded4fd08d2344ae69fe0783b98b33fb64650c719c32f78d2a847e5899 not found: ID does not exist" Feb 20 10:17:52 crc kubenswrapper[5094]: I0220 10:17:52.840539 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:17:52 crc kubenswrapper[5094]: E0220 10:17:52.842050 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.564326 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bh6s7"] Feb 20 10:17:56 crc kubenswrapper[5094]: E0220 10:17:56.565697 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2578360e-4830-4223-b07f-031c6c2df11e" containerName="copy" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.565769 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="2578360e-4830-4223-b07f-031c6c2df11e" containerName="copy" Feb 20 10:17:56 crc kubenswrapper[5094]: E0220 10:17:56.565799 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9c6363e-df27-4cdc-bdd1-26daff7ceb4b" containerName="registry-server" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.565907 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c6363e-df27-4cdc-bdd1-26daff7ceb4b" containerName="registry-server" Feb 20 10:17:56 crc kubenswrapper[5094]: E0220 10:17:56.565929 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9c6363e-df27-4cdc-bdd1-26daff7ceb4b" containerName="extract-utilities" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.565941 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c6363e-df27-4cdc-bdd1-26daff7ceb4b" containerName="extract-utilities" Feb 20 10:17:56 crc kubenswrapper[5094]: E0220 10:17:56.565957 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2578360e-4830-4223-b07f-031c6c2df11e" containerName="gather" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.565965 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="2578360e-4830-4223-b07f-031c6c2df11e" containerName="gather" Feb 20 10:17:56 crc kubenswrapper[5094]: E0220 10:17:56.565988 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9c6363e-df27-4cdc-bdd1-26daff7ceb4b" containerName="extract-content" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.565996 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c6363e-df27-4cdc-bdd1-26daff7ceb4b" containerName="extract-content" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.566292 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="2578360e-4830-4223-b07f-031c6c2df11e" containerName="gather" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.566312 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9c6363e-df27-4cdc-bdd1-26daff7ceb4b" containerName="registry-server" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.566325 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="2578360e-4830-4223-b07f-031c6c2df11e" containerName="copy" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.568496 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.580514 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bh6s7"] Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.768063 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75023528-13e3-4ab0-927d-6edfa21f1627-utilities\") pod \"community-operators-bh6s7\" (UID: \"75023528-13e3-4ab0-927d-6edfa21f1627\") " pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.768171 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzxhc\" (UniqueName: \"kubernetes.io/projected/75023528-13e3-4ab0-927d-6edfa21f1627-kube-api-access-gzxhc\") pod \"community-operators-bh6s7\" (UID: \"75023528-13e3-4ab0-927d-6edfa21f1627\") " pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.768224 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75023528-13e3-4ab0-927d-6edfa21f1627-catalog-content\") pod \"community-operators-bh6s7\" (UID: \"75023528-13e3-4ab0-927d-6edfa21f1627\") " pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.870315 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzxhc\" (UniqueName: \"kubernetes.io/projected/75023528-13e3-4ab0-927d-6edfa21f1627-kube-api-access-gzxhc\") pod \"community-operators-bh6s7\" (UID: \"75023528-13e3-4ab0-927d-6edfa21f1627\") " pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.870406 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75023528-13e3-4ab0-927d-6edfa21f1627-catalog-content\") pod \"community-operators-bh6s7\" (UID: \"75023528-13e3-4ab0-927d-6edfa21f1627\") " pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.870516 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75023528-13e3-4ab0-927d-6edfa21f1627-utilities\") pod \"community-operators-bh6s7\" (UID: \"75023528-13e3-4ab0-927d-6edfa21f1627\") " pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.870939 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75023528-13e3-4ab0-927d-6edfa21f1627-catalog-content\") pod \"community-operators-bh6s7\" (UID: \"75023528-13e3-4ab0-927d-6edfa21f1627\") " pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.870982 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75023528-13e3-4ab0-927d-6edfa21f1627-utilities\") pod \"community-operators-bh6s7\" (UID: \"75023528-13e3-4ab0-927d-6edfa21f1627\") " pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.891591 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzxhc\" (UniqueName: \"kubernetes.io/projected/75023528-13e3-4ab0-927d-6edfa21f1627-kube-api-access-gzxhc\") pod \"community-operators-bh6s7\" (UID: \"75023528-13e3-4ab0-927d-6edfa21f1627\") " pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:17:57 crc kubenswrapper[5094]: I0220 10:17:57.187057 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:17:57 crc kubenswrapper[5094]: I0220 10:17:57.633749 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bh6s7"] Feb 20 10:17:58 crc kubenswrapper[5094]: I0220 10:17:58.174348 5094 generic.go:334] "Generic (PLEG): container finished" podID="75023528-13e3-4ab0-927d-6edfa21f1627" containerID="c5c24f4523a103b0e90866a15beab09054af9f680716c5ab49ddc347baafab03" exitCode=0 Feb 20 10:17:58 crc kubenswrapper[5094]: I0220 10:17:58.174417 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bh6s7" event={"ID":"75023528-13e3-4ab0-927d-6edfa21f1627","Type":"ContainerDied","Data":"c5c24f4523a103b0e90866a15beab09054af9f680716c5ab49ddc347baafab03"} Feb 20 10:17:58 crc kubenswrapper[5094]: I0220 10:17:58.174811 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bh6s7" event={"ID":"75023528-13e3-4ab0-927d-6edfa21f1627","Type":"ContainerStarted","Data":"3f1c87259bd06d4e00943cc87d16195aae2ed1cfd874fe2cad4388dd9b173469"} Feb 20 10:17:59 crc kubenswrapper[5094]: I0220 10:17:59.187992 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bh6s7" event={"ID":"75023528-13e3-4ab0-927d-6edfa21f1627","Type":"ContainerStarted","Data":"836a50f935878b83d764a7d3383ae88515e36fc1e7f27766ef2da4df4873e1fc"} Feb 20 10:18:01 crc kubenswrapper[5094]: I0220 10:18:01.209839 5094 generic.go:334] "Generic (PLEG): container finished" podID="75023528-13e3-4ab0-927d-6edfa21f1627" containerID="836a50f935878b83d764a7d3383ae88515e36fc1e7f27766ef2da4df4873e1fc" exitCode=0 Feb 20 10:18:01 crc kubenswrapper[5094]: I0220 10:18:01.209939 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bh6s7" event={"ID":"75023528-13e3-4ab0-927d-6edfa21f1627","Type":"ContainerDied","Data":"836a50f935878b83d764a7d3383ae88515e36fc1e7f27766ef2da4df4873e1fc"} Feb 20 10:18:02 crc kubenswrapper[5094]: I0220 10:18:02.220450 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bh6s7" event={"ID":"75023528-13e3-4ab0-927d-6edfa21f1627","Type":"ContainerStarted","Data":"a302024900b4d1fdf70f6801e5bd17191f9277e41d06cf81d7245ca1f75ede67"} Feb 20 10:18:02 crc kubenswrapper[5094]: I0220 10:18:02.251845 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bh6s7" podStartSLOduration=2.819304675 podStartE2EDuration="6.251819088s" podCreationTimestamp="2026-02-20 10:17:56 +0000 UTC" firstStartedPulling="2026-02-20 10:17:58.176925777 +0000 UTC m=+12693.049552498" lastFinishedPulling="2026-02-20 10:18:01.60944021 +0000 UTC m=+12696.482066911" observedRunningTime="2026-02-20 10:18:02.24115547 +0000 UTC m=+12697.113782191" watchObservedRunningTime="2026-02-20 10:18:02.251819088 +0000 UTC m=+12697.124445829" Feb 20 10:18:03 crc kubenswrapper[5094]: I0220 10:18:03.842483 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:18:03 crc kubenswrapper[5094]: E0220 10:18:03.843084 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:18:07 crc kubenswrapper[5094]: I0220 10:18:07.187946 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:18:07 crc kubenswrapper[5094]: I0220 10:18:07.188773 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:18:07 crc kubenswrapper[5094]: I0220 10:18:07.255146 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:18:07 crc kubenswrapper[5094]: I0220 10:18:07.357756 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:18:07 crc kubenswrapper[5094]: I0220 10:18:07.506016 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bh6s7"] Feb 20 10:18:09 crc kubenswrapper[5094]: I0220 10:18:09.331448 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bh6s7" podUID="75023528-13e3-4ab0-927d-6edfa21f1627" containerName="registry-server" containerID="cri-o://a302024900b4d1fdf70f6801e5bd17191f9277e41d06cf81d7245ca1f75ede67" gracePeriod=2 Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:09.901186 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.352908 5094 generic.go:334] "Generic (PLEG): container finished" podID="75023528-13e3-4ab0-927d-6edfa21f1627" containerID="a302024900b4d1fdf70f6801e5bd17191f9277e41d06cf81d7245ca1f75ede67" exitCode=0 Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.352958 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bh6s7" event={"ID":"75023528-13e3-4ab0-927d-6edfa21f1627","Type":"ContainerDied","Data":"a302024900b4d1fdf70f6801e5bd17191f9277e41d06cf81d7245ca1f75ede67"} Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.352990 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bh6s7" event={"ID":"75023528-13e3-4ab0-927d-6edfa21f1627","Type":"ContainerDied","Data":"3f1c87259bd06d4e00943cc87d16195aae2ed1cfd874fe2cad4388dd9b173469"} Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.353014 5094 scope.go:117] "RemoveContainer" containerID="a302024900b4d1fdf70f6801e5bd17191f9277e41d06cf81d7245ca1f75ede67" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.353204 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.367161 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75023528-13e3-4ab0-927d-6edfa21f1627-utilities\") pod \"75023528-13e3-4ab0-927d-6edfa21f1627\" (UID: \"75023528-13e3-4ab0-927d-6edfa21f1627\") " Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.367206 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzxhc\" (UniqueName: \"kubernetes.io/projected/75023528-13e3-4ab0-927d-6edfa21f1627-kube-api-access-gzxhc\") pod \"75023528-13e3-4ab0-927d-6edfa21f1627\" (UID: \"75023528-13e3-4ab0-927d-6edfa21f1627\") " Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.367348 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75023528-13e3-4ab0-927d-6edfa21f1627-catalog-content\") pod \"75023528-13e3-4ab0-927d-6edfa21f1627\" (UID: \"75023528-13e3-4ab0-927d-6edfa21f1627\") " Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.368995 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75023528-13e3-4ab0-927d-6edfa21f1627-utilities" (OuterVolumeSpecName: "utilities") pod "75023528-13e3-4ab0-927d-6edfa21f1627" (UID: "75023528-13e3-4ab0-927d-6edfa21f1627"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.375902 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75023528-13e3-4ab0-927d-6edfa21f1627-kube-api-access-gzxhc" (OuterVolumeSpecName: "kube-api-access-gzxhc") pod "75023528-13e3-4ab0-927d-6edfa21f1627" (UID: "75023528-13e3-4ab0-927d-6edfa21f1627"). InnerVolumeSpecName "kube-api-access-gzxhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.385857 5094 scope.go:117] "RemoveContainer" containerID="836a50f935878b83d764a7d3383ae88515e36fc1e7f27766ef2da4df4873e1fc" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.431229 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75023528-13e3-4ab0-927d-6edfa21f1627-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75023528-13e3-4ab0-927d-6edfa21f1627" (UID: "75023528-13e3-4ab0-927d-6edfa21f1627"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.434522 5094 scope.go:117] "RemoveContainer" containerID="c5c24f4523a103b0e90866a15beab09054af9f680716c5ab49ddc347baafab03" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.470385 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75023528-13e3-4ab0-927d-6edfa21f1627-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.470429 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzxhc\" (UniqueName: \"kubernetes.io/projected/75023528-13e3-4ab0-927d-6edfa21f1627-kube-api-access-gzxhc\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.470447 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75023528-13e3-4ab0-927d-6edfa21f1627-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.477868 5094 scope.go:117] "RemoveContainer" containerID="a302024900b4d1fdf70f6801e5bd17191f9277e41d06cf81d7245ca1f75ede67" Feb 20 10:18:10 crc kubenswrapper[5094]: E0220 10:18:10.478917 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a302024900b4d1fdf70f6801e5bd17191f9277e41d06cf81d7245ca1f75ede67\": container with ID starting with a302024900b4d1fdf70f6801e5bd17191f9277e41d06cf81d7245ca1f75ede67 not found: ID does not exist" containerID="a302024900b4d1fdf70f6801e5bd17191f9277e41d06cf81d7245ca1f75ede67" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.478961 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a302024900b4d1fdf70f6801e5bd17191f9277e41d06cf81d7245ca1f75ede67"} err="failed to get container status \"a302024900b4d1fdf70f6801e5bd17191f9277e41d06cf81d7245ca1f75ede67\": rpc error: code = NotFound desc = could not find container \"a302024900b4d1fdf70f6801e5bd17191f9277e41d06cf81d7245ca1f75ede67\": container with ID starting with a302024900b4d1fdf70f6801e5bd17191f9277e41d06cf81d7245ca1f75ede67 not found: ID does not exist" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.478989 5094 scope.go:117] "RemoveContainer" containerID="836a50f935878b83d764a7d3383ae88515e36fc1e7f27766ef2da4df4873e1fc" Feb 20 10:18:10 crc kubenswrapper[5094]: E0220 10:18:10.479482 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"836a50f935878b83d764a7d3383ae88515e36fc1e7f27766ef2da4df4873e1fc\": container with ID starting with 836a50f935878b83d764a7d3383ae88515e36fc1e7f27766ef2da4df4873e1fc not found: ID does not exist" containerID="836a50f935878b83d764a7d3383ae88515e36fc1e7f27766ef2da4df4873e1fc" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.479507 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"836a50f935878b83d764a7d3383ae88515e36fc1e7f27766ef2da4df4873e1fc"} err="failed to get container status \"836a50f935878b83d764a7d3383ae88515e36fc1e7f27766ef2da4df4873e1fc\": rpc error: code = NotFound desc = could not find container \"836a50f935878b83d764a7d3383ae88515e36fc1e7f27766ef2da4df4873e1fc\": container with ID starting with 836a50f935878b83d764a7d3383ae88515e36fc1e7f27766ef2da4df4873e1fc not found: ID does not exist" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.479523 5094 scope.go:117] "RemoveContainer" containerID="c5c24f4523a103b0e90866a15beab09054af9f680716c5ab49ddc347baafab03" Feb 20 10:18:10 crc kubenswrapper[5094]: E0220 10:18:10.480029 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5c24f4523a103b0e90866a15beab09054af9f680716c5ab49ddc347baafab03\": container with ID starting with c5c24f4523a103b0e90866a15beab09054af9f680716c5ab49ddc347baafab03 not found: ID does not exist" containerID="c5c24f4523a103b0e90866a15beab09054af9f680716c5ab49ddc347baafab03" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.480054 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5c24f4523a103b0e90866a15beab09054af9f680716c5ab49ddc347baafab03"} err="failed to get container status \"c5c24f4523a103b0e90866a15beab09054af9f680716c5ab49ddc347baafab03\": rpc error: code = NotFound desc = could not find container \"c5c24f4523a103b0e90866a15beab09054af9f680716c5ab49ddc347baafab03\": container with ID starting with c5c24f4523a103b0e90866a15beab09054af9f680716c5ab49ddc347baafab03 not found: ID does not exist" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.714486 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bh6s7"] Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.731617 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bh6s7"] Feb 20 10:18:11 crc kubenswrapper[5094]: I0220 10:18:11.856574 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75023528-13e3-4ab0-927d-6edfa21f1627" path="/var/lib/kubelet/pods/75023528-13e3-4ab0-927d-6edfa21f1627/volumes" Feb 20 10:18:18 crc kubenswrapper[5094]: I0220 10:18:18.840218 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:18:18 crc kubenswrapper[5094]: E0220 10:18:18.840788 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.226460 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-559qt"] Feb 20 10:18:30 crc kubenswrapper[5094]: E0220 10:18:30.227855 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75023528-13e3-4ab0-927d-6edfa21f1627" containerName="extract-utilities" Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.227878 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="75023528-13e3-4ab0-927d-6edfa21f1627" containerName="extract-utilities" Feb 20 10:18:30 crc kubenswrapper[5094]: E0220 10:18:30.227903 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75023528-13e3-4ab0-927d-6edfa21f1627" containerName="extract-content" Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.227916 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="75023528-13e3-4ab0-927d-6edfa21f1627" containerName="extract-content" Feb 20 10:18:30 crc kubenswrapper[5094]: E0220 10:18:30.227945 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75023528-13e3-4ab0-927d-6edfa21f1627" containerName="registry-server" Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.227959 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="75023528-13e3-4ab0-927d-6edfa21f1627" containerName="registry-server" Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.228324 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="75023528-13e3-4ab0-927d-6edfa21f1627" containerName="registry-server" Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.230817 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.270344 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-559qt"] Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.351127 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dvld\" (UniqueName: \"kubernetes.io/projected/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-kube-api-access-6dvld\") pod \"redhat-operators-559qt\" (UID: \"c2fb1f9f-aa57-4dff-9a98-23600dabc73c\") " pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.351283 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-utilities\") pod \"redhat-operators-559qt\" (UID: \"c2fb1f9f-aa57-4dff-9a98-23600dabc73c\") " pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.351369 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-catalog-content\") pod \"redhat-operators-559qt\" (UID: \"c2fb1f9f-aa57-4dff-9a98-23600dabc73c\") " pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.453001 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-utilities\") pod \"redhat-operators-559qt\" (UID: \"c2fb1f9f-aa57-4dff-9a98-23600dabc73c\") " pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.453087 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-catalog-content\") pod \"redhat-operators-559qt\" (UID: \"c2fb1f9f-aa57-4dff-9a98-23600dabc73c\") " pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.453153 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dvld\" (UniqueName: \"kubernetes.io/projected/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-kube-api-access-6dvld\") pod \"redhat-operators-559qt\" (UID: \"c2fb1f9f-aa57-4dff-9a98-23600dabc73c\") " pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.453849 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-utilities\") pod \"redhat-operators-559qt\" (UID: \"c2fb1f9f-aa57-4dff-9a98-23600dabc73c\") " pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.454068 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-catalog-content\") pod \"redhat-operators-559qt\" (UID: \"c2fb1f9f-aa57-4dff-9a98-23600dabc73c\") " pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.490021 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dvld\" (UniqueName: \"kubernetes.io/projected/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-kube-api-access-6dvld\") pod \"redhat-operators-559qt\" (UID: \"c2fb1f9f-aa57-4dff-9a98-23600dabc73c\") " pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.567832 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:31 crc kubenswrapper[5094]: I0220 10:18:31.093037 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-559qt"] Feb 20 10:18:31 crc kubenswrapper[5094]: I0220 10:18:31.625205 5094 generic.go:334] "Generic (PLEG): container finished" podID="c2fb1f9f-aa57-4dff-9a98-23600dabc73c" containerID="2d504c558c63939b83a5299e910ff68b89b7a4bc3934c694efd127d7dd6239b1" exitCode=0 Feb 20 10:18:31 crc kubenswrapper[5094]: I0220 10:18:31.625316 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-559qt" event={"ID":"c2fb1f9f-aa57-4dff-9a98-23600dabc73c","Type":"ContainerDied","Data":"2d504c558c63939b83a5299e910ff68b89b7a4bc3934c694efd127d7dd6239b1"} Feb 20 10:18:31 crc kubenswrapper[5094]: I0220 10:18:31.625459 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-559qt" event={"ID":"c2fb1f9f-aa57-4dff-9a98-23600dabc73c","Type":"ContainerStarted","Data":"792549d34da7adbc68a3fe7ee378715e0ab4d58104bfcdc9915532a19f015019"} Feb 20 10:18:32 crc kubenswrapper[5094]: I0220 10:18:32.638212 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-559qt" event={"ID":"c2fb1f9f-aa57-4dff-9a98-23600dabc73c","Type":"ContainerStarted","Data":"163f70b4ec0cb4c95815a1eb6a48e16ab9102a581b8d64b2ea287805b8b1604c"} Feb 20 10:18:32 crc kubenswrapper[5094]: I0220 10:18:32.840682 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:18:32 crc kubenswrapper[5094]: E0220 10:18:32.853913 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:18:35 crc kubenswrapper[5094]: I0220 10:18:35.685516 5094 generic.go:334] "Generic (PLEG): container finished" podID="c2fb1f9f-aa57-4dff-9a98-23600dabc73c" containerID="163f70b4ec0cb4c95815a1eb6a48e16ab9102a581b8d64b2ea287805b8b1604c" exitCode=0 Feb 20 10:18:35 crc kubenswrapper[5094]: I0220 10:18:35.685651 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-559qt" event={"ID":"c2fb1f9f-aa57-4dff-9a98-23600dabc73c","Type":"ContainerDied","Data":"163f70b4ec0cb4c95815a1eb6a48e16ab9102a581b8d64b2ea287805b8b1604c"} Feb 20 10:18:36 crc kubenswrapper[5094]: I0220 10:18:36.697309 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-559qt" event={"ID":"c2fb1f9f-aa57-4dff-9a98-23600dabc73c","Type":"ContainerStarted","Data":"bfc7749518aecf9fd3e7369ac9bfaa846cc4bd61d323f9bdeaa1bc47fca643bb"} Feb 20 10:18:36 crc kubenswrapper[5094]: I0220 10:18:36.723117 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-559qt" podStartSLOduration=2.268313618 podStartE2EDuration="6.723099445s" podCreationTimestamp="2026-02-20 10:18:30 +0000 UTC" firstStartedPulling="2026-02-20 10:18:31.627392465 +0000 UTC m=+12726.500019176" lastFinishedPulling="2026-02-20 10:18:36.082178292 +0000 UTC m=+12730.954805003" observedRunningTime="2026-02-20 10:18:36.714555599 +0000 UTC m=+12731.587182310" watchObservedRunningTime="2026-02-20 10:18:36.723099445 +0000 UTC m=+12731.595726156" Feb 20 10:18:40 crc kubenswrapper[5094]: I0220 10:18:40.569518 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:40 crc kubenswrapper[5094]: I0220 10:18:40.570117 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:41 crc kubenswrapper[5094]: I0220 10:18:41.630119 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-559qt" podUID="c2fb1f9f-aa57-4dff-9a98-23600dabc73c" containerName="registry-server" probeResult="failure" output=< Feb 20 10:18:41 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 10:18:41 crc kubenswrapper[5094]: > Feb 20 10:18:44 crc kubenswrapper[5094]: I0220 10:18:44.840270 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:18:44 crc kubenswrapper[5094]: E0220 10:18:44.840979 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:18:46 crc kubenswrapper[5094]: I0220 10:18:46.193432 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rq6hb"] Feb 20 10:18:46 crc kubenswrapper[5094]: I0220 10:18:46.198988 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:18:46 crc kubenswrapper[5094]: I0220 10:18:46.208281 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rq6hb"] Feb 20 10:18:46 crc kubenswrapper[5094]: I0220 10:18:46.326838 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sprq\" (UniqueName: \"kubernetes.io/projected/defa329d-6728-4dbb-aa04-ffe8a237e397-kube-api-access-5sprq\") pod \"redhat-marketplace-rq6hb\" (UID: \"defa329d-6728-4dbb-aa04-ffe8a237e397\") " pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:18:46 crc kubenswrapper[5094]: I0220 10:18:46.326915 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/defa329d-6728-4dbb-aa04-ffe8a237e397-catalog-content\") pod \"redhat-marketplace-rq6hb\" (UID: \"defa329d-6728-4dbb-aa04-ffe8a237e397\") " pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:18:46 crc kubenswrapper[5094]: I0220 10:18:46.326956 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/defa329d-6728-4dbb-aa04-ffe8a237e397-utilities\") pod \"redhat-marketplace-rq6hb\" (UID: \"defa329d-6728-4dbb-aa04-ffe8a237e397\") " pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:18:46 crc kubenswrapper[5094]: I0220 10:18:46.429673 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sprq\" (UniqueName: \"kubernetes.io/projected/defa329d-6728-4dbb-aa04-ffe8a237e397-kube-api-access-5sprq\") pod \"redhat-marketplace-rq6hb\" (UID: \"defa329d-6728-4dbb-aa04-ffe8a237e397\") " pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:18:46 crc kubenswrapper[5094]: I0220 10:18:46.429829 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/defa329d-6728-4dbb-aa04-ffe8a237e397-catalog-content\") pod \"redhat-marketplace-rq6hb\" (UID: \"defa329d-6728-4dbb-aa04-ffe8a237e397\") " pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:18:46 crc kubenswrapper[5094]: I0220 10:18:46.429910 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/defa329d-6728-4dbb-aa04-ffe8a237e397-utilities\") pod \"redhat-marketplace-rq6hb\" (UID: \"defa329d-6728-4dbb-aa04-ffe8a237e397\") " pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:18:46 crc kubenswrapper[5094]: I0220 10:18:46.430555 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/defa329d-6728-4dbb-aa04-ffe8a237e397-catalog-content\") pod \"redhat-marketplace-rq6hb\" (UID: \"defa329d-6728-4dbb-aa04-ffe8a237e397\") " pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:18:46 crc kubenswrapper[5094]: I0220 10:18:46.431535 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/defa329d-6728-4dbb-aa04-ffe8a237e397-utilities\") pod \"redhat-marketplace-rq6hb\" (UID: \"defa329d-6728-4dbb-aa04-ffe8a237e397\") " pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:18:46 crc kubenswrapper[5094]: I0220 10:18:46.476159 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sprq\" (UniqueName: \"kubernetes.io/projected/defa329d-6728-4dbb-aa04-ffe8a237e397-kube-api-access-5sprq\") pod \"redhat-marketplace-rq6hb\" (UID: \"defa329d-6728-4dbb-aa04-ffe8a237e397\") " pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:18:46 crc kubenswrapper[5094]: I0220 10:18:46.537926 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:18:47 crc kubenswrapper[5094]: I0220 10:18:47.066230 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rq6hb"] Feb 20 10:18:47 crc kubenswrapper[5094]: E0220 10:18:47.510918 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddefa329d_6728_4dbb_aa04_ffe8a237e397.slice/crio-conmon-19d66568e0a06c42b894d221af96c193ff21978e60895d76054875aac01d8ade.scope\": RecentStats: unable to find data in memory cache]" Feb 20 10:18:47 crc kubenswrapper[5094]: I0220 10:18:47.825833 5094 generic.go:334] "Generic (PLEG): container finished" podID="defa329d-6728-4dbb-aa04-ffe8a237e397" containerID="19d66568e0a06c42b894d221af96c193ff21978e60895d76054875aac01d8ade" exitCode=0 Feb 20 10:18:47 crc kubenswrapper[5094]: I0220 10:18:47.825890 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rq6hb" event={"ID":"defa329d-6728-4dbb-aa04-ffe8a237e397","Type":"ContainerDied","Data":"19d66568e0a06c42b894d221af96c193ff21978e60895d76054875aac01d8ade"} Feb 20 10:18:47 crc kubenswrapper[5094]: I0220 10:18:47.826065 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rq6hb" event={"ID":"defa329d-6728-4dbb-aa04-ffe8a237e397","Type":"ContainerStarted","Data":"ffe9c58e2491aee9243375a1ad768e3748148129ed1b476a7a951f6ce3a1544a"} Feb 20 10:18:48 crc kubenswrapper[5094]: I0220 10:18:48.835954 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rq6hb" event={"ID":"defa329d-6728-4dbb-aa04-ffe8a237e397","Type":"ContainerStarted","Data":"168ea3d168f84a5ed368c3173ac95ee8c5db2344dfbc556a9733949fed085a8c"} Feb 20 10:18:49 crc kubenswrapper[5094]: I0220 10:18:49.855023 5094 generic.go:334] "Generic (PLEG): container finished" podID="defa329d-6728-4dbb-aa04-ffe8a237e397" containerID="168ea3d168f84a5ed368c3173ac95ee8c5db2344dfbc556a9733949fed085a8c" exitCode=0 Feb 20 10:18:49 crc kubenswrapper[5094]: I0220 10:18:49.860452 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rq6hb" event={"ID":"defa329d-6728-4dbb-aa04-ffe8a237e397","Type":"ContainerDied","Data":"168ea3d168f84a5ed368c3173ac95ee8c5db2344dfbc556a9733949fed085a8c"} Feb 20 10:18:50 crc kubenswrapper[5094]: I0220 10:18:50.650780 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:50 crc kubenswrapper[5094]: I0220 10:18:50.711357 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:50 crc kubenswrapper[5094]: I0220 10:18:50.867430 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rq6hb" event={"ID":"defa329d-6728-4dbb-aa04-ffe8a237e397","Type":"ContainerStarted","Data":"73e74d894b863d09abbd8119e2eec5b41d5af92d6f206548e14a84510e46ad29"} Feb 20 10:18:50 crc kubenswrapper[5094]: I0220 10:18:50.896385 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rq6hb" podStartSLOduration=2.25101946 podStartE2EDuration="4.896361592s" podCreationTimestamp="2026-02-20 10:18:46 +0000 UTC" firstStartedPulling="2026-02-20 10:18:47.828585638 +0000 UTC m=+12742.701212389" lastFinishedPulling="2026-02-20 10:18:50.47392777 +0000 UTC m=+12745.346554521" observedRunningTime="2026-02-20 10:18:50.884825203 +0000 UTC m=+12745.757451944" watchObservedRunningTime="2026-02-20 10:18:50.896361592 +0000 UTC m=+12745.768988323" Feb 20 10:18:52 crc kubenswrapper[5094]: I0220 10:18:52.966474 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-559qt"] Feb 20 10:18:52 crc kubenswrapper[5094]: I0220 10:18:52.967075 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-559qt" podUID="c2fb1f9f-aa57-4dff-9a98-23600dabc73c" containerName="registry-server" containerID="cri-o://bfc7749518aecf9fd3e7369ac9bfaa846cc4bd61d323f9bdeaa1bc47fca643bb" gracePeriod=2 Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.463908 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.524485 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-utilities\") pod \"c2fb1f9f-aa57-4dff-9a98-23600dabc73c\" (UID: \"c2fb1f9f-aa57-4dff-9a98-23600dabc73c\") " Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.524539 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dvld\" (UniqueName: \"kubernetes.io/projected/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-kube-api-access-6dvld\") pod \"c2fb1f9f-aa57-4dff-9a98-23600dabc73c\" (UID: \"c2fb1f9f-aa57-4dff-9a98-23600dabc73c\") " Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.524572 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-catalog-content\") pod \"c2fb1f9f-aa57-4dff-9a98-23600dabc73c\" (UID: \"c2fb1f9f-aa57-4dff-9a98-23600dabc73c\") " Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.526210 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-utilities" (OuterVolumeSpecName: "utilities") pod "c2fb1f9f-aa57-4dff-9a98-23600dabc73c" (UID: "c2fb1f9f-aa57-4dff-9a98-23600dabc73c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.531033 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-kube-api-access-6dvld" (OuterVolumeSpecName: "kube-api-access-6dvld") pod "c2fb1f9f-aa57-4dff-9a98-23600dabc73c" (UID: "c2fb1f9f-aa57-4dff-9a98-23600dabc73c"). InnerVolumeSpecName "kube-api-access-6dvld". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.626876 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.626943 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dvld\" (UniqueName: \"kubernetes.io/projected/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-kube-api-access-6dvld\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.655314 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2fb1f9f-aa57-4dff-9a98-23600dabc73c" (UID: "c2fb1f9f-aa57-4dff-9a98-23600dabc73c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.728508 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.902817 5094 generic.go:334] "Generic (PLEG): container finished" podID="c2fb1f9f-aa57-4dff-9a98-23600dabc73c" containerID="bfc7749518aecf9fd3e7369ac9bfaa846cc4bd61d323f9bdeaa1bc47fca643bb" exitCode=0 Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.902863 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-559qt" event={"ID":"c2fb1f9f-aa57-4dff-9a98-23600dabc73c","Type":"ContainerDied","Data":"bfc7749518aecf9fd3e7369ac9bfaa846cc4bd61d323f9bdeaa1bc47fca643bb"} Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.902894 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-559qt" event={"ID":"c2fb1f9f-aa57-4dff-9a98-23600dabc73c","Type":"ContainerDied","Data":"792549d34da7adbc68a3fe7ee378715e0ab4d58104bfcdc9915532a19f015019"} Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.902915 5094 scope.go:117] "RemoveContainer" containerID="bfc7749518aecf9fd3e7369ac9bfaa846cc4bd61d323f9bdeaa1bc47fca643bb" Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.902932 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.933238 5094 scope.go:117] "RemoveContainer" containerID="163f70b4ec0cb4c95815a1eb6a48e16ab9102a581b8d64b2ea287805b8b1604c" Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.933405 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-559qt"] Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.952087 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-559qt"] Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.958124 5094 scope.go:117] "RemoveContainer" containerID="2d504c558c63939b83a5299e910ff68b89b7a4bc3934c694efd127d7dd6239b1" Feb 20 10:18:54 crc kubenswrapper[5094]: I0220 10:18:54.015012 5094 scope.go:117] "RemoveContainer" containerID="bfc7749518aecf9fd3e7369ac9bfaa846cc4bd61d323f9bdeaa1bc47fca643bb" Feb 20 10:18:54 crc kubenswrapper[5094]: E0220 10:18:54.019039 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfc7749518aecf9fd3e7369ac9bfaa846cc4bd61d323f9bdeaa1bc47fca643bb\": container with ID starting with bfc7749518aecf9fd3e7369ac9bfaa846cc4bd61d323f9bdeaa1bc47fca643bb not found: ID does not exist" containerID="bfc7749518aecf9fd3e7369ac9bfaa846cc4bd61d323f9bdeaa1bc47fca643bb" Feb 20 10:18:54 crc kubenswrapper[5094]: I0220 10:18:54.019076 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfc7749518aecf9fd3e7369ac9bfaa846cc4bd61d323f9bdeaa1bc47fca643bb"} err="failed to get container status \"bfc7749518aecf9fd3e7369ac9bfaa846cc4bd61d323f9bdeaa1bc47fca643bb\": rpc error: code = NotFound desc = could not find container \"bfc7749518aecf9fd3e7369ac9bfaa846cc4bd61d323f9bdeaa1bc47fca643bb\": container with ID starting with bfc7749518aecf9fd3e7369ac9bfaa846cc4bd61d323f9bdeaa1bc47fca643bb not found: ID does not exist" Feb 20 10:18:54 crc kubenswrapper[5094]: I0220 10:18:54.019099 5094 scope.go:117] "RemoveContainer" containerID="163f70b4ec0cb4c95815a1eb6a48e16ab9102a581b8d64b2ea287805b8b1604c" Feb 20 10:18:54 crc kubenswrapper[5094]: E0220 10:18:54.019352 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"163f70b4ec0cb4c95815a1eb6a48e16ab9102a581b8d64b2ea287805b8b1604c\": container with ID starting with 163f70b4ec0cb4c95815a1eb6a48e16ab9102a581b8d64b2ea287805b8b1604c not found: ID does not exist" containerID="163f70b4ec0cb4c95815a1eb6a48e16ab9102a581b8d64b2ea287805b8b1604c" Feb 20 10:18:54 crc kubenswrapper[5094]: I0220 10:18:54.019376 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"163f70b4ec0cb4c95815a1eb6a48e16ab9102a581b8d64b2ea287805b8b1604c"} err="failed to get container status \"163f70b4ec0cb4c95815a1eb6a48e16ab9102a581b8d64b2ea287805b8b1604c\": rpc error: code = NotFound desc = could not find container \"163f70b4ec0cb4c95815a1eb6a48e16ab9102a581b8d64b2ea287805b8b1604c\": container with ID starting with 163f70b4ec0cb4c95815a1eb6a48e16ab9102a581b8d64b2ea287805b8b1604c not found: ID does not exist" Feb 20 10:18:54 crc kubenswrapper[5094]: I0220 10:18:54.019389 5094 scope.go:117] "RemoveContainer" containerID="2d504c558c63939b83a5299e910ff68b89b7a4bc3934c694efd127d7dd6239b1" Feb 20 10:18:54 crc kubenswrapper[5094]: E0220 10:18:54.020033 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d504c558c63939b83a5299e910ff68b89b7a4bc3934c694efd127d7dd6239b1\": container with ID starting with 2d504c558c63939b83a5299e910ff68b89b7a4bc3934c694efd127d7dd6239b1 not found: ID does not exist" containerID="2d504c558c63939b83a5299e910ff68b89b7a4bc3934c694efd127d7dd6239b1" Feb 20 10:18:54 crc kubenswrapper[5094]: I0220 10:18:54.020103 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d504c558c63939b83a5299e910ff68b89b7a4bc3934c694efd127d7dd6239b1"} err="failed to get container status \"2d504c558c63939b83a5299e910ff68b89b7a4bc3934c694efd127d7dd6239b1\": rpc error: code = NotFound desc = could not find container \"2d504c558c63939b83a5299e910ff68b89b7a4bc3934c694efd127d7dd6239b1\": container with ID starting with 2d504c558c63939b83a5299e910ff68b89b7a4bc3934c694efd127d7dd6239b1 not found: ID does not exist" Feb 20 10:18:55 crc kubenswrapper[5094]: I0220 10:18:55.857581 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2fb1f9f-aa57-4dff-9a98-23600dabc73c" path="/var/lib/kubelet/pods/c2fb1f9f-aa57-4dff-9a98-23600dabc73c/volumes" Feb 20 10:18:56 crc kubenswrapper[5094]: I0220 10:18:56.538422 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:18:56 crc kubenswrapper[5094]: I0220 10:18:56.538806 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:18:56 crc kubenswrapper[5094]: I0220 10:18:56.600201 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:18:57 crc kubenswrapper[5094]: I0220 10:18:57.009140 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:18:57 crc kubenswrapper[5094]: I0220 10:18:57.766914 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rq6hb"] Feb 20 10:18:58 crc kubenswrapper[5094]: I0220 10:18:58.965397 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rq6hb" podUID="defa329d-6728-4dbb-aa04-ffe8a237e397" containerName="registry-server" containerID="cri-o://73e74d894b863d09abbd8119e2eec5b41d5af92d6f206548e14a84510e46ad29" gracePeriod=2 Feb 20 10:18:59 crc kubenswrapper[5094]: I0220 10:18:59.440688 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:18:59 crc kubenswrapper[5094]: I0220 10:18:59.581635 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/defa329d-6728-4dbb-aa04-ffe8a237e397-catalog-content\") pod \"defa329d-6728-4dbb-aa04-ffe8a237e397\" (UID: \"defa329d-6728-4dbb-aa04-ffe8a237e397\") " Feb 20 10:18:59 crc kubenswrapper[5094]: I0220 10:18:59.581869 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/defa329d-6728-4dbb-aa04-ffe8a237e397-utilities\") pod \"defa329d-6728-4dbb-aa04-ffe8a237e397\" (UID: \"defa329d-6728-4dbb-aa04-ffe8a237e397\") " Feb 20 10:18:59 crc kubenswrapper[5094]: I0220 10:18:59.582049 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sprq\" (UniqueName: \"kubernetes.io/projected/defa329d-6728-4dbb-aa04-ffe8a237e397-kube-api-access-5sprq\") pod \"defa329d-6728-4dbb-aa04-ffe8a237e397\" (UID: \"defa329d-6728-4dbb-aa04-ffe8a237e397\") " Feb 20 10:18:59 crc kubenswrapper[5094]: I0220 10:18:59.582672 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/defa329d-6728-4dbb-aa04-ffe8a237e397-utilities" (OuterVolumeSpecName: "utilities") pod "defa329d-6728-4dbb-aa04-ffe8a237e397" (UID: "defa329d-6728-4dbb-aa04-ffe8a237e397"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:59 crc kubenswrapper[5094]: I0220 10:18:59.587888 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/defa329d-6728-4dbb-aa04-ffe8a237e397-kube-api-access-5sprq" (OuterVolumeSpecName: "kube-api-access-5sprq") pod "defa329d-6728-4dbb-aa04-ffe8a237e397" (UID: "defa329d-6728-4dbb-aa04-ffe8a237e397"). InnerVolumeSpecName "kube-api-access-5sprq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:59 crc kubenswrapper[5094]: I0220 10:18:59.629069 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/defa329d-6728-4dbb-aa04-ffe8a237e397-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "defa329d-6728-4dbb-aa04-ffe8a237e397" (UID: "defa329d-6728-4dbb-aa04-ffe8a237e397"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:59 crc kubenswrapper[5094]: I0220 10:18:59.684484 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/defa329d-6728-4dbb-aa04-ffe8a237e397-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:59 crc kubenswrapper[5094]: I0220 10:18:59.684526 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sprq\" (UniqueName: \"kubernetes.io/projected/defa329d-6728-4dbb-aa04-ffe8a237e397-kube-api-access-5sprq\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:59 crc kubenswrapper[5094]: I0220 10:18:59.684537 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/defa329d-6728-4dbb-aa04-ffe8a237e397-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:59 crc kubenswrapper[5094]: I0220 10:18:59.840369 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:18:59 crc kubenswrapper[5094]: E0220 10:18:59.840695 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:18:59 crc kubenswrapper[5094]: I0220 10:18:59.982766 5094 generic.go:334] "Generic (PLEG): container finished" podID="defa329d-6728-4dbb-aa04-ffe8a237e397" containerID="73e74d894b863d09abbd8119e2eec5b41d5af92d6f206548e14a84510e46ad29" exitCode=0 Feb 20 10:18:59 crc kubenswrapper[5094]: I0220 10:18:59.982808 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rq6hb" event={"ID":"defa329d-6728-4dbb-aa04-ffe8a237e397","Type":"ContainerDied","Data":"73e74d894b863d09abbd8119e2eec5b41d5af92d6f206548e14a84510e46ad29"} Feb 20 10:18:59 crc kubenswrapper[5094]: I0220 10:18:59.982838 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rq6hb" event={"ID":"defa329d-6728-4dbb-aa04-ffe8a237e397","Type":"ContainerDied","Data":"ffe9c58e2491aee9243375a1ad768e3748148129ed1b476a7a951f6ce3a1544a"} Feb 20 10:18:59 crc kubenswrapper[5094]: I0220 10:18:59.982865 5094 scope.go:117] "RemoveContainer" containerID="73e74d894b863d09abbd8119e2eec5b41d5af92d6f206548e14a84510e46ad29" Feb 20 10:18:59 crc kubenswrapper[5094]: I0220 10:18:59.982874 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:19:00 crc kubenswrapper[5094]: I0220 10:19:00.007295 5094 scope.go:117] "RemoveContainer" containerID="168ea3d168f84a5ed368c3173ac95ee8c5db2344dfbc556a9733949fed085a8c" Feb 20 10:19:00 crc kubenswrapper[5094]: I0220 10:19:00.041581 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rq6hb"] Feb 20 10:19:00 crc kubenswrapper[5094]: I0220 10:19:00.043030 5094 scope.go:117] "RemoveContainer" containerID="19d66568e0a06c42b894d221af96c193ff21978e60895d76054875aac01d8ade" Feb 20 10:19:00 crc kubenswrapper[5094]: I0220 10:19:00.053244 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rq6hb"] Feb 20 10:19:00 crc kubenswrapper[5094]: I0220 10:19:00.090626 5094 scope.go:117] "RemoveContainer" containerID="73e74d894b863d09abbd8119e2eec5b41d5af92d6f206548e14a84510e46ad29" Feb 20 10:19:00 crc kubenswrapper[5094]: E0220 10:19:00.091321 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73e74d894b863d09abbd8119e2eec5b41d5af92d6f206548e14a84510e46ad29\": container with ID starting with 73e74d894b863d09abbd8119e2eec5b41d5af92d6f206548e14a84510e46ad29 not found: ID does not exist" containerID="73e74d894b863d09abbd8119e2eec5b41d5af92d6f206548e14a84510e46ad29" Feb 20 10:19:00 crc kubenswrapper[5094]: I0220 10:19:00.091376 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73e74d894b863d09abbd8119e2eec5b41d5af92d6f206548e14a84510e46ad29"} err="failed to get container status \"73e74d894b863d09abbd8119e2eec5b41d5af92d6f206548e14a84510e46ad29\": rpc error: code = NotFound desc = could not find container \"73e74d894b863d09abbd8119e2eec5b41d5af92d6f206548e14a84510e46ad29\": container with ID starting with 73e74d894b863d09abbd8119e2eec5b41d5af92d6f206548e14a84510e46ad29 not found: ID does not exist" Feb 20 10:19:00 crc kubenswrapper[5094]: I0220 10:19:00.091409 5094 scope.go:117] "RemoveContainer" containerID="168ea3d168f84a5ed368c3173ac95ee8c5db2344dfbc556a9733949fed085a8c" Feb 20 10:19:00 crc kubenswrapper[5094]: E0220 10:19:00.092002 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"168ea3d168f84a5ed368c3173ac95ee8c5db2344dfbc556a9733949fed085a8c\": container with ID starting with 168ea3d168f84a5ed368c3173ac95ee8c5db2344dfbc556a9733949fed085a8c not found: ID does not exist" containerID="168ea3d168f84a5ed368c3173ac95ee8c5db2344dfbc556a9733949fed085a8c" Feb 20 10:19:00 crc kubenswrapper[5094]: I0220 10:19:00.092121 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"168ea3d168f84a5ed368c3173ac95ee8c5db2344dfbc556a9733949fed085a8c"} err="failed to get container status \"168ea3d168f84a5ed368c3173ac95ee8c5db2344dfbc556a9733949fed085a8c\": rpc error: code = NotFound desc = could not find container \"168ea3d168f84a5ed368c3173ac95ee8c5db2344dfbc556a9733949fed085a8c\": container with ID starting with 168ea3d168f84a5ed368c3173ac95ee8c5db2344dfbc556a9733949fed085a8c not found: ID does not exist" Feb 20 10:19:00 crc kubenswrapper[5094]: I0220 10:19:00.092205 5094 scope.go:117] "RemoveContainer" containerID="19d66568e0a06c42b894d221af96c193ff21978e60895d76054875aac01d8ade" Feb 20 10:19:00 crc kubenswrapper[5094]: E0220 10:19:00.092544 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19d66568e0a06c42b894d221af96c193ff21978e60895d76054875aac01d8ade\": container with ID starting with 19d66568e0a06c42b894d221af96c193ff21978e60895d76054875aac01d8ade not found: ID does not exist" containerID="19d66568e0a06c42b894d221af96c193ff21978e60895d76054875aac01d8ade" Feb 20 10:19:00 crc kubenswrapper[5094]: I0220 10:19:00.092584 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19d66568e0a06c42b894d221af96c193ff21978e60895d76054875aac01d8ade"} err="failed to get container status \"19d66568e0a06c42b894d221af96c193ff21978e60895d76054875aac01d8ade\": rpc error: code = NotFound desc = could not find container \"19d66568e0a06c42b894d221af96c193ff21978e60895d76054875aac01d8ade\": container with ID starting with 19d66568e0a06c42b894d221af96c193ff21978e60895d76054875aac01d8ade not found: ID does not exist" Feb 20 10:19:01 crc kubenswrapper[5094]: I0220 10:19:01.855021 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="defa329d-6728-4dbb-aa04-ffe8a237e397" path="/var/lib/kubelet/pods/defa329d-6728-4dbb-aa04-ffe8a237e397/volumes" Feb 20 10:19:11 crc kubenswrapper[5094]: I0220 10:19:11.840092 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:19:11 crc kubenswrapper[5094]: E0220 10:19:11.841252 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:19:22 crc kubenswrapper[5094]: I0220 10:19:22.841191 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:19:22 crc kubenswrapper[5094]: E0220 10:19:22.842427 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:19:34 crc kubenswrapper[5094]: I0220 10:19:34.840194 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:19:34 crc kubenswrapper[5094]: E0220 10:19:34.841267 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:19:47 crc kubenswrapper[5094]: I0220 10:19:47.841661 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:19:47 crc kubenswrapper[5094]: E0220 10:19:47.843017 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:19:59 crc kubenswrapper[5094]: I0220 10:19:59.840226 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:19:59 crc kubenswrapper[5094]: E0220 10:19:59.841037 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:20:10 crc kubenswrapper[5094]: I0220 10:20:10.841209 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:20:10 crc kubenswrapper[5094]: E0220 10:20:10.842758 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:20:21 crc kubenswrapper[5094]: I0220 10:20:21.841664 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:20:21 crc kubenswrapper[5094]: E0220 10:20:21.842973 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:20:35 crc kubenswrapper[5094]: I0220 10:20:35.857601 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:20:36 crc kubenswrapper[5094]: I0220 10:20:36.173777 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"e494f0f6155c48fe39960089a5835d5140b6d9af1d65647b4a7985a7770671bf"} var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515146032711024445 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015146032712017363 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015146000574016507 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015146000575015460 5ustar corecore